Here is a quick missed factors post for Joes graph 1M<n<20M ...
Code:Lower n 1M 2M 5M 6M 9M 11M 14M 15M 16M 17M 19M Upper n 2M 3M 6M 7M 10M 12M 15M 16M 17M 18M 20M 49T 3 2 50T 6 5 1 1 1 1 2 2 78T 2 89T 2 1 99T 1 1 1 1 1 1
O.K. posting Joe's work on his behalf...
You can also see some of the other ranges we tested above 44T...
- Black dots what was found before and factrange.txt submissions for the 1M<n<20M dat.
- blue dots new factors found using the 991<n<50M dat (and missed factors n<20M)
- green dots, Ironbits work in progress he hasn't submitted or we havn't processed those n>20M factors yet.
Now if you look at the blue strips:
The blue strips 20M<n<50M are ranges we tested with 991<n<50M. If there are blue dots below 20M these were previously missed factors. So you can see that Joe actualy "hit a jackpot" around 99T.
Here is a quick missed factors post for Joes graph 1M<n<20M ...
Code:Lower n 1M 2M 5M 6M 9M 11M 14M 15M 16M 17M 19M Upper n 2M 3M 6M 7M 10M 12M 15M 16M 17M 18M 20M 49T 3 2 50T 6 5 1 1 1 1 2 2 78T 2 89T 2 1 99T 1 1 1 1 1 1
Last edited by vjs; 06-14-2005 at 10:08 AM.
I think the next aim should be to get to 88T as fast as possible, in order to increase sieving speed, too.
vjs, can you assign me a range of 1T near 80T, please?
Thank you in advance, Yours H.
79000-80000 HHHOriginally posted by hhh
I think the next aim should be to get to 88T as fast as possible, in order to increase sieving speed, too.
vjs, can you assign me a range of 1T near 80T, please?
Thank you in advance, Yours H.
Is that near enough?
It's not VJS but that Other guy. Will that do?
Joe O
HHH,
Just to let you know alot of the other ranges less than 88T are being worked on as well. Ironbits just took a chunk under your own. We will see what happens with this slightly higher T. There are a few places where no factors <3M were reported, and maybe some vise versa, can you say 300K<n<3M dat and 3M<n<20M dat confusion?
This is part of the reason why we are sticking with 991<n<50M.
If I'm nice to Joe he'll run another stats update once we get everything to 50T done. I'm the only one with one range less than 50T outstanding, we will also see what effect the 9k had etc.
I've got four machines running on 991<n<50M. Currently I can bang out 1T in about 2 weeks (assuming none of the machines crashes that is). Right now I'm doing high range (862T-863T) sieving for the main effort. If it would help at all I'd be happy to switch to second-pass sieving on lower T ranges after the lastest range finishes (around July 18).
Jandersonlee,
It's your call on the low-p high-n or high-p high-n. Sieving both is useful but we havn't been finding alot of missed factors lately with the low-p. THe major push right now is to basically just finish everything less than 100T. It's not a rush as long as we can get them done.
If you'd like to try a 1T range I can assign one to you. If you'd like to continue with the main effort using 991-50M from 862000 down this is great as well.
Thanks for the help and let me know if you want a range of low-p.
I guess you are checking every number sent to factrange@yahoo.com if it is really a factor, like does the submission script, but are you actually checking for factors found the first time, but missed the second time, too? This might help discovering hardware or software problems.
H.
Actually yes. That's one of the reasons why we are asking for factexcl.txt files.Originally posted by hhh
I guess you are checking every number sent to factrange@yahoo.com if it is really a factor, like does the submission script, but are you actually checking for factors found the first time, but missed the second time, too? This might help discovering hardware or software problems.
H.
Joe O
O.K. Guys and Gals!!!
High Time for a High-n_low-p Update.
Joe, Just ran another dat update on the 991-50M dat, we are doing great. Looking at the data a couple things stand out.
- We almost eliminated an additional 1% of the tests from 20<n<50M with this update . This is really saying something since we are now at higher T than before.
I'm always curious about how many missed factors we have found.
- 18 missed factors were found!!!
18 when I only count those which eliminate tests. I always try to be careful when counting these they are for 2M<n<20M and only for those k which have no prime.
We found quite a few missed ones n<2M, but since secondpass is beyond that level it's hardly fair to count them.
Look for more data tommorrow, also a new dat will be working it's way through the system. But for now here are the first couple pics.
O.K. one more graph this one is a little difficult to explain. The axis have changed slightly...
The 100% sieve level for the 991-50M dat is currently sitting around 53T, however a large number of ranges above 53T and less than 100T have been totally sieved.
The y-axis has not changed and it still represents the total number of k/n pairs in the 991-50M dat. The x-axis represent the number of T sieve less than 100T. I couldn't think of a better variable this one to show our progress, it seems to work well.
From the graph it's apparent that we had a fairly large decrease. I'd also like to point out k=27653 has been removed from all data past and present. In other words there is no effect from the prime in these results.
Last edited by vjs; 07-19-2005 at 04:19 PM.
Time for a monthly update??
Hey Nuri,
Your probably right, but lets give it another week or two we will have a little more by then.
don't we always have a little moer every week?
Yes, we even have a little more every day. It's hard to balance the work of producing a dat vs the work saved by using a smaller dat. There is also the fact that many people do not update their dat that often. The 1% or 2% savings do not warrant stopping and starting the sieve programs. It's just easier to keep the nextrange.txt file full with next ranges to sieve, and possibly more productive.Originally posted by ShoeLace
don't we always have a little moer every week?
Now if you are talking about pictures, that is a different story. Those are easy to produce. Now having said that here they are. First the 50T-100T range:
Joe O
Thanks for doing the pictures Joe,
Shoelace, yes we do have a little more every week. But it's more an issue of users submitting the data and what data has been submitted. There are only two people working below 100T afaik, Hades_Au and Stromkarl. Both of these guys are trying to finish off the gap around 55T.
I was thinking it would be nice to do another stats run once they hand in those ranges. I'd like to tabulate an everything that has happened at some point.
The greater than 500T stuff is best represent graphically as Joe_O has shown.
vjs/JoeO
can you please give a list of unassigned ranges below 100T. This is just to get a feeling of where we are.
And as far as the request is concerned, yes, I was talking about the pictures, and thx for the quick reply.
In my opinion, an update after each 100T is finished would be enough for the dat file.
Originally posted by Nuri
vjs/JoeO
can you please give a list of unassigned ranges below 100T. This is just to get a feeling of where we are.Or, you can see for yourself here.Code:70000 74000 991<n<50M <------------ Available -------------> 81000 85000 991<n<50M <------------ Available ------------->
Joe O
Hmm. Have some idle machines at the moment.
70000-71000 jandersonlee [reserved] (w/ 991<n<50M)
Or is there somewhere else to record this?
Jeff
I'll record it for you. But yes there is a place to make reservations. The best way is to email factrange at yahoo dot com with for exampleOriginally posted by jandersonlee
Hmm. Have some idle machines at the moment.
70000-71000 jandersonlee [reserved] (w/ 991<n<50M)
Or is there somewhere else to record this?
Jeff
70000-71000 jandersonlee [reserved] (w/ 991<n<50M) in both the subject and the body of the email. But any old email will do.
Joe O
I'm not sure if I want to help with the low-p sieving or not. I have 2 computers, 256K cache on each, 1.25GHz AMD something and a 1.75GHz Sempron.
If anybody knows what these two babies can do in two weeks time, check to see if the number is more than 1 trillion. If it is, go ahead and assign me something.
Alternately, tell me a range that has been sieved, and I'll tweak the necessary .dat file so it'll run 200 million and give me an accurate measure in Sobistrator.(You know what? Never mind. I'm going to go ahead and do the tweaking. brb)
Edit: Total sieving speed of 625kp/s, about 2wks4ds for a trillion. A little too much, but if you want to break the rule about 1T chunks, I'd be willing to tackle .5 or .75 trillion.
Last edited by jasong; 08-18-2005 at 10:27 PM.
Well I finished up my 865000-870000 range over the weekend here are some stats for you guys. Out of the 5T range I found 109 factors <20M and 247 factors between 991-50M.
Basically the factor density over this range was....
20 factors / 1T with n<20M
49 factors / 1T with 991<n<50M
Still pretty good IMHO, I'll let you guys and gals know what happens with the 1120T.
Looks like the factor submission form http://www.seventeenorbust.com/sieve could now be accepting some n values above 20M. Not sure if it's intentional, but a few I was alerted to a few being in results.txt
n=22418391
n=23423165
n=31287881
n=20134244
n=28359068
then I submitted this one that I had lying around
Code:Factors 899671439917729|4847*2^26696511+1 Verification Results 899671439917729 4847 26696511 verified. Factor table setup returned 1 Test table setup returned 1 1 of 1 verified in 0.11 secs. 1 of the results were new results and saved to the database.
This is what the factor submit page says right now:
Appears someone worked on itFor those interested, here are the constraints on what the verifier will accept:
k = one of the 9 left
1000 < n < 50000000
1000000000000 < p < 2^64
I'll have to try and submit my old factrange.txt contents, I still have all of them around.
I just wanted to point out the status of the 991<n<50M dat double check...
All ranges 0-72000 have been completed using the 991<n<50M.
0 72000 Completed
Here is what's going on between 72T and 100T as you can see we are well on our way to finishing everything less than 100T.
72000 74000 Reserved Stromkarl
74000 78000 Completed Ironbits
78000 79000 Completed Joe O
79000 80000 Completed HHH
80000 81000 Completed Hades_Au
81000 85000 Available <---- lowest reservation point
85000 88000 On Hold Or combined Effort
88000 89000 Reserved Joe_o
89000 90000 Completed Joe O
90000 91000 Combined Effort Reserved HHH
91000 95000 On hold Or Combined effort
95000 97000 Completed e
97000 100000 Complete VJS
I'd also like to add that severall smaller ranges 100T<p<500T have been completed by Joe_O checking holes etc.
Notice! the sieve posting page may be broken, if you enter a factor above 32 million (probably a unsigned 32bit interger) it does not verify the factor, and it appears not to verify factors entered after it on the same page! Will an admin please check if thats the case, some people may be losing factors to this.
920902464340249 | 4847*2^27525327+1 : Works
920526825028453 | 21181*2^28359068+1 : Works
920754397985377 | 21181*2^33801212+1 : Does not work
It's only recently that the server accepts any factors n>20M, those >20M must the e-mailed to factrange@yahoo.com once the entire range is finished.
32 million is kind of a random number for it to break on. A 32bit signed integer would have problems at 2.1 billion, whereas unsigned 32bit would mess up at 4.2 billion. 32 million is roughly 25 bits... doesn't ring any bells for me...Originally posted by pixl97
Notice! the sieve posting page may be broken, if you enter a factor above 32 million (probably a unsigned 32bit interger) it does not verify the factor, and it appears not to verify factors entered after it on the same page! Will an admin please check if thats the case, some people may be losing factors to this.
920902464340249 | 4847*2^27525327+1 : Works
920526825028453 | 21181*2^28359068+1 : Works
920754397985377 | 21181*2^33801212+1 : Does not work
It actually rings a bell.
Somewhere around that range is the minimum n for the EFF prize (money) winning prime. So in other words the limit may be a intentional program limit created by the coder etc...
Ya, I tested this again today. Still, in my eyes, a serious problem..
1 of 17 verified
921011660940941 | 55459*2^17205790+1 : Verified
921068079832919 | 4847*2^36387327+1 : No
921076620830321 | 55459*2^13850266+1 : No - should have
921105537390287 | 24737*2^15437407+1 : No - should have
921126590368079 | 22699*2^48725398+1 : No
921156430133813 | 55459*2^30391546+1 : No - should have
921165601914619 | 55459*2^45318190+1: No
921286274117719 | 4847*2^17870991+1: No - should have
921297091642001 | 4847*2^23853111+1: No - should have
921300547253111 | 4847*2^41540007+1 : No
921310300068559 | 55459*2^46574014+1 : No
921319782560293 | 21181*2^40936292+1 : No
921327573029173 | 33661*2^9838296+1: No - should have
921340761412397 | 55459*2^28030474+1: No - should have
921355297438757 | 55459*2^33958954+1 : No
921358184252041 | 24737*2^13649527+1: No - should have
921368678312893 | 24737*2^31790383+1: No - should have
Personally I think this is a major problem, people who are not paying attention that are using the 50MB dat will miss entering factors. Just one factor about 32M and the rest after it DO NOT VERIFY.
I've been removing the ones over 32M when posting now so they will show up and sending everything to factrange@yahoo.com
update: I entered the 9 'should have' after by themselves and they enter correctly, just remember anything over 32M will cause problems
9 of 9 verified in 0.61 secs.
9 of the results were new results and saved to the database.
Well the ones less than 20M are very troubling. Could you e-mail Louie or Alien88 about this problem... thankfully we have a backup record with factrange@yahoo.com.
I wrote Louie about two factors I found by P-1 that I can't submit at the beginning of this month, still haven't heard from him.
If that were the case, then it would only apply to n >= 33,219,281. Of course, that's not considering k, so if k is 67607, then it would be n >= 33,219,265.Originally posted by vjs
It actually rings a bell.
Somewhere around that range is the minimum n for the EFF prize (money) winning prime. So in other words the limit may be a intentional program limit created by the coder etc...
However, this probably isn't the case simply because the EFF prize winning prime only applies to prime numbers, so if we have a factor, then that possible prime k/n pair is now definitely not prime. I wonder what the admins are up to, they seem to be busy/absent recently, and they're really the only ones who can give us a heads up on what might be happening.
We are still accepting all factors at factrange at yahoo dot com.
Here is a picture of the most recent results:
Joe O
It should be fixed now; let me know if you're still having problems.Originally posted by hc_grove
I wrote Louie about two factors I found by P-1 that I can't submit at the beginning of this month, still haven't heard from him.
It never ends does it...
How about it saying:
"k = one of the 8 left"
;-)