Page 2 of 2 FirstFirst 12
Results 41 to 49 of 49

Thread: A New Estimate for When Will Find Primes above n = 3 million

  1. #41
    There is a 50% probability that we will get all 12 primes by n=2.9 x 10^13

    Does this mean that there is a 50% probability that we will
    NOT see the sierpinski problem solved in our lifetimes,
    assuming that no quantum computers are invented?
    You think we'll reach n=2.9 x 10^13 in our lifetimes?

  2. #42
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    You think we'll reach n=2.9 x 10^13 in our lifetimes?
    Depends on the lifetime we got left.
    I think it is widely distributed among the participants...

  3. #43
    Depends on the lifetime we got left.
    Lets assume the folowing:

    1) We tested 12 K's from 1 to 3M in 1 year

    2) Computer speed doubles every 1 years

    3) Number of searchers doubles every year

    4) a test twice as large takes 4 times longer

    All of the above are very optimistic to make up with the removal of K's

    This means we'll have a progress of 3M a year.

    (2.9 x 10^13) / (3 x 10^6) is about 10 million years

    Even i'f i'm a couple of magnitudes off.................

  4. #44
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    (2.9 x 10^13) / (3 x 10^6) is about 10 million years
    If you want to die that early...

  5. #45
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    2) Computer speed doubles every 1 years
    That would be true if we assume that every new user has a faster computer than the current searchers have and/or if we assume that the current searchers will bring newer machines to the project.

    3) Number of searchers doubles every year
    This is a very very very optimistic assumpition. Yes, the number of new users grows a bit every day (and will grow a lot when a new prime is discovered), but the number of active users hasn't increased significantly in the last months (even, I think it has decreased).

    It would be nice to have in the stats an "active users in the last months" graph.

  6. #46
    Originally posted by Moo_the_cow
    Does this mean that there is a 50% probability that we will
    NOT see the sierpinski problem solved in our lifetimes, assuming that no quantum computers are invented?
    First let’s update that estimate. The bad news is that several of Yves Gallot’s Proth Weights were smaller than my estimates, and this substantially increases the time until we are likely to have found all of the primes. Using his Proth Weights and assuming no primes found by n=3.5M, the 50% probability estimate for finding all the primes is n=10^14.

    Let’s assume this Proth Weight model is good enough for such ridiculous estimates, and assume no new algorithmic discoveries over the course of Seventeen or Bust, and assume that Moore’s Law continues to double processing power every eighteen months, and then do a back-of-the-envelope estimation.

    On the resource allocation thread we are discussing models of the time needed to complete a primality test. One of the models says it grows with n as n^2 * (log(n))^2. If that’s right, then the 50 percentile primality test will take 4*10^15 times as long as the present tests around n=3.5M. If we want to finish that test in the same amount of time as today, we need about 52 doublings of computer power, which will take about 78 years. But there is another problem: clearing each successive power of 10 takes about 10 times as many tests. We cleared about a half-power of 10 in a year. To clear powers of 10 at the same pace at the end, we would need to complete each primality test about 3*10^7 faster. Maintaining that pace through powers of 10 would finish in only 15 years, though, so we can slow down by a factor of 5 to 6*10^6. In addition, we will only be testing one k value instead of 17, and it will be a low proth weight value, so a speed increase of only 2*10^5 is probably sufficient. This is another 18 doublings, requiring an additional 26 years.

    So, assuming no new algorithmic improvements, and assuming all these modeling assumptions, it looks like there is a 50% probability that Seventeen or Bust will require more than 104 years to finish. Under these assumptions, I would say that yes, there is a greater than 50% chance the Sierpinski problem will not be solved in our lifetimes.

    Note to smh – you missed that computer speed overtakes you because if you are only completing 3M tests per year, you don’t double the n’s fast enough, so computing power goes to faster tests rather than harder tests in the same time.

    I don’t despair, though. It’s silly to assume the probability model is still useful at such extremes, and it’s even sillier to assume no new algorithmic breakthroughs will occur.

  7. #47
    Unholy Undead Death's Avatar
    Join Date
    Sep 2003
    Location
    Kyiv, Ukraine
    Posts
    907
    Blog Entries
    1

    almost year passed

    well, how about CURRENT estimations...
    wbr, Me. Dead J. Dona \


  8. #48
    In the near and longterm future advances in optical processes, nanotechnology, quantumcomputers, ........... will certainly help in solving this and many more problems much faster!!!!!
    Dont you agreeee???????

  9. #49
    Ah things I forgot, the development of grid computing or utility computing or whatever you call it, these developments will certainly reduce the amount of cpu cycles wasted in the future by the millions of computerusers in the world.
    Another thing, possible alliances in the near future with platforms such as BOINC, so you can reach much more people.....

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •