-
Senior Member
Algorithm breakthrough
I think an algorithmic breakthrough, at least complete resdesign will be required once you pass a certain limit - specifically when the size of the number 2^n and intermediate calculation > available memory say a 1,000 Gb (= ram plus swap).
When will we reach it? Will we factor, sieve and prp faster than Moore's law? If not we will hit a limit.
2^18T is truly large number., something 3,000,000 bigger than where we are now. A new algorithm will be required unless we are here a long time.
Dave/louie/mathguy/someone else - what n will the new client be able to calculate up to?
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules