Since Gimps reports work done in P90 years, is there a good reference for what this means in terms of modern chips? Now that we've got all these new instruction sets and bigger caches it would seem that simply dividing megahertz would no longer give adequate results.

I'm curious about how many P90 years an hour of something like a P4 2.4 would be.


Anyone remember the Murphy's Law that said that any measurement will be given in the least useful units, such as expressing lightspeed in furlongs per fortnight?