PDA

View Full Version : Nvidia working on first GPGPUs for Apple Macs



IronBits
01-24-2008, 06:30 PM
Graphics chipmaker Nvidia Corp. is in the early developmental stages of its first Mac-bound GPGPUs, AppleInsider has learned.
http://www.appleinsider.com/articles...pple_macs.html (http://www.appleinsider.com/articles/08/01/24/nvidia_working_on_first_gpgpus_for_apple_macs.html)

Short for general-purpose computing on graphics processing units, GPGPUs are a new wave of graphics processors that can be instructed to perform computations previously reserved only for a system's primary CPU, allowing them aid in the speed of non graphics related applications.

The technology -- in Nvidia's case -- leverages a proprietary architecture called CUDA, which is short for Compute Unified Device Architecture. It's currently compatible with the company's new GeForce 8 Series of graphics cards, allowing developers to use the C programming language to write algorithms for execution on the GPU.

GPGPUs have proven most beneficial in applications requiring intense number crunching, examples of which include high-performance computer clusters, raytracing, scientific computing applications, database operations, cryptography, physics-based simulation engines, and video, audio and digital image processing.

It's likely that the first Mac-comptaible GPGPUs would turn up as build-to-order options for Apple's Mac Pro workstations due to their ability to aid digital video and audio professionals in sound effects processing, video decoding and post processing.

Precisely when those cards will crop up is unclear, though Nividia through its Santa Clara, Calif.-based offices this week put out an urgent call for a full time staffer to help design and implement kernel level Mac OS X drivers for the cards.

Nvidia's $1500 Tesla graphics and computing hybrid card released in June is the chipmaker's first chipset explicitly built for both graphics and high intensity, general-purpose computing.

Programs based on the CUDA architecture can not only tap its 3D performance but also repurpose the shader processors for advanced math. The massively parallel nature leads to tremendous gains in performance compared to regular CPUs, NVIDIA claims.

In science applications, calculations have seen speed boosts from a 45 times to as much as 415 times in processing MRI scans for hospitals. Increases such as this can mean the difference between using a single system and a whole computer cluster to do the same work, the company says.

Paratima
01-24-2008, 07:47 PM
In science applications, calculations have seen speed boosts from a 45 times to as much as 415 times...

Yow! :scared:

jasong
02-03-2008, 01:43 AM
Yow! :scared:
I have a friend who made an LLR client that utilizes the graphics card. He says it goes so fast that it makes you think the program's busted. A Mersenne number that now takes ~30 days? How about taking that down to hours? What about a sieving client that's about 19(I think that's the number, not sure) times faster simply because the code has been ported to run on the graphics card?

Paratima
02-03-2008, 10:52 AM
Everything digital is slower than almost anything analog. Analog computers were blazingly FAST! But (sigh) we didn't take that road.

As to super-fast computing on video cards... Talk is cheap. Someone gimme a linky.

jasong
02-05-2008, 05:29 PM
Everything digital is slower than almost anything analog. Analog computers were blazingly FAST! But (sigh) we didn't take that road.

As to super-fast computing on video cards... Talk is cheap. Someone gimme a linky.
The dude is really sick. The doctors say he has about a year left.

At the moment, I'm trying to convince him to put me in his will. Not for any material assets, mind you, but for possession of the source code and some other stuff. If he dies(I kind of like the idea that someone might have some way to treat him, a just in time sort of thing. :) ), I'd like to place the code under an open-source license, with a plea to the open-source community to make the original source(not just the later, edited stuff) accessible for as long as it's relevant. It's really important to me that he gets credit for his work.

As for why it's not released RIGHT NOW, the guy has about a year to live. He's not interested in fame and fortune when the timeline is so short. If he was looking forward to 30-50 more years then, yeah, people would be using the program right now.

Not to be morbid, but I haven't heard from him in about 8.5 days, so this stuff may be out there in the next month or two.

PCZ
02-06-2008, 02:28 AM
Nvidia have bought aegia.
The Physics card makers.

Don't think we will see NV's number crunching solution restricted to macs only for very long.