I WISH!!!

I almost never talk about absolute amps/cost because of the on/off nature of electrical devices.
Video cards for example, do they run less energy when you put the card/monitor to sleep?
How about when the monitor is just blanked out or turned off?
How about when it's just displaying text/graphics when surfing etc.
How about when you fire the beast up for some heavy gaming?
I'm not sticking my Kill-A-Watt on each system and get annal about it...

Yes, it can reach 120s here in the summer, and the BIG A/C unit eats 30amps each time it comes on.
So I tossed a nice Window A/C unit into the computer room window that runs at about 15amps when the compressor is on.
As you said, probably 1/2 the time on, 1/2 the time on fan mode (~5amps) ...

I try to keep my computer room free of crunchers so the Servers and my game box can enjoy nice cool dry weather in here, with the door shut.
I've run $500 per month electricity bills over in Laguna, California, and it doesn't take much around here to hit that high either.
If I turn everything off, then I can get it down around $250 in the summer months, and about $75 in the winter (gas heat).
As it is, I pay close to $300 every month, all year long now, on that new monthly average plan they have here.
$3,600 per year in electricity costs to survive the heat and the privilege of running a few servers and 1/2 dozen crunchers...