Argent, I thought we had agreed to disagree. That didn't fly for you, huh? I guess I did open the box by throwing out the Ferrari thing after I'd offered to agree to disagree. That was a mistake. OK, I've got a little more left before I declare this debate pointless.

From: Argent Stonecutter
Digital electronic components, however, don't "wear out faster" if you run them slower than 100% of rated max.
If one were to argue for the sake of arguing, one could find an argument to be had with this point. I don't necessarily agree, but here's how the logic might go:
If you leave an electronic component completely unused, it will simply rot like anything else, as its various components corrode. The flow of electricity through circuitry can and does prevent oxidation of conductive materials. Therefore, SOME degree of steady usage will make an electronic device last longer than no usage at all. Carry that to the next logical step, and it's not inconceivable that constant full throttle usage will keep a device in better shape than intermittent limited usage.
We see analogous behavior all the time in mechanical devices like cars. Stop and go city driving kills a vehicle MUCH faster than steady highway driving.
A processor is an electronic device, but it is also very much a mechanical device. It's full of thousands of tiny switches. I'm not sure anyone really knows whether these switches will break down faster through the mechanical wear and tear of flipping on and off constantly, or through corrosion from sitting idle. There are theoretical arguments that could be had in both directions.
Feel free to agree, disagree, or simply ignore all this so far. It's just a silly logical exercise, and not even a terribly good one, meant mainly to show the futility of continuing to argue about this.
So, silly-logic aside, here's what I'll now say on the subject from a more practical standpoint. Hopefully this will be the point we can agree to disagree on.
Even if the constant full usage of a CPU will wear it out faster than more moderate use, I don't believe for a second that that wear and tear could possibly shorten the processor's lifespan so dramatically as to cause it to fail before it simply becomes obsolete. That's the only point that's really important, as far as I'm concerned.
The simple fact is by the time my 3 year warranty on my laptop is up, it will be time to upgrade to a new one, no matter whether the old one is physically on its last legs or not. So what would be the point in lessening my enjoyment of it in order to make it last longer physically? I'd much rather continue to drive it at 100MPH across the finish line than Sunday-drive to the same inevitable destination.
From: Argent Stonecutter
The hotter they are, the faster they wear out, this is uncontroversial.
I'm not sure I can fully agree with that. If a device is heated beyond its safety margin, then sure, it will wear out prematurely. But as long as it's kept within its design limits, it will just run and run and run.
But again, even if you're right that every degree of heat lessens the lifespan, you'll never convince me that any amount of heat within the design threshold will lessen it so much that the device will fail before it's obsolete.
Fact, any laptop older than 3-4 years is a relative dinosaur. Operating systems and programs continue to evolve at an every increasing pace, and only get more and more power hungry. Desktops tend to get upgraded little by little along the way, so they're easier to keep current. Laptops are closed systems, which need to be replaced every few years no matter what. Barring defects, CPU's last longer than that 3-4 years, no matter how much you use them, as long as you don't overheat them.
From: Argent Stonecutter
You may consider that I am too conservative, but you can't contradict the fact that all other things being equal, running a system cooler will increase its expected lifespan
For all practical purposes, I just did. Running a laptop cooler won't make it any more usable in 3-4 years than running it hot the whole time. Obsolescence is obsolescence.
From: Argent Stonecutter
Since there is no downside to throttling FPS and running SL at less than 100% of the available frame rate, what exactly are you trying to accomplish by belittling my digital phallus?
Excuse me, what? Now I KNOW you're arguing just to argue. With all due respect to you and your self-anointed giant pillar of virtual masculinity, please don't go there again.
From: Argent Stonecutter
I don't mind having a smaller clock than you, I'm just pointing out that a benefit (however derisory you consider it) of throttling SL.
I don't think it's a significant enough benefit, if it's even a benefit at all, to bother with.
From: Shambolic Walkenberg
To me it doesn't matter if we're talking laptop or desktop - Why have an application that even when doing very little still sucks up all available resources?
Why not? There's no harm in it, as I've been trying to point out, between the "digital phallus" remarks.
From: Shambolic Walkenberg
Maybe I don't want my CPU fan to be spinning faster than necessary (no matter what fan you have, the lifespan is still revolution dependent), or to have so much air flowing through my case I can barely hear myself think just to keep the internal temps down.
Realistically, do you know how long it would take to kill a CPU fan just by letting it spin? Seriously, it could take centuries. I've personally never seen a fan die, be it a CPU fan, a house fan, an automobile engine fan, or any other type. I'm sure they do fail from time to time, just like all mechanical devices, but it's an exceedingly rare occurrence.
And as for noise, I can't help but point out the obvious. If your fan is really so loud you can't hear yourself think, get a quieter fan. That's kind of a no-brainer, isn't it?
I recommend Zaalman 9500's. They're very, very quiet, and their heat dissipation is phenomenal. My system temps dropped a full 10 degrees, across the board, when I switched from a (very expensive) liquid cooling system to my Zaalman, and the noise level dropped too, as the Zaalman turned out to be infinitely quieter than the radiator fan on the liquid system. Total win-win.
From: Shambolic Walkenberg
A CPU at full pelt will be warmer than one that isn't, and that heat will be raising the temp of every component.
Technically yes, but not by much. By the time you get more than a millimeter or so from the surface of the fan motor chassis, you're talking about fractions of a degree.
If you want to argue that the heat coming off the processor itself, and dissipating off the heat pipes that connect to the fan, or even the exhaust blowing out FROM the fan, if not aimed properly out an exhaust port, can all serve to raise the temperature of the interior space inside a case, fine. But to argue that the heat from the fan motor itself is in any way a significant effector of the temperature is more than a little silly.
From: Shambolic Walkenberg
Even if I compromise between noise levels and ideal temps and keep everything below manufacturer spec, the warmer something runs the less long it will function.
Again, get a quieter fan, and you won't have to compromise at all. In any case, I take issue with the notion that just because something is warm, it's destined to die sooner. As long as it's within the design threshold, it will be fine for the practical lifespan of the computer. It's only when things start to overheat that there's a problem. There's a big difference between ordinary operational heat and overheat.
From: Shambolic Walkenberg
Underclocking isn't an option, why should I either run a flakey utility to try and do it on the fly, or have to manually keep adjusting things depending on what I want to do?
Agreed. Kind of beside the point, but agreed.
From: Shambolic Walkenberg
And aside from all that, why with everything as standard, should I be pumping extra heat into a room and sucking more electricity just to maybe stand around in SL chatting in IMs?
Maybe, maybe not. But the differences you're talking about are really negligible. So there's not much point in worrying about it.
Want some examples to prove it? I'll give you a great one, my office.
The presence of the computers in my office causes that room to be a good 10 degrees hotter than the rest of the house, at all times. I've got central A/C, but I also keep a secondary portable air conditioner in the office, for when the heat gets too excessive. I monitor the room temperature pretty closely, since I actually have a pretty severe heat-related medical condition. If I spend too much time in a room that is too hot, my system starts to shut down, and I could literally die within hours if I'm unable to cool my body in time. It's the lingering aftermath of a heat stroke I suffered about 10 years ago, during which my body's internal temperature regulation ability was partially destroyed. So as you can imagine, the subject of heat output from computers (and from everything else) is not one I take lightly.
Want to know how hot the room is with the secondary A/C off, and the computers idle? About 80 degrees, usually. Want to know how hot it is with the secondary A/C off and my main computer running full throttle all day long? Still about 80 degrees, usually. It only starts to creep above 80 when the temperature outside climbs above the low 70's. Otherwise, that 80 degree mark is quite steady.
What heats up the room that extra 10 degrees above the rest of the house is the fact that the computers are on at all. The amount of CPU usage at any given time makes no measurable difference whatsoever.
The same is true for electricity as well. Since I work freelance, there are times when I'm working 8-16 hours per day for weeks, and other times when I've got almost no work at all. When work is slow, I'm not in SL very much. I still use my computers, but my CPU usage is practically nonexistent, compared to when I'm working. My electric bill remains pretty constant, however.
Surely, if CPU usage were a major factor in power consumption, then there would be a significant spike in my electric bill in months when I've had a lot of work. But that spike just isn't there.
I know what you're thinking. I probably watch more TV or something when I'm not working, and that's what evens out the cost, right? Wrong. I have a TV in my office, which is ALWAYS on when I'm working. It's a habit I've had since childhood. For some reason I concentrate better when the TV is on than when it's not. On the rare occasions that I don't turn it on, I don't get nearly as much done. Somehow I procrastinate more when my shows and movies aren't going. I've never understood why.
And beyond computers and TV's, there's not much else I really use discretionary electricity for. I do have a woodshop full of power tools, but I only ever get in there to work on things a few days a month. Other than that, there's really nothing.
Bottom line, CPU usage doesn't change ambient room temperature or household electricity usage in any statistically significant way.
In any case, one thing we all seem to keep ignoring is the fact that SL is hardly the only program to exhibit this behavior. Most video games do exactly the same thing. Should we be discussing throttling them as well?