[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]
On Mon, 6 Feb 2012, Jack Oley wrote:
There's a lot of talk of reduced life expectancy due to thermal changes.
It's not just talk...It's a well known and used technique in the design and manufacture of electronics components - you take a board of something and thermally cycle it from one extreme to the other. This causes all sorts of issues to crop-up which you might not be expecting.
Every company who have been designing or making electronics systems that I've worked with have such an "oven". Typically capable or cycling from -40C to +40C (or more) in a controlled manner. One unit I remember using could take a 4KW load in both directions at 10C/minute - that's a lot of cooling!
One of the issues I used to find in early microprocessor system was that of the chips working their way out of sockets! The Apple II was good (bad) for it - in a classroom situation they'd be switched on/off several times through the day and it wasn't uncommon to pop the lid and just push all the chips back down. Most other systems (Spectrum onwards) would have the chips soldered to the motherboard for economy...
You could argue that the board in your PC has already undergone the thermal cycling, so it ought to be fine, but even so, I don't like pushing things...
First, I find it difficult to believe that this is not accounted for, to some degree, in the design of the components,
It is - to a degree, but what if you have devices that have to be exposed to the elements and go from -30C to +50C? This not unusual for outdoor wireless kit in the desert...
and second, I suggest that you'd have to turn your PC on and off so many times that you're more likely to be upgrading components naturally before any failure due to metal fatigue.
There's lots more things at issue too - power supplies suddently putting power to all the RAM chips for exmaple - early DRAM required -5V as well as +5 and +12V and you should put the -5V on before the rest (for example), turn it off and back on again and with some cheaper PSUs you ran a real risk of blowing up the RAM...
I could be wrong, of course, but environmental considerations should be paramount and I don't believe that there's any good reason for leaving equipment - including routers - on when not being used. Where's the evidence to the contrary? If anyone's seen any articles, etc. to this effect, please post links. I turn all my gear off when not in use, never had any problems that I'm aware of. Jack.
I leave my kit on because I'm mostly lazy - I want my workstation to be just the way I left it - and as I often leave monitoring running overnight on it anyway, putting it into standby isn't really an option.
Of-course my servers run all the time.As for routers - it's more the ADSL modem - you really do need to leave that on all the time - well, with BT lines anyway - especially on new lines. Most of them are under 5 watts anyway - it's not zero, but think of that the next time you fill your 3KW kettle for a cup of tea - you've probably just wasted the same electricity to boil the extra water than the router uses all year.
(ok, I'll do the sums - 3KW for 3 minutes - 3KWh - lets say it just took 1 minute to boil a cup rather than 3 minutes to boil the lot - so 2KWh wasted - a 5W router uses 0.120KWh a day. So, OK, nearly 44KWh a year, so not as much as is wasted on boiling a kettle, but as much as boiling that same kettle 22 times - probably a weeks worth if you filled your kettle all the time!)
Even if you do just boil the kettle for what you need you get the idea - put into perspective, it's not a lot, really.
Gordon -- The Mailing List for the Devon & Cornwall LUG http://mailman.dclug.org.uk/listinfo/list FAQ: http://www.dcglug.org.uk/listfaq