We recently took delivery of a new immersion water heater, the old one having given up the ghost after less than four years. Two problems arise, one of which I don’t really care about. I seem to remember, long ago in a distant galaxy, frequently getting embroiled over cocoa and crumpets on whether leaving an immersion heater on all the time requires more energy to maintain some desirable temperature than switching it on and off. It all had to do with how high the desired temperature exceeded ambient, and rates of heat loss, and all sorts of similarly important physics stuff, and I cannot remember the answer. If you have one, of course, please share it.
The question that concerns me now is, how hot should the desired temperature be? That is, if the thermostat is set way up high, and the tank is well lagged, maybe it uses less energy, because you are adding a lot of cold to make the shower bearable. Less hot used means less cold to heat up, if you get my drift. But maybe having a lower thermostat setting, and thus using less cold, and having therefor to heat more cold when the shower is done, but to a lower temperature, actually uses less energy.
I wonder. But I don’t have the wherewithal to do an actual experiment. So please weigh in.
A contributory factor to the demise of the old geyser might have been the very hot temperature at which it ran, which might have precipitated more of the crap from the very hard water we are cursed with, and may have accelerated general wear and tear on the various bits. But that requires a chemist, rather than a thermodynamicist.