> With all due respect to Trevor: Saying it don't make it so. Far from
>being debunked, the assertion that there is an optimum flow rate for
>maximum heat transfer out the radiator has received ample backup on the
>list. This thread resumes each year about the time the trees begin to bud
>and people are as passionate in their support of their analyses as they
are
>support of their religious beliefs.
> I have been meaning to go to the local engineering college and look up
>"the radiator problem", which surely appears as a homework assignment or
on
>Heat Transfer midterms hundreds of time each year. Stay tuned.
> John Cowan<
John, I will address this as a completely theoretical issue, with real
world theory put in as well.
First, for a given radiator surface area, the heat flux (amount
transferred)
is proportional to the temperature differential between the coolant in
the radiator and the air outside the radiator. To increase cooling,
you want to maximixe the flow through the radiator in order to avoid
having some parts of the radiator less efficient due to cooler fluid
within that area.
But in real life that is a problem because as you try to push the fluid
through faster you incur greater energy expenditure as pumping
losses. (you can see this by trying to suck a milk shake through a
straw. If you suck slowly, the effort isn't great, but if you want faster
flow, you have to put in great effort and collapse the straw.) So,
there is a certain law of diminishing returns of cooling vs total
energy expenditure.
Next problem is the flow of the coolant within the radiator. As the
velocity increases, you will at some point (don't know if this is
reached in radiators) go from laminar flow to turbulent flow
and the pressure required to move the fluid goes up drastically.
(Turbulent flow is GOOD for heat exchange, but makes resistance
greater. Maybe better to go with a larger radiator and keep laminar
flow).
Next problem is that for a certain speed of air flow through the radiator,
there is a certain maximum (fairlty low) amount of heat exchange
possible. If you are already close to the maximum for the air,
forcing more coolant through the radiator will do little to change the
heat transfer. I bet that the air is about maxxed out in real life.
To summarize: More coolant flow CAN increase heat trransfer,
but in real life you probably would have to increase BOTH AIR
flow and COOLANT flow to change the overall heat dissipation.
Also in real life, since the specific heat (ability to absorb heat) of
air is MUCH lower than that of coolant, you would have to greatly
increase air flow for a given small increase in coolant flow.
(E.G. [not based on any calculations] double the air flow for a
50% increase in coolant flow).
Applying real-world numbers to the engineering theory is not a
trivial task because the math is all differential equations. However,
I am positive that there are some really good cookbook equations
in some CRC manual somewhere that a very smart engineer has
devised to cover the fairly narrow temperature range that we are
dealing with.
Sorry for the long winded reply, but you need to see just how
complicated the issue is, and how people on both sides of the
issue can be "correct".
-Tony
P.S. There can be TOO much cooling. There is an optimum
operating temperature of an engine.
|