Air compressors are sized for maximum load.
In many plants, the maximum load is very rarely met. So running at a lower stage will burn less energy to keep up with the air demands.
2007-05-12 05:08:47
·
answer #1
·
answered by Mike J 4
·
0⤊
0⤋
There is a tradeoff at work.
If you wanted to make the most efficient possible compression cycle, you would compress the air in an ideal adiabatic compressor (without any loss of heat) and then have an ideal heat engine that would recover work while cooling the air back to ambient termperature. If you then looked at the net of the work done by the compressor minus the work recovered by the heat engine, you would compress the air with the minimum possible net work.
However, in the real world, nobody ever does that. Instead, the heat generated by compressing the air is rejected to the environment without recovering any work. When you take this as the cycle, then the only thing that counts is the work put into the compressor. And, it turns out, if you want to compress air with the minimum amount of work put into the compressor, then the way to do that is isothermally - to maintain constant temperature while the pressure is increased.
No known machine can actually do that. As a compromise, we build multi-stage compressors, where the air is compressed nearly adiabatically, then cooled, then compressed again. A two stage compressor is pretty common, but more stages are used either to achieve high compression ratios or when maximum efficiency is more important than the cost of the machinery.
As for why this saves energy, it is a consequence of the thermodynamic behavior of gases. If you fix the starting and ending pressure of the process, then the minimum increase in enthalpy occurs when the starting and ending temperature are the same. (presuming that there is a minimum possible temperature that is dictated by the ambient environment.) Extend that thinking to infinitesimally small increments of the process, and you arrive at a constant-temperature process. It can be proven by application of the perfect gas law.
2007-05-12 13:16:31
·
answer #2
·
answered by AnswerMan 4
·
1⤊
0⤋
Having multiple stages makes the compressor more efficient. Think about a reciprocating compressor. When the piston's at the top of the stroke, there's a little bit of space. The air in that space doesn't leave the compressor, it just expands back when the piston drops, causing an efficiency loss.
So, at a lower pressure ratio, not as much compression occurs, so there's less waste at the first stage. Then the air goes through an intercooler - cooling air makes it more dense - and is compressed in the second stage. Not as much space at the top of the 2nd piston, so not as great an efficiency loss.
So the two space savings + the cooler air entering the 2nd stage combine to increase the efficiency.
2007-05-12 12:44:22
·
answer #3
·
answered by Doug B 3
·
0⤊
0⤋
Two stage compressors are more efficient than single stage compressors. A two stage air compressor discharging at 100psig can be as much as 10% more efficient than a single stage unit, and the higher the discharge pressure, the greater the efficiency.
If you look at and solve the equations for the power required for single stage air compression (polytropic), and the power required for two stage air compression (polytropic), you will get a lower power requirement for the two stage process.
For single stage compression:
P = 144(p1)(Va)(n/n-1)((p2/p1)exp(n-1/n) - 1)
For two stage compression:
P = 288(p1)(Va)(n/n-1)((p2/p1)exp(n-1/2n) - 1)
It is the exponential function exp(n-1/n) and exp(n-1/2n) that make all the difference.
Use absolute values of pressure, in psia, and n=1.25, and any value of Va in CFM that you want.
P = ft. lb./minute
2007-05-12 14:58:50
·
answer #4
·
answered by gatorbait 7
·
0⤊
0⤋
Give the points to AnswerMa.
I agree; the most efficient practical compression process is isothermal. This is achieved best with inter-stage cooling.
2007-05-12 21:45:01
·
answer #5
·
answered by Bomba 7
·
0⤊
0⤋