Relationship of Theoretical Static and Dynamic Compression Ratios
I have a normally aspirated Reliant 862 cc car engine (petrol/gasoline,
spark ignition), remanufactured from one which (unknown to me at that
time) already had a CR of 10.5 to 1.
Because the car I built it
for is a competition car (it's called a Liege, for on and off-road
trials), I wanted the engine uprated during the rebuild. The motor
engineering company who did the work (now closed) skimmed the head by 25
thou and later had to skim the block to get the cylinder deck heights
correct.
They never told me what the actual CR of the engine was,
but it's obviously higher than 10.5 to 1. Because my engine runs very
well, other owners, (including the designer of the car) have asked me
about the specifications and in particular what CR it runs.
Short
of stripping the engine and measuring it properly, which I'm not
inclined to do, all I can do is take the cylinder compression pressure
readings with my testing gauge.
The cylinder pressures are all a
good 225 psi, engine warm and cranked on the starter at abut 300 rpm,
with the plugs out and the throttle fully open. That figure comes up
after about 3 compression strokes.
I understand the international
standard atmosphere (one bar) is about 14.7 psi. Can this figure be
used to calculate or even approximate the CR? If so, my engine has an
extremely high compression ratio, which is a lot higher than I expected.
I am reluctant to think it was any higher than about 12.5 to 1 but
using this formula it would seem to be more like 15 to 1.
Or have
I got this all around my neck somehow? Can gauge measured compression
pressure be reliably used as a guide to actual compression ratio?
I
have been trawling the internet for answers without success and having
discussed it with other enthusiasts over time I have heard different
opinions.
No, The compression ratio is a maths calculation. The gauge
pressure is the results of a number of things, cam timing, temp rise
from air compression, etc.
Measure with oil poured through the plug hole, with piston on TDC, for compressed volume.
if there was no inertial ramming (low cranking speed, negligable
effect) and there was no appreciable effect from charge heating or
leakage, then the compression pressure might be a reasonably good guide
to the theoretical / mathematically calculated CR*ball valve
well the
trouble with that is that you don't know when compression starts. With
no "inertial ramming," you won't start compressing the charge until IVC,
which almost certainly happens a while into the compression stroke.
Without
knowing when IVC happens, you can estimate the compression ratio from
IVC to TDC by assuming that polytropic compression of air is a good
approximation of what's going on inside your engine.
p2 = p1*(r.c)^k
k = 1.37
r.c = exp(ln(p2/p1)/k)
using p2 = 225 and p1 = 14.7 gives r.c = 7.33
Original CR -> 10.5
Original Bore -> 2.46 in
Original Stroke -> 2.72 in
here's what I calculate:
Original TDC volume -> 22.30 cc
Original BDC volume -> 234.15 cc
TDC volume is increased by bore change
TDC volume is reduced by head shaving
Pistons stay the same
TDC2 = TDC1 * Bore2^2/Bore1^2 - (Bore2^2/4 * pi * .025 in)
TDC2 -> 20.685 cc
BDC2 = BDC1 * Bore2^2/Bore1^2 - (Bore2^2/4 * pi * .025 in)
BDC2 -> 235.995 cc
CR2 -> 11.41:1
So
the known geometry changes give you a compression ratio change from
10.5:1 to 11.4:1, and the unknowns will likely change it further.
MORE NEWS