A method for modeling the effect of turbulence on the cut point function
of a decanting centrifuge includes inputting data related to the
centrifuge including the bowl radius, r.sub.2, the pond top radius,
r.sub.1, the bowl length, L.sub.B, the angular velocity, .omega., of the
centrifuge into an analyzer, measuring data related to the feed fluid
including a fluid density, .rho..sub.f, a particle density .rho..sub.s, a
fluid viscosity, .mu., and the flow rate, Q, and inputting the measured
data into the analyzer, calculating a d.sub.100 cut point as a function
of the flow rate, modeling a first cut functions for laminar flow as a
function of a particle diameter, d.sub.i, using the calculated d.sub.100
cut point function, calculating a maximum point within the pond, I.sub.b,
I.sub.p, at which a particle having particle diameter, d.sub.i, must
reach a laminar boundary layer within the pond to be captured within the
bowl, wherein I.sub.b is defined as the distance from the inlet at which
the particle having particle diameter, d.sub.i encounters the laminar
boundary layer and I.sub.p is defined as the height of the boundary layer
above a bottom of the pond and modeling a second cut function for
turbulent flow having a laminar boundary layer along the bowl inner
surface as a function of particle diameter, d.sub.i. A turbulence factor
is used to model the second cut function and may be used to predict a
corrected cut point as a function of particle diameter.