Hi,
I'm trying to use the Brent algo to find the minima for a given function. It converges very quickly, however it doesn't "stop" once the result is close enough. I need it to stop at the nearest 1/100. I believe the accuracy is the value I need to use, but the value I've tested (1, 0.01) don't give me result I need.
From looking at the code, it seems I should hit "return true;" once my required accuracy is reached...
Thanks,
Dominic
I'm trying to use the Brent algo to find the minima for a given function. It converges very quickly, however it doesn't "stop" once the result is close enough. I need it to stop at the nearest 1/100. I believe the accuracy is the value I need to use, but the value I've tested (1, 0.01) don't give me result I need.
From looking at the code, it seems I should hit "return true;" once my required accuracy is reached...
// convergence check
double xAcc1 = 2.0*Precision.DoublePrecision*Math.Abs(root) + 0.5*accuracy;
double xMidOld = xMid;
xMid = (upperBound - root)/2.0;
if (Math.Abs(xMid) <= xAcc1 && froot.AlmostEqualNormRelative(0, froot, accuracy))
{
return true;
}
Any suggestions? Thanks,
Dominic