I'm trying to write a program in matlab that checks how consistent the definition of the derivative becomes:
(f(x+h)-f(x))/h ~= f'(x)
when h
is small enough. Thus far i have this:
function [errList] = diffConsistency(f,df,x,iMax,h0)
h=h0;
for i=1:iMax
leftSide = (f(x+h) - f(x)) / h;
rightSide = df(x);
errList = abs(leftSide - rightSide);
h = h*10^(-1);
end
I then use f=@(x)sin(x)
and df=@(x)cosx
, I'm new to using function handles so this might be wrong completely. iMax
is set to 10
and h0 = 1
, x=rand(10)
Could anyone check if this is even remotely correct. Especially the use of the function handles inside the diffConsistency function and use of the rand
.
Should i define x differently, leftside rightside are correct? etc
Any feedback would help. Thanks in advance