Not sure if I should have posted this here or on the Maths stackexchange, so sorry if it's the wrong place. I'm very new to MATLAB and programming in general, and am having some troubles trying to solve an ODE problem using finite difference methods for an assignment.
My finite difference equation is:
z(t+dt) = (dt^2*(γ^2*h*sin(γ*t)-β*z(t)) - z(t-dt)*(1-dt*α)+2*z(t))/(1 + dt*α)
Where t is a 51x1 array for the time increments. Basically I want to calculate z(t) for t values from 0 to 1 in increments of 0.02. I have the initial conditions z(0) = 0 and z(Δt) = 0.
My current code (not everything, but the bit that's giving me trouble:
dt = 0.02
t = [0:dt:T]';
z(0) = 0
z(dt)= 0
for i = t
z(i+dt) = (dt^2*(gamma^2.*h.*sin(gamma*t)-beta*z(i)) - z(i-dt)*(1-dt*alpha)+2*z(i))/(1 + dt*alpha)
end
Alpha, beta and gamma are all constants in this case, they're defined earlier in the code.
I keep getting the error "Subscript indices must either be real positive integers or logicals." I understand that MATLAB arrays begin with element 1 and not 0, so trying to access element 0 will give an error.
I'm not sure if the error is with how I've entered my finite difference function, or the initial conditions. By setting i = t, am I running the for loop for those values of t, or for those elements in the matrix? E.g. when i = 0, is it trying to access the 0 element of the matrix, or is it setting the i variable in the equation to 0 like I want it to?
Any help would be greatly appreciated.
Thankyou!