This is the code I have so far to compute how long it takes, on average, for MATLAB to implement gaussian elimination on a matrix of size N=200:
Ntr=50; % Number of trials
N=200; % Matrix size
times=zeros(Ntr,1); % Vector of timing data for Ntr trials
for i=1:Ntr
% Form a random matrix A and right-hand side b (normally distributed)
A=randn(N,N);
b=randn(N,1);
% Apply backslash and calculate time taken
tic;
x=A\b;
times(i)=toc;
end
N
mean_time=mean(times)
How can I modify this code so that it computes this for various values of N such as N=200, 500, 1000, 3000 etc? I tried a for loop but randn can only take in scalar values... The end result I am looking for is plotting a loglog graph of N values against the average time taken. Any help would be appreciated!