Simply, Imagine I have 100 trained Neural Networks, And a Test image( or Samples, tensor, vector etc.) I want to calculate output of all my trained nets on test sample, One simple way is to use for loop, and calculate network responses one by one. Like so :
def show_reconstructions(model,X_test):
reconstructions = model.predict(X_test)
return reconstructions
for i in range(100):
Rec = show_reconstructions(Nets[i],test_samples)
Diff = Rec - test_samples
NormDiff.append = np.linalg.norm(Diff)
I assign all my net object in a list named Nets
, and finally I need the NormDiff
variable which is a vector of norm of difference between test samples and net output (something like error, make sense) whose size is 100.
My question is, simply:
How Can I Remove The "for" Loop and calculate Whole of Network Outputs to Test Input, All At Once?? In order to save time and compute the calculations in shorter time(real time application is assumed).
This is obvious that output of one network is completely independent of the other net outputs, so this task seems to be done in a parallel fashion.(?) But I am a new python coder and don't know how can do that.
I try to do that with numba library, but it don't accept network object as input argument as list or dict.