0

I am exploring MATLAB's Neural Network Toolbox (NNtool), and I am running into a problem of having incompatible dimensions of target matrix. The exact error message is:

com.mathworks.jmi.MatlabException: Insufficient number of outputs from right hand side of equal sign to satisfy assignment.

Let me explain in detail. I have an image which has some elliptical image in it, and I want to estimate the eliiptical parameters using a neural network, for that purpose, I have training data with all the target values. So, I am giving image as an input (I first read the image, convert it into mat2gray() format, and then import it in NNtool), and then I set the target matrix (my target matrix contains two values, since my neural network will be outputing two values, I have tried formatting the output in both ways, [0.5 0.9] and also [0.5; 0.9], but still I am getting the same error.

I have also tried by keeping the number of columns the same for the input and target matrix. I made my input matrix as [2304,1] (I have a 48*48 image, that is equal to 2304) and my target matrix is [2,1] dimensions, but again, the same error occurs. While searching, I read that this is some sort of insufficient memory error. I am not sure if that is correct or not. Is that the case?

For this neural network, I have to train 40,000 images. Each image is 48*48 dimensions. How can I input these many images into NNtool?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Nadeem
  • 75
  • 2
  • 11
  • please post the exact code and your matlab version here, otherwise there's no way to verify what you are doing. thanks! – memyself Sep 23 '11 at 15:26
  • This is matlab R2009a 64 bit version running on Win7. As far as code is concerned, there is no code at all, it is nntool, there is hardly any code involved in here, other than the input matrix and target matrix that gets imported into the nntool. – Nadeem Sep 23 '11 at 15:30
  • input = 48*48 matrix target = [0.5; 0.8] (also tried without semi colon, didn't work) And then I import these two in nntool, rest is all done with GUI – Nadeem Sep 23 '11 at 15:32
  • you don't say anything about what kind of network you created with nntool. therefore it's impossible to replicate your problem. – memyself Sep 23 '11 at 15:46
  • Sorry for the incomplete information, here you go. Network Type = Feed Forward back prop; Training Function = TrainLIM; Number of Layers = 2; Transfer Funcation = TANSIG; – Nadeem Sep 23 '11 at 15:55
  • I have just tried by increasing the memory allocated to matlab, but it also didn't work. :( – Nadeem Sep 23 '11 at 16:14
  • I agree with you that it has nothing do to with it, but this link made me ponder about it http://www.mathworks.com/support/solutions/en/data/1-BBJCDC/index.html – Nadeem Sep 23 '11 at 18:21

1 Answers1

0

From http://www.mathworks.ch/support/solutions/en/data/1-BBJCDC/index.html

This enhancement has been incorporated in Release 2010b (R2010b). For previous product releases, read below for any possible workarounds:

The error message: Error in ==> nntool at 681 [errmsg,errid] = me.message; is due to an out of memory error that happened earlier during the call to TRAIN in a TRY CATCH block. The ability to show the standard out of memory error message is not available in NNTOOL in Neural Network Toolbox 6.0.3 (R2009b).

As a workaround reduce the huge number of inputs, since they are generating huge internal temporary matrices when calculating the training steps.

memyself
  • 11,907
  • 14
  • 61
  • 102
  • well - all i can reduce is the number of hidden neurons, cant reduce the image dimension, that would screw up my experiment. Just as an experiment, i also tried with 24*24 image, but no luck. – Nadeem Sep 23 '11 at 19:58
  • did you try just using 10 images? if you are really always running into memory problems, then there's not much we can do about. You can only reduce the number of images, the image size, or the neural network dimensions. – memyself Sep 24 '11 at 07:47
  • it is just a single image of 48*48 that i am feeding into this Network. – Nadeem Sep 25 '11 at 13:47