0

My problem is that I want to apply a linear projection, followed by batch normalization and ReLU but I don't know pytorch good enough to apply it.

My input data are features for 1024 datapoints with dimension 10x10 so (16, 1024, 10, 10) while 16 is the batch size. I want to project that 10x10 feature Matrix to a vector with a length of say 32 so my output is (16, 1024, 32). How to tackle this? I found This Question which looks like what I need but I get an error saying 4 dimensional input is not supported in bmm.

1 Answers1

0

torch.bmm only supports 3D inputs. It would be nice if the functionality was extended to further broadcasting dimensions but currently this isn't the case. Since you are reducing the dimension eventually, you can perform the matrix multiplication on a 3D view of the matrix in question. I'll call the vectorizing matrix F.

# get 3D tensor view input
input_3D = input.view(input.shape[0],input.shape[1],-1)

# vectorizing matrix should be of size [input_3D.shape[-1], <desired output size>]
# in this case, F is of size [100,16]

# since torch.bmm expects F to be of 3 dimensions, we need to expand it
F = F.unsqueeze(0).expand(F.shape[1],100,16])

output = torch.bmm(input_3D,F) # output has shape [16,1024,16] in your case

... # subsequent steps (normalization, activation function, etc.)

Depending on how you expected to implement this you may need to rearrange the elements of F to achieve the desired transformation.

DerekG
  • 3,555
  • 1
  • 11
  • 21