I am setting up a Recurrent Neural Network, where I have a time series of features vectors (N x D), where each of the N columns corresponds to an "event". I am given the evolution of the coordinates of the event in another N x 3 matrix. Instead of just feeding the network with an (N x D + 3) matrix, I want to build a graph of events in order to make use of Graph Convolutions as well. Thus, if I have a time series of length t, I am dealing with an input of dimensionality t x N x D as well as t x N x 3, as both the features as well as the coordinates of the single "events" evolve.
For the case of single event classification I already built a method to construct the dense adjacency matrix of the coordinates of the N events. This involves calculating the pairwise distances of each of the N coordinates and applying a Gaussian Kernel.
To be more precise, I obtain the distance matrix according to this question's answer. I can however not figure out how to generalize this approach to a setting, where I have t differnt N x D coordinate systems without iterating over those (which would be too costly).
This is the tensorflow code I am currently using to obtain the distance matrix given the coordinates.
coordinate_norms = tf.reduce_sum(coordinates * coordinates, 1)
coordinate_norms = tf.reshape(coordinate_norms, [-1, 1])
distances = coordinate_norms - 2 * tf.matmul(coordinates, tf.transpose(coordinates)) + tf.transpose(coordinate_norms)
If I simply use the approch here, the tensors dimension are incorrect.