4

A similar question has been asked here. But my question is different.

I am trying to build an RNN autoencoder like below. RNN autoencoder

Basically, the encoder and the decoder are both DNN. And the RNN takes the all encoding results as a time series. Weights are sharing between encoders and decoders correspondingly.

Problem: I know how to build an autoencoder but I don't know how to implement RNN in the embedding layer.

Here is my autoencoder code:

def autoencoder(adj):
    h,w = adj.shape
    kwargs = dict(
        use_bias=True,
        kernel_initializer='glorot_normal',
        kernel_regularizer=None,
        bias_initializer='zeros',
        bias_regularizer=None,
        trainable=True,
    )
    data = Input(shape=(w,), dtype=np.float32, name='data')
    noisy_data = Dropout(rate=0.5, name='drop0')(data)
    encoded = Dense(256, activation='relu', name='encoded1', **kwargs)(noisy_data)
    encoded = Dense(128, acitivation='relu', names='encoded2', **kwargs)(encoded)

    encoder = Model([data], encoded)
    encoded1 = encoder.get_layer('encoded1')
    encoded2 = encoder.get_layer('encoded2')

    ### Need an RNN (LSTM) to take h encoded results as a time series  

    decoded = DenseTied(256, tie_to=encoded2, transpose=True, acitvation='relu', name='decoded2')(encoded)
    decoded = DenseTied(w, tie_to=encoded1, transpose=True, activation='relu', name='decoded1')(decoded)

    # compile the autoencoder
    adam = optimizers.Adam(lr=0.001, decay=0.0)
    autoencoder = Model(inputs=[data], outputs=[decoded])
    autoencoder.compile(optimizer=adam, loss=mbce)

 return encoder, autoencoder
Ioannis Nasios
  • 8,292
  • 4
  • 33
  • 55
Vision
  • 548
  • 1
  • 7
  • 20

0 Answers0