1

I used tensorflow script word2vec_basic.py and I saved the model with tf.summary : saver = tf.train.Saver() save_path = saver.save(sess, "./w2v/model.ckpt")

I visualize the embedding with tensorboard succesfully but I get indexes of words in the vector enter image description here How can I get the words in the embedding instead of indexes in the vocabulary

Aggounix
  • 251
  • 5
  • 15

1 Answers1

0

I used this answer: linking-tensorboard-embedding-metadata-to-checkpoint

the problem was I tried o call tensorboard with logdir : "./w2v/model.ckpt" I should called it only with "w2v/"

Aggounix
  • 251
  • 5
  • 15