3

I used tensorflow to train LSTM language model, code is from here.

According to article here, it seems that if I use pre-trained word2vec, it works better.

Using word embeddings such as word2vec and GloVe is a popular method to improve the accuracy of your model. Instead of using one-hot vectors to represent our words, the low-dimensional vectors learned using word2vec or GloVe carry semantic meaning – similar words have similar vectors. Using these vectors is a form of pre-training.

So, I want to use word2vec to redo the training, but I am a little bit confused about how to do this.

The embedding code goes here:

with tf.device("/cpu:0"):
  embedding = tf.get_variable(
      "embedding", [vocab_size, size], dtype=data_type())
  inputs = tf.nn.embedding_lookup(embedding, input_.input_data)

How can I change this code to use pre-trained word2vec?

roger
  • 9,063
  • 20
  • 72
  • 119

0 Answers0