1

Could anyone let me know if there is any way of getting sentence embeddings from meta-llama/Llama-2-13b-chat-hf from huggingface?

Model link: https://huggingface.co/meta-llama/Llama-2-13b-chat-hf

I tried using transfomer.Automodel module from hugging faces to get the embeddings, but the results don't look as expected. Implementation is referred to in the below link. Reference: https://github.com/Muennighoff/sgpt#asymmetric-semantic-search-be 

  • You _can_, certainly, but it's not a good choice for the job. A specialized model designed specifically for embeddings will provide much more compact (and thus, efficient-to-compare) results. – Charles Duffy Aug 18 '23 at 02:01

1 Answers1

0

You can get sentence embedding from llama-2. Take a look of Project Repo 'llama.cpp' https://github.com/ggerganov/llama.cpp/.

You can use 'embedding.cpp' to generate sentence embedding

./embedding -m models/7B/ggml-model-q4_0.bin -p "your sentence"

https://github.com/ggerganov/llama.cpp/blob/master/examples/embedding/embedding.cpp.

as Charles Duffy also mentioned in the comment there are other specialized model designed specifically for Sentence embeddings " Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks" https://www.sbert.net/.

You can see more discussion on effectiveness of llama based sentence embedding on this thread "Embedding doesn't seem to work?" https://github.com/ggerganov/llama.cpp/issues/899.

Deepak Kumar
  • 433
  • 4
  • 12