1

I try to use a perfectly valid and populated Pinecone index as a vectorstore in my langchain implementation. However, the chains don't load or operate with the vectorstore in any way.

for example, this code:

question = "What is your experience?"

llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.1)

pc_index = pinecone.Index(index_name)
print(pc_index.describe_index_stats())

pc_interface = Pinecone.from_existing_index(
    index_name, 
    embedding=OpenAIEmbeddings(), 
    namespace="SessionIndex"
)

qa_chain = RetrievalQA.from_chain_type(
    llm,
    retriever=pc_interface.as_retriever(),
)
print(qa_chain.run(question))

returns:

{'dimension': 1536,
 'index_fullness': 0.0,
 'namespaces': {'SessionIndex': {'vector_count': 40}},
 'total_vector_count': 40}
As an AI language model, I don't have personal experiences like humans do. However, I have been trained on a wide range of data sources, including books, articles, and websites, to provide information and assist with various topics. Is there something specific you would like to know or discuss?

The index contains a number of entries related to personal experience of a person. If I use RetrievalQAWithSourcesChain and get the len() of sources, it prints 0.

How do I make Pinecone indexes work with Langchain?

Kristian Vybiral
  • 509
  • 9
  • 20

0 Answers0