0

I am writing a little app in javascript utilizing the LangChain library. I have the following snippet:

/* LangChain Imports */
import { OpenAI } from "langchain/llms/openai";
import { BufferMemory } from "langchain/memory";
import { ConversationChain } from "langchain/chains";

// ========================================================================================= //
  // ============= Use LangChain to send request to OpenAi API =============================== //
  // ========================================================================================= //

  const openAILLMOptions = {
    modelName: chatModel.value,
    openAIApiKey: decryptedString,
    temperature: parseFloat(temperatureValue.value),
    topP: parseFloat(topP.value),
    maxTokens: parseInt(maxTokens.value),
    stop: stopSequences.value.length > 0 ? stopSequences.value : null,
    streaming: true,
};

  const model = new OpenAI(openAILLMOptions);
  const memory = new BufferMemory();
  const chain = new ConversationChain({ llm: model, memory: memory });

  try {
    const response = await chain.call({ input: content.value, signal: signal }, undefined,
      [
        {
          
          handleLLMNewToken(token) {
            process.stdout.write(token);
          },
        },
      ]
    );

// handle the response 

}

This does not work(tried both using the token via typescript and without typeing). I have scoured various forums and they are either implementing streaming with python or there solution is not relevant to this problem. So to summarize I can successfully pull the response from OpenAI via LangChain ConversationChain() api call but I cant stream the response. Any guidance or solutions most welcome.

Alan
  • 1,067
  • 1
  • 23
  • 37

0 Answers0