1

this is my code:

import os
from dotenv import load_dotenv,find_dotenv
load_dotenv(find_dotenv())

print(os.environ.get("OPEN_AI_KEY"))

from langchain.llms import OpenAI
llm=OpenAI(model_name="text-davinci-003",temperature=0.7,max_tokens=512)
print(llm)

when I execute above code I get this error

ValidationError: 1 validation error for OpenAI
__root__
  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass  `openai_api_key` as a named parameter. (type=value_error)

docs say

If you'd prefer not to set an environment variable you can pass the key in directly via the openai_api_key named parameter when initiating the OpenAI LLM class:

But already set it and it prints correctly

enter image description here

When I set the llm by passing named param:

llm=OpenAI(openai_api_key="PASSINGCORRECTKEY", model_name="text-davinci-003",temperature=0.7,max_tokens=512)
llm("Tell me a joke")

then I get this error:

raise ValueError(
             "Argument `prompt` is expected to be a string. Instead found "
            f"{type(prompt)}. If you want to run the LLM on multiple prompts, use "
             "`generate` instead."
         )

UPDATE

env variable initially was set as OPEN_AI_KEY since I copied and pasted from one of my other project which calls chat/completions api. I changed the env to OPENAI_API_KEY not I get this error:

AuthenticationError: Incorrect API key provided: org-Wz3J****************2XK6. You can find your API key at https://platform.openai.com/account/api-keys.

But same api key works when i call "https://api.openai.com/v1/chat/completions" endpoint

Yilmaz
  • 35,338
  • 10
  • 157
  • 202

3 Answers3

1

You may need to store the OpenAI token and then pass it to the llm variable you have here, or just rename your environment variable to openai_api_key.

A possible example of passing a key directly is this:

import os
from dotenv import load_dotenv,find_dotenv
load_dotenv(find_dotenv())

prompt = "Your Prompt Here"
OpenAI_key = os.environ.get("OPEN_AI_KEY")
print(OpenAI_token)

from langchain.llms import OpenAI
llm=openai.Completion.create(model_name="text-davinci-003",temperature=0.7,max_tokens=512,openai_api_key=OpenAI_key, prompt=prompt, stop=None)
print(llm)

It should work now

1

Your env variable must have the key name OPENAI_API_KEY This should solve your problem:

OPEN_AI_KEY => OPENAI_API_KEY

krisograbek
  • 1,642
  • 10
  • 16
0

you should add a .env file and add the open_api_key in it

open_api_key="xxxx"
Xiaomin Wu
  • 400
  • 1
  • 5