3

So I created a lambda function for a script that essentially that allows a user to pass a query to amazon titan LLM on Amazon bedrock. Here is the content of my main.py file in my deployment package.

from langchain.llms.bedrock import Bedrock
import boto3
from langchain.retrievers import AmazonKendraRetriever
from langchain.chains import RetrievalQA
from langchain.prompts import PromptTemplate
import json
from botocore.exceptions import ClientError

def get_secret():
    secret_name = "kendraRagApp"

    # Create a Secrets Manager client
    session = boto3.session.Session()
    client = session.client(
        service_name='secretsmanager',
    )

    try:
        get_secret_value_response = client.get_secret_value(
            SecretId=secret_name
        )
    except ClientError as e:
        raise e

    # Decrypts secret using the associated KMS key.
    secret = get_secret_value_response['SecretString']
    return secret   
def qa(query):
    secrets = json.loads(get_secret())
    kendra_index_id = secrets['kendra_index_id']

    llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1', credentials_profile_name='default')
    llm.model_kwargs = {"maxTokenCount": 4096}
    

    retriever = AmazonKendraRetriever(index_id=kendra_index_id)
    
    prompt_template = """
    {context}
    {question} If you are unable to find the relevant article, respond 'I can't generate the needed content based on the context provided.'
    """
    
    PROMPT = PromptTemplate(
    template=prompt_template, input_variables=["context", "question"])
    
    chain = RetrievalQA.from_chain_type(
    llm=llm,
    retriever=retriever,
    verbose=True,
    chain_type_kwargs={
    "prompt": PROMPT
    }
    )
    
    return chain(query)

def handler(event, context):
    query = event['query']
    response = qa(query)
    if response.get("result"):
        return {
            'statusCode': 200,
            'body': response["result"]
        }
    else:
        return {
            'statusCode': 400,
            'body': "Could not answer the query based on the context available"
        }

The lambda function has been created successfully, but when I try to invoke it, I get the following validation error, apparently, Bedrock could not load my credentials for authentication.

{
  "errorMessage": "1 validation error for Bedrock\n__root__\n  Could not load credentials to authenticate with AWS client. Please check that credentials in the specified profile name are valid. (type=value_error)",
  "errorType": "ValidationError",
  "requestId": "b772f236-f582-4308-8af5-b5a418d4327f",
  "stackTrace": [
    "  File \"/var/task/main.py\", line 62, in handler\n    response = qa(query)\n",
    "  File \"/var/task/main.py\", line 32, in qa\n    llm = Bedrock(model_id=\"amazon.titan-tg1-large\", region_name='us-east-1',) #client=BEDROCK_CLIENT)\n",
    "  File \"/var/task/langchain/load/serializable.py\", line 74, in __init__\n    super().__init__(**kwargs)\n",
    "  File \"pydantic/main.py\", line 341, in pydantic.main.BaseModel.__init__\n    raise validation_error\n"
  ]

I have looked at the Bedrock class as defined here Bedrock class but couldn't find enough information on how to pass my credentials to the Bedrock class. Mind you, my code runs without issues from my Sagemaker notebook (I guess because authentication is handled automatically). I will appreciate any useful help. Thanks.

Edit: not using the credentials_profile_name parameter when calling the bedrock class does not fix it, also, calling the lambda function from a local environment with authentication set up does not resolve the issue either.

2 Answers2

1

The likely issue is that you haven't configured the AWS credentials on the machine you're using. As you pass credentials_profile_name='default' into the Bedrock constructor, it tries to load the credentials from the local default profile.

SageMaker notebooks do this automatically, but on most other machines you have to do this yourself.

In order to do this you need to do two things:

Having that said, you don't have to provide any specific credentials to Bedrock, it automatically uses boto3.Session() internally.

This means that if you have configured a boto3 session with the proper credentials, you don't need to pass credentials_profile_name='default' into the constructor.

If the boto3 Session has the required permissions, it should be sufficient to replace:

llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1', credentials_profile_name='default')

with:

llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1')
Dennis Traub
  • 50,557
  • 7
  • 93
  • 108
  • Thank you Dennis. My AWS credentials are configured on my local machine. In my case, I was trying to invoke the function from a sagemaker notebook, even when I tried creating a test event in the lambda console, it fails due to this same error. I had ensured that the role assigned to my lambda function has the permissions to access bedrock service. I also tried to add my aws access id and aws secret key as environment variables in the lambda console but this didn't help either. Does this error have to do with how the bedrock class handles authentication or is it got to do with lambda? – Mustapha Unubi Momoh Aug 04 '23 at 14:55
  • 1
    If you don't pass `credentials_profile_name` into `Bedrock`, it will leverage the boto3 credentials. Remove the credential argument from `llm = Bedrock(...)` and try again. I've updated my answer. – Dennis Traub Aug 04 '23 at 15:42
  • Thank you Dennis. In my previous attempts, `credentials_profile_name` was not passed to `Bedrock`, so I have observed that whether passed or not, I get the same error. One other approach I tried was to pass my credentials when creating the bedrock client as shown here: `BEDROCK_CLIENT = boto3.client('bedrock', region_name='us-east-1', aws_access_key_id='randomrandomrandom',aws_secret_access_key='randomrandomrandom')`, then, `llm = Bedrock(model_id="amazon.titan-tg1-large", region_name='us-east-1', client=BEDROCK_CLIENT)` – Mustapha Unubi Momoh Aug 04 '23 at 18:39
  • however, I get a new error complaining that `bedrock` is an unknown service even when the correct versions of boto3 and botocore (i.e. 1.26.162) that supports bedrock are installed in the deployment package – Mustapha Unubi Momoh Aug 04 '23 at 18:47
0

Try passing the bedrock client like this:

    llm1 = Bedrock(
        model_id="anthropic.claude-v1",
        model_kwargs={
            "temperature": 1,
        },
        region_name="us-east-1",
        client=bedrock_client,
    )

This link provides some more information on this issue.

  • I mentioned this in my last comment above. This approach does not fix the issue either. Instead I get a new error `bedrock` is an unknown service. Mind you, both of the boto wheels in the deployment package that I used to create the lambda function support bedrock. – Mustapha Unubi Momoh Aug 31 '23 at 18:24