Automationscribe.com
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us
No Result
View All Result
Automation Scribe
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us
No Result
View All Result
Automationscribe.com
No Result
View All Result

Join Amazon Bedrock brokers to cross-account data bases

admin by admin
November 8, 2025
in Artificial Intelligence
0
Use IP-restricted presigned URLs to boost safety in Amazon SageMaker Floor Fact
399
SHARES
2.3k
VIEWS
Share on FacebookShare on Twitter


Organizations want seamless entry to their structured information repositories to energy clever AI brokers. Nonetheless, when these sources span a number of AWS accounts integration challenges can come up. This publish explores a sensible answer for connecting Amazon Bedrock brokers to data bases in Amazon Redshift clusters residing in numerous AWS accounts.

The problem

Organizations that construct AI brokers utilizing Amazon Bedrock can preserve their structured information in Amazon Redshift clusters. When these information repositories exist in separate AWS accounts from their AI brokers, they face a big limitation: Amazon Bedrock Information Bases doesn’t natively assist cross-account Redshift integration.

This creates a problem for enterprises with multi-account architectures who need to:

  • Leverage current structured information in Redshift for his or her AI brokers.
  • Preserve separation of issues throughout totally different AWS accounts.
  • Keep away from duplicating information throughout accounts.
  • Guarantee correct safety and entry controls.

Answer overview

Our answer permits cross-account data base integration via a safe, serverless structure that maintains safe entry controls whereas permitting AI brokers to question structured information. The method makes use of AWS Lambda as an middleman to facilitate safe cross-account information entry.

Cross-Account Amazon Bedrock knowledge base architecture

The motion circulate as proven above:

  1. Customers enter their pure language query in Amazon Bedrock Brokers which is configured within the agent account.
  2. Amazon Bedrock Brokers invokes a Lambda operate via motion teams which supplies entry to the Amazon Bedrock data base configured within the agent-kb account above.
  3. Motion group Lambda operate working in agent account assumes an IAM position created in agent-kb account above to connect with the data base within the agent-kb account.
  4. Amazon Bedrock Information Base within the agent-kb account makes use of an IAM position created in the identical account to entry Amazon Redshift information warehouse and question information within the information warehouse.

The answer follows these key parts:

  1. Amazon Bedrock agent within the agent account that handles person interactions.
  2. Amazon Redshift serverless workgroup in VPC and personal subnet within the agent-kb account containing structured information.
  3. Amazon Bedrock Information base utilizing the Amazon Redshift serverless workgroup as structured information supply.
  4. Lambda operate within the agent account.
  5. Motion group configuration to attach the agent within the agent account to the Lambda operate.
  6. IAM roles and insurance policies that allow safe cross-account entry.

Conditions

This answer requires you to have the next:

  1. Two AWS accounts. Create an AWS account in the event you do not need one. Particular permissions required for each account which shall be arrange in subsequent steps.
  2. Set up the AWS CLI (2.24.22 – present model)
  3. Arrange authentication utilizing IAM person credentials for the AWS CLI for every account
  4. Be sure you have jq put in, jq is light-weight command-line JSON processor. For instance, in Mac you should use the command brew set up jq (jq-1.7.1-apple – present model) to put in it.
  5. Navigate to the Amazon Bedrock console and be sure you allow entry to the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb account and entry for us.amazon.nova-pro-v1:0 mannequin within the agent account within the us-west-2, US West (Oregon) AWS Area.

Assumption

Let’s name the AWS account profile, agent profile that has the Amazon Bedrock agent. Equally, the AWS account profile be referred to as agent-kb that has the Amazon Bedrock data base with Amazon Redshift Serverless and the structured information supply. We’ll use the us-west-2 US West (Oregon) AWS Area however be happy to decide on one other AWS Area as mandatory (the conditions shall be relevant to the AWS Area you select to deploy this answer in). We’ll use the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb. That is an out there on-demand mannequin in us-west-2. You’re free to decide on different fashions with cross-Area inference however that may imply altering the roles and polices accordingly and allow mannequin entry in all Areas they’re out there in. Primarily based on our mannequin alternative for this answer the AWS Area have to be us-west-2. For the agent we shall be utilizing an Amazon Bedrock agent optimized mannequin like us.amazon.nova-pro-v1:0.

Implementation walkthrough

The next is a step-by-step implementation information. Be certain to carry out all steps in the identical AWS Area in each accounts.

These steps are to deploy and take a look at an end-to-end answer from scratch and if you’re already working a few of these parts, you could skip over these steps.

    1. Make an observation of the AWS account numbers within the agent and agent-kb account. Within the implementation steps we are going to refer them as follows:
      Profile AWS account Description
      agent 111122223333 Account for the Bedrock Agent
      agent-kb 999999999999 Account for the Bedrock Information base

      Word: These steps use instance profile names and account numbers, please change with actuals earlier than working.

    2. Create the Amazon Redshift Serverless workgroup within the agent-kb account:
      1. Go browsing to the agent-kb account
      2. Observe the workshop hyperlink to create the Amazon Redshift Serverless workgroup in personal subnet
      3. Make an observation of the namespace, workgroup, and different particulars and observe the remainder of the hands-on workshop directions.
    3. Arrange your information warehouse within the agent-kb account.
    4. Create your AI data base within the agent-kb account. Make an observation of the data base ID.
    5. Practice your AI Assistant within the agent-kb account.
    6. Check pure language queries within the agent-kb account. Yow will discover the code in aws-samples git repository: sample-for-amazon-bedrock-agent-connect-cross-account-kb.
    7. Create mandatory roles and insurance policies in each the accounts. Run the script create_bedrock_agent_kb_roles_policies.sh with the next enter parameters.
      Enter parameter Worth Description
      –agent-kb-profile agent-kb The agent knowledgebase profile that you just arrange with the AWS CLI with aws_access_key_id, aws_secret_access_key as talked about within the conditions.
      –lambda-role lambda_bedrock_kb_query_role That is the IAM position the agent account Bedrock agent motion group lambda will assume to connect with the Redshift cross account
      –kb-access-role bedrock_kb_access_role That is the IAM position the agent-kb account which the lambda_bedrock_kb_query_role in agent account assumes to connect with the Redshift cross account
      –kb-access-policy bedrock_kb_access_policy IAM coverage connected to the IAM position bedrock_kb_access_role
      –lambda-policy lambda_bedrock_kb_query_policy IAM coverage connected to the IAM position lambda_bedrock_kb_query_role
      –knowledge-base-id XXXXXXXXXX Substitute with the precise data base ID created in Step 4
      –agent-account 111122223333 Substitute with the 12-digit AWS account quantity the place the Bedrock agent is working. (agent account)
      –agent-kb-account 999999999999 Substitute with the 12-digit AWS account quantity the place the Bedrock data base is working. (agent-kb acccount)
    8. Obtain the script (create_bedrock_agent_kb_roles_policies.sh) from the aws-samples GitHub repository.
    9. Open Terminal in Mac or related bash shell for different platforms.
    10. Find and alter the listing to the downloaded location, present executable permissions:
      cd /my/location
      chmod +x create_bedrock_agent_kb_roles_policies.sh

    11. In case you are nonetheless not clear on the script utilization or inputs, then you’ll be able to run the script with the –assist choice and the script will show the utilization:
      ./create_bedrock_agent_kb_roles_policies.sh –assist
    12. Run the script with the appropriate enter parameters as described within the earlier desk.
      ./create_bedrock_agent_kb_roles_policies.sh --agent-profile agent  
        --agent-kb-profile agent-kb  
        --lambda-role lambda_bedrock_kb_query_role  
        --kb-access-role bedrock_kb_access_role  
        --kb-access-policy bedrock_kb_access_policy  
        --lambda-policy lambda_bedrock_kb_query_policy  
        --knowledge-base-id XXXXXXXXXX  
        --agent-account 111122223333  
        --agent-kb-account 999999999999

    13. The script on profitable execution exhibits the abstract of the IAM, roles and insurance policies created in each accounts.
    14. Go browsing to each the agent and agent-kb account to confirm the IAM roles and insurance policies are created.
          • For the agent account: Make an observation of the ARN of the lambda_bedrock_kb_query_role as that would be the worth of CloudFormation stack parameter AgentLambdaExecutionRoleArn within the subsequent step.
            Agent IAM Role
          • For the agent-kb account: Make an observation of the ARN of the bedrock_kb_access_role as that would be the worth of CloudFormation stack parameter TargetRoleArn within the subsequent step.
            Agent KB IAM Role
    15. Run the AWS CloudFormation script to create a Bedrock agent:
            1. Obtain the CloudFormation script: cloudformation_bedrock_agent_kb_query_cross_account.yaml from the aws-samples GitHub repository.
            2. Go browsing to the agent account and navigate to the CloudFormation console, and confirm you might be within the us-west-2 (Oregon) Area, select Create stack and select With new sources (normal).
            3. Within the Specify template part select Add a template file after which Select file and choose the file from (1). Then, select Subsequent.
            4. Enter the next stack particulars and select Subsequent.
              Parameter Worth Description
              Stack identify bedrock-agent-connect-kb-cross-account-agent You possibly can select any identify
              AgentFoundationModelId us.amazon.nova-pro-v1:0 Don’t change
              AgentLambdaExecutionRoleArn arn:aws:iam:: 111122223333:position/lambda_bedrock_kb_query_role Substitute with you agent account quantity
              BedrockAgentDescription Agent to question stock information from Redshift Serverless database Hold this as default
              BedrockAgentInstructions You’re an assistant that helps customers question stock information from our Redshift Serverless database utilizing the motion group. Don’t change
              BedrockAgentName bedrock_kb_query_cross_account Hold this as default
              KBFoundationModelId meta.llama3-1-70b-instruct-v1:0 Don’t change
              KnowledgeBaseId XXXXXXXXXX Information base id from Step 4
              TargetRoleArn arn:aws:iam::999999999999:position/bedrock_kb_access_role Substitute with you agent-kb account quantity

            5. Full the acknowledgement and select Subsequent.
            6. Scroll down via the web page and select Submit.
            7. You will notice the CloudFormation stack is getting created as proven by the standing CREATE_IN_PROGRESS.
            8. It’ll take a couple of minutes, and you will note the standing change to CREATE_COMPLETE indicating creation of all sources. Select the Outputs tab to make an observation of the sources that have been created.
              In abstract, the CloudFormation script does the next within the agent account.
                  • Creates a Bedrock agent
                  • Creates an motion group
                  • Additionally creates a Lambda operate which is invoked by the Bedrock motion group
                  • Defines the OpenAPI schema
                  • Creates mandatory roles and permissions for the Bedrock agent
                  • Lastly, it prepares the Bedrock agent in order that it is able to take a look at.
    16. Test for mannequin entry in Oregon (us-west-2)
            1. Confirm Nova Professional (us.amazon.nova-pro-v1:0) mannequin entry within the agent account. Navigate to the Amazon Bedrock console and select Mannequin entry beneath Configure and be taught. Seek for Mannequin identify : Nova Professional to confirm entry. If not, then allow mannequin entry.
            2. Confirm entry to the meta.llama3-1-70b-instruct-v1:0 mannequin within the agent-kb account. This could already be enabled as we arrange the data base earlier.
    17. Run the agent. Go browsing to agent account. Navigate to Amazon Bedrock console and select Brokers beneath Construct.
    18. Select the identify of the agent and select Check. You possibly can take a look at the next questions as talked about the workshop’s Stage 4: Check Pure Language Queries web page. For instance:
            1. Who’re the highest 5 prospects in Saudi Arabia?
            2. Who’re the highest elements provider in america by quantity?
            3. What’s the complete income by area for the yr 1998?
            4. Which merchandise have the best revenue margins?
            5. Present me orders with the best precedence from the final quarter of 1997.

    19. Select Present hint to research the agent traces.

Some really helpful greatest practices:

      • Phrase your query to be extra particular
      • Use terminology that matches your desk descriptions
      • Attempt questions just like your curated examples
      • Confirm your query pertains to information that exists within the TPCH dataset
      • Use Amazon Bedrock Guardrails so as to add configurable safeguards to questions and responses.

Clear up sources

It’s endorsed that you just clear up any sources you don’t want anymore to keep away from any pointless fees:

      1. Navigate to the CloudFormation console for the agent and agent-kb account, seek for the stack and and select Delete.
      2. S3 buckets should be deleted individually.
      3. For deleting the roles and insurance policies created in each accounts, obtain the script delete-bedrock-agent-kb-roles-policies.sh from the aws-samples GitHub repository.
        1. Open Terminal in Mac or related bash shell on different platforms.
        2. Find and alter the listing to the downloaded location, present executable permissions:
        cd /my/location
        			chmod +x delete-bedrock-agent-kb-roles-policies.sh

      4. In case you are nonetheless not clear on the script utilization or inputs, then you’ll be able to run the script with the –assist choice then the script will show the utilization:
        ./ delete-bedrock-agent-kb-roles-policies.sh –assist
      5. Run the script: delete-bedrock-agent-kb-roles-policies.sh with the identical values for a similar enter parameters as in Step7 when working the create_bedrock_agent_kb_roles_policies.sh script. Word: Enter the proper account numbers for agent-account and agent-kb-account earlier than working.
        ./delete-bedrock-agent-kb-roles-policies.sh --agent-profile agent  
          	--agent-kb-profile agent-kb  
        	  --lambda-role lambda_bedrock_kb_query_role  
        	  --kb-access-role bedrock_kb_access_role  
        	  --kb-access-policy bedrock_kb_access_policy  
        	  --lambda-policy lambda_bedrock_kb_query_policy  
        	  --agent-account 111122223333  
        	  --agent-kb-account 999999999999

        The script will ask for a affirmation, say sure and press enter.

Abstract

This answer demonstrates how the Amazon Bedrock agent within the agent account can question the Amazon Bedrock data base within the agent-kb account.

Conclusion

This answer makes use of Amazon Bedrock Information Bases for structured information to create a extra built-in method to cross-account information entry. The data base in agent-kb account connects on to Amazon Redshift Serverless in a personal VPC. The Amazon Bedrock agent within the agent account invokes an AWS Lambda operate as a part of its motion group to make a cross-account connection to retrieve response from the structured data base.

This structure provides a number of benefits:

      • Makes use of Amazon Bedrock Information Bases capabilities for structured information
      • Offers a extra seamless integration between the agent and the info supply
      • Maintains correct safety boundaries between accounts
      • Reduces the complexity of direct database entry codes

As Amazon Bedrock continues to evolve, you’ll be able to make the most of future enhancements to data base performance whereas sustaining your multi-account structure.


In regards to the Authors

Author KunalKunal Ghosh is an skilled in AWS applied sciences. He obsessed with constructing environment friendly and efficient options on AWS, particularly involving generative AI, analytics, information science, and machine studying. Apart from household time, he likes studying, swimming, biking, and watching films, and he’s a foodie.

Author ArghyaArghya Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, centered on serving to prospects undertake and use the AWS Cloud. He’s centered on massive information, information lakes, streaming and batch analytics companies, and generative AI applied sciences.

Author IndranilIndranil Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, centered on serving to prospects within the hi-tech and semi-conductor sectors clear up advanced enterprise issues utilizing the AWS Cloud. His particular pursuits are within the areas of legacy modernization and migration, constructing analytics platforms and serving to prospects undertake innovative applied sciences similar to generative AI.

Author VinayakVinayak Datar is Sr. Options Supervisor based mostly in Bay Space, serving to enterprise prospects speed up their AWS Cloud journey. He’s specializing in serving to prospects to transform concepts from ideas to working prototypes to manufacturing utilizing AWS generative AI companies.

Tags: AgentsAmazonBasesBedrockConnectcrossaccountKnowledge
Previous Post

Past Numbers: How you can Humanize Your Knowledge & Evaluation

Next Post

Energy Evaluation in Advertising and marketing: A Arms-On Introduction

Next Post
Energy Evaluation in Advertising and marketing: A Arms-On Introduction

Energy Evaluation in Advertising and marketing: A Arms-On Introduction

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Popular News

  • How Aviva constructed a scalable, safe, and dependable MLOps platform utilizing Amazon SageMaker

    How Aviva constructed a scalable, safe, and dependable MLOps platform utilizing Amazon SageMaker

    402 shares
    Share 161 Tweet 101
  • The Journey from Jupyter to Programmer: A Fast-Begin Information

    402 shares
    Share 161 Tweet 101
  • Unlocking Japanese LLMs with AWS Trainium: Innovators Showcase from the AWS LLM Growth Assist Program

    402 shares
    Share 161 Tweet 101
  • Speed up edge AI improvement with SiMa.ai Edgematic with a seamless AWS integration

    402 shares
    Share 161 Tweet 101
  • Diffusion Mannequin from Scratch in Pytorch | by Nicholas DiSalvo | Jul, 2024

    402 shares
    Share 161 Tweet 101

About Us

Automation Scribe is your go-to site for easy-to-understand Artificial Intelligence (AI) articles. Discover insights on AI tools, AI Scribe, and more. Stay updated with the latest advancements in AI technology. Dive into the world of automation with simplified explanations and informative content. Visit us today!

Category

  • AI Scribe
  • AI Tools
  • Artificial Intelligence

Recent Posts

  • Do You Actually Want GraphRAG? A Practitioner’s Information Past the Hype
  • Introducing agent-to-agent protocol assist in Amazon Bedrock AgentCore Runtime
  • The Three Ages of Knowledge Science: When to Use Conventional Machine Studying, Deep Studying, or an LLM (Defined with One Instance)
  • Home
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms & Conditions

© 2024 automationscribe.com. All rights reserved.

No Result
View All Result
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us

© 2024 automationscribe.com. All rights reserved.