Automationscribe.com
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us
No Result
View All Result
Automation Scribe
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us
No Result
View All Result
Automationscribe.com
No Result
View All Result

Automate enterprise workflows by integrating Salesforce Agentforce with Amazon Bedrock Brokers

admin by admin
August 8, 2025
in Artificial Intelligence
0
Automate enterprise workflows by integrating Salesforce Agentforce with Amazon Bedrock Brokers
399
SHARES
2.3k
VIEWS
Share on FacebookShare on Twitter


AI brokers are quickly remodeling enterprise operations. Though a single agent can carry out particular duties successfully, advanced enterprise processes typically span a number of programs, requiring knowledge retrieval, evaluation, decision-making, and motion execution throughout totally different programs. With multi-agent collaboration, specialised AI brokers can work collectively to automate intricate workflows.

This put up explores a sensible collaboration, integrating Salesforce Agentforce with Amazon Bedrock Brokers and Amazon Redshift, to automate enterprise workflows.

Multi-agent collaboration in Enterprise AI

Enterprise environments at this time are advanced, that includes various applied sciences throughout a number of programs. Salesforce and AWS present distinct benefits to clients. Many organizations already keep vital infrastructure on AWS, together with knowledge, AI, and numerous enterprise purposes corresponding to ERP, finance, provide chain, HRMS, and workforce administration programs. Agentforce delivers highly effective AI-driven agent capabilities which can be grounded in enterprise context and knowledge. Whereas Salesforce supplies a wealthy supply of trusted enterprise knowledge, clients more and more want brokers that may entry and act on info throughout a number of programs. By integrating AWS-powered AI companies into Agentforce, organizations can orchestrate clever brokers that function throughout Salesforce and AWS, unlocking the strengths of each.

Agentforce and Amazon Bedrock Brokers can work collectively in versatile methods, leveraging the distinctive strengths of each platforms to ship smarter, extra complete AI workflows. Instance collaboration fashions embody:

  • Agentforce as the first orchestrator:
    • Manages finish to finish customer-oriented workflows
    • Delegates specialised duties to Amazon Bedrock Brokers as wanted by means of customized actions
    • Coordinates entry to exterior knowledge and companies throughout programs

This integration creates a extra highly effective resolution that maximizes the advantages of each Salesforce and AWS, so you possibly can obtain higher enterprise outcomes by means of enhanced AI capabilities and cross-system performance.

Agentforce overview

Agentforce brings digital labor to each worker, division, and enterprise course of, augmenting groups and elevating buyer experiences.It really works seamlessly together with your current purposes, knowledge, and enterprise logic to take significant motion throughout the enterprise. And since it’s constructed on the trusted Salesforce platform, your knowledge stays safe, ruled, and in your management. With Agentforce, you possibly can:

  • Deploy prebuilt brokers designed for particular roles, industries, or use instances
  • Allow brokers to take motion with current workflows, code, and APIs
  • Join your brokers to enterprise knowledge securely
  • Ship correct and grounded outcomes by means of the Atlas Reasoning Engine

Amazon Bedrock Brokers and Amazon Bedrock Information Bases overview

Amazon Bedrock is a totally managed AWS service providing entry to high-performing basis fashions (FMs) from numerous AI firms by means of a single API. On this put up, we focus on the next options:

  • Amazon Bedrock Brokers – Managed AI brokers use FMs to know person requests, break down advanced duties into steps, keep dialog context, and orchestrate actions. They will work together with firm programs and knowledge sources by means of APIs (configured by means of motion teams) and entry info by means of data bases. You present directions in pure language, choose an FM, and configure knowledge sources and instruments (APIs), and Amazon Bedrock handles the orchestration.
  • Amazon Bedrock Information Bases – This functionality allows brokers to carry out Retrieval Augmented Technology (RAG) utilizing your organization’s non-public knowledge sources. You join the data base to your knowledge hosted in AWS, corresponding to in Amazon Easy Storage Service (Amazon S3) or Amazon Redshift, and it mechanically handles the vectorization and retrieval course of. When requested a query or given a job, the agent can question the data base to search out related info, offering extra correct, context-aware responses and selections with no need to retrain the underlying FM.

Agentforce and Amazon Bedrock Agent integration patterns

Agentforce can name Amazon Bedrock brokers in numerous methods, permitting flexibility to construct totally different architectures. The next diagram illustrates synchronous and asynchronous patterns.

For a synchronous or request-reply interplay, Agentforce makes use of customized agent actions facilitated by Exterior Companies, Apex Invocable Strategies, or Movement to name an Amazon Bedrock agent. The authentication to AWS is facilitated utilizing named credentials. Named credentials are designed to securely handle authentication particulars for exterior companies built-in with Salesforce. They alleviate the necessity to hardcode delicate info like person names and passwords, minimizing the chance of publicity and potential knowledge breaches. This separation of credentials from the appliance code can considerably improve safety posture. Named credentials streamline integration by offering a centralized and constant methodology for dealing with authentication, decreasing complexity and potential errors. You should use Salesforce Non-public Join to supply a safe non-public reference to AWS utilizing AWS PrivateLink. Consult with Non-public Integration Between Salesforce and Amazon API Gateway for added particulars.

Detailed workflow diagram showing how Agentforce agents connect to AWS Bedrock through External Services, Topics, and OpenAPI Schema integration

For asynchronous calls, Agentforce makes use of Salesforce Occasion Relay and Movement with Amazon EventBridge to name an Amazon Bedrock agent.

Architectural diagram illustrating Agentforce and AWS Multi Agent Experiences using Event Relay for asynchronous integration

On this put up, we focus on the synchronous name sample. We encourage you to discover Salesforce Occasion Relay with EventBridge to construct event-driven agentic AI workflows. Agentforce additionally affords the Agent API, which makes it simple to name an Agentforce agent from an Amazon Bedrock agent, utilizing EventBridge API locations, for bi-directional agentic AI workflows.

Answer overview

As an example the multi-agent collaboration between Agentforce and AWS, we use the next structure, which supplies entry to Web of Issues (IoT) sensor knowledge to the Agentforce agent and handles doubtlessly faulty sensor readings utilizing a multi-agent method.

Comprehensive architecture diagram illustrating Agentforce workflow from Salesforce CRM through AWS services, including Lambda, Bedrock Agent, and security controls

The instance workflow consists of the next steps:

  1. Coral Cloud has geared up their rooms with sensible air conditioners and temperature sensors. These IoT gadgets seize essential info corresponding to room temperature and error code and retailer it in Coral Cloud’s AWS database in Amazon Redshift.
  2. Agentforce agent calls an Amazon Bedrock agent by means of the Agent Wrapper API with questions corresponding to “What’s the temperature in room 123” to reply buyer questions associated to the consolation of the room. This API is applied as an AWS Lambda operate, performing because the entry level within the AWS Cloud.
  3. The Amazon Bedrock agent, upon receiving the request, wants context. It queries its configured data base by producing the required SQL question.
  4. The data base is linked to a Redshift database containing historic sensor knowledge or contextual info (just like the sensor’s thresholds and upkeep historical past). It retrieves related info primarily based on the agent’s question and responds again with a solution.
  5. With the preliminary knowledge and the context from the data base, the Amazon Bedrock agent makes use of its underlying FM and pure language directions to determine the suitable motion. On this situation, detecting an error prompts it to create a case when it receives faulty readings from a sensor.
  6. The motion group incorporates the Agentforce Agent Wrapper Lambda operate. The Amazon Bedrock agent securely passes the required particulars (like which sensor or room wants a case) to this operate.
  7. The Agentforce Agent Wrapper Lambda operate acts as an adapter. It interprets the request from the Amazon Bedrock agent into the particular format required by the Agentforce service‘s API or interface.
  8. The Lambda operate calls Agentforce, instructing it to create a case related to the contact or account linked to the sensor that despatched the faulty studying.
  9. Agentforce makes use of its inside logic (agent, matters, and actions) to create or escalate the case inside Salesforce.

This workflow demonstrates how Amazon Bedrock Brokers orchestrates duties, utilizing Amazon Bedrock Information Bases for context and motion teams (by means of Lambda) to work together with Agentforce to finish the end-to-end course of.

Conditions

Earlier than constructing this structure, be sure you have the next:

  • AWS account – An energetic AWS account with permissions to make use of Amazon Bedrock, Lambda, Amazon Redshift, AWS Identification and Entry Administration (IAM), and API Gateway.
  • Amazon Bedrock entry – Entry to Amazon Bedrock Brokers and to Anthropic’s Claude 3.5 Haiku v1 enabled in your chosen AWS Area.
  • Redshift sources – An operational Redshift cluster or Amazon Redshift Serverless endpoint. The related tables containing sensor knowledge (historic readings, sensor thresholds, and upkeep historical past) have to be created and populated.
  • Agentforce system – Entry to and understanding of the Agentforce system, together with methods to configure it. You’ll be able to enroll for a developer version with Agentforce and Information Cloud.
  • Lambda data – Familiarity with creating, deploying, and managing Lambda capabilities (utilizing Python).
  • IAM roles and insurance policies – Understanding of methods to create IAM roles with the obligatory permissions for Amazon Bedrock Brokers, Lambda capabilities (to name Amazon Bedrock, Amazon Redshift, and the Agentforce API), and Amazon Bedrock Information Bases.

Put together Amazon Redshift knowledge

Be certain your knowledge is structured and obtainable in your Redshift occasion. Be aware the database identify, credentials, and desk and column names.

Create IAM roles

For this put up, we create two IAM roles:

  • custom_AmazonBedrockExecutionRoleForAgents:
    • Connect the next AWS managed insurance policies to the position:
      • AmazonBedrockFullAccess
      • AmazonRedshiftDataFullAccess
    • Within the belief relationship, present the next belief coverage (present your AWS account ID):
{
    "Model": "2012-10-17",
    "Assertion": [
        {
            "Sid": "AmazonBedrockAgentBedrockFoundationModelPolicyProd",
            "Effect": "Allow",
            "Principal": {
                "Service": "bedrock.amazonaws.com"
            },
            "Action": "sts:AssumeRole",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": "YOUR_ACCOUNT_ID"
                }
            }
        }
    ]
}

  • custom_AWSLambdaExecutionRole:
    • Connect the next AWS managed insurance policies to the position:
      • AmazonBedrockFullAccess
      • AmazonLambdaBasicExecutionRole
    • Within the belief relationship, present the next belief coverage (present your AWS account ID):
{
    "Model": "2012-10-17",
   "Assertion": [
       {
           "Effect": "Allow",
           "Principal": {
               "Service": "lambda.amazonaws.com"
           },
           "Action": "sts:AssumeRole",
           "Condition": {
               "StringEquals": {
                   "aws:SourceAccount": "YOUR_ACCOUNT_ID"
               }
           }
       }
   ]
}

Create an Amazon Bedrock data base

Full the next steps to create an Amazon Bedrock data base:

  1. On the Amazon Bedrock console, select Information Bases within the navigation pane.
  2. Select Create and Information Base with structured knowledge retailer.

Knowledge base selection menu showing three database storage options

  1. On the Present Information Base particulars web page, present the next info:
    1. Enter a reputation and non-obligatory description.
    2. For Question engine, choose Amazon Redshift.
    3. For IAM permissions, choose Use an current service position and select custom_AmazonBedrockExecutionRoleForAgents.
    4. Select Subsequent.
      Knowledge base dropdown menu with three storage type options
      1. For Question engine connection particulars, choose Redshift provisioned and select your cluster.
      2. For Authentication, choose IAM Function.
      3. For Storage configuration, choose Amazon Redshift database and Redshift database checklist.
      4. On the Configure question engine web page, present the next info:
        Configure query engine
      5. Present desk and column descriptions. The next is an instance.
        Table names and descriptions
      6. Select Create Information Base.
  2. After you create the data base, open the Redshift question editor and grant permissions for the position to entry Redshift tables by working the next queries:
CREATE USER "IAMR:custom_AmazonBedrockExecutionRoleForAgents" WITH PASSWORD DISABLE; 

GRANT SELECT ON ALL TABLES IN SCHEMA dev.knowledgebase TO "IAMR:custom_AmazonBedrockExecutionRoleForAgents"; 

GRANT USAGE ON SCHEMA dev.knowledgebase TO "IAMR:custom_AmazonBedrockExecutionRoleForAgents";

For extra info, discuss with arrange your question engine and permissions for making a data base with structured knowledge retailer.

  1. 5. Select Sync to sync the question engine.

Be certain the standing exhibits as Full earlier than transferring to the subsequent steps.

Status shows complete

  • When the sync is full, select Check Information Base.
  • Choose Retrieval and response era: knowledge sources and mannequin and select Claude 3.5 Haiku for the mannequin.
  • Enter a query about your knowledge and be sure you get a sound reply.

Test knowledge base with a question

Create an Amazon Bedrock agent

Full the next steps to create an Amazon Bedrock agent:

  1. On the Amazon Bedrock console, select Brokers within the navigation pane.
  2. Select Create agent.
  3. On the Agent particulars web page, present the next info:
    1. Enter a reputation and non-obligatory description.
    2. For Agent useful resource position, choose Use an current service position and select custom_AmazonBedrockExecutionRoleForAgents.

  1. Present detailed directions to your agent. The next is an instance:
You're an IoT gadget monitoring and alerting agent. 
You could have entry to the structured knowledge containing studying, upkeep, threshold knowledge for IoT gadgets. 
You reply questions on gadget studying, upkeep schedule and thresholds. 
It's also possible to create case through Agentforce. 
While you obtain comma separated values parse them as device_id, temperature, voltage, connectivity and error_code. 
First examine if the temperature is lower than min temperature, greater than max temperature and connectivity is greater than the connectivity threshold for the product related to the gadget id. 
If there may be an error code, ship info to agentforce to create case. The knowledge despatched to agentforce ought to embody gadget readings corresponding to gadget id, error code. 
It must also embody the edge values associated to the product related to the gadget corresponding to min temperature, max temperature and connectivity, 
In response to your name to agentforce simply return the abstract of the data supplied with all of the attributes supplied. 
Don't omit any info within the response. Don't embody the phrase escalated in agent.

  1. Select Save to save lots of the agent.
  2. Add the data base you created in earlier step to this agent.

  1. Present detailed directions in regards to the data base for the agent.

  1. Select Save after which select Put together the agent.
  2. Check the agent by asking a query (within the following instance, we ask about sensor readings).

  1. Select Create alias.
  2. On the Create alias web page, present the next info:
    1. Enter an alias identify and non-obligatory description.
    2. For Affiliate model, choose Create a brand new model and affiliate it to this alias.
    3. For Choose throughput, choose On-demand.
    4. Select Create alias.

  1. Be aware down the agent ID, which you’ll use in subsequent steps.Bedrock agent identifier
  2. Be aware down the alias ID and agent ID, which you’ll use in subsequent steps.

Create a Lambda operate

Full the next steps to create a Lambda operate to obtain requests from Agentforce:

  1. On the Lambda console, select Features within the navigation pane.
  2. Select Create operate.
  3. Configure the operate with the next logic to obtain requests by means of API Gateway and name Amazon Bedrock brokers:
import boto3
import uuid
import json
import pprint
import traceback
import time
import logging
from agent_utils import invoke_agent_generate_response
logger = logging.getLogger()
logger.setLevel(logging.INFO)
bedrock_agent_runtime_client = boto3.shopper(
service_name="bedrock-agent-runtime",
region_name="REGION_NAME", # exchange with the area identify out of your account
)
def lambda_handler(occasion, context):
    logger.information(occasion)
    physique = occasion['body']
    input_text = json.masses(physique)['inputText']
    agent_id = 'XXXXXXXX' # exchange with the agent id out of your account
    agent_alias_id = 'XXXXXXX' # exchange with the alias id out of your account
    session_id:str = str(uuid.uuid1()) # random identifier
    enable_trace:bool = False
    end_session:bool = False
    final_answer = None
    response = call_agent(input_text, agent_id, agent_alias_id)
    print("response : ")
    print(response)
 
    return {
        'headers': {
            'Content material-Kind' : 'software/json',
            'Entry-Management-Enable-Headers': '*',
            'Entry-Management-Enable-Origin': '*',
            'Entry-Management-Enable-Strategies': '*'
        },
        
        'statusCode': 200,
        'physique': json.dumps({"outputText" :  response  })
    }
def call_agent(inputText, agentId, agentAliasId): 
    session_id = str(uuid.uuid1())
    enable_trace = False
    end_session = False
    whereas True:
        attempt:
            agent_response = bedrock_agent_runtime_client.invoke_agent(
                inputText=inputText,
                agentId=agentId,
                agentAliasId=agentAliasId,                
                sessionId=session_id,
                enableTrace=enable_trace,
                endSession=end_session
            )
            logger.information("Agent uncooked response:")
            pprint.pprint(agent_response)
            if 'completion' not in agent_response:
                increase ValueError("Lacking 'completion' in agent response")
            for occasion in agent_response['completion']:
                chunk = occasion.get('chunk')
                # print('chunk: ', chunk)
                if chunk:
                    decoded_bytes = chunk.get("bytes").decode()
                    # print('bytes: ', decoded_bytes)
                    return decoded_bytes
        besides Exception as e:
            print(traceback.format_exc())
            return f"Error: {str(e)}"

  1. Outline the required IAM permissions by assigning custom_AWSLambdaExecutionRole.

Create a REST API

Full the next steps to create a REST API in API Gateway:

  1. On the API Gateway console, create a REST API with proxy integration.

REST API for proxy integration with AWS Lambda

  1. Allow API key required to guard the API from unauthenticated entry.

Enable API key required

  1. Configure the utilization plan and API key. For extra particulars, see Arrange API keys for REST APIs in API Gateway.
  2. Deploy the API.
  3. Be aware down the Invoke URL to make use of in subsequent steps.

API Gateway invoke URL

Create named credentials in Salesforce

Now that you’ve created an Amazon Bedrock agent with an API Gateway endpoint and Lambda wrapper, let’s full the configuration on the Salesforce aspect. Full the next steps:

  1. Log in to Salesforce.
  2. Navigate to Setup, Safety, Named Credentials.
  3. On the Exterior Credentials tab, select New.

Named credentials configuration

  1. Present the next info:
    1. Enter a label and identify.
    2. For Authentication Protocol, select Customized.
    3. Select Save.

External credentials configuration

  1. Open the Exterior Credentials entry to supply further particulars:
    1. Below Principals, create a brand new principal and supply the parameter identify and worth.

External credentials principal

    1. Below Customized Headers, create a brand new entry and supply a reputation and worth.
    2. Select Save.

Custom header external credentials

Now you possibly can grant entry to the agent person to entry these credentials.

  1. Navigate to Setup, Customers, Person Profile, Enabled Exterior Credential Principal Entry and add the exterior credential principal you created to the enable checklist.

Add permissions to user profile

  1. Select New to create a named credentials entry.
  2. Present particulars corresponding to label, identify, the URL of the API Gateway endpoint, and authentication protocol, then select Save.

External service connect to named credential

You’ll be able to optionally use Salesforce Non-public Join with PrivateLink to supply a safe non-public reference to. This permits essential knowledge to move from the Salesforce surroundings to AWS with out utilizing the general public web.

Add an exterior service in Salesforce

Full the next steps so as to add an exterior service in Salesforce:

  1. In Salesforce, navigate to Setup, Integrations, Exterior Companies and select Add an Exterior Service.
  2. For Choose an API supply, select From API Specification.

Add external service

  1. On the Edit an Exterior Service web page, present the next info:
    1. Enter a reputation and non-obligatory description.
    2. For Service Schema, select Add from native.
    3. For Choose a Named Credential, select the named credential you created.

Add named credential to external service

  1. Add an Open API specification for the API Gateway endpoint. See the next instance:
openapi: 3.0.0
information:
  title: Bedrock Agent Wrapper API
  model: 1.0.0
  description: Bedrock Agent Wrapper API
paths:
  /proxy:
    put up:
      operationId: call-bedrock-agent
      abstract: Name Bedrock Agent
      description: Name Bedrock Agent
      requestBody:
        description: enter
        required: true
        content material:
          software/json:
            schema:
              $ref: '#/elements/schemas/enter'
      responses:
        '200':
          description: Profitable response
          content material:
            software/json:
              schema:
                $ref: '#/elements/schemas/output'
        '500':
          description: Server error
elements:
  schemas:
    enter:
      sort: object
      properties:
        inputText:
          sort: string
        agentId:
          sort: string
        agentAlias:
          sort: string
    output:
      sort: object
      properties:
        outputText:
          sort: string

  1. Select Save and Subsequent.
  2. Allow the operation to make it obtainable for Agentforce to invoke.
  3. Select End.

Create an Agentforce agent motion to make use of the exterior service

Full the next steps to create an Agentforce agent motion:

  1. In Salesforce, navigate to Setup, Agentforce, Einstein Generative AI, Agentforce Studio, Agentforce Belongings.
  2. On the Actions tab, select New Agent Motion.
  3. Below Connect with an current motion, present the next info:
    1. For Reference Motion Kind, select API.
    2. For Reference Motion Class, select Exterior Companies.
    3. For Reference Motion, select the Name Bedrock Agent motion that you just configured.
    4. Enter an agent motion label and API identify.
    5. Select Subsequent.

New agent action

  1. Present the next info to finish the agent motion configuration:
    1. For Agent Motion Directions, enter Name Bedrock Agent to get the details about gadget readings, sensor readings, upkeep or threshold info.
    2. For Loading Textual content, enter Calling Bedrock Agent.
    3. Below Enter, for Physique, enter Present the enter within the enter Textual content discipline.
    4. Below Outputs, for 200, enter Profitable response.

Configure agent action

  1. Save the agent motion.

Configure the Agentforce agent to make use of the agent motion

Full the next steps to configure the Agentforce agent to make use of the agent motion:

  1. In Salesforce, navigate to Setup, Agentforce, Einstein Generative AI, Agentforce Studio, Agentforce Brokers and open the agent in Agent Builder.
  2. Create a brand new subject.
  3. On the Matter Configuration tab, present the next info:
    1. For Identify, enter Machine Data.
    2. For Classification Description, enter This subject handles inquiries associated to gadget and sensor info, together with studying, upkeep, and threshold.
    3. For Scope, enter Your job is simply to supply details about gadget readings, sensor readings, gadget upkeep, sensor upkeep, and threshold. Don’t try to handle points outdoors of offering gadget info.
    4. For Directions, enter the next:
If a person asks for gadget readings or sensor readings, present the data.
If a person asks for gadget upkeep or sensor upkeep, present the data.
When looking for gadget info, embody the gadget or sensor id and any related key phrases in your search question.

  1. On the This Matter’s Actions tab, select New and Add from Asset Library.

  1. Select the Name Bedrock Agent motion.

  1. Activate the agent and enter a query, corresponding to “What’s the newest studying for sensor with gadget id CITDEV003.”

The agent will point out that it’s calling the Amazon Bedrock agent, as proven within the following screenshot.

The agent will fetch the data utilizing the Amazon Bedrock agent from the related data base.

Clear up

To keep away from further prices, delete the sources that you just created once you not want them:

  1. Delete the Amazon Bedrock data base:
    1. On the Amazon Bedrock console, select Information Bases within the navigation pane.
    2. Choose the data base you created and select Delete.
  2. Delete the Amazon Bedrock agent:
    1. On the Amazon Bedrock console, select Brokers within the navigation pane.
    2. Choose the agent you created and select Delete.
  3. Delete the Lambda operate:
    1. On the Lambda console, select Features within the navigation pane.
    2. Choose the operate you created and select Delete.
  4. Delete the REST API:
    1. On the API Gateway console, select APIs within the navigation pane.
    2. Choose the REST API you created and select Delete.

Conclusion

On this put up, we described an structure that demonstrates the facility of mixing AI companies on AWS with Agentforce. Through the use of Amazon Bedrock Brokers and Amazon Bedrock Information Bases for contextual understanding by means of RAG, and Lambda capabilities and API Gateway to bridge interactions with Agentforce, companies can construct refined, automated workflows. As AI capabilities proceed to develop, such collaborative multi-agent programs will grow to be more and more central to enterprise automation methods. In an upcoming put up, we’ll present you methods to construct the asynchronous integration sample from Agentforce to Amazon Bedrock utilizing Salesforce Occasion Relay.

To get began, see Grow to be an Agentblazer Innovator and discuss with How Amazon Bedrock Brokers works.


In regards to the authors

Yogesh Dhimate is a Sr. Associate Options Architect at AWS, main know-how partnership with Salesforce. Previous to becoming a member of AWS, Yogesh labored with main firms together with Salesforce driving their business resolution initiatives. With over 20 years of expertise in product administration and options structure Yogesh brings distinctive perspective in cloud computing and synthetic intelligence.

Kranthi Pullagurla has over 20+ years’ expertise throughout Software Integration and Cloud Migrations throughout A number of Cloud suppliers. He works with AWS Companions to construct options on AWS that our joint clients can use. Previous to becoming a member of AWS, Kranthi was a strategic advisor at MuleSoft (now Salesforce). Kranthi has expertise advising C-level buyer executives on their digital transformation journey within the cloud.

Shitij Agarwal is a Associate Options Architect at AWS. He creates joint options with strategic ISV companions to ship worth to clients. When not at work, he’s busy exploring NY city and the climbing trails that encompass it, and occurring bike rides.

Ross Belmont is a Senior Director of Product Administration at Salesforce masking Platform Information Companies. He has greater than 15 years of expertise with the Salesforce ecosystem.

Sharda Rao is a Senior Director of Product Administration at Salesforce masking Agentforce Go To Market technique

Hunter Reh is an AI Architect at Salesforce and a passionate builder who has developed over 100 brokers because the launch of Agentforce. Exterior of labor, he enjoys exploring new trails on his bike or getting misplaced in an amazing e-book.

Tags: AgentforceAgentsAmazonAutomateBedrockEnterpriseIntegratingSalesforceWorkflows
Previous Post

Time Collection Forecasting Made Easy (Half 3.2): A Deep Dive into LOESS-Based mostly Smoothing

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Popular News

  • How Aviva constructed a scalable, safe, and dependable MLOps platform utilizing Amazon SageMaker

    How Aviva constructed a scalable, safe, and dependable MLOps platform utilizing Amazon SageMaker

    402 shares
    Share 161 Tweet 101
  • Diffusion Mannequin from Scratch in Pytorch | by Nicholas DiSalvo | Jul, 2024

    401 shares
    Share 160 Tweet 100
  • Unlocking Japanese LLMs with AWS Trainium: Innovators Showcase from the AWS LLM Growth Assist Program

    401 shares
    Share 160 Tweet 100
  • Proton launches ‘Privacy-First’ AI Email Assistant to Compete with Google and Microsoft

    401 shares
    Share 160 Tweet 100
  • Streamlit fairly styled dataframes half 1: utilizing the pandas Styler

    401 shares
    Share 160 Tweet 100

About Us

Automation Scribe is your go-to site for easy-to-understand Artificial Intelligence (AI) articles. Discover insights on AI tools, AI Scribe, and more. Stay updated with the latest advancements in AI technology. Dive into the world of automation with simplified explanations and informative content. Visit us today!

Category

  • AI Scribe
  • AI Tools
  • Artificial Intelligence

Recent Posts

  • Automate enterprise workflows by integrating Salesforce Agentforce with Amazon Bedrock Brokers
  • Time Collection Forecasting Made Easy (Half 3.2): A Deep Dive into LOESS-Based mostly Smoothing
  • The DIVA logistics agent, powered by Amazon Bedrock
  • Home
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms & Conditions

© 2024 automationscribe.com. All rights reserved.

No Result
View All Result
  • Home
  • AI Scribe
  • AI Tools
  • Artificial Intelligence
  • Contact Us

© 2024 automationscribe.com. All rights reserved.