This put up is co-authored by Daryl Martis and Darvish Shadravan from Salesforce.
That is the fourth put up in a sequence discussing the combination of Salesforce Knowledge Cloud and Amazon SageMaker.
In Half 1 and Half 2, we present how Salesforce Knowledge Cloud and Einstein Studio integration with SageMaker permits companies to entry their Salesforce information securely utilizing SageMaker’s instruments to construct, practice, and deploy fashions to endpoints hosted on SageMaker. SageMaker endpoints could be registered with Salesforce Knowledge Cloud to activate predictions in Salesforce. In Half 3, we show how enterprise analysts and citizen information scientists can create machine studying (ML) fashions, with out code, in Amazon SageMaker Canvas and deploy educated fashions for integration with Salesforce Einstein Studio to create highly effective enterprise purposes.
On this put up, we present how native integrations between Salesforce and Amazon Net Providers (AWS) allow you to Deliver Your Personal Massive Language Fashions (BYO LLMs) out of your AWS account to energy generative synthetic intelligence (AI) purposes in Salesforce. Requests and responses between Salesforce and Amazon Bedrock move via the Einstein Belief Layer, which promotes accountable AI use throughout Salesforce.
We show BYO LLM integration by utilizing Anthropic’s Claude mannequin on Amazon Bedrock to summarize a listing of open service instances and alternatives on an account report web page, as proven within the following determine.
Associate quote
“We proceed to develop on our robust collaboration with AWS with our BYO LLM integration with Amazon Bedrock, empowering our clients with extra mannequin decisions and permitting them to create AI-powered options and Copilots custom-made for his or her particular enterprise wants. Our open and versatile AI surroundings, grounded with buyer information, positions us properly to be leaders in AI-driven options within the CRM area.”
–Kaushal Kurapati, Senior Vice President of Product for AI at Salesforce
Amazon Bedrock
Amazon Bedrock is a completely managed service that provides a selection of high-performing basis fashions (FMs) from main AI firms like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon via a single API, together with a broad set of capabilities you have to construct generative AI purposes with safety, privateness, and accountable AI. Utilizing Amazon Bedrock, you possibly can rapidly experiment with and consider high FMs in your use case, privately customise them together with your information utilizing methods akin to fine-tuning and Retrieval Augmented Era (RAG), and construct brokers that execute duties utilizing your enterprise methods and information sources. Since Amazon Bedrock is serverless, you don’t must handle infrastructure, and you’ll securely combine and deploy generative AI capabilities into your purposes utilizing the AWS companies you might be already conversant in.
Salesforce Knowledge Cloud and Einstein Mannequin Builder
Salesforce Knowledge Cloud is a knowledge platform that unifies your organization’s information, giving each crew a 360-degree view of the shopper to drive automation and analytics, personalize engagement, and energy trusted AI. Knowledge Cloud creates a holistic buyer view by turning volumes of disconnected information right into a single, trusted mannequin that’s easy to entry and perceive. With information harmonized inside Salesforce Knowledge Cloud, clients can put their information to work to construct predictions and generative AI–powered enterprise processes throughout gross sales, assist, and advertising.
With Einstein Mannequin Builder, clients can construct their very own fashions utilizing Salesforce’s low-code mannequin builder expertise or combine their very own custom-built fashions into the Salesforce platform. Einstein Mannequin Builder’s BYO LLM expertise gives the potential to register {custom} generative AI fashions from exterior environments akin to Amazon Bedrock and Salesforce Knowledge Cloud.
As soon as {custom} Amazon Bedrock fashions are registered in Einstein Mannequin Builder, fashions are linked via the Einstein Belief Layer, a sturdy set of options and guardrails that shield the privateness and safety of knowledge, enhance the security and accuracy of AI outcomes, and promote the accountable use of AI throughout Salesforce. Registered fashions can then be utilized in Immediate Builder, a newly launched, low-code immediate engineering software that permits Salesforce admins to construct, check, and fine-tune trusted AI prompts that can be utilized throughout the Salesforce platform. These prompts could be built-in with Salesforce capabilities akin to Flows and Invocable Actions and Apex.
Answer overview
With the Salesforce Einstein Mannequin Builder BYO LLM characteristic, you possibly can invoke Amazon Bedrock fashions in your AWS account. On the time of this writing, Salesforce helps Anthropic Claude 3 fashions on Amazon Bedrock for BYO LLM. For this put up, we use the Anthropic Claude 3 Sonnet mannequin. To study extra about inference with Claude 3, confer with Anthropic Claude fashions within the Amazon Bedrock documentation.
To your implementation, you could use the mannequin of your selection. Seek advice from Deliver Your Personal Massive Language Mannequin in Einstein 1 Studio for fashions supported with Salesforce Einstein Mannequin Builder.
The next picture exhibits a high-level structure of how one can combine the LLM out of your AWS account into the Salesforce Immediate Builder.
On this put up, we present methods to construct generative AI–powered Salesforce purposes with Amazon Bedrock. The next are the high-level steps concerned:
- Grant Amazon Bedrock invoke mannequin permission to an AWS Id and Entry Administration (IAM) consumer
- Register the Amazon Bedrock mannequin in Salesforce Einstein Mannequin Builder
- Combine the immediate template with the sphere within the Lightning App Builder
Stipulations
Earlier than deploying this resolution, be sure to meet the next stipulations:
- Have entry to Salesforce Knowledge Cloud and meet the necessities for utilizing BYO LLM.
- Have Amazon Bedrock arrange. If that is the primary time you might be accessing Anthropic Claude fashions on Amazon Bedrock, you have to request entry. You have to have enough permissions to request entry to fashions via the console. To request mannequin entry, sign up to the Amazon Bedrock console and choose Mannequin entry on the backside of the left navigation pane.
Answer walkthrough
To construct generative AI–powered Salesforce purposes with Amazon Bedrock, implement the next steps.
Grant Amazon Bedrock invoke mannequin permission to an IAM Consumer
Salesforce Einstein Studio requires an entry key and a secret to entry the Amazon Bedrock API. Observe the directions to arrange an IAM consumer and entry keys. The IAM consumer should have Amazon Bedrock invoke mannequin permission to entry the mannequin. Full the next steps:
- On the IAM console, choose Customers within the navigation panel. On the correct facet of the console, select Add permissions and Create inline coverage.
- On the Specify permissions display, within the Service dropdown menu, choose Bedrock.
- Underneath Actions allowed, enter “invoke.” Underneath Learn, choose InvokeModel. Choose All underneath Assets. Select Subsequent.
- On the Evaluate and create display, underneath Coverage identify, enter
BedrockInvokeModelPolicy
. Select Create coverage.
Register Amazon Bedrock mannequin in Einstein Mannequin Builder
- On the Salesforce Knowledge Cloud console, underneath the Einstein Studio tab, select Add Basis Mannequin.
- Select Connect with Amazon Bedrock.
- For Endpoint data, enter the endpoint identify, your AWS account Entry Key, and your Secret Key. Enter the Area and Mannequin data. Select Join.
- Now, create the configuration for the mannequin endpoint you created within the earlier steps. Present Inference parameters akin to temperature to set the deterministic issue of the LLM. Enter a pattern immediate to confirm the response.
- Subsequent, it can save you this new mannequin configuration. Enter the identify for the saved LLM mannequin and select Create Mannequin.
- After the mannequin creation is profitable, select Shut and proceed to create the immediate template.
- Choose the Mannequin identify to open the Mannequin configuration.
- Choose Create Immediate Template to launch the immediate builder.
- Choose Area Era because the immediate template kind, template identify, set Object to Account, and set Object Area to PB Case and Oppty Abstract. It will affiliate the template to a {custom} area within the account report object to summarize the instances.
For this demo, a wealthy textual content area named PB Case and Oppty Abstract was created and added to the Salesforce Account web page structure in keeping with the Add a Area Era Immediate Template to a Lightning Document Web page directions.
- Present the immediate and enter variables or objects for information grounding and choose the mannequin. Seek advice from Immediate Builder to study extra.
Combine immediate template with the sphere within the Lightning App builder
- On the Salesforce console, use the search bar to search out Lightning App Builder. Construct or edit an current web page to combine the immediate template with the sphere as proven within the following screenshot. Seek advice from Add a Area Era Immediate Template to a Lightning Document Web page for detailed directions.
- Navigate to the Account web page and click on on the PB Case and Oppty Abstract enabled for chat completion to launch the Einstein generative AI assistant and summarize the account case information.
Cleanup
Full the next steps to scrub up your assets.
Amazon Bedrock provides on-demand inference pricing. There’s no further prices with a continued mannequin subscription. To take away mannequin entry, confer with the steps in Take away mannequin entry.
Conclusion
On this put up, we demonstrated methods to use your personal LLM in Amazon Bedrock to energy Salesforce purposes. We used summarization of open service instances on an account object for instance to showcase the implementation steps.
Amazon Bedrock is a completely managed service that makes high-performing FMs from main AI firms and Amazon out there in your use via a unified API. You may select from a variety of FMs to search out the mannequin that’s greatest suited in your use case.
Salesforce Einstein Mannequin Builder permits you to register your Amazon Bedrock mannequin and use it in Immediate Builder to create prompts grounded in your information. These prompts can then be built-in with Salesforce capabilities akin to Flows and Invocable Actions and Apex. You may then construct {custom} generative AI purposes with Claude 3 which are grounded within the Salesforce consumer expertise. Amazon Bedrock requests from Salesforce move via the Einstein Belief Layer, which gives accountable AI use with options akin to dynamic grounding, zero information retention, and toxicity detection whereas sustaining security and safety requirements.
AWS and Salesforce are excited for our mutual clients to harness this integration and construct generative AI–powered purposes. To study extra and begin constructing, confer with the next assets.
In regards to the Authors
Daryl Martis is the Director of Product for Einstein Studio at Salesforce Knowledge Cloud. He has over 10 years of expertise in planning, constructing, launching, and managing world-class options for enterprise clients, together with AI/ML and cloud options. He has beforehand labored within the monetary companies business in New York Metropolis. Observe him on LinkedIn.
Darvish Shadravan is a Director of Product Administration within the AI Cloud at Salesforce. He focuses on constructing AI/ML options for CRM, and is the product proprietor for the Deliver Your Personal LLM characteristic. You may join with him on LinkedIn.
Rachna Chadha is a Principal Options Architect AI/ML in Strategic Accounts at AWS. Rachna is an optimist who believes that moral and accountable use of AI can enhance society sooner or later and convey financial and social prosperity. In her spare time, Rachna likes spending time together with her household, climbing, and listening to music.
Ravi Bhattiprolu is a Sr. Associate Options Architect at AWS. Ravi works with strategic companions Salesforce and Tableau to ship progressive and well-architected merchandise and options that assist joint clients notice their enterprise goals.
Ife Stewart is a Principal Options Architect within the Strategic ISV section at AWS. She has been engaged with Salesforce Knowledge Cloud during the last 2 years to assist construct built-in buyer experiences throughout Salesforce and AWS. Ife has over 10 years of expertise in expertise. She is an advocate for variety and inclusion within the expertise area.
Mike Patterson is a Senior Buyer Options Supervisor within the Strategic ISV section at AWS. He has partnered with Salesforce Knowledge Cloud to align enterprise goals with progressive AWS options to realize impactful buyer experiences. In Mike’s spare time, he enjoys spending time together with his household, sports activities, and outside actions.
Dharmendra Kumar Rai (DK Rai) is a Sr. Knowledge Architect, Knowledge Lake & AI/ML, serving strategic clients. He works intently with clients to grasp how AWS can assist them resolve issues, particularly within the AI/ML and analytics area. DK has a few years of expertise in constructing data-intensive options throughout a variety of business verticals, together with high-tech, FinTech, insurance coverage, and consumer-facing purposes.