Loading
About Salesforce Data 360
Table of Contents
Select Filters

          No results
          No results
          Here are some search tips

          Check the spelling of your keywords.
          Use more general search terms.
          Select fewer filters to broaden your search.

          Search all of Salesforce Help
          Add a Foundation Model

          Add a Foundation Model

          Adding a foundation model in AI Models (formerly Einstein Studio) establishes an endpoint connection between the external model provider and Salesforce. After you connect the model, create a new model configuration and evaluate your model’s response to custom prompts.

          Bring Your Own LLM (BYOLLM) supports Amazon Bedrock, Azure OpenAI, OpenAI, and Vertex AI from Google as foundation model providers. Use the Connect to your LLM option to use the LLM Open Connector, which enables you to connect the Einstein AI Platform to any language model hosted on major cloud platforms or hosted by you.

          To add a foundation model, Einstein Generative AI must be enabled in your org. For more information, see Set Up Einstein Generative AI in Salesforce Help.

          To connect to a model endpoint, you need its URL and authentication information. Find these details on the provider’s dashboard.

          For LLM Open Connector, the URL must end with /chat/completions. If the URL path doesn’t satisfy this requirement, Open Connector automatically appends /chat/completions to the URL path.

          1. In AI Models, go to the Generative tab.
          2. Click Add Foundation Model.
            AI Models (formerly Einstein Studio) Model Builder displaying the option to add a foundation model.
          3. Select the type of foundation model that you want to connect, and click Next.
            Model Builder displaying foundation models that you can add.
          4. Connect to the model endpoint.
            1. Enter the endpoint name and URL.
              Note
              Note To connect to a remote model endpoint, a standard HTTPS 443 port is required.
            2. Enter your authentication details.
              If you’re bringing your own Azure Open AI model, you can select OAuth and select a named credential. To learn more, see Salesforce Help: Create Named Credentials and External Credentials.
            3. Enter the model information.
              If you’re connecting an Azure Open AI model, enter the Azure deployment. You can find this information on the Azure OpenAI Dashboard in Deployments. If you’re connecting an OpenAI fine-tuned model, select Yes when prompted during setup.
              For details about the supported models, see Large Language Model Support.
            4. Click Save & Test.
            5. Enter the exact name of the model that you want to connect.
            6. Click Connect.
            7. Enter a name for your model, and then click Name and Connect.
            8. Select the model version.

              You may see references to “recitation” in your model’s generated text response to a prompt. For more details, see Google Gemini API Reference.

            9. Select the model type.
              For Amazon Bedrock, use Anthropic. For a list of models, see Large Language Model Support.
          5. To use your foundation model, go to the Foundation Model tab and configure it.
           
          Loading
          Salesforce Help | Article