Loading
Salesforce now sends email only from verified domains. Read More
Agentforce and Einstein Generative AI
Table of Contents
Select Filters

          No results
          No results
          Here are some search tips

          Check the spelling of your keywords.
          Use more general search terms.
          Select fewer filters to broaden your search.

          Search all of Salesforce Help
          Batch Models

          Batch Models

          Use Prompt Template Batch Processing to generate large quantities of responses for prompt templates asynchronously.

          To asynchronously generate large quantities of responses for prompt templates, you can use Prompt Template Batch Processing. For more information on batch processing, see Flow Core Action: Prompt Template Actions in Salesforce Help and Prompt Template Batch Processing in the Agentforce Developer Guide.

          Native Batch Supported Models

          Certain Salesforce-managed models enable batch processing natively supported by the model provider. These models support native batch processing:

          • GPT 4.1
          • GPT 4.1 Mini
          • GPT 4 Omni
          • GPT 4 Omni Mini
          • GPT 5
          • GPT 5 Mini
          • OpenAI GPT 4 Omni Mini
          • Bedrock Claude Haiku 4.5
          • Bedrock Claude Sonnet 4.5

          Native Batch Limits

          When using models that support native batch processing, there is a limit of 50,000 items per day. Via Apex, there is a limit of 10,000 items per job, and you can start 5 job runs in 24 hours. For more information on processing batch requests via Apex, see Prompt Template Batch Processing.

          Batch Model Routing Behavior

          Important
          Important Batch processing doesn't follow geo-aware routing rules and doesn't provide in-region enforcement.

          There are different routing behaviors depending on the model provider used for native provider batch processing.

          Azure Batch Model Routing Behavior

          Azure native model provider batch processing uses global model endpoints. For these models, batch requests may be processed in regions outside your org's region, even if non-batch LLM requests are restricted to in-region Azure endpoints.

          For more information on Azure's global model endpoints, see Models by deployment type.

          Bedrock Batch Model Routing Behavior

          Native provider batch requests to Anthropic models on Bedrock may route to regions outside of your org's region. Native provider batch requests outside of the United States, Australia, Sweden, Germany, Italy, Ireland, or Japan may be processed in another region.

           
          Loading
          Salesforce Help | Article