Loading
Salesforce now sends email only from verified domains. Read More
Automate Your Business Processes
Table of Contents
Select Filters

          No results
          No results
          Here are some search tips

          Check the spelling of your keywords.
          Use more general search terms.
          Select fewer filters to broaden your search.

          Search all of Salesforce Help
          Troubleshoot APEX_CPU_TIME_LIMIT_EXCEEDED Errors in Flows

          Troubleshoot APEX_CPU_TIME_LIMIT_EXCEEDED Errors in Flows

          When a transaction consumes too much CPU time, Salesforce throws an APEX_CPU_TIME_LIMIT_EXCEEDED error. Flows share this limit with every other automation in the same transaction, including Apex triggers.

          Required Editions

          View supported editions.
          User Permissions Needed
          To open, edit, create, activate, or deactivate a flow using all flow types, elements, and features available in Flow Builder, including Einstein and Agentforce for Flow: Manage Flow
          To view Setup and access debug logs: View Setup and Configuration
          To view, retain, and delete debug logs, and set trace flags: View All Data

          Salesforce enforces a single CPU time limit of 10,000 milliseconds (10 seconds) per synchronous transaction. Every piece of automation in that transaction, such as Apex triggers, flows, workflow rules, and processes, draws from the same budget. If other automation consumes most of the budget first, even an optimized flow can fail. Execution order depends on flow type. For the full order, see Triggers and Order of Execution.

          • Before-save flows run earlier in the order than after-save flows.
          • Apex triggers run before or after flows depending on their trigger type.

          To troubleshoot and optimize flows that hit CPU time limits:

          • Understand the common patterns that cause CPU limit errors
          • Identify which elements consume the most CPU time
          • Apply optimization techniques to reduce CPU consumption
          • Implement prevention guidelines to avoid CPU limit errors

          Problems, Solutions, and Prevention Techniques for CPU Limit Errors

          Identify common CPU limit problems, apply solutions, and follow prevention techniques to avoid APEX_CPU_TIME_LIMIT_EXCEEDED errors.

          This table provides a reference for troubleshooting CPU limit errors. Each row describes a common problem, the solution to fix it, and techniques to prevent it in future flows. Start by determining whether any of these problems apply to your flow. If none apply, another automation in the same transaction—such as an Apex trigger, another flow, or a workflow rule—draws from the same CPU budget and can be the culprit. To identify what's consuming CPU across the entire transaction, review the Apex debug logs. For more details, see Debug Logs.

          Problem Solution Prevention Technique

          Data Manipulation Language (DML) Operations Inside Loops

          Performing Create Records, Update Records, or Delete Records operations inside a loop path consumes CPU time with each iteration. Processing multiple records one at a time can quickly exhaust the limit.

          Example: A flow loops through 100 opportunities and uses a Create Records element inside the loop to create a task for each opportunity, resulting in 100 separate DML operations.

          Use Collection-Based Data Manipulation Language (DML) Operations

          Inside the loop, use Assignment elements to add records to a record collection variable. After the loop completes, use a single Create Records, Update Records, or Delete Records element to process the entire collection at once. This approach is called bulkification.

          Example: Loop through opportunities, use an assignment element to build a task for each opportunity. Then use another assignment element to add each task to a collection variable. After the loop, use a Create Records element to create all tasks at once.

          For more information, see Flow Bulkification in Transactions.

          Never place DML operations inside loops. Always design flows to collect records in a collection variable during the loop, then perform DML after the loop completes.

          Multiple Queries Inside Loops

          Get Records elements inside a loop consume significant CPU time, especially when querying large objects or using complex filters.

          Example: A flow loops through accounts and uses Get Records inside the loop to get related contacts for each account, resulting in one query per account.

          Query Data Before Loops

          Before the loop, use a single Get Records element and use the get related records functionality to get all necessary data with appropriate filters. During the loop, use the collected data rather than querying on each iteration. If you can't use a single Get Records element, use a Get Records element to get the primary records. Then, use another Get Records element to get the secondary records. Filter the secondary records using a record field, the In operator, and the first Get Records collection. For example, Account ID > In > Accounts from Get Accounts.

          Example: First get all related contacts with a Get Records element that filters by the account IDs, then reference the contacts collection during the loop.

          Avoid queries inside loops. Get all necessary data before entering the loop.

          Complex Formulas in Loops

          Assignment elements that run complex formula calculations on each loop iteration accumulate CPU time, particularly with string manipulation, date calculations, or nested functions.

          Example: A flow loops through 500 records, and each iteration performs multiple formula calculations to derive field values.

          Simplify Formulas

          Break complex formulas into simpler steps. Calculate values that don't change outside the loop. Avoid nested functions when possible. Consider using formula fields on the object instead of flow formulas.

          For operations on entire collections—such as filtering, mapping, or sorting—a Transform element performs more efficiently than a loop. To get all related records in a single query instead of one query per loop iteration, use a Get Records element with the IN operator.

          For more information, see Transform Element.

          Simplify formulas and move calculations that don't change outside the loop. Test formula performance with realistic data volumes.

          Processing Large Record Collections

          Scheduled flows or batch operations that process thousands of records in a single transaction, even with proper bulkification, can consume too much CPU time.

          Example: A scheduled flow gets 5000 account records and performs complex transformations on each record's data before updating them.

          Use Alternative Approaches

          For large data volumes, consider:

          • Breaking the work into multiple scheduled flows that each process a subset of records
          • Using Apex batch processing for extreme data volumes
          • Processing records incrementally over time
          • Using Platform Events to distribute processing across multiple transactions

          Filter data early to reduce the number of records processed. Use Get Records filters to get only the records you need. Test with realistic data volumes before deploying to production.

          Multiple Flows in a Transaction

          When one flow triggers other flows (through record changes or subflows), the cumulative CPU time of all flows in the transaction counts toward the limit.

          Example: A record-triggered flow on Account updates runs multiple subflows and also triggers other record-triggered flows on related objects.

          Reduce Flow Chains

          Consolidate related automation into fewer flows. Review your flow trigger criteria to ensure flows run only when necessary. Consider using entry conditions to limit when record-triggered flows run.

          Review the cumulative impact of all automation (flows, processes, workflows, triggers) that can run in the same transaction. Monitor and optimize the entire automation chain.

          Bulk Data Loads

          Record-triggered flows that run during bulk data imports or mass updates require efficient processing of all records. Inefficient flows hit CPU limits during bulk operations even if they work fine for individual records.

          Example: A user imports 200 account records. The before-save record-triggered flow performs multiple lookups and calculations for each account.

          Optimize for Bulk Operations

          • Use collection-based DML and queries
          • Consider after-save flows for non-critical operations (they run asynchronously and have higher CPU limits)
          • Use Transform elements (Filter, Map, Sort) which operate on entire collections more efficiently than loops

          For more information, see Transform Element.

          Always assume record-triggered flows process multiple records simultaneously. Test with bulk data loads (by using Data Loader or mass updates) before deploying to production. If not properly designed, a flow that works perfectly for individual records can fail during bulk operations if not properly designed.

          General Prevention Guidelines

          • Monitor flow performance: Review Apex debug logs regularly to identify flows approaching CPU limits, even if they haven't failed yet. Regular monitoring helps you optimize flows before they cause production issues.
          • Test with realistic data: Test flows with realistic data volumes to uncover performance issues before activation. Debug mode typically tests with one record, which doesn't reveal bulk operation issues.
          • Document optimization decisions: Use element descriptions to note where you applied bulkification or other optimizations. This documentation helps future maintainers understand the design and prevents the accidental introduction of performance issues.
          • Start simple and optimize: Build flows in small increments, testing performance at each step. Optimizing a working flow is easier than fixing a complex broken one.

          Identify CPU-Intensive Elements in a Flow

          Use Apex debug logs to pinpoint which elements in your flow consume the most CPU time.

          1. Go to Setup and enter Debug Logs in the Quick Find box.
          2. Set up debug logging for the user who experiences the error, or run the flow in Debug mode.
          3. Reproduce the error by running the flow.
          4. Open the generated debug log.
          5. Search for these key events in the debug log.
            • FLOW_CREATE_INTERVIEW_BEGIN - Shows when each flow starts
            • FLOW_ELEMENT_LIMIT_USAGE - Shows CPU time consumption for each flow element
            • CUMULATIVE_LIMIT_USAGE - Shows running totals of CPU time
          6. Identify elements with high CPU time values.
            Common culprits include:
            • Loop elements with large collections.
            • Get Records elements with complex filters.
            • DML operations with multiple records (Create Records, Update Records, Delete Records).
            • Assignment elements with complex formulas.

          You now know which elements are consuming the most CPU time. Use this information to apply targeted optimizations.

          For more information about debug logs, see Working with Logs in the Developer Console.

           
          Loading
          Salesforce Help | Article