Loading
Agentforce and Einstein Generative AI
Table of Contents
Select Filters

          No results
          No results
          Here are some search tips

          Check the spelling of your keywords.
          Use more general search terms.
          Select fewer filters to broaden your search.

          Search all of Salesforce Help
          Audit Trail

          Audit Trail

          Track the use of generative AI in your Salesforce org and ensure that AI usage complies with your security, privacy, regulatory, and AI governance policies.

          Required Editions

          Available in: Enterprise, Performance, and Unlimited Editions with an Einstein for Sales, Einstein for Platform, Einstein for Service, Einstein 1 Service, or Einstein GPT Service add-on. To purchase add-ons, contact your Salesforce account executive.

          Generative AI audit data (also known as audit trail) includes data about the Einstein Trust Layer features such as data masking and toxicity scores.

          Audit trail along with feedback data is stored in Data 360. Use Data 360 reports to see how the Einstein Trust Layer protects your sensitive data from exposure to external large language models (LLMs). Einstein Trust Layer also verifies the safety and accuracy of the responses generated by the LLM. For example, if data masking is enabled for your org, generative AI data includes the masked prompt text that was sent to an external LLM. Similarly, you can also see the response generated by the LLM and the full unmasked response that was served to the user.

          To access the data stored in Data 360, you'll need to turn on the Einstein generative AI data collection and storage and install the report package.

           
          Loading
          Salesforce Help | Article