Skip to main content

Data Cloud Integration

GPTfy now supports seamless integration with Salesforce Data Cloud, enabling you to leverage AI capabilities across your unified data platform. This feature allows GPTfy to access and process data from Data Cloud objects, whether your Data Cloud contains data from your org only, or aggregates data from multiple connected source orgs.

What You Can Do:

  • Query Data Cloud objects in your org and use them with GPTfy prompts
  • Access unified data aggregated from multiple connected source orgs into your Data Cloud
  • Process Data Cloud objects with the same security layers and anonymization as standard Salesforce objects

How Data Cloud Integration Works

GPTfy works with Salesforce Data Cloud in two configurations:

ConfigurationDescriptionData Source
Single OrgData Cloud is enabled in your org and contains data from the same org onlyData originates from your current org
Hub-and-SpokeData Cloud is enabled in your org (hub) and aggregates data from multiple connected source orgsData flows from connected source orgs into your Data Cloud

Prerequisites

Before configuring Data Cloud integration in GPTfy, ensure the following requirements are met:

1. User Permissions

The user configuring the integration must have:

  • Data Cloud Architect permission set assigned
  • Access to Data Cloud objects
  • Administrative access to GPTfy
Data Cloud Integration Overview

2. Connected App Setup

Create a Connected App in Salesforce for secure authentication:

  1. Navigate to Setup → App Manager → New Connected App
  2. Enable OAuth Settings
  3. Configure the appropriate OAuth Scopes:
    • api
    • refresh_token
    • offline_access
  4. Save and note the Consumer Key (you'll need this as the Client ID)

💡 Pro Tip: Keep your Consumer Key and Private Key secure. These credentials provide access to your Data Cloud data.

3. Named Credentials

Create two Named Credentials for authentication:

Named Credential 1: Salesforce Authentication

  • Purpose: Generates JWT tokens for Salesforce authentication
  • Type: Named Principal
  • Authentication Protocol: OAuth 2.0
  • Identity Type: Named Principal
  • URL: Your Salesforce instance URL

Named Credential 2: Data Cloud Access

  • Purpose: Fetches data from Data Cloud objects in your org
  • Type: Named Principal
  • Authentication Protocol: OAuth 2.0
  • Identity Type: Named Principal
  • URL: Your Data Cloud instance URL

📖 Learn More: How to create Named Credentials in Salesforce

4. Private Key File

Upload your private key file (.key or .pem format) to Salesforce:

  1. Go to Files or Documents
  2. Upload your private key file
  3. Note the Record ID of the uploaded file (you'll need this in configuration)

Configuration Steps

Step 1: Enable Data Cloud Feature in GPTfy

  1. Navigate to GPTfy Cockpit
  2. Go to AI Settings → Preferences tab
  3. Scroll to the Data Cloud Configuration section

Step 2: Configure Data Cloud Settings

Data Cloud Integration Overview

Fill in the following fields to establish the connection:

FieldDescriptionExample
Enable for Data CloudToggle this ON to activate Data Cloud integration✓ Enabled
Salesforce Domain URLYour current org's domain URL from Setup → My Domainhttps://mycompany.my.salesforce.com
Salesforce Client IdConsumer Key from the Connected App you created3MVG9...ABC123
Salesforce Admin User NameUsername of the admin user with Data Cloud accessadmin@mycompany.com
Salesforce Private Key File IdRecord ID of the uploaded private key file0695g000000ABCDEF
Salesforce Auth Named CredentialAPI name of the Named Credential for JWT token generationSalesforce_Auth_NC
Data Cloud Named CredentialAPI name of the Named Credential for Data Cloud accessDataCloud_Access_NC

Step 3: Save Configuration

  1. Click Save to store your Data Cloud configuration
  2. GPTfy will validate the connection in the background

⚠️ Important: Ensure all fields are filled correctly. Incorrect values will prevent GPTfy from connecting to Data Cloud.


Using Data Cloud Objects in Prompts

Once Data Cloud is configured, you can use Data Cloud objects just like standard Salesforce objects in your Data Context Mapping and prompts.

Step 1: Create or Edit a Prompt

  1. Navigate to GPTfy Cockpit → Prompts
  2. Click New to create a new prompt or select an existing one

Step 2: Configure Prompt Command

  1. Enter a descriptive Prompt Name
  2. Specify the Prompt Command that will trigger the AI action

Step 3: Configure Data Context Mapping

Data Cloud Integration Overview
  1. Create a new Data Context Mapping or select an existing one
  2. In the Target Object dropdown, you'll now see Data Cloud objects listed alongside standard Salesforce objects
  3. Select the Data Cloud object you want (e.g., Unified_Individual__dlm, Customer_360__dlm)
  4. Configure Field Mappings to select which Data Cloud object's fields and save the mappings

💡 Pro Tip: Data Cloud objects typically end with __dlm suffix (Data Lake Model).

Step 4: Activate the Prompt

  1. Click Activate to make the prompt available
  2. The prompt is now ready to use with Data Cloud objects

Step 5: Run the Prompt

  1. Navigate to a Salesforce record
  2. Execute the prompt to process data from Data Cloud
  3. GPTfy will:
    • Authenticate to Data Cloud using your Named Credentials
    • Fetch the data from the specified Data Cloud object (including data from connected source orgs if using hub-and-spoke)
    • Apply any configured security layers for anonymization
    • Send the processed data to the AI model
    • Return the AI-generated response
Data Cloud Integration Overview Data Cloud Integration Overview