Introduction
In today’s fast-paced digital landscape, businesses generate massive amounts of data from various sources. To make the most of this data, organizations must integrate it seamlessly into their systems for analytics, customer insights, and automation. Salesforce Data Cloud’s Ingestion API simplifies the process of ingesting data from external sources into Salesforce Data Cloud, ensuring real-time data synchronization and bulk data uploads.
In this guide, we’ll cover everything you need to know about Salesforce Data Cloud Ingestion API, including how to set it up, how to stream data in real-time, and how to perform bulk ingestion efficiently.
What is Salesforce Data Cloud Ingestion API?
The Salesforce Data Cloud Ingestion API is a REST API designed to push data from external systems into Data Cloud. It supports two interaction patterns:
- Streaming Ingestion – Sends incremental updates to datasets in real time.
- Bulk Ingestion – Uploads large datasets periodically using CSV files.
The same data stream can accept data from both streaming and bulk ingestion, making it highly flexible for various business needs.
Get Started with Ingestion API
Before using the Ingestion API in Data Cloud, complete the prerequisites, set up authentication, and understand the limits that apply to bulk ingestion and streaming ingestion.
Prerequisites
- Set up an Ingestion API connector to define the endpoints and payload to ingest data.
- Create an Ingestion API data stream to configure ingestion jobs and expose the API for external consumption.
- Contact your admin to get endpoint details configured.
Steps to Set Up and Use the Ingestion API
Setting up and using the Ingestion API involves multiple steps, requiring collaboration between an admin, developer, and data specialist. Let’s break down the setup process.
Step 1: Set Up the Ingestion API Connector
The Ingestion API Connector acts as a bridge between external data sources and Salesforce Data Cloud.
How to Set Up the Connector
Go to Data Cloud Setup:
- Navigate to Salesforce Data Cloud Setup in your Salesforce org.
Create a New Ingestion API Connector:
- Under Salesforce Integrations, select Ingestion API.
- Click New, enter a name, and click Save.
Upload a Schema File:
- On the connector’s details page, upload a schema file (in OpenAPI (OAS) YAML format).
- Click Upload Schema, select the file, and click Open.
- Validate the schema and click Save
Step 2: Create an Ingestion API Data Stream
Once the connector is set up, create a data stream to define how data flows into Salesforce Data Cloud.
Steps to Create a Data Stream
Go to Data Streams in Data Cloud Setup.
- Click New and select Ingestion API.
- Choose the configured Ingestion API connector from the dropdown.
- Select the schema objects to include and click Next.
Configure the data stream:
- Primary Key: Field that uniquely identifies each record.
- Category: Defines the object type (Profile, Engagement, etc.).
- Record Modified Field: Tracks the last modified date (optional but recommended).
Click Deploy.
Step 3: Authentication & API Testing Using Postman
Authentication
Before interacting with the Ingestion API, authentication must be set up to securely communicate with Salesforce Data Cloud.
Set Up a Connected App for Authentication
A connected app allows secure authentication via OAuth. Follow these steps:
Create a Connected App in Salesforce.
Enable OAuth Settings and select the necessary scopes:
- cdp_ingest_api – Access and manage your Data Cloud Ingestion API data.
- api – Access and manage your data.
- refresh_token, offline_access – Perform requests on your behalf anytime.
Save the connected app and retrieve the client ID and client secret.
Once authentication is set up, developers can send API requests using Postman.
- Use Postman to send authentication requests to generate access tokens.
- Test API endpoints before integrating them into production workflows.
- Validate successful authentication and response codes (200 OK).
Acquire a Salesforce Access Token
After configuring the connected app, request a Salesforce access token by sending a request to the authentication endpoint:
Exchange Salesforce Access Token for Data Cloud Access Token
Use the Salesforce access token to get a Data Cloud access token for invoking the Ingestion API.
Step 4: Using the Ingestion API (Streaming & Bulk Ingestion)
After authentication, developers can send data into Salesforce Data Cloud using Postman, cURL, or API scripts.
Streaming Ingestion: Sending Data in Real-Time
The Streaming Ingestion API supports micro-batch data synchronization every 3 minutes. This is ideal for real-time data updates.
Example JSON Payload
{ "data": [ { "id": 1, "contact_name": "Joe Smith", "created_date": "2024-03-10T12:00:00Z", "tax_exempt": false, "ship_address": "123 Main Street", "total": 5000, "tax_rate": 10, "modified_date": "2024-03-11T14:00:00Z" } ] }
Making an API Request
📌 Response:
{ "accepted": true }
🔹 A 202 Accepted response confirms that data is queued for ingestion.
Bulk Ingestion: Uploading Large Data Sets
With the Data Cloud Ingestion API, you can upsert or delete large data sets. Prepare a CSV file for the data you want to upload, create a job, upload job data, and let Salesforce handle the rest.
Bulk Jobs and Operations
A bulk job typically goes through the following stages:
- Create a job – Define the object type and the operation (upsert or delete).
- Upload data – Submit CSV files containing the data.
- Close the job – Signal that the data is ready for processing.
- Monitor progress – Track job completion and review any errors.
- Delete the job – Remove job data after completion.
Prepare CSV Files
- The first row of the CSV file must list field names that match the Data Cloud object.
- All records in the CSV must belong to the same object.
- Fields must be in UTF-8 format and not exceed 150 MB per file.
- Empty field values are set to null.
Creating a Bulk Job
Request
Define the object type and the operation (upsert or delete).
Upload Job Data
Closing or Aborting a Job
Retrieving Job Information
By following these steps, you can efficiently ingest large datasets into Salesforce Data Cloud.
Data Ingested Successfully:
Conclusion
The Salesforce Data Cloud Ingestion API is a powerful tool for integrating external data into Salesforce Data Cloud. By following this guide, you can efficiently authenticate, set up data streams, and perform both real-time streaming and bulk ingestion.
By leveraging Postman for API testing, organizations can validate their ingestion setup before moving to production. With proper authentication and data stream configurations, businesses can unlock real-time customer insights and automation in Salesforce Data Cloud