The Customer.io source is in beta. Please read the sources overview for more information on using beta-labeled sources.
This tutorial provides steps for creating a Customer.io source connection and dataflow using the Adobe Experience Platform user interface.
This tutorial requires a working understanding of the following components of Experience Platform:
The following section provides information on prerequisites to complete before you can create a Customer.io source connection.
Before creating a Customer.io source connection, you will require a source schema to be provided. You can use the below JSON.
{
"event_id": "01E4C4CT6YDC7Y5M7FE1GWWPQJ",
"object_type": "customer",
"metric": "subscribed",
"timestamp": 1613063089,
"data": {
"customer_id": "42",
"email_address": "test@example.com",
"identifiers": {
"id": "42",
"email": "test@example.com",
"cio_id": "d9c106000001"
}
}
}
You must also ensure that you create a Platform schema to use for your source. See the tutorial on creating a Platform schema for comprehensive steps on how to create a schema.
In the Platform UI, select Sources from the left navigation to access the Sources workspace and see a catalog of sources available in Experience Platform.
Use the Categories menu to filter sources by category. Alternatively, enter a source name in the search bar to find a specific source from the catalog.
Go to the Marketing automation category to see the Customer.io source card. To begin, select Add data.
The Select data step appears, providing an interface for you to select the data that you want to bring to Platform.
Select Upload files to upload a JSON file from your local system. Alternatively, you can drag and drop the JSON file you want to upload into the Drag and drop files panel.
Once your file uploads, the preview interface updates to display a preview of the schema you uploaded. The preview interface allows you to inspect the contents and structure of a file. You can also use the Search field utility to access specific items from within your schema.
When finished, select Next.
The Dataflow detail step appears, providing you with options to use an existing dataset or establish a new dataset for your dataflow, as well as an opportunity to provide a name and description for your dataflow. During this step, you can also configure settings for Profile ingestion, error diagnostics, partial ingestion, and alerts.
When finished, select Next.
The Mapping step appears, providing you with an interface to map the source fields from your source schema to their appropriate target XDM fields in the target schema.
Platform provides intelligent recommendations for auto-mapped fields based on the target schema or dataset that you selected. You can manually adjust mapping rules to suit your use cases. Based on your needs, you can choose to map fields directly, or use data prep functions to transform source data to derive computed or calculated values. For comprehensive steps on using the mapper interface and calculated fields, see the Data Prep UI guide.
All the mappings listed below are mandatory and should be setup before proceeding to the Review stage.
Target Field | Description |
---|---|
object_type |
The object type, refer to the Customer.io events documentation for the supported types. |
id |
The object’s identifier. |
email |
The email address associated with the object. |
event_id |
The unique identifier of the event. |
cio_id |
The Customer.io identifier for the event. |
metric |
The event type. For more information, refer to the Customer.io events documentation for supported types. |
timestamp |
The timestamp when the event occurred. |
Do not map cio_id
when executing Customer.io webhook in the test mode
as there will be no associated fields sent from Customer.io.
Once your source data is successfully mapped, select Next.
The Review step appears, allowing you to review your new dataflow before it is created. Details are grouped within the following categories:
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
With your streaming dataflow created, you can now retrieve your streaming endpoint URL. This endpoint will be used to subscribe to your webhook, allowing your streaming source to communicate with Experience Platform.
In order to construct the URL used to configure the webhook on Customer.io you must retrieve the following:
To retrieve your Dataflow ID and Streaming endpoint, go to the Dataflow activity page of the dataflow that you just created and copy the details from the bottom of the Properties panel.
Once you have retrieved your streaming endpoint and dataflow ID, build a URL based on the following pattern: {STREAMING_ENDPOINT}?x-adobe-flow-id={DATAFLOW_ID}
. For example, a constructed webhook URL may look like: https://dcs.adobedc.net/collection/febc116d22ba0ea2868e9c93b199375302afb8a589617700991bb8f3f0341ad7?x-adobe-flow-id=439b3fc4-3042-4a3a-b5e0-a494898d3fb0
With your webhook URL created, you can now set up your reporting webhook using the Customer.io user interface. For steps on setting up reporting webhooks, please read the Customer.io guide on setting up webhooks.
In the Customer.io user interface, input your webhook URL in the WEBHOOK ENDPOINT field.
You can subscribe to a variety of different events for your reporting webhook. Each events’ message will be ingested to Platform when a Customer.io action event trigger criteria is met. For more information on the different events, please refer to the Customer.io events documentation.
By following this tutorial you have successfully configured a streaming dataflow to bring your Customer.io data to Experience Platform. To monitor the data that is being ingested, refer to the guide on monitoring streaming dataflows using Platform UI.
The sections below provide additional resources that you can refer to when using the Customer.io source.
For information on guardrails, please refer to the Customer.io timeouts and failures page.
To validate that you have correctly set up the source and Customer.io messages are being ingested, follow the steps below: