The Pendo source is in beta. Please read the sources overview for more information on using beta-labeled sources.
This tutorial provides steps for creating a Pendo source connection and dataflow using the Adobe Experience Platform user interface.
This tutorial requires a working understanding of the following components of Experience Platform:
The following section provides information on prerequisites to complete before you can create a Pendo source connection.
Before creating a Pendo source connection, you will require a source schema to be provided. You can use the JSON below.
{
"accountId": "58f79ee324d3f",
"timestamp": 1673372516,
"visitorId": "test@test.com",
"uniqueId": "166e50cdf40930fe1367e4d44795c9c74d88b83a",
"properties": {
"guideProperties": {
"name": "Guide Conversion Test"
}
}
}
For more information, read the Pendo guide on webhooks.
You must also ensure that you first create a Platform schema to use for your source. See the tutorial on creating a Platform schema for comprehensive steps on how to create a schema.
In the Platform UI, select Sources from the left navigation to access the Sources workspace and see a catalog of sources available in Experience Platform.
Use the Categories menu to filter sources by category. Alternatively, enter a source name in the search bar to find a specific source from the catalog.
Go to the Analytics category to see the Pendo source card. To begin, select Add data.
The Select data step appears, providing an interface for you to select the data that you want to bring to Platform.
Select Upload files to upload a JSON file from your local system. Alternatively, you can drag and drop the JSON file you want to upload into the Drag and drop files panel.
Once your file uploads, the preview interface updates to display a preview of the schema you uploaded. The preview interface allows you to inspect the contents and structure of a file. You can also use the Search field utility to access specific items from within your schema.
When finished, select Next.
The Dataflow detail step appears, providing you with options to use an existing dataset or establish a new dataset for your dataflow, as well as an opportunity to provide a name and description for your dataflow. During this step, you can also configure settings for Profile ingestion, error diagnostics, partial ingestion, and alerts.
When finished, select Next.
The Mapping step appears, providing you with an interface to map the source fields from your source schema to their appropriate target XDM fields in the target schema.
Platform provides intelligent recommendations for auto-mapped fields based on the target schema or dataset that you selected. You can manually adjust mapping rules to suit your use cases. Based on your needs, you can choose to map fields directly, or use data prep functions to transform source data to derive computed or calculated values. For comprehensive steps on using the mapper interface and calculated fields, see the Data Prep UI guide.
The mappings listed below are mandatory and should be setup before proceeding to the Review stage.
Target Field | Description |
---|---|
uniqueId |
The Pendo identifier for the event. |
Once your source data is successfully mapped, select Next.
The Review step appears, allowing you to review your new dataflow before it is created. Details are grouped within the following categories:
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
With your streaming dataflow created, you can now retrieve your streaming endpoint URL. This endpoint will be used to subscribe to your webhook, allowing your streaming source to communicate with Experience Platform.
In order to construct the URL used to configure the webhook on Pendo you must retrieve the following:
To retrieve your Dataflow ID and Streaming endpoint, go to the Dataflow activity page of the dataflow that you just created and copy the details from the bottom of the Properties panel.
Once you have retrieved your streaming endpoint and dataflow ID, build a URL based on the following pattern: {STREAMING_ENDPOINT}?x-adobe-flow-id={DATAFLOW_ID}
. For example, a constructed webhook URL may look like: https://dcs.adobedc.net/collection/0c61859cc71939a0caf01123f91b2fc52589018800ad46b6c76c2dff3595ee95
Next, login to your account on Pendo and create a webhook. For steps on how to create a webhook using the Pendo user interface, please refer to the Pendo guide on creating webhook.
Once your webhook is created, navigate to the settings page of your Pendo webhook and input your webhook URL in the URL field.
You can subscribe to a variety of different events categories to determine the kind of events you want to send from your Pendo instance to Platform. For more information on the different events, please refer to the Pendo documentation.
By following this tutorial you have successfully configured a streaming dataflow to bring your Pendo data to Experience Platform. To monitor the data that is being ingested, refer to the guide on monitoring streaming dataflows using Platform UI.
The sections below provide additional resources that you can refer to when using the Pendo source.
To validate that you have correctly set up the source and Pendo messages are being ingested, follow the steps below:
When checking a dataflow run, you might encounter the following error message: The message can't be validated ... uniqueID:expected minLength:1, actual 0].
To fix this error, you must verify that the uniqueID mapping has been set up. For additional guidance, refer to the Mmpping section.
For more information visit the Pendo Help Center.