The Stripe source is in beta. Read the terms and conditions in the sources overview for more information on using beta-labeled sources.
Read the following tutorial to learn how to ingest payments data from your Stripe account to Adobe Experience Platform using the user interface.
This tutorial requires a working understanding of the following components of Experience Platform:
Read the Stripe overview for information on how to retrieve your authentication credentials.
In the Platform UI, select Sources from the left navigation to access the Sources workspace. You can select the appropriate category from the catalog on the left-hand side of your screen. Alternatively, you can find the specific source you wish to work with using the search option.
Under the Payments category, select Stripe, and then select Set up.
Sources in the sources catalog display the Set up option when a given source does not yet have an authenticated account. Once an authenticated account exists, this option changes to Add data.
The Connect Stripe account page appears. On this page, you can either use new or existing credentials.
To create a new account, select New account and provide a name, an optional description, and your credentials.
When finished, select Connect to source and then allow some time for the new connection to establish.
Credential | Description |
---|---|
Access token | Your Stripe access token. For information on how to retrieve your access token, read the Stripe authentication guide. |
To use an existing account, select Existing account and then select the account that you want to use from the existing account catalog.
Select Next to proceed.
Now that you have access to your account, you must identify the appropriate path to the Stripe data that you want to ingest. Select Resource path and then select the endpoint from where you want to ingest data from. The available Stripe endpoints are:
Once your endpoint is selected, the interface updates into a preview screen, displaying the data structure of the Stripe endpoint that you selected. Select Next to proceed.
Next, you must provide information on your dataset and your dataflow.
A dataset is a storage and management construct for a collection of data, typically a table, that contains a schema (columns) and fields (rows). Data that is successfully ingested into Experience Platform is stored within the data lake as datasets. During this step, you can create a new dataset or use an existing dataset.
To use a new dataset, select New dataset and then provide a name, and an optional description for your dataset. You must also select an Experience Data Model (XDM) schema that your dataset adheres to.
New dataset details | Description |
---|---|
Output dataset name | The name of your new dataset. |
Description | (Optional) A brief explanation of the new dataset. |
Schema | A dropdown list of schemas that exist in your organization. You can also create your own schema prior to the source configuration process. For more information, read the guide on creating an XDM schema in the UI. |
If you already have an existing dataset, select Existing dataset and then use the Advanced search option to view a window of all datasets in your organization, including their respective details, such as whether they are enabled for ingestion to Real-Time Customer Profile or not.
If your dataset is enabled for Real-Time Customer Profile, then during this step, you can toggle Profile dataset to enable your data for Profile-ingestion. You can also use this step to enable Error diagnostics and Partial ingestion.
Once your dataset is configured, you must then provide details on your dataflow, including a name, an optional description, and alert configurations.
Dataflow configurations | Description |
---|---|
Dataflow name | The name of the dataflow. By default, this will use the name of the file that is being imported. |
Description | (Optional) A brief description of your dataflow. |
Alerts | Experience Platform can produce event-based alerts that users can subscribe to. These options all require a running dataflow to trigger them. For more information, read the alerts overview
|
When finished, select Next to proceed.
The Mapping step appears. Use the mapping interface to map your source data to the appropriate schema fields before ingesting that data into Experience Platform. For an extensive guide on how to use the mapping interface, read the Data Prep UI guide for more information.
Next, use the scheduling interface to create an ingestion schedule for your dataflow.
Select the frequency dropdown to configure your dataflow’s ingestion frequency.
You can also select the calendar icon and use a pop-up calendar to configure your ingestion start time.
Scheduling configuration | Description |
---|---|
Frequency | Configure frequency to indicate how often the dataflow should run. You can set your frequency to:
|
Interval | Once you select a frequency, you can then configure the interval setting to establish the time frame between every ingestion. For example, if you set your frequency to day and configure the interval to 15, then your dataflow will run every 15 days. You cannot set the interval to zero. The minimum accepted interval value for each frequency is as follows:
|
Start Time | The timestamp for the projected run, presented in UTC time zone. |
Backfill | Backfill determines what data is initially ingested. If backfill is enabled, all current files in the specified path will be ingested during the first scheduled ingestion. If backfill is disabled, only the files that are loaded in between the first run of ingestion and the start time will be ingested. Files loaded prior to the start time will not be ingested. |
Once you have configured your dataflow’s ingestion schedule, select Next.
The final step in the dataflow creation process is to review your dataflow before executing it. Use the Review step to review the details of your new dataflow before it runs. Details are grouped in the following categories:
Once you have reviewed your dataflow, select Finish and allow some time for the dataflow to be created.
By following this tutorial, you have successfully created a dataflow to bring payments data from your Stripe source to Experience Platform. For additional resources, visit the documentation outlined below.
Once your dataflow has been created, you can monitor the data that is being ingested through it to view information on ingestion rates, success, and errors. For more information on how to monitor dataflow, visit the tutorial on monitoring accounts and dataflows in the UI.
To update configurations for your dataflows scheduling, mapping, and general information, visit the tutorial on updating sources dataflows in the UI.
You can delete dataflows that are no longer necessary or were incorrectly created using the Delete function available in the Dataflows workspace. For more information on how to delete dataflows, visit the tutorial on deleting dataflows in the UI.