Release month | Update type | Description |
---|---|---|
January 2024 | Functionality and documentation update | The Amazon S3 destination connector now supports a new assumed role authentication type. Read more about it in the authentication section. |
July 2023 | Functionality and documentation update | With the July 2023 Experience Platform release, the Amazon S3 destination provides new functionality, as listed below:
|
This section describes which types of audiences you can export to this destination.
Audience origin | Supported | Description |
---|---|---|
Segmentation Service | ✓ | Audiences generated through the Experience Platform Segmentation Service. |
Custom uploads | ✓ | Audiences imported into Experience Platform from CSV files. |
Refer to the table below for information about the destination export type and frequency.
Item | Type | Notes |
---|---|---|
Export type | Profile-based | You are exporting all members of a segment, together with the desired schema fields (for example: email address, phone number, last name), as chosen in the select profile attributes screen of the destination activation workflow. |
Export frequency | Batch | Batch destinations export files to downstream platforms in increments of three, six, eight, twelve, or twenty-four hours. Read more about batch file-based destinations. |
This destination supports dataset exports. For complete information on how to set up dataset exports, read the tutorials:
When exporting audience data, Platform creates a .csv
, parquet
, or .json
file in the storage location that you provided. For more information about the files, see the supported file formats for export section in the audience activation tutorial.
When exporting datasets, Platform creates a .parquet
or .json
file in the storage location that you provided. For more information about the files, see the verify successful dataset export section in the export datasets tutorial.
To connect to the destination, you need the View Destinations and Manage Destinations access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.
To connect to this destination, follow the steps described in the destination configuration tutorial. In the destination configuration workflow, fill in the fields listed in the two sections below.
To authenticate to the destination, fill in the required fields and select Connect to destination. The Amazon S3 destination supports two authentication methods:
Use this authentication method when you want to input your Amazon S3 access key and secret key to allow Experience Platform to export data to your Amazon S3 properties.
Amazon S3 access key and Amazon S3 secret key: In Amazon S3, generate an access key - secret access key
pair to grant Platform access to your Amazon S3 account. Learn more in the Amazon Web Services documentation.
Encryption key: Optionally, you can attach your RSA-formatted public key to add encryption to your exported files. View an example of a correctly formatted encryption key in the image below.
Use this authentication type if you prefer not to share account keys and secret keys with Adobe. Instead, Experience Platform connects to your Amazon S3 location using role-based access.
To do this, you need to create in the AWS console an assumed user for Adobe with the right required permissions to write to your Amazon S3 buckets. Create a Trusted entity in AWS with the Adobe account 670664943635. For more information, refer to the AWS documentation on creating roles.
arn:aws:iam::800873819705:role/destinations-role-customer
.To configure details for the destination, fill in the required and optional fields below. An asterisk next to a field in the UI indicates that the field is required.
manifest-<<destinationId>>-<<dataflowRunId>>.json
. View a sample manifest file. The manifest file includes the following fields:
flowRunId
: The dataflow run which generated the exported file.scheduledTime
: The time in UTC when the file was exported.exportResults.sinkPath
: The path in your storage location where the exported file is deposited.exportResults.name
: The name of the exported file.size
: The size of the exported file, in bytes.In the connect destination workflow, you can create a custom folder in your Amazon S3 storage per exported audience file. Read Use macros to create a folder in your storage location for instructions.
You can enable alerts to receive notifications on the status of the dataflow to your destination. Select an alert from the list to subscribe to receive notifications on the status of your dataflow. For more information on alerts, see the guide on subscribing to destinations alerts using the UI.
When you are finished providing details for your destination connection, select Next.
To successfully connect and export data to your Amazon S3 storage location, create an Identity and Access Management (IAM) user for Platform in Amazon S3 and assign permissions for the following actions:
s3:DeleteObject
s3:GetBucketLocation
s3:GetObject
s3:ListBucket
s3:PutObject
s3:ListMultipartUploadParts
When configuring the IAM role as a customer, make sure that the permission policy associated with the role includes the required actions to the target folder in the bucket and the s3:ListBucket
action for the root of the bucket. View below an example of the minimum permissions policy for this authentication type:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:GetBucketLocation",
"s3:ListMultipartUploadParts"
],
"Resource": "arn:aws:s3:::bucket/folder/*"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::bucket"
}
]
}
See Activate audience data to batch profile export destinations for instructions on activating audiences to this destination.
To verify if data has been exported successfully, check your Amazon S3 storage and make sure that the exported files contain the expected profile populations.
Refer to the IP address allowlist article if you need to add Adobe IPs to an allowlist.