This page is specific to the Data Landing Zone source connector in Experience Platform. For information on connecting to the Data Landing Zone destination connector, refer to the Data Landing Zone destination documentation page.
Data Landing Zone is an Azure Blob storage interface provisioned by Adobe Experience Platform, granting you to access a secure, cloud-based file storage facility to bring files into Platform. You have access to one Data Landing Zone container per sandbox, and the total data volume across all containers is limited to the total data provided with your Platform Products and Services license. All customers of Experience Platform are provisioned with one Data Landing Zone container per sandbox. You can read and write file(s) to your container through Azure Storage Explorer or your command-line interface.
Data Landing Zone supports SAS-based authentication and its data is protected with standard Azure Blob storage security mechanisms at rest and in transit. SAS-based authentication allows you to securely access your Data Landing Zone container through a public internet connection. There are no network changes required for you to access your Data Landing Zone container, which means you do not need to configure any allow lists or cross-region setups for your network. Experience Platform enforces a strict seven-day expiration time on all files and folders uploaded to a Data Landing Zone container. All files and folders are deleted after seven days.
Follow the steps below to learn how you can set up your Data Landing Zone account for Experience Platform on Azure.
If you want to access Data Landing Zone from Azure Data Factory, then you must create a linked service for Data Landing Zone using the SAS credentials provided by Experience Platform. Once you have created your linked service, you can then explore your Data Landing Zone by selecting the container path instead of the default root path.
The following is a list of constraints that you must account for when naming your cloud storage files or directories.
/
). If provided, it will be automatically removed.! ' ( ) ; @ & = + $ , % # [ ]
" \ / : | < > * ?
.\uE000
, while valid in NTFS filenames, are not valid Unicode characters. In addition, some ASCII or Unicode characters, like control characters (such as 0x00
to 0x1F
, \u0081
, and so on), are also not allowed. For rules governing Unicode strings in HTTP/1.1 see RFC 2616, Section 2.2: Basic Rules and RFC 3987.You can use Azure Storage Explorer to manage the contents of your Data Landing Zone container.
In the Azure Storage Explorer UI, select the connection icon in the left-navigation. The Select Resource window appears, providing you with options to connect to. Select Blob container to connect to Data Landing Zone.
Next, select Shared access signature URL (SAS) as your connection method, and then select Next.
After selecting your connection method, you must next provide a display name and the Blob container SAS URL that corresponds with your Data Landing Zone container.
You can retrieve your Data Landing Zone credentials from the sources catalog in the Platform UI.
Provide your Data Landing Zone SAS URL and then select Next
The Summary window appears, providing you with an overview of your settings, including information on your Blob endpoint and permissions. When ready, select Connect.
A successful connection updates your Azure Storage Explorer UI with your Data Landing Zone container.
With your Data Landing Zone container connected to Azure Storage Explorer, you can now start uploading files to your Data Landing Zone container. To upload, select Upload and then select Upload Files.
Once you have selected the file you want to upload, you must then identify the Blob type that you want to upload it as and your desired destination directory. When finished, select Upload.
Blob types | Description |
---|---|
Block Blob | Block Blobs are optimized for uploading large amounts of data in an efficient manner. Block Blobs are the default option for Data Landing Zone. |
Append Blob | Append Blobs are optimized for appending data to the end of the file. |
You can also use the command line interface of your device and access upload files to your Data Landing Zone.
The following example uses Bash and cURL to upload a file to a Data Landing Zone with the Azure Blob Storage REST API:
# Set Azure Blob-related settings
DATE_NOW=$(date -Ru | sed 's/\+0000/GMT/')
AZ_VERSION="2018-03-28"
AZ_BLOB_URL="<URL TO BLOB ACCOUNT>"
AZ_BLOB_CONTAINER="<BLOB CONTAINER NAME>"
AZ_BLOB_TARGET="${AZ_BLOB_URL}/${AZ_BLOB_CONTAINER}"
AZ_SAS_TOKEN="<SAS TOKEN, STARTING WITH ? AND ENDING WITH %3D>"
# Path to the file we wish to upload
FILE_PATH="</PATH/TO/FILE>"
FILE_NAME=$(basename "$FILE_PATH")
# Execute HTTP PUT to upload file (remove '-v' flag to suppress verbose output)
curl -v -X PUT \
-H "Content-Type: application/octet-stream" \
-H "x-ms-date: ${DATE_NOW}" \
-H "x-ms-version: ${AZ_VERSION}" \
-H "x-ms-blob-type: BlockBlob" \
--data-binary "@${FILE_PATH}" "${AZ_BLOB_TARGET}/${FILE_NAME}${AZ_SAS_TOKEN}"
The following example uses Microsoft’s Python v12 SDK to upload a file to a Data Landing Zone:
While the example below uses the full SAS URI to connect to an Azure Blob container, you can use other methods and operations to authenticate. See this Microsoft document on Python v12 SDK for more information.
import os
from azure.storage.blob import ContainerClient
try:
# Set Azure Blob-related settings
sasUri = "<SAS URI>"
srcFilePath = "<FULL PATH TO FILE>"
srcFileName = os.path.basename(srcFilePath)
# Connect to container using SAS URI
containerClient = ContainerClient.from_container_url(sasUri)
# Upload file to Data Landing Zone with overwrite enabled
with open(srcFilePath, "rb") as fileToUpload:
containerClient.upload_blob(srcFileName, fileToUpload, overwrite=True)
except Exception as ex:
print("Exception: " + ex.strerror)
The following example uses Microsoft’s AzCopy utility to upload a file to a Data Landing Zone:
While the example below is using the copy
command, you can use other commands and options to upload a file to your Data Landing Zone, using AzCopy. See this Microsoft AzCopy document for more information.
set sasUri=<FULL SAS URI, PROPERLY ESCAPED>
set srcFilePath=<PATH TO LOCAL FILE(S); WORKS WITH WILDCARD PATTERNS>
azcopy copy "%srcFilePath%" "%sasUri%" --overwrite=true --recursive=true
This section applies to implementations of Experience Platform running on Amazon Web Services (AWS). Experience Platform running on AWS is currently available to a limited number of customers. To learn more about the supported Experience Platform infrastructure, see the Experience Platform multi-cloud overview.
Follow the steps below to learn how you can set up your Data Landing Zone account for Experience Platform on Amazon Web Services (AWS).
Use the AWS configure
command to set up your CLI with access keys and a session token.
aws configure
When prompted, enter the following values:
{YOUR_ACCESS_KEY_ID}
{YOUR_SECRET_ACCESS_KEY}
{YOUR_REGION}
(for example, us-west-2
)json
Next, set the session token:
aws configure set aws_session_token your-session-token
Template:
aws s3 cp local-file-path s3://bucketName/dlzFolder/remote-file-Name
Example:
aws s3 cp example.txt s3://bucketName/dlzFolder/example.txt
Template:
aws s3 cp s3://bucketName/dlzFolder/remote-file local-file-path
Example:
aws s3 cp s3://bucketName/dlzFolder/example.txt example.txt
First, you must obtain the following:
awsAccessKeyId
awsSecretAccessKey
awsSessionToken
Next, use the extracted credentials to create a session and generate a sign-in token using the AWS Federation endpoint:
import json
import requests
# Example DLZ response with credentials
response_json = '''{
"credentials": {
"awsAccessKeyId": "your-access-key",
"awsSecretAccessKey": "your-secret-key",
"awsSessionToken": "your-session-token"
}
}'''
# Parse credentials
response_data = json.loads(response_json)
aws_access_key_id = response_data['credentials']['awsAccessKeyId']
aws_secret_access_key = response_data['credentials']['awsSecretAccessKey']
aws_session_token = response_data['credentials']['awsSessionToken']
# Create session dictionary
session = {
'sessionId': aws_access_key_id,
'sessionKey': aws_secret_access_key,
'sessionToken': aws_session_token
}
# Generate the sign-in token
signin_token_url = "https://signin.aws.amazon.com/federation"
signin_token_payload = {
"Action": "getSigninToken",
"Session": json.dumps(session)
}
signin_token_response = requests.post(signin_token_url, data=signin_token_payload)
signin_token = signin_token_response.json()['SigninToken']
Once you have sign-in token, you can then build the URL that logs you in the the AWS Console and points directly to the desired Amazon S3 bucket.
from urllib.parse import quote
# Define the S3 bucket and folder path you want to access
bucket_name = "your-bucket-name"
bucket_path = "your-bucket-folder"
# Construct the destination URL
destination_url = f"https://s3.console.aws.amazon.com/s3/buckets/{bucket_name}?prefix={bucket_path}/&tab=objects"
# Create the final sign-in URL
signin_url = f"https://signin.aws.amazon.com/federation?Action=login&Issuer=YourAppName&Destination={quote(destination_url)}&SigninToken={signin_token}"
print(f"Sign-in URL: {signin_url}")
Finally, navigate to the generated URL to directly log in to the AWS Console with your Data Landing Zone credentials, which provides access to a specific folder within an Amazon S3 bucket. The sign-in URL will take you directly to that folder, ensuring that you only see and manage permitted data.
To connect to the source, you need the View Sources and Manage Sources access control permissions. For more information, read the access control overview or contact your product administrator to obtain the required permissions.
Private links are currently not supported when connecting to Experience Platform using the Data Landing Zone. The only supported methods for access are the methods listed here.
The documentation below provides information on how to bring data from your Data Landing Zone container to Adobe Experience Platform using APIs or the user interface.