Intelligently re-engage your customers: Luma examples

Last update: 2024-03-07
  • Created for:
  • Admin

Learn how Adobe adapted the Intelligent Re-engagement use case to work with the Luma demo site, building on the foundation implementation documented in the Data Architect and Data Engineer tutorial and the Platform Web SDK tutorial.

Implementation

 Transcript

So how can you execute the intelligent re-engagement use case at your company? I’m Daniel Wright, technical marketing engineer. And then a series of videos. We’ll walk you through an example of implementation and execution using our fictional brand Luma. This video is for the developers doing the implementation work, and the other videos will show marketers how to do their part in real time customer data platform and Journey Optimizer. So let’s dive in. The Adobe Experience platform components that we’ll use in our implementation are schemas, identities, data sets, data ingestion and profiles. Let’s make sure we understand the business requirements and then transition into our data model and schemas. So we have three scenarios abandoned Browse in which a customer browses a piece of content but doesn’t take the next step and you want to reengage them. We’re going to showcase this in a retail scenario, but you can apply this to almost any industry. Maybe you’re a financial institution and somebody reads about your auto loans but doesn’t apply for one. Next is abandoned cart. In this one, a customer adds a product to their shopping cart but doesn’t complete the purchase. Finally, we have an order confirmation scenario which is sending a message after a purchase or other conversion event. So what data do we need to pull this off? I’m going to break this problem down into three themes ID, qualification and messaging. Okay. So let’s start with ID. Who is this user so we can message them? They need to be what we call a known user. We need to be able to connect online behavior to an email address or mobile phone number. Typically, you would get this information by having an account creation process, and they occasionally log in to your website or mobile app for in-store purchases. It’s a good idea to have an incentive to self identify, say, by entering loyalty program details at checkout. Qualification is about what makes them eligible. In our use case, there are different events used in this scenarios like viewing a product, adding it to their shopping cart or purchasing something accompanying these events. You need things like timestamps because the passage of time is another important element of qualification. Finally, we have messaging. What message do you want to show them? If you want to show them the actual products they browsed, abandoned or purchased, then you need to collect SKUs, product names, images or things like that. Also, what channel will you use to message them? What are their communication preferences? So now that we know the scenarios and the data points they require, we can build an entity relationship diagram of our data model.

Here are the three main schemas mentioned in the use case. Customer attributes which uses the axiom individual profile class. Since these are attributes of the customer or customer digital transactions which uses the EKM Experience event class. Since this captures actions that a customer’s taken on the website or mobile app over a period of time, and customer offline transactions which also uses the ZM experience event class again, because these are actions that customers have taken. So here’s the thing you don’t need to use these exact schemas, you just need to collect the right data for the use case. For example, here’s our data model for LUMA. Now I like to have separate schemas for each data source. It might look more complex, but I find it more intuitive. Your data model is probably going to be even more elaborate because you’re going to onboard data from additional sources and need additional fields for additional use cases. The important thing is that you’re collecting the right data in the right format. By the right format, I mean use the experience event class for event data like the product use and purchases and use the Axiom individual profile class for attribute data like email addresses and consent preferences. Use the right data types and field properties to help ensure data integrity. For the most part, I’m using the same field groups that were outlined in the use case document.

Now let’s talk about identities and profiles. We want to use data from our website or mobile App store purchases and then systems with attribute data about our customers like CRM or loyalty. We’re going to stitch the data from these sources together to build real time customer profiles. Identity fields and the identity graph are What do this? Let’s take a look at how we collect these with Luma. Starting with this CRM system, Our CRM system uses a CRM idea as the primary identity and you can see how it’s labeled as such in the schema. Note that my email and phone number fields are also in this schema and I don’t have to use them as identities. Our loyalty system uses a loyalty ID as its primary ID, but it also uses the CRM ID as a secondary identity. This allows us to connect profiles between the CRM and loyalty systems and note that these identity fields are custom fields added through a custom field group. I’ve also created custom identity namespaces for them. Our offline purchases schema, which captures those in-store orders, also uses the loyalty ID as the primary.

Anonymous in-store purchases, which wouldn’t have a loyalty ID. They’re not helpful in the use case. In our use case, we use the in-store purchases to send or suppress messages. So if we can’t tie the in-store purchase back to a known user, we don’t need that data for our scenarios anyway. Our consent and test profile schemas also use the CRM idea as the primary.

Now let’s look at the web and mobile schema. Note that I’m not using any fields labeled as identities. Instead, I’m using the identity map, which is automatically part of every experience events schema. When you use identity map, you specify the identity namespace and whether or not the identity is primary. When you send the data to platform, I’ll explain this more. When we get to the data ingestion portion.

So by having a considerate data model with good identity selection, identity graphs can form.

Let’s take a look at one. I’ll go to the identities screen and search for one of my CRM IDs. I can turn on the visualizations so we can see how the loyalty ID, CRM, ID and device IDs called experience Cloud IDs or ECI IDs are all graph together for this user.

So we review the schemas and identities. Now let’s take a look at our data sets. Data sets are no big deal. They just take a few seconds to create and I use one for each schema mentioned earlier. And I do like to use separate data sets for my web and mobile data.

To build profiles, you need to take one minor step and enable both schemas and data sets for profile, which is done through these toggles. You need to have identity fields to enable profile and with a schema using the identity map, there will be an additional dialog you go through to confirm that you’ll be passing your identities that way.

Once you have schemas and data sets enabled for a profile and ingest data, you can see profiles in the interface. So let’s click through to the profile viewer where we can see that this profile contains attributes from CRM, loyalty, consent and event data from the website and mobile app. Now let’s pivot to data ingestion and we’ll start with the website and mobile app. So we’ve implemented with the Experience Platform, web and mobile SDK, which can send data to platform as well as other Adobe applications and third parties. But this isn’t actually a requirement if you use app measurement for your web analytics data, you can use the Adobe Analytics Source Connector to pull that data into platform. And if you use a non Adobe analytics vendor on your website and mobile app, you can ingest that data into platform two using source connectors or APIs.

When someone logs into the Luma website, they become a known user and we can pass their authentic data or customer ID to Adobe. If I inspect a network car, you can see I’m passing this Luma CRM ID as the primary identity in the hit. I use tags to implement the web SDK and there is a special identity map data element type that you can use to set these authenticated IDs.

There’s another identity, a device ID called the Experience Cloud ID or ECI ID, and that gets added to the identity map on the platform Edge network. Before that data is sent to the platform. So I won’t see it in the call, but it will be there when the data gets into platform and the idea is used as the primary identity. When the user is not logged in and the mobile app does the same thing, we also collect our events through the case. So the product for use adds to cart purchases. There are two common ways of doing this. First is to use the event type field. Note here I have event type set to commerce product views. I also have a commerce stop product used values set to one. That’s another way of signaling that a product view has occurred. I can only send one event type per call, but with that dot value approach, you can indicate that multiple events have occurred within a single call. It’s just important that your marketing team knows which approach you’re using in order to build their journeys and audiences correctly.

The SDK will automatically add timestamps and IDs to every event, so you don’t need to worry about those or other events coming from in-store purchases. Those can be streamed or batched into platform using the available source connectors in our catalog or by using the API. I use the API to ingest my sample data. This data should map to the same extreme fields as your web and mobile data so that you can easily identify purchases across all sales channels.

What about our messaging requirements? How do we collect the product details in case we want to display things like product names and images in our re-engagement messages? I’m collecting the product SKUs and names here in the Web SDK implementation. You can see them in this product list items array. The image URLs. I don’t collect client side. If we go back to my ERP. Note that I have a separate product catalog schema that uses a custom product catalog class. I use what’s called a schema relationship to map the SKUs from my events to this schema. So it’s basically a lookup table and all I really need to collect in my events is the SKU.

Now let’s move on to our customer attribute data. CRM data can be onboarded from our CRM source connectors available in the catalog or through cloud storage connectors or via API. I’m using the API to batch ingest sample data.

Consent preferences can also be onboarded from the consent and preferences source connectors in the catalog through cloud storage, connectors or API. Again, I’m just using the API to batch my sample data.

Loyalty system data can be onboarded using cloud storage connectors or API. Again, I’m using the API and the test profiles since this is more of an internal configuration for the marketer. These are usually just dragged into the UI on the data set screen, so that should cover data ingestion and our messaging requirements. And that concludes our implementation video showing how we built our schemas, chose and populated our identities, built data sets, ingested data and constructed real time customer profiles. As you can see, we took that design pattern from the use case document. Broke down the problem into the areas of identification, qualification and messaging requirements and used an entity relationship diagram to help focus on the use case. Make sure we were capturing everything we needed to accomplish the use case. I hope you’re are able to use this example to implement the use case at your business.

Journey Configuration

Audience and Destination Configuration

 Transcript

Hi, it’s Daniel. In this video, I’m going to show you how we set up the activation of our paid media campaign for the Intelligent Re-Engagement Abandoned Browse scenario. We set up this scenario on our retail demo brand, Luma. First, let’s review this scenario diagram. Eligibility for the paid media campaign begins when a customer views a product on either our website or mobile app. After the product view, we want to give the customer a little time. Maybe they’re going to make a purchase without a nudge. Maybe it’s a purchase decision, which takes a few days, and they’re still actively engaged with your brand. We don’t want to waste advertising budget on active customers. So we’ll wait for three days, and if they haven’t engaged with our brand again, we’ll enter them into our paid media campaign. We’ll also send them a message from our Adobe Journey Optimizer Journey, which is covered in a separate video. So we show them the ad for three days, and then that’s it. If they engage or not, we want our paid media campaign to end. To execute this, we need an audience to detect the lack of brand engagement and a destination to which we’re going to send this audience. Let’s start with the audience. I’m going to show you the end result and then walk you through how we got there. We ultimately built three audiences, two which look for specific behaviors and a third which combines them. Let’s dive in. We started by tackling this portion of the diagram, looking for a product view, and then excluding people who engaged with the brand. Engagement with the brand being defined as anyone who bought something or even just came back to the mobile app or website. This is what our final definition looks like. If you’re new to building audiences, this plain language description is incredibly helpful to understand how the audience behaves. Now I’ll show you how we actually built the audience. I’ll start by dragging the product view event onto the canvas. Next, we add our exclusions, excluding people who made a purchase, excluding people who launched the mobile app, and finally, excluding anybody who visited the website. I’ll explain why I didn’t just use the page views event in a minute. Also, very important note that I use the and condition in the exclusion to make sure none of these events occur. When choosing how to define these events in the audience builder, we need to know a little bit about how events are collected in the source systems gathering this data. And this might be different for your implementation. We pass event type commerce.product views on product pages on our website and mobile app, which is why I was able to drag that product view event into the audience builder. Those easy to grab events are dependent on the use of event type. In our web SDK implementation, we don’t pass event type web.webpage details.page views on every page load. On this product page, remember I’m passing the commerce.product views and there’s a limit of one event type per call. We do pass web.webpage details.page views.value equals one on every page, which is why I use that in my audience definition. OK, let’s resume building our audience. Now, looking at our diagram again, we see there are some time considerations. We don’t want the visitor to qualify for this audience unless they’ve been disengaged for at least three days after the product view. Also, after six days, we want them to fall out of the audience and no longer be in the paid media campaign. There are a lot of places to add time constraints in the audience builder, and they each have a different impact. There’s one here, here and under here. For this use case, I added to the product view event. I add a rolling range of three to six days ago. The description down here is your friend to see in plain language if the time constraint makes sense. Now, one thing we realized is that we don’t want to be too strict about banning brand engagement immediately after the first product view. For example, if someone looked at one product and then right afterwards looked at some other products or went to the home page, really went anywhere else in the website or mobile app, we wouldn’t want to kick those types of people out. So this is where our other time consideration comes in. We only want to look for the absence of brand engagement beginning an hour after that initial product view, which gives a little buffer for the visitor to complete that visit or session. We add that buffer here before the exclusion. So that’s our first audience. We save it as a batch audience, which is entirely suitable because of the time horizon. Now, because of that one hour buffer, we need to be careful. We just created a blind spot and we don’t want people who make a purchase or add a product to their cart during that hour to qualify for our paid media campaign. We don’t want to spend ad dollars on people who just bought something from us and people who added something to their cart but didn’t purchase. We’d prefer to save them for our abandoned cart scenario. So we built out our next audience to look for product views in that same window of three to six days ago and then do not purchase or add anything to their cart within one hour. At first, we tried to build this logic into our original audience, but there is a restriction on sequencing exclusions. Now we can build out the third audience, which just looks for anyone who qualified for both of the other two audiences. Now that we have our audience, we can activate it to our advertising destination. The configuration is going to be different based on what destination you use. All advertising destinations require you to use something as an identifier. For people who’ve never logged in to the website, a destination like Google DV360 might be useful. To do this successfully, you’d need to implement a sync container on your website and mobile app to synchronize identifiers. For people who have authenticated, you can use other destinations like Google Customer Match and then use a hashed email, phone number or mobile device ID as the identifier. If you haven’t yet configured a destination, here’s a quick overview of the process. You find the destination you want to use in the catalog and configure the destination. Now, many destinations require more authentication and account details than this one. Once the destination is configured, you can add audiences to it. Advertising destinations only share audience qualification. They don’t share profile attributes. So we can’t share the details of the last product the customer viewed. Now, I have a bunch of test scenarios I use to validate the behavior of this audience. But what you should see when you test on your own website and mobile app is immediately after browsing the website or mobile app, you should see the product view event captured in your profile. On the third or fourth day after the product view, you should see that you qualify for the audience and you should continue to qualify for the next few days. Remember, these are all batch audiences and will evaluate once a day. You can view the timestamp of the last audience evaluation by opening up the audience in edit mode. So if you viewed a product immediately after the last evaluation, it might take a little bit longer for you to qualify for the audience than somebody who viewed a product a couple hours before the audience evaluated. The audience qualification leads to the person’s qualification status being shared to the configured destination. And assuming you’ve configured your paid media campaign, they will qualify for that. After six or seven days, the person should fall out of the audiences and then cease seeing the ad campaign. That’s it. I hope this helps you to implement the abandoned browse scenario at your company.

On this page