Why can't the variants run on different pages of my site?
Monetate designed the Automated Personalization algorithm to select the best variant to maximize a customer's chance of completing the experience goal. For the algorithm to make this decision, the individual must have equal eligibility for each variant at the time the Personalization Engine makes its decision.
Consider the following scenario. An Automated Personalization experience includes variant A, which makes a change on the home page, and variant B, which makes a change on the search results page. A visitor arrives on your site and lands on the home page. The engine decides variant B is the best option to display based on everything it has learned. The visitor is on the home page, but the change is made on the search results page. There is no guarantee that the visitor will ever reach the search results page. The algorithm learns behavior of someone who never saw the actual change. To take full advantage of Automated Personalization experiences, the engine needs to learn behaviors from visitors who actively engage with the changes.
Can I edit context selections after I activate an experience?
Once you activate an Automated Personalization experience, you cannot change the context for the experience. The Automated Personalization algorithm collects specific information about each visitor and then finds the relationships among the information to make decisions. If the model begins collecting a new piece of context, it not only must collect that specific information, but it also must relearn how it relates to the rest of the context.
If a visitor is placed in the holdout in their first session, will they remain in the holdout in later sessions?
Yes, assignment within the holdout works exactly as it does for standard A/B testing. Once a Monetate ID (MID) is assigned to the holdout and a variant within that holdout, the visitor sees that same variant in all subsequent sessions until the MID expires or the experience ends.
If a visitor is placed in the holdout for one Automated Personalization experience, will they be in the holdout for all Automated Personalization experiences?
Visitors are randomly assigned to the holdout group of an Automated Personalization experience on a per-experience basis. This means a visitor may be in the holdout of one Automated Personalization experience and not in the holdout of another. The assignment of holdout versus Automated Personalization experience occurs on a per-experience basis that balances the likelihood of seeing a personalized experience versus a holdout experience. If the engine assigns someone to the holdout for an Automated Personalization experience, that person remains in the holdout for the entire length of that experience, but the visitor is still eligible for intelligent assignment in other Automated Personalization experiences.
If I traditionally run two experiences for mobile and desktop, do I need to do the same for Automated Personalization experiences?
Monetate recommends that you run a single Automated Personalization experience across all devices if you can. When an Automated Personalization experience targets only mobile devices, the behaviors learned not shared with the equivalent desktop experience. Automated Personalization experiences that incorporate all device types allow the engine to use what it learned from mobile on a desktop user. The engine uses device type as context, so therefore unless there is a business reason to run them separately, you have no need to create two separate experiences.
If I use Revenue per session as a goal metric, how does the Personalization Engine handle outliers in the holdout or Automated Personalization experience?
The engine optimizes for Revenue per session by breaking the metric into two parts: conversion rate and order value. Each has its own merit for consideration in the engine. Optimizing conversion rate creates a strong signal to the bandit—visitors either convert or don't. When it comes to order value, the amounts reported can vary greatly. While the engine looks at overall revenue, taking the order values at face value opens the engine up to skewing by outliers. Therefore by breaking out order value and transforming it, the engine can find a signal within all the outliers. The combined signal from conversion rate and the transformed order value provides the best of both worlds and allows the engine to optimize for conversions while considering order value.
Why shouldn't I throw as many creatives as I can into an Automated Personalization experience?
Consider an experience with a baseline variant and variant A, which has a single creative. The engine has only two options from which choose from. The engine chooses variant A or the baseline based on how individuals previously performed and ultimately selects variant A. Next, a new visitor comes to your website. The engine considers how new visitors performed against each variant and determines they performed better with the baseline. Now consider that the new visitor is from England. When you take this scenario and factor in a large number of creatives, the time to move into a state of optimization increases exponentially and the opportunity cost also increases. Your experience is configured for success when it has the minimum number of polarizing changes. Many similar creatives can keep the engine in a state of learning longer than necessary because it must sort through all the noise from the similar variants.
Are Automated Personalization experiences only effective for creative-based experiences?
Automated Personalization experiences are an effective strategy for when you want to present different alternatives to your visitors. You can accomplish this goal with creatives, but you can also impact changes during their journey. The key is to create bold, differentiated options for visitors that ultimately help them achieve your goal and improve their relationship with your brand.
Should I know about variant-level performance?
Automated Personalization experiences assign individuals into variants that increase their likelihood of performing the goal metric. Every individual who arrives on your site does not have the same chance of performing the goal metric. Some may have a naturally low tendency. If these individuals all appear similar to the engine, they are likely placed in the same variant. When you then look at the performance of this variant, it appears to be the worst variant, but that is a direct result of the people being assigned to it. It is the combined effort of all variants that creates the lift in your goal metric, not a variant in isolation.
What is the maximum number of variants I can add to an Automated Personalization experience?
You can add a maximum of 10 variants to an Automated Personalization experience.