Active tests not showing

A/B Testing is great

Sitecore’s Experience Optimization content testing functionality is a powerful tool in the Sitecore XP arsenal.  When coupled with a solid engagement value strategy, marketers can get deep insights into what actually works (rather than what we think may work).

….when it works

Recently we hit an issue in a recently upgraded (8.2 -> 9.0.1) installation that would not display newly created tests correctly.  The process of creating the test all went smoothly, the test would actually be running…but it would not display in the “Active Tests” tab of Experience Optimization.  Drafts and Historical tests all displayed as expected.

Capture
I just created a new test….but no active tests 😦

Testing through our environments identified that things were working locally, but any upstream deployments all reproduced the issue.

All tests get indexed by your search provider upon creation.  As part of the upgrade we’d also switched search providers to Azure Search (locally we use SOLR) so thought I’d investigate the [sitecore-testing-index].  All looked ok ¯\_(ツ)_/¯ .  Documents were being added for each test we created with what seemed appropriate data. There was also a fairly cryptic INFO message in the logs when attempting to load the Active tests:

9300 02:31:53 INFO AzureSearch Query [sitecore_testing_index]: &search=This_Is_Equal_ConstNode_Return_Nothing

After hitting up colleagues and Sitecore community slack we were still at a loss, so raised a Sitecore support ticket.  After an open dialogue and many configs later, we had a solution!

The solution

This was a config issue with Azure search.  One of the Sitecore fields to be added to the [sitecore-testing-index] is a flag to whether the test is running (“__is running“).  The default configs fail to add this field to the index correctly.  Easily fixed…..it’s a one line change (isn’t it always?).

The Content Testing configs for 9.0.1 (and 9.0.2) provided in the Sitecore package downloads (for Azure AppService) you’ll see the following:

<field fieldName="__is_running" cloudFieldName="is_running____" boost="1f" type="System.Boolean" settingType="Sitecore.ContentSearch.Azure.CloudSearchFieldConfiguration, Sitecore.ContentSearch.Azure" />

This code needs to change to the following  (I recommend patching it to avoid any future deployment/upgrade issues):

<field fieldName="__is running" cloudFieldName="is_running_____b" boost="1f" type="System.Boolean" settingType="Sitecore.ContentSearch.Azure.CloudSearchFieldConfiguration, Sitecore.ContentSearch.Azure" />

The changes are subtle, but important! Firstly, it’s referencing the field name correctly (with a space). Secondly it is giving the index field name a _b suffix to further indicate it is a boolean.

After making this change and then rebuilding the [sitecore-testing-index] in the control panel, marketers and analytics teams rejoiced as they were able to view all of their glorious active tests.

 

Marketing Automation stored procedures and tables missing

Seen this message in the exception logs of your shiny new Azure app service instance of Sitecore 9?

Could not find stored procedure ‘xdb_ma_pool.AutomationPool_Stall’

ARRRRRGH.  Missing Stored Procedures?!?! Who’s been dropping stuff in my DB!  It’s newly deployed…how can this be! It worked on my dev machine! Never fear, there’s an explanation and easy fix.

No…you haven’t been dropping DB objects in your sleep.  This just appears to be an oversight in the Marketing Automation SQL dacpac in the Sitecore 9 Azure App Service web deploy packages.  In fact there are a few other objects missing too:

Stored Procedures
AutomationPool_Stall
ContactWorkerAffinity_ReleaseLease
ContactWorkerAffinity_TakeLease

Table
StalledAutomationPool

This has been confirmed by Sitecore support and they’ve published a KB  https://kb.sitecore.net/articles/065636 . Also the KB now contains a SQL script to create the missing objects.  Running this script on the Marketing Automation DB in your Azure environment should create the missing objects and resolve the issue.   This is confirmed as an issue on 9.0.1 and 9.0.2.

We’ll be adding this as a post-install script to all our ARM template deployments to avoid any manual steps in future deployments.

 

Sitecore dashboards in Azure portal

Sitecore 9 in Azure PaaS offers a robust ecosystem that will allow you to monitor performance of your application and infrastructure.  Metrics are made available via the built in metrics of the PaaS environment and metrics logged to Application Insights.  Sitecore also captures custom metrics relevant to Sitecore operations and pushes them to App Insights.  By default there is a lot of data being captured, so much so in fact it’s worth trimming any metrics that you don’t require.  Check out the Sitecore’s post deployment recommendations for App Insights.

Application insights is amazingly powerful, allowing you to build queries (in the Kusto query language), but with power comes great responsibility. Querying to trawl logs is one thing, but at a glance I want to see the health of my Sitecore instance.  The Azure portal offers a customisable dashboard system so that you can be greeted with graphs, metrics & labels and then quickly change between other dashboards (possibly for other projects or environments).  Graphs and charts can easily be customised in the metrics blade then quickly added to your dashboard by selecting “Pin to dashboard”.  Dashboards can also be shared with other users in your Azure subscription.  Microsoft have provided some solid documentation on creating/customising and sharing dashboards.

Usually I’ll try to get as much relevant data as possible into 2 sections.  One for public facing (eg. CD metrics, uptime, response times etc) and one for Sitecore “behind the scenes” (CM/Processing and reporting metrics). There are some key metrics I like to keep an eye on.  These are usually indicators that something might be wrong and it’s time to investigate the cause by drilling down or hitting the logs.

SQL

  • DTU Percentage
DTU utilization for all databases
DTU % utilization for all databases

App Services & App Service plans

  • CPU Percentage
  • Memory Percentage
  • Requests
  • Avg response time
CPU and Memory of CD App Service Plans
CPU and Memory of CD App Service Plans

App Insights

  • Exceptions (sum, split by role)
  • Sitecore.Caching / Cache Misses /s
  • Failed requests (exposes all HTTP responses > 400)
  • Availability statistics
  • Live Stream (just a overview and link)
Time to look into why we’re seeing those exceptions!

Adding all of these pretty much fill a decent sized dashboard, but there are some other metrics that are good indicators of health too.  So it may be worth baselineing them and adding (email or webhook) alerts above your thresholds as they may not be front and centre (you can do this for almost all metrics!).

  • Sitecore.Analytics/ Aggregation Live Interactions Processed /s
  • Sitecore.Analytics/ Aggregation Contacts Processed /s
  • Search latency
  • Search queries /s

The list of available metrics Sitecore logs by default can be found in App_Config/Include/zzz/Sitecore.Cloud.ApplicationInsights.config by default.  Take a look and see what may be relevant to your solution for each role.

From here you can continue to add and tweak your dashboards that best suit your solution.

xDB index rebuild with Azure Search

As part of a Sitecore migration (8.2 to 9.0.1) we had a requirement to migrate a sizeable xDB implementation. The xDB data migration tool makes this super easy. It reads straight out of your existing mongoDB and pushes it to xConnect. As new contacts and interactions are pushed to the 9.0.1 xDB (in our case in Azure SQLDB) the indexer service will pick up the changes and update the appropriate index. Sweet!

So what happens if you want to change your search provider from Solr to Azure Search? It stands to reason that you should just be able to hook up Azure Search in xConnect in a similar way to the core role indexes (check out Azure Search on Sitecore 9) with the appropriate Configs and connection strings, then rebuild the search index. One problem, there are some serious caveats on the xDB index rebuild for Azure search. They are alluded to in the Rebuild the xDB Search index documentation (Update May 2018: The doco has been updated and is much clearer now!), but may need some further clarification. Important points:

  • A full xDB index rebuild is not supported on Azure Search as of 9.0.1. That means that if you want to rebuild the xDB search with historical data, it’s not supported out of the box. Please note though that the indexer will rebuild the index for everything that is still available in the change tracking log in the SQL db (by default this is a 5 day retention). But anything before that will not be included.
  • This may not be a deal-breaker for a migration as your data will be indexed as the xDB migration tool pulls everything across. Just plan your migration accordingly with the limitations in mind.
  • This will affect anyone looking to switch from Solr to Azure search on 9.0.1
  • There is probably some potential to code this yourself!
  • Sitecore support have it as a feature planned for 9.0.2, so may be best to hold tight for now and get support ootb.
  • I’m hoping the doco gets updated to be a bit more clear as a few in the community have been thrown by this.  (Update May 2018: it has been!)

Sitecore 9: Azure search and index all fields

Index all fields

Out of the box Sitecore 9 configs will index all fields in each of the basic indexes (core,master,web, etc..). This is a great strategy to ensure that everything you add to your templates gets indexed down the line. However this also leads to a couple problems:

  • Exceeding the 1000 field maximum on Azure Search indexes
  • Over indexing

1000 field limit

Seen this error pop up in your logs when using the Azure search provider?

ManagedPoolThread #3 09:59:00 ERROR [Index=sitecore_master_index] Commit failed
Exception: System.AggregateException
Message: One or more errors occurred.
Source: mscorlib
at System.Threading.Tasks.Task.ThrowIfExceptional(Boolean includeTaskCanceledExceptions)
at System.Threading.Tasks.Task.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
at System.Threading.Tasks.Parallel.ForWorker[TLocal](Int32 fromInclusive, Int32 toExclusive, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Func`4 bodyWithLocal, Func`1 localInit, Action`1 localFinally)
at System.Threading.Tasks.Parallel.ForEachWorker[TSource,TLocal](IEnumerable`1 source, ParallelOptions parallelOptions, Action`1 body, Action`2 bodyWithState, Action`3 bodyWithStateAndIndex, Func`4 bodyWithStateAndLocal, Func`5 bodyWithEverything, Func`1 localInit, Action`1 localFinally)
at System.Threading.Tasks.Parallel.ForEach[TSource](IEnumerable`1 source, Action`1 body)
at Sitecore.ContentSearch.Azure.Http.CompositeSearchService.PostDocuments(ICloudBatch batch)
at Sitecore.ContentSearch.Azure.CloudSearchUpdateContext.Commit()
Nested Exception
Exception: Sitecore.ContentSearch.Azure.Http.Exceptions.BadRequestException
Message: Error in the request URI, headers, or body
Source: Sitecore.ContentSearch.Azure
at Sitecore.ContentSearch.Azure.Http.SearchServiceClient.EnsureSuccessStatusCode(HttpResponseMessage response)
at Sitecore.ContentSearch.Azure.Http.SearchServiceClient.UpdateIndex(IndexDefinition indexDefinition)
at Sitecore.ContentSearch.Azure.Schema.SearchServiceSchemaSynchronizer.SyncRemoteService(IndexDefinition sourceIndexDefinition, IEnumerable`1 incomingFields)
at Sitecore.ContentSearch.Azure.Schema.SearchServiceSchemaSynchronizer.c__DisplayClass17_0.b__0()
at Sitecore.ContentSearch.Azure.Utils.Retryer.RetryPolicy.Execute(Action action)
at Sitecore.ContentSearch.Azure.Http.SearchService.PostDocumentsImpl(ICloudBatch batch)
at Sitecore.ContentSearch.Azure.Http.SearchService.PostDocuments(ICloudBatch batch)
at Sitecore.ContentSearch.Azure.Http.CompositeSearchService.c__DisplayClass15_0.b__0(ISearchService searchService)
at System.Threading.Tasks.Parallel.c__DisplayClass17_0`1.b__1()
at System.Threading.Tasks.Task.InnerInvokeWithArg(Task childTask)
at System.Threading.Tasks.Task.c__DisplayClass176_0.b__0(Object )

Nested Exception
Exception: Sitecore.ContentSearch.Azure.Http.Exceptions.AzureSearchServiceRESTCallException
Message: {"error":{"code":"","message":"The request is invalid. Details: definition : Invalid index: The index contains 1033 field(s). An index can have at most 1000 fields.\r\n"}}

Uh oh, this doesn’t happen on Solr!

Azure search has a hard limit of 1000 fields per index 😢

This is confirmed by checking the limits and quotas for an S1 instance of Azure search service:

Azure search index limits
Source: https://docs.microsoft.com/en-us/azure/search/search-limits-quotas-capacity

An out if the box Sitecore 9.0.1 instance will create some indexes with more than 900 fields…so it works, but allows for very little head room when adding your own fields. As the last point indicates the default settings on the basic indexes index all fields, so any customisation (eg. adding a couple of fields to templates for items that gets indexed) pretty much leads to exceeding this limit. You’ll then start having a bad time when trying to add/modify items or rebuilding the index.

After confirmation from Sitecore support, the default index configs do explicitly include a list of required fields to support basic functionality. So you can change the indexAllFields flag in the default documentOptions to false to ensure all of your custom fields are not automatically added to the index. Patch in this change with a config patch like so:

<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:set="http://www.sitecore.net/xmlconfig/set/" xmlns:search="http://www.sitecore.net/xmlconfig/search/" xmlns:role="http://www.sitecore.net/xmlconfig/role/" >
<sitecore role:require="Standalone or ContentManagement or ContentDelivery" search:require="Azure">
<contentSearch>
<indexConfigurations>
<defaultCloudIndexConfiguration type="Sitecore.ContentSearch.Azure.CloudIndexConfiguration, Sitecore.ContentSearch.Azure">
<documentOptions type="Sitecore.ContentSearch.Azure.CloudSearchDocumentBuilderOptions,Sitecore.ContentSearch.Azure">
<indexAllFields>false</indexAllFields>
</documentOptions>
</defaultCloudIndexConfiguration>
</indexConfigurations>
</contentSearch>
</sitecore>
</configuration>

Rebuilding your index should now complete successfully.

If your solution does need additional fields, be picky. Only index what is needed! Change the default indexAllFields flag so that basic functionality still works, then add a new custom index with the fields your solution requires.  Keep it sub 1000 fields!

This will keep your indexes lean and mean as your project grows. *Hint: and also fix the concern below!

Over indexing

While it’s handy to have everything right there in your search indexes, it can lead to decreases in performance as your content grows.  To keep this in check, only index what you need.

  • Only crawl parts of the tree you need to search
  • Only index the fields you need.
  • Use custom indexes to help keep it organised and performant

 

Azure Search on Sitecore 9

Hello, is it me you’re looking for?

Sitecore has implemented Azure Search and Solr search providers for 9+ XP installs.  They both have their pros and cons. But the best way to evaluate and compare them is by getting your hands dirty and breaking some things on your local machine.

Azure Search is Microsoft’s Paas offering that is supported by Sitecore and is the Out Of The Box search provider for Sitecore 9+ destined for PaaS environments. The Sitecore Quickstart ARM templates for all Cloud XP installs will include the appropriate service infrastructure and configs.

Solr is the recommended option for XP on prem deployments.  It is battle tested and fully featured, but has a difficult time fitting in with the Azure Paas offerings many organisations are investing in.

When it comes to your dev box, you’ll likely have installed an on prem package which uses Solr as the default search provider.  Using Azure search for your dev instance full time may be cost prohibitive as you need a Standard instance at minimum. But as an inquisitive dev, it’s easy to have a play with it locally to create and consume custom indexes or just to get a feel for the differences.   Note there are some limitations with using the Azure Search provider as outlined in the Sitecore Azure Search documentation.   There are workarounds for most of these, so really comes down to your implementation and being aware of these limitations when you’ve got the headphones on and in “code mode”.

Azure Search on your dev machine

If you’re game and have a few Azure credits up your sleeve (remember you get a fair whack with your MSDN subscription), then you can get up and running quickly on any 9+ install.

NB: You’ll need a working Sitecore 9+ install (that is using Solr) up and running. If you don’t already have one you can install via SIFLess, for fast installs that won’t take all night long.  If you have an existing customised install there are a number of considerations to keep in mind, which I’ll address in a follow up post.

Create the Azure Search Service

You’ll need to create a search service in the Azure portal.

  1. New Service > Azure Search Service
  2. Give it a URL, it just needs to be unique
  3. Select the appropriate subscription and resource group for your account
  4. Pick the location closest to you to minimise the latency on queries (NB: Azure search is not available in all regions and prices/offerings do vary!)
  5.  Selects a standard pricing tier (you’ll require 15 indexes minimum)
  6. Create it and wait a short while for the service to provision.
  7. Make a note of the search service URL (on the Overview blade) and API key (on the Keys blade) for the following steps.

Create a search service

Modify the configs

The Sitecore 9+ configs allow for an easy switch between providers, assuming you haven’t modified the OOTB configs:

Add a new connection string called “cloud.search”

Using the Search URL and API key from the Azure portal, add a connection string to your ConnectionStrings.config like the below.

<ConnectionStrings>
...
<add name="cloud.search" connectionString="serviceUrl=https://YOURSEARCHURL;apiVersion=2015-02-28-preview;apiKey=YOURAPIKEY"/>
...
</ConnectionStrings>
Change the app search definition app setting in your web.config

Sitecore 9 allows for roles based configuration so, switching providers is just a matter of changing the search definition in app settings.

<AppSettings>
...
<add key="search:define" value="Azure" />
...
</AppSettings>

Rebuild the indexes

Your instance should now be “talking” to your Azure search service, but none of the appropriate data will be populated in the indexes.  Rebuild all your indexes by:

  • Logging into your sitecore instance
  • Go to the Control Panel > Indexing Manager
  • Rebuild Indexes

That’s about it. You’re dancing on the ceiling with searches running in the cloud.  Do some tests.  Have a play, maybe even implement a custom index for your site search.  Just remember to delete your Azure search instance in the portal once you’re done and save those valuable credits!

Machine Learning in Sitecore – Is it real?

BINGO! It’s all the buzz., it’s cool.  It’s machine learning. But is there real world use for Machine Learning for organizations that aren’t an Amazon or Microsoft?  Honestly I don’t know, but the potential looks amazing, particularly for organizations that have a lot of data to leverage.

The organization I work for is one such company.  In fact, one of the most difficult issues we faced was deciding what data we could (and should) use.   We devised a Proof Of Concept (PoC) that would explore the capabilities of integrating Machine Learning with Sitecore XP and give us measurable outcomes as to it’s success.  With Cortex being announced at Symposium last year, the promise of a Sitecore delivered solution is on the horizon, however there is little detail on the specifics.  Running our PoC gives us a pre-cursor and hopefully contributes to the business case to implementation of Cortex or another solution in future.

Our overall goal was to:

  • Prove we can integrate some Machine Learning technology into Sitecore
  • Create metrics that will allow us to measure, optimise and verify outcomes
  • Get it to market quickly as a PoC
  • Ensure data security given we were dealing with sensitive information

As of right now, our PoC has been in production running with real users for a few months.  We’re still learning and iterating to optimise the outcomes.  Unfortunately I’m limited by a few factors in sharing specific code examples publicly, but this is how we approached it.

Solution Design

We put together a small panel from internal and partner team members to quickly design a solution that would meet our goals outlined above.  Consulting with business and technical stakeholders, we mapped out what we thought was the best path forward.

The concept was to create a recommendation engine that could be integrated into Sitecore, allowing authors to add a component to the page displaying personalised content.  The content would be in the form of a recommendation featuring products the ML results predicted may be of interest to the end user.

There were two main data sets available to us to build a “profile” of existing users, that we could use to train and test the ML models.

Firstly, basic demographic information. E.G. Gender, age band, postcode (In AU ours are at a suburb/regional level) .  These data points combined with some overarching categorizations provided by the Australian Bureau of Statistics, gave a us solid, but well anonymised profile of the user.  This data set contained well over a million subjects to train and test models with.

Secondly was the product holding data.  This mapped out which users currently held which products. To simplify the project we ran this on a subset of 15 products.  All of which are subscription style products rather than physical goods.

Without delving too far into the details (next sections!), we planned to implement a flow that looked a little like this:

Untitled Diagram (1).png

  1. Extract data (albeit manually) from our Enterprise data warehouse.
  2. Upload to Azure ML Studio
  3. Run training and testing against the ML model (to determine accuracy)
  4. Decide on a model that offered best results through statistical relevance and human sanity checks
  5. Expose the results
  6. Create a Sitecore rendering that could consume the results service
  7. Run content tests against known control variants
  8. Re-assess and optimise test content on a regular basis.

Machine Learning with Azure Studio

Up front, I am by no means a Data Scientist.  As it stands, this project absolutely needed some expertise in this area and looked to our partner to provide insights in this area.  We were particularly needing some extra expertise surrounding:

  • Preparing data sets
  • Algorithm selection
  • Evaluation of results
  • General data manipulation techniques

While Azure ML Studio does make it easy for non-Data Scientists to get started, I found there was much to learn and gained some valuable insights in what (not) to do in certain situations.  That said, they do offer some great documentation to get started.

Hold on tight…..here we go.  When preparing the data sets we wanted to ensure that the distribution of the training set was reflective of the actual data set.  It stands to reason that a training data set that closely represents the characteristics of the actual data will yield more accurate results.  Running a T-Test on data sets will give an indication of the means, variances and a p-value which indicates the possibility of a random variable falling within the expected norm (IE. Is your data reasonably consistent and does the test data fit well). This was run across the demographic distributions as well as individual products, looking for significant variations.  What I did learn here is that data set preparation involves a lot of trial and error and copying and pasting (Note to self: Order new C and V keys).  Rinse, repeat; Create the data sets,  re-running tests and comparing.  Eventually we landed on a training data set we were confident had a reflective distribution of the full set.

While Azure ML Studio users can write and maintain their own algorithms, there are also a bunch available out of the box. Which may get you just a drag and drop away from having a successful model.  We assessed a number of algorithms, but for our purposes and given the tight time frame, we settled on the “Matchbox recommender” which is commonly used in recommendation engines.

Microsoft has developed a large-scale recommender system based on a probabilistic model (Bayesian) called Matchbox. This model can learn about a user’s preferences through observations made on how they rate items, such as movies, content, or other products. Based on those observations, it recommends new items to the users when requested.

The inputs would be the demographic data as “users”, the product holdings as “ratings” for the product items and of course some product metadata for each line.  Using these inputs, we were able to create a trained model upon which we could perform predictive experiments.  Hooray!

ev.jpg
Matchbox recommender training flow

Well. Almost Hooray, we still needed to confirm that our results were reflective of something that a) is statistically accurate and b) passes a human “sniff test”.  For a), luckily ML studio has an  “Evaluate Recommender” module you can feed inputs of scored results (in your predictive model) and a test data set for comparison.  Once run this will allow you to right click on the output port of the recommender module to visualise the evaluation results.   This will give a Normalized Discounted Cumulative Gain (NDCG) value.  This is a measure of ranking quality.

The evaluation metric Normalized Discounted Cumulative Gain (NDCG) is estimated from the ground truth ratings given in the test set. Its value ranges from 0.0 to 1.0, where 1.0 represents the most ideal ranking of the entities.

So, get close to 1…..and you’re good to go!

ev (1)
Predictive experiment flow

During development we explored 2 ways of exposing data to the service developed in Sitecore.

  • Creating a secure web service endpoint in ML Studio that would accept the input parameters, then respond with a scored result set.  Unfortunately these requests we found to be slower than expected.
  • Using an input of the full data set, outputting results for all users that could be imported into Sitecore .  This would allow us to provide fast access to “snapshots” of recommendations, but with some manual overhead and the possibility of stale data if it wasn’t regularly updated.

Both had pros and cons, but given this was a time boxed Proof of Concept we wanted to ensure there was no performance impact. So we implemented the latter option.

Measuring success

Being able to measure the success of the initiative was one of the core goals of the PoC.  We needed unequivocal evidence that the strategy improved or did not improve key metrics on the site.  For this we wanted to ensure a few things:

  • We had control data sets in place, so we have a baseline for comparison.
  • We were measuring goals and engagement values relevant to the exercise
  • Metric indicators for engagement were recorded in different ways (eg. Time on site in Google Analytics and Sitecore’s trailing visit engagement value).  We really only had one shot to run this PoC, so wanted to cover as many bases as possible and compare trends across the board.

To ensure that we had a baseline and were able to compare the ML results against the norm we implemented a recommendation rendering with similar layout & design, but 3 separate data sources:

  1. The ML results in a personalised context to the user viewing the page
  2. An array of products curated in the CMS by content authors
  3. A random selection of products

We could then use the content testing features in Sitecore XP to deliver an A/B/n test on selected placements.  This allows for comparative analysis on Sitecore goals, path analyser and engagement value. We also enabled some dynamic metric gathering in Google Analytics to back up the data collected in Sitecore and give us some very specific page level stats.

Configuring A/B/n content tests using different datasources

 

A/B/n tests were scheduled, with some checkpoints to stop tests, analyse, optimise content and test hypotheses that we thought may enhance the User Experience based on the data.  We were looking at things link, adjusting CTAs, imagery, layout & supporting copy.  In all cases it was imperative that all variants had similar changes at the same time to keep from skewing any test results.

Integration with Sitecore

Now with a scored data set to work with, and a clear idea of the other data sets required in the recommendations, we needed to map all of this into a format selected content authors could manage and optimise (as above).  We already had a product tile component that content authors were able to configure to display any number of curated products.  To keep things familiar and nicely componentised, we were able to quickly extend the rendering to use a different “service” for retrieving data depending on data source settings.  We added settings to the Sitecore templates that flagged which data source “type” was being used and which data service to use.  Additionally it allowed us to implement custom business logic in the data service, which retrieved the ML results.  Just because a result may be statistically relevant, doesn’t necessarily mean the business would want to encourage some purchases (Eg. recommending a subscription of lesser value than one the user already had).   We had to account for these situations given this was going to have real world business impact.

This approach allowed us to leverage existing knowledge as content authors could set up everything including the tests in Experience Editor, while still allowing for the flexibility required to meet the goals and custom business rules.

A/B/n Content tests in Experience Editor

Results

So this is the crux of it, eh?  Did it work?  Well, sorry, but we can’t draw any conclusive evidence one way or the other.  It’s just too early.  We will continue to analyse and optimise the results.  After which, I’m sure our analytics team will delve into the results further to identify trends and perhaps things we could have done better.

Once the PoC is complete, we’ll be tearing it down (Noooooo!), but that is the nature of a PoC).  Depending on the outcomes there may be a fully fledged version or perhaps we’ll have more information on other solutions that may be better suited (I’m looking at you Cortex).  We just don’t know until that time comes ¯\_(ツ)_/¯.

That said, using ML for this sort of marketing tooling looks very promising and we can definitely cross off the goals we set out to achieve.

Update:

Great success! The PoC period was extended to gather some more results and confirm initial indicators.  The overall results were phenomenal.  So much so that the project is now the subject of a case study published by Sitecore.  A big shout out to all that helped on the project.