Cloning presentation details and associated datasources with Sitecore SXA

A recent investigation into a client requirement based on a Sitecore SXA project took me through a discovery of a value added functionality that can be re-purposed for other SXA projects and perhaps for the Sitecore community. The version of Sitecore SXA used at that time was version 1.7.1.

The primary functionality was to allow the client content authors to easily clone an existing page from the default language of English to other configured language versions. This will mean taking into consideration the following:

  1. All page property fields to be copied over.
  2. Layout definition comprised of building blocks of SXA components and its associated datasources (both local and reusable datasources), and any further linked references to be copied over. (Note: Some nested SXA components such as tabs does have associated presentation that is also factored into be copied over)

Some TLDR content below. If it bores you, skip straight to the Solution section.

To provide some context, the use case of this project involved a rollout of a total of 6 country sites all of which utilised a set of reusable SXA components alongside customised components. As part of the digital transformation initiative, the client marketing team needed to be trained to assemble new SXA pages quickly and make content updates prior to a full country site rollout. This initiative involved not just ensuring seamless operations transfer to the client but also to also allow the client can assemble pages quickly and efficiently across different languages.

During this time, the client discovered a pain point of having to manually replicate the layout and component definition from the default English language across to other language versions for example Thailand, Traditional Chinese (Hong Kong), Indonesian etc. For example, imagine the client having to copy a TabGroup composite component over the Chinese version and compose each TabItem by copying the content one by one into each tab. This involved several clicks to associate the added components with a new language version of its associated datasources, subsequently clicking save and waiting for the page to refresh and repeating the same steps, all in all creating a lengthy and time consuming tasks that is error prone to mistakes. This took almost half a day for the client to update one page across all other languages. The time consuming task was further compounded by the Experience Editor’s save and reload page time when it came to applying update changes to the page (I nearly pulled my hair before how long it took Experience Editor to reload a page with mid weight content and components).

The Solution

The answer to this was to customize toward the primary functionality mentioned above.

This can be easily be broken down to a few simple steps:

  1. Create a custom Experience Editor Button which I have called “Clone Layout Definition” which will sit under the Experience Accelerator Ribbon Tab.
  2. Create a backing JavaScript file to invoke your pipeline processor.
  3. Create a C# custom pipeline processor class to perform the required functionality on the context item page.

Note: When customising your processor for Experience Editor button functionality, you will need to ensure that you catch server side errors and log them accordingly as EE will just emit an ‘Error occured on the server’ error message if it encountered it during processing. In my example, I had not had the time to do it but you can add it on to your own. Brain Jocks explains well how to go about this.

Create a custom experience button

It was pretty easy to find some useful articles on how to go about adding a custom button to the Experience Editor. I will not mention the concepts to create one but you can refer to this article by Brain Jocks which provides a detailed explanation of customising an Experience Editor button in a Ribbon tab.

The snippet below illustrates the processor configuration.

<?xml version="1.0"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:role="http://www.sitecore.net/xmlconfig/role/">
<sitecore>
<sitecore.experienceeditor.speak.requests>
<setting name="ExperienceEditor.XA.Foundation.Extensions.ExecuteLayoutDefinitionUpdate" value="XA.Foundation.Extensions.Commands.Requests.ExecuteLayoutDefinitionUpdate, XA.Foundation.Extensions.Commands" />
</sitecore.experienceeditor.speak.requests>
</sitecore>
</configuration>

 

Create the custom JavaScript file

This is the backing JavaScript file which hooks the Experience Editor’s client side event to the custom C# processor code.

define(["sitecore", "/-/speak/v1/ExperienceEditor/ExperienceEditor.js"], function (Sitecore, ExperienceEditor) {
Sitecore.Commands.CopyLayoutDefinitionsToLanguageVersions =
{
canExecute: function (context) {
return true;
},
execute: function (context) {

if (!confirm("Caution: You are about to overwrite the current page layout definition and its associated content from English over the to the current site language versions. Click ok to proceed.")) {
return;
}

ExperienceEditor.modifiedHandling(true, function() {
var requestContext = context.app.clone(context.currentContext);

context.app.disableButtonClickEvents();

ExperienceEditor.PipelinesUtil.generateRequestProcessor("ExperienceEditor.XA.Foundation.Extensions.ExecuteLayoutDefinitionUpdate", function (response) {
if (!response.responseValue.value) {
return;
}
}, requestContext).execute(context);


alert("Layout definition and associated contents copied.");

context.app.enableButtonClickEvents();
}); 
}
};
});

Create the custom C# processor pipeline class

Nothing beats understanding what is under the hood with Sitecore’s source code. My favourite tool is ILSpy (check it out here ) which enables me to peek down to Sitecore’s assemblies to actually see what it is doing. I decided that I will just make the idea simpler by cloning the existing page level field properties and together with its Presentation Layout definition. Presentation layout definition essentially means the sum of all the components (including nested components) on the Final Layout with its associated data sources.

To get this functionality working, it is important knowing some useful Sitecore APIs that are already defined in its assemblies, particularly ones which enable you to find the associated datasources and any reference links to any other Sitecore items.

Below is the code which details the entire workings to perform the cloning process as mentioned above.

namespace XA.Foundation.Extensions.Commands.Requests
{
public class ExecuteLayoutDefinitionUpdate : PipelineProcessorRequest<ItemContext>
{
private static readonly string DefaultLanguage = "en";


public override PipelineProcessorResponseValue ProcessRequest()
{
Item sourceItem = base.RequestContext.Item;
Assert.IsNotNull(sourceItem, "sourceItem");

if(sourceItem.Visualization.Layout == null)
{
return null;
}

if (!HasLocalDatasourceFolder(sourceItem))
{
return null;
}

Language.TryParse(DefaultLanguage, out Language defaultLang);

using (new LanguageSwitcher(defaultLang))
{
var otherLanguages = sourceItem.Languages.Where(l => l.Name != DefaultLanguage);
IList<Language> validLanguages = new List<Language>();
foreach (Language otherLanguage in otherLanguages)
{
var srcLangItem = sourceItem.Versions.GetLatestVersion(otherLanguage);

if (srcLangItem != null && srcLangItem.Versions.Count > 0)
{
validLanguages.Add(otherLanguage);
}
}

foreach (Language validLanguage in validLanguages)
{ 
Item dsItemTargetLang = sourceItem.Versions.GetLatestVersion(validLanguage);

if(dsItemTargetLang.Versions.Count > 0)
{
UpdatePagePropertyFields(sourceItem, dsItemTargetLang);
} 
}

var partialDesignItems = GetBasicPartialDesigns(base.RequestContext.Item);

var pdDsItems = GetPartialDesignDataSourceItems(partialDesignItems, base.RequestContext.DeviceItem, defaultLang);

foreach (Item pdDsItem in pdDsItems)
{
List<Field> sourceLangFields = new List<Field>();

GetFieldsInAllVersionsSelfAndDescendants(pdDsItem, sourceLangFields);

foreach (Language validLanguage in validLanguages)
{
Item dsItemTargetLang = pdDsItem.Versions.GetLatestVersion(validLanguage);

if (dsItemTargetLang.Versions.Count == 0)
{
AddItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}
else if (dsItemTargetLang.Versions.Count > 0)
{
UpdateItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}

UpdateCompositeLayoutwithAssociatedDataSourcesSelfAndDescendants(
pdDsItem,
base.RequestContext.DeviceItem,
defaultLang,
validLanguage); 
}
}

foreach (Language validLanguage in validLanguages)
{ 
foreach (Item partialDesignItem in partialDesignItems)
{
var targetPartialDesignItem = partialDesignItem.Versions.GetLatestVersion(validLanguage);
if (targetPartialDesignItem != null && targetPartialDesignItem.Versions.Count > 0)
{
UpdateFinalLayout(partialDesignItem, targetPartialDesignItem);
}
} 
}

List<Item> dsItems = GetPageLocalDataSourceItems(sourceItem, base.RequestContext.DeviceItem, defaultLang);

foreach (Item dsItem in dsItems)
{
List<Field> sourceLangFields = new List<Field>();

GetFieldsInAllVersionsSelfAndDescendants(dsItem, sourceLangFields);

foreach (Language validLanguage in validLanguages)
{
Item dsItemTargetLang = dsItem.Versions.GetLatestVersion(validLanguage);

if (dsItemTargetLang.Versions.Count == 0)
{
AddItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}
else if (dsItemTargetLang.Versions.Count > 0)
{
UpdateItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}

UpdateCompositeLayoutwithAssociatedDataSourcesSelfAndDescendants(
dsItem,
base.RequestContext.DeviceItem,
defaultLang,
validLanguage); 
}
}

foreach (Language validLanguage in validLanguages)
{
var targetItem = sourceItem.Versions.GetLatestVersion(validLanguage);
if (targetItem != null && targetItem.Versions.Count > 0)
{
UpdateFinalLayout(sourceItem, targetItem);
}
}

}

return new PipelineProcessorResponseValue
{

Value = "", 
};
}

private static void UpdatePagePropertyFields(Item sourceItem, Item targetItem)
{ 
IEnumerable<Field> sourceFields = GetFieldsInAllVersions(sourceItem);
UpdateTargetItemFields(targetItem, sourceFields);
}

private static List<Item> GetBasicPartialDesigns(Item sourceItem)
{
var pageDesignItem = sourceItem.TargetItem(Templates.Designable.Fields.PageDesign.ID);

if (pageDesignItem == null)
{
return new List<Item>();
}

var pdItems = ((MultilistField)pageDesignItem.Fields[Templates.PageDesign.Fields.PartialDesigns.ID]).GetItems();

var basicPdItems = pdItems.Where(pdItem => pdItem.Name.ToLower().Contains("basic")).ToList();

return basicPdItems;
}

private static List<Item> GetPartialDesignDataSourceItems(List<Item> basicPartialDesignItems, DeviceItem deviceItem, Language defaultLanguage)
{
List<Item> basicPdDsItems = new List<Item>();
; 
foreach (Item basicPdItem in basicPartialDesignItems)
{
var dsItems = GetDataSourceItems(basicPdItem, deviceItem, defaultLanguage);
basicPdDsItems.AddRange(dsItems);
}

return basicPdDsItems;
}

private static List<Item> GetPageLocalDataSourceItems(Item sourceItem, DeviceItem deviceItem, Language defaultLanguage)
{
List<Item> dsItems = GetDataSourceItems(sourceItem, deviceItem, defaultLanguage);

return dsItems;
}


private static List<Item> GetCompositeComponentDataSourceItems(Item sourceItem, DeviceItem deviceItem, Language defaultLanguage)
{
List<Item> dsItems = GetDataSourceItems(sourceItem, deviceItem, defaultLanguage);
return dsItems;
}

private static List<Item> GetDataSourceItems(Item sourceItem, DeviceItem deviceItem, Language defaultLanguage)
{
List<Item> dsItems = new List<Item>();

List<Item> list1 = GetLocalDataSourceItems(sourceItem);
List<Item> list2 = ItemUtility.GetItemsFromLayoutDefinedDatasources(sourceItem, deviceItem, defaultLanguage).ToList();
List<Item> list3 = ItemUtility.GetItemReferences(sourceItem).Where(r => (r.SourceFieldID == FieldIDs.LayoutField || r.SourceFieldID == FieldIDs.FinalLayoutField))
.Select(x => x.GetTargetItem()).Distinct(new SitecoreItemNameComparer()).Where(x => x != null && x.Visualization.Layout != null).ToList();

dsItems.AddRange(list1);
dsItems.AddRange(list2);
dsItems.AddRange(list3);

return dsItems;
}

private static List<Item> GetLocalDataSourceItems(Item sourceItem)
{
List<Item> dsItems = new List<Item>();

if(sourceItem.Children.Any(x => x.TemplateID == Templates.LocalDataSource_PageData.ID))
{
foreach (Item childDsItem in sourceItem.Children.FirstOrDefault(x => x.TemplateID == Templates.LocalDataSource_PageData.ID)?.GetChildren())
{
AddLocalDataSourceItemToList(childDsItem, dsItems);
}
}

return dsItems;
}

private static void AddLocalDataSourceItemToList(Item item, List<Item> dsItems)
{
dsItems.Add(item);

foreach (Item childDsItem in item.GetChildren())
{
AddLocalDataSourceItemToList(childDsItem, dsItems);
}
}

private static void UpdateCompositeLayoutwithAssociatedDataSourcesSelfAndDescendants(Item sourceDsItem, DeviceItem deviceItem, Language sourceLanguage, Language targetLanguage)
{
if(
((sourceDsItem.IsDerived(Templates.Composites.Datasource.Tabs.TabItem.ID) ||
sourceDsItem.IsDerived(Templates.Composites.Datasource.Accordion.AccordionItem.ID))
|| (sourceDsItem.IsDerived(Templates.Composites.Datasource.Tabs.ID) ||
sourceDsItem.IsDerived(Templates.Composites.Datasource.Accordion.ID)))
&& sourceDsItem.Visualization.Layout != null)
{
var targetDsItem = sourceDsItem.Versions.GetLatestVersion(targetLanguage);
if (targetDsItem != null)
{
if(targetDsItem.Versions.Count == 0)
{
var newTargetLangDsItem = targetDsItem.Versions.AddVersion();
UpdateFinalLayout(sourceDsItem, newTargetLangDsItem);
}
else if(targetDsItem.Versions.Count > 0)
{
UpdateFinalLayout(sourceDsItem, targetDsItem);
}
}

var layoutDsItems = GetCompositeComponentDataSourceItems(sourceDsItem, deviceItem, sourceLanguage);

foreach (Item layoutDsItem in layoutDsItems)
{
List<Field> sourceLangFields = new List<Field>();
GetFieldsInAllVersionsSelfAndDescendants(layoutDsItem, sourceLangFields);

Item dsItemTargetLang = layoutDsItem.Versions.GetLatestVersion(targetLanguage);

if (dsItemTargetLang.Versions.Count == 0)
{
AddItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}
else if (dsItemTargetLang.Versions.Count > 0)
{
UpdateItemVersionsSelfAndDescendants(dsItemTargetLang, sourceLangFields);
}
}
}

foreach (Item childSourceDsItem in sourceDsItem.GetChildren())
{
UpdateCompositeLayoutwithAssociatedDataSourcesSelfAndDescendants(childSourceDsItem, deviceItem, sourceLanguage, targetLanguage);
}
}

private static void AddItemVersionsSelfAndDescendants(Item targetItem, IEnumerable<Field> sourceLangFields)
{
var newTargetLang = targetItem.Versions.AddVersion();
UpdateTargetItemFields(newTargetLang, sourceLangFields);

foreach (Item childTargetItem in newTargetLang.Children)
{
AddItemVersionsSelfAndDescendants(childTargetItem, sourceLangFields);
}
}

private static void UpdateItemVersionsSelfAndDescendants(Item targetItem, IEnumerable<Field> sourceLangFields)
{
UpdateTargetItemFields(targetItem, sourceLangFields);

foreach (Item childItem in targetItem.Children)
{
UpdateItemVersionsSelfAndDescendants(childItem, sourceLangFields);
}
}

private static void UpdateTargetItemFields(Item targetItem, IEnumerable<Field> sourceLangFields)
{
var targetLangFields = GetFieldsInAllVersions(targetItem);

foreach (Field field in targetLangFields)
{
using (new EditContext(field.Item, false, false))
{
var sourceLangField = sourceLangFields.FirstOrDefault(x => x.Item.ID == field.Item.ID && x.Name == field.Name);
if (sourceLangField != null)
{
field.Value = sourceLangField.Value;
}
}
}
}

private static void GetFieldsInAllVersionsSelfAndDescendants(Item sourceItem, List<Field> allSourceFields)
{
GetFieldsInAllVersions(sourceItem, allSourceFields);

foreach (Item childItem in sourceItem.Children)
{
GetFieldsInAllVersionsSelfAndDescendants(childItem, allSourceFields);
}
}

private static IEnumerable<Field> GetFieldsInAllVersions(Item item)
{ 
item.Fields.ReadAll();
var fields = (IEnumerable<Field>)item.Fields.Where(f => !f.Name.StartsWith("__")).ToArray();
var itemVersions = item.Versions.GetVersions();
return fields.SelectMany(field => itemVersions, (field, itemVersion) => itemVersion.Fields[field.ID]); 
}

private static IEnumerable<Field> GetFieldsInAllVersions(Item item, List<Field> allFields)
{
item.Fields.ReadAll();
var fields = (IEnumerable<Field>)item.Fields.Where(f => !f.Name.StartsWith("__")).ToArray();
var itemVersions = item.Versions.GetVersions();
var fieldItemVersions = fields.SelectMany(field => itemVersions, (field, itemVersion) => itemVersion.Fields[field.ID]);
allFields.AddRange(fieldItemVersions);
return fieldItemVersions;
}

private static void UpdateFinalLayout(Item sourceItem, Item targetItem)
{ 
var sourceFinalLayoutField = new LayoutField(sourceItem.Fields[Sitecore.FieldIDs.FinalLayoutField]);
var targetFinalLayoutField = new LayoutField(targetItem.Fields[Sitecore.FieldIDs.FinalLayoutField]);

if (sourceFinalLayoutField == null)
{
Log.Warn($"Source Final Layout Field - Could not find layout on: {sourceItem.Name}", typeof(ExecuteLayoutDefinitionUpdate));
return;
}

if (targetFinalLayoutField == null)
{
Log.Warn($"Target Final Layout Field - Could not find layout on: {targetItem.Name}", typeof(ExecuteLayoutDefinitionUpdate));
}

if (string.IsNullOrEmpty(sourceFinalLayoutField.Value))
{
return;
}

var sourceFinalLayoutDefinition = LayoutDefinition.Parse(sourceFinalLayoutField.Value);

using (new EditContext(targetItem, false, false))
{
var targetFinalLayoutField2 = new LayoutField(targetItem.Fields[Sitecore.FieldIDs.FinalLayoutField]);
targetFinalLayoutField2.Value = sourceFinalLayoutDefinition.ToXml(); 
targetItem.Editing.AcceptChanges();
}
}

public static bool HasLocalDatasourceFolder(Item item)
{
if (item == null)
{
throw new ArgumentNullException(nameof(item));
}

return item.Children.Any(x => x.IsDerived(Templates.LocalDataSource_PageData.ID));
}
}

public class SitecoreItemNameComparer : IEqualityComparer<Item>
{
public bool Equals(Item x, Item y)
{
if(x != null && y != null)
return x.Name.Equals(y.Name);

return false;
}

public int GetHashCode(Item obj)
{
return 0;
}
}
}

The entire code will be made available soon in Github.

For the experience Sitecore developer, the concept of a Layout definition (both shared and final layout) should be familiar to you. If these concepts are not understood, I suggest referring back to the basics of Sitecore layout & presentation details and the differences between Shared and Final layout. Ankit Joshi provides a good explanation of the differences here: https://ankitjoshi2409.wordpress.com/2017/02/06/sitecore-shared-vs-final-layouts/

Please feel free to leave any feedback or comments.

I really hope this module will be of benefit to your SXA projects as well. This saved heaps of time from a content authoring perspective and more importantly benefits the customer first.

 

 

 

 

Advertisements
Posted in .NET, SXA | Tagged , | Leave a comment

Upgrading Sitecore 6.6 to Sitecore 8 – SolrCloud (Part 1 of 2) – Sitecore and Solr Cloud Integration

Many thanks for staying patient with me on this post. I have finally able to get some time to write this article and am much excited to share my achievements on solutioning and implementing a production level SolrCloud configuration for one of my customers.

A-lot has been said about supportability by Sitecore on SolrCloud. Sitecore has indicated experimental support for SolrCloud starting with v8.2. Although experimental, this is still very much a strong indication that Sitecore is pushing towards official support. To make it less worrying for you, I had been able to successfully get SolrCloud configured and setup with Sitecore XP 8.0 Update-4 running in a Production environment for my client. I will also like to credit Sitecore Support who have been helpful to determine workaround solutions to address one to two issues encountered along the way.

This article attempts to describe my experience with the setup and configuration of SolrCloud on a simulated Azure environment.

In the first part of this series, I will outline the checklist of prerequisite items needed for the SolrCloud setup on an Azure VM using Infrastructure-as-a-Service (IaaS) approach. The next part of the article goes on to explain how to setup an internal load balancer which acts a distribution point of Solr HTTP queries to be distributed cluster of Azure VMs with an installation of Solr.

What is SolrCloud?

SolrCloud is a multi-server or cluster setup of multiple Solr instances to provide an automatic HADR solution which can scale to large volumes of content. It provides redundancy for your index storages to avoid a single point of failure. With its automatic HADR features, customers will not have to worry about manual recovery to restore another member to become the primary, should the previous primary instance encounter a power or outage failure, network partition failure, server instance downtime etc. For those who are unfamiliar with SolrCloud, I strongly suggest reading the following recommended articles:

  1. https://cwiki.apache.org/confluence/display/solr/Getting+Started+with+SolrCloud
  2. https://support.lucidworks.com/hc/en-us/articles/201298317-What-is-SolrCloud-And-how-does-it-compare-to-master-slave-

In my illustration, I will demonstrate a 3 Solr member instance setup for a minimal automatic HADR setup.

Note*: A Solr member instance is a Solr Server that participates in a quorum or election process managed by Zookeeper as part of the automatic HADR mechanisms.

What are the prerequisite downloads?

To get up an Azure VM setup with a single running Solr member instance, the following items are required and to be setup in order of sequence:

  1. Windows Server 2012 R2 vM servers
  2. Separate Drive Partition(s) for both Solr installation files and log files each to prevent resource contention between transaction processing of rolling log files and the indexing process. – E:/solr log files and zookeeper log files, F:/solr installation
  3. Java Server JRE (http://www.oracle.com/technetwork/java/javase/downloads/index.html) – jdk1.8.0_60
  4. Apache Zookeeper (https://zookeeper.apache.org) – Zookeeper-3.4.6
  5. Apache Solr (http://lucene.apache.org/solr) – Solr 5.3.0

Technical Architecture

  1. 3 Solr Services hosted on each of 3 individual Azure VMs.
  2. 3 Zookeeper Services hosted on the each of the 3 same individual Azure VMs – because Zookeeper services consume fairly minimal resource, this can be safely hosted within the same VMs.
  3. Solr Cloud collection setup with a single shard and replication factor of 3 for minimal automatic HADR.

Installation and Configuration Steps

  1. Download Zookeeper Release version 3.4.6 (stable) from http://www.eu.apache.org/dist/zookeeper/. Zookeeper is a high performance coordination service to manage your Solr configuration for you cluster in a single managed location. For more info on the installation and configuration steps, visit the http://zookeeper.apache.org/doc/r3.4.6/
  2.  Extract out Zookeeper using a utility such as WinRAR or 7Zip. Make sure to extract it out to a separate drive from your system drive (C:/) as the below screenshot. solr-zookeeper-on-dedicated-drive
  3. Edit zoo.cfg and adjust the necessary parameters for example the zookeeper port, peer connection port (2888) and the leadership election port (3888). The zoo.cfg lives in E:\zookeeper-3.4.6\conf.
  4. Edit zoo.cfg. Find value dataDir and set it to zookeeper\data folder (E:/zookeeper-3.4.6/data). Create the folder if the folder doesn’t exist. Note you need to use forward-slashes (/) to separate path segments in this case.
  5. Append a line for each server to the zoo.cfg file. Note in my case I’ve chosen to use IP addresses – 10.0.0.4, 10.0.0.5 and 10.0.0.6 for the three VMs to be created – these will be the statically assigned IPs to the three servers.
    1. server.1=10.0.0.4:2888:3888
    2. server.2=10.0.0.5:2888:3888
    3. server.3=10.0.0.6:2888:3888

6. Recommendation: It is a good idea to separate out storage of log files to a     separate drive for optimum performance. Zookeeper recommends that in order to achieve lower latency on updates, it is important to have a dedicated transaction log directory. By default transaction logs are put in the same directory as the data snapshots and myid file. The dataLogDir parameters indicates a different directory to use for the transaction logs.

7. Recommendation: It is also a good idea to get a full understanding on what Zookeeper does and how to setup, refer to this link (http://zookeeper.apache.org/doc/r3.4.6/zookeeperStarted.html#ch_GettingStarted) 88. Repeats step 1 – 3 for other individual servers to host Zookeeper.
9. Download Solr 5.3.x from http://lucene.apache.org/solr/downloads.html.
10. Extract out the Solr installation zip package to the same drive where Zookeeper is installed. This is where your physical Solr cores (indexes) will live.
11. Repeats step 1 -3 for other individual servers to host Solr.

Running a Solr Member Instance

Once the installation steps are performed, you are almost ready to run your Solr instance member.

For a reminder note:

Note 1*: A Solr member instance is a Solr Server that participates in a quorum or election process managed by Zookeeper as part of the automatic HADR mechanisms.

Note 2*: You can only run a Solr member instance that is part of a ZK quorum during at least one Zookeeper service is running.

Below are the steps to run a Solr Member Instance:

  1. To ensure Zookeeper is running, ensure that zkServer.cmd runs prior to running Solr. In this case, the Zookeeper service is running in the same server as Solr.
  2. The below is an example startup command for a Solr service member in SolrCloud mode that is part of a Zookeeper quorum:

set SOLRDIR=solr-5.3.0\bin

START %Z_TIP%%SOLRDIR%\solr.cmd start -p 8983 -f -c -z “10.0.0.4:2181,10.0.0.5:2182,10.0.0.6:2183” -noprompt

3. Once fired, this Solr member instance will run in a Zookeeper ensemble that is SolrCloud.

4. Repeat steps 1-2 for the remaining two Solr Member Instance.

Once all Solr member instances are up and running, you will be able to browse to the SolrCloud administration panel dashboard.

SolrCloud_Capture

If you have reached this step, you will have successfully setup a vanilla SolrCloud installation.

Schema xml Generation

Sitecore requires that Solr’s schema.xml be generated to include dynamic fields that map to Sitecore’s system fields for proper indexing to take place on Solr. In order to do this, a few steps are required.

You will need to modify the original schema.xml file to prepare it for Sitecore’s Solr Schema Generator (available under Sitecore’s Control Panel) to generate the correct and final schema.xml to be loaded within your cores. For detailed steps to perform this, refer to https://kb.sitecore.net/articles/227897.

Once the schema.xml is generated and if you are planning to utilise a single language for your Sitecore platform, you may replace the following in the final schema.xml.

From:

<dynamicField name="*_t_ar" type="text_ar" indexed="true" stored="true" />
<dynamicField name="*_t_bg" type="text_bg" indexed="true" stored="true" />
<dynamicField name="*_t_ca" type="text_ca" indexed="true" stored="true" />
<dynamicField name="*_t_cz" type="text_cz" indexed="true" stored="true" />
<dynamicField name="*_t_da" type="text_da" indexed="true" stored="true" />
<dynamicField name="*_t_de" type="text_de" indexed="true" stored="true" />
<dynamicField name="*_t_el" type="text_el" indexed="true" stored="true" />
<dynamicField name="*_t_es" type="text_es" indexed="true" stored="true" />
<dynamicField name="*_t_eu" type="text_eu" indexed="true" stored="true" />
<dynamicField name="*_t_fa" type="text_fa" indexed="true" stored="true" />
<dynamicField name="*_t_fi" type="text_fi" indexed="true" stored="true" />
<dynamicField name="*_t_fr" type="text_fr" indexed="true" stored="true" />
<dynamicField name="*_t_ga" type="text_ga" indexed="true" stored="true" />
<dynamicField name="*_t_gl" type="text_gl" indexed="true" stored="true" />
<dynamicField name="*_t_hi" type="text_hi" indexed="true" stored="true" />
<dynamicField name="*_t_hu" type="text_hu" indexed="true" stored="true" />
<dynamicField name="*_t_hy" type="text_hy" indexed="true" stored="true" />
<dynamicField name="*_t_id" type="text_id" indexed="true" stored="true" />
<dynamicField name="*_t_it" type="text_it" indexed="true" stored="true" />
<dynamicField name="*_t_ja" type="text_ja" indexed="true" stored="true" />
<dynamicField name="*_t_lv" type="text_lv" indexed="true" stored="true" />
<dynamicField name="*_t_nl" type="text_nl" indexed="true" stored="true" />
<dynamicField name="*_t_no" type="text_no" indexed="true" stored="true" />
<dynamicField name="*_t_pt" type="text_pt" indexed="true" stored="true" />
<dynamicField name="*_t_ro" type="text_ro" indexed="true" stored="true" />
<dynamicField name="*_t_ru" type="text_ru" indexed="true" stored="true" />
<dynamicField name="*_t_sv" type="text_sv" indexed="true" stored="true" />
<dynamicField name="*_t_th" type="text_th" indexed="true" stored="true" />
<dynamicField name="*_t_tr" type="text_tr" indexed="true" stored="true" />

To:

<dynamicField name="*_t_ar" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_bg" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_ca" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_cz" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_da" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_de" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_el" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_es" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_eu" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_fa" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_fi" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_fr" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_ga" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_gl" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_hi" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_hu" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_hy" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_id" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_it" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_ja" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_lv" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_nl" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_no" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_pt" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_ro" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_ru" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_sv" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_th" type="text_general" indexed="true" stored="true" />
<dynamicField name="*_t_tr" type="text_general" indexed="true" stored="true" />

The Sitecore generated schema.xml must replace the original schema.xml in the configuration folder. This configuration will be used as the shared configuration to be uploaded to Zookeeper. Details on uploading the configuration to Zookeeper is discussed in the “Upload Configuration to Zookeeper” section in this article below.

Solr and Sitecore Integration

Next, you will need to perform the integration between Solr and Sitecore which will involve installing a valid Sitecore Solr package which contains a set of Sitecore and Solr .NET assemblies and configuration to be included within your Sitecore website root. The package can be downloaded from Sitecore’s official documentatation website.

Before you proceed to install the package, I strongly suggest taking a back up of your Sitecore website using Sitecore Instance Manager (SIM). This is to support cases whereby you accidentally missed or skipped a step during the installation process, or wrongly performed a configuration which you may have forgotten along the way. Follow the necessary steps to enable the Solr index configurations and disable the Lucene config files. There will also be a step to modify the Global.asax to ensure the Sitecore application is able to initialize the Solr index and content search configurations accordingly upon appllication initialization. To avoid re-inventing instructional steps, this article will not describe detailed steps to integrate Solr with Sitecore, but you must complete the integration before proceeding to the next part of the series.

For more info on the steps to integrate Solr with Sitecore, you may refer to the community article, https://sitecore-community.github.io/docs/search/solr/Configuring-Solr-for-use-with-Sitecore-8/

Identify Solr Cores

Identify the Solr cores which are required to be setup. For Sitecore XP 8.0 Update-4, the following cores are required:

  1. sitecore_master_index
  2. sitecore_web_index
  3. sitecore_core_index
  4. sitecore_suggested_test_index
  5. sitecore_testing_index
  6. fxm core (sitecore_fxm_master and sitecore_fxm_web will read from this single cor. In Solr Cloud, a core is referred to as a collection.)
  7. sitecore_listing_index
  8. sitecore_social_master
  9. sitecore_social_web
  10. sitecore_analytics_index
  11. sitecore_marketing_asset_index_master
  12. sitecore_marketing_asset_index_web

Upload Configuration to Zookeeper

As opposed to a configuration per index for standalone Solr configuration, SolrCloud’s collection reads its configuration from one single / shared managed location. This will require uploading the configuration to ZooKeeper which is then attached to each created collection for each of the Solr members within the quorum.

Tip: It is recommended to use a single managed configuration for all of your collections.

In one of your Solr VMs, perform the following in order:

  1. Make a copy of basic_config folder in the  E:\solr-5.3.0\server\solr\configsets folder and rename the new folder to sitecore_config.
  2. Copy the generated schema.xml into the sitecore_config folder.
  3. CD to e:\solr-5.3.0\server\scripts\cloud-scripts\
  4. Run the zkcli.cmd to the upload the configuration to Zookeeper:
    zkcli.cmd -zkhost localhost:2181 -cmd upconfig -confdir  E:\solr-5.3.0\server\solr\configsets\sitecore_common_config -confname sitecore_common_config
  5. Once step 4 is completed, this configuration is now uploaded to Zookeeper and shared with the Solr members of the cluster.

Create the Collections

In SolrCloud, collection is often referred to as the logical collection which makes up the physical cores. For example, sitecore_master_index is a logical collection that makes up the following physical indexes that comprise the logical set.

  1.  sitecore_master_index_shard1_replica1
  2. sitecore_master_index_shard1_replica2
  3. sitecore_master_index_shard1_replica3

Once all Solr members are up and running, the next step is to create all 12 logical collections in SolrCloud. Perform the following in order.

From a command line, perform the following in order:

cd D:\Solr-5.3.0-Instance\bin
./solr.cmd create_collection -c sitecore_core_index -d sitecore_configs -n sitecore_common_config -shards 1 -replicationFactor 3; 
./solr.cmd create_collection -c sitecore_master_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_web_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_analytics_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_marketing_asset_index_master -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_marketing_asset_index_web -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_testing_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c sitecore_suggested_test_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c fxm -n sitecore_common_config -shards 1 -replicationFactor 3; 
./solr.cmd create_collection -c sitecore_list_index -n sitecore_common_config -shards 1 -replicationFactor 3;
./solr.cmd create_collection -c social_messages_master -n sitecore_common_config -shards 1 -replicationFactor 3

The first command above creates the collection and uploads the configuration specified in the sitecore_config directory to a zookeeper directory called sitecore_common_config.

The commands after the first utilise the ZooKeeper configuration uploaded in the first command. All collections share this same configuration.

Each create collection command will create the physical index folders across the three Solr member instances. The create collection command will issue a HTTP request to the Solr Server to execute the collection creation and ensure all appropriate index folders and configuration information is created accordingly. A success message is displayed once the command executes successfully.

In summary, you will have by now able to achieve a setup of a fully working SolrCloud service. To validate that the SolrCloud service’s collection is attached to Sitecore, it is recommended that you login into your Sitecore CMS portal as an Administrator, and attempt to rebuild each indexes. For example when you are testing the rebuild of the Master index, monitor changes in the directory of the physical index folder, for index files that are recreated with latest time stamps. Repeat this for the other indexes. This will have validated a successful and fully working SolrCloud installation, configuration and setup.

In the next part of this article, I will demonstrate how you can use an automated approach to spin up your Solr Cloud service using a Windows Scheduler Task.

Posted in .NET, Sitecore | Leave a comment

Sitecore 8 Upgrade: Lessons Learned

In my first Sitecore 8 Upgrade use case, described in my Upgrade from Sitecore 6.6 to Sitecore 8 series, I learnt several valuable lessons along the way. I will not be able to spell all those out for you. Therefore I decided to share a summary of those high level points with you. Hope you will be kind enough as well to add to this list. After all as a Sitecorian, sharing is caring.

Lesson 1: Addressing your Customer Requirements well.

Having well addressed all key success criterias related for my Customer, I feel that sometimes we may not address the requirements well if we do not understand what is needed to resolve the problem.

The key thing is to understand your Customer’s environmental variables. Next, is to ensure that you are confident with your proposed solution. This comes with much planning and analysis on the system’s problematic points and how the upgrade can address them.

If you are new to Sitecore, I will strongly recommend reading up Sitecore MVP blogs and visit Sitecore Community forums to speed up your learning. Having said that, nothing beats going through the original Sitecore Documentation in SDN and Docs Sitecore to understand basic and fundamentals.

Communicate with your client often to gather information during the analysis phase. In the first series, I explained about the must ask questions to your Customer when performing an upgrade. Use this as a guideline to help you make good decisions on your proposed solution.

Build a presentation deck to describe the scope of your work and how they will address your customer’s requirements, whether it will be performance issues, new features etc. This can better align yours and your customer’s understanding on the subject matter when the you are presenting your deck before the customer. This promotes a Q&A session after the presentation to better clarify requirements.

Lesson 2: Never gold plate your Customer Requirements.

Keep things simple. Ensure requirements are met to baseline standards. This can even save you from spending long hours of development whilst minimising the risk of your inability to deliver to your customer on time and on budget.

Putting your Customer’s main needs was the driving factor for the project success!

Lesson 3: If your customers are not familiar with Sitecore’s new CMS UI Search, show it to them earlier before the project kick off.

Sitecore 6.x does not feature any ability to search content or media using keywords, tags or facets.

Therefore when required, demo the use of Sitecore 8 much early in the phase to demonstrate how their users will be comfortable to use new search features. This will be their daily bread and butter and of course, and they must like the system and be comfortable to use it. This is especially important to prevent later surprises during User Acceptance Testing or Training sessions, whereby the user may not be happy with certain features. Get an agreement much earlier on to negotiate with basic features on the product and put this down into the requirements.

We realised we managed obtain just two Change Requirements pertaining to User Experience just after the User Acceptance stage. The above saved my team a lot of effort.

Lesson 3: My index rebuild is timing out and taking an extremely long time. Thank you dear Sitecore Logs. 

Turn off Analytics if you do not need it.

Thanks to our best friend, Sitecore Logs, I have managed to pinpoint the root cause of the issue.

If you are not working with Analytics, turn it off. Yes, it is simple to say that. However, it is not straightforward in Sitecore 8.0 with a flip of a switch (i.e it is not just only disabling a configuration setting in your .config). More info will be posted in my article about this.

Turning off Analytics reduced index rebuild time by two hours.

The reason why we needed to turn it off was due to the fact that a full index rebuild of the Master database took long hours to rebuild and timed out at the final step of index optimisation. This optimisation operation is performed by Solr prior to finalising and preparing its index for query readiness.

Turn off Item Cloning

This can be done by setting the ItemCloning.Enabled config property to false. This is  a gold configuration that not many people know. But before you turn it off, be sure to know what the configuration does. Disabling this property can can reduce memory consumption substantially and the reduce number of SQL calls to the Master database. We realised after this was disabled, turning off this setting reduced index rebuild time by a further four hours.

Increase your DI Container Timeout

We used Solr Cloud as our Search provider with Sitecore. By using Solr, we needed to choose a DI Container. By increasing the timeout on our DI container, this helped us to extend the time taken to rebuild and optimise the full Master index. By default, Sitecore sets this to a 30 minute timeout. This timeout value can be set by overridding the Application_Start method in the Global.asax.

Lesson 4: Use console applications to run long running processes.

This is especially true for the Data Migration implementation. We chose to use Console Applications to develop our Migrator Agents to migrate thousands of news, articles and media definition items to designated Sitecore Buckets.

Although Sitecore does feature Background Jobs,  however in our case, we were primarily concerned with web timeout operations. Using a console app separates all our long running activities from the web context and provided us assurance of non time out issues. This is especially true for large batches of data to migrate, especially with media definition items that carried over large binary data. Moreover, building with a Console Application gave us better control of code customisation and ensure a smooth and reliable migration process.

The above approach helped us to complete all data migration activities before the specified deadline.

Lesson 5: Make a back up of your data just before you execute your migration process.

Can’t emphasise this more. This is especially true if you mess up with data. We had an issue whereby we did not have a point in time backup to restore to after our Media Migrator Agent accidentally migrated Sitecore’s default system images to our designated custom Media Folder Bucket. We needed to then restore to current morning’s data and repeat a few complex activities prior to being able to run the agent again.

There are even possible occasions that you will need to repeat the migration process several times to test different scenarios. Backup your data in case!

Lesson 6: Data migration may slow down tremendously after a few hours.

This was due to the fact that Sitecore creates an event record on each content or definition item that is moved during migration. In turn, this creates heavy write operations to the Master database, that in turns causes the migration process to progressively decrease resulting in poor slow down.

Execute period cleaning of your EventQueue and PublishQueue tables to prevent Publishing Queue. Perform IIS restart to ensure that all processes and blocking threads are flushed prior to resuming the migration agent.

Lesson 7: Sitecore’s Content Management server eats memory like a beast

As we were performing a full index rebuild, we realised that IIS instance memory for our Sitecore solution started to grow as the index rebuild progressed along. However, with some of the extra activities turned off in our Sitecore configuration, this reduced slight memory consumption on the server. Having said that, Sitecore still does consume memory when it attempts to cache the items that are indexed into memory.

The solution was to increase memory capacity on the CM server to accomodate the index rebuild.

Lesson 8: Place higher CPU for your Solr Servers if using Solr Cloud

We implemented Solr Cloud for our customer whereby each Solr server will have an exact copy of our out of the box indexes and our custom indexes. Three Solr Servers were setup in a minimal quorum for leader election among the participating nodes (also known as followers) should one of the servers go down during an unexpected outage or if a service interruption occurred on one of the nodes. This setup is mandatory for minimal high data availability.

As we were monitoring the Solr Servers for the first week of the Post Go-Live period, we realised that CPU levels were peaking intermittently due to periods where large traffic spikes occurred and also during intervals where large batches of articles were created during a specified time of the day. This caused poor page loading of the home landing page at intermittent periods.

We realised that Solr services on the three servers consumed high CPU resources to replicate its indexes across its followers. Solr Replication is deemed an expensive operation. The resolution was to increase CPU capacity to accommodate the replication work load.

Hope that this article does help you to achieve your upgrade. Do note that every upgrade will have varying requirements and traps. The explanation above is solely a guide and what I can share with you so far with my experience.

If there is anything you will like to add or remark, please feel free to comment and share on my blog. It will be as equally exciting to hear what others have learnt in your Sitecore upgrade projects.

 

Posted in .NET, Sitecore | Tagged | Leave a comment

Upgrading Sitecore 6.6 to Sitecore 8 (Part 1 of 2)

After a few months of being preoccupied with a prior project and a presales assignment, I am finally able to spend some time here to share my experience on my role as a Technical Lead for a Sitecore rebuild and migration work for our client for a six month period till early of December 2015. Our client is a prestigious news press and digital media company, who have used Sitecore as their platform and repository of thousands of news articles and stories. I hope that this article will be an enjoyable read as much as it was to write and recap my experience in this article.

The one most important thing to consider when working on a Sitecore upgrade is planning.  I cannot emphasize it more. This means listing down every possible scenario in every category and area for the rebuild and migration. The next thing is communication. I was given the privilege to ask honest questions from my customer during the Planning phase around requirements.

Our Customer’s primary concern with the system was poor performance. This included topics such as how we can better streamline Editor’s (who are the Content Authors) daily activity tasks, increase Editor experience and navigation with the back office. For instance, our users did not enjoy clicking through too many levels deep to get to an article content item. Thousands of articles accumulated over the years further degraded the user experience due to slow loading of content items. In addition, uploading of media items takes a long time to process and complete. More are mentioned down below under Customer Requirements.

To take things further, I was engaged to come up with a presentation to engage the customer’s Head of Technology as to why they needed to upgrade to Sitecore 8. This means putting their top priority concerns in a deck and providing them a proposed scope of delivery items that is addressable with Sitecore 8’s features. I hope that the below can be used  to help any Sitecore developers or consultants involved whereby in presales or in actual delivery of a Sitecore Upgrade.

Customer Requirements

Every customer has unique requirements and circumstances. And it is extremely important to understand how these can impact them from the short to the long term. After having attended several preliminary face to face meetings with my customer, I decided to understand more of their concerns and pain points with their use of the existing system. Below are the key pain points

  1. Poor CM performance whereby content authors were experiencing Kill Pages when navigating multiple levels down to navigate to their stories.
  2. Poor Publishing performance which often led to the Publishing activity being queued often. This is a news press company that often publish hundreds of story content every 10 to 15 minutes. This contributes to issue no 3)
  3. Slow upload of media images to Sitecore resulting in failed uploads and further exacerbating the CM performance.
  4. High data requirement with thousands of articles and millions of media items being stored over the past 10 to 12 years. This contributes to the issue 3)
  5. Poor media, stories and articles taxonomy which results in Content Author frustration to click through to many levels to reach their desired content item or media item.
  6. Inconsistency between indexes reflected in their Content Delivery servers in the Production environment. The customer were using Lucene to index large volumes of contents across their Content Delivery servers. This resulted in inconsistent viewing of news and article information on the front end site between servers.
  7. Inability to implement Content Personalisation as a lack of foundational setup with the current HTML structure and its data sources.

The Upgrade Questions

The next question is to ask ourselves why do you need to upgrade?

To follow up from the issues highlighted above,  I then decided to put a list of points in bullet form on the areas I considered that were important to highlight to the customer in pushing them for an upgrade.

  1. Does a minor/major upgrade address their performance issues immediately and in the long term?
  2. Do we need changes to the infrastructure?
  3. Are there limitations to the current customer’s environment (can this be onsite network limitations, physical hardware, etc.). If so, how this can affect the platform?
  4. Consider the Sitecore’s Product Lifecycle as to when main stream support will be made available for the customer. (https://kb.sitecore.net/articles/641167)
  5. Where possible, provide  them with Sitecore’s Compatibility Matrix to understand offerings between Sitecore 8 and Sitecore 7.2 so  that the Customer may understand what they can gain from Sitecore 8 and map these features to the issues highlighted in their requirements as addressable.
  6. What are the customer’s data and security policies around cloud hosting options?

In terms of the Digital Marketing System, my customer had not enabled this feature due to performance reasons already experienced with the CM. Nevertheless, our Customer were keen to realize their Digital Transformation Road Map in the second phase and treat the first phase as an opportunity to rebuild and prepare the platform for Digital readiness. Hence,  it was safe to speak about Sitecore 8.0’s new Experience Analytics Database (xDB) and its offerings to our customer on a high level. This was a gold point to point our Customer to adopt the upgrade and use the first phase as a Baseline Phase to ensure critical requirements related to performance issues were met.

Planning points

Next, I have gathered a must have list for any developer prior to planning their Sitecore Upgrade with their customer. The questions below are must-ask questions  from a CMS and DMS point of view.

Firstly CMS:

  1. How much data is stored in the CMS such as content items and media definition items.
  2. Are they are any customized items as the following described:
    • Custom events, modules, pipelines, processors and commands.
    • Custom agent tasks or schedulers.
    • Custom workflow definitions, items or commands.
  3. What are the various events, pipelines, processors and commands.
  4. Are there any current Sitecore’s owned modules. If so, check if it is first compatible with Sitecore 8.
  5.  Are there any shared source modules available in the Sitecore Market Place. If so, check if it is first compatible with Sitecore 8.
  6. Custom configuration files found in App_Config/Include.
  7. Are there any current third party integration or touch points.
  8. Are there any current Sitecore Support dlls that were issued by Sitecore or obtained from their Knowledge Base site. If so, check if it first is compatible with Sitecore 8.

DMS:

  1. Has the customer adopted DMS on their existing platform? If yes, how can we plan to migrate their analytics data from Sitecore 6.6 to 8. Note that with Sitecore 7.5 to 8 and above, Sitecore has introduced the Experience Marketing Analytics and Platform to host user visitor data, tracking, profile, personal, campaigns etc. in MongoDB, a NoSQL document database that is able to support schemaless data. For more information on MongoDB, you can find out here (https://docs.mongodb.org/getting-started/shell/introduction/)
  2. Are the current components for current Sitecore system configured to support Page Editor mode. Note: Page Editor experience is a prerequisite for Digital Marketers to configure content personalisation at a later date.
  3. Regardless if the customer has or hasn’t adopted Sitecore’s Engagement Platform (in this case it will be DMS for my customer), you must spell out the option to the Customer if they will like to host its Analytics infrastructure on-premise or on the Cloud. As mentioned, some companies adopt strict data security policies around Cloud hosting options, and as such may prefer to host the xDB infrastructure on premise. Where as, others prefer to cut short an learning curve on MongoDB and delegate the actual work (be it processing and aggregating analytics data, auto scaling, etc.) to Cloud xDB.  List the options down to the customer. It is important to pose these options upfront to the Customer so that they can make a considered decision and manage risks if any.

The next thing you need to know is which version of Sitecore 8 suits your project. Many people will be quick to think that the latest version of Sitecore 8 is the safest choice. Well thanks to some advice from a good Solution Architect friend at Sitecore, this is not always the case. Always check the feature release version of the product. Not every latest feature releases are stable as yet. Service packs are definitely recommended as Sitecore tends to introduce fixes to major performances and security vulnerabilities.  I will prefer staying with one version below the latest version update. And if there is a service pack which I can grab from Sitecore’s 8 Release downloads, use the service packs before the feature releases.

In the next part of this series, I will aim to demonstrate some of our approaches in addressing these performance pain points. These will be described in a summary and accompanied with a technical architecture diagram for ease of understanding.

To refer to key learning points of the upgrade, you can read my experience article here.

Please stay tuned.

Posted in Sitecore | Tagged | Leave a comment

Sitecore Azure 7.2 – My First Sitecore Azure Evaluation Story

Recently, I had a chance to run a ‘pilot’ project for a customer to demonstrate to them the capabilities of automating their Sitecore deployment with Sitecore Azure 7.2. Sitecore Azure is a Sitecore module developed to ease deployment of your Sitecore Solution to the Azure Cloud Service, typically known as Platform-as-a-Service (PaaS). This article will attempt to show you basic ideas on how to get started with configuring your setup to support the Sitecore Azure module for ease and automated deployment of your Sitecore solution to the Cloud. I will demonstrate this with a typical deployment scenario of a Staging delivery farm and thereafter later, how this can be promoted to become a Production delivery farm.

With some valuable experiences learnt along the way, I hope this article will be able to benefit other readers so that they do not face the hard and bumpy challenges along the way. However, if you think that there you may have an opinion to share, you are most welcome to comment on this article.

One note is that circumstances may vary to your situation and requirements. It is recommended to verify which Sitecore Azure module is compatible and suitable to your requirements. For the Sitecore Azure compatibility matrix, refer to the official table. Once you have decided on the correct module version, install the Sitecore Azure module onto the CM using the UpdateInstallationWizard page.

I will not explain the detailed installation steps of the module as this information is ready made available in the SDN documentation of Get Started with Sitecore Azure.

Installing the Management Certificate

Communications between Azure and Sitecore Azure requires a secured channel to transmit data from Sitecore Azure to the Cloud Service. You will first need to setup a management certificate for Sitecore Azure to be given authorization to send requests to Azure via the Azure Rest API Management Service.

I proceeded with option A) of downloading the .publishsettings file from my current Azure subscription (https://manage.windowsazure.com/publishsettings/index) and then uploading this management certificate (this certificate holds both the private and public certificate to Azure) via the Sitecore Azure prompts. Note you will be prompted to upload a management certificate after the first time you have uploaded your Sitecore Environment File. This Environment file is unique to your Sitecore solution is usually obtained from Sitecore within one or two days after a request has been made.

Capture2

Once this is done, Sitecore Azure will be able to submit API service request calls to Azure to manage your Sitecore deployments.

One note is Sitecore addressed two bugs around this upload management certificate. It will be important to apply the below kb fixes.

  1. Error installing a management certificate for Sitecore Azure without the appropriate application pool permissions (https://kb.sitecore.net/articles/823113)
  2. Error installing a management certificate in Sitecore Azure when using the 2.0 format of the publish settings file (https://kb.sitecore.net/articles/075335)

As an optional setup, you can also manually configure SSL in Sitecore Azure using this KB article (https://kb.sitecore.net/articles/661557)

Creating a Staging Delivery Farm

I then create a delivery farm that will host my CD servers in PaaS in the staging slot. Notice that the environment type is set as ‘Evaluation [Staging]’. CD servers in PaaS are typically called ‘Web Roles’. In my case, I created five delivery farms (three that are Production and two that are Staging).

Sitecore Azure provides a integrated user interface to visualise the various deployment spots across the globe. I will recommend choosing a data centre closest to your area to minimise cost.

Capture4

The deployment topology used in my use case is the “Delivery” Deployment topology meaning that only my CD servers are hosted in PaaS and my CM remains hosted on-premise. See figure above with only Delivery farms listed.

When you first add a delivery farm in a given datacenter region, Sitecore Azure brings up the New Deployment dialog. Choose the Number of instances and VM size appropriate for your environment. I choose to use the Default settings since this is environment is a staging environment and used only for evaluation purposes. By default, Sitecore Azure sets the number of instances to 2 and VM size to Small.

For production purpose, Microsoft recommends at least a minimum of 2 instances to satisfy the SLA agreement of 99.95 percent high availability.

Capture6

 

After you add a delivery farm, Sitecore Azure creates the Delivery Farm item in the Sitecore Content Tree. For example, after I add DeliveryFarm05, Sitecore creates DeliveryFarm05 as a Farm item with its necessary child items at /sitecore/system/Modules/Azure/[Environment-Name]/[Region Name

Do not click Start Deployment yet. Click on the more options which brings you the Staging Deployment content item created. In this example, it navigates to /sitecore/system/Modules/Azure/SomeProject-Evaluation/Southeast Asia/Delivery05/Role01/Staging. You will see an example like the figure below.

Capture7

 

 

 

 

 

 

 

 

 

 

 

By default, the Delivery Farms will be set to share the same Production and Staging databases. Leave this as-is.

Capture5

 

 

 

 

 

 

 

 

 

Database references are set to point to existing databases. This is my preference. I do not wish for Sitecore Azure to create the Sitecore databases for me as this will consume extra time for deployment. Instead, I configure it to point to the existing sitecore databases (core, master, web, analytics and wffm).

Ensure that the Connection Strings Patch field is also set correctly with the same database connection strings as specified in your Database Set (set01). See figure below.  Refer to this KB article for more information. (https://kb.sitecore.net/articles/440060)

Capture8

Next step is to ensure that your service configuration and service definition fields have been configured correctly. Ensure that the following fields are set correctly particularly StorageName, Microsoft.WindowsAzure.Plugins.Caching.ConfigStoreConnectionString and Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString. Ensure SqlServerName is left blank. These will create the Sitecore Logs in Azure as tables upon deployment of the Delivery Farm. For more info on how Sitecore Azure creates IIS logs, WADLogs, etc, refer to this KB article (https://kb.sitecore.net/articles/400950)

Capture9

 

Once completed with the above, you are ready to Start Deployment. You will expect Sitecore Azure to create a brand new staging slot for DeliveryFarm05 if this has not been created yet. Whilst deploying, Sitecore Azure will perform the following in order:

1. Resolve databases

2. Builds the package

3. Applies the service configuration changes.

4. Executes the package

5. Creates the package

6. Uploads the package to Azure

7. Deploys the package to Azure

8. Runs any start up tasks specified in the service definition file if any.

9. Finally it runs the deployment and starts up the Web Roles.

The deployment time will vary depending on factors such as the size of the solution deployed and network bandwidth.

Sitecore Azure dialog displays the status messages indicating the progress of the deployment all the way up when all Web Roles have finished starting up to 100.00 %.

Once deployment is done, you will be able to see the green globe of Delivery05  in the Sitecore Azure dialog as below. This means the Delivery Farm has started successfully and is running.

Capture12

 

 

 

 

 

 

Turning over to your Azure Management Portal whereby Sitecore Azure is linked to your account subscription via the Management Certificate Trust, you will be able to view the Staging Deployment Slot. For example, Delivery05 delivery slot is created as below.

Capture13

 

 

Microsoft Azure permits up to a 100 farms per region. In real life, you will not be deploying as many as that into a production setting. So please be prudent about this choice. Note the more farms you have, the higher costs incurred upon you by Microsoft.

Promoting  a Staging Delivery Farm to a Production Delivery Farm

Once staging is up and running and after the evaluation process is decided, you can easily promote the staging delivery farm to be a production delivery farm. Sitecore Azure offers the Swap in to swap a staging and production deployment by swapping the virtual ip addresses both deployments. This is known as VIP-SWAP. VIP-SWAP incurs no downtime cost and is extremely fast. When a VIP-SWAP is performed, production deployment be swapped in with the previous VIP of staging and hence the staging deployment slot becomes empty and the production deployment slot is replaced by the staging deployment content. This can be seen in Azure Management Portal once the VIP SWAP is done. This blog explains it well (http://blog.toddysm.com/2010/06/update-upgrade-and-vip-swap-for-windows-azure-servicewhat-are-the-differences.html)

To perform the VIP-SWAP in Sitecore Azure, you will click on the swap command entry. Note: This is essentially the same Swap command available in the Azure Management Portal. Sitecore Azure has provided a nice shortcut for Sitecore admins to perform this action in the dialog popup.

Capture14

 

 

 

 

 

Once done, you will be able to see that the Production Deployment slot is now filled in with what was seen in Staging before. Swapping will typically take a short moment to complete making promotion so much faster and easier. When the swap completes, you have successfully promoted your site to Production!

Typically the below will be the status flow:

2/1/2015, 4:27:11 PMDeployment was swapped to opposite slot
2/1/2015, 4:27:03 PM…..SaCd05Role01PSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [done]. Progress: 100.00%
2/1/2015, 4:27:11 PMDeployment was swapped to opposite slot
2/1/2015, 4:27:03 PM…..SaCd05Role01PSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [done]. Progress: 100.00%
2/1/2015, 4:26:45 PM…..SaCd05Role01SSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [start]

See figure below.

Production Deployment Slot:

Capture15

 

 

 

 

 

 

 

 

 

 

 

 

Staging Deployment Slot:

Capture16

Hope this helps those who are encountering with Sitecore Azure for the first time. Feel free
to drop me questions or share opinions if any.

Overall, I found the experience with Sitecore Azure very seamless and it automated alot of the tedious deployment work required for a Sitecore Instance solution be it for small to large scale. It does introduce a very streamlined approach when it comes to deploying code based files and configurations to Azure Cloud. Although there are other mechanisms of deploying a Sitecore solution manually to Azure PaaS, Sitecore Azure does provide a nice visual of the statuses of your deployments in each delivery farm you deploy to.

That said, there are further tasks required to be investigated such as, startup task capabilities such as  automating 3rd party antivirus installation, instrumenting application performance monitoring on the web roles etc. Will yet to see how Microsoft is able to provide some useful managed services around these areas for recommendations to our customers moving forward with the advent of Azure Cloud Services.

For some useful references to Sitecore Azure, refer to the below which proved very helpful in my personal findings.

1. http://www.awareweb.com/awareblog/9-22-14-azuretips

2. http://www.awareweb.com/awareblog/11-26-13-sitecoreazure

Posted in .NET, Sitecore | Tagged | 3 Comments

Sitecore CT3 ClayTablet – Part 1

Recently, I had the opportunity to install, configure and integrate Sitecore Claytablet Translation Connector (as known as CT3) for implementing a multilingual solution to one of our Sitecore CMS integrated client projects. In this article, I am going to refer to CT3 as opposed to using its long usual name. I realised that there aren’t many blogs out there which provide basic installation steps to get the CT3 Connector going along. I decided to put together a short blog of a step by step guide of preinstallation requirements. But before I dive into it, a quick nutshell of the CT3 Connector. This is the connector which Claytablet has developed to connect the Sitecore CMS to provide connectivity to the Claytablet Platform which is used to receive and route content from the cms to Translation providers and back. We can think of the connector as the bridge sitting between Sitecore cms and Claytablet.

Firstly, you will need the list of things below.

1. Obtain the CT3 Translation account license keys from Clay Tablet. This is usually one license key per translation provider. The license key comes in a .xml file such as source.xml. A target.xml file is supplied from Clay Tablet to indicate Clay Tablet platform as the destination where content will be sent to. You need not modify any of the source and target.xml.

Note: You cannot duplicate the license keys on two ore more difference environments. Doing so will cause unexpected behaviour with the CT3 connector resulting in lost translated content from the translator providers, orphaned projects etc.

2. Obtain the CT3 Installer package from Clay Tablet. This will be a Sitecore installation package (.zip file) which you will need to install via the Installation Wizard. Don’t do this step yet as we have not reached the installation stage.

3. Obtain the CT3 Database zip file from Clay Tabelt which includes the “CT3Translation.mdf” and “CT3Translation_log.mdf”. You will need to attach the CT3 databases on the same database server where your Sitecore (web, master and core) database are located. Make sure the CT3 database is online.

4. Configure the CT3 Database connection string:

You need to add one more connection string called “CT3Translation” (don’t use any
other name) for the CT3 Database:

The “User ID”, “password” and “Data Source” are usually same as used for the
other connection strings.
“Database= CT3Translation” where “CT3Translation” is the name of CT3 database from

5. Prepare the required folders and setup permissions. You will need to create a folder called “CT3” (recommend you use case sensitive) and then create two sub folders called “Accounts” and “data” in that newly created “CT3” folder. You will need to then setup security permissions on “CT3/Accounts” with Windows Account and IIS read permissions, and Full permissions on “CT3/Data” folders. Place the two Account License Keys from Clay Tablet into “CT3/Accounts”.

Note: The CT3 folder must be placed under the Sitecore Data Folder (not the Sitecore website root folder). You can check your web.config file and search this line.

<sc.variable name=”dataFolder” value=”/App_Data” />

6. Install the pckage using the “Installation Wizard” under “Development Tools” on the Sitecore Desktop. You will need to have Administrator privileges to perform the installation. Wait the installation to complete. I recommend using Internet Explorer to perform package installations. According to the Sitecore Claytable installation document, the package will install one important file called CT3_LanguagesCodes.txt. If you do not see this file (which I experienced before), do not be alarmed. You can simply open up the package.zip file and then copy out that .txt file.

For info, the CT3_LanguagesCodes.txt file shows all valid region language code that are only valid for Sitecore ClayTablet to work. This may be used as reference in the configuration step.

7. Once installed, you will see the following new tab “CT3Translation” in the ribbon bar in Sitecore Content Editor.

test

But hold your horses, we are not done yet with 20 percent more to go. We will need to further perform a few parameter configurations.

In my next upcoming part 2 article, I will talk about parameter configurations and CT3 workflows.

Posted in .NET, Sitecore | Tagged , | Leave a comment

Setting up Jenkins as a CI Server for .NET 3.5/4.0/4.5 projects using BitBucket

Recently, I joined a team at Tribal DDB Melbourne. I had convinced my team of developers to adopt an approach to streamline deployments of the various projects for our clients. I endeavored upon to look at Jenkins (previously named Hudson), a CI product derived from Java Sun to implement a continuous build approach for the developer teams. After a few days of google-ing up articles and blogs from places, I decided to come up with my own simple way of steps to get one up and running, along with the gotchas and traps. Considering not many blogs have documented gotchas with building projects targeting the .NET Framework 4.5, I have decided to include it here.

Step 1:
Download the latest version of Jenkins. Run the installer which automatically steps you through the wizard. The installer will automatically install the Windows Service with the user set as Local System User. Ensure that the service is running up. You can browse to Services and Settings and lookup Jenkins. It should say ‘Started’. (https://wiki.jenkins-ci.org/display/JENKINS/Installing+Jenkins+as+a+Windows+service)

Step 2:
Restart your machine to ensure that the installation is performed completely.

Step 3:
Ensure that the Jenkins page can run. Fire up a browser and key in the following in the url, http:[your-machine-name]:8080. You should see landing page of your Jenkins Home page.

Step 4:
We need to install a few tools on our CI Server to support GIT since my team have already used BitBucket as their source control repository for their projects. This will require a few plugins such as Git Client and Server plugins, Jenkins GIT Plugin. This will be available on the Jenkins > Manage Plugins Page. The Jenkins GIT Plugin allows you to use MSBuild to build .NET Projects by specifying a location to the MSBuild.exe on the Jenkin’s configuration page. (https://wiki.jenkins-ci.org/display/JENKINS/MSBuild+Plugin) and whilst the Jenkins GIT Plugin allows communication between Jenkins and the BitBucket server. Next install the latest versions of GitBash and GitExtensions onto your server. After all this, you will get a ‘Git Bash’ icon on your screen.

Step 5:
Setting up communication between BitBucket and Jenkins is not enough by installing the plugins in step 4. We will need to setup authentication for Jenkins to allow to perform GIT commands to BitBucket server. Because Jenkins has no means of prompting the user to enter a password each time it intends to clone or pull from BitBucket, we will need to setup SSH Keys on our CI Server and BitBucket Read up here for more info on SSH and setting up SSH for Git. (https://confluence.atlassian.com/display/BITBUCKET/Set+up+SSH+for+Git). In a nutshell, you will need to setup a private key on your CI server and a public key on your BitBucket account. Note your BitBucket account will be the admin account for Jenkins and thus it is important to remember the credentials to this account.

Step 6:
Add the SSH Keys to your CI server. Leave the passphrase blank for BitBucket to authenticate, otherwise you will be left with a hanging “Fetching upstream changes from origin” when attempting to run the build for the project. You can setup authentication later by managing users and this will be sufficient for security purpose to restrict people to particular project builds. This step got me confused as to why I left with a hanging “FUCFO” as above. Leaving the passphrase blank solved the problem for me as Jenkins will now be able to push and pull from Bitbucket.

Step 7:
Create a test project in Jenkins and setup the necessary details and configuration so that it points to your project nicely. A good test will be to check if Jenkins is able to clone your project successfully into its workspace. Once this is working successfully, you will be ready to start setting up the build on your .NET solution or .csproj.

Step 8:
Next it is time to setup your msbuild scripts. I found a good place to start is an article by Mustafa (http://www.infoq.com/articles/MSBuild-1). I never knew how easy it was to get started with MSBuild. Mustafa also explains well how to setup a proper Jenkins job, along with build triggering from a BitBucket push, etc.

Some of the recommended plugins:

Artifact Deployer Plugin
This artifact allows you to choose which directories you will like to deploy to the target destination (usually this is where you will have your websites located such as Inetpub)
https://wiki.jenkins-ci.org/display/JENKINS/ArtifactDeployer+Plugin

Hopefully if you have anything else to share, feel free to drop me a comment to see how you go with setting up CI with BitBucket.

Posted in .NET | Tagged , , , , | 2 Comments