Sitecore 8 Upgrade: Lessons Learned

In my first Sitecore 8 Upgrade use case, described in my Upgrade from Sitecore 6.6 to Sitecore 8 series, I learnt several valuable lessons along the way. I will not be able to spell all those out for you. Therefore I decided to share a summary of those high level points with you. Hope you will be kind enough as well to add to this list. After all as a Sitecorian, sharing is caring.

Lesson 1: Addressing your Customer Requirements well.

Having well addressed all key success criterias related for my Customer, I feel that sometimes we may not address the requirements well if we do not understand what is needed to resolve the problem.

The key thing is to understand your Customer’s environmental variables. Next, is to ensure that you are confident with your proposed solution. This comes with much planning and analysis on the system’s problematic points and how the upgrade can address them.

If you are new to Sitecore, I will strongly recommend reading up Sitecore MVP blogs and visit Sitecore Community forums to speed up your learning. Having said that, nothing beats going through the original Sitecore Documentation in SDN and Docs Sitecore to understand basic and fundamentals.

Communicate with your client often to gather information during the analysis phase. In the first series, I explained about the must ask questions to your Customer when performing an upgrade. Use this as a guideline to help you make good decisions on your proposed solution.

Build a presentation deck to describe the scope of your work and how they will address your customer’s requirements, whether it will be performance issues, new features etc. This can better align yours and your customer’s understanding on the subject matter when the you are presenting your deck before the customer. This promotes a Q&A session after the presentation to better clarify requirements.

Lesson 2: Never gold plate your Customer Requirements.

Keep things simple. Ensure requirements are met to baseline standards. This can even save you from spending long hours of development whilst minimising the risk of your inability to deliver to your customer on time and on budget.

Putting your Customer’s main needs was the driving factor for the project success!

Lesson 3: If your customers are not familiar with Sitecore’s new CMS UI Search, show it to them earlier before the project kick off.

Sitecore 6.x does not feature any ability to search content or media using keywords, tags or facets.

Therefore when required, demo the use of Sitecore 8 much early in the phase to demonstrate how their users will be comfortable to use new search features. This will be their daily bread and butter and of course, and they must like the system and be comfortable to use it. This is especially important to prevent later surprises during User Acceptance Testing or Training sessions, whereby the user may not be happy with certain features. Get an agreement much earlier on to negotiate with basic features on the product and put this down into the requirements.

We realised we managed obtain just two Change Requirements pertaining to User Experience just after the User Acceptance stage. The above saved my team a lot of effort.

Lesson 3: My index rebuild is timing out and taking an extremely long time. Thank you dear Sitecore Logs. 

Turn off Analytics if you do not need it.

Thanks to our best friend, Sitecore Logs, I have managed to pinpoint the root cause of the issue.

If you are not working with Analytics, turn it off. Yes, it is simple to say that. However, it is not straightforward in Sitecore 8.0 with a flip of a switch (i.e it is not just only disabling a configuration setting in your .config). More info will be posted in my article about this.

Turning off Analytics reduced index rebuild time by two hours.

The reason why we needed to turn it off was due to the fact that a full index rebuild of the Master database took long hours to rebuild and timed out at the final step of index optimisation. This optimisation operation is performed by Solr prior to finalising and preparing its index for query readiness.

Turn off Item Cloning

This can be done by setting the ItemCloning.Enabled config property to false. This is  a gold configuration that not many people know. But before you turn it off, be sure to know what the configuration does. Disabling this property can can reduce memory consumption substantially and the reduce number of SQL calls to the Master database. We realised after this was disabled, turning off this setting reduced index rebuild time by a further four hours.

Increase your DI Container Timeout

We used Solr Cloud as our Search provider with Sitecore. By using Solr, we needed to choose a DI Container. By increasing the timeout on our DI container, this helped us to extend the time taken to rebuild and optimise the full Master index. By default, Sitecore sets this to a 30 minute timeout. This timeout value can be set by overridding the Application_Start method in the Global.asax.

Lesson 4: Use console applications to run long running processes.

This is especially true for the Data Migration implementation. We chose to use Console Applications to develop our Migrator Agents to migrate thousands of news, articles and media definition items to designated Sitecore Buckets.

Although Sitecore does feature Background Jobs,  however in our case, we were primarily concerned with web timeout operations. Using a console app separates all our long running activities from the web context and provided us assurance of non time out issues. This is especially true for large batches of data to migrate, especially with media definition items that carried over large binary data. Moreover, building with a Console Application gave us better control of code customisation and ensure a smooth and reliable migration process.

The above approach helped us to complete all data migration activities before the specified deadline.

Lesson 5: Make a back up of your data just before you execute your migration process.

Can’t emphasise this more. This is especially true if you mess up with data. We had an issue whereby we did not have a point in time backup to restore to after our Media Migrator Agent accidentally migrated Sitecore’s default system images to our designated custom Media Folder Bucket. We needed to then restore to current morning’s data and repeat a few complex activities prior to being able to run the agent again.

There are even possible occasions that you will need to repeat the migration process several times to test different scenarios. Backup your data in case!

Lesson 6: Data migration may slow down tremendously after a few hours.

This was due to the fact that Sitecore creates an event record on each content or definition item that is moved during migration. In turn, this creates heavy write operations to the Master database, that in turns causes the migration process to progressively decrease resulting in poor slow down.

Execute period cleaning of your EventQueue and PublishQueue tables to prevent Publishing Queue. Perform IIS restart to ensure that all processes and blocking threads are flushed prior to resuming the migration agent.

Lesson 7: Sitecore’s Content Management server eats memory like a beast

As we were performing a full index rebuild, we realised that IIS instance memory for our Sitecore solution started to grow as the index rebuild progressed along. However, with some of the extra activities turned off in our Sitecore configuration, this reduced slight memory consumption on the server. Having said that, Sitecore still does consume memory when it attempts to cache the items that are indexed into memory.

The solution was to increase memory capacity on the CM server to accomodate the index rebuild.

Lesson 8: Place higher CPU for your Solr Servers if using Solr Cloud

We implemented Solr Cloud for our customer whereby each Solr server will have an exact copy of our out of the box indexes and our custom indexes. Three Solr Servers were setup in a minimal quorum for leader election among the participating nodes (also known as followers) should one of the servers go down during an unexpected outage or if a service interruption occurred on one of the nodes. This setup is mandatory for minimal high data availability.

As we were monitoring the Solr Servers for the first week of the Post Go-Live period, we realised that CPU levels were peaking intermittently due to periods where large traffic spikes occurred and also during intervals where large batches of articles were created during a specified time of the day. This caused poor page loading of the home landing page at intermittent periods.

We realised that Solr services on the three servers consumed high CPU resources to replicate its indexes across its followers. Solr Replication is deemed an expensive operation. The resolution was to increase CPU capacity to accommodate the replication work load.

Hope that this article does help you to achieve your upgrade. Do note that every upgrade will have varying requirements and traps. The above is just a guide and what I can share with all of you so far. If there is anything I answered incorrectly, please feel free to comment and share on my blog. It will be as equally exciting to hear what others have learnt in their Sitecore upgrade projects.


Posted in .NET, Sitecore | Tagged | Leave a comment

Upgrading Sitecore 6.6 to Sitecore 8 (Part 1 of 2)

After a few months of being preoccupied with a prior project and a presales assignment, I am finally able to spend some time here to share my experience on my role as a Technical Lead for a Sitecore rebuild and migration work for our client for a six month period till early of December 2015. Our client is a prestigious news press and digital media company, who have used Sitecore as their platform and repository of thousands of news articles and stories. I hope that this article will be an enjoyable read as much as it was to write and recap my experience in this article.

The one most important thing to consider when working on a Sitecore upgrade is planning.  I cannot emphasize it more. This means listing down every possible scenario in every category and area for the rebuild and migration. The next thing is communication. I was given the privilege to ask honest questions from my customer during the Planning phase around requirements.

Our Customer’s primary concern with the system was poor performance. This included topics such as how we can better streamline Editor’s (who are the Content Authors) daily activity tasks, increase Editor experience and navigation with the back office. For instance, our users did not enjoy clicking through too many levels deep to get to an article content item. Thousands of articles accumulated over the years further degraded the user experience due to slow loading of content items. In addition, uploading of media items takes a long time to process and complete. More are mentioned down below under Customer Requirements.

To take things further, I was engaged to come up with a presentation to engage the customer’s Head of Technology as to why they needed to upgrade to Sitecore 8. This means putting their top priority concerns in a deck and providing them a proposed scope of delivery items that is addressable with Sitecore 8’s features. I hope that the below can be used  to help any Sitecore developers or consultants involved whereby in presales or in actual delivery of a Sitecore Upgrade.

Customer Requirements

Every customer has unique requirements and circumstances. And it is extremely important to understand how these can impact them from the short to the long term. After having attended several preliminary face to face meetings with my customer, I decided to understand more of their concerns and pain points with their use of the existing system. Below are the key pain points

  1. Poor CM performance whereby content authors were experiencing Kill Pages when navigating multiple levels down to navigate to their stories.
  2. Poor Publishing performance which often led to the Publishing activity being queued often. This is a news press company that often publish hundreds of story content every 10 to 15 minutes. This contributes to issue no 3)
  3. Slow upload of media images to Sitecore resulting in failed uploads and further exacerbating the CM performance.
  4. High data requirement with thousands of articles and millions of media items being stored over the past 10 to 12 years. This contributes to the issue 3)
  5. Poor media, stories and articles taxonomy which results in Content Author frustration to click through to many levels to reach their desired content item or media item.
  6. Inconsistency between indexes reflected in their Content Delivery servers in the Production environment. The customer were using Lucene to index large volumes of contents across their Content Delivery servers. This resulted in inconsistent viewing of news and article information on the front end site between servers.
  7. Inability to implement Content Personalisation as a lack of foundational setup with the current HTML structure and its data sources.

The Upgrade Questions

The next question is to ask ourselves why do you need to upgrade?

To follow up from the issues highlighted above,  I then decided to put a list of points in bullet form on the areas I considered that were important to highlight to the customer in pushing them for an upgrade.

  1. Does a minor/major upgrade address their performance issues immediately and in the long term?
  2. Do we need changes to the infrastructure?
  3. Are there limitations to the current customer’s environment (can this be onsite network limitations, physical hardware, etc.). If so, how this can affect the platform?
  4. Consider the Sitecore’s Product Lifecycle as to when main stream support will be made available for the customer. (
  5. Where possible, provide  them with Sitecore’s Compatibility Matrix to understand offerings between Sitecore 8 and Sitecore 7.2 so  that the Customer may understand what they can gain from Sitecore 8 and map these features to the issues highlighted in their requirements as addressable.
  6. What are the customer’s data and security policies around cloud hosting options?

In terms of the Digital Marketing System, my customer had not enabled this feature due to performance reasons already experienced with the CM. Nevertheless, our Customer were keen to realize their Digital Transformation Road Map in the second phase and treat the first phase as an opportunity to rebuild and prepare the platform for Digital readiness. Hence,  it was safe to speak about Sitecore 8.0’s new Experience Analytics Database (xDB) and its offerings to our customer on a high level. This was a gold point to point our Customer to adopt the upgrade and use the first phase as a Baseline Phase to ensure critical requirements related to performance issues were met.

Planning points

Next, I have gathered a must have list for any developer prior to planning their Sitecore Upgrade with their customer. The questions below are must-ask questions  from a CMS and DMS point of view.

Firstly CMS:

  1. How much data is stored in the CMS such as content items and media definition items.
  2. Are they are any customized items as the following described:
    • Custom events, modules, pipelines, processors and commands.
    • Custom agent tasks or schedulers.
    • Custom workflow definitions, items or commands.
  3. What are the various events, pipelines, processors and commands.
  4. Are there any current Sitecore’s owned modules. If so, check if it is first compatible with Sitecore 8.
  5.  Are there any shared source modules available in the Sitecore Market Place. If so, check if it is first compatible with Sitecore 8.
  6. Custom configuration files found in App_Config/Include.
  7. Are there any current third party integration or touch points.
  8. Are there any current Sitecore Support dlls that were issued by Sitecore or obtained from their Knowledge Base site. If so, check if it first is compatible with Sitecore 8.


  1. Has the customer adopted DMS on their existing platform? If yes, how can we plan to migrate their analytics data from Sitecore 6.6 to 8. Note that with Sitecore 7.5 to 8 and above, Sitecore has introduced the Experience Marketing Analytics and Platform to host user visitor data, tracking, profile, personal, campaigns etc. in MongoDB, a NoSQL document database that is able to support schemaless data. For more information on MongoDB, you can find out here (
  2. Are the current components for current Sitecore system configured to support Page Editor mode. Note: Page Editor experience is a prerequisite for Digital Marketers to configure content personalisation at a later date.
  3. Regardless if the customer has or hasn’t adopted Sitecore’s Engagement Platform (in this case it will be DMS for my customer), you must spell out the option to the Customer if they will like to host its Analytics infrastructure on-premise or on the Cloud. As mentioned, some companies adopt strict data security policies around Cloud hosting options, and as such may prefer to host the xDB infrastructure on premise. Where as, others prefer to cut short an learning curve on MongoDB and delegate the actual work (be it processing and aggregating analytics data, auto scaling, etc.) to Cloud xDB.  List the options down to the customer. It is important to pose these options upfront to the Customer so that they can make a considered decision and manage risks if any.

The next thing you need to know is which version of Sitecore 8 suits your project. Many people will be quick to think that the latest version of Sitecore 8 is the safest choice. Well thanks to some advice from a good Solution Architect friend at Sitecore, this is not always the case. Always check the feature release version of the product. Not every latest feature releases are stable as yet. Service packs are definitely recommended as Sitecore tends to introduce fixes to major performances and security vulnerabilities.  I will prefer staying with one version below the latest version update. And if there is a service pack which I can grab from Sitecore’s 8 Release downloads, use the service packs before the feature releases.

In the next part of this series, I will aim to demonstrate some of our approaches in addressing these performance pain points. These will be described in a summary and accompanied with a technical architecture diagram for ease of understanding.

To refer to key learning points of the upgrade, you can read my experience article here.

Please stay tuned.

Posted in Sitecore | Tagged | Leave a comment

Sitecore Azure 7.2 – My First Sitecore Azure Evaluation Story

Recently, I had a chance to run a ‘pilot’ project for a customer to demonstrate to them the capabilities of automating their Sitecore deployment with Sitecore Azure 7.2. Sitecore Azure is a Sitecore module developed to ease deployment of your Sitecore Solution to the Azure Cloud Service, typically known as Platform-as-a-Service (PaaS). This article will attempt to show you basic ideas on how to get started with configuring your setup to support the Sitecore Azure module for ease and automated deployment of your Sitecore solution to the Cloud. I will demonstrate this with a typical deployment scenario of a Staging delivery farm and thereafter later, how this can be promoted to become a Production delivery farm.

With some valuable experiences learnt along the way, I hope this article will be able to benefit other readers so that they do not face the hard and bumpy challenges along the way. However, if you think that there you may have an opinion to share, you are most welcome to comment on this article.

One note is that circumstances may vary to your situation and requirements. It is recommended to verify which Sitecore Azure module is compatible and suitable to your requirements. For the Sitecore Azure compatibility matrix, refer to the official table. Once you have decided on the correct module version, install the Sitecore Azure module onto the CM using the UpdateInstallationWizard page.

I will not explain the detailed installation steps of the module as this information is ready made available in the SDN documentation of Get Started with Sitecore Azure.

Installing the Management Certificate

Communications between Azure and Sitecore Azure requires a secured channel to transmit data from Sitecore Azure to the Cloud Service. You will first need to setup a management certificate for Sitecore Azure to be given authorization to send requests to Azure via the Azure Rest API Management Service.

I proceeded with option A) of downloading the .publishsettings file from my current Azure subscription ( and then uploading this management certificate (this certificate holds both the private and public certificate to Azure) via the Sitecore Azure prompts. Note you will be prompted to upload a management certificate after the first time you have uploaded your Sitecore Environment File. This Environment file is unique to your Sitecore solution is usually obtained from Sitecore within one or two days after a request has been made.


Once this is done, Sitecore Azure will be able to submit API service request calls to Azure to manage your Sitecore deployments.

One note is Sitecore addressed two bugs around this upload management certificate. It will be important to apply the below kb fixes.

  1. Error installing a management certificate for Sitecore Azure without the appropriate application pool permissions (
  2. Error installing a management certificate in Sitecore Azure when using the 2.0 format of the publish settings file (

As an optional setup, you can also manually configure SSL in Sitecore Azure using this KB article (

Creating a Staging Delivery Farm

I then create a delivery farm that will host my CD servers in PaaS in the staging slot. Notice that the environment type is set as ‘Evaluation [Staging]’. CD servers in PaaS are typically called ‘Web Roles’. In my case, I created five delivery farms (three that are Production and two that are Staging).

Sitecore Azure provides a integrated user interface to visualise the various deployment spots across the globe. I will recommend choosing a data centre closest to your area to minimise cost.


The deployment topology used in my use case is the “Delivery” Deployment topology meaning that only my CD servers are hosted in PaaS and my CM remains hosted on-premise. See figure above with only Delivery farms listed.

When you first add a delivery farm in a given datacenter region, Sitecore Azure brings up the New Deployment dialog. Choose the Number of instances and VM size appropriate for your environment. I choose to use the Default settings since this is environment is a staging environment and used only for evaluation purposes. By default, Sitecore Azure sets the number of instances to 2 and VM size to Small.

For production purpose, Microsoft recommends at least a minimum of 2 instances to satisfy the SLA agreement of 99.95 percent high availability.



After you add a delivery farm, Sitecore Azure creates the Delivery Farm item in the Sitecore Content Tree. For example, after I add DeliveryFarm05, Sitecore creates DeliveryFarm05 as a Farm item with its necessary child items at /sitecore/system/Modules/Azure/[Environment-Name]/[Region Name

Do not click Start Deployment yet. Click on the more options which brings you the Staging Deployment content item created. In this example, it navigates to /sitecore/system/Modules/Azure/SomeProject-Evaluation/Southeast Asia/Delivery05/Role01/Staging. You will see an example like the figure below.













By default, the Delivery Farms will be set to share the same Production and Staging databases. Leave this as-is.











Database references are set to point to existing databases. This is my preference. I do not wish for Sitecore Azure to create the Sitecore databases for me as this will consume extra time for deployment. Instead, I configure it to point to the existing sitecore databases (core, master, web, analytics and wffm).

Ensure that the Connection Strings Patch field is also set correctly with the same database connection strings as specified in your Database Set (set01). See figure below.  Refer to this KB article for more information. (


Next step is to ensure that your service configuration and service definition fields have been configured correctly. Ensure that the following fields are set correctly particularly StorageName, Microsoft.WindowsAzure.Plugins.Caching.ConfigStoreConnectionString and Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString. Ensure SqlServerName is left blank. These will create the Sitecore Logs in Azure as tables upon deployment of the Delivery Farm. For more info on how Sitecore Azure creates IIS logs, WADLogs, etc, refer to this KB article (



Once completed with the above, you are ready to Start Deployment. You will expect Sitecore Azure to create a brand new staging slot for DeliveryFarm05 if this has not been created yet. Whilst deploying, Sitecore Azure will perform the following in order:

1. Resolve databases

2. Builds the package

3. Applies the service configuration changes.

4. Executes the package

5. Creates the package

6. Uploads the package to Azure

7. Deploys the package to Azure

8. Runs any start up tasks specified in the service definition file if any.

9. Finally it runs the deployment and starts up the Web Roles.

The deployment time will vary depending on factors such as the size of the solution deployed and network bandwidth.

Sitecore Azure dialog displays the status messages indicating the progress of the deployment all the way up when all Web Roles have finished starting up to 100.00 %.

Once deployment is done, you will be able to see the green globe of Delivery05  in the Sitecore Azure dialog as below. This means the Delivery Farm has started successfully and is running.








Turning over to your Azure Management Portal whereby Sitecore Azure is linked to your account subscription via the Management Certificate Trust, you will be able to view the Staging Deployment Slot. For example, Delivery05 delivery slot is created as below.




Microsoft Azure permits up to a 100 farms per region. In real life, you will not be deploying as many as that into a production setting. So please be prudent about this choice. Note the more farms you have, the higher costs incurred upon you by Microsoft.

Promoting  a Staging Delivery Farm to a Production Delivery Farm

Once staging is up and running and after the evaluation process is decided, you can easily promote the staging delivery farm to be a production delivery farm. Sitecore Azure offers the Swap in to swap a staging and production deployment by swapping the virtual ip addresses both deployments. This is known as VIP-SWAP. VIP-SWAP incurs no downtime cost and is extremely fast. When a VIP-SWAP is performed, production deployment be swapped in with the previous VIP of staging and hence the staging deployment slot becomes empty and the production deployment slot is replaced by the staging deployment content. This can be seen in Azure Management Portal once the VIP SWAP is done. This blog explains it well (

To perform the VIP-SWAP in Sitecore Azure, you will click on the swap command entry. Note: This is essentially the same Swap command available in the Azure Management Portal. Sitecore Azure has provided a nice shortcut for Sitecore admins to perform this action in the dialog popup.







Once done, you will be able to see that the Production Deployment slot is now filled in with what was seen in Staging before. Swapping will typically take a short moment to complete making promotion so much faster and easier. When the swap completes, you have successfully promoted your site to Production!

Typically the below will be the status flow:

2/1/2015, 4:27:11 PMDeployment was swapped to opposite slot
2/1/2015, 4:27:03 PM…..SaCd05Role01PSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [done]. Progress: 100.00%
2/1/2015, 4:27:11 PMDeployment was swapped to opposite slot
2/1/2015, 4:27:03 PM…..SaCd05Role01PSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [done]. Progress: 100.00%
2/1/2015, 4:26:45 PM…..SaCd05Role01SSc412Staging [S] Sitecore.Azure.Pipelines.SwapAzureDeployment.SwapAzureDeployment [start]

See figure below.

Production Deployment Slot:














Staging Deployment Slot:


Hope this helps those who are encountering with Sitecore Azure for the first time. Feel free
to drop me questions or share opinions if any.

Overall, I found the experience with Sitecore Azure very seamless and it automated alot of the tedious deployment work required for a Sitecore Instance solution be it for small to large scale. It does introduce a very streamlined approach when it comes to deploying code based files and configurations to Azure Cloud. Although there are other mechanisms of deploying a Sitecore solution manually to Azure PaaS, Sitecore Azure does provide a nice visual of the statuses of your deployments in each delivery farm you deploy to.

That said, there are further tasks required to be investigated such as, startup task capabilities such as  automating 3rd party antivirus installation, instrumenting application performance monitoring on the web roles etc. Will yet to see how Microsoft is able to provide some useful managed services around these areas for recommendations to our customers moving forward with the advent of Azure Cloud Services.

For some useful references to Sitecore Azure, refer to the below which proved very helpful in my personal findings.



Posted in .NET, Sitecore | Tagged | 2 Comments

Sitecore CT3 ClayTablet – Part 1

Recently, I had the opportunity to install, configure and integrate Sitecore Claytablet Translation Connector (as known as CT3) for implementing a multilingual solution to one of our Sitecore CMS integrated client projects. In this article, I am going to refer to CT3 as opposed to using its long usual name. I realised that there aren’t many blogs out there which provide basic installation steps to get the CT3 Connector going along. I decided to put together a short blog of a step by step guide of preinstallation requirements. But before I dive into it, a quick nutshell of the CT3 Connector. This is the connector which Claytablet has developed to connect the Sitecore CMS to provide connectivity to the Claytablet Platform which is used to receive and route content from the cms to Translation providers and back. We can think of the connector as the bridge sitting between Sitecore cms and Claytablet.

Firstly, you will need the list of things below.

1. Obtain the CT3 Translation account license keys from Clay Tablet. This is usually one license key per translation provider. The license key comes in a .xml file such as source.xml. A target.xml file is supplied from Clay Tablet to indicate Clay Tablet platform as the destination where content will be sent to. You need not modify any of the source and target.xml.

Note: You cannot duplicate the license keys on two ore more difference environments. Doing so will cause unexpected behaviour with the CT3 connector resulting in lost translated content from the translator providers, orphaned projects etc.

2. Obtain the CT3 Installer package from Clay Tablet. This will be a Sitecore installation package (.zip file) which you will need to install via the Installation Wizard. Don’t do this step yet as we have not reached the installation stage.

3. Obtain the CT3 Database zip file from Clay Tabelt which includes the “CT3Translation.mdf” and “CT3Translation_log.mdf”. You will need to attach the CT3 databases on the same database server where your Sitecore (web, master and core) database are located. Make sure the CT3 database is online.

4. Configure the CT3 Database connection string:

You need to add one more connection string called “CT3Translation” (don’t use any
other name) for the CT3 Database:

The “User ID”, “password” and “Data Source” are usually same as used for the
other connection strings.
“Database= CT3Translation” where “CT3Translation” is the name of CT3 database from

5. Prepare the required folders and setup permissions. You will need to create a folder called “CT3” (recommend you use case sensitive) and then create two sub folders called “Accounts” and “data” in that newly created “CT3” folder. You will need to then setup security permissions on “CT3/Accounts” with Windows Account and IIS read permissions, and Full permissions on “CT3/Data” folders. Place the two Account License Keys from Clay Tablet into “CT3/Accounts”.

Note: The CT3 folder must be placed under the Sitecore Data Folder (not the Sitecore website root folder). You can check your web.config file and search this line.

<sc.variable name=”dataFolder” value=”/App_Data” />

6. Install the pckage using the “Installation Wizard” under “Development Tools” on the Sitecore Desktop. You will need to have Administrator privileges to perform the installation. Wait the installation to complete. I recommend using Internet Explorer to perform package installations. According to the Sitecore Claytable installation document, the package will install one important file called CT3_LanguagesCodes.txt. If you do not see this file (which I experienced before), do not be alarmed. You can simply open up the file and then copy out that .txt file.

For info, the CT3_LanguagesCodes.txt file shows all valid region language code that are only valid for Sitecore ClayTablet to work. This may be used as reference in the configuration step.

7. Once installed, you will see the following new tab “CT3Translation” in the ribbon bar in Sitecore Content Editor.


But hold your horses, we are not done yet with 20 percent more to go. We will need to further perform a few parameter configurations.

In my next upcoming part 2 article, I will talk about parameter configurations and CT3 workflows.

Posted in .NET, Sitecore | Tagged , | Leave a comment

Setting up Jenkins as a CI Server for .NET 3.5/4.0/4.5 projects using BitBucket

Recently, I joined a team at Tribal DDB Melbourne. I had convinced my team of developers to adopt an approach to streamline deployments of the various projects for our clients. I endeavored upon to look at Jenkins (previously named Hudson), a CI product derived from Java Sun to implement a continuous build approach for the developer teams. After a few days of google-ing up articles and blogs from places, I decided to come up with my own simple way of steps to get one up and running, along with the gotchas and traps. Considering not many blogs have documented gotchas with building projects targeting the .NET Framework 4.5, I have decided to include it here.

Step 1:
Download the latest version of Jenkins. Run the installer which automatically steps you through the wizard. The installer will automatically install the Windows Service with the user set as Local System User. Ensure that the service is running up. You can browse to Services and Settings and lookup Jenkins. It should say ‘Started’. (

Step 2:
Restart your machine to ensure that the installation is performed completely.

Step 3:
Ensure that the Jenkins page can run. Fire up a browser and key in the following in the url, http:[your-machine-name]:8080. You should see landing page of your Jenkins Home page.

Step 4:
We need to install a few tools on our CI Server to support GIT since my team have already used BitBucket as their source control repository for their projects. This will require a few plugins such as Git Client and Server plugins, Jenkins GIT Plugin. This will be available on the Jenkins > Manage Plugins Page. The Jenkins GIT Plugin allows you to use MSBuild to build .NET Projects by specifying a location to the MSBuild.exe on the Jenkin’s configuration page. ( and whilst the Jenkins GIT Plugin allows communication between Jenkins and the BitBucket server. Next install the latest versions of GitBash and GitExtensions onto your server. After all this, you will get a ‘Git Bash’ icon on your screen.

Step 5:
Setting up communication between BitBucket and Jenkins is not enough by installing the plugins in step 4. We will need to setup authentication for Jenkins to allow to perform GIT commands to BitBucket server. Because Jenkins has no means of prompting the user to enter a password each time it intends to clone or pull from BitBucket, we will need to setup SSH Keys on our CI Server and BitBucket Read up here for more info on SSH and setting up SSH for Git. ( In a nutshell, you will need to setup a private key on your CI server and a public key on your BitBucket account. Note your BitBucket account will be the admin account for Jenkins and thus it is important to remember the credentials to this account.

Step 6:
Add the SSH Keys to your CI server. Leave the passphrase blank for BitBucket to authenticate, otherwise you will be left with a hanging “Fetching upstream changes from origin” when attempting to run the build for the project. You can setup authentication later by managing users and this will be sufficient for security purpose to restrict people to particular project builds. This step got me confused as to why I left with a hanging “FUCFO” as above. Leaving the passphrase blank solved the problem for me as Jenkins will now be able to push and pull from Bitbucket.

Step 7:
Create a test project in Jenkins and setup the necessary details and configuration so that it points to your project nicely. A good test will be to check if Jenkins is able to clone your project successfully into its workspace. Once this is working successfully, you will be ready to start setting up the build on your .NET solution or .csproj.

Step 8:
Next it is time to setup your msbuild scripts. I found a good place to start is an article by Mustafa ( I never knew how easy it was to get started with MSBuild. Mustafa also explains well how to setup a proper Jenkins job, along with build triggering from a BitBucket push, etc.

Some of the recommended plugins:

Artifact Deployer Plugin
This artifact allows you to choose which directories you will like to deploy to the target destination (usually this is where you will have your websites located such as Inetpub)

Hopefully if you have anything else to share, feel free to drop me a comment to see how you go with setting up CI with BitBucket.

Posted in .NET | Tagged , , , , | 2 Comments

Melbourne Summer Cycle 2011

A calm, cooling and sunny day. A perfect start to the summer / autumn adventure of cycling. My mate, Sam and I took off on our bikes early for the Melbourne Summer Cycle 2011, a charity ride event organised in conjunction with raising funds and support for people living with multiple sclerosis.

This is usually the first ride event of the year which Sam and I take it as a short major ride event. The ride goes for just 40 km, just short and sweet for a charity ride. However, we still feel that this is a good exercise tour for us to begin training hard for upcoming cycling events this year.

The weather was just perfect and we made an early start at about 7.30 am. It took ages for us to ride out from the heart of the city until Port Melbourne.

More pictures to arrive as I am still digging in my collection. The one thing I love about riding is the expression of freedom, determination and appreciation for what we have in life. No matter where you ride in Melbourne, there’s always mother nature’s beauty to appreciate.

Posted in .NET | 2 Comments

Posting a selected value from a dropdownlist with ASP.NET MVC

Recently I came across a quirk of posting a selected value of a drop down list element in my HTML markup to an action method in my controller.

I basically set up my action method to receive a class object called TrainingPackageDetailsModel. This action method will be invoked upon a http post action.

Using a strongly type view model approach to populate my view template, I have basically set up a SelectList property of type IEnumerable<string> and populated this SelectList with a list (array of strings) of training package type name descriptions. For simplicity sake, the SelectList was initialized to be {“TrainingPackageType1”, “TrainingPackageType2”, “TrainingPackageType3”}. I then set my SelectList to populate a property in my TrainingPackageDetailsModel called SelectedTrainingPackageType of type string. This values are set during SelectList constructor time or instantiation time.

By doing this, I was expecting ASP.NET MVC to be clever enough to decipher the text and value of this SelectList element to have text, “TrainingPackageType1” and its corresponding value, “TrainingPackageType1”, “TrainingPackageType2” will have its corresponding value, “TrainingPackageType2”.

Compiling and running this code in FF3.5  and calling TryUpdateModel on the TrainingPackageDetailsModel after a HTTP Post has occured, my SelectedTrainingPackageType property within my TrainingPackageDetailsModel was successfully populated with a value I selected in the drop down list. Keep in mind, I am using IEnumerable<string> to initialize the text of my drop down list in my markup.

As I was developing on a website which needed to be browser compatible with both IE and FireFox, I had to run this same piece of code in IE7.  To my amazement with IE7, TryUpdateModel fails to populate the SelectedTrainingPackageTypeProperty with a value I selected in the dropdownlist.

So I ventured out abit to view the page source of both FF and IE7, and voila, FF manages to populate the text and value attributes within the option tags that nests inside the rendered select tag element with the text, “TrainingPackageType1” and value “”TrainingPackageType1”. This repeats for the other remaining values. Hence, the select html element looks like this.

Rendering the same element in IE7, I noticed that there were NO values at all populated for the value attributes of the option tags nested within the select element. This may be a quirk with IE7 in terms of how it intends to render the markup for a Html.DropDownList of a type set to it as IEnumerable<string>.

In order to get around this issue with posting a selected value to an action method when you’re using a strongly typed object (passed through the action method), I will recommend creating a class that encapsulates two properties which can be called Text and Value respectively. Then, we pass a list of these classes to the SelectList element and set the property names accordingly to be the text and value inside the SelectList at constructor time. This is a good practice and is one that I have actually learnt from the NerdDinners tutorial when I first began learning ASP.NET MVC. Changing my code to use a List of classes encapsulating a Text and Value property as apposed to using IEnumerable<string> eventually solved my aching problem on both IE and Firefox.

Feel free to test this out in other browsers too. I am confident this is the most solid approach to solving this problem in all browsers and there is nothing better that what the guys at Microsoft propose with good practices such as illustrated in Nerd Dinner tutorial.

Wasted two hours on this problem. I am most confident it will work in all other browsers as well.

I will put my code snippets very soon to address this scenario and that developers can clearly understand what is the problem at hand.

Posted in .NET | Tagged | 1 Comment