Sitecore 10 Docker installation – Missing database

After using Docker Compose to start Sitecore 10 with Docker it appeared that the traefik and cm containers were unhealthy when viewing the containers using the docker ps command

After some investigation I noticed the following error and it turned out that the Sitecore.Master database was not attached at all despite all of the other databases being in position

Exception: System.Data.SqlClient.SqlException
Message: Cannot open database "Sitecore.Master" requested by the login. The login failed.

This is easy to spot if you use the docker logs command.

docker logs --tail 100 {container id}

The mdf and ldf files were in place in the container but they appeared to be corrupt. This is not necessarily a problem because the getting started guide ( comes with all of the databases which are held in getting-started/mssql-data

My initial attempts led me to the following error which would not allow me to copy to the container

Error response from daemon: filesystem operations against a running Hyper-V container are not supported

This was resolved by stopping and starting the container before and after copying the files

docker stop {container id}

docker cp C:\sitecore\docker\getting-started\mssql-data\Sitecore.Master_Primary.mdf  sitecore-xp0_mssql_1:/data

docker cp C:\sitecore\docker\getting-started\mssql-data\Sitecore.Master_Primary.ldf  sitecore-xp0_mssql_1:/data

docker start {container id}

Once the files are copied you can check they are in place using the following command to open up an interfactive powershell session

docker exec -it {container id} powershell

Unfortunately after attempting to attach the database in SQL Server Management studio I kept receiving the following error.

Unable to open the physical file "C:\test\Sitecore.Master_Primary.mdf". Operating system error 5: "5(Access is denied.)".
CREATE DATABASE failed. Some file names listed could not be created. Check related errors. (Microsoft SQL Server, Error: 5120)

To resolve this you need to use c:/data which contains the other databases

Sitecore 10 CLI Issues


Whilst attempting to login to the CLI for the first time I came across an issue where I received the following message.

Couldn't resolve a root configuration file (sitecore.json) in the current or any parent directory. Looks like the command may have been executed outside a Sitecore project?

To solve this you simply needs to initialise the properties using the following command. Once this has been created it will create the file in the base directory and contains serialisation properties

sitecore init

Module Json Format

Error converting value "items" to type 'Sitecore.DevEx.Serialization.Client.Configuration.SerializationModuleConfiguration'. Path '', line 1, position 7.
  in C:\inetpub\wwwroot\\test/test.module.json

Invalid Format

	"items": {
	  "includes": [
		  "name": "content",
		  "path": "/sitecore/content/home"

Correct Format

	"namespace": "Project.Test",
	"items": {
	  "includes": [
		  "name": "content",
		  "path": "/sitecore/content/home"

Azure CDN from Verizon Performance with Sitecore

Recently I had to investigate a performance issue problem where images from Sitecore through an Azure CDN were being delivered very slowly. Very small images of 7kb in size were taking 500ms – 1.5s and it was causing the website to perform very poorly.

This turned out to be purely more than just a Sitecore issue and this blog article discusses some of the problems and issues which you are likely to come across when dealing with a content delivery network.

Hit or Miss

Content Delivery Networks generally have a concept of hit and miss whereby the hit delivers the content directly to the user and a miss returns to the origin. With an Azure CDN it is possible to identify whether it is a hit or a miss by looking at the response headers.

The response header x-cache: HIT is added to the response header if the CDN has achieved a HIT but if this is missing then it has gone back the origin and there will be no similar MISS identifier header. These are easily visible through the network tools in Chrome

One thing to note about this header is that it is not part of the HTTP Header standards which is why it has the x prefix

When having issues with any CDN vendor one of the best methods to receive support is to provide a HAR file which provides all of the diagnostic data. This can be generated in Chrome by when using the Network tab. Once the traffic has loaded on the page right click and select Save all as HAR with content which will generate a JSON file.

If you want to view this data in a useful format to analyse the problems, then there is a great tool provided at

Pop Locations

Content delivery networks depend on having point-of-presence locations in your near vicinity. They also require having a connection with your ISP and if they do not then your ISP will use the next pop location. Each provider varies so it is good to find out from a possible vendor what pop locations they offer.

Query-string Caching

Each unique query string has its own cache key which is stored within the CDN. This cache key is then replicated through the various geographical locations using by the content delivery network provider. By adding query string keys, you are adding more variation and complexity this is a very common cause of why items respond with a miss. Reducing this complexity will definitely lead to a better hit rate.

Content can obviously change quite frequently on a Sitecore website and updated images will require purging. There are two ways around this:

  • Custom Media Provider

Compression Settings

Most IIS based websites now have Dynamic Compression enabled which will allow the website to server to serve files using gzip compression. Unfortunately, even if this is enabled this won’t be replicated on the CDN and also needs to be configured there. Unfortunately the compression will only work for items which are cacheable so you will notice that pages do show as using gzip compression

This can be achieved in the CDN settings under:

Settings > Compression and it gives you the ability to specify the file types which you actually want to compress.

Image Uncacheable Viewer

Dependent on the provider you can analyse the status of the CDN Cache to a certain degree. Through the Supplemental Management Portal in the Azure Portal you can find out whether images are being cached as expected.

Using a Azure CDN Verizon account this can be located under:

Analytics > Edge Performance Analytics > Http Large Object > CacheStatusUncacheable.

This will list out all of the items which are not cached on your website. The list itself will generally consist of  popular pages and items that may not require caching.

The default setting in Sitecore for caching images is set to private but CDN’s require it to be public. Public marks the response as cacheable where as private means that the cache is specific to one user/instance. When using private this will cause traffic to route via the CDN and return to the origin.

Cache-Control: private, max-age=604800

By default the cache-control setting in Sitecore.config is set to private. This can be easily changed in the Sitecore configuration using a simple patch file

<?xml version="1.0" encoding="UTF-8"?>
<sitecore xmlns:p="" xmlns:role="" xmlns:s="">
      <setting name="MediaResponse.Cacheability" s:value="public" />

External Links

Sitecore Suggest No suggester named default was configured

Recently I was testing out SOLR’s suggester functionality through the  Sitecore API. There is a slight gap in the documentation and I kept receiving the error No suggester named default was configured.

<?xml version="1.0" encoding="UTF-8"?>
   <lst name="responseHeader">
      <int name="status">400</int>
      <int name="QTime">22</int>
   <lst name="error">
      <lst name="metadata">
         <str name="error-class">org.apache.solr.common.SolrException</str>
         <str name="root-error-class">org.apache.solr.common.SolrException</str>
      <str name="msg">No suggester named default was configured</str>
      <int name="code">400</int>


This error is caused because the Response Handler does not know which dictionary to use. This can be resolve in two ways:


Option 1 Specify the dictionary in the request handler. Once this change has been made you must reload the SOLR core

<?xml version="1.0" encoding="UTF-8"?>
<requestHandler name="/suggest" class="solr.SearchHandler" startup="lazy">
   <lst name="defaults">
      <str name="suggest.dictionary">mySuggester</str>
      <str name="suggest">true</str>
      <str name="suggest.count">10</str>
   <arr name="components">


Option 2 Add the dictionary into the parameters

using (IProviderSearchContext context = index.CreateSearchContext())
    SolrSuggestQuery q = searchtext;
    var options = new SuggestHandlerQueryOptions
        Parameters = new SuggestParameters
            Count = 3,
            Build = true,
            <strong>Dictionary = "mySuggester"</strong>
    var result = context.Suggest(q, options);
    var suggestions = result.Suggestions["mySuggester"].Suggestions.Select(x =&amp;gt; x.Term);

Further Reading

Solr Unknown Field error

When attempting to rebuild the sitecore_master_index on a project I faced an issued regarding an unknown field. This is related to the language  support provided in the use of Solr’s managed schemas which was introduced in Sitecore 9.  Unfortunately the support for languages is not extensive and Populate Solr Managed Schema does not necessarily generate all of the languages that you need.

Job started: Index_Update_IndexName=sitecore_master_index|#Exception: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> SolrNet.Exceptions.SolrConnectionException: <?xml version="1.0" encoding="UTF-8"?>
<lst name="responseHeader"><int name="status">400</int><int name="QTime">7</int></lst><lst name="error"><lst name="metadata"><str name="error-class">org.apache.solr.common.SolrException</str><str name="root-error-class">org.apache.solr.common.SolrException</str></lst><str name="msg">ERROR: [doc=sitecore://master/{f621b08b-ac40-428e-af9a-a6d8be8640b0}?lang=zh-sg&amp;ver=1&amp;ndx=sitecore_master_index] unknown field 'description_t_zh'</str><int name="code">400</int></lst>
---> System.Net.WebException: The remote server returned an error: (400) Bad Request.
at System.Net.HttpWebRequest.GetResponse()
at HttpWebAdapters.Adapters.HttpWebRequestAdapter.GetResponse()
at SolrNet.Impl.SolrConnection.GetResponse(IHttpWebRequest request)

Rather than the old implementation of using the schema.xml, Sitecore 9 now uses Solr’s managed schema which means that it is more reliant on using the SOLR RESTful API. This can be used to make changed and this is how the Populate Solr Managed Schema tool works. The managed schema is still held in a file under server/solr for each core but it is recommended that you do not directly edit it.

This can be viewed by selecting a Solr core through Files > Managed Schema


Unfortunately there is not full coverage and a number of the languages using the Populate Solr Managed Schema tool . In our instance zh-sg (Chinese) was not supported and we needed to solve this by adding a dynamic field

<dynamicField name="*_t_zh" type="text_general" indexed="true" stored="true" />


This would then represent something like the following which is held in the managed schema

This can be achieved by selecting the core sitecore_master_index and the menu  to select Schema. Once loaded you can select Add Dynamic Field and fill out the following details as shown below. Once this has been saved you can attempt to rebuild the index again.

name: *_t_zh

field Type: text_general

stored: Checked

indexed: Checked


After contacting Sitecore about this issue they raised this as an issue and suggested extending the Populate processor Sitecore.ContentSearch.SolrProvider.Pipelines.PopulateSolrSchema.PopulateFields in future scenarios to introduce this as part of the managed schema process


Further Reading:

Experience Profile Contacts Error

Sitecore 9 issue faced when loading the Experience Profile screen:

[Sitecore Services]: HTTP GET
URL https://dev.local/sitecore/api/ao/v1/contacts/search?&amp;pageSize=20&amp;pageNumber=1&amp;sort=visitCount desc&amp;match=*&amp;searchfromdatefilter=30%2F12%2F2018&amp;searchtodatefilter=29%2F01%2F2019&amp;searchchannelfilters=null&amp;searchcampaignfilters=null&amp;searchoutcomefilters=null&amp;searchgoalfilters=null&amp;searchprofilefilters=null&amp;searchdevicefilters=null&lt;/pre&gt;
Exception System.NullReferenceException: Object reference not set to an instance of an object.
at Sitecore.Cintel.Endpoint.Plumbing.NegotiateLanguageFilter.OnActionExecuted(HttpActionExecutedContext actionExecutedContext)
at System.Web.Http.Filters.ActionFilterAttribute.OnActionExecutedAsync(HttpActionExecutedContext actionExecutedContext, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Web.Http.Filters.ActionFilterAttribute.&lt;CallOnActionExecutedAsync&gt;d__6.MoveNext()
  • Ensure Newtonsoft.Json.dll version is same as vanilla sitecore package
  • Ensure SOLR path connection strings are formed correctly. For example,  they should not have hashes in:


<add name=”solrCore” connectionString=”https://localsolr:8984/solr/xp_xdb” />


<add name=”solrCore” connectionString=”https://localsolr:8984/solr/#/xp_xdb” />


Sitecore Experience Accelerator (SXA) Setup and basic guide

Detailed in this blog is a basic guide on setting up SXA and creating a basic site and adding a component.


The initial support the SXA started with Sitecore 8.1. At the time of writing this it was not evident from the downloads page which version was for which version of Sitecore and it only becomes apparent when installing the package. This information is in the Installation Guide though.


When installing the module please make sure that you have the relevant version of SPE ( installed for the version of SXA



SXA uses the concept of a Tenant to define site groups. Under this you can create a number of sites which can represent brands etc. The Tenant template is created on the Content node.

As the module is Helix based you can then choose your Feature set.


New Site

As mentioned previously you can create sites for different brands, organisations. To create a new site, right click on the Tentant previously created and select the Site template.


A new window will pop up and give you a number of options

  • Features
  • Themes
  • Grid – Bootstrap, Foundation etc

Complete the name and Press OK to progress


SXA Site Manager

Once the site has been created this should load but you can also access it through the menu as displayed below. Click on the site and open from top right to specify the site properties

Complete the “Target Hostname” field and create a binding for this in the hosts file and the IIS bindings for the site.





Add Component

As mentioned previously the SXA module is based on Helix principles. This will be become more evident once you start adding components.

To add a component select Experience Editor (Publish Tab) for the Home page to load the accelerator editor.  There are basic placeholders have been in place. To the right hand side the components list is visible

Add a component as standard and you will be presented with the renderings window. In this instance we will add a Carousel.


Create a new folder by specifying the Create button next to Carousels (Current Site)



Select Carousel


Select the Carousel itself to add it to the page.

You can now edit the component as you would any standard component developer through Sitecore.


To modify the core features of the Carousel you can edit this through parameter template properties set through Edit Component Properties



To review the content added you can view them in the Helix based structure in the following locations:






Further Reading:

Dynamics CRM to Cosmos DB

Recently I was looking at storing data held in Dynamics CRM in  a storage environment which has better performance and doesn’t require manipulating the Dynamics CRM Web API. This is something that I was recently looking at and it is possible to do it in a number of ways to achieve this but it can also be painful. Microsoft Azure provides a number of tools of which some are listed below:

  • Events Hubs plus Streaming Analytics
  • Azure Data Factory
  • Azure Logic Apps / Microsoft Flow

Azure Event Hubs

Initially I looked at integrating this process through the combination of Azure Event Hubs and Streaming Analytics which is the link between the data feed itself and the input. This can take data from an Event Hub and feed it to a number of sources such as Cosmos DB, Power BI etc.

The connection can be set up through the Plugin Registration Tool in Dynamics 365 and you can add a number of sources such as Event Hubs, Service Bus etc. Once applied against a Primary Entity you can test the connection by changing items. One thing to note when testing is that there is a checkbox located at the bottom left corner of the Register New Step screen which is “Delete Async Operation if Status Code = Successful”. If the operation if successful then it will not be logged so it can be worth un-checking this for initial testing purposes.

Pulling the data through the Event Hub and manipulating this data through Streaming Analytics seemed liked a good idea initially but what is fed through is hundreds of lines in which you have to manipulate the data in a format which is acceptable for Cosmos DB.

"key": "modifiedonbehalfby",
"value": null
"key": "name",
"value": "Test 1234"
"key": "modifiedon",
"value": "\/Date(1528470685000)\/"


Azure Data Factory

At the time of writing this article V2 Preview had just come out which made the process a lot more difficult. Unfortunately I had a number of issues with this and it proved pretty troublesome.


Azure Logic Apps / Microsoft Flow

Azure Logic Apps turned out to be a fantastic way of moving data from Microsoft Dynamics into Cosmos DB. The connectors available use the Web API to check to whether records have been updated for an Organisation, Contact etc and manipulate the data and pass it through to Cosmos DB.

This can be achieved using the following steps

  1. Dynamics CRM Connector – When a record is created or updated
  2. Compose Action (Manipulate the data into the JSON format required for Cosmos DB)
  3. Cosmos DB Connector – Create or update a document


Compose action:

Ensure that id is one of the fields when using Componse or you will receive errors like below:

  "code": "BadRequest",
  "message": "Message: {\"Errors\":[\"One of the specified inputs is invalid\"]}\r\nActivityId: 100c177f-572b-4ef6-ae52-e800179dba95, Request URI: /apps/1763df43-522c-4d5c-a297-e65d4be9a9f3/services/e78ab862-8664-4155-b331-492a5a16c3d1/partitions/e8bc4a0d-0ad0-45b2-8086-8c0a7e8d3c54/replicas/131722482047483029p, RequestStats: , SDK: Microsoft.Azure.Documents.Common/"
  "code": "BadRequest",
  "message": "Partition key provided either doesn't correspond to definition in the collection or doesn't match partition key field values specified in the document.\r\nActivityId: ea348b1c-6ff0-403d-90ec-824d91fdfb6f, Microsoft.Azure.Documents.Common/"

One thing to note though is the cost and its worth looking at Flow which is generally more aimed at business users but it may prove more cost effective to use this.


Sitecore 9 XConnect List Manager Import Error

Today I attempted to import a list into List Manager using Sitecore 9 Update 1. The import went fine but none of the contacts would not show up

In the logs I received the following errors which didnt appear to provide much information:

INFO  [ListManagement]: Starting contacts importing
ERROR Request of 'importing' is failed. Retry will not be performed because the issue is probably caused not during xDB operation... (Message: Object reference not set to an instance of an object.)
Exception: System.NullReferenceException
Message: Object reference not set to an instance of an object.
Source: Sitecore.ListManagement.XConnect
at Sitecore.ListManagement.XConnect.Import.XConnectContactImporter.&lt;&gt;c__DisplayClass3_0.b__1(IXdbContext client)
at Sitecore.ListManagement.XConnect.XdbRequestPerformer.RequestWithRetry(Action`1 action, String actionMessage)</code>

11824 17:19:19 ERROR [ListManagement]: Failed to finish importing. The error has been occurred: Object reference not set to an instance of an object.
Exception: System.NullReferenceException
Message: Object reference not set to an instance of an object.
Source: Sitecore.ListManagement.XConnect
at Sitecore.ListManagement.XConnect.Import.XConnectContactImporter.&lt;&gt;c__DisplayClass3_0.b__1(IXdbContext client)
at Sitecore.ListManagement.XConnect.XdbRequestPerformer.RequestWithRetry(Action`1 action, String actionMessage)

This error didn’t seem very conclusive so I attempted to see whether xconnect was working correctly.

To see whether XConnect was working correctly by exposing the oData feed I opened up the config xconnectsite/App_config/AppSettings.config and commented out this line:

<add key="validateCertificateThumbprint" value="1F24CBC38F9239DEDCF166A82E1C0B2A47D12844" />

I then went to look at the exposed data contacts via https://sc91.xconnect.local/odata/Contacts and came across the following error:

"Message":"Operation failed: Store Error: Login failed for user 'collectionuser'.. The error occurred while attempting to perform the underlying storage operation during 'Microsoft.Azure.SqlDatabase.ElasticScale.ShardManagement.StoreException: Error occurred while performing store operation. See the inner SqlException for details. ---&gt; System.Data.SqlClient.SqlException: Login failed for user 'collectionuser'.\r\n at 

Resolving this issue then allowed me to import the contacts