Dynamics CRM to Cosmos DB

Recently I was looking at storing data held in Dynamics CRM in  a storage environment which has better performance and doesn’t require manipulating the Dynamics CRM Web API. This is something that I was recently looking at and it is possible to do it in a number of ways to achieve this but it can also be painful. Microsoft Azure provides a number of tools of which some are listed below:

  • Events Hubs plus Streaming Analytics
  • Azure Data Factory
  • Azure Logic Apps / Microsoft Flow

Azure Event Hubs

Initially I looked at integrating this process through the combination of Azure Event Hubs and Streaming Analytics which is the link between the data feed itself and the input. This can take data from an Event Hub and feed it to a number of sources such as Cosmos DB, Power BI etc.

The connection can be set up through the Plugin Registration Tool in Dynamics 365 and you can add a number of sources such as Event Hubs, Service Bus etc. Once applied against a Primary Entity you can test the connection by changing items. One thing to note when testing is that there is a checkbox located at the bottom left corner of the Register New Step screen which is “Delete Async Operation if Status Code = Successful”. If the operation if successful then it will not be logged so it can be worth un-checking this for initial testing purposes.

Pulling the data through the Event Hub and manipulating this data through Streaming Analytics seemed liked a good idea initially but what is fed through is hundreds of lines in which you have to manipulate the data in a format which is acceptable for Cosmos DB.

"key": "modifiedonbehalfby",
"value": null
"key": "name",
"value": "Test 1234"
"key": "modifiedon",
"value": "\/Date(1528470685000)\/"




Azure Data Factory

At the time of writing this article V2 Preview had just come out which made the process a lot more difficult. Unfortunately I had a number of issues with this and it proved pretty troublesome.


Azure Logic Apps / Microsoft Flow

Azure Logic Apps turned out to be a fantastic way of moving data from Microsoft Dynamics into Cosmos DB. The connectors available use the Web API to check to whether records have been updated for an Organisation, Contact etc and manipulate the data and pass it through to Cosmos DB.

This can be achieved using the following steps

  1. Dynamics CRM Connector – When a record is created or updated
  2. Compose Action (Manipulate the data into the JSON format required for Cosmos DB)
  3. Cosmos DB Connector – Create or update a document


Compose action:

Ensure that id is one of the fields when using Componse or you will receive errors like below:

  "code": "BadRequest",
  "message": "Message: {\"Errors\":[\"One of the specified inputs is invalid\"]}\r\nActivityId: 100c177f-572b-4ef6-ae52-e800179dba95, Request URI: /apps/1763df43-522c-4d5c-a297-e65d4be9a9f3/services/e78ab862-8664-4155-b331-492a5a16c3d1/partitions/e8bc4a0d-0ad0-45b2-8086-8c0a7e8d3c54/replicas/131722482047483029p, RequestStats: , SDK: Microsoft.Azure.Documents.Common/"
  "code": "BadRequest",
  "message": "Partition key provided either doesn't correspond to definition in the collection or doesn't match partition key field values specified in the document.\r\nActivityId: ea348b1c-6ff0-403d-90ec-824d91fdfb6f, Microsoft.Azure.Documents.Common/"

One thing to note though is the cost and its worth looking at Flow which is generally more aimed at business users but it may prove more cost effective to use this.





2 Replies to “Dynamics CRM to Cosmos DB”

  1. Can you help providing the template for Logic Apps?
    What are you doing in Compose Action .. Is Compose Action and Parse JSON two different steps or one. We did it separately and got error in PARSE JSON that expected document and received object.

    Also our CRM does not have a id as column how do we map the primary key of CRM table to JSON created id.

    • You just need to use compose. I have just added an example of the compose action you can use to map it to JSON. This will allow you to map the fields that you need.

Leave a Reply

Your email address will not be published. Required fields are marked *