Activity Checklist – D365 Customer Insights

Customer insights ( https://dynamics.microsoft.com/en-us/ai/customer-insights/ ) is a powerful tool that helps us collate customer data from various data sources within the client’s environment or even outside and create a unified customer profile , which can then be viewed by all the internal users of CRM. Having information displayed beyond the boundaries of what is held in the CRM system makes it a really powerful tool that can substantially increase the ROI from the chosen CRM platform.

The objective of this blog is to list out the set of activities that will need inputs from the business which in turn will enable a successful implementation / configuration of the tool and help reap the benefits from the platform to the maximum. Below is the table listing out the activities ( which can be used like a checklist) and the inputs required. Hopefully this will help identify the inputs required right at the beginning thus helping the implementation team to plan the project in a much more efficient way.

Virtual agent and Customer Insights – joining the dots in customer service.

In this blog I am going to be looking at the process of ID & V (Identification and verification ) process that is adopted in customer contact centres  .  Whenever the customer calls in to report a problem or has a query or wants to make a complaint, irrespective of the nature of the service there is one aspect  that always remains a common factor – identification of the customer prior to registering the service request.

Over a period , countless hours are spent in identifying the customer irrespective of the nature of service that has to be provided.  This is a lot of valuable time lost which could have been spent on the actual customer issue allowing customer service agents to be  able to provide higher quality of service and better turnaround times.

As per a study conducted by IBM , businesses spent  $1.3 trillion on 265 billion customer service calls in 2017 , which has been on a steady rise ever since . The study also found that chatbots can help businesses save on customer service costs by speeding up response times, freeing up agents for more challenging work, and answering up to 80% of routine questions.

If we try and do some number crunching – lets say an average call center with 100 center agents would take up 1500 customer calls a day. If the average call lasts for about 20 mins , and the amount of time spent on verifying the customer is 30% of the total duration .The total amount of time spent on customer verification amounts to 150 man hours a day. Even if 50% percent of the customers initiate the conversation over the digital channel supported by a chat bot , that amounts to 75 man hours per day saved in customer verification alone ( I am yet to calculate the time saved in resolving simple customer queries). Over a month that amounts to a saving of approx. 2250 man hours, which could as well be spent on the actual issues / complaints reported by the customer, leaving the mundane task of verification to the automated channels.

The process of creating a fluid chatbot conversation with added intelligence becomes event more effortless if we are able to provide the ability to configure the end to end conversation enriched with data and insights into the hands of the subject matter experts as opposed to developers.  This is exactly what has now been made possible with the introduction of Dynamics 365 virtual agent for customer service by Microsoft.   The tools provides a simple “drag and drop” interface which enables SMEs to create conversations on common topics that are most frequently queried by their customers , thus providing more power into the hands of “citizen developers” . The tool integrates with Microsoft Flow which in turn enables dynamic content to be brought into the conversation via out of the box or custom connectors connecting to the various back office systems.  Note : Custom connectors is the only scenario where the additional lending hand is needed from the developers .

The virtual agent is complemented very well with the new customer service insights tool which helps SMEs understand the trending topics that the customers are most concerned about in their line of business . Which in turn allows the business SMEs to create additional conversations based on the topics highlighted by the customer service insights tool . This is where the concept of “digital feedback loop” can be seen in action , enabling an on-going process of continuous improvement of the quality of customer service supported by technology.

The image below illustrates the digital feedback loop in action:

Azure Cognitive Services,Power Platform and the Insurance sector – Part 2

In the previous blog for this series , we looked at the overall process of reporting an insurance claim at a high level . As promised , within this installment of the blog series will concentrate on first part of the process i.e. FNOL ( First notification of loss) by the policy holder , and how we can look to apply Azure cognitive services and related technologies (MS Flow,Power platform et al) to provide a value add to the business process.

The first loss of notification can either be done by the policy holder directly or by the appointed Insurance claim representative on behalf of the policy holder.

In this instance, while trying to move more and more towards the model of self-service supported by “digitization” being brought into the grand scheme of things , lets assume that the policy holder would directly notify the insurance company via an app provided by the Insurance company to report the loss and provide relevant details and upload associated images for the accident via the phone camera . This app could be built in PowerApp wherein the uploaded images are added to the document management , in this case it being Sharepoint . The images are then in parallel submitted to Azure Cognitive Services – computer vision API , which in turn analyses /interprets the image(s) and provides a textual description of the submitted image(s). The image of the licence plate can be interpreted by Azure OCR API – which can then be used to look up the corresponding account record in D365 CE and automatically create a case against the account !

The whole exercise of being able to interpret the images via combination of cognitive services enables :

  1. Helping the insurance claim advisers to understand the details of the incident in addition to the narrative provided by the policy holder and those involved in the incident via the textual analysis of the crash image
  2. Extraction of description text of the crash image can also be used by reporting systems or other in-house systems for further trend reporting and analysis by making the description text part of the internal overall “big data” held within the Insurance company.
  3. Speed in customer service – by automatically creating a case based on the registration plate image analysis using Azure OCR API , even before a detailed conversation is had with the concerned customer.

The overall architecture would be as below :

  1. The Microsoft PowerApp based mobile app will be used to capture the image via smartphone camera
  2. Microsoft Flow would then be triggered in the background which would in turn :
    1. Supply the crash image to the Azure Computer vision API (Describe Image Content)
    2. Upload the crash image to company’s sharepoint
    3. Supply the registration plate image to the Computer Vision API ( OCR Text)
    4. Upload the registration plate image to company’s sharepoint
    5. The registration text returned from the OCR API would be used to search for the account in D365 CE
    6. A new case will be created against the returned account based on the additional details supplied via the app ( along with the supplied imaged held in Sharepoint linked to the newly created case)
  3. The Flow can potentially also be used to supply the output from the Cognitive services directly to Power BI ( for futher analysis or input to any machine learning algorithms in the background)

The screen for the Mobile App created using Canvas Power App platform is as below ( Note : I do not claim to be a UI expert or a Power Apps Pro as this stage , so please excuse the odd colour scheme or “lack of finesse” in the UI design :))

The associated flow that would run is the background , once the customer submits the details and associated images is as below :

Most important thing to note – all of the above starting with the mobile app to the flow based logic running in the background leading to creation of the case in D365 has been carried out without writing a single line of code ! (if the expressions in PowerApps do not qualify as code) I have so far kept true to the promise of keeping this as “no code” as possible. I hope to continue the trend as we try and automate the process further down the line

Azure Cognitive Services,Power Platform and the Insurance sector – Part 1

With the ever growing long list of Azure Cognitive Services , combined with Azure machine learning – potential use cases that can now be possible across different industry sectors exist in abundance. Once such sector that is going to be the focus of this blog is the Insurance sector.

With the help of this blog series I will try and explore the application of Azure Cognitive Services , MS Power Platform and any other supporting technologies in the MS ecosystem within the Insurance sector , concentrating specifically on the claims process. The steps that will be followed to complete this exercise will be as below :

  1. Explain the end to end generic business process that is followed when reporting the FNOL (First notice of loss) right up until the fulfillment of the payment of the claim, keeping auto insurance into context.
  2. Break down the specific steps in the process and understand how we can apply cognitive services to help improve / speed up the step in question.
  3. Create an end to end POC based on the architecture specified in step 2.

Despite the fact that I come from a development background, my objective will be to try and keep the overall solution as “no code” as possible and see how far we can get. Hence proving the point that the toolset at our disposal is not just limited to developers but rather to be used by “citizen developers” (refer blog by Steve Mordue) , which I believe is going to be the case in majority of the scenarios going forward.

So lets get started with the first step, and understand the end to end process for filing a claim and receiving a payment.

Without claiming to be an expert on the insurance claim process I will try and explain the process in brief ( supported by the graphic above) :

  1. The customer / policy holder provides the first notification of loss immediately after the accident . The information provided is further supplemented by any video/graphic evidence
  2. After providing the insurance company with all the information, the claim will proceed to the next stage, and insurance holder will be assigned a claim representative. The representative will examine evidence of injury claims and provide assistance with the claim process and clarify any outstanding process.
  3. Based on the nature and value of the damage ( and any other parameters) relevant adjustor from the insurance company will be appointed.
  4. The adjustor will be responsible for inspection and assessment of the damage and issue a cost estimate.
  5. Based on the cost estimate issued , if the car is deemed not be be beyond economic repair , relevant repairs are carried out by the appointment garage . If the car is beyond economic repair , a settlement payment is issued to the policy holder on the agreed amount.

The scenario explained above will help us set the scene for the architecture / building blocks that we will put in place to provide fully functional cognitive capabilities as a value add to the process in the next part of this blog.

Microsoft Flow – Enterprise Ready ?

Now that we have transitioned into a brave new world of Power Platform from the old world XRM , it obviously becomes imperative to understand the complete toolset at our disposal in great detail.

One of the most important and underestimated ( IMHO) tool in the Power Platform toolbox is Microsoft Flow. Flow is touted as one of the three pillars of the Power Platform – and it is hard to ignore the amount of power the tool brings to the table.

However when talking to enterprise customers that have a sophisticated IT landscape that underpins their business , there are quite a few apprehensions (rightly so) about a technology that is supposed to be at the disposal of “citizen developers”, and the mayhem it can potentially cause without the presence of appropriate governance rules.

So as part of this blog (or a blog series , if I get around to creating a series out of it) , I will try and answer some of the common questions / challenges (in a Q&A format) we can come across when trying to land this piece of technology in an enterprise environment.

Can I group / isolate / segregate and separate various flows being created in my organisation ?

The subscription for Microsoft Flow comes with a separate administrator interface i.e. https://admin.flow.microsoft.com/ , which is the admin center for Flow.


This admin center allows administrators to create and manage what is called environments for Flow. These environments provide an isolation boundary for all resources. Hence providing the capability to create isolation between apps and flows . Additionally DLP (data loss prevention) policies can be applied to individual environments – which prevents connectors talking to each other , that is if you do not want them to talk to each other. As an example if you do not want D365 online and Twitter in the same flow – you could create a DLP policy that would prevent that.

Can I mimic a typical ALM scenario with dev , test and prod instances to gain more control over the release cycle of Flows being created ?

Using the concept of environments as explained above – we can mimic such a scenario.

As we can see above – the default environment has been renamed to Dev and additional non-default environments have been created to mimic test and prod. The Dev environment can be used by the users to experiment and create new flows. Whereas the Test and Prod environments can be used in more controlled manner with additional policy controls being applied , and limited publishing right given to the users. This will prevent users pushing changes to test or prod environments without going through the rigor of a change control process , but having flexibility of being able to experiment in the Dev environment at the same time.

This set-up can now be used to separate connections to SOX and non-SOX systems . Any SOX systems can be connected to within the test or Prod environments whereas any non-SOX systems can be connected to in the default / dev environments.

Are there any tools available that can be used to aid the maintenance of the flows being created in a various environments ?

Other than the standard powershell cmdlets , Microsoft has introduced a number of native flow management connectors that can be used to carry our various maintenance activities. A quick search for the templates results in the list as below :

If we look at one specific template as an example :

It provides a list of new PowerApps, flows and connectors that have been introduced into the tenant within a configurable window, the list of components is then collated and emailed to a configured recipient .

This could potentially be further extended to add flow approvals , which enable Admin to approve or reject flows , and the ones rejected can be deleted after a specific time period , post notifying the original creator.

Can we get more visibility and insight into how Flow is being used by the users within the Organisation ?

In order to satisfy this requirement Microsoft has released Microsoft Flow Admin Analytics based on customer feedback . This admin interface provides a detailed brekadown of the usage of Flow , with the help of various prebuilt Power BI reports, as explained in the link :
https://us.flow.microsoft.com/en-us/blog/admin-analytics/