Many discussions around moving SAP to Azure, AWS, Alibaba or GCP circle around IaaS capabilities, cost of running production and maintenance procedures for instance. However, in my opinion moving the system containing the most valuable data, that a company owns, close to an environment that thrives on innovation and rapid adoption of new trends is much more interesting. One of those trends is Machine Learning. Therefore, at the heart of today’s blog post is the Text Analytics API provided by Azure Cognitive Services.
So, let’s dig into a fancy full-up-and-running analytics topic, that you can rollout yourself
I am going to show you how easy it is to enrich your precious SAP data with meaningful inputs from social media and feed it back to a CRM system. A complete process, that creates immediate additional value with your SAP data.
Down below you will find the overall reference architecture of the solution. The actual implementation It is based on Azure, SAP’s cloud ERP demo system ES5, SAP Cloud Platform, Twitter and Dynamics365 CRM solution.
Fig. 1 architecture overview
Data Ingestion is the foundation (upper part of fig.1)
Our implementation reads the “famous” Sample Flight Data dataset on a timely basis from SAP ERP (ES5) via OData:
Credentials are provided in a secure way via Azure KeyVault. For productive scenarios you would be looking for isolated environments and VNet Peering options to avoid Basic Auth. But in our case, we are limited by the exposed functionality of the Demo environment of ES5.
The result is stored on a schema-less Azure Table Storage.
In parallel we listen for tweets on our Twitter account in relation to the flight carrier information from ES5. You have the option to search for keywords and/or specific Twitter users for example. We decided to listen for airline related content on our Twitter demo account only to have more certainty on the data traffic.
Fig.2 Screenshot from demo Twitter account
The twitter feed posts are analysed for its sentiment via Azure Cognitive Services Text Analytics and categorized by airline depending on the search results. So, if you see: “What’s happening at ContosoAir? Delays again? ”, this will likely be scored as negative and be put in the ContosoAir bucket.
Eventually the results from the Twitter feed analysis are also written to Azure Table Storage.
Now you have already enriched your SAP flight data with sentiment scores from Twitter in a raw table format. Great isn’t it?
Complex Sentiment Analysis with a simple LogicApp building block
Some more details on the analysis: The Twitter feed is fed into the Text Analytics API by a LogicApp building block with out-of-the-box integration. Have a look at the screenshot from the LogicApp down below (fig.3). You could achieve the same thing with a pure REST API integration and client libraries but with some coding efforts. Find more details here.
Fig.3 Screenshot from LogicApp on the config for the Azure Text Analytics API
The Text Analytics API wants to know which language the provided text was written in. Luckily, Twitter does that already for us. You simply assign the fields from the payload accordingly. The API is using pre-trained machine learning models to perform the sentiment scoring. There are plans to be able to submit your own training data. More details on the topic here.
Shiny Reporting with mobile access
Now that the raw data is prepared and enriched with the sentiment scoring it is ready to be consumed. We decided to provide a web-based PowerBI dashboard for data discovery use cases and an UI5 app on SAP Cloud Platform) for mobile-friendly consumption. Both options are integrated with the SaaS-CRM solution Dynamics365. Since this is all API-based you can just as easily do the same thing for SAP Sales Cloud (C4C portfolio) or Salesforce for instance.
Fig.4 Screenshot from UI5 app overview page
The overview page of the UI5 app provides the average sentiment score by airline. Master data originates from SAP ERP (ES5) and the scores from Azure Text Analytics API.
Fig.5 Screenshot from UI5 app detail page
You can investigate the scoring of individual twitter messages on the details page. Furthermore, you can create CRM activities manually if you want. Our prototype creates CRM tasks automatically only for all scores below 0.4. So, if you want to follow up on something outside of that scope you can do that with the app. We feel with every automation it should always be possible for humans to weigh in with their own thoughts. That’s why we provided this additional option.
PowerBI is a great option to quickly create a dashboard or do some data discovery. The proposed architecture offers that out-of-the-box. Of course, there are other options with Tableau and the likes. For this prototype I picked, what I had available with a feasible feature set.
Fig.6 Screenshot from PowerBI web report
Obviously, this first draft needs some more love
CRM Process and closing the loop
With the sentiment scoring and anomaly detection on the twitter feed you get actionable data, that contains potentially interesting info for your account representatives. Imagine you are responsible for ContosoAir and the system discovers a peak in bad or good scores. You will want to get notified quickly and probably derive some sort of action from it, right? To do that we integrated with the cloud-based CRM solution Dynamics365.
As I mentioned before you could the same thing with various other CRM systems. We chose this one due to the easy availability of a trial demo instance.
We modelled a contact in Dynamics365 to match our twitter profile. That way we can link the twitter feed, scoring results and the account in the CRM system.
Fig.7 Contact in Dynamics365 to match the twitter profile
On top of that we designed a workflow that sends notifications via E-Mail and Microsoft Teams to inform the Account Representative. Cool, isn’t it?
Fig.8 LogicApp flow to create CRM task and post Teams message
Fig.9 Screenshot of task for bad twitter sentiment score
As the reference architecture shows (fig.1), this CRM task got created automatically due to its very bad score of 0.09 (defined threshold is below 0.4). You can check fig.5 to verify. We call this basic anomaly detection.
Once the task in Dynamics365 is created the system posts a message to Microsoft Teams. This could easily be an email or Jira Ticket as well – You name the system. If there is an API you can integrate.
Fig.10 Screenshot of task for bad twitter sentiment score
There is even a back-reference to the CRM-Task (notice the link at the end of the message). Simply click and go to Dynamics to check on more details. Of course, SSO is key here. But that is a topic for another post
That’s it for data loading, analysis, storage, reporting, creating CRM activities based on the info and notifying account representatives. But what about the software delivery process? Let’s focus on the reporting piece for now.
Continuous integration and multi-cloud release
For a real-world scenario you need continuous integration and continuous deployment (CI/CD) capabilities to deliver your software solution. Wow, I almost added “in an agile way”. That would have been way too many buzzwords for one sentence As I said in the beginning, it is important to be able to prototype, implement changes and new features rapidly. The same is true for getting feedback from your user base quickly.
The way to achieve that in a managed way is CI/CD. In our case I configured a GitHub project for source control and Azure Pipelines for the CI/CD aspect. On our reference architecture (see fig.1) one pipeline branch delivers to SAP Cloud Platform and another to Azure App Service for Containers. For the latter I build a Docker file to be able to package the UI5 app and execute in a containerized environment. For more details have a look at the GitHub repos.
Fig.11 Screenshots of Azure Build-Pipeline
For simplicity reasons I didn’t configure my Cloud Foundry trial with Azure Pipelines yet. However, you can check out my earlier blogs on the topic. You can find up & running prototypes there:
Both options offer capabilities to configure smart release strategies such as Blue/Green deployments or Canary.
Thoughts on Scalability
Above described architecture relies on Azure Logic Apps for all the integration. But they are meant for transaction-based requests. Therefore, as soon as you enter the sphere of bulk requests, you are better off with streaming solutions. A path that I investigated was Azure Event Hub with Stream Analytics Jobs. You can find the specifics here. I decided against it for the simplicity of the demo. So, if you consider running a scenario on a broader scale you need to think: Azure Data Lake, Streaming Jobs and Event distribution solutions. Feel free to reach out on that matter.
Building on what I just mentioned: In my example I am reading from an individual Twitter feed. But you might want to scan the whole Twitter network with a specific time window. To achieve that you can leverage the Twitter API functionality too.
The LogicApp connector for instance allows you to search for hashtags, words, topics or lists of users and to start at a specific tweet id. So, be smart about your data reading logic. A sliding time window method would apply for most cases. Especially, because there are read limits to the API and items on the response. Find more info here and here.
Some context and an issue close to my developer heart
You could argue, that integrating all of this with the systems on-premises will work too, but you heavily miss out on the flexibility. With hyper scalers you just spin up new components as you go. Public preview versions of new “goodies” are only one example to have an edge on trending topics.
In terms of SAP backend, you could even activate irreversible SAP modules on-demand without having second thoughts. Just roll back if you are not happy with it, without keeping your basis guys busy.
It is about the freedom to try new approaches, ideas, versions of products/services or systems. Once you embrace the MVP-philosophy (minimum viable product) you get the opportunity to learn as early as possible if your idea strikes a nerve or if you should go back to the drawing board.
From my point of view playing with ideas is essential for the creative process and to eventually create something valuable.
Uh, that was quite ride. I hope you enjoyed it at least half as much as we had building it. We showcased how you can enrich your SAP ERP data with interesting information from social media platform such as Twitter, wrap meaningful reporting around it and create CRM-Actions with direct business impact for your account representatives.
Apart from that I motivated the new way of tech-thinking. The creative process of bringing new ideas to life with software, really thrives in hyper scaler environments. I often quote the “pets vs. cattle” paradigm on this.
Let’s embrace what developers/architects do on their devices in local sandbox environments anyway. New ideas become available to end-users much more quickly.
This whole project for example was brought to life in a matter of days. So, roll up your sleeves and get your hands dirty with some ideas .
For some inspiration and further reading have a look at the cool demo scenarios for SAP and O365 from my colleagues Holger Bruchelt and @roman.broich1. Roman also has a nice IoT implementation available in GitHub.
Find my GitHub repository with details on the configuration here: https://github.com/MartinPankraz/SAP-X-Fiori
Azure DevOps project here : https://dev.azure.com/mukurtul/Project%20X
A video will follow soon.
#Kudos to my colleagues Vadim Demchenko, Michael Kaufmann and Mustafa Kurtulus on collaborating on this adventure with me.
As always feel free to leave or ask lots of follow-up questions.
Opinions expressed are solely my own and do not express the views or opinions of my employer.