Power BI Insights for C-Suite from Microsoft MegaExpert Belinda Allen – Part 2

In this second part of our series with Microsoft Power BI Certified MVP executive strategist, Belinda Allen, she shared deep insight with us about how Power BI can elevate your pharma company and provide enhanced efficiency for C-level executives.

We asked her two questions about Power BI below, here’s what she had to say:

What are some of the biggest advantages that come with using Power BI?

Not only does the product come with a low cost of entry, but it is also easy to use for the person consuming reports and it’s very easy to share data. Along with that comes built-in Microsoft ability. The fact that Microsoft has sister-like areas to Power BI, such as machine learning and AI, C-level executives can take advantage of utilizing the other areas and bring them into Power BI.

Power BI also has Bing technology built in it, and natural query language that can have us go in and ask questions from our data and the answers come back in the form of a visual (a chart, a graph, a table, etc.). That’s just insanely powerful! And it’s something that we’re used to, it’s a natural progression for people, it’s called Q&A. Another advantage is the quick analysis of the Power BI service. The quick analysis goes through the data and it looks for trends and outliers to those trends.

Now, it doesn’t know what the data is, it doesn’t know, and it doesn’t care. It just picks the data and it looks for trends and outliers and then it provides you with a report of various visuals based on what was found. I’m going to admit, if you’re familiar with your data then you’re going to look at the visuals and not be surprised by the trends and outliers, but there’s often this golden nugget of information in there that you didn’t expect to get – you didn’t expect to see, and you never even thought about before.

I think one of the biggest advantages is being able to take advantage of and leverage other Microsoft products with Power BI. Another advantage of Power BI is the ability to take advantage of Microsoft’s Power Platform – Power BI, Power Apps, and Power Automate (formerly named Flow). With this Power Platform, instead of doing all the work in various places you can do everything in one integrated location.

One of the things I love about it is you can literally do it from your iPad. So, if the person is out of the office, they can log in and approve/reject or make any adjustments to reports. You can build some custom apps that can have pharma sales reps retrieving data from a CRM system – Dynamics 365, and with the app in there as well you can have it write back to Dynamics 365 and give information out that way to pharma sales reps while they are on the road.

Also, from the marketing point of view, you can go through and monitor tweets. For example, any tweet that uses a certain hashtag you specify can be stored in an excel file in OneDrive and that way I have all my tweets stored and can run my sentiment analysis on them.

What Are Some Challenges that Power BI can Help Large Pharma Companies Overcome?

One of the biggest things is that you can reduce the chain of command to get information. So, you don’t have to ask different people to have that information funneled back up to you. The fact that you can get access to information very quickly is invaluable.

Another thing is the ability to attach comments to reports that can be accessed from any device and location. You can specifically ask questions in real-time from your device while out of the office and have the answers delivered to you immediately. Reducing the chain of command to request information and retain information is one of the biggest ways you can generate efficiency because you’re getting it right away.

Another advantage Power BI offers C-level executives is that this efficiency is forcing the line-level employees to enter data in a timely manner. So, it’s an electronic tattle tale device if you will. If we want to know what is going on in real-time, we must have everything in on time because we are only going to be able to analyze what’s already in the data.

It’s forcing people to do their work much more quickly and to stay current with it which is only going to provide faster information in the long and short run. Pharma sales reps can use the cloud when they are out of the office and they can do a quick analysis of reports when they are selling to a physician. To be able to quickly analyze trends and other data right before entering the door to speak with a physician is invaluable to pharma companies commercially.

There’s also the ability to analyze the status of making their payments. Power BI includes the approval of reports and that’s where the access to the sister Power Apps can help as well. The main advantage with Power BI and the Power Apps is requesting information, so if the information is already in. Instead of C-level executives having to go to other people for reports, the reports can instantly be shared with them including relevant data.

A lot of companies in the past would have challenges with “closing the books” because a lot of employees would allow work to stack up on their desk. Now, if you’re accessing data as it’s collected, you’re asking questions about that data as it is in real-time in the system – and that changes the whole world for the people who must get the data into the system.

Real-time data is a major way Power BI can reduce costs because you can find out what’s going on in the moment. It also doesn’t have to be just an email that goes out with a Flow, using Power Automate. Once it sees that alert has been activated in Power BI, it can do other things as well. You can have it update something inside of a CRM system, update a to-do list activity, and more!

It can do these types of things in conjunction with each other as well as going off the alert that was activated in Power BI. So, it doesn’t have to be one or the other, it can be an entire change of events that occur from that alert. If costs are going up dramatically, you can see that and find out what’s causing it and you can monitor those trends as you’re going along.

The fact that the monitoring of communications becomes an automated process, it’s much more efficient and much timelier. It can also be something that might be happening over the weekend when nobody might be working so the data collecting in real-time can be ongoing constantly. While employees are no longer being used to monitor communications with the help of Power BI, you can now utilize those same employees for more pressing tasks that can help generate more sales.

We are proud to be a partner with Belinda to provide her insight into the C-level application of Power BI strategies, changing the methodology of business in the life sciences. Stay tuned for the next release of insight from Belinda. 

Belinda Allen is a Microsoft Certified Power BI MVP and Gold Level Trainer; she is also the Business Intelligence Program Manager for PowerGP Online. She has excelled in assisting partners and customers to implement and create BI methodologies, enabling businesses like yours to make high-quality decisions based on real-time and accurate information. You can find out more at http://www.saci.com/.

If you would like to learn about our Data Analytics solution which includes Power BI technology and how it can help your Pharma company, click below!

C-Level Analytics Strategies with PowerBI – Insight from Microsoft MegaExpert: Belinda Allen

Belinda-Allen-Power-BI-Article-Pt-2-Feat-Image-768x432

We sat down with Microsoft PowerBI Certified MVP and executive strategist, Belinda Allen – where she shared her insight on how enterprise application of Power BI is revolutionalizing business strategies.

This is the first part of a series with Belinda, focusing on tangible, actionable Power BI integration at the enterprise and executive levels.

During each part of this series, we are sharing two tidbits that we asked Belinda:

What is your definition of the CEO-level value-proposition of using Power BI?

“With the implementation of Power BI, C-value executives will have access to real-time analytics and knowing what is happening in the moment without having to ask anyone to provide them with updates, without having to guess on their own, and more importantly – they are no longer looking at what has happened in the past, but they can now dig into what is happening in the present to make the best decisions for the future.

We live in a world where disparate systems are a reality, and they probably will be a reality going forward while line-level managers are leveraging data. So, being able to create reports that integrate the data from multiple sources on to one common place and looking at it in real-time is something that prior to the world of Power BI dashboards was not possible or it took a long time for people to prepare reports and send to C-level executives.

The ability to look at information and make decisions quickly to look for trends, tracking data, and starting to use other resources that are available within the Microsoft family that provide intelligence and machine learning built-in is beyond invaluable for the CEO. 

One of the biggest advantages of using Power BI is there’s a low cost to entry and high return on investment. So, the low cost of entry the product that is relatively easy to use for the person consuming reports and it’s very easy to share data. It’s all about the efficiency in getting data, nobody wants to sit and wait on what’s going on. We all expect everything instant in today’s technology era – and Power BI can deliver that.”

How does Power BI feed the strategies that help the executive management of an organization and make enterprise decisions?

“Being able to use sentiment analysis along with financial numbers is becoming more and more important. Sentiment analysis monitors what people are saying on social media and reviews that are left on websites such as Yelp and if they’re showing that they’re happy with your company or not. We are living in the here and now, so if you have a bad experience with a company, you’re most likely going to tweet about it and companies can now capitalize on this information with some of the tools that take sentiment analysis and do that. 

The same is true when they’re looking at their CRM systems, being able to monitor what’s going on. Times have definitely changed, you can now create a correlation between complaints about your company and the revenue decreasing. Let’s say you have a ton of complaints about product “B”, you may want to see if there have been new sales for that particular product and if there is any correlation there. We don’t know what we don’t know! We have a tendency as human beings to look for data that supports our theories. 

So, if we can do some analysis that takes us out of trying to build something that just supports our theories – which is what we’ve always done with tools such as Excel (creating files to prove a theory of why we are going in that direction we are headed in), we can instead look at what the data is telling us. Being able to be more fluent and go with the trends using Power BI and sentiment analysis is just invaluable.

Microsoft’s Power Platform is also very beneficial and it consists of three products – Power BI, Power Apps, and Flow. Power Apps is a mini-app building tool that enables you to create an app to use on your desktop or on your phone and it can be something very complex that a developer would build or something more straightforward. And you build those out using the single functioning formulas similar to the way you do an excel spreadsheet. So, if you’re comfortable with Excel, you can build your own apps inside of Power Apps. I bring this up because you then can take something built-in Power Apps and put it inside your Power BI report. Or you can take Power BI visuals and put them inside your Power App. 

Then you can take advantage of a third product called Flow. It’s an integration tool or workflow tool – it’s a way to bridge things together between multiple applications. Let’s say you are in Dynamics 365 and somebody has a problem with your company. After a case has been resolved in Dynamics 365, Microsoft Flow can automatically send out a form and if someone responds saying no they weren’t satisfied then new questions can appear so you can drill down on the issue and if it’s an open-ended question which I completely recommend then whether or not the answer was positive or negative can be tracked and analyzed, so you can really get to the heart of the matter.

In days past, we would have to read all of the comments/answers to get an idea of what’s going on, now, Microsoft machine learning has the ability to go through all of that text automatically and assign based on shift in tense, all capital letters, and other signs of negativity. You can have this tool go through social media comments, YouTube comments, blog comments, etc. Or, let’s say one of your employees creates an alert using Flow, and if that alert gets activated, then I want it to send an email to the CEO of the head of sales and maybe three other specific people and I want it to include a link to a specific report or visual – and you can do that automatically with Flow. You can even set parameters where if the number being analyzed is less than a certain value, the alert can only be sent to certain people. Or if it’s a value greater than you can send it to everybody.” 

We are proud to be a partner with Belinda to provide her insight into the C-level application of Power BI strategies, changing the methodology of business in the life sciences. Stay tuned for the next release of insight from Belinda. 

Belinda Allen is a Microsoft Certified PowerBI MVP and Gold Level Trainer, she is also the Business Intelligence Program Manager for PowerGP Online. She has excelled in assisting partners and customers to implement and create BI methodologies, enabling businesses like yours to make high-quality decisions based on real-time and accurate information. You can find out more at http://www.saci.com/.

If you would like to learn about our Data Analytics solution which includes Power BI technology and how it can help your Pharma company, click below!

How to Use Azure IoT Hub, Azure function, and Azure Cosmos DB — Walk-through

IoT is very well-known throughout the corporate world. For starters, this isn’t another IoT article where we’re going to do home automation. This is a simple walk-through about adding a server-less back-end to your existing IoT system.

We’ll touch on how Azure IoT Hub will be used as an event hub or a messaging system for “Things.” Also, we’ll cover how these incoming messages can be routed to the related database using Azure functions.

No devices will be used. We’ll be simulating our PC as a device and use node.js to connect to IoT hub and stream data. Then, we’ll see how the streamed data to the IoT hub can be routed to the DB after some logic. Doing so, by using Azure functions and C#.

Finally, we’ll see how the integration happens where we connect every component to everything.

Architecture

Architecture basic

Fig 1: Architecture basic

The above figure shows the basic architecture or flow of the system. It also should be noted that we’ll be using cosmos DB’s mongo DB API to access the database. This displays the flow of data from a device to the database using an architecture without a server.

Azure IoT hub

Azure IoT hub is essentially an IoT platform. Using this, we are going to send messages from our simulated device.

Technically, the device section of any IoT system is referred to as an edge device. From our simulated edge device, we are going to send data to the azure IoT hub using MQTT protocol.

First, let’s go over how to set the Azure IoT Hub. Head on to https://portal.azure.com and create a new IoT hub.

Azure IoT Hub - Screenshot

Fig 2: Azure IoT Hub — Screenshot

Once created, we will add a new device and name it something relevant. In this case, the name of the IoT hub is “spektro” and the name of the device is “simulatedDevice”.

To create a device, click on the IoT hub you just created and head on to the “IoT devices”, under the tab “Explorers”.

Screenshot - IoT Devices

Fig 4: Screenshot — IoT Devices

After creating the device, the next step is to write a code that will simulate the device and send some exciting raw data. For the code, we will stick to node.js. Use the npm package manager for installing the module.

npm install azure-iot-device-mqtt

After a successful installation, we can use the above dependency. Another important parameter we need to note is the connection string. This will act as a key for our simulated device.

For this, click on the device ID you just created. In this case, it was “simulatedDevice”. Once you click that, you will end up in device details.

Screenshot - Device details

Fig 5: Screenshot — Device details

From this page, you need to copy the “Connection string — primary key”. This will allow the device to communicate. Then it can be copied and pasted.

Simulated Device — Node.js

As mentioned earlier, the dependency needs to be installed. Here we’ll see how the dependency is used to stream data. Our data, in this case, will be a JSON string containing the following parameters.

DeviceID and Data

“Device id” is the one that’s created and “Data” just values varying between 1 to 100.

'use strict';

var clientFromConnectionString = require('azure-iot-device-mqtt').clientFromConnectionString;
var Message = require('azure-iot-device').Message;

function azcall()
{
	var connectionString = 'YOUR CONNECTION STRING'; 

	var client = clientFromConnectionString(connectionString);
	function printResultFor(op) 
	{
		return function printResult(err, res) 
		{
			if (err) console.log(op + ' error: ' + err.toString());
			if (res) console.log(op + ' status: ' + res.constructor.name);
		};
	}

	var connectCallback = function (err) 
	{
		if (err) 
		{
			console.log('Could not connect: ' + err);
		} 
		else 
		{
			console.log('Client connected');
			pubData();
			function pubData()
			{  
				var rand= Math.floor((Math.random() * 100) + 1);
				var data = JSON.stringify({ "device_id": "Simulated Device", "Data":rand});
				var message = new Message(data);
				console.log("Sending message: " + message.getData());
				client.sendEvent(message, printResultFor('send'));
			}
		}
	};
	client.open(connectCallback);
}
setInterval(azcall, 1500);

In the code above, we have used the function azcall() for streaming the messages after every 1.5 seconds. It’s a simple implementation of the azure IoT hub SDK which uses MQTT protocol internally. In the function pubData(), we’re publishing the data to the Azure IoT hub which is a JSON string.

The code is based on the sample provided by the Microsoft team.

https://github.com/Azure-Samples/azure-iot-samples-node

Before executing the code, we have one small step involved. We must check out the incoming messages to the Azure IoT Hub.

There’s a utility by Microsoft known as Device explorer twin. That allows us to monitor the messages. It’s a Windows C# app and can be downloaded from the below Github link.

https://github.com/Azure/azure-iot-sdk-csharp/releases/download/2018-3-13/SetupDeviceExplorer.msi

In order to use this, we need the connection string of the IoT Hub, not the Device. So head back to the Azure portal, and click on Shared access policies.

Shared access policies

Fig 6: Shared access policies.

Click on shared access policies to open the policies panel. Here we are interested in “iothubowner”. Click on it to open the policy details.

IoT Hub - Connection String primary key

Fig 7: IoT Hub — Connection String primary key

Copy the “Connection string — primary key”.

Now, fire up your device explorer tool and paste the connection string in the IoT Hub Connection string input panel.

IoT Hub device utility

Fig 8: IoT Hub device utility

Then, click on update. After the update is successfully finished, click on Data.

Monitoring messages

Fig 9: Monitoring messages

In this section, the device that you created from the portal before should appear on the Device ID drop-down. Select your device and click on the monitor.

Now, head back to the folder, where your node.js code was written and execute the code using any IDE. Or in this case, we used Windows Powershell.

Powershell window

Fig 10: Powershell window

At the same time, maximize the device explorer tool and then we’ll see the incoming messages as shown. Fig 11.

Incoming message stream to the IoT Hub

Fig 11: Incoming message stream to the IoT Hub

If everything is done correctly, the messages should appear here, which means that messages are being received by the IoT Hub. Now that we can send messages to the IoT Hub, we can write a server-less API for routing these messages to the Cosmos DB using Mongo DB API. Initially, we’ll set up the cosmos DB such that we know where exactly we need to route the data.

Azure Cosmos DB

Click below for a detailed description of Cosmos DB. Understood correctly, it’s a very informative write-up!

https://docs.microsoft.com/en-us/azure/cosmos-db/introduction

Create a Cosmos DB from the Azure portal as a new resource and select MongoDB under API drop-down.

Cosmos DB

Fig 12: Cosmos DB

After your deployment is completed, head on to the DB you created. In our case for reference, the name of the DB is “spektrodb”. This Cosmos DB is based on documents and it’s unstructured DB. Therefore, we can create new documents on the run-time. We just need to create a database for now under the name of “database_spektro”.

MongoDB database create

Fig 13: MongoDB database create.

For now, we’ll leave it in this way and just copy the connection string as it will be required in our Azure function. Head on to the DB you just created and click on quickstart.

Cosmos DB

Fig 14: Cosmos DB

Keep the above data primarily to the connection string under .NET section somewhere saved locally. You may also want to have a look at the code as it’s going to be used in the Azure functions.

Azure functions

It’s like a micro-service that gets going whenever there is any triggers or requests. Here, we are interested to trigger an azure function whenever there is an incoming message in the Azure IoT Hub.

Before that, let’s create an azure function first. Head towards the Azure portal and search for azure function. In this case, the name of the function is “spektrofunc”.

Azure function

Fig 15: Azure function

Click on a new function.

Azure function -1

Fig 16: Azure function -1

Next, click on the custom function and navigate to “IoT Hub (Event Hub)”.

Azure function -2

Fig 17: Azure function -2

After that, a configuration window will open where we will configure our IoT Hub named “spektro”. You can name your function anything. We named it as “spectroIotTrigger”. Under the event hub connection, click on new.

Azure function -3

Fig 18: Azure function -3

Here, click on IoT hub and the name of the IoT hub should appear.

Azure function -4

Fig 19: Azure function -4

Click on select and finally on create.

Now, your function is ready for you to write code in C#.

Azure function -5

Fig 20: Azure function -5

Before writing the code, let’s find out how the already provided sample will work. In order to test the sample code provided, we need to click on test and logs, which is a place where the messages or errors are displayed.

Click on run and the test message should be displayed.

Azure function -6

Fig 21: Azure function -6

We can also test with our device simulator in node.js. Just execute the code in the back-end, and the messages will be shown in the log window.

Azure function -7

Fig 22: Azure function -7

Now, we can see the incoming messages in the azure function. The next task is to handle these incoming messages which will be used to publish data to cosmos DB.

Cosmos DB integration in Azure function

Before jumping off to integrate cosmos, we must include the dependencies. Adding dependencies in azure functions is a bit tricky. Click on “View files” just above the “Test” button. Then, we need to add one file named “Project.json”.

In the file, we need to manually declare the dependencies. For cosmos DB with MongoDB API, we need MongoDB driver. For handling JSON, we need Newtonsoft.json. You can also declare the .net version.

Let’s declare them using the following JSON.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "Newtonsoft.Json": "10.0.3",
        "System.ServiceModel.Primitives":"4.4.0",
        "MongoDB.Bson": "2.4.0",
        "MongoDB.Driver": "2.4.0",
        "MongoDB.Driver.Core": "2.4.0"
      }
    }
   }
}

Now, click on save and then browse to run.csx file.

Include all the necessary “using statements”.

using System;
using System.Runtime.Serialization;
using System.ServiceModel.Description;
using MongoDB.Bson.IO;
using MongoDB.Bson;
using MongoDB;
using MongoDB.Driver;
using System.Security.Authentication;
using System.Text;
using Newtonsoft.Json;

Now, save it and run the file. If there isn’t any compilation error, then dependencies have been successfully installed.

Let’s write the code for decoding and pushing data. Our target is to create a new document named after the device ID. Then the document will contain the required parameter ID and parameter values.

In this case, it will be displayed as shown: {“Data”:”12″}

using System;
using System.Runtime.Serialization;
using System.ServiceModel.Description;
using MongoDB.Bson.IO;
using MongoDB.Bson;
using MongoDB;
using MongoDB.Driver;
using System.Security.Authentication;
using System.Text;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;

public static void Run(string myIoTHubMessage, TraceWriter log)
{
    log.Info($"C# IoT Hub trigger function processed a message: {myIoTHubMessage}");
    string deviceId="",data="";
    var raw_obj=JObject.Parse(myIoTHubMessage);
    deviceId=(string)raw_obj["device_id"];
    data=(string)raw_obj["Data"];
    Cosmos cosmos= new Cosmos(deviceId,data);
    cosmos.pushData();
}

//CosmosDB class
public class Cosmos
{
    string deviceId="",data="";
    public BsonDocument document = new BsonDocument();
    public Cosmos(string deviceId, string data)
    {
        this.deviceId=deviceId;
        this.data=data;
    }
    public void pushData()
    {
        MainAsync().Wait();
    }
    public async Task MainAsync()
    {
        string connectionString = 
    @"";
    MongoClientSettings settings = MongoClientSettings.FromUrl(new MongoUrl(connectionString));
        settings.SslSettings = new SslSettings() { EnabledSslProtocols = SslProtocols.Tls12};
        var mongoClient = new MongoClient(settings);
        IMongoDatabase db = mongoClient.GetDatabase("database_spektro");
        var icollection = db.GetCollection(deviceId);
        document.Add("Data",data);
        await icollection.InsertOneAsync(document);
    }

}

The code above is the azure function written in C#. It decodes the JSON received in the string message from the azure IoT hub. A class was created to handle the MongoDB message push.

We pass the parameters through a parameter constructor and by an async call. Then, we push the data to the CosmosDB using MongoDB API. There are two options for testing the code. You can create any name for the DB.

Recall the “Test” area where there was some sample message.

Azure function screenshot for testing

Fig 23: Azure function screenshot for testing

In the test area, you can replace it with a JSON and click on run. This should execute the code with the sample input. Run a test for any errors.

Now, let’s check this with the simulated device code we have written earlier in node.js. Head on to the local folder and start the node.js code. Once it starts executing, go to your CosmosDB and click on “Data Explorer”.

Cosmos DB - Data explorer

Fig 24: Cosmos DB — Data explorer

Click on the database that was created and then on the table name which is basically the device ID. Our device ID was a simulated device as in the node.js code. After we click on documents, the JSON files appear. Click on anyone and you can check out the data.

Conclusion

We have seen how we can simulate an IoT device using node.js and send messages to the IoT hub. Finally, we route the messages into respective tables in CosmosDB based on the device ID and stream data. This has a lot of use case(s).

Let’s take an example of telematics data from a car or any machine to be sent to the DB after a certain time interval. From some kind of remote weather station or any IoT use cases where data is involved.

Now that you know the flow of data, why not comment some use case(s) which perfectly fits in this flow? The part that we didn’t cover is the use of rule engines which can be programmed in the azure function. Another topic for another article!

Curious about how Azure can impact your business? Get in touch with us today for a free consultation below on how the various Azure platforms can help you grow your business.