Blog: Update for Microsoft Azure Stream Analytics – IoT

Blog: Update for Microsoft Azure Stream Analytics – IoT

Last week Microsoft finally updated Microsoft Azure Stream Analytics with great new features. In this blogpost I explain some new features I use in our projects now.

Using custom code in your Microsoft Stream Analytics Query

In our projects we have sometimes challenges with the data we are receiving in Stream Analytics for the Microsoft Azure IoT Hub (finally the sensor off course…). For example we sometimes work with the Sigfox IoT network. With this network a device can sent small datapackages to the Azure IoT Hub, most of the times this is sent via a HEX format. Normally you want to transform this HEX data to a integer. (Yes, you can do it afterwards, but I want to do this directly in Stream Analytics query). One of the new functions just launched is support for Custom Code with JavaScript user defined functions. This is great, because I can great a function in Stream Analytics and transform my Hex to an INT just in the query. Here is how you do this:

  1. First of all you go to your Microsoft Stream Analytics job and select ‘Function’ and add a new one:
Add a function in Azure Stream Analytics

2. After that you will get a window to create your javascript. In this example you see a hex to int function:

Create UDF Javascript in Stream Analytics – Hex2Int sample

3. Now you can use your Javascript function in your query of Stream Analytics. Just type udf. than your function name and you can transform your HEX to an integer.

Using UDF Javascript in your query of Stream Analytics


Visual Studio integration for Azure Stream Analytics

One challenge I have with creating queries in Azure Stream Analytics (yes, I am not the best with SQL..) is testing my queries with the data input. I was costing me a lot of time. I was the last weeks part of a private preview of the integration of Azure Stream Analytics in Visual Studio. That is now in public preview(download here). With this add-in you can export (or import) your Microsoft Stream Analytics projects in Visual Studio and test everything locally! Currently only Visual Studio 2015 is supported.

  1. First or all go to your Server Explorer and navigate to Stream Analytics and hit the export button.
Export in Visual Studio a Stream Analytics project

2. Then your complete stream analytics project is exported to Visual Studio 2015. Here you get some great functionalities

Exported Azure Stream Analytics project in Visual Studio


  • Syntax Highlighting in your queries
  • You can write your query and test it directly to testdata! Just hit F5!
  • You can write your custom Javascript code
  • You have off course directly integration with TFS / source control
Using local data to test your new queries

Low-latency dashboards in Power BI

Another new update is low latency support in your Microsoft Power BI output of Azure Stream Analytics. (Power BI streaming sets). The latency is now much better then some weeks ago. So you get faster insights in your data!

Low latency in Power BI as output in Stream Analytics


Other new functionalities:

  • Native support for Geospatial functions; now you can define geographical areas in your Azure Streaming job.
  • Job Diagnostics logs; there is now an integration with Azure Monitoring to better monitor your Stream Analytics Jobs!








Power BI app update with Q&A on your dataset!

Power BI app update with Q&A on your dataset!

Today I received again a new update of the Power BI app on my iPhone. I was really happy with the release notes:

  • Q&A support / ask your data in Preview

When you open a report in Power BI, you see now at the button a chat button (beside the favorite button):


When you click on the button, you get a Q&A window:


Here you can ask the question. The great thing is that Power BI is already providing you sample question about the dataset. Below an example I asked the data, I get a report with a graph!


New Windows 10 IoT Core insider build

New Windows 10 IoT Core insider build

Today Microsoft released a new build for Windows 10 IoT Core. Great new feature is that Cortana is enabled! Here the full list with new functionality / fixes:

New this build:

  • Cortana feature has been enabled.
  • The Dragonboard BSP in the provided FFU has been updated to the new build.
  • The Windows Device Portal (WDP/Web Management) has been extended to add a quick run portal for IoT Samples.
  • A fix was made to the Class Extensions for Hardware Notification (hwnclx) and USB Function (usbfnclx) packages so that they would be included in the default IoT Core images.
  • Changes were made to IoTShell to enable waiting for PPKG provisioned package installation to complete.
  • Updates were made to the GPIO Interrupt Buffer API.
  • Changes were made to Applyupdate.exe to add the blockrebooton/blockrebootoff flags.
  • A fix was made to the power state API to ensure the wakeup timer is cancelled upon exiting from connected standby.
  • Universal Write Filter (UWF) has been added as an option to the Windows Imaging and Configuration Designer (ICD).
  • The BluetoothLE stack has been updated to address the issues seen when calling GattDeviceService.GetCharacteristics.
  • Issues with NanoRDP connecting have been addressed.

Known Issues:

  • The package version for some inbox applications may not match the installed version.
  • Store applications are not being serviced when in use or set as the default application.
  • NanoRDP does not render correctly on some platforms.
  • When multiple audio devices are present on the board audio routing changes may not persist across boots.
  • The MinnowBoard Max firmware 0.93 has a known issue which can lead to network connectivity failure.


How to use the Particle Photon with the Microsoft Azure IoT Hub

How to use the Particle Photon with the Microsoft Azure IoT Hub

I use the Particle Photon devices a lot in prototype project. It’s a small WiFi board (like the ESP 8X series).


I used several libraries on the particle cloud to use the Azure IoT hub. These libraries were build based on the Azure IoT SDK. But since several weeks Particle released an Azure IoT Connector on the Particle cloud. It’s still a beta function. But as a user you can send the data from the Particle Photon to the Particle Cloud. When the data has a specific name, the Particle Cloud will sent this data to the Azure IoT Hub.

Take the following steps to get this working.

1. Login to the Particle Cloud

2. Click on the integration button (the latest button) and select ‘New Integration’


3. Select the Azure IoT Hub


4. Configure the settings of your hub


5. Save and continu and the Particle Cloud is connected to the Microsoft Azure IoT Hub. Now you can create your code on the Particle Photon to sent data to the Azure IoT Hub.

Below the example you can add in your Particle Photon device.

void loop() {
  // Get some data
  String data = String(10);
  // Trigger the integration
  Particle.publish("datafield", data, PRIVATE);
  // Wait 60 seconds

Run the code and the messages will flow from the Particle Cloud:


and then to the Microsoft Azure IoT Hub! How simple is this!

Smartdata flow with IoT Hub, Stream Analytics, Power BI data alerts and Microsoft Flow

Smartdata flow with IoT Hub, Stream Analytics, Power BI data alerts and Microsoft Flow

Several weeks a go a new function was released in Power BI, called Data Alerts. With Data Alerts you can create alerts on data in your Power BI Report. That data can off course flow from the Microsoft IoT Hub with Stream Analytics.  I will give you an example of a scenario. In the following report I count the amount of messages we receive from the Microsoft IoT Hub at a client. If the amount will exceed more than the threshold of 2300 I will need to receive an e-mail. And there comes Microsoft Flow.schermafbeelding-2016-12-07-om-11-40-38









If you click on the three points on the top right of the card in Power BI, you now will get some options:


Now you see a alert button in the middle. When you click on the alert button you will get a new screen:


In this screen you can manage the new alert. I have set a threshold above 2300 to get an alert. Hit ‘Save and Close’ and the alert is saved.

Now begins the cool stuff, Microsoft Flow added support for Power BI Alerts since three weeks. When you navigate to Microsoft Flow you can create a new Flow and search for Power BI:


Click on Power BI – When a data driven alerts is triggered and you see your alerts and the alert we just made:


Then you can create an action when the threshold is above the number 2300. And there comes the power of Flow.  In this example I will receive an e-mail, but off course you can also update Microsoft CRM or other (MS) services in Microsoft Flow. See here or a complet list.


Hit the save button and wait for the mail 🙂 Below the mail I received when the threshold was above 2300:


I really like this new functionality. Now you can easy create (data)flows with IoT solutions without the need of any programming!



Using the Microsoft Cognitive Services with Microsoft Flow and SharePoint Online Lists

Using the Microsoft Cognitive Services with Microsoft Flow and SharePoint Online Lists

This week I am attending the MVP summit in Redmond. This morning I had a chat with SharePointAppie about the possibilities of Microsoft Flow and the Cognitive Services. We had the following scenario: We want to get the hashtag of Twitter: #MVPSummit and put that in a SharePoint Online list. We want to see if the tweet about the MVPSummit is possitive of negative and write that score back to the SharePoint list.

Global solution:

  • Get items from twitter with Microsoft Flow
  • Save them in a SharePoint List a SharePoint List
  • Create a flow that sees new listitems in that list
  • Post it to Microsoft Cognitive Service Text Analyses / Sentiment via HTTP
  • The sentiment services gives a score 0 (negative) and 1 (positive)
  • Update the item with the sentiment score

It straight forwards, but when we came to the posting and getting information from the Cognitive Services with Microsoft Flow we had some issues.

Here some issues we had:

First of all: Posting to the service. We need authentication before we can post the tweet. Below the settings of the HTTP post. Just update your Subscription key and you can connect



The second part was more ‘complex’ and we did a lot of testing/debugging, the solution of course was simple….

We need to get the score out of the answer, in stead of storing the whole document. After a lot of testing we created the following settings in Microsoft Flow:


After putting the above into Microsoft Flow property, Flow updates the ‘code’ to the following (last item):


After that we now get the score of a tweetthe cognitive services directly back into a SharePoint list. There is one catch, you need to add a space before the “documents.score” element in the above screenshot, else I will not work…

Below the scores in the list, close to 1 the tweet is positive: