Working as Solution Architect with a focus on the education and healthcare market. I have worked as vendor for Microsoft Education Technology Advisor for 6 years.
You can contact me: email@example.com
Last week Microsoft finally updated Microsoft Azure Stream Analytics with great new features. In this blogpost I explain some new features I use in our projects now.
Using custom code in your Microsoft Stream Analytics Query
First of all you go to your Microsoft Stream Analytics job and select ‘Function’ and add a new one:
Visual Studio integration for Azure Stream Analytics
One challenge I have with creating queries in Azure Stream Analytics (yes, I am not the best with SQL..) is testing my queries with the data input. I was costing me a lot of time. I was the last weeks part of a private preview of the integration of Azure Stream Analytics in Visual Studio. That is now in public preview(download here). With this add-in you can export (or import) your Microsoft Stream Analytics projects in Visual Studio and test everything locally! Currently only Visual Studio 2015 is supported.
First or all go to your Server Explorer and navigate to Stream Analytics and hit the export button.
2. Then your complete stream analytics project is exported to Visual Studio 2015. Here you get some great functionalities
Syntax Highlighting in your queries
You can write your query and test it directly to testdata! Just hit F5!
You have off course directly integration with TFS / source control
Low-latency dashboards in Power BI
Another new update is low latency support in your Microsoft Power BI output of Azure Stream Analytics. (Power BI streaming sets). The latency is now much better then some weeks ago. So you get faster insights in your data!
Other new functionalities:
Native support for Geospatial functions; now you can define geographical areas in your Azure Streaming job.
Job Diagnostics logs; there is now an integration with Azure Monitoring to better monitor your Stream Analytics Jobs!
I use the Particle Photon devices a lot in prototype project. It’s a small WiFi board (like the ESP 8X series).
I used several libraries on the particle cloud to use the Azure IoT hub. These libraries were build based on the Azure IoT SDK. But since several weeks Particle released an Azure IoT Connector on the Particle cloud. It’s still a beta function. But as a user you can send the data from the Particle Photon to the Particle Cloud. When the data has a specific name, the Particle Cloud will sent this data to the Azure IoT Hub.
Several weeks a go a new function was released in Power BI, called Data Alerts. With Data Alerts you can create alerts on data in your Power BI Report. That data can off course flow from the Microsoft IoT Hub with Stream Analytics. I will give you an example of a scenario. In the following report I count the amount of messages we receive from the Microsoft IoT Hub at a client. If the amount will exceed more than the threshold of 2300 I will need to receive an e-mail. And there comes Microsoft Flow.
If you click on the three points on the top right of the card in Power BI, you now will get some options:
Now you see a alert button in the middle. When you click on the alert button you will get a new screen:
In this screen you can manage the new alert. I have set a threshold above 2300 to get an alert. Hit ‘Save and Close’ and the alert is saved.
Now begins the cool stuff, Microsoft Flow added support for Power BI Alerts since three weeks. When you navigate to Microsoft Flow you can create a new Flow and search for Power BI:
Click on Power BI – When a data driven alerts is triggered and you see your alerts and the alert we just made:
Then you can create an action when the threshold is above the number 2300. And there comes the power of Flow. In this example I will receive an e-mail, but off course you can also update Microsoft CRM or other (MS) services in Microsoft Flow. See here or a complet list.
Hit the save button and wait for the mail 🙂 Below the mail I received when the threshold was above 2300:
I really like this new functionality. Now you can easy create (data)flows with IoT solutions without the need of any programming!
This week I am attending the MVP summit in Redmond. This morning I had a chat with SharePointAppie about the possibilities of Microsoft Flow and the Cognitive Services. We had the following scenario: We want to get the hashtag of Twitter: #MVPSummit and put that in a SharePoint Online list. We want to see if the tweet about the MVPSummit is possitive of negative and write that score back to the SharePoint list.
Get items from twitter with Microsoft Flow
Save them in a SharePoint List a SharePoint List
Create a flow that sees new listitems in that list
Post it to Microsoft Cognitive Service Text Analyses / Sentiment via HTTP
The sentiment services gives a score 0 (negative) and 1 (positive)
Update the item with the sentiment score
It straight forwards, but when we came to the posting and getting information from the Cognitive Services with Microsoft Flow we had some issues.
Here some issues we had:
First of all: Posting to the service. We need authentication before we can post the tweet. Below the settings of the HTTP post. Just update your Subscription key and you can connect
The second part was more ‘complex’ and we did a lot of testing/debugging, the solution of course was simple….
We need to get the score out of the answer, in stead of storing the whole document. After a lot of testing we created the following settings in Microsoft Flow:
After putting the above into Microsoft Flow property, Flow updates the ‘code’ to the following (last item):
After that we now get the score of a tweetthe cognitive services directly back into a SharePoint list. There is one catch, you need to add a space before the “documents.score” element in the above screenshot, else I will not work…
Below the scores in the list, close to 1 the tweet is positive: