Blog: Update for Microsoft Azure Stream Analytics – IoT

Blog: Update for Microsoft Azure Stream Analytics – IoT

Last week Microsoft finally updated Microsoft Azure Stream Analytics with great new features. In this blogpost I explain some new features I use in our projects now.

Using custom code in your Microsoft Stream Analytics Query

In our projects we have sometimes challenges with the data we are receiving in Stream Analytics for the Microsoft Azure IoT Hub (finally the sensor off course…). For example we sometimes work with the Sigfox IoT network. With this network a device can sent small datapackages to the Azure IoT Hub, most of the times this is sent via a HEX format. Normally you want to transform this HEX data to a integer. (Yes, you can do it afterwards, but I want to do this directly in Stream Analytics query). One of the new functions just launched is support for Custom Code with JavaScript user defined functions. This is great, because I can great a function in Stream Analytics and transform my Hex to an INT just in the query. Here is how you do this:

  1. First of all you go to your Microsoft Stream Analytics job and select ‘Function’ and add a new one:
azuresafunction
Add a function in Azure Stream Analytics

2. After that you will get a window to create your javascript. In this example you see a hex to int function:

azuresaudf
Create UDF Javascript in Stream Analytics – Hex2Int sample

3. Now you can use your Javascript function in your query of Stream Analytics. Just type udf. than your function name and you can transform your HEX to an integer.

azuresausingfunction
Using UDF Javascript in your query of Stream Analytics

 

Visual Studio integration for Azure Stream Analytics

One challenge I have with creating queries in Azure Stream Analytics (yes, I am not the best with SQL..) is testing my queries with the data input. I was costing me a lot of time. I was the last weeks part of a private preview of the integration of Azure Stream Analytics in Visual Studio. That is now in public preview(download here). With this add-in you can export (or import) your Microsoft Stream Analytics projects in Visual Studio and test everything locally! Currently only Visual Studio 2015 is supported.

  1. First or all go to your Server Explorer and navigate to Stream Analytics and hit the export button.
vssaintegration
Export in Visual Studio a Stream Analytics project

2. Then your complete stream analytics project is exported to Visual Studio 2015. Here you get some great functionalities

vssaintegration2
Exported Azure Stream Analytics project in Visual Studio

Functionalities:

  • Syntax Highlighting in your queries
  • You can write your query and test it directly to testdata! Just hit F5!
  • You can write your custom Javascript code
  • You have off course directly integration with TFS / source control
vssaintegration3
Using local data to test your new queries

Low-latency dashboards in Power BI

Another new update is low latency support in your Microsoft Power BI output of Azure Stream Analytics. (Power BI streaming sets). The latency is now much better then some weeks ago. So you get faster insights in your data!

powerbilowlatency
Low latency in Power BI as output in Stream Analytics

Others

Other new functionalities:

  • Native support for Geospatial functions; now you can define geographical areas in your Azure Streaming job.
  • Job Diagnostics logs; there is now an integration with Azure Monitoring to better monitor your Stream Analytics Jobs!

 

 

 

 

 

 

Advertisements

How to use the Particle Photon with the Microsoft Azure IoT Hub

How to use the Particle Photon with the Microsoft Azure IoT Hub

I use the Particle Photon devices a lot in prototype project. It’s a small WiFi board (like the ESP 8X series).

photon_vector2_600

I used several libraries on the particle cloud to use the Azure IoT hub. These libraries were build based on the Azure IoT SDK. But since several weeks Particle released an Azure IoT Connector on the Particle cloud. It’s still a beta function. But as a user you can send the data from the Particle Photon to the Particle Cloud. When the data has a specific name, the Particle Cloud will sent this data to the Azure IoT Hub.

Take the following steps to get this working.

1. Login to the Particle Cloud

2. Click on the integration button (the latest button) and select ‘New Integration’

particlecloudazureiothub.png

3. Select the Azure IoT Hub

selectazureiothub

4. Configure the settings of your hub

saveandonctinuazureiothub.png

5. Save and continu and the Particle Cloud is connected to the Microsoft Azure IoT Hub. Now you can create your code on the Particle Photon to sent data to the Azure IoT Hub.

Below the example you can add in your Particle Photon device.

      
void loop() {
  // Get some data
  String data = String(10);
  // Trigger the integration
  Particle.publish("datafield", data, PRIVATE);
  // Wait 60 seconds
  delay(60000);
}

Run the code and the messages will flow from the Particle Cloud:

dataflow

and then to the Microsoft Azure IoT Hub! How simple is this!

SmartBuildings factsheet

SmartBuildings factsheet

The last few months I wss busy with a SmartBuilding solution for some of our education clients to save square meters and give an app to studenten to find free rooms. Later I will create several post about the technical solution, but I have created (sorry its in dutch) white paper about this topic.

You can download it here from our website