Crossing the boundaries: Integrate Azure event hub and stream analytics programmatically

IIOT. Industrial Internet of Things is an area which talks about the interconnected systems and devices where data has an immense significance. It is not just about connecting things together with internet but empowering them with computer applications and technologies. I will not much talk about this but you can refer more details here.

Just to give glimpse of what I was talking above, I would like to show practically how we can get these data/event from interconnected devices and process them using Azure event hub and stream analytics. These services are very powerful yet rich products in analytics domain. Event hubs manages the event from any system and process it for further analysis using stream analytics job.

In this blog post we will practically see , how we can send live events to event hub and process it using stream analytics.

what is covered?

  • A quick overview of the solution : Design , approach etc.
  • what is Azure event hub and how to configure it
  • what is Azure stream analytics and how to configure it.
  • Sample java application to send events to event hub topics, consider this application as device which sends data.
  • Azure stream analytics job to receive the events for further processing
  • How to use Azure service bus explorer to view the event data

what is not covered?

  • Power BI setup

Overview:

There are certain use cases where event needs to be sent for further action/processing, for e.x. in factories, there are many equipment’s/sensors whose data needs to be recorded as events for further actions. Time bound actions needs to be performed to get the meaningful data out of it and process further to take business decisions. Lets take sample application.

Here is the reference architecture for the simple application.


Workflow using Azure event hub and stream analytics

The above architecture depicts that, there is an IOT application which sends events/alerts to event hub name space which has event hub topics created. This act as storage for events which are real events. There is another storage called blob storage which contains reference data, needs to be processed along with real data to get meaningful results which can be acted upon. Azure stream analytics will stream these events from these sources and process it as an real time events . Once done, data gets sent to Power BI for visualization. That’s it !

This is just an sample application which where latency isn’t concern. But it poses some real time challenges on high velocity/volume data.

For this blog post, I have created sample spring boot java based application/microservice which sends the data to event hub and then ASA job streams that data, process it and sends the output to Power BI.

Lets see now how to create Azure event hub and ASA

Initial setup:

  • Create Event hub name space and topic
  • Create Azure stream analytics job and storage
  • Create spring boot java app
  • Download and install Azure service bus explorer

Azure Event Hub:

Create the event hub namespace using Portal, Azure CLI, ARM templates. Using Azure portal, the event hub namespace and topics looks like this .

Once the event hub is created, it looks like below.

Azure stream analytics:

OK. Good . Now lets create the Azure stream analytics job. Using azure portal , ASA looks like below.

In the input section of ASA, configure the input alias which contains the input as event hub as per our above architecture. Input section looks like below.

In the query section of ASA, write a query to fetch the data from event hub, will be processed by the ASA and send the output to PowerBI. Looks like below

Sample Java code:

Good. Now you have event hub namespace and event hub created along with ASA job and query. Next is to create sample java app and send some events to event hub. Below is the sample controller code.

package com.eduwebmonster.patientsystem;

import org.springframework.stereotype.Controller;

import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;

import com.azure.messaging.eventhubs.EventData;
import com.azure.messaging.eventhubs.EventDataBatch;
import com.azure.messaging.eventhubs.EventHubClientBuilder;
import com.azure.messaging.eventhubs.EventHubProducerClient;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;

import ch.qos.logback.core.net.SyslogOutputStream;

@Controller
public class PatientController {	
	
	static String connectionString = <your connection string here>;
	static String eventHubName = "<your event hub name>";
	
	  ObjectMapper Obj = new ObjectMapper();
	
		  
	@GetMapping("/patient")
	public static String patientForm(Model model) {
		model.addAttribute("patient", new PatientDataModel());
		return "patient";
	}

	//@PostMapping("/patient")
	@RequestMapping(value="/patient", method=RequestMethod.POST, params="action=Send data to event hub")
	public static String patientSubmit(@ModelAttribute PatientDataModel patient, Model model, Object obj) throws JsonProcessingException {
	
		
		
		obj = new ObjectMapper().writeValueAsString(patient);
		model.addAttribute("patient", patient);
		
		model.addAttribute("patientjson", obj);
		
		EventHubProducerClient producer = new EventHubClientBuilder().connectionString(connectionString, eventHubName).buildProducerClient();
		EventDataBatch batch = producer.createBatch();
		batch.tryAdd(new EventData(obj.toString()));
				
		producer.send(batch);
		producer.close();
				
		return "result";
	}
	
}

Above code uses the Azure event hub SDK to connect to the Event hub namespace using connection string and send some sample events . for UI, I am using thymeleaf as template engine.

Once the above controller code runs, it sends the data to event hub in JSON format. it looks like below.

 {"id":222,"content":"this is sample event. sending to event hub ","name":"john bose"}

Fine. Till now, we have setup Azure event hub and stream analytics job and sent the sample event using Java code. Lets have a look at the event using service bus explorer. You can download from here . Once downloaded, just run the ServiceBusExplorer.exe. Select the File -> Connect option and you will see the below screen.

Now, enter the connection string from the event hub namespace and click OK. Once successfully connected, create the listener app by right clicking on the $Default consumer group. Once done, you can see the events which we sent using Java code appear as below.

That’s it. By getting above screen, it clears that, our workflow is working end to end and able to see the events . Good job.

This brings to the end of this blog post . I hope it helps a lot to get started with Azure event hub and Azure stream analytics to test the floor and get involved in more complex use cases.

What next ?

Thera are plenty of documentation to start :

Conclusion:

Complex real time, time series data poses lot of challenges. Data volume is also huge and difficult to process them in timely manner. However, cloud based technologies like Azure event hub and stream analytics job made it easier to work on such high velocity data and get the benefited out of that .

In this blog post, we used the sample application to send events and process them using event hub and analytics using SQL.

These services are powerful, off-the shelf services which brings down the complexity of processing event data without investing much on infrastructure.

Leave a Comment

Your email address will not be published. Required fields are marked *