After an Event Hub has received an event, I want to design a "consumer" that will pass the event object through a pipeline of custom processing functions ...the last module would write the post-processed event to a Azure Table or Azure SQL Database. I want these pipelines to be easily extended, composable and performant. ...kind of like a PowerShell pipeline for Event Hub Events.
Azure Data Factory does part of this but as far as I've been able to determine, Data Factory doesn't have native support for reading events in an Event Hub ...you need to write a custom Event Hub receiver/consumer that would first write the events to some sort of Azure Storage (which seems wasteful when the events have already been persisted in an Event Hub). Another option would be to write a custom DotNetActivity for Data Factory but I don't want to do this when Data Factory should have its own native supported for reading from an Event Hub.
Stream Analytics would be another option. The service understands Event Hub data but it seems like overkill and I haven't been able to determine if it supports something similar to custom DotNetActivities.
Any ideas for an Event Hub post-processing pipeline that is lightweight, easily extended, easily composable and performant?
Best regards,
Michael Herman (Toronto)
Xpert Search Agents for Microsoft web sites: http://www.parallelspace.net/MicrodeX