Azure Event Hubs

Event Hubs is a modern big data streaming platform and event ingestion service that can seamlessly integrate with other Azure and Microsoft services. The service can process millions of events per second with low latency. The data sent to an event hub (Event Hubs instance) can be transformed and stored by using any real-time analytics providers or batching or storage adapters.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Deployments

26

Made by

Massdriver

Official

Yes

No

Compliance

azure-event-hubs

Event Hubs is a modern big data streaming platform and event ingestion service that can seamlessly integrate with other Azure and Microsoft services. The service can process millions of events per second with low latency. The data sent to an event hub (Event Hubs instance) can be transformed and stored by using any real-time analytics providers or batching or storage adapters.

Use Cases

Data is valuable only when there’s an easy way to process and get timely insights from data sources. Event Hubs provides a distributed stream processing platform with low latency and seamless integration, with data and analytics services inside and outside Azure to build your complete big data pipeline.

Real-time and batch processing

Ingest, buffer, store, and process your stream in real time to get actionable insights. Event Hubs uses a partitioned consumer model, enabling multiple applications to process the stream concurrently and letting you control the speed of processing.

Capture event data

Capture your data in near-real time in an Azure Data Lake Storage for long-term retention or micro-batch processing. You can achieve this behavior on the same stream you use for deriving real-time analytics. Setting up capture of event data is fast. There are no administrative costs to run it, and it scales automatically with Event Hubs throughput units. Event Hubs enables you to focus on data processing rather than on data capture.

Event Hubs for Apache Kafka

Event Hubs for Apache Kafka ecosystems furthermore enables Apache Kafka (1.0 and later) clients and applications to talk to Event Hubs. You don’t need to set up, configure, and manage your own Kafka and Zookeeper clusters or use some Kafka-as-a-Service offering not native to Azure.

Configuration Presets

Development

Production

Design

Our bundle includes the following design choices to help simplify your deployment:

1 to 1 ratio of Event Hubs to Event Hub Namespaces

To simplify the deployment, we’ve chosen to create a 1 to 1 ratio of Event Hubs to Event Hub Namespaces. This means that each Event Hub will be created in its own Event Hub Namespace. This configuration does not incur any additional costs.

Best Practices

The bundle includes a number of best practices without needing any additional work on your part.

Scalable

With Event Hubs, you can start with data streams in megabytes, and grow to gigabytes or terabytes. The Auto-inflate feature is one of the many options available to scale the number of throughput units or processing units to meet your usage needs.

TLS Encryption 1.2

Communication between a client application and an Azure Event Hubs namespace is encrypted using Transport Layer Security (TLS). TLS is a standard cryptographic protocol that ensures privacy and data integrity between clients and services over the Internet. We’ve enforced TLS 1.2 as the minimum version for all communication between Event Hubs and client applications.

Security

SAS authentication disabled

Shared Access Signature (SAS) authentication is disabled for additional security. SAS authentication is a legacy authentication mechanism that is not recommended for secure applications. We recommend using Azure Active Directory (AAD) authentication instead. Learn more about Authentication and authorization in Azure Event Hubs.

Observability

Massdriver provides you with visibility into the health of your systems. By default, Event Hubs will be created with alarms connected to Massdriver to alert you when performance drops below a key threshold or fails completely. You will be notified when the memory or CPU exceeds 90%, or when server errors exceed 5 in a 5 minute period. All metric values are customizable to fit your application’s needs.

Trade-offs

We do not currently support the following:

  • Event Hubs dedicated clusters
  • Connecting Event Hubs to blob storage for data capture
  • Disabling data capture
  • Event Hubs basic tier
  • Event Hubs VNet integration (restricts metrics, logs, data capture, and app access)
  • Schema registry support

Suggest a feature for this bundle on our roadmap or fork the bundle to publish your own version.

Variable Type Description
capture.arvo_encoding string Specifies the encoding used for the capture.
capture.capture_buildup integer The amount of data built up in your Event Hub before a capture operation occurs. Minimum of 10 MiB, maximum of 500 MiB.
capture.capture_interval integer The time interval, in seconds, at which the capture to Azure Data Lake will happen. Minimum of 60, maximum of 900.
hub.message_retention integer The number of days to retain the events for this Event Hubs, value should be 1 to 7 days.
hub.partition_count integer No description
hub.sku string Learn more about the different features and capabilities of each pricing tier here. Cannot be changed after deployment.
hub.throughput_units integer The number of throughput units allocated for the Event Hubs. Minimum of 1, maximum of 40. Learn more here.
hub.zone_redundant boolean Enable zone redundancy for the Event Hubs. Cannot be changed after deployment.
monitoring.mode string Enable and customize Function App metric alarms.