Microsoft Azure Event Hubs protocol FAQ

Use these frequently asked questions and answers to help you understand the Microsoft Azure Event Hubs protocol.

Why do I need a storage account to connect to an event hub?

You must have a storage account for the Microsoft Azure Event Hubs protocol to manage the lease and partitions of an event hub. For more information, see the Event processor host documentation (https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-event-processor-host).

Why does the Microsoft Azure Event Hubs protocol use the storage account?

The Microsoft Azure Event Hubs protocol uses the storage account to track partition ownership. This protocol creates blob files in the Azure storage account in the <Event Hub Name> → <Consumer group Name> directory. Each blob file relates to a numbered partition that is managed by the event hub. For more information, see the Event processor host documentation (https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-event-processor-host).

How much data does the storage account need to store?

The amount of data that needs to be stored in a storage account is the number of partitions that are multiplied by ~150 bytes.

Does my storage account need to contain events?

No. Storing the logs in storage is an option that is provided by Microsoft. However, this option is not used by the protocol.

What does a blob file that is created by the Microsoft Azure Event Hubs protocol look like?

The following example shows what is stored in a blob file that is created by the protocol:
{"offset":"@latest","sequenceNumber":0,"partitionId":"3","epoch":8,"owner":"","token":""}”

Can I use the same storage account with other event hubs?

There are no restrictions on how many event hubs can store data in a storage account. You can use the same storage account for all log sources in the same QRadar® environment. This creates a single location for all event hub partition management folders and files.

What do I do if the protocol isn't collecting events?

If the protocol appears to be working and the protocol testing tools pass all of the tests, and you don't see events, follow these steps to confirm whether events are posted.
  1. Confirm that there are events for the event hub to collect. If the Azure side configuration is not correct, the event hub might not collect the events.
  2. If the Use as a Gateway Log Source is enabled, do a payload search for events that the Event Hub log source collects. If you are not sure what the events should look like, then go to step 4.
  3. If the Use as a Gateway Log Source option is enabled, and the protocol is not collecting events, test the same log source with the gateway disabled. By setting the Use as a Gateway Log Source to disabled, all collected events are forced to use the log source that is connected to the protocol. If events are arriving when the Use as a Gateway Log Source is disabled, but events are not arriving when Use as a Gateway Log Source is enabled, there might be an issue with the log source identifier options or the Traffic Analysis can't automatically match the events to a DSM.
  4. If you identified in Step 2 or Step 3 that the events are not coming in under the expected log source, there might be an issue with the event hub log sources logsourceidentifierpattern. For issues related to the event hub log source identifier pattern, you might need to contact Support.

Why do I need to open the ports for two different IPs that have different ports?

You need two different IPs to have different ports open because the Microsoft Azure Event Hub protocol communicates between the event hub host and the storage account host.

The event hub connection uses the Advanced Message Queuing Protocol (AMQP) with ports 5671 and 5672. The storage account uses HTTPS with ports 443. Because the storage account and the event hub have different IPs, you must open two different ports.

Can I collect <Service/Product> events by using the Microsoft Event Hubs protocol?

The Microsoft Event Hubs protocol collects all events that are sent to the event hub, but not all events are parsed by a supported DSM. For a list of supported DSMs, see QRadar supported DSMs.

What does the Format Azure Linux Events To Syslog option do?

This option takes the Azure Linux® event, which is wrapped in a JSON format with metadata, and converts it to a standard syslog format. Unless there is a specific reason that the metadata on the payload is required, enable this option. When this option is disabled, the payloads do not parse with Linux DSMs.