Azure Flow Log Aggregation (SOF-ELK)

I recently completed the SANS FOR572 | Advance Network Forensics: Threat Hunting, Analysis, and Incident Response course delivered by Phil Hagen and the tested and certified in GIAC Network Forensic Analysis (GNFA). During my study time I wanted to dig deeper into the SOF-ELK distribution. What is SOF-ELK you say? It is Security-Operation and Forensics Elasticsearch, LogStash and Kibana virtual machine that can be utilized for log aggregation and network forensic analysis. Please check it out here.

Currently SOF-ELK will ingest syslog, httpd, passivedns, netflow and zeek (bro) data for you to perform more detailed analysis of what is occurring on your network. I wanted to support this endeavor by making the ingestion of Azure Flow Logs into this solution a reality. So what follows is my journey and solution to make this happen.

What is Netflow?

Netflow is a feature that was introduced by Cisco in 1996 to collect basic IP traffic at the ingress and egress of a network interface. In it simplest form, it gathers a timestamp, source and destination address, protocol, port, number of packets and total bytes transferred.

Please check out for further history and details.

What are Azure Flow Logs?

Azure Flow Logs are similar to Netflow in that they record the source and destination, port, protocol, number of packets and total bytes transferred. They also have information as to whether or not the packet was allowed through the Network Security Group (NSG). Which you can think of as a firewall on a network interface.

When enabled, these logs are stored in a JSON format on a storage account within resource group. So it can be a challenge to get the logs out and into a format for ingestion for SOF-ELK.

Luckily I created a PowerShell script that will do this for you. Well, sort of, it doesn't get the logs, but it will put the logs in a format that can be ingested into SOF-ELK.

For more information on Azure Flow Logs, and how to enable and obtain them, check out


You can download the script from my GitHub repository here.

Once you have obtained the JSON Azure Flow Log, run the following to create the proper ingest-able format for the log. Keep in mind that I had to create a lot of "zero" values to meet the format because the data was not applicable.

Convert-AZNetflow2csv.ps1 -r <Azure Flow Log>.json -w <SOF-ELK Format>.txt

Once you have your out-file, just drop it into the /logstash/nfarch/ directory for ingestion.