Perform network intrusion detection with Network Watcher and open source tools

Packet captures are a key component for implementing network intrusion detection systems (IDS) and performing Network Security Monitoring (NSM). There are several open source IDS tools that process packet captures and look for signatures of possible network intrusions and malicious activity. Using the packet captures provided by Network Watcher, you can analyze your network for any harmful intrusions or vulnerabilities.

One such open source tool is Suricata, an IDS engine that uses rulesets to monitor network traffic and triggers alerts whenever suspicious events occur. Suricata offers a multi-threaded engine, meaning it can perform network traffic analysis with increased speed and efficiency. For more details about Suricata and its capabilities, visit their website at https://suricata-ids.org/.

Scenario

This article explains how to set up your environment to perform network intrusion detection using Network Watcher, Suricata, and the Elastic Stack. Network Watcher provides you with the packet captures used to perform network intrusion detection. Suricata processes the packet captures and trigger alerts based on packets that match its given ruleset of threats. These alerts are stored in a log file on your local machine. Using the Elastic Stack, the logs generated by Suricata can be indexed and used to create a Kibana dashboard, providing you with a visual representation of the logs and a means to quickly gain insights to potential network vulnerabilities.

simple web application scenario

Both open source tools can be set up on an Azure VM, allowing you to perform this analysis within your own Azure network environment.

Steps

Install Suricata

For all other methods of installation, visit https://suricata.readthedocs.io/en/suricata-5.0.2/quickstart.html#installation

  1. In the command-line terminal of your VM run the following commands:

    sudo add-apt-repository ppa:oisf/suricata-stable
    sudo apt-get update
    sudo sudo apt-get install suricata
    
  2. To verify your installation, run the command suricata -h to see the full list of commands.

Download the Emerging Threats ruleset

At this stage, we do not have any rules for Suricata to run. You can create your own rules if there are specific threats to your network you would like to detect, or you can also use developed rule sets from a number of providers, such as Emerging Threats, or VRT rules from Snort. We use the freely accessible Emerging Threats ruleset here:

Download the rule set and copy them into the directory:

wget https://rules.emergingthreats.net/open/suricata/emerging.rules.tar.gz
tar zxf emerging.rules.tar.gz
sudo cp -r rules /etc/suricata/

Process packet captures with Suricata

To process packet captures using Suricata, run the following command:

sudo suricata -c /etc/suricata/suricata.yaml -r <location_of_pcapfile>

To check the resulting alerts, read the fast.log file:

tail -f /var/log/suricata/fast.log

Set up the Elastic Stack

While the logs that Suricata produces contain valuable information about what's happening on our network, these log files aren't the easiest to read and understand. By connecting Suricata with the Elastic Stack, we can create a Kibana dashboard what allows us to search, graph, analyze, and derive insights from our logs.

Install Elasticsearch

  1. The Elastic Stack from version 5.0 and above requires Java 8. Run the command java -version to check your version. If you do not have java installed, refer to documentation on the Azure-suppored JDKs.

  2. Download the correct binary package for your system:

    curl -L -O https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.2.0.deb
    sudo dpkg -i elasticsearch-5.2.0.deb
    sudo /etc/init.d/elasticsearch start
    

    Other installation methods can be found at Elasticsearch Installation

  3. Verify that Elasticsearch is running with the command:

    curl http://127.0.0.1:9200
    

    You should see a response similar to this:

    {
    "name" : "Angela Del Toro",
    "cluster_name" : "elasticsearch",
    "version" : {
        "number" : "5.2.0",
        "build_hash" : "8ff36d139e16f8720f2947ef62c8167a888992fe",
        "build_timestamp" : "2016-01-27T13:32:39Z",
        "build_snapshot" : false,
        "lucene_version" : "6.1.0"
    },
    "tagline" : "You Know, for Search"
    }
    

For further instructions on installing Elastic search, refer to the page Installation

Install Logstash

  1. To install Logstash run the following commands:

    curl -L -O https://artifacts.elastic.co/downloads/logstash/logstash-5.2.0.deb
    sudo dpkg -i logstash-5.2.0.deb
    
  2. Next we need to configure Logstash to read from the output of eve.json file. Create a logstash.conf file using:

    sudo touch /etc/logstash/conf.d/logstash.conf
    
  3. Add the following content to the file (make sure that the path to the eve.json file is correct):

    input {
    file {
        path => ["/var/log/suricata/eve.json"]
        codec =>  "json"
        type => "SuricataIDPS"
    }
    
    }
    
    filter {
    if [type] == "SuricataIDPS" {
        date {
        match => [ "timestamp", "ISO8601" ]
        }
        ruby {
        code => "
            if event.get('[event_type]') == 'fileinfo'
            event.set('[fileinfo][type]', event.get('[fileinfo][magic]').to_s.split(',')[0])
            end
        "
        }
    
        ruby{
        code => "
            if event.get('[event_type]') == 'alert'
            sp = event.get('[alert][signature]').to_s.split(' group ')
            if (sp.length == 2) and /\A\d+\z/.match(sp[1])
                event.set('[alert][signature]', sp[0])
            end
            end
            "
        }
    }
    
    if [src_ip]  {
        geoip {
        source => "src_ip"
        target => "geoip"
        #database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
        add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
        add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
        }
        mutate {
        convert => [ "[geoip][coordinates]", "float" ]
        }
        if ![geoip.ip] {
        if [dest_ip]  {
            geoip {
            source => "dest_ip"
            target => "geoip"
            #database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
            add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
            add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
            }
            mutate {
            convert => [ "[geoip][coordinates]", "float" ]
            }
        }
        }
    }
    }
    
    output {
    elasticsearch {
        hosts => "localhost"
    }
    }
    
  4. Make sure to give the correct permissions to the eve.json file so that Logstash can ingest the file.

    sudo chmod 775 /var/log/suricata/eve.json
    
  5. To start Logstash run the command:

    sudo /etc/init.d/logstash start
    

For further instructions on installing Logstash, refer to the official documentation

Install Kibana

  1. Run the following commands to install Kibana:

    curl -L -O https://artifacts.elastic.co/downloads/kibana/kibana-5.2.0-linux-x86_64.tar.gz
    tar xzvf kibana-5.2.0-linux-x86_64.tar.gz
    
    
  2. To run Kibana use the commands:

    cd kibana-5.2.0-linux-x86_64/
    ./bin/kibana
    
  3. To view your Kibana web interface, navigate to http://localhost:5601

  4. For this scenario, the index pattern used for the Suricata logs is "logstash-*"

  5. If you want to view the Kibana dashboard remotely, create an inbound NSG rule allowing access to port 5601.

Create a Kibana dashboard

For this article, we have provided a sample dashboard for you to view trends and details in your alerts.

  1. Download the dashboard file here, the visualization file here, and the saved search file here.

  2. Under the Management tab of Kibana, navigate to Saved Objects and import all three files. Then from the Dashboard tab you can open and load the sample dashboard.

You can also create your own visualizations and dashboards tailored towards metrics of your own interest. Read more about creating Kibana visualizations from Kibana's official documentation.

kibana dashboard

Visualize IDS alert logs

The sample dashboard provides several visualizations of the Suricata alert logs:

  1. Alerts by GeoIP – a map showing the distribution of alerts by their country/region of origin based on geographic location (determined by IP)

    geo ip

  2. Top 10 Alerts – a summary of the 10 most frequent triggered alerts and their description. Clicking an individual alert filters down the dashboard to the information pertaining to that specific alert.

    image 4

  3. Number of Alerts – the total count of alerts triggered by the ruleset

    image 5

  4. Top 20 Source/Destination IPs/Ports - pie charts showing the top 20 IPs and ports that alerts were triggered on. You can filter down on specific IPs/ports to see how many and what kind of alerts are being triggered.

    image 6

  5. Alert Summary – a table summarizing specific details of each individual alert. You can customize this table to show other parameters of interest for each alert.

    image 7

For more documentation on creating custom visualizations and dashboards, see Kibana's official documentation.

Conclusion

By combining packet captures provided by Network Watcher and open source IDS tools such as Suricata, you can perform network intrusion detection for a wide range of threats. These dashboards allow you to quickly spot trends and anomalies within your network, as well dig into the data to discover root causes of alerts such as malicious user agents or vulnerable ports. With this extracted data, you can make informed decisions on how to react to and protect your network from any harmful intrusion attempts, and create rules to prevent future intrusions to your network.

Next steps

Learn how to trigger packet captures based on alerts by visiting Use packet capture to do proactive network monitoring with Azure Functions

Learn how to visualize your NSG flow logs with Power BI by visiting Visualize NSG flows logs with Power BI