Part 5. Intelligent SIEM Logging

Seamlessly parse your security logs with Graylog!

PART ONE: Backend Storage

PART TWO: Log Ingestion

PART THREE: Log Analysis

PART FOUR: Wazuh Agent Install

Video Walkthrough

Intro

In PART THREE we started our Graylog input and configured Fluent Bit on our Wazuh Manager to forward the /var/ossec/logs/alerts/alerts.json file to Graylog. Without this step, Graylog would not receive our security logs nor would any data be stored within our Wazuh-Indexer.

Please follow PART THREE before progressing with this tutorial.

Taking a look at our Graylog Input, we see that it is receiving data from our Wazuh Manager.

Graylog Receiving Data

Let’s view our received messages and look at the data coming in.

Select Received Messages

Select a message to expand it out and view all of the metadata for that specific event:

Graylog Message

Gross! You can see that our message is not parsed out into key value pairs. All data is written to the message field. This makes it difficult for us to build dashboards, alerts, and slice and dice our data. Our log ingestion engine must be able to parse keys and their values or we are going to have a difficult time detecting, visualizing, and responding to security events.

Graylog Extractors

Wouldn’t it be nice to be able to search for all blocked packages of a given source IP or to get a quick terms analysis of recently failed SSH login usernames? Hard to do when all you have is just a single long text message.

WELCOME GRAYLOG EXTRACTORS

The extractors allow you to instruct Graylog nodes on how to extract data from any text in a received message (regardless of the format and even if it’s an extracted field) to message fields. Full text searches provide a great deal of possibilities for analysis but the real power of log analytics unveils when you can run queries like http_response_code:>;=500 AND user_id:9001 to get all internal server errors triggered by a specific user.

More can be read up on Graylog’s Extractors here: Graylog Docs

JSON EXTRACTOR

The JSON extractor is the perfect extractor for our Wazuh logs. By default, the Wazuh Manager is writing to the alerts.json file in a single line json format. We are also setting our OUTPUT format to json_lines in our fluent-bit.conf file.

The beauty of the JSON extractor is that it will do the heavy lifting for us. We simply need to provide it a few details and Graylog will handle the rest!

Configuring the JSON Extractor

Let’s now instruct Graylog to parse through our received messages with the JSON extractor.

  1. Select anywhere in the message field (not on the title but the data itself) and select Create extractor
Select Create Extractor

2. Select JSON

JSON Extractor Type

3. Set the below configuration settings:

JSON Extractor Settings

4. Select Try to see some magic :)

Parsed out data

Isn’t it beautiful, now Graylog will parse through our received messages and write out our key value pairs.

Give your extractor a name and select Create Extractor

Create Extractor

Head back over to your ingested messages and see them now being parsed correctly!

Parsed Messages

Creating Index

An Index is how our Wazuh-Indexer stores our ingested logs. We need to create an index that will store all of our Wazuh Alerts. Graylog also gives the ability to configure index settings such as the number of shards, replicas, etc.

More can be read up on indices in PART ONE

  1. Select System / Indices

2. Select Create index set

3. Configure your Index settings. Below is just an example, you should customize to fit your needs.

Index Configuration
Index Configuration Continued

Highlights

  • Index Prefix — Name of the Index that is used to store the data into our Wazuh-Indexer
  • Rotation Strategy — How often the index will rotate. For example, our first index created will be named wazuh-alerts-socfortress_0 . Once that index hits a size of 10GB, Graylog will rotate to the next index, wazuh-alerts-socfortress_1 .
  • Retention Strategy — How long an index will remain in our Wazuh Indexer. For example, I have a rotation strategy of 10GB and a retention strategy of 10. This means that I will hold at a max 100GB (10 x 10) of total Wazuh Alerts data. Once 10 indices are reached, Graylog will delete the first index, wazuh-alerts-socfortress_0 , to make room for wazuh-alerts-socfortress_11 . Keep in mind that this data is permanently deleted.

Creating the Stream

Graylog streams are a mechanism that route messages into categories in real time while they are being processed. We can define rules in Graylog to route messages into certain streams.

Streams allow us to route received messages to the correct index. Creating multiple indices and multiple streams gives us the ability to provide a multi tenant solution!

  1. Select Streams on the top menu and select Create Stream
Creating a Stream

2. Create a Stream name and set it to your newly created Index. Select Remove matches from 'All messages' stream . We only want the data to go to our one index.

Stream Configuration

3. Select Save.

New Stream Created

4. Head over to our Inputs and select Add static field

Adding a static field

5. Add log_type of wazuh . Or use whatever field name you’d like.

Static Field Name

This will add the key value pair of log_type:wazuh to every log ingested by our Wazuh Events Fluent Bit — TCP Input. We can now use this field name as a rule for our Stream to route all Wazuh Alerts to our correct Index.

log_type key vaule pair

6. Select Stream and Manage Rules

Manage Stream Rules

7. Select Add stream rule

Stream Rule

8. Select I'm done!

9. Start the stream

Starting of Stream

Select the Stream to view the ingested messages:

Select your message and view the Index that our Wazuh logs are now being written to:

Stored in Index

Conclusion

Throughout this post we learned how to parse through our received Wazuh alerts, create unique indices, and route our logs to the correct index. The ability to slice, dice, and route our logs to fit our needs is crucial for any SOC team / MSPs / etc. And it doesnt stop here! You can implement this same approach to ingest Firewall logs, AWS/GCP/Azure logs, other 3rd party logs, etc. You now have the power, now go take back control of your logs! Happy Defending 😄.

Need Help?

The functionality discussed in this post, and so much more, are available via SOCFortress’s Professional Services. Let SOCFortress help you and your team keep your infrastructure secure.

Website: https://www.socfortress.co/

Professional Services: https://www.socfortress.co/ps.html

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store