Understanding Wazuh Decoders

Intro

Not all logs are built the same. Depending on the source of the logs you are looking to collect, can sometimes require you to build a custom Wazuh decoder. Wazuh provides an out-of-the-box set of decoders and rules, but often times users are left needing to create decoders/rules to handle their unique use case. Thankfully, Wazuh provides us the ability to create our own decoders and rules powered by a flexible regex library.

Building The Decoder

We can use Wazuh to build decoders that will match on ANYTHING. This flexibility allows us to ingest any type of log into Wazuh, which in turn is written into Elasticsearch and viewable within Kibana. Let’s take for example the below log:

Medium: SOCFortress is an awesome company, check them out at https://www.socfortress.co
/var/ossec/bin/wazuh-logtest

Parent Decoder

When designing the new decoders and rules, you should consider event samples. In the case of the example log above you can see the text Mediumis always present at the beginning of every message, so it can be used in the root decoder prematch. Again, we can use regex to build out our root decoder. The following decoder will attempt to find a match on every log for the expression defined (Medium) and decide whether the child decoders should be considered for triggering or not.

<decoder name="medium">
<prematch>^Medium:</prematch>
</decoder>
nano /var/ossec/etc/decoders/local_decoder.xml

Child Decoder

Now we need to build a child decoder that will allow us to parse out fields that can contain dynamic values to store them into Elasticsearch. The decoder below extracts the values for the headers company, and website. The regex option finds the fields of interest and extracts them through the () operator. The order option defines what the parenthesis groups contain and the order in which they were received.

<decoder name="medium_child">
<parent>medium</parent>
<regex offset="after_parent">^\s(\.+) is an awesome company, check them out at (https://\.+)</regex>
<order>company,website</order>
</decoder>
nano /var/ossec/etc/decoders/local_decoder.xml
Medium: OpenSecure is an awesome company, check them out at https://www.opensecure.co

Creating Rules

Creating decoders is only half the battle, in order for the Wazuh Manager to write these logs to Elasticsearch, we need to create rules. Let’s create a rule that matches on the company key containing SOCFortress:

nano /var/ossec/etc/rules/local_rules.xml<group name="medium,socfortress">
<rule id="100021" level="5">
<decoded_as>medium</decoded_as>
<field name="company">SOCFortress</field>
<description>Go check out $(company) at $(website)!</description>
</rule>
</group>
nano /var/ossec/etc/rules/local_rules.xml<group name="medium,socfortress">
<rule id="100022" level="5">
<decoded_as>medium</decoded_as>
<field name="company">OpenSecure</field>
<description>Go check out $(company) at $(website)!</description>
</rule>
</group>
Medium: OpenSecure is an awesome company, check them out at https://www.opensecure.co

pfSense Firewall

Let’s build a decoder and rule set with a rule world example. Currently, our pfSense firewall is configured to send syslog output to the Wazuh Manager, our Wazuh Manager is receiving the logs:

1 2022-04-22T13:46:00.769161-05:00 router.localdomain filterlog 48244 - - 107,,,1000005911,mvneta0,match,pass,out,4,0x0,,64,62124,0,none,17,udp,78,104.181.152.45,205.251.194.94,13998,53,58

Parent Decoder

Looking at the raw log, we see 1 followed by the current date and timestamp: 2022–04–22T13:46:00.769161–05:00 . We can use that consistent pattern to create a parent decoder:

<decoder name="pfsense">
<prematch>^\d \d\d\d\d-\d\d-\d\dT\d\d:\d\d:\d\d.\d\d\d\d\d\d-\d\d:\d\d</prematch>
</decoder>

Child Decoder

Our child decoder will now match on the rest of the log and parse out the router, firewallaction, direction, protocol, srcip, and dstip

<decoder name="pfsense_router">
<parent>pfsense</parent>
<regex offset="after_parent">^(\.+) filterlog \d\d\d\d\d - - \d\d\d,,,\d\d\d\d\d\d\d\d\d\d,\w\w\w\w\w\w\d,\w\w\w\w\w,(\w\w\w\w),(\.+),\d,\dx\d,,\d\d,\d\d\d\d\d,\d,\w\w\w\w,\d\d,(\.+),\d\d,(\.+),(\.+),</regex>
<order>router,firewallaction,direction,protocol,srcip,dstip</order>
</decoder>

Rule

We can now create rules to match on any of these stripped out fields. Below we created a rule that matches on all traffic that was passed by the firewall.

nano /var/ossec/etc/rules/local_rules.xml<group name="pfsense,syslog">
<rule id="100023" level="5">
<decoded_as>pfsense</decoded_as>
<field name="firewallaction">pass</field>
<description>Traffic from $(srcip) to $(dstip) passed.</description>
</rule>
</group>

Conclusion

The ability to create custom Wazuh decoders gives us the ability to collect, parse, and store any type of log file. While the learning curve can be a little high, I hope this post helps to clarify how we can use Wazuh to build custom decoders and rules.

Need Help?

The functionality discussed in this post, and so much more, are available via the SOCFortress platform. Let SOCFortress help you and your team keep your infrastructure secure.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
SOCFortress

SOCFortress

SOCFortress is a SaaS company that unifies Observability, Security Monitoring, Threat Intelligence and Security Orchestration, Automation, and Response (SOAR).