Thursday, July 8, 2021

Security Onion 2.3.60 Heavy Node Hotfix

If you don't have any heavy nodes in your deployment, then you can ignore this blog post and hotfix. If you do have heavy nodes, please continue reading.

We were recently notified by Github user Sahale that heavy nodes are incorrectly sending logs to the manager:
https://github.com/Security-Onion-Solutions/securityonion/discussions/4697

We've put together a hotfix to correct heavy node pipelines. If you have any heavy nodes in your deployment, you will want to install this hotfix via the normal soup process. (If you weren't already on version 2.3.60, then soup will upgrade to 2.3.60 and install the hotfix.) If you're in an airgap environment, please see https://docs.securityonion.net/en/2.3/airgap.html#security-onion-hotfixes.

Once your deployment is fully updated, you should then do the following on each heavy node.

Restart services:

sudo salt-call state.apply ssl queue=True

sudo so-redis-restart

sudo so-elasticsearch-stop

sudo so-filebeat-restart #(this may take a few minutes as module pipelines are loaded) 

sudo so-logstash-restart

Check to ensure the Redis input and output now reflect Logstash pushing to and pulling from Redis on the heavy node (check for heavy node host name):

sudo grep host /opt/so/conf/logstash/pipelines/manager/9999_output_redis.conf

sudo grep host /opt/so/conf/logstash/pipelines/search/0900_input_redis.conf

Ensure certificates for Redis and Elasticsearch reflect the correct `CN` value (heavy node host name):

sudo openssl x509 -in /etc/pki/elasticsearch.crt -text -noout | grep -E 'Subject.*CN'

sudo openssl x509 -in /etc/pki/redis.crt -text -noout | grep -E 'Subject.*CN'

Ensure no issues with Logstash connecting to Redis. Check /opt/so/log/logstash/logstash.log for SSL/TLS/Bad certificate errors, or errors with regard to missing pipelines. For example:

sudo tail -f /opt/so/log/logstash/logstash.log

Ensure Filebeat module pipelines are loaded:

sudo so-elasticsearch-pipelines-list | grep filebeat

Ensure data is flowing through the manager and search pipeline input/output for Redis (should see non-zero/count increasing):

sudo so-logstash-pipeline-stats manager | grep -C5 redis

sudo so-logstash-pipeline-stats search | grep -C5 redis

Ensure data is being written to local Elasticsearch indices (should see so-zeek, so-suricata, so-ossec indices, etc):

sudo so-elasticsearch-query  _cat/indices

Questions or Problems

If you have questions or problems, please see our community support forum guidelines:
https://docs.securityonion.net/en/2.3/community-support.html

You can then find the community support forum at:

https://securityonion.net/discuss

No comments:

Search This Blog

Featured Post

Security Onion 2.4.111 now available!

In October, we released version 2.4.110: https://blog.securityonion.net/2024/10/security-onion-24110-hurricane-helene.html Last week, Surica...

Popular Posts

Blog Archive