You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The documentation states that this filter is being deprecated in favor of the multiline codec. However, multiline codec doesn't support custom stream_identity's and therefore can't do transaction id style streams over a single input (a great example is Cisco ACS AAA logs).
If you want to add a regex/grok to the multiline codec to support custom stream_identity that would probably suffice for most -- although it's not as "clean" as the filter implementation against fully formed events.
I also have to say losing this is a problem, as I don't see a solution other than this for my current issue:
I have docker containers write out their logs via the json-file driver (anything else and it's a lot of junk in the syslog or just hard to seperate logs from multiple containers), but since filebeats collects them as json, it's too late to multiline at input, I have to do that as a filter. Unless I'm missing something important?
Hi,
The documentation states that this filter is being deprecated in favor of the multiline codec. However, multiline codec doesn't support custom stream_identity's and therefore can't do transaction id style streams over a single input (a great example is Cisco ACS AAA logs).
From the wayback machine, here's some examples of people trying to do this.
https://logstash.jira.com/browse/LOGSTASH-1785
It also shows up in google and other open issues / JIRA's.
If you want to add a regex/grok to the multiline codec to support custom stream_identity that would probably suffice for most -- although it's not as "clean" as the filter implementation against fully formed events.
There's an interesting reference to "sub-stream patterns" here which is a similar idea:
logstash-plugins/logstash-codec-multiline#22
Is there a way to do transaction id style multiline in 5.0.0?
Thanks,
John
The text was updated successfully, but these errors were encountered: