-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
unpacking json to toplevel overrides @metadata #15323
Comments
Preferable solution: a) json filter doesn't remove Minimal solution: c) if the behaviour is to stay, there is clear warning in json filter docs that using this filter in toplevel mode (without target) removes |
is it possible that the json filter can call the ruby event api the Anyways, There are 6 options (assuming you're using logstash 8.x). I think only 1 solution for myself which is # 2 but you may be able to use # 3 or # 4. Hope the below table helps anybody else making the choice..
|
to add to this, ingest pipelines sometimes expect event.original and sometimes expect message. sometimes copy message to event.original and remove message. sometimes copy event.original to message but still use message. |
Problem I faced in rather simple setup:
filebeat scraps some apache and nginx logs (using standard modules), output-ting them to kafka
logstash reads from kafka, unpacks json, makes some (irrelevant here) changes and saves to elasticsearch
It turned out, that in spite of using
decorate_events => "basic"
,[@metadata][kafka]
is not available. And it took me quite a lot of time to find out - why. Looks likejson
filter, while unpacking, removed@metadata
block.Relevant parts of the config (my actual config is more complicated but other elements are irrelevant here):
I expected to see
kafka.topic
in the results, but this field was simply missing.It turned out that replacing first filter with
helped¹, so looks like json filter removed
@metadata
for some reason.This reason is even more unclear as it seems to me that no fields appeared under
@metadata
(so it is not even the case of „there was@metadata.something
in filebeat output and this is why json replaced this block) - at least it seems so to me after some glazing at rubydebug output..Once the problem is known, it is rather easy to workaround (for example as above) but for unaware it is very confusing.
¹ Of course swapping filter order would likely help too but in my case actual processing of kafka metadata was different and needed both kafka metadata and unpacked fields.
The text was updated successfully, but these errors were encountered: