Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

writing to elastic isnt working #41

Open
eran10 opened this issue Jan 13, 2020 · 8 comments
Open

writing to elastic isnt working #41

eran10 opened this issue Jan 13, 2020 · 8 comments

Comments

@eran10
Copy link

eran10 commented Jan 13, 2020

Hi, i would like to send my pino logs to elastic, we are using elastic 6 version with latest version of pino and pino-elastic and when i config the below for pino :

let pino = require('pino'),
    pinoElastic = require('pino-elasticsearch');

const streamToElastic = pinoElastic({
    index: `log-test-%{DATE}`,
    type: 'info',
    consistency: 'one',
    node: 'http://myuser:mypass@localhost:9200',
    'trace-level': 'info',
    'es-version': 6,
    'bulk-size': 1,
    ecs: true
});

and then

const logger = pino(pinoOptions, streamToElastic);
logger.info('test');

the app is running ok, but no logs is printed to console and no logs sent to elastic with no errors at all,
am i missing something?

@mcollina
Copy link
Member

You should use https://github.com/pinojs/pino-multi-stream to print to stdout as well.

As for the reason logs are not popping up in Elastic... I don't know. cc @delvedor

@eran10
Copy link
Author

eran10 commented Jan 20, 2020

thanks i will check

@DavidPVaz
Copy link

@eran10, you had any development regarding this issue?

@delvedor
Copy link
Collaborator

I can share an update, we have worked for improving the ECS support, and if you are using Pio v6 now you can use @elastic/ecs-pino-format instead of enabling the ecs options here.
@mcollina we should probably deprecate it :)

@JoHuang
Copy link

JoHuang commented Jun 8, 2020

I found if I write more logs, it sends some of the logs.
So I think it's might be cause of the flushBytes.

If I set flushBytes to 10, all logs appears.

This might be an issue that the last few logs can not be flushed before node.js app terminated.

@delvedor
Copy link
Collaborator

delvedor commented Jun 8, 2020

I found if I write more logs, it sends some of the logs.
So I think it's might be cause of the flushBytes.
If I set flushBytes to 10, all logs appears.

This is the correct behavior :) By default we collect 5 MB of logs before sending them, to avoid overloading Elasticsearch. You can easily change that limit with the --flush-bytes option.

This might be an issue that the last few logs can not be flushed before node.js app terminated.

It should not, as soon as the process end, the bulk indexer does a final flush.

Anyhow in the next version of the bulk indexer, there will be a flush timeout option as well :)

@JoHuang
Copy link

JoHuang commented Jun 8, 2020

This might be an issue that the last few logs can not be flushed before node.js app terminated.

It should not, as soon as the process end, the bulk indexer does a final flush.

I didn't see this behavior.
How to trigger it?
Or could you indicate the code related to this behavior?
Thanks

@delvedor
Copy link
Collaborator

delvedor commented Jun 9, 2020

If you run the main process and pipe this transport, it works automatically, as the stream from the main process and the transport ends.

node example.js | ./cli.js

If you pass this library directly to the pino options, and then kill the process, there is no guarantee that all the logs will be sent, as the process will be destroyed.
As I was saying the next version will support a flush interval.
We can also think about a force flush method.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants