A writable stream for bulk indexing records in Elasticsearch
The records written to the stream has to have the following format:
{
index: 'name-of-index',
type: 'recordType',
id: 'recordId',
parent: 'parentRecordType' //optional
body: {
name: 'Foo Bar'
}
}
The highWaterMark
option set on the stream defines how many items
will be buffered before doing a bulk indexing operation. The stream
will also write all buffered items if its is closed, before emitting
the finish
event.
Its also possible to send in the option flushTimeout
to indicate
that the items currently in the buffer should be flushed after the
given amount of milliseconds if the highWaterMark
haven't been
reached.
A bunyan,
winston or similar logger
instance that have methods like debug
, error
and info
may be
sent in as options.logger
to the constructor.
var ElasticsearchBulkIndexStream = require('elasticsearch-bulk-index-stream');
var stream = new ElasticsearchBulkIndexStream(elasticsearchClient, {
highWaterMark: 256,
flushTimeout: 500
});
someInputStream
.pipe(stream)
.on('error', function(error) {
// Handle error
})
.on('finish', function() {
// Clean up Elasticsearch client?
})
See api.md.
Elasticsearch readable and writable streams. The main difference
between the bulk writer in elasticsearch-streams
and this library is
that this library requires the index
and type
of the data being
written to exist in the record instead of being set in a callback when
the records written.
elasticsearch-streams
also implements its own event named close
to
indicate that all the data has been written to Elasticsearch. This
will break modules like pump
that depend on the finish
event.
MIT