small medium large xlarge

Generic-user-small
27 Oct 2017, 19:08
Mike Yao (1 post)

There is an example in Chapter 6, Part II about “Inserting Elasticsearch Documents in Bulk”. It uses Stream because of the large JSON file. The data is sent by chunk via Stream.

const req = request.post(options);
const stream = fs.createReadStream(file); 
stream.pipe(req);

And here is the quote from Elasticsearch document:

If using the HTTP API, make sure that the client does not send HTTP chunks, as this will slow things down.

So my understanding is the code of example should be failed on broken JSON chunk, but it works. I have gone through API doc and tried to understand more, and it still cannot find the reason.

Thank you so much! - Mike Yao

You must be logged in to comment