stream: Readable iterator unhandled error when piping

  • Version: 12.4
  • Platform: Ubuntu 18.04

When using asyncIterator on a piped stream, the error is not handled correctly.

const fs = require('fs');
const { PassThrough } = require('stream');

async function print() {

	const read = fs.createReadStream('file');
	const iterator = read.pipe(new PassThrough())

	for await (const k of iterator) {
		console.log(k);
	}
}

print()
  .then(() => console.log('done')) // never called
  .catch(console.log); // never called

In the above example, the .catch is not catching the error, and the script crashes.

I know, that I should catch the error of each stream, but if I do:

read.on('error', console.log);

The print function never resolves nor rejects. The only solution I've found is to emit the error to the piped stream.

async function print() {

	const read = fs.createReadStream('file');
	const stream = new PassThrough();
	
	read.on('error', (err) => stream.emit('error', err));

	const iterator = read.pipe(stream);

	for await (const k of iterator) {
		console.log(k);
	}
}

When you have multiple pipes, this can get very ugly. I don't know if this is the intended behaviour, but makes it hard & ugly to work with async iterators.