Skip to content

Conversation

@refractalize
Copy link

I've actually replaced the implementation with the one found on node's stream documentation. Hope this doesn't hurt anybody's egos ;)

Importantly however, this makes the stream throw an error (via stream.on('error', ...)) if it meets a line in the input that isn't JSON (excluding whitespace only lines of course.)

We fix this by waiting for a \n before attempting to parse JSON, and finally by waiting for the end event to parse any remaining text (if the file doesn't end with \n.) The end event is the _flush() method on the Transform stream.

The module supports async and non-async modes, as before. (Although this needs some documentation... but I'm not really sure what situations you'd need one or the other.)

}

JSONStream.prototype._transform = function(chunk, encoding, cb) {
this._buffer += this._decoder.write(chunk);
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jcrugzz so you reckon this bit should do buffer concat, not string concat?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant