Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

readable stream does not seem to support readableLength property #2552

Open
markddrake opened this issue Feb 16, 2022 · 14 comments
Open

readable stream does not seem to support readableLength property #2552

markddrake opened this issue Feb 16, 2022 · 14 comments

Comments

@markddrake
Copy link

The value of the property readableLength appears to be 'undefined'. Is this expected behavoir, I would have expected 0 or a integer less that highwater mark

@dougwilson
Copy link
Member

Hi @markddrake very strange. The way streams work are typically managed by Node.js itself. If this is something that this module needs to implement, would you be willing to make a pull request with a fix? I would expect from the Node.js docs that it would have at least some value, as this module will .push a row in to the stream as it parses it from the query.

@markddrake
Copy link
Author

I did a quick hunt (30secs) to look for the .stream() implementation but didn't see it. I was really trying to see if / how destroy() / _destroy() was implemented. If you have some pointers on where to look.

@dougwilson
Copy link
Member

No problem. Here is a direct link to the implementation:

Query.prototype.stream = function(options) {

@markddrake
Copy link
Author

markddrake commented Feb 16, 2022

I wonder if this is a side effect of

var Readable        = require('readable-stream');

rather than (switching to modern practice)

import { Readable } from 'stream' 

Note sure which version of 'stream' the 'readable-stream' module is mimicking

readable.readable length was Added in: v9.4.0

I also don't see an implementation of _destroy() (introduced in v8.0.0)

I was not aware of the issues around stream that readable stream was meant to address. I wonder if it's as simple as switching the source of the 'Readable' class.

@dougwilson
Copy link
Member

dougwilson commented Feb 16, 2022

It is just a copy in user land from my understanding, allowing to use the newer stream implementations in older Node.js versions (i.e. backport functionality). Does changing it fix the issue is the real question, though.

@markddrake
Copy link
Author

More importantly, does changing it fix the issue without breaking a 's**tload' of other things :)

@markddrake
Copy link
Author

markddrake commented Feb 16, 2022

A Quick Hack says yes, it does fix the issue with readableLength

Given the following code

this.yadamuLogger.trace([this.constructor.name,'PIPELINE ERROR',this.tableName,this.COPY_METRICS.SOURCE_DATABASE_VENDOR,this.COPY_METRICS.TARGET_DATABASE_VENDOR],`${
this.currentPipeline.map((s) => { return `${s.constructor.name}:[${s.readableLength},${s.writableLength}]` }).join(',')
}`)

I get the following output with the current implementation

[TRACE][MySQLWriter][PIPELINE ERROR][ColdRoomTemperatures_Archive][MySQL][MySQL]: Readable:[undefined,undefined],MySQLParser:[16,10],MySQLOutputManager:[16,16],MySQLWriter:[undefined,0]

and this output with a the 'hack' in place

[TRACE][MySQLWriter][PIPELINE ERROR][ColdRoomTemperatures_Archive][MySQL][MySQL]: Readable:[16,undefined],MySQLParser:[16,10],MySQLOutputManager:[16,16],MySQLWriter:[undefined,0]

So does imply that readable-stream is not pickup later version of steam.

@dougwilson
Copy link
Member

Is the two outputs supported to be different? They both look the same to me, showing undefined, right? Am I missing something?

@markddrake
Copy link
Author

markddrake commented Feb 16, 2022

Sorry that was cut and paste error - updated . Basically this is a snapshot of the readableLength and writableLength values for each member of a pipeline when I deliberately break the pipeline by killing the database connection used by the MySQLWriter class.

@dougwilson
Copy link
Member

Ok, gotcha. In parallel I was looking at readable-stream and it looks like readableLength was added in the 3.0 release.

@markddrake
Copy link
Author

Does that version also add destroy()/_destroy. I have a project (YADAMU) that does parallel copies of data between different databases. I need the ability to terminate all active copies, if one of them fails. I do this by invoking destroy() on the reader of all the active Pipelines, it seems to work fine with most of the databases I support but doesn't seem to work with MySQL. If destroy is not implemented that would probably explain it (and indicate that I need to double check my error handling :))

@dougwilson
Copy link
Member

That I'm not sure, but you're always welcome to contribute any fixes or such that you think makes sense.

@markddrake
Copy link
Author

markddrake commented Feb 16, 2022

Looks like I currently have readable-steam 2.3.7 (presumably as a result of MySQL's dependency). I just forced an install of readable-stream@latest and got 3.6.0 by the look of it. However when I revert my change to query.js and re-run the test I'm back to undefined by the look of it

[TRACE][MySQLWriter][PIPELINE ERROR][ColdRoomTemperatures_Archive][MySQL][MySQL]: Readable:[undefined,undefined],MySQLParser:[16,13],MySQLOutputManager:[16,16],MySQLWriter:[undefined,0]

Not sure if that's a valid test ?

Also looks like the package has not been updated for 17 months. I think there have been a lot of changes to the core streams implementation in that time

@markddrake
Copy link
Author

@dougwilson Doug

Not sure what to propose as the fix here. Changing the import to 'streams' seems to resolve the issue. Explicitly upgrading the 'readable-stream' component to 3.6.7 does not appear to resolve the issue. The logic behind using 'readable-stream' rather than directly using using 'streams' seems to make sense, but the package does not appear to have been upgraded for over 18 months.

If I propose switching mysql to directly use the 'readable' supplied by streams that could cause issues for consumers who are running with older versions of 'node'. Unfortunately, based on many years of experience, I suspect there are many more consumers sticking to older versions compared to consumers who track the latest one.

How diffiicult is it to change the require to use streams and run regression tests to see what if any side effects there would be to that change

Any advice much appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants