Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent app crash when postgres prepared query keys become out of sync with Rails #41356

Merged
merged 1 commit into from Feb 11, 2021

Conversation

MGPalmer
Copy link

@MGPalmer MGPalmer commented Feb 6, 2021

Summary

Increment @counter of prepared postgres statements prior to running the query, so if the query gets interrupted without Rails' knowledge we don't end up with a crashed app in a state of perpetual failure to prepare the next query.

Other information

Based on this PR #41034 , thanks wbharding, I only streamlined the change and added a test.
Relates to #25827 . At my current project, we got hit pretty hard by the "prepared statement "ax" already exists" error and while I understand the reluctance to "fix" this in Rails, and we've already taken measures to stop the error at the root, I feel like not allowing connections to get completely stuck in a perpetual error is a good move in any case.

Any suggestions on improving the test welcome! My first time digging into the Rails (test-)code base.

@rafaelfranca
Copy link
Member

Can you squash the last three commits in one? Also can you please rebase?

@MGPalmer MGPalmer force-pushed the fix_postgres_prepared_statement branch from a6e240c to c727fe4 Compare February 9, 2021 09:57
@MGPalmer
Copy link
Author

MGPalmer commented Feb 9, 2021

Can you squash the last three commits in one? Also can you please rebase?

@rafaelfranca ah yes my bad, done.

…he query.

If the next query or the prepared statement itself get interrupted, this prevents the database session getting stuck perpetually retrying to recreate the same prepared statement.
@MGPalmer MGPalmer force-pushed the fix_postgres_prepared_statement branch from c727fe4 to 2416da9 Compare February 11, 2021 11:41
@rafaelfranca rafaelfranca merged commit 5b0fe37 into rails:main Feb 11, 2021
rafaelfranca added a commit that referenced this pull request Feb 11, 2021
Closes #41356
Closes #41034
@MGPalmer
Copy link
Author

\o/

@dv
Copy link
Contributor

dv commented Mar 17, 2021

@rafaelfranca can you explain why this PR does not interfere with your argument as stated here #25827 (comment)

We don't want to claim that we expect or intend to function in such a hostile environment.

I feel really sad that my efforts in the linked PR were ignored for 5 years even with lots of supporters, for what was in my opinion petty bikeshedding, yet this PR here that does essentially the same thing but differently, got the go ahead really quickly. This decision feels to me like it was politically motivated more than anything.

It certainly does not encourage me to contribute to Rails any longer.

@rafaelfranca
Copy link
Member

can you explain why this PR does not interfere with your argument as stated here #25827 (comment)

Yes. This PR had a different perspective about the problem and which cases the system could be failing.

Before the argument was about rack-timeout and Timeout.timeout that I still think we should not be dealing with.

Now it was about @connection.get_last_result https://github.com/rails/rails/pull/41034/files#diff-2468c670eb10c24bd2823e42708489a336d6f21c6efc7e3c4a574166fa77bb22R753 raising an exception. It was no anymore about a willingly introduced hostile environment, but a system failure.

got the go ahead really quickly

Compared with the timeframe of your PR, yes, it is true that it was quick, but I was considering this change since #41034, in the beginning still skeptical about it.

This decision feels to me like it was politically motivated more than anything.

Before your PR, #17607 was opened and rejected. If I merged your PR instead #17607 would that be a politically motived decision?

New arguments are introduced to discussions, the code evolve, we evolve, our decision evolve.

I'm sorry that you feel that way, it wasn't my intention to discourage you from contributing to the framework

@jose-airspace
Copy link

jose-airspace commented May 17, 2021

@rafaelfranca

can you explain why this PR does not interfere with your argument as stated here #25827 (comment)

Yes. This PR had a different perspective about the problem and which cases the system could be failing.

Before the argument was about rack-timeout and Timeout.timeout that I still think we should not be dealing with.

Now it was about @connection.get_last_result https://github.com/rails/rails/pull/41034/files#diff-2468c670eb10c24bd2823e42708489a336d6f21c6efc7e3c4a574166fa77bb22R753 raising an exception. It was no anymore about a willingly introduced hostile environment, but a system failure.

got the go ahead really quickly

Compared with the timeframe of your PR, yes, it is true that it was quick, but I was considering this change since #41034, in the beginning still skeptical about it.

This decision feels to me like it was politically motivated more than anything.

Before your PR, #17607 was opened and rejected. If I merged your PR instead #17607 would that be a politically motived decision?

New arguments are introduced to discussions, the code evolve, we evolve, our decision evolve.

I'm sorry that you feel that way, it wasn't my intention to discourage you from contributing to the framework

I still don't understand the hostile environment argument, how is making a timeout not break an entire app accounting for a hostile environment? If you have code that prevents all timeouts please share it with the rails community. How is one exception raised in the context of a single request breaking an entire application acceptable? By your logic we should just have a rack timeout exit the entire app. Why do we even raise exceptions and roll back transactions if exceptions should never happen / be accounted for? None of this makes sense

The fact that you can DDoS a rails app by making requests hang and timeout until the prepared statement issue happens seems like a security issue IMO

I think it's stubborn and short sighted to skip over a simple and elegant solution from @dv

@rails rails locked as resolved and limited conversation to collaborators May 17, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants