Skip to content

[Cache] Fix Memory leak #44002

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 12, 2021
Merged

[Cache] Fix Memory leak #44002

merged 1 commit into from
Nov 12, 2021

Conversation

a1812
Copy link
Contributor

@a1812 a1812 commented Nov 10, 2021

Q A
Branch? 4.4
Bug fix? yes
New feature? no
Deprecations? no
Tickets Fix #43918
License MIT
Doc PR

Rough decision as an example of how to stop a leak

@carsonbot
Copy link

Hey!

To help keep things organized, we don't allow "Draft" pull requests. Could you please click the "ready for review" button or close this PR and open a new one when you are done?

Note that a pull request does not have to be "perfect" or "ready for merge" when you first open it. We just want it to be ready for a first review.

Cheers!

Carsonbot

@a1812 a1812 marked this pull request as ready for review November 10, 2021 20:55
@carsonbot carsonbot added this to the 5.3 milestone Nov 10, 2021
@nicolas-grekas
Copy link
Member

@Jeroeny can you please try this patch a report back if it fixes your issue?

@Jeroeny
Copy link
Contributor

Jeroeny commented Nov 11, 2021

@Jeroeny can you please try this patch a report back if it fixes your issue?

It no longer seems to leak in the reproducer after having applied these changes. 👍

@nicolas-grekas
Copy link
Member

nicolas-grekas commented Nov 11, 2021

This property has been added to save calling validateKey() and to save hashing long keys.

Since #40317, it's possible to skip calling validateKey() so I'm fine removing for that part.
But for hashing, this would still be useful.

This property has been added with the assumption that the number of different keys is upper bounded.
We could enforce this upper bound by limiting the size of the ids array instead. Eg resetting it when it reaches 1000 items?

@a1812
Copy link
Contributor Author

a1812 commented Nov 11, 2021

But for hashing, this would still be useful.
This property has been added with the assumption that the number of different keys is upper bounded. We could enforce this upper bound by limiting the size of the ids array instead. Eg resetting it when it reaches 1000 items?

i agree with the idea of 1000 keys, a @Jeroeny ?

@Jeroeny
Copy link
Contributor

Jeroeny commented Nov 11, 2021

But for hashing, this would still be useful.
This property has been added with the assumption that the number of different keys is upper bounded. We could enforce this upper bound by limiting the size of the ids array instead. Eg resetting it when it reaches 1000 items?

i agree with the idea of 1000 keys, a @Jeroeny ?

Sounds good to me. Upon reaching the limit, do you want to purge the entire array or slice it so that it's maxed out at 1000 items (like the ArrayAdapter cache) ?

@nicolas-grekas
Copy link
Member

The logic in ArrayAdapter is too involving for this. I'd suggest halving the array instead:
array_splice($this->ids, 0, 500);

@nicolas-grekas
Copy link
Member

Thank you @a1812.

@nicolas-grekas nicolas-grekas merged commit a8b4e7f into symfony:4.4 Nov 12, 2021
@nicolas-grekas nicolas-grekas modified the milestones: 5.3, 4.4 Nov 12, 2021
@a1812 a1812 deleted the fix_bug_43918 branch November 12, 2021 10:21
This was referenced Nov 14, 2021
This was referenced Nov 22, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants