Skip to content

Also retry HTTP-based transient errors #1648

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

mitar
Copy link
Contributor

@mitar mitar commented Oct 20, 2021

So currently the library can automatically retry when there are some types of API errors. But sometimes the HTTP calls fails on a lower level (connection broken, etc.). I think that should count as a transient error as well and be retried if configured so. This PR adds this.

Closes #1885

@codecov-commenter
Copy link

codecov-commenter commented Oct 20, 2021

Codecov Report

Merging #1648 (9af09a2) into main (5a1678f) will decrease coverage by 0.10%.
The diff coverage is 30.00%.

@@            Coverage Diff             @@
##             main    #1648      +/-   ##
==========================================
- Coverage   91.90%   91.80%   -0.11%     
==========================================
  Files          74       74              
  Lines        4287     4331      +44     
==========================================
+ Hits         3940     3976      +36     
- Misses        347      355       +8     
Flag Coverage Δ
cli_func_v4 81.73% <30.00%> (-0.17%) ⬇️
py_func_v4 80.95% <30.00%> (+0.03%) ⬆️
unit 83.53% <30.00%> (-0.19%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
gitlab/client.py 88.25% <30.00%> (-1.55%) ⬇️
gitlab/v4/objects/labels.py 93.47% <0.00%> (ø)
gitlab/v4/objects/runners.py 98.18% <0.00%> (ø)
gitlab/exceptions.py 99.29% <0.00%> (+0.01%) ⬆️
gitlab/v4/objects/users.py 98.50% <0.00%> (+0.06%) ⬆️
gitlab/v4/objects/merge_requests.py 85.82% <0.00%> (+0.11%) ⬆️
gitlab/mixins.py 91.29% <0.00%> (+0.17%) ⬆️
gitlab/v4/objects/merge_request_approvals.py 93.58% <0.00%> (+0.53%) ⬆️

@mitar mitar force-pushed the retry-http-transient-errors branch from 2b68c53 to 29f178a Compare October 20, 2021 20:48
@mitar mitar force-pushed the retry-http-transient-errors branch from 29f178a to 9af09a2 Compare October 24, 2021 11:26
@nejch
Copy link
Member

nejch commented Nov 2, 2021

Thanks for this @mitar. As this grows (I believe you also added the initial retry flag?), I'm wondering if we should switch to using the https://urllib3.readthedocs.io/en/latest/reference/urllib3.util.html#urllib3.util.Retry utility for this. It might reduce our duplication and even clean up our custom 429 handling.

Would you like to give it a try here? If not we can do this in a follow-up, but it'd be great if we could at least have tests for this so that we can see it still behaves properly later. One example of similar tests is here, I think:
https://github.com/python-gitlab/python-gitlab/blob/master/tests/unit/test_gitlab_http_methods.py#L58-L123

@github-actions
Copy link

github-actions bot commented Feb 2, 2022

This Pull Request (PR) was marked stale because it has been open 90 days with no activity. Please remove the stale label or comment on this PR. Otherwise, it will be closed in 15 days.

@github-actions github-actions bot added the stale label Feb 2, 2022
@mitar
Copy link
Contributor Author

mitar commented Feb 2, 2022

Unstale.

@nejch
Copy link
Member

nejch commented Feb 4, 2022

@mitar did you still want to take a look at urllib3's retry class or you want to proceed this way?

@nejch
Copy link
Member

nejch commented Feb 8, 2022

@mitar as we're getting more requests on this I'm happy to merge this as is and we can work on the rest later, could you just rebase/resolve conflicts please? 🙇

@Sineaggi
Copy link

Sineaggi commented Feb 28, 2022

@mitar Would you mind if someone else took over this pr? Or will you have time to work on this soon?

@mitar
Copy link
Contributor Author

mitar commented Feb 28, 2022

Hey, sorry, I got busy with other things. I can add anyone to the branch if you wan to rebase/fix conflicts. I will try to do it as well, but I do not know when I will be able to circle back to this.

@mitar
Copy link
Contributor Author

mitar commented Feb 28, 2022

(But definitely sadly I do not have time to add more tests or to port it to urlib3. I would suggest that this gets merged and then followup work is done to get things further. Also, I have been running this code for some time now and it works great/stable.)

@Sineaggi
Copy link

Sineaggi commented Mar 2, 2022

I've rebased @mitar's branch here #1904

Added tests and 52x http response range to retries.

@nejch
Copy link
Member

nejch commented Mar 10, 2022

Closing this in favor of #1904, as I see git commit authorship was preserved. Thanks for the work here @mitar.

@nejch nejch closed this Mar 10, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

client.py doesn't retry a connection on a requests.exceptions.ConnectionError "RemoteDisconnected"
5 participants