BitBucket Importers should queue imports to not hit rate limiting
Summary
Some imports are failing due to being rate limited when bulk importing to GitLab.com.
Steps to reproduce
- Attempt to import many (100s) projects from BitBucket. Probably true for GitHub too.
- Have multiple projects fail.
- Check logs to see why.
Example Project
See ticket (internal) https://gitlab.zendesk.com/agent/tickets/173040
What is the current bug behavior?
Imports fail because it's hitting our rate limiter and it's failing without telling the user why.
What is the expected correct behavior?
The system should queue the imports
Relevant logs and/or screenshots
Kibana log for this particular case: https://log.gprd.gitlab.net/goto/7a7aad00dd778acfe1fcb25d0572dae3
Output of checks
GitLab.com, GitLab Enterprise Edition 13.4.0-pre 768c5860
Proposals
Start with propagating the error to the user, similar to #26466 (comment 316595595):
Then, implement something like what we did for GitHub Importer, or consider:
- Auto-retry? Retrying may not help if the user keeps hitting the limit.
- Would it be possible to pause the import if we detect this error and resume automatically at the given time? And the message could say that Bitbucket API call rate limit was reached and the import will resume at the given time.
- Or, alternatively, let the user set a limit/throttle for our importer to slow down and stay under the Bitbucket limit?
- Or, preferably, try to auto-adjust our rate using the Bitbucket HTTP response data, which would avoid getting the error?
Edited by Haris Delalić