Destroying a project can fail if there are a large number of notes
Summary
Observed on gstg
: https://gitlab.com/gitlab-com/gl-infra/infrastructure/issues/8596
A project with around 67K notes - all on merge requests, although I doubt that part matters - reliably fails to be destroyed from the web UI.
Steps to reproduce
- Create a project with > 67,000 notes
- Run
ProjectDestroyWorker.new.perform(<project_id>, <authorised_user_id>)
What is the current bug behavior?
The worker failed to complete with the following error:
PG::QueryCanceled: ERROR: canceling statement due to statement timeout
CONTEXT: SQL statement "DELETE FROM ONLY "public"."notes" WHERE $1 OPERATOR(pg_catalog.=) "project_id""
What is the expected correct behavior?
ProjectDestroyWorker
should successfully remove a project with many notes attached
Output of checks
This bug happens on staging.GitLab.com
Possible fixes
Ideally, we'd be doing this at the database level with an on_delete: :cascade
construct. We have to handle attachments, but since we know we're removing the whole project, that might not be such an issue.
Failing that, we could consider removing the notes in batches? Are there any other affected relations that might need similar treatment?
I guess this could equally be ~"Category:Project Management"