Updating a Wiki page can result in a job payload exceeding the limit
When updating a wiki page, we build a payload for webhooks from the page object. The payload includes the entire content of the page, which can cause it to exceed the 5MB (compressed) limit we impose on Sidekiq jobs.
We build the content of the hook in Gitlab::HookData::WikiPageBuilder
in the hash WikiPage
https://sentry.gitlab.net/gitlab/gitlabcom/issues/3372526/?referrer=gitlab_plugin
Gitlab::SidekiqMiddleware::SizeLimiter::ExceedLimitError: Integrations::ExecuteWorker job exceeds payload size limit
lib/gitlab/application_context.rb:103:in `block in use'
Labkit::Context.with_context(to_lazy_hash) { yield }
lib/gitlab/application_context.rb:103:in `use'
Labkit::Context.with_context(to_lazy_hash) { yield }
lib/gitlab/application_context.rb:48:in `with_context'
application_context.use(&block)
app/models/integration.rb:580:in `async_execute'
Integrations::ExecuteWorker.perform_async(id, data)
app/models/project.rb:1669:in `block (2 levels) in execute_integrations'
integration.async_execute(data)
...
(18 additional frame(s) were not displayed)