Skip to content

Add post migration for populating Historical Statistics

What does this MR do?

Related to #217811 (closed)

This change adds post migration to populate Vulnerability Historical Statistics table with data from last 90 days.

Projects

SELECT DISTINCT "vulnerabilities"."project_id"
FROM "vulnerabilities"
ORDER BY "vulnerabilities"."project_id" ASC;
 Unique  (cost=0.43..62711.49 rows=4941 width=8) (actual time=0.159..989.705 rows=6021 loops=1)
   Buffers: shared hit=38073
   ->  Index Only Scan using index_vulnerabilities_on_project_id on public.vulnerabilities  (cost=0.43..56352.26 rows=2543689 width=8) (actual time=0.157..668.205 rows=2579347 loops=1)
         Heap Fetches: 24184
         Buffers: shared hit=38073

Queries

Currently we have ~6082 projects that have at least one vulnerability. With Batch size = 50, we will have around 122 batches executed with a 2 minute delay between them, and the migration should take ~4h.

Each batch needs around ~30s to just perform the query to read the statistics, and ~25s to write statistics back to database.

Query to fetch projects with vulnerabilities (6083 / 50 = 122 times => 13ms * 122 = 1586ms)

SELECT DISTINCT "vulnerabilities"."project_id"
FROM "vulnerabilities"
ORDER BY "vulnerabilities"."project_id" ASC
LIMIT 50;

https://explain.depesz.com/s/DMOe (~13ms)

Query to fetch statistics (9 * 50ms for single project => 450ms | 50 * 450ms => ~23s for single batch)

SELECT DATE(calendar.entry) AS day,
  severity,
  COUNT(*)
FROM generate_series(
    DATE '2020-07-20',
    DATE '2020-07-29',
    INTERVAL '1 day'
  ) as calendar(entry)
  INNER JOIN vulnerabilities ON vulnerabilities.created_at <= calendar.entry
WHERE "vulnerabilities"."project_id" = 278964
  AND (
    (
      vulnerabilities.dismissed_at IS NULL
      OR vulnerabilities.dismissed_at > calendar.entry
    )
    AND (
      vulnerabilities.resolved_at IS NULL
      OR vulnerabilities.resolved_at > calendar.entry
    )
  )
GROUP BY "day",
  "severity";

https://explain.dalibo.com/plan/FLW (~55ms)

Insert statistics (50 * 350ms for single batch => ~18s)

INSERT INTO "vulnerability_historical_statistics" (
    "letter_grade",
    "created_at",
    "updated_at",
    "project_id",
    "total",
    "critical",
    "high",
    "medium",
    "low",
    "unknown",
    "info",
    "date"
  )
VALUES (
    1,
    '2020-07-30 07:23:26.813637',
    '2020-07-30 07:23:26.813637',
    10,
    1,
    0,
    0,
    0,
    1,
    0,
    0,
    '2020-05-01'
  ),
  ...,
  (
    1,
    '2020-07-30 07:23:26.813637',
    '2020-07-30 07:23:26.813637',
    10,
    91,
    0,
    0,
    0,
    91,
    0,
    0,
    '2020-07-30'
  ) ON CONFLICT DO NOTHING
RETURNING "id"

https://explain.depesz.com/s/Eael (~350ms)

To summarize in single batch we are doing:

  • fetching statistics for projects in batch (50 * 450ms => ~23s)
  • inserting statistics to vulnerability_historical_statistics table (50 * 350ms => ~18s)

So a single batch should take around 40 seconds, so we set an interval of 2 minutes between batches.

Does this MR meet the acceptance criteria?

Conformity

Availability and Testing

Security

If this MR contains changes to processing or storing of credentials or tokens, authorization and authentication methods and other items described in the security review guidelines:

  • [-] Label as security and @ mention @gitlab-com/gl-security/appsec
  • [-] The MR includes necessary changes to maintain consistency between UI, API, email, or other methods
  • [-] Security reports checked/validated by a reviewer from the AppSec team
Edited by Yannis Roussos

Merge request reports

Loading