Skip to content

Add the Manage::Import total GMAU metric

Kassio Borges requested to merge kassio/manage-import-gmau-metrics into master

What does this MR do?

Add a total counter of unique users that used any Gitlab Import. Because each import is handled in a very different way, this requires a few different queries to be achived. Bellow you might find the details of each query.

These metrics used to be used on an individual manner, but that use is being deprecated: !51495 (merged).

unique_users_all_imports => project_imports

Query:

SELECT
    COUNT(DISTINCT "projects"."creator_id")
FROM
    "projects"
WHERE
    "projects"."created_at" BETWEEN '2020-12-26 14:23:16.821847'
    AND '2021-01-23 14:23:16.822019'
    AND "projects"."import_type" IS NOT NULL
    AND "projects"."creator_id" >= 0
    AND "projects"."creator_id" < 1

Plan:

 Aggregate  (cost=3.46..3.47 rows=1 width=8) (actual time=8.389..8.391 rows=1 loops=1)
   Buffers: shared hit=6 read=6
   I/O Timings: read=8.155
   ->  Index Only Scan using index_projects_on_creator_id_import_type_and_created_at_partial on public.projects  (cost=0.43..3.45 rows=1 width=4) (actual time=5.503..5.504 rows=0 loops=1)
         Index Cond: ((projects.creator_id >= 0) AND (projects.creator_id < 1) AND (projects.created_at >= '2020-12-26 14:23:16.821847+00'::timestamp with time zone) AND (projects.created_at <= '2021-01-23 14:23:16.822019+00'::timestamp with time zone))
         Heap Fetches: 0
         Buffers: shared read=3
         I/O Timings: read=5.460

Recommendations: Looks good

Statistics:

Time: 9.185 ms
  - planning: 0.747 ms
  - execution: 8.438 ms
    - I/O read: 8.155 ms
    - I/O write: N/A

Shared buffers:
  - hits: 6 (~48.00 KiB) from the buffer pool
  - reads: 6 (~48.00 KiB) from the OS file cache, including disk I/O
  - dirtied: 0
  - writes: 0
unique_users_all_imports => bulk_imports

Query:

SELECT
    COUNT(DISTINCT "bulk_imports"."user_id")
FROM
    "bulk_imports"
WHERE
    "bulk_imports"."created_at" BETWEEN '2020-12-26 14:23:16.821847'
    AND '2021-01-23 14:23:16.822019'
    AND "bulk_imports"."user_id" >= 1
    AND "bulk_imports"."user_id" < 2

Plan:

 Aggregate  (cost=3.17..3.18 rows=1 width=8) (actual time=0.014..0.015 rows=1 loops=1)
   Buffers: shared hit=1
   ->  Index Scan using index_bulk_imports_on_user_id on public.bulk_imports  (cost=0.14..3.17 rows=1 width=4) (actual time=0.005..0.005 rows=0 loops=1)
         Index Cond: ((bulk_imports.user_id >= 1) AND (bulk_imports.user_id < 2))
         Filter: ((bulk_imports.created_at >= '2020-12-26 14:23:16.821847+00'::timestamp with time zone) AND (bulk_imports.created_at <= '2021-01-23 14:23:16.822019+00'::timestamp with time zone))
         Rows Removed by Filter: 0
         Buffers: shared hit=1

Recommendations: Looks good

Statistics:

Time: 0.193 ms
  - planning: 0.149 ms
  - execution: 0.044 ms
    - I/O read: N/A
    - I/O write: N/A

Shared buffers:
  - hits: 1 (~8.00 KiB) from the buffer pool
  - reads: 0 from the OS file cache, including disk I/O
  - dirtied: 0
  - writes: 0
unique_users_all_imports => jira_issue_imports

Query:

SELECT
    COUNT(DISTINCT "jira_imports"."user_id")
FROM
    "jira_imports"
WHERE
    "jira_imports"."created_at" BETWEEN '2020-12-26 14:23:16.821847'
    AND '2021-01-23 14:23:16.822019'
    AND "jira_imports"."user_id" >= 0
    AND "jira_imports"."user_id" < 1

Plan:

 Aggregate  (cost=3.30..3.31 rows=1 width=8) (actual time=0.075..0.076 rows=1 loops=1)
   Buffers: shared hit=14
   ->  Index Scan using index_jira_imports_on_user_id on public.jira_imports  (cost=0.28..3.30 rows=1 width=8) (actual time=0.029..0.029 rows=0 loops=1)
         Index Cond: ((jira_imports.user_id >= 0) AND (jira_imports.user_id < 1))
         Filter: ((jira_imports.created_at >= '2020-12-26 14:23:16.821847+00'::timestamp with time zone) AND (jira_imports.created_at <= '2021-01-23 14:23:16.822019+00'::timestamp with time zone))
         Rows Removed by Filter: 0
         Buffers: shared hit=5

Recommendations: Looks good

Statistics:

Time: 0.414 ms
  - planning: 0.302 ms
  - execution: 0.112 ms
    - I/O read: N/A
    - I/O write: N/A

Shared buffers:
  - hits: 14 (~112.00 KiB) from the buffer pool
  - reads: 0 from the OS file cache, including disk I/O
  - dirtied: 0
  - writes: 0
unique_users_all_imports => cvs_issue_imports

Query:

SELECT
    COUNT(DISTINCT "csv_issue_imports"."user_id")
FROM
    "csv_issue_imports"
WHERE
    "csv_issue_imports"."created_at" BETWEEN '2020-12-26 14:23:16.821847'
    AND '2021-01-23 14:23:16.822019'
    AND "csv_issue_imports"."user_id" >= 0
    AND "csv_issue_imports"."user_id" < 1

Plan:

 Aggregate  (cost=3.31..3.32 rows=1 width=8) (actual time=0.331..0.331 rows=1 loops=1)
   Buffers: shared hit=14
   ->  Index Scan using index_csv_issue_imports_on_user_id on public.csv_issue_imports  (cost=0.28..3.30 rows=1 width=8) (actual time=0.097..0.097 rows=0 loops=1)
         Index Cond: ((csv_issue_imports.user_id >= 0) AND (csv_issue_imports.user_id < 1))
         Filter: ((csv_issue_imports.created_at >= '2020-12-26 14:23:16.821847+00'::timestamp with time zone) AND (csv_issue_imports.created_at <= '2021-01-23 14:23:16.822019+00'::timestamp with time zone))
         Rows Removed by Filter: 0
         Buffers: shared hit=5

Recommendations: Looks good

Statistics:

Time: 0.610 ms
  - planning: 0.247 ms
  - execution: 0.363 ms
    - I/O read: N/A
    - I/O write: N/A

Shared buffers:
  - hits: 14 (~112.00 KiB) from the buffer pool
  - reads: 0 from the OS file cache, including disk I/O
  - dirtied: 0
  - writes: 0
unique_users_all_imports => group_imports

Query:

SELECT
    COUNT(DISTINCT "group_import_states"."user_id")
FROM
    "group_import_states"
WHERE
    "group_import_states"."created_at" BETWEEN '2020-12-26 14:23:16.821847'
    AND '2021-01-23 14:23:16.822019'
    AND "group_import_states"."user_id" >= 0
    AND "group_import_states"."user_id" < 1

Plan:

 Aggregate  (cost=3.30..3.31 rows=1 width=8) (actual time=2.366..2.367 rows=1 loops=1)
   Buffers: shared read=2
   I/O Timings: read=2.226
   ->  Index Scan using index_group_import_states_on_user_id on public.group_import_states  (cost=0.28..3.30 rows=1 width=8) (actual time=2.327..2.327 rows=0 loops=1)
         Index Cond: ((group_import_states.user_id >= 0) AND (group_import_states.user_id < 1))
         Filter: ((group_import_states.created_at >= '2020-12-26 14:23:16.821847+00'::timestamp with time zone) AND (group_import_states.created_at <= '2021-01-23 14:23:16.822019+00'::timestamp with time zone))
         Rows Removed by Filter: 0
         Buffers: shared read=2
         I/O Timings: read=2.226

Recommendations: Looks good

Statistics:

Time: 2.619 ms
  - planning: 0.197 ms
  - execution: 2.422 ms
    - I/O read: 2.226 ms
    - I/O write: N/A

Shared buffers:
  - hits: 0 from the buffer pool
  - reads: 2 (~16.00 KiB) from the OS file cache, including disk I/O
  - dirtied: 0
  - writes: 0

Screenshots (strongly suggested)

Does this MR meet the acceptance criteria?

Conformity

Availability and Testing

Security

If this MR contains changes to processing or storing of credentials or tokens, authorization and authentication methods and other items described in the security review guidelines:

  • Label as security and @ mention @gitlab-com/gl-security/appsec
  • The MR includes necessary changes to maintain consistency between UI, API, email, or other methods
  • Security reports checked/validated by a reviewer from the AppSec team
Edited by Kassio Borges

Merge request reports

Loading