Add performance marks and measures (User Timing API) to MR Diffs for relevant moments
Problem to solve
We are currently unable to track the timing of the events beyond the generic ones (Example).
We'd like to be able to tell whether a change has affected the performance of any given part of the lifecycle of the MR Diffs app.
Per the discussion on gitlab-org/frontend/rfcs#51 (closed), it's pretty clear we will go forward with this and we're now waiting for a final decision on the How? — In regards to the naming and eventually how to actually write it consistently.
We should start by taking stock and identifying the relevant moments we'd like to see the timings tracked. Later we will create an issue to add all the marks identified in this issue to the app.
User experience goal
Have a clear list of marks to be added to the MR Diffs app.
Proposal
Contribute in the discussion of this issue to compile a table of meaningful marks and measurements we'd like to have timings of.
Marks
Keyword | Description |
---|---|
Measurements
Title | Mark Start | Mark End | Description |
---|---|---|---|
Not just absolute timings, but we'd be able to calculate deltas between two marks with measurements.
Further details
For more details on the context of this effort, see gitlab-org/frontend/rfcs#51 (closed)
Availability & Testing
We haven't assessed how we are going to tackle testing here, yet. Contributions welcome.
What does success look like, and how can we measure that?
By being able to tell if an MR had a meaningful impact in the time between the file tree is rendered and the first Diff file is shown.
Is this a cross-stage feature?
No, focused only on MR Diffs app. devopscreate groupsource code