Migrate remote_development_agent_configs table
Issue: Switch from remote_development_agent_configs ta... (#480135 - closed)
What does this MR do and why?
Switches from remote_development_agent_configs
table to workspaces_agent_configs
.
- Migrates from remote_development_agent_configs table to workspaces_agent_configs
- Updates all API and code references, but leaves existing table and GraphQL API functional to support zero-downtime deploys with no ActiveRecord or client-side errors.
- See Switch from remote_development_agent_configs ta... (#480135 - closed) for all further details and acceptance criteria.
Note on table rename and zero-downtime deployments
This rename is high priority because Category:Workspaces has much current and upcoming work involving this table, and we want to get the rename out of the way first.
Therefore, we are not using the normal two-release rename process (involving a view), but instead are using an alternate process which can be done in a single release (by copying all data to new table, leaving the old table in place, and pointing all code to the new table).
We also know based on metrics and service ping and direct conversations with enterprise users that this table has a very low volume of records (likely 1 or zero in most cases), so there's no concerns with doing the work in regular migrations.
This approach has already been pre-approved by @jon_jenkins
in the database group, and @ahegyi
as a database maintainer (who will also be the database maintainer on this MR).
See the Technical Requirements section in the issue for more details on this approach.
MR acceptance checklist
Please evaluate this MR against the MR acceptance checklist. It helps you analyze changes to reduce risks in quality, performance, reliability, security, and maintainability.
Screenshots or screen recordings
UI testing screenshot
Cluster agent dropdown still functional:
Agent config update testing screenshot
Update of workspaces_quota
value in agent config file to 100
successfully written to new table (both with old table existing and with old table deleted):
How to set up and validate locally
Delete the old table to prove it is not being used
You can check out MR Delete remote_development_agent_configs table (!164032 - merged) and run it to delete the old remote_development_agent_configs
table, then run the following validations, in order to prove it is not used.
However, note that this will DELETE THE DATA IN THIS TABLE in your development database. If you do it, you can restore the data with:
INSERT INTO remote_development_agent_configs SELECT * FROM workspaces_agent_configs;
...or else create an insert statement with the proper data (you can have AI create it for you based on a copy-paste of a \d ...
and SELECT *
from the existing table.)
Validate agent updates still work
- In local dev environment, ensure you have an agent set up and enabled for remote development: https://gitlab.com/gitlab-org/workspaces/gitlab-workspaces-docs/-/blob/main/doc/local-development-environment-setup.md
- Go to
gdk psql
- Do query of
workspaces_quota
in the new table:select id, cluster_agent_id, workspaces_quota from workspaces_agent_configs where id = 12;
, note the value (should be default of-1
) - Make an update to the agent config for that row's agent, e.g. set
workspaces_quota
to100
: https://docs.gitlab.com/ee/user/workspace/gitlab_agent_configuration.html#workspaces_quota - Run the query again, see that the value got updated to
100
(see screenshot above for example).
Repeat the above both with and without the old table existing, to confirm the existence of the old table (as it will for 1 release) does not have any impact.
Validate workspace creation and reconcile still work
- In local dev environment, ensure you have an agent set up and enabled for remote development: https://gitlab.com/gitlab-org/workspaces/gitlab-workspaces-docs/-/blob/main/doc/local-development-environment-setup.md
- Do a full-lifecycle test of creating, using, and terminating a workspace.
- You can use
scripts/remote_development/run-e2e-tests.sh
for this too, just make sure you don't have any spinning hung workspaces, or that QA test won't work.
Validate GraphQL API works
- Ensure new API endpoints work
- Ensure old API endpoints work
- Test both with old table existing and deleted
Example queries (set groupPath
variable to gitlab-org
):
query getGroupClusterAgentsOLD($groupPath: ID!) {
group(fullPath: $groupPath) {
id
clusterAgents(hasRemoteDevelopmentEnabled: true) {
nodes {
id
name
project {
id
nameWithNamespace
}
remoteDevelopmentAgentConfig {
id
createdAt
dnsZone
enabled
gitlabWorkspacesProxyNamespace
maxHoursBeforeTerminationLimit
defaultMaxHoursBeforeTermination
networkPolicyEnabled
projectId
updatedAt
workspacesPerUserQuota
workspacesQuota
}
}
}
}
}
query getGroupClusterAgentsNEW($groupPath: ID!) {
group(fullPath: $groupPath) {
id
clusterAgents(hasRemoteDevelopmentEnabled: true) {
nodes {
id
name
project {
id
nameWithNamespace
}
workspacesAgentConfig {
id
createdAt
dnsZone
enabled
gitlabWorkspacesProxyNamespace
maxHoursBeforeTerminationLimit
defaultMaxHoursBeforeTermination
networkPolicyEnabled
projectId
updatedAt
workspacesPerUserQuota
workspacesQuota
}
}
}
}
}
query getRemoteDevelopmentClusterAgentsOLD($namespace: ID!) {
namespace(fullPath: $namespace) {
id
remoteDevelopmentClusterAgents(filter: AVAILABLE) {
nodes {
id
name
project {
id
nameWithNamespace
}
remoteDevelopmentAgentConfig {
id
createdAt
dnsZone
enabled
gitlabWorkspacesProxyNamespace
maxHoursBeforeTerminationLimit
defaultMaxHoursBeforeTermination
networkPolicyEnabled
projectId
updatedAt
workspacesPerUserQuota
workspacesQuota
}
}
}
}
}
query getRemoteDevelopmentClusterAgentsNEW($namespace: ID!) {
namespace(fullPath: $namespace) {
id
remoteDevelopmentClusterAgents(filter: AVAILABLE) {
nodes {
id
name
project {
id
nameWithNamespace
}
workspacesAgentConfig {
id
createdAt
dnsZone
enabled
gitlabWorkspacesProxyNamespace
maxHoursBeforeTerminationLimit
defaultMaxHoursBeforeTermination
networkPolicyEnabled
projectId
updatedAt
workspacesPerUserQuota
workspacesQuota
}
}
}
}
}
Additional notes
Command to rollback all migrations on this MR (change command to only roll back selected ones):
bin/spring rails db:migrate:down:main db:migrate:down:ci VERSION=20240825000005 && bin/spring rails db:migrate:down:main db:migrate:down:ci VERSION=20240825000004 && bin/spring rails db:migrate:down:main db:migrate:down:ci VERSION=20240825000003 && bin/spring rails db:migrate:down:main db:migrate:down:ci VERSION=20240825000002 && bin/spring rails db:migrate:down:main db:migrate:down:ci VERSION=20240825000001