Add Explain Vulnerability AI action to GraphQL API
What does this MR do and why?
This change introduces an AI completion for generating an explanation of a vulnerability using an intelligent chat service.
Screenshots or screen recordings
How to set up and validate locally
If you do not already have an OpenAI API key you will need to set one up
- Request a key via https://forms.gle/LPLXtgVjuGL4JCYFA
- Follow the sign up process in the resulting invite email
- Once signed in, go to 'View API keys' in the top right user menu
- Select 'Create Secret Key' and copy the generated key
- In the rails console:
Gitlab::CurrentSettings.update(openai_api_key: "<your-key>")
Following that:
- Enable feature flags (consult https://gitlab.com/gitlab-org/gitlab/-/issues/405161 for the latest list):
Feature.enable(:explain_vulnerability) Feature.enable(:openai_experimentation) Feature.enable(:ai_experimentation_api)
- Import a test project:
- Pick a specific SAST vulnerability with a
location: { file: "", start_line: 0 }
- Execute from a rails console:
[1] pry(main)> ::Llm::CompletionWorker.new.perform(<user_id>, <vulnerability_id>, "Vulnerability", :explain_vulnerability, {}) [ActionCable] Broadcasting to graphql-event::aiCompletionResponse:resourceId:Z2lkOi8vZ2l0bGFiL1Z1bG5lcmFiaWxpdHkvODY2NDY:userId:Z2lkOi8vZ2l0bGFiL1VzZXIvMQ: "{\"id\":\"acc55795-b110-468e-a5b8-c2eafb2b9b91\",\"model_name\":\"Vulnerability\",\"response_body\":\"As an AI language model, I do not have access to the actual code or context of the vulnerability mentioned in the README.md file. Therefore, I cannot provide a detailed explanation of how the vu...
MR acceptance checklist
This checklist encourages us to confirm any changes have been analyzed to reduce risks in quality, performance, reliability, security, and maintainability.
-
I have evaluated the MR acceptance checklist for this MR.
Edited by Malcolm Locke