Skip to content

Abstract prompts per provider

Alexandru Croitor requested to merge abstract_ai_prompts into master

What does this MR do and why?

This attempts to abstract prompt building per ai provider.

Each tool would need to define a provider specific prompt class, that will be responsible for building the prompt for the given provider.

Screenshots or screen recordings

To test this for Anthropic from rails console:

context = Gitlab::Llm::Chain::GitlabContext.new(container: Issue.find(612).namespace, resource: Issue.find(612), current_user: User.first, ai_request: ::Gitlab::Llm::Chain::Requests::Anthropic.new(User.first))

q = Gitlab::Llm::Chain::Agents::ZeroShot::Executor.new(tools: [::Gitlab::Llm::Chain::Tools::IssueIdentifier, ::Gitlab::Llm::Chain::Tools::SummarizeComments], user_input: "Summarize notes on issue ##{Issue.find(612).iid}?", context: context).execute

To test this for Vertex from rails console:

context = Gitlab::Llm::Chain::GitlabContext.new(container: Issue.find(612).namespace, resource: Issue.find(612), current_user: User.first, ai_request: ::Gitlab::Llm::Chain::Requests::VertexAi.new(User.first))

q = Gitlab::Llm::Chain::Agents::ZeroShot::Executor.new(tools: [::gitlab::Llm::Chain::Tools::IssueIdentifier, ::gitlab::Llm::Chain::Tools::SummarizeComments], user_input: 'Summarize notes on issue #1?', context: context).execute

NOTE: You need to make sure you have anthropic and vertex credentials setup.

Before After

How to set up and validate locally

Numbered steps to set up and validate the change are strongly suggested.

MR acceptance checklist

This checklist encourages us to confirm any changes have been analyzed to reduce risks in quality, performance, reliability, security, and maintainability.

Edited by Alexandru Croitor

Merge request reports

Loading