Skip to content

feat(agents): code completions for custom models via agents

Igor Drozdov requested to merge id-agent-for-code-completion into main

What does this merge request do and why?

It's related to Prompt Migration to AI Gateway (gitlab-org&14259). In order to start migrating the prompts for code competions, we need to prepare AI Gateway to execute code completions via agents.

In this MR:

  • When an empty prompt is sent to v2/code/completions and the prompt version is 2, an agent is created
  • The agent looks for code_suggestions/completions/<model-name> definition, that's why this MR depends on feat(agents): change prompt lookup for agent re... (!1095 - merged)
  • The agent is called with prefix and suffix params to inject them into template: either ainvoke or astream method defending on the param
  • The agent call is wrapped into CodeCompletions class in order to preserve all other behavior that is currently executed when the endpoint is called (for example, instrumentation that tracks the number of calls per language).

Test

In http://localhost:5052/docs#/completions/completions_v2_code_completions_post, the following request can be sent when ollama with codegemma model is run locally (it's an instruct model by default, but it's just to demonstrate that the prompt is sent correctly)

{
  "project_path": "string",
  "project_id": 0,
  "current_file": {
    "file_name": "string",
    "language_identifier": "string",
    "content_above_cursor": "class Product",
    "content_below_cursor": "end"
  },
  "model_provider": "litellm",
  "model_endpoint": "http://localhost:11434/v1",
  "model_api_key": "string",
  "model_name": "codegemma",
  "telemetry": [],
  "stream": false,
  "choices_count": 0,
  "context": [],
  "prompt": "",
  "prompt_version": 2
}
Edited by Igor Drozdov

Merge request reports

Loading