Skip to content

Comments

[WIP] feat: Add support for Google Gemini CLI as an AI Engine#16585

Closed
Copilot wants to merge 1 commit intomainfrom
copilot/fix-9919-1036865607-6e95057e-09fb-44fc-bdd5-16b152107344
Closed

[WIP] feat: Add support for Google Gemini CLI as an AI Engine#16585
Copilot wants to merge 1 commit intomainfrom
copilot/fix-9919-1036865607-6e95057e-09fb-44fc-bdd5-16b152107344

Conversation

Copy link
Contributor

Copilot AI commented Feb 18, 2026

Thanks for assigning this issue to me. I'm starting to work on it and will keep this PR's description up to date as I form a plan and make progress.

Original prompt

This section details on the original issue you should resolve

<issue_title>feat: Add support for Google Gemini CLI as an AI Engine</issue_title>
<issue_description>### Is your feature request related to a problem? Please describe.
Currently, the repository supports various AI engines, but it lacks support for Google Gemini.

Gemini (specifically 1.5 Pro and Flash) offers massive context windows and strong reasoning capabilities that are highly beneficial for agentic workflows, particularly when analyzing large codebases or performing complex reasoning tasks.

Describe the solution you'd like

I propose adding gemini-cli as a supported engine provider. The integration should leverage the CLI's headless mode, which is specifically designed for programmatic usage and automation.

Implementation Plan:

  1. Execution: Use the headless mode with JSON output to get structured responses suitable for parsing.

    gemini --prompt "Your prompt here" --output-format json
  2. Authentication: Support standard Gemini authentication methods. The simplest integration path for CI/CD and headless environments is the API Key:

    • Environment Variable: GEMINI_API_KEY
    • Alternative: Google Cloud Application Default Credentials (ADC) for Vertex AI users.
  3. Response Handling: The CLI returns a structured JSON object that the agent can parse. The wrapper needs to handle the following schema:

    {
      "response": "The AI's actual text response...",
      "stats": {
        "models": { ... },
        "tools": { ... }
      }
    }
  4. Streaming (Optional but recommended): The CLI also supports streaming JSON events via --output-format stream-json, which could be used to provide real-time feedback in the UI if supported by the architecture.

Describe alternatives you've considered

  • Direct API SDK: We could use the Google AI SDK directly (e.g., in Go or Node). However, using the gemini-cli wrapper aligns with the tool-based nature of this repository and provides a unified interface for both Standard Gemini and Vertex AI without extra configuration.

Additional context

Documentation References:

Comments on the Issue (you are @copilot in this section)

@pelikhan Does gemini support LLM gateways? @pelikhan We need to be able to gateway out of the agent container back into the gemini endpoints. @pelikhan /scout do a research on Gemini-cli to gather information about implementing it as an agentic engine in AW.
  • headless execution
  • authentication
  • LLM gateway support
  • connection to Google Gemini endpoints directly from the gateway
@pelikhan We are doing cleanup to prepare for more engines. @pelikhan Use port 1003 for Gemini. @pelikhan Use subagent https://github.com/github/gh-aw/blob/main/.github/agents/custom-engine-implementation.agent.md to implement t this . @pelikhan Mark it as experimental.

💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: Add support for Google Gemini CLI as an AI Engine

2 participants