Skip to content

Conversation

@mcbodge
Copy link
Contributor

@mcbodge mcbodge commented Jan 15, 2026

Summary

This PR adds support for the OpenCode CLI provider in the ModelSelector component, enabling users to access GitHub Copilot models and other providers authenticated through the OpenCode CLI.

Changes

apps/ui/src/components/views/board-view/shared/model-selector.tsx

  • Added OpenCode as a fourth provider option alongside Claude, Cursor CLI, and Codex CLI
  • Implemented OpenCode model fetching from the store on component mount
  • Added UI for displaying OpenCode models with proper loading, error, and empty states
  • Models dynamically fetched via OpenCode CLI are displayed with tier badges (Free, Balanced, Premium)
  • Added provider availability check for OpenCode CLI authentication status

Features

  • Provider Selection: Users can now select OpenCode CLI as their AI provider
  • Dynamic Model Discovery: Models are automatically fetched from the OpenCode CLI, including GitHub Copilot models (claude-sonnet-4, gpt-5.2-codex, etc.)
  • Model Tiers: Models are badged by tier (Free, Balanced, Premium) for easy identification
  • Error Handling: Proper error states with retry functionality if model fetching fails
  • Loading States: Shows loading spinner while models are being fetched
  • Authentication Warning: Displays warning when OpenCode CLI is not installed or authenticated

Testing

  1. Ensure OpenCode CLI is installed (npx opencode-ai@latest --version)
  2. Authenticate with a provider (e.g., npx opencode-ai@latest auth login and select GitHub Copilot)
  3. Open the model selector in the feature creation dialog
  4. Verify OpenCode CLI appears as a provider option
  5. Select OpenCode CLI and verify models are loaded and displayed
  6. Verify models can be selected and used for feature generation

Screenshots

image

Checklist

  • Code follows project style guidelines
  • Changes are consistent with existing CLI provider implementations
  • No breaking changes to existing functionality

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @mcbodge, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly expands the AI model selection capabilities within the application by integrating the OpenCode CLI as a new provider. This integration allows users to seamlessly access and utilize a broader range of AI models, including those from GitHub Copilot, directly through the ModelSelector component, enhancing flexibility and choice for AI-powered features.

Highlights

  • OpenCode CLI Provider Integration: The ModelSelector component now supports OpenCode CLI as a new AI provider, allowing users to select models authenticated via OpenCode CLI, including GitHub Copilot models.
  • Dynamic Model Discovery and Display: OpenCode models are dynamically fetched from the OpenCode CLI on component mount and displayed within the UI, complete with loading, error, and empty states.
  • Model Tier Badging: Dynamically fetched OpenCode models are visually differentiated with 'Free', 'Balanced', and 'Premium' tier badges for easy identification.
  • Provider Availability and Error Handling: The UI includes checks for OpenCode CLI installation and authentication status, displaying warnings if unavailable and providing retry functionality for model fetching errors.
  • Default Model Selection: Logic has been added to automatically select a default OpenCode model when the OpenCode provider is chosen.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@coderabbitai
Copy link

coderabbitai bot commented Jan 15, 2026

Important

Review skipped

Auto reviews are disabled on base/target branches other than the default branch.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request successfully adds support for the OpenCode CLI provider to the ModelSelector. The implementation is consistent with existing providers. My feedback primarily focuses on opportunities to refactor duplicated code. Several sections, including model fetching logic, UI for provider buttons, and the model list display, are nearly identical to those for other providers. Extracting this repeated logic into reusable hooks and components would significantly improve the maintainability and scalability of the ModelSelector component. I've also pointed out the use of a hardcoded string that should be replaced with a shared constant.

- Replace magic string 'opencode/big-pickle' with DEFAULT_OPENCODE_MODEL constant
- Extract useProviderModels custom hook to reduce duplicate useEffect logic
- Extract transformModels helper function with badge mapping objects
- Create reusable DynamicModelList component for OpenCode and Codex model lists
- Improve maintainability and prepare for easier addition of new providers
@mcbodge mcbodge marked this pull request as draft January 16, 2026 16:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant