-
-
Notifications
You must be signed in to change notification settings - Fork 669
feat(ai): add OpenAI and custom API provider support #1424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||
|---|---|---|---|---|---|---|---|---|
|
|
@@ -256,17 +256,52 @@ gosec -exclude-generated ./... | |||||||
| ``` | ||||||||
|
|
||||||||
| ### Auto fixing vulnerabilities | ||||||||
|
|
||||||||
| gosec can suggest fixes based on AI recommendation. It will call an AI API to receive a suggestion for a security finding. | ||||||||
|
|
||||||||
| You can enable this feature by providing the following command line arguments: | ||||||||
| - `ai-api-provider`: the name of the AI API provider, currently only `gemini`is supported. | ||||||||
| - `ai-api-key` or set the environment variable `GOSEC_AI_API_KEY`: the key to access the AI API, | ||||||||
| For gemini, you can create an API key following [these instructions](https://ai.google.dev/gemini-api/docs/api-key). | ||||||||
| - `ai-endpoint`: the endpoint of the AI provider, this is optional argument. | ||||||||
|
|
||||||||
| - `ai-api-provider`: the name of the AI API provider. Supported providers: | ||||||||
| - **Gemini**: `gemini-2.5-pro`, `gemini-2.5-flash`, `gemini-2.5-flash-lite`, `gemini-2.0-flash`, `gemini-2.0-flash-lite` (default) | ||||||||
| - **Claude**: `claude-sonnet-4-0` (default), `claude-opus-4-0`, `claude-opus-4-1`, `claude-sonnet-3-7` | ||||||||
| - **OpenAI**: `gpt-4o` (default), `gpt-4o-mini` | ||||||||
| - **Custom OpenAI-compatible**: Any custom model name (requires `ai-base-url`) | ||||||||
| - `ai-api-key` or set the environment variable `GOSEC_AI_API_KEY`: the key to access the AI API | ||||||||
| - For Gemini, you can create an API key following [these instructions](https://ai.google.dev/gemini-api/docs/api-key) | ||||||||
| - For Claude, get your API key from [Anthropic Console](https://console.anthropic.com/) | ||||||||
| - For OpenAI, get your API key from [OpenAI Platform](https://platform.openai.com/api-keys) | ||||||||
| - `ai-base-url`: (optional) custom base URL for OpenAI-compatible APIs (e.g., Azure OpenAI, LocalAI, Ollama) | ||||||||
| - `ai-skip-ssl`: (optional) skip SSL certificate verification for AI API (useful for self-signed certificates) | ||||||||
|
||||||||
| - `ai-skip-ssl`: (optional) skip SSL certificate verification for AI API (useful for self-signed certificates) | |
| - `ai-skip-ssl`: (optional) **Disables SSL/TLS certificate verification for AI API calls.** | |
| > **Warning:** Disabling certificate verification makes connections vulnerable to man-in-the-middle attacks and can expose your `ai-api-key` and scan contents. This option should **never** be used in production or with sensitive data. For self-signed or internal endpoints, prefer configuring a custom CA bundle or trusting your internal CA using the appropriate system/environment settings, so connections remain authenticated and secure. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -13,7 +13,8 @@ import ( | |
| const ( | ||
| AIProviderFlagHelp = `AI API provider to generate auto fixes to issues. Valid options are: | ||
| - gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite, gemini-2.0-flash, gemini-2.0-flash-lite (gemini, default); | ||
| - claude-sonnet-4-0 (claude, default), claude-opus-4-0, claude-opus-4-1, claude-sonnet-3-7` | ||
| - claude-sonnet-4-0 (claude, default), claude-opus-4-0, claude-opus-4-1, claude-sonnet-3-7; | ||
| - gpt-4o (openai, default), gpt-4o-mini` | ||
|
|
||
| AIPrompt = `Provide a brief explanation and a solution to fix this security issue | ||
| in Go programming language: %q. | ||
|
|
@@ -27,21 +28,35 @@ type GenAIClient interface { | |
| } | ||
|
|
||
| // GenerateSolution generates a solution for the given issues using the specified AI provider | ||
| func GenerateSolution(model, aiAPIKey string, issues []*issue.Issue) (err error) { | ||
| func GenerateSolution(model, aiAPIKey, baseURL string, skipSSL bool, issues []*issue.Issue) (err error) { | ||
| var client GenAIClient | ||
|
|
||
| switch { | ||
| case strings.HasPrefix(model, "claude"): | ||
| client, err = NewClaudeClient(model, aiAPIKey) | ||
| case strings.HasPrefix(model, "gemini"): | ||
| client, err = NewGeminiClient(model, aiAPIKey) | ||
| case strings.HasPrefix(model, "gpt"): | ||
| config := OpenAIConfig{ | ||
| Model: model, | ||
| APIKey: aiAPIKey, | ||
| BaseURL: baseURL, | ||
| SkipSSL: skipSSL, | ||
| } | ||
| client, err = NewOpenAIClient(config) | ||
| default: | ||
| // Default to OpenAI-compatible API for custom models | ||
| config := OpenAIConfig{ | ||
| Model: model, | ||
| APIKey: aiAPIKey, | ||
| BaseURL: baseURL, | ||
| SkipSSL: skipSSL, | ||
| } | ||
| client, err = NewOpenAIClient(config) | ||
|
Comment on lines
+39
to
+55
|
||
| } | ||
|
|
||
| switch { | ||
| case err != nil: | ||
| if err != nil { | ||
| return fmt.Errorf("initializing AI client: %w", err) | ||
| case client == nil: | ||
| return fmt.Errorf("unsupported AI backend: %s", model) | ||
| } | ||
|
|
||
| return generateSolution(client, issues) | ||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,120 @@ | ||||||||||||||||
| package autofix | ||||||||||||||||
|
|
||||||||||||||||
| import ( | ||||||||||||||||
| "context" | ||||||||||||||||
| "crypto/tls" | ||||||||||||||||
| "errors" | ||||||||||||||||
| "fmt" | ||||||||||||||||
| "net/http" | ||||||||||||||||
|
|
||||||||||||||||
| "github.com/openai/openai-go/v3" | ||||||||||||||||
| "github.com/openai/openai-go/v3/option" | ||||||||||||||||
| ) | ||||||||||||||||
|
|
||||||||||||||||
| const ( | ||||||||||||||||
| ModelGPT4o = openai.ChatModelGPT4o | ||||||||||||||||
| ModelGPT4oMini = openai.ChatModelGPT4oMini | ||||||||||||||||
| DefaultOpenAIBaseURL = "https://api.openai.com/v1" | ||||||||||||||||
|
||||||||||||||||
| DefaultOpenAIBaseURL = "https://api.openai.com/v1" |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Temperature default value of 0 is problematic. In the condition if temperature == 0, a user-provided value of 0.0 (which is a valid temperature for deterministic outputs) would be overridden with 0.7.
Consider using a pointer or a sentinel value to distinguish between "not set" and "explicitly set to 0":
temperature := config.Temperature
if temperature == 0 {
temperature = 0.7
}Or better, document that 0 is not a valid input and will be replaced with the default, or use a negative sentinel value like -1 to indicate "use default".
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The maxTokens and temperature fields are set in the openaiWrapper struct but never used in the GenerateSolution method. The comment on line 92-93 suggests they should be used, but the implementation doesn't include them in the API call parameters.
You should add these parameters to the request:
params := openai.ChatCompletionNewParams{
Model: o.model,
Messages: []openai.ChatCompletionMessageParamUnion{
openai.UserMessage(prompt),
},
MaxTokens: openai.Int(o.maxTokens),
Temperature: openai.Float(o.temperature),
}This ensures consistency with the Claude client which explicitly sets MaxTokens: 1024.
| } | |
| // Set optional parameters if available | |
| // Using WithMaxTokens and WithTemperature methods if they exist in v3 | |
| MaxTokens: openai.Int(o.maxTokens), | |
| Temperature: openai.Float(o.temperature), | |
| } |
Check failure on line 118 in autofix/openai.go
GitHub Actions / test (1.25.4, latest)
unnecessary conversion (unconvert)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing Claude models in the documentation. Line 266 lists only 4 Claude models but the code in
autofix/ai.goline 16 mentions 6 models:claude-sonnet-4-0,claude-sonnet-4-5,claude-opus-4-0,claude-opus-4-1,claude-haiku-4-5, andclaude-sonnet-3-7.Update line 266 to include all supported models: