Skip to content

Conversation

@SeoFood
Copy link

@SeoFood SeoFood commented Jan 15, 2026

Summary

  • Adds a new global setting that allows users to configure the language in which AI agents respond
  • Includes 12 predefined language templates (English, German, Spanish, French, Portuguese, Italian, Dutch, Polish, Russian, Japanese, Chinese, Korean)
  • Language instruction is prepended to all system prompts when enabled

Features

  • Language Dropdown: Select from predefined templates that load sensible default instructions
  • Editable Instruction: Customize the instruction to fit specific needs
  • Toggle: Enable/disable the language instruction globally
  • Reset Button: Restore default template for selected language

Technical Changes

  • New LanguageInstruction interface in @automaker/types
  • Language templates in @automaker/prompts
  • prependLanguageInstruction() utility in merge.ts
  • Settings helper integration for automatic prompt injection
  • New UI section in Settings → Response Language
  • Store and settings sync integration

Test plan

  • Start dev server with npm run dev:web
  • Navigate to Settings → Response Language
  • Select a language (e.g., German)
  • Enable the toggle
  • Create a feature and verify agent responds in German
  • Open chat and verify responses are in German
  • Modify the instruction and verify changes take effect
  • Reset to default and verify template is restored

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • "Response Language" settings: choose from 12 languages, enable/disable per-language instructions, edit custom instructions, and reset to defaults.
    • Prompts now prepend the configured language instruction so AI outputs follow your selected language/style.
  • UI

    • New Settings section, navigation item, and controls for Response Language.
  • Chores

    • Settings persist, migrate, and sync across sessions.

✏️ Tip: You can customize this high-level summary in your review settings.

Add a new global setting that allows users to configure the language
in which AI agents respond. Features include:

- Language dropdown with 12 predefined templates (English, German,
  Spanish, French, Portuguese, Italian, Dutch, Polish, Russian,
  Japanese, Chinese, Korean)
- Editable instruction text that gets prepended to all system prompts
- Toggle to enable/disable the language instruction
- Reset button to restore default template for selected language

The language instruction is automatically applied to:
- Agent chat responses
- Backlog planning prompts
- Enhancement prompts (improve, technical, simplify, etc.)

Technical changes:
- New LanguageInstruction interface in @automaker/types
- Language templates in @automaker/prompts
- prependLanguageInstruction() utility in merge.ts
- Settings helper integration for prompt injection
- New UI section in Settings -> Response Language
- Store and settings sync integration

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@coderabbitai
Copy link

coderabbitai bot commented Jan 15, 2026

📝 Walkthrough

Walkthrough

Adds a LanguageInstruction type, language templates, UI for editing and persisting a global language instruction, a prepend utility in the prompts library, and server integrations to apply the stored language instruction to system prompts before model calls.

Changes

Cohort / File(s) Summary
Types
\libs/types/src/settings.ts`, `libs/types/src/index.ts``
Add LanguageInstruction interface and optional languageInstruction on GlobalSettings; export the new type.
Prompts library — templates & API
\libs/prompts/src/language-templates.ts`, `libs/prompts/src/index.ts``
Add language templates and helpers (LANGUAGE_TEMPLATES, getLanguageTemplate, getDefaultLanguageInstruction, getAvailableLanguages, LanguageTemplate) and re-export them.
Prompts library — merge
\libs/prompts/src/merge.ts`, `libs/prompts/src/index.ts``
Add and export prependLanguageInstruction(prompt, languageInstruction) to conditionally prepend enabled instructions.
Server settings helpers
\apps/server/src/lib/settings-helpers.ts``
Add getLanguageInstruction() and apply prependLanguageInstruction when building merged prompts; getPromptCustomization() now includes languageInstruction in its return.
Server routes — prompt assembly
\apps/server/src/routes/...`(e.g.generate-features-from-spec.ts, generate-spec.ts, describe-file.ts, describe-image.ts, features/routes/generate-title.ts, suggestions/generate-suggestions.ts, github/routes/validate-issue.ts, worktree/routes/generate-commit-message.ts`)
Load languageInstruction via settingsService and use prependLanguageInstruction to build final prompts; some route constructors accept an optional settingsService.
Server route wiring
\apps/server/src/index.ts`, `apps/server/src/routes/features/index.ts``
createFeaturesRoutes and related route wiring updated to accept an optional settingsService parameter and pass it to handlers.
UI — store & migration
\apps/ui/src/store/app-store.ts`, `apps/ui/src/hooks/use-settings-migration.ts``
Add languageInstruction to app state, implement setLanguageInstruction action that persists and syncs, and propagate the field in migration/hydration/update flows.
UI — settings view & navigation
\apps/ui/src/components/views/settings-view.tsx`, `apps/ui/src/components/views/settings-view/config/navigation.ts`, `apps/ui/src/components/views/settings-view/hooks/use-settings-view.ts``
Add 'language' settings view id and navigation item ("Response Language"); expose languageInstruction and setLanguageInstruction to settings view.
UI — language section component
\apps/ui/src/components/views/settings-view/language/language-section.tsx``
New LanguageSection component with language selector, enable toggle, editable instruction textarea, reset-to-default, and informational banner.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant UI as Client UI
  participant Store as App Store
  participant Server as API Server
  participant Settings as SettingsService
  participant Prompts as Prompts Library
  participant Model as LLM

  UI->>Store: setLanguageInstruction(...)
  Store->>Server: syncSettingsToServer()
  Server->>Settings: read GlobalSettings
  Settings-->>Server: GlobalSettings (includes languageInstruction)
  Server->>Prompts: prependLanguageInstruction(basePrompt, languageInstruction)
  Prompts-->>Server: finalPrompt
  Server->>Model: send(finalPrompt)
  Model-->>Server: response
  Server-->>UI: result
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Possibly related PRs

Suggested labels

Enhancement

Poem

🐰 I tuck a phrase before the prompt’s start,
A gentle language nudge from rabbit-heart.
Templates bloom and server paths agree,
The model speaks in tones we choose to be. 🥕

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 71.43% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: add global language instruction setting for AI responses' accurately describes the main feature being added and directly corresponds to the core changes in this changeset.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings


📜 Recent review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between dbe139a and 8856550.

📒 Files selected for processing (3)
  • apps/server/src/lib/settings-helpers.ts
  • apps/server/src/routes/worktree/routes/generate-commit-message.ts
  • apps/server/src/services/ideation-service.ts
🧰 Additional context used
📓 Path-based instructions (3)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Always import from shared packages (@automaker/*), never from old relative paths

Files:

  • apps/server/src/routes/worktree/routes/generate-commit-message.ts
  • apps/server/src/services/ideation-service.ts
  • apps/server/src/lib/settings-helpers.ts
apps/server/src/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use createEventEmitter() from lib/events.ts for all server operations to emit events that stream to frontend via WebSocket

Files:

  • apps/server/src/routes/worktree/routes/generate-commit-message.ts
  • apps/server/src/services/ideation-service.ts
  • apps/server/src/lib/settings-helpers.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use resolveModelString() from @automaker/model-resolver to convert model aliases (haiku, sonnet, opus) to full model names

Files:

  • apps/server/src/routes/worktree/routes/generate-commit-message.ts
  • apps/server/src/services/ideation-service.ts
  • apps/server/src/lib/settings-helpers.ts
🧠 Learnings (1)
📚 Learning: 2025-12-28T05:07:48.147Z
Learnt from: CR
Repo: AutoMaker-Org/automaker PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-28T05:07:48.147Z
Learning: Store project-specific rules in `.automaker/context/` and load them into agent prompts via `loadContextFiles()` from automaker/utils

Applied to files:

  • apps/server/src/services/ideation-service.ts
  • apps/server/src/lib/settings-helpers.ts
🧬 Code graph analysis (2)
apps/server/src/services/ideation-service.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (323-337)
apps/server/src/lib/settings-helpers.ts (4)
libs/types/src/settings.ts (1)
  • LanguageInstruction (288-295)
libs/types/src/prompts.ts (1)
  • PromptCustomization (113-128)
libs/prompts/src/index.ts (5)
  • mergeAutoModePrompts (51-51)
  • mergeAgentPrompts (52-52)
  • mergeBacklogPlanPrompts (53-53)
  • mergeEnhancementPrompts (54-54)
  • prependLanguageInstruction (57-57)
libs/prompts/src/merge.ts (5)
  • mergeAutoModePrompts (60-86)
  • mergeAgentPrompts (92-96)
  • mergeBacklogPlanPrompts (102-110)
  • mergeEnhancementPrompts (116-139)
  • prependLanguageInstruction (46-54)
🔇 Additional comments (8)
apps/server/src/lib/settings-helpers.ts (3)

8-20: LGTM! Imports correctly use shared packages.

The imports follow the coding guideline to import from @automaker/types and @automaker/prompts rather than relative paths.


250-313: Well-structured language instruction application to system prompts.

The implementation correctly applies prependLanguageInstruction to all system prompts while intentionally leaving userPromptTemplate (line 288) unwrapped. This is the right approach since language instructions should guide the AI's response style via the system prompt, not modify user-provided content.


316-337: LGTM! Clean helper with proper error handling.

The getLanguageInstruction helper follows the same defensive pattern as other helpers in this file, returning undefined on failure for graceful degradation. This allows callers to safely use the result without additional null checks.

apps/server/src/routes/worktree/routes/generate-commit-message.ts (2)

17-17: LGTM! Import correctly uses shared package.

The import follows the coding guideline to use @automaker/prompts for the prependLanguageInstruction utility.


53-63: LGTM! Language instruction correctly applied to commit message generation.

The implementation properly extracts languageInstruction from settings and applies it to the system prompt. The prependLanguageInstruction function handles the undefined case gracefully (returns the original prompt unchanged).

apps/server/src/services/ideation-service.ts (3)

44-45: LGTM! Imports correctly use shared packages and local helpers.

The imports follow the coding guideline: prependLanguageInstruction from @automaker/prompts and getLanguageInstruction from the local settings-helpers module.


200-209: LGTM! Clean language instruction integration in sendMessage.

The implementation correctly separates concerns: first building the base system prompt with context and category, then applying the language instruction wrapper. The getLanguageInstruction helper handles the optional settingsService gracefully.


654-664: LGTM! Consistent language instruction pattern in generateSuggestions.

The implementation follows the same pattern as sendMessage, ensuring consistency across the ideation service. Both chat messages and structured suggestions will now respect the user's language preference.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @SeoFood, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a significant new feature that empowers users to control the language in which AI agents respond. By providing a global setting, users can now ensure AI outputs align with their preferred language, enhancing accessibility and user experience. The implementation includes a comprehensive UI for easy configuration and robust backend logic to seamlessly inject language instructions into AI system prompts, ensuring consistent linguistic behavior across various AI interactions.

Highlights

  • Global Language Setting for AI Responses: Implemented a new global setting that allows users to specify the language in which AI agents respond, enhancing user experience and localization.
  • Predefined Language Templates: Introduced 12 predefined language templates (e.g., English, German, Spanish, Japanese) with sensible default instructions, which can also be customized.
  • Dedicated UI for Language Configuration: Developed a new 'Response Language' section within the Settings view, providing a user-friendly interface for language selection, instruction customization, and enabling/disabling the feature.
  • Automatic Prompt Instruction Integration: Integrated a mechanism to automatically prepend the chosen language instruction to all relevant AI system prompts, ensuring consistent linguistic behavior across various AI interactions.
  • New LanguageInstruction Interface and Utilities: Added a new LanguageInstruction interface to define the structure of language settings and introduced utility functions for managing language templates and modifying prompts.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a well-implemented global language instruction setting for AI responses. The changes are comprehensive, covering type definitions, prompt merging logic, settings persistence, and a user-friendly UI component. The feature enhances the user experience by allowing customization of AI response languages while maintaining technical accuracy. The code is clean, well-structured, and follows good practices for modularity and maintainability.

@Shironex Shironex added Testers-Requested Request for others to test an enhancement or bug fix/etc. Do Not Merge Use this label if something should not be merged. type: feature A new capability or functionality that does not exist yet. labels Jan 15, 2026
Extends the global language instruction setting to additional endpoints
that were generating content in English regardless of user preference:

- generate-title.ts: Auto-generated feature titles
- generate-suggestions.ts: Feature suggestions
- describe-file.ts: Context file descriptions
- describe-image.ts: Context image descriptions
- generate-features-from-spec.ts: Features from app spec
- generate-spec.ts: App specification generation
- validate-issue.ts: Fix import path for prependLanguageInstruction

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@SeoFood
Copy link
Author

SeoFood commented Jan 15, 2026

Additional Changes

Extended the language instruction setting to all AI query endpoints that were previously generating content in English regardless of user preference:

Fixed Endpoints

  • generate-title.ts - Auto-generated feature titles now respect language setting
  • generate-suggestions.ts - Feature suggestions now generated in user's language
  • describe-file.ts - Context file descriptions in user's language
  • describe-image.ts - Context image descriptions in user's language
  • generate-features-from-spec.ts - Features from app spec in user's language
  • generate-spec.ts - App specification generation in user's language

Bug Fix

  • validate-issue.ts - Fixed import path for prependLanguageInstruction

This ensures that when a user sets their language to German (or any other language), all AI-generated content respects that preference.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@apps/server/src/routes/github/routes/validate-issue.ts`:
- Around line 46-50: The imports getLanguageInstruction and
prependLanguageInstruction are unused; either remove them or apply language
instruction support to the issue validation flow: fetch languageInstruction via
getLanguageInstruction(settingsService), wrap the base
ISSUE_VALIDATION_SYSTEM_PROMPT (or finalPrompt) with
prependLanguageInstruction(basePrompt, languageInstruction), and pass that
resulting prompt into streamingQuery (where
finalPrompt/ISSUE_VALIDATION_SYSTEM_PROMPT is currently used); update references
in validate-issue.ts (e.g., getLanguageInstruction, prependLanguageInstruction,
ISSUE_VALIDATION_SYSTEM_PROMPT, finalPrompt, streamingQuery) accordingly.
🧹 Nitpick comments (1)
apps/server/src/routes/features/routes/generate-title.ts (1)

69-77: Use systemPrompt option in simpleQuery for clearer prompt separation.

The simpleQuery function supports a dedicated systemPrompt parameter (defined in SimpleQueryOptions interface). Separating system and user prompts improves clarity and aligns with how provider APIs handle them differently. Current approach concatenates everything into prompt, which works but mixes concerns.

♻️ Optional refactor to use systemPrompt option
       // Use simpleQuery - provider abstraction handles all the streaming/extraction
       const result = await simpleQuery({
-        prompt: `${effectiveSystemPrompt}\n\n${userPrompt}`,
+        prompt: userPrompt,
+        systemPrompt: effectiveSystemPrompt,
         model: CLAUDE_MODEL_MAP.haiku,
         cwd: process.cwd(),
         maxTurns: 1,
         allowedTools: [],
       });
📜 Review details

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3dc799d and e71989a.

📒 Files selected for processing (9)
  • apps/server/src/index.ts
  • apps/server/src/routes/app-spec/generate-features-from-spec.ts
  • apps/server/src/routes/app-spec/generate-spec.ts
  • apps/server/src/routes/context/routes/describe-file.ts
  • apps/server/src/routes/context/routes/describe-image.ts
  • apps/server/src/routes/features/index.ts
  • apps/server/src/routes/features/routes/generate-title.ts
  • apps/server/src/routes/github/routes/validate-issue.ts
  • apps/server/src/routes/suggestions/generate-suggestions.ts
🧰 Additional context used
📓 Path-based instructions (3)
**/*.{ts,tsx,js,jsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Always import from shared packages (@automaker/*), never from old relative paths

Files:

  • apps/server/src/routes/context/routes/describe-image.ts
  • apps/server/src/routes/app-spec/generate-spec.ts
  • apps/server/src/routes/suggestions/generate-suggestions.ts
  • apps/server/src/routes/github/routes/validate-issue.ts
  • apps/server/src/routes/features/index.ts
  • apps/server/src/routes/features/routes/generate-title.ts
  • apps/server/src/index.ts
  • apps/server/src/routes/app-spec/generate-features-from-spec.ts
  • apps/server/src/routes/context/routes/describe-file.ts
apps/server/src/**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use createEventEmitter() from lib/events.ts for all server operations to emit events that stream to frontend via WebSocket

Files:

  • apps/server/src/routes/context/routes/describe-image.ts
  • apps/server/src/routes/app-spec/generate-spec.ts
  • apps/server/src/routes/suggestions/generate-suggestions.ts
  • apps/server/src/routes/github/routes/validate-issue.ts
  • apps/server/src/routes/features/index.ts
  • apps/server/src/routes/features/routes/generate-title.ts
  • apps/server/src/index.ts
  • apps/server/src/routes/app-spec/generate-features-from-spec.ts
  • apps/server/src/routes/context/routes/describe-file.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (CLAUDE.md)

Use resolveModelString() from @automaker/model-resolver to convert model aliases (haiku, sonnet, opus) to full model names

Files:

  • apps/server/src/routes/context/routes/describe-image.ts
  • apps/server/src/routes/app-spec/generate-spec.ts
  • apps/server/src/routes/suggestions/generate-suggestions.ts
  • apps/server/src/routes/github/routes/validate-issue.ts
  • apps/server/src/routes/features/index.ts
  • apps/server/src/routes/features/routes/generate-title.ts
  • apps/server/src/index.ts
  • apps/server/src/routes/app-spec/generate-features-from-spec.ts
  • apps/server/src/routes/context/routes/describe-file.ts
🧠 Learnings (1)
📚 Learning: 2025-12-28T05:07:48.147Z
Learnt from: CR
Repo: AutoMaker-Org/automaker PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-28T05:07:48.147Z
Learning: Store project-specific rules in `.automaker/context/` and load them into agent prompts via `loadContextFiles()` from automaker/utils

Applied to files:

  • apps/server/src/routes/app-spec/generate-spec.ts
  • apps/server/src/routes/suggestions/generate-suggestions.ts
🧬 Code graph analysis (8)
apps/server/src/routes/context/routes/describe-image.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
apps/server/src/routes/app-spec/generate-spec.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
apps/server/src/routes/suggestions/generate-suggestions.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
apps/server/src/routes/features/index.ts (2)
apps/server/src/services/settings-service.ts (1)
  • SettingsService (106-782)
apps/server/src/routes/features/routes/generate-title.ts (1)
  • createGenerateTitleHandler (41-114)
apps/server/src/routes/features/routes/generate-title.ts (2)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
apps/server/src/providers/simple-query-service.ts (1)
  • simpleQuery (105-167)
apps/server/src/index.ts (1)
apps/server/src/routes/features/index.ts (1)
  • createFeaturesRoutes (19-45)
apps/server/src/routes/app-spec/generate-features-from-spec.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
apps/server/src/routes/context/routes/describe-file.ts (1)
apps/server/src/lib/settings-helpers.ts (1)
  • getLanguageInstruction (299-313)
🔇 Additional comments (15)
apps/server/src/index.ts (1)

222-222: LGTM!

The addition of settingsService to createFeaturesRoutes follows the established pattern used by other routes (e.g., createEnhancePromptRoutes, createWorktreeRoutes, createSuggestionsRoutes) and enables language instruction support for the features API.

apps/server/src/routes/app-spec/generate-spec.ts (2)

24-25: LGTM!

Imports follow the coding guideline to use shared packages (@automaker/prompts).


70-85: LGTM!

Clean implementation of language instruction integration:

  1. Base prompt is clearly defined
  2. Language instruction is loaded asynchronously from settings
  3. Final prompt is composed using prependLanguageInstruction

This pattern is consistent with other routes in this PR.

apps/server/src/routes/context/routes/describe-image.ts (2)

22-26: LGTM!

Imports are properly structured and follow the coding guideline to use shared packages (@automaker/prompts).


286-294: LGTM!

Language instruction integration follows the established pattern. The instructionText is correctly used in both the Cursor text prompt path (line 304) and the Claude multi-part prompt path (line 309).

apps/server/src/routes/app-spec/generate-features-from-spec.ts (2)

17-18: LGTM!

Imports follow the coding guideline to use shared packages (@automaker/prompts).


57-92: LGTM!

Clean implementation that follows the same pattern as other routes:

  1. Define basePrompt with the feature generation instructions
  2. Load language instruction via getLanguageInstruction(settingsService)
  3. Create final prompt using prependLanguageInstruction

The implementation is consistent across the codebase.

apps/server/src/routes/suggestions/generate-suggestions.ts (2)

18-19: LGTM!

Imports correctly use shared packages (@automaker/prompts) and local settings helpers as required by coding guidelines.


205-212: LGTM!

The language instruction integration follows the established pattern:

  1. Load instruction asynchronously via getLanguageInstruction
  2. Prepend to base prompt using prependLanguageInstruction
  3. Use the enhanced basePrompt for both structured and non-structured output paths

This ensures consistent language handling regardless of model type.

apps/server/src/routes/features/index.ts (2)

7-7: LGTM!

Type-only import for SettingsService is appropriate and follows best practices.


19-22: LGTM!

The optional settingsService parameter maintains backward compatibility while enabling language instruction support. Only generate-title receives it, which is correct since other routes handle CRUD operations without AI text generation.

Also applies to: 42-42

apps/server/src/routes/context/routes/describe-file.ts (2)

22-26: LGTM!

Imports are properly organized:

  • Settings-related functions grouped together from settings-helpers.js
  • prependLanguageInstruction correctly imported from @automaker/prompts

139-150: LGTM!

Clean integration of language instruction:

  1. Renamed to basePrompt for clarity
  2. Language instruction loaded asynchronously
  3. Final prompt created via prependLanguageInstruction

The prompt is correctly passed to simpleQuery at line 174.

apps/server/src/routes/features/routes/generate-title.ts (2)

11-14: LGTM!

Imports correctly use shared packages as per coding guidelines.


41-43: LGTM!

Optional parameter maintains backward compatibility while enabling language instruction support.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

SeoFood and others added 2 commits January 15, 2026 20:58
Use the language instruction setting for GitHub issue validation
so that validation responses respect the user's language preference.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add language instruction to IdeationService for chat and suggestions
- Add language instruction to commit message generation
- Apply language instruction to all Auto Mode prompts in settings-helpers

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Do Not Merge Use this label if something should not be merged. Testers-Requested Request for others to test an enhancement or bug fix/etc. type: feature A new capability or functionality that does not exist yet.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants