From 49dd3cfb2326222b7e6f1890d411692c7b22d014 Mon Sep 17 00:00:00 2001 From: Haile <118998054+blackmammoth@users.noreply.github.com> Date: Tue, 21 Apr 2026 15:38:51 +0300 Subject: [PATCH] Refactor provider runtimes for sessions, auth, and MCP management (#666) * feat: implement MCP provider registry and service - Add provider registry to manage LLM providers (Claude, Codex, Cursor, Gemini). - Create provider routes for MCP server operations (list, upsert, delete, run). - Implement MCP service for handling server operations and validations. - Introduce abstract provider class and MCP provider base for shared functionality. - Add tests for MCP server operations across different providers and scopes. - Define shared interfaces and types for MCP functionality. - Implement utility functions for handling JSON config files and API responses. * chore: remove dead code related to MCP server * refactor: put /api/providers in index.js and remove /providers prefix from provider.routes.ts * refactor(settings): move MCP server management into provider module Extract MCP server settings out of the settings controller and agents tab into a dedicated frontend MCP module. The settings UI now delegates MCP rendering and behavior to a single module that only needs the selected provider and current projects. Changes: - Add `src/components/mcp` as the single frontend MCP module - Move MCP server list rendering into `McpServers` - Move MCP add/edit modal into `McpServerFormModal` - Move MCP API/state logic into `useMcpServers` - Move MCP form state/validation logic into `useMcpServerForm` - Add provider-specific MCP constants, types, and formatting helpers - Use the unified `/api/providers/:provider/mcp/servers` API for all providers - Support MCP management for Claude, Cursor, Codex, and Gemini - Remove old settings-owned Claude/Codex MCP modal components - Remove old provider-specific `McpServersContent` branching from settings - Strip MCP server state, fetch, save, delete, and modal ownership from `useSettingsController` - Simplify agents settings props so MCP only receives `selectedProvider` and `currentProjects` - Keep Claude working-directory unsupported while preserving cwd support for Cursor, Codex, and Gemini - Add progressive MCP loading: - render user/global scope first - load project/local scopes in the background - append project results as they resolve - cache MCP lists briefly to avoid slow tab-switch refetches - ignore stale async responses after provider switches Verification: - `npx eslint src/components/mcp` - `npm run typecheck` - `npm run build:client` * fix(mcp): form with multiline text handling for args, env, headers, and envVars * feat(mcp): add global MCP server creation flow Add a separate global MCP add path in the settings MCP module so users can create one shared MCP server configuration across Claude, Cursor, Codex, and Gemini from the same screen. The provider-specific add flow is still kept next to it because these two actions have different intent. A global MCP server must be constrained to the subset of configuration that every provider can accept, while a provider-specific server can still use that provider's own supported scopes, transports, and fields. Naming the buttons as "Add Global MCP Server" and "Add MCP Server" makes that distinction explicit without forcing users to infer it from the selected tab. This also moves the explanatory copy to button hover text to keep the MCP toolbar compact while still documenting the difference between global and provider-only adds at the point of action. Implementation details: - Add global MCP form mode with shared user/project scopes and stdio/http transports. - Submit global creates through `/api/providers/mcp/servers/global`. - Reuse the existing MCP form modal with configurable scopes, transports, labels, and descriptions instead of duplicating form logic. - Disable provider-only fields for the global flow because those fields cannot be safely written to every provider. - Clear the MCP server cache globally after a global add because every provider tab may have changed. - Surface partial global add failures with provider-specific error messages. Validation: - npx eslint src/components/mcp/view/McpServers.tsx - npm run typecheck - npm run build:client * feat: implement platform-specific provider visibility for cursor agent * refactor(providers): centralize message handling in provider module Move provider-specific normalizeMessage and fetchHistory logic out of the legacy server/providers adapters and into the refactored provider classes so callers can depend on the main provider contract instead of parallel adapter plumbing. Add a providers service to resolve concrete providers through the registry and delegate message normalization/history loading from realtime handlers and the unified messages route. Add shared TypeScript message/history types and normalized message helpers so provider implementations and callers use the same contract. Remove the old adapter registry/files now that Claude, Codex, Cursor, and Gemini implement the required behavior directly. * refactor(providers): move auth status checks into provider runtimes Move provider authentication status logic out of the CLI auth route so auth checks live with the provider implementations that understand each provider's install and credential model. Add provider-specific auth runtime classes for Claude, Codex, Cursor, and Gemini, and expose them through the shared provider contract as `provider.auth`. Add a provider auth service that resolves providers through the registry and delegates status checks via `auth.getStatus()`. Keep the existing `/api/cli//status` endpoints, but make them thin route adapters over the new provider auth service. This removes duplicated route-local credential parsing and makes auth status a first-class provider capability beside MCP and message handling. * refactor(providers): clarify provider auth and MCP naming Rename provider auth/MCP contracts to remove the overloaded Runtime suffix so the shared interfaces read as stable provider capabilities instead of execution implementation details. Add a consistent provider-first auth class naming convention by renaming ClaudeAuthProvider, CodexAuthProvider, CursorAuthProvider, and GeminiAuthProvider to ClaudeProviderAuth, CodexProviderAuth, CursorProviderAuth, and GeminiProviderAuth. This keeps the provider module API easier to scan and aligns auth naming with the main provider ownership model. * refactor(providers): move session message delegation into sessions service Move provider-backed session history and message normalization calls out of the generic providers service so the service name reflects the behavior it owns. Add a dedicated sessions service for listing session-capable providers, normalizing live provider events, and fetching persisted session history through the provider registry. Update realtime handlers and the unified messages route to depend on `sessionsService` instead of `providersService`. This separates session message operations from other provider concerns such as auth and MCP, keeping the provider services easier to navigate as the module grows. * refactor(providers): move auth status routes under provider API Move provider authentication status endpoints out of the legacy `/api/cli` route namespace so auth status is exposed through the same provider module that owns provider auth and MCP behavior. Add `GET /api/providers/:provider/auth/status` to the provider router and route it through the provider auth service. Remove the old `cli-auth` route file and `/api/cli` mount now that provider auth status is handled by the unified provider API. Update the frontend provider auth endpoint map to call the new provider-scoped routes and rename the endpoint constant to reflect that it is no longer CLI specific. * chore(api): remove unused backend endpoints after MCP audit Remove legacy backend routes that no longer have frontend or internal callers, including the old Claude/Codex MCP APIs, unused Cursor and Codex helper endpoints, stale TaskMaster detection/next/initialize routes, and unused command/project helpers. This reduces duplicated MCP behavior now handled by the provider-based MCP API, shrinks the exposed backend surface, and removes probe/service code that only existed for deleted endpoints. Add an MCP settings API audit document to capture the route-usage analysis and explain why the legacy MCP endpoints were considered safe to remove. * refactor(providers): remove debug logging from Claude authentication status checks * refactor(cursor): lazy-load better-sqlite3 and remove unused type definitions * refactor(cursor): remove SSE from CursorMcpProvider constructor and error message * refactor(auth): standardize API response structure and remove unused error handling * refactor: make providers use dedicated session handling classes * refactor: remove legacy provider selection UI and logic * fix(server/providers): harden and correct session history normalization/pagination Address correctness and safety issues in provider session adapters while preserving existing normalized message shapes. Claude sessions: - Ensure user text content parts generate unique normalized message ids. - Replace duplicate `${baseId}_text` ids with index-suffixed ids to avoid collisions when one user message contains multiple text segments. Cursor sessions: - Add session id sanitization before constructing SQLite paths to prevent path traversal via crafted session ids. - Enforce containment by resolving the computed DB path and asserting it stays under ~/.cursor/chats/. - Refactor blob parsing to a two-pass flow: first build blobMap and collect JSON blobs, then parse binary parent refs against the fully populated map. - Fix pagination semantics so limit=0 returns an empty page instead of full history, with consistent total/hasMore/offset/limit metadata. Gemini sessions: - Honor FetchHistoryOptions pagination by reading limit/offset and slicing normalized history accordingly. - Return consistent hasMore/offset/limit metadata for paged responses. Validation: - eslint passed for touched files. - server TypeScript check passed (tsc --noEmit -p server/tsconfig.json). --------- --- eslint.config.js | 32 +- package-lock.json | 111 ++++ package.json | 3 + server/claude-sdk.js | 10 +- server/cursor-cli.js | 14 +- server/gemini-cli.js | 10 +- server/gemini-response-handler.js | 4 +- server/index.js | 62 +- .../list/claude/claude-auth.provider.ts | 123 ++++ .../list/claude/claude-mcp.provider.ts | 135 +++++ .../list/claude/claude-sessions.provider.ts | 306 ++++++++++ .../providers/list/claude/claude.provider.ts | 15 + .../list/codex/codex-auth.provider.ts | 100 ++++ .../list/codex/codex-mcp.provider.ts | 135 +++++ .../list/codex/codex-sessions.provider.ts | 319 ++++++++++ .../providers/list/codex/codex.provider.ts | 15 + .../list/cursor/cursor-auth.provider.ts | 143 +++++ .../list/cursor/cursor-mcp.provider.ts | 108 ++++ .../list/cursor/cursor-sessions.provider.ts | 421 +++++++++++++ .../providers/list/cursor/cursor.provider.ts | 15 + .../list/gemini/gemini-auth.provider.ts | 151 +++++ .../list/gemini/gemini-mcp.provider.ts | 110 ++++ .../list/gemini/gemini-sessions.provider.ts | 227 +++++++ .../providers/list/gemini/gemini.provider.ts | 15 + server/modules/providers/provider.registry.ts | 36 ++ server/modules/providers/provider.routes.ts | 217 +++++++ .../modules/providers/services/mcp.service.ts | 94 +++ .../services/provider-auth.service.ts | 26 + .../providers/services/sessions.service.ts | 45 ++ .../shared/base/abstract.provider.ts | 20 + .../providers/shared/mcp/mcp.provider.ts | 151 +++++ server/modules/providers/tests/mcp.test.ts | 293 ++++++++++ server/openai-codex.js | 10 +- server/providers/claude/adapter.js | 278 --------- server/providers/claude/status.js | 136 ----- server/providers/codex/adapter.js | 248 -------- server/providers/codex/status.js | 78 --- server/providers/cursor/adapter.js | 348 ----------- server/providers/cursor/status.js | 128 ---- server/providers/gemini/adapter.js | 186 ------ server/providers/gemini/status.js | 111 ---- server/providers/registry.js | 67 --- server/providers/types.js | 132 ----- server/providers/utils.js | 29 - server/routes/cli-auth.js | 27 - server/routes/codex.js | 314 +--------- server/routes/commands.js | 49 -- server/routes/cursor.js | 542 +---------------- server/routes/mcp-utils.js | 21 +- server/routes/mcp.js | 552 ------------------ server/routes/messages.js | 12 +- server/routes/settings.js | 10 + server/routes/taskmaster.js | 492 ---------------- server/shared/interfaces.ts | 54 ++ server/shared/types.ts | 172 ++++++ server/shared/utils.ts | 193 ++++++ server/utils/mcp-detector.js | 51 -- src/components/chat/hooks/useChatMessages.ts | 2 +- .../ProviderSelectionEmptyState.tsx | 59 +- src/components/mcp/constants.ts | 58 ++ src/components/mcp/hooks/useMcpServerForm.ts | 248 ++++++++ src/components/mcp/hooks/useMcpServers.ts | 535 +++++++++++++++++ src/components/mcp/index.ts | 1 + src/components/mcp/types.ts | 90 +++ src/components/mcp/utils/mcpFormatting.ts | 184 ++++++ src/components/mcp/view/McpServers.tsx | 281 +++++++++ .../mcp/view/modals/McpServerFormModal.tsx | 434 ++++++++++++++ .../hooks/useProviderAuthStatus.ts | 13 +- src/components/provider-auth/types.ts | 10 +- .../settings/constants/constants.ts | 47 +- .../settings/hooks/useSettingsController.ts | 479 +-------------- src/components/settings/types/types.ts | 77 --- src/components/settings/view/Settings.tsx | 51 +- .../view/modals/ClaudeMcpFormModal.tsx | 478 --------------- .../view/modals/CodexMcpFormModal.tsx | 177 ------ .../agents-settings/AgentsSettingsTab.tsx | 52 +- .../sections/AgentCategoryContentSection.tsx | 58 +- .../sections/AgentSelectorSection.tsx | 5 +- .../sections/content/McpServersContent.tsx | 391 ------------- .../view/tabs/agents-settings/types.ts | 33 +- src/hooks/useServerPlatform.ts | 40 ++ src/utils/api.js | 5 - 82 files changed, 5834 insertions(+), 5680 deletions(-) create mode 100644 server/modules/providers/list/claude/claude-auth.provider.ts create mode 100644 server/modules/providers/list/claude/claude-mcp.provider.ts create mode 100644 server/modules/providers/list/claude/claude-sessions.provider.ts create mode 100644 server/modules/providers/list/claude/claude.provider.ts create mode 100644 server/modules/providers/list/codex/codex-auth.provider.ts create mode 100644 server/modules/providers/list/codex/codex-mcp.provider.ts create mode 100644 server/modules/providers/list/codex/codex-sessions.provider.ts create mode 100644 server/modules/providers/list/codex/codex.provider.ts create mode 100644 server/modules/providers/list/cursor/cursor-auth.provider.ts create mode 100644 server/modules/providers/list/cursor/cursor-mcp.provider.ts create mode 100644 server/modules/providers/list/cursor/cursor-sessions.provider.ts create mode 100644 server/modules/providers/list/cursor/cursor.provider.ts create mode 100644 server/modules/providers/list/gemini/gemini-auth.provider.ts create mode 100644 server/modules/providers/list/gemini/gemini-mcp.provider.ts create mode 100644 server/modules/providers/list/gemini/gemini-sessions.provider.ts create mode 100644 server/modules/providers/list/gemini/gemini.provider.ts create mode 100644 server/modules/providers/provider.registry.ts create mode 100644 server/modules/providers/provider.routes.ts create mode 100644 server/modules/providers/services/mcp.service.ts create mode 100644 server/modules/providers/services/provider-auth.service.ts create mode 100644 server/modules/providers/services/sessions.service.ts create mode 100644 server/modules/providers/shared/base/abstract.provider.ts create mode 100644 server/modules/providers/shared/mcp/mcp.provider.ts create mode 100644 server/modules/providers/tests/mcp.test.ts delete mode 100644 server/providers/claude/adapter.js delete mode 100644 server/providers/claude/status.js delete mode 100644 server/providers/codex/adapter.js delete mode 100644 server/providers/codex/status.js delete mode 100644 server/providers/cursor/adapter.js delete mode 100644 server/providers/cursor/status.js delete mode 100644 server/providers/gemini/adapter.js delete mode 100644 server/providers/gemini/status.js delete mode 100644 server/providers/registry.js delete mode 100644 server/providers/types.js delete mode 100644 server/providers/utils.js delete mode 100644 server/routes/cli-auth.js delete mode 100644 server/routes/mcp.js create mode 100644 server/shared/interfaces.ts create mode 100644 server/shared/types.ts create mode 100644 server/shared/utils.ts create mode 100644 src/components/mcp/constants.ts create mode 100644 src/components/mcp/hooks/useMcpServerForm.ts create mode 100644 src/components/mcp/hooks/useMcpServers.ts create mode 100644 src/components/mcp/index.ts create mode 100644 src/components/mcp/types.ts create mode 100644 src/components/mcp/utils/mcpFormatting.ts create mode 100644 src/components/mcp/view/McpServers.tsx create mode 100644 src/components/mcp/view/modals/McpServerFormModal.tsx delete mode 100644 src/components/settings/view/modals/ClaudeMcpFormModal.tsx delete mode 100644 src/components/settings/view/modals/CodexMcpFormModal.tsx delete mode 100644 src/components/settings/view/tabs/agents-settings/sections/content/McpServersContent.tsx create mode 100644 src/hooks/useServerPlatform.ts diff --git a/eslint.config.js b/eslint.config.js index 3ef25d92..742f0c2b 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -148,9 +148,27 @@ export default tseslint.config( ], "boundaries/elements": [ { - type: "backend-shared-types", // shared backend type contract that modules may consume without creating runtime coupling - pattern: ["server/shared/types.{js,ts}"], // support the current shared types path - mode: "file", // treat the types file itself as the boundary element instead of the whole folder + type: "backend-shared-type-contract", // shared backend type/interface contracts that modules may consume without creating runtime coupling + pattern: [ + "server/shared/types.{js,ts}", + "server/shared/interfaces.{js,ts}", + ], // keep backend modules on explicit shared contract files for erased imports only + mode: "file", // treat each shared contract file itself as the boundary element instead of the whole folder + }, + { + type: "backend-shared-utils", // shared backend runtime helpers that modules may import directly + pattern: ["server/shared/utils.{js,ts}"], // classify the shared utils file so modules can depend on it explicitly + mode: "file", + }, + { + type: "backend-legacy-runtime", // legacy runtime persistence modules used while providers migrate into server/modules + pattern: [ + "server/projects.js", + "server/sessionManager.js", + "server/database/*.{js,ts}", + "server/utils/runtime-paths.js", + ], // provider history loading still resolves session data through these legacy runtime/database files + mode: "file", }, { type: "backend-module", // logical element name used by boundaries rules below @@ -196,13 +214,13 @@ export default tseslint.config( checkInternals: false, // do not apply these cross-module rules to imports inside the same module rules: [ { - from: { type: "backend-module" }, // modules may depend on the shared types contract only as erased type-only imports - to: { type: "backend-shared-types" }, + from: { type: "backend-module" }, // modules may depend on shared type/interface contracts only as erased type-only imports + to: { type: "backend-shared-type-contract" }, disallow: { dependency: { kind: ["value", "typeof"] }, - }, // block runtime imports so shared types stay a compile-time contract instead of a hidden shared module + }, // block runtime imports so shared contracts stay compile-time only instead of becoming hidden shared modules message: - "Backend modules may only use `import type` when importing from server/shared/types.ts (or server/types.ts).", + "Backend modules may only use `import type` when importing from server/shared/types.ts or server/shared/interfaces.ts.", }, { to: { type: "backend-module" }, // when importing anything that belongs to another backend module diff --git a/package-lock.json b/package-lock.json index 7ea371ab..b8f5cfb2 100644 --- a/package-lock.json +++ b/package-lock.json @@ -74,6 +74,9 @@ "@commitlint/config-conventional": "^20.5.0", "@eslint/js": "^9.39.3", "@release-it/conventional-changelog": "^10.0.5", + "@types/better-sqlite3": "^7.6.13", + "@types/cross-spawn": "^6.0.6", + "@types/express": "^5.0.6", "@types/node": "^22.19.7", "@types/react": "^18.2.43", "@types/react-dom": "^18.2.17", @@ -3705,6 +3708,47 @@ "@babel/types": "^7.20.7" } }, + "node_modules/@types/better-sqlite3": { + "version": "7.6.13", + "resolved": "https://registry.npmjs.org/@types/better-sqlite3/-/better-sqlite3-7.6.13.tgz", + "integrity": "sha512-NMv9ASNARoKksWtsq/SHakpYAYnhBrQgGD8zkLYk/jaK8jUGn08CfEdTRgYhMypUQAfzSP8W6gNLe0q19/t4VA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/node": "*" + } + }, + "node_modules/@types/body-parser": { + "version": "1.19.6", + "resolved": "https://registry.npmjs.org/@types/body-parser/-/body-parser-1.19.6.tgz", + "integrity": "sha512-HLFeCYgz89uk22N5Qg3dvGvsv46B8GLvKKo1zKG4NybA8U2DiEO3w9lqGg29t/tfLRJpJ6iQxnVw4OnB7MoM9g==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/connect": "*", + "@types/node": "*" + } + }, + "node_modules/@types/connect": { + "version": "3.4.38", + "resolved": "https://registry.npmjs.org/@types/connect/-/connect-3.4.38.tgz", + "integrity": "sha512-K6uROf1LD88uDQqJCktA4yzL1YYAK6NgfsI0v/mTgyPKWsX1CnJ0XPSDhViejru1GcRkLWb8RlzFYJRqGUbaug==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/node": "*" + } + }, + "node_modules/@types/cross-spawn": { + "version": "6.0.6", + "resolved": "https://registry.npmjs.org/@types/cross-spawn/-/cross-spawn-6.0.6.tgz", + "integrity": "sha512-fXRhhUkG4H3TQk5dBhQ7m/JDdSNHKwR2BBia62lhwEIq9xGiQKLxd6LymNhn47SjXhsUEPmxi+PKw2OkW4LLjA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/node": "*" + } + }, "node_modules/@types/debug": { "version": "4.1.12", "resolved": "https://registry.npmjs.org/@types/debug/-/debug-4.1.12.tgz", @@ -3729,6 +3773,31 @@ "@types/estree": "*" } }, + "node_modules/@types/express": { + "version": "5.0.6", + "resolved": "https://registry.npmjs.org/@types/express/-/express-5.0.6.tgz", + "integrity": "sha512-sKYVuV7Sv9fbPIt/442koC7+IIwK5olP1KWeD88e/idgoJqDm3JV/YUiPwkoKK92ylff2MGxSz1CSjsXelx0YA==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/body-parser": "*", + "@types/express-serve-static-core": "^5.0.0", + "@types/serve-static": "^2" + } + }, + "node_modules/@types/express-serve-static-core": { + "version": "5.1.1", + "resolved": "https://registry.npmjs.org/@types/express-serve-static-core/-/express-serve-static-core-5.1.1.tgz", + "integrity": "sha512-v4zIMr/cX7/d2BpAEX3KNKL/JrT1s43s96lLvvdTmza1oEvDudCqK9aF/djc/SWgy8Yh0h30TZx5VpzqFCxk5A==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/node": "*", + "@types/qs": "*", + "@types/range-parser": "*", + "@types/send": "*" + } + }, "node_modules/@types/hast": { "version": "3.0.4", "resolved": "https://registry.npmjs.org/@types/hast/-/hast-3.0.4.tgz", @@ -3738,6 +3807,13 @@ "@types/unist": "*" } }, + "node_modules/@types/http-errors": { + "version": "2.0.5", + "resolved": "https://registry.npmjs.org/@types/http-errors/-/http-errors-2.0.5.tgz", + "integrity": "sha512-r8Tayk8HJnX0FztbZN7oVqGccWgw98T/0neJphO91KkmOzug1KkofZURD4UaD5uH8AqcFLfdPErnBod0u71/qg==", + "dev": true, + "license": "MIT" + }, "node_modules/@types/json-schema": { "version": "7.0.15", "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz", @@ -3796,6 +3872,20 @@ "integrity": "sha512-F6bEyamV9jKGAFBEmlQnesRPGOQqS2+Uwi0Em15xenOxHaf2hv6L8YCVn3rPdPJOiJfPiCnLIRyvwVaqMY3MIw==", "license": "MIT" }, + "node_modules/@types/qs": { + "version": "6.15.0", + "resolved": "https://registry.npmjs.org/@types/qs/-/qs-6.15.0.tgz", + "integrity": "sha512-JawvT8iBVWpzTrz3EGw9BTQFg3BQNmwERdKE22vlTxawwtbyUSlMppvZYKLZzB5zgACXdXxbD3m1bXaMqP/9ow==", + "dev": true, + "license": "MIT" + }, + "node_modules/@types/range-parser": { + "version": "1.2.7", + "resolved": "https://registry.npmjs.org/@types/range-parser/-/range-parser-1.2.7.tgz", + "integrity": "sha512-hKormJbkJqzQGhziax5PItDUTMAM9uE2XXQmM37dyd4hVM+5aVl7oVxMVUiVQn2oCQFN/LKCZdvSM0pFRqbSmQ==", + "dev": true, + "license": "MIT" + }, "node_modules/@types/react": { "version": "18.3.23", "resolved": "https://registry.npmjs.org/@types/react/-/react-18.3.23.tgz", @@ -3816,6 +3906,27 @@ "@types/react": "^18.0.0" } }, + "node_modules/@types/send": { + "version": "1.2.1", + "resolved": "https://registry.npmjs.org/@types/send/-/send-1.2.1.tgz", + "integrity": "sha512-arsCikDvlU99zl1g69TcAB3mzZPpxgw0UQnaHeC1Nwb015xp8bknZv5rIfri9xTOcMuaVgvabfIRA7PSZVuZIQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/node": "*" + } + }, + "node_modules/@types/serve-static": { + "version": "2.2.0", + "resolved": "https://registry.npmjs.org/@types/serve-static/-/serve-static-2.2.0.tgz", + "integrity": "sha512-8mam4H1NHLtu7nmtalF7eyBH14QyOASmcxHhSfEoRyr0nP/YdoesEtU+uSRvMe96TW/HPTtkoKqQLl53N7UXMQ==", + "dev": true, + "license": "MIT", + "dependencies": { + "@types/http-errors": "*", + "@types/node": "*" + } + }, "node_modules/@types/unist": { "version": "3.0.3", "resolved": "https://registry.npmjs.org/@types/unist/-/unist-3.0.3.tgz", diff --git a/package.json b/package.json index 5be1fd99..e2f0c960 100644 --- a/package.json +++ b/package.json @@ -126,6 +126,9 @@ "@commitlint/config-conventional": "^20.5.0", "@eslint/js": "^9.39.3", "@release-it/conventional-changelog": "^10.0.5", + "@types/better-sqlite3": "^7.6.13", + "@types/cross-spawn": "^6.0.6", + "@types/express": "^5.0.6", "@types/node": "^22.19.7", "@types/react": "^18.2.43", "@types/react-dom": "^18.2.17", diff --git a/server/claude-sdk.js b/server/claude-sdk.js index 1489962b..aa8b44aa 100644 --- a/server/claude-sdk.js +++ b/server/claude-sdk.js @@ -24,9 +24,9 @@ import { notifyRunStopped, notifyUserIfEnabled } from './services/notification-orchestrator.js'; -import { claudeAdapter } from './providers/claude/adapter.js'; -import { createNormalizedMessage } from './providers/types.js'; -import { getStatusChecker } from './providers/registry.js'; +import { sessionsService } from './modules/providers/services/sessions.service.js'; +import { providerAuthService } from './modules/providers/services/provider-auth.service.js'; +import { createNormalizedMessage } from './shared/utils.js'; const activeSessions = new Map(); const pendingToolApprovals = new Map(); @@ -654,7 +654,7 @@ async function queryClaudeSDK(command, options = {}, ws) { const sid = capturedSessionId || sessionId || null; // Use adapter to normalize SDK events into NormalizedMessage[] - const normalized = claudeAdapter.normalizeMessage(transformedMessage, sid); + const normalized = sessionsService.normalizeMessage('claude', transformedMessage, sid); for (const msg of normalized) { // Preserve parentToolUseId from SDK wrapper for subagent tool grouping if (transformedMessage.parentToolUseId && !msg.parentToolUseId) { @@ -707,7 +707,7 @@ async function queryClaudeSDK(command, options = {}, ws) { await cleanupTempFiles(tempImagePaths, tempDir); // Check if Claude CLI is installed for a clearer error message - const installed = getStatusChecker('claude')?.checkInstalled() ?? true; + const installed = await providerAuthService.isProviderInstalled('claude'); const errorContent = !installed ? 'Claude Code is not installed. Please install it first: https://docs.anthropic.com/en/docs/claude-code' : error.message; diff --git a/server/cursor-cli.js b/server/cursor-cli.js index f6193369..66af16ef 100644 --- a/server/cursor-cli.js +++ b/server/cursor-cli.js @@ -1,9 +1,9 @@ import { spawn } from 'child_process'; import crossSpawn from 'cross-spawn'; import { notifyRunFailed, notifyRunStopped } from './services/notification-orchestrator.js'; -import { cursorAdapter } from './providers/cursor/adapter.js'; -import { createNormalizedMessage } from './providers/types.js'; -import { getStatusChecker } from './providers/registry.js'; +import { sessionsService } from './modules/providers/services/sessions.service.js'; +import { providerAuthService } from './modules/providers/services/provider-auth.service.js'; +import { createNormalizedMessage } from './shared/utils.js'; // Use cross-spawn on Windows for better command execution const spawnFunction = process.platform === 'win32' ? crossSpawn : spawn; @@ -190,7 +190,7 @@ async function spawnCursor(command, options = {}, ws) { case 'assistant': // Accumulate assistant message chunks if (response.message && response.message.content && response.message.content.length > 0) { - const normalized = cursorAdapter.normalizeMessage(response, capturedSessionId || sessionId || null); + const normalized = sessionsService.normalizeMessage('cursor', response, capturedSessionId || sessionId || null); for (const msg of normalized) ws.send(msg); } break; @@ -220,7 +220,7 @@ async function spawnCursor(command, options = {}, ws) { } // If not JSON, send as stream delta via adapter - const normalized = cursorAdapter.normalizeMessage(line, capturedSessionId || sessionId || null); + const normalized = sessionsService.normalizeMessage('cursor', line, capturedSessionId || sessionId || null); for (const msg of normalized) ws.send(msg); } }; @@ -288,7 +288,7 @@ async function spawnCursor(command, options = {}, ws) { }); // Handle process errors - cursorProcess.on('error', (error) => { + cursorProcess.on('error', async (error) => { console.error('Cursor CLI process error:', error); // Clean up process reference on error @@ -296,7 +296,7 @@ async function spawnCursor(command, options = {}, ws) { activeCursorProcesses.delete(finalSessionId); // Check if Cursor CLI is installed for a clearer error message - const installed = getStatusChecker('cursor')?.checkInstalled() ?? true; + const installed = await providerAuthService.isProviderInstalled('cursor'); const errorContent = !installed ? 'Cursor CLI is not installed. Please install it from https://cursor.com' : error.message; diff --git a/server/gemini-cli.js b/server/gemini-cli.js index 62aa5307..2e68a938 100644 --- a/server/gemini-cli.js +++ b/server/gemini-cli.js @@ -9,8 +9,8 @@ import os from 'os'; import sessionManager from './sessionManager.js'; import GeminiResponseHandler from './gemini-response-handler.js'; import { notifyRunFailed, notifyRunStopped } from './services/notification-orchestrator.js'; -import { createNormalizedMessage } from './providers/types.js'; -import { getStatusChecker } from './providers/registry.js'; +import { providerAuthService } from './modules/providers/services/provider-auth.service.js'; +import { createNormalizedMessage } from './shared/utils.js'; let activeGeminiProcesses = new Map(); // Track active processes by session ID @@ -383,7 +383,7 @@ async function spawnGemini(command, options = {}, ws) { } else { // code 127 = shell "command not found" — check installation if (code === 127) { - const installed = getStatusChecker('gemini')?.checkInstalled() ?? true; + const installed = await providerAuthService.isProviderInstalled('gemini'); if (!installed) { const socketSessionId = typeof ws.getSessionId === 'function' ? ws.getSessionId() : finalSessionId; ws.send(createNormalizedMessage({ kind: 'error', content: 'Gemini CLI is not installed. Please install it first: https://github.com/google-gemini/gemini-cli', sessionId: socketSessionId, provider: 'gemini' })); @@ -399,13 +399,13 @@ async function spawnGemini(command, options = {}, ws) { }); // Handle process errors - geminiProcess.on('error', (error) => { + geminiProcess.on('error', async (error) => { // Clean up process reference on error const finalSessionId = capturedSessionId || sessionId || processKey; activeGeminiProcesses.delete(finalSessionId); // Check if Gemini CLI is installed for a clearer error message - const installed = getStatusChecker('gemini')?.checkInstalled() ?? true; + const installed = await providerAuthService.isProviderInstalled('gemini'); const errorContent = !installed ? 'Gemini CLI is not installed. Please install it first: https://github.com/google-gemini/gemini-cli' : error.message; diff --git a/server/gemini-response-handler.js b/server/gemini-response-handler.js index 9da1f5cc..b0a17485 100644 --- a/server/gemini-response-handler.js +++ b/server/gemini-response-handler.js @@ -1,5 +1,5 @@ // Gemini Response Handler - JSON Stream processing -import { geminiAdapter } from './providers/gemini/adapter.js'; +import { sessionsService } from './modules/providers/services/sessions.service.js'; class GeminiResponseHandler { constructor(ws, options = {}) { @@ -56,7 +56,7 @@ class GeminiResponseHandler { } // Normalize via adapter and send all resulting messages - const normalized = geminiAdapter.normalizeMessage(event, sid); + const normalized = sessionsService.normalizeMessage('gemini', event, sid); for (const msg of normalized) { this.ws.send(msg); } diff --git a/server/index.js b/server/index.js index 86ded977..62d85130 100755 --- a/server/index.js +++ b/server/index.js @@ -5,6 +5,9 @@ import fs from 'fs'; import path from 'path'; import { findAppRoot, getModuleDir } from './utils/runtime-paths.js'; +import { AppError, createNormalizedMessage } from '@/shared/utils.js'; + + const __dirname = getModuleDir(import.meta.url); // The server source runs from /server, while the compiled output runs from /dist-server/server. // Resolving the app root once keeps every repo-level lookup below aligned across both layouts. @@ -23,10 +26,9 @@ import cors from 'cors'; import { promises as fsPromises } from 'fs'; import { spawn } from 'child_process'; import pty from 'node-pty'; -import fetch from 'node-fetch'; import mime from 'mime-types'; -import { getProjects, getSessions, renameProject, deleteSession, deleteProject, addProjectManually, extractProjectDirectory, clearProjectDirectoryCache, searchConversations } from './projects.js'; +import { getProjects, getSessions, renameProject, deleteSession, deleteProject, extractProjectDirectory, clearProjectDirectoryCache, searchConversations } from './projects.js'; import { queryClaudeSDK, abortClaudeSDKSession, isClaudeSDKSessionActive, getActiveClaudeSDKSessions, resolveToolApproval, getPendingApprovalsForSession, reconnectSessionWriter } from './claude-sdk.js'; import { spawnCursor, abortCursorSession, isCursorSessionActive, getActiveCursorSessions } from './cursor-cli.js'; import { queryCodex, abortCodexSession, isCodexSessionActive, getActiveCodexSessions } from './openai-codex.js'; @@ -34,7 +36,6 @@ import { spawnGemini, abortGeminiSession, isGeminiSessionActive, getActiveGemini import sessionManager from './sessionManager.js'; import gitRoutes from './routes/git.js'; import authRoutes from './routes/auth.js'; -import mcpRoutes from './routes/mcp.js'; import cursorRoutes from './routes/cursor.js'; import taskmasterRoutes from './routes/taskmaster.js'; import mcpUtilsRoutes from './routes/mcp-utils.js'; @@ -42,13 +43,12 @@ import commandsRoutes from './routes/commands.js'; import settingsRoutes from './routes/settings.js'; import agentRoutes from './routes/agent.js'; import projectsRoutes, { WORKSPACES_ROOT, validateWorkspacePath } from './routes/projects.js'; -import cliAuthRoutes from './routes/cli-auth.js'; import userRoutes from './routes/user.js'; import codexRoutes from './routes/codex.js'; import geminiRoutes from './routes/gemini.js'; import pluginsRoutes from './routes/plugins.js'; import messagesRoutes from './routes/messages.js'; -import { createNormalizedMessage } from './providers/types.js'; +import providerRoutes from './modules/providers/provider.routes.js'; import { startEnabledPluginServers, stopAllPlugins, getPluginPort } from './utils/plugin-process-manager.js'; import { initializeDatabase, sessionNamesDb, applyCustomSessionNames } from './database/db.js'; import { configureWebPush } from './services/vapid-keys.js'; @@ -286,9 +286,6 @@ app.use('/api/projects', authenticateToken, projectsRoutes); // Git API Routes (protected) app.use('/api/git', authenticateToken, gitRoutes); -// MCP API Routes (protected) -app.use('/api/mcp', authenticateToken, mcpRoutes); - // Cursor API Routes (protected) app.use('/api/cursor', authenticateToken, cursorRoutes); @@ -304,9 +301,6 @@ app.use('/api/commands', authenticateToken, commandsRoutes); // Settings API Routes (protected) app.use('/api/settings', authenticateToken, settingsRoutes); -// CLI Authentication API Routes (protected) -app.use('/api/cli', authenticateToken, cliAuthRoutes); - // User API Routes (protected) app.use('/api/user', authenticateToken, userRoutes); @@ -322,6 +316,9 @@ app.use('/api/plugins', authenticateToken, pluginsRoutes); // Unified session messages route (protected) app.use('/api/sessions', authenticateToken, messagesRoutes); +// Unified provider MCP routes (protected) +app.use('/api/providers', authenticateToken, providerRoutes); + // Agent API Routes (uses API key authentication) app.use('/api/agent', agentRoutes); @@ -509,23 +506,6 @@ app.delete('/api/projects/:projectName', authenticateToken, async (req, res) => } }); -// Create project endpoint -app.post('/api/projects/create', authenticateToken, async (req, res) => { - try { - const { path: projectPath } = req.body; - - if (!projectPath || !projectPath.trim()) { - return res.status(400).json({ error: 'Project path is required' }); - } - - const project = await addProjectManually(projectPath.trim()); - res.json({ success: true, project }); - } catch (error) { - console.error('Error creating project:', error); - res.status(500).json({ error: error.message }); - } -}); - // Search conversations content (SSE streaming) app.get('/api/search/conversations', authenticateToken, async (req, res) => { const query = typeof req.query.q === 'string' ? req.query.q.trim() : ''; @@ -1378,7 +1358,7 @@ wss.on('connection', (ws, request) => { /** * WebSocket Writer - Wrapper for WebSocket to match SSEStreamWriter interface * - * Provider files use `createNormalizedMessage()` from `providers/types.js` and + * Provider files use `createNormalizedMessage()` from `shared/utils.js` and * adapter `normalizeMessage()` to produce unified NormalizedMessage events. * The writer simply serialises and sends. */ @@ -2213,6 +2193,30 @@ app.get('*', (req, res) => { } }); +// global error middleware must be last +app.use((err, req, res, next) => { + if (err instanceof AppError) { + return res.status(err.statusCode).json({ + success: false, + error: { + code: err.code, + message: err.message, + details: err.details, + }, + }); + } + + console.error(err); + + return res.status(500).json({ + success: false, + error: { + code: 'INTERNAL_ERROR', + message: 'Internal server error', + }, + }); +}); + // Helper function to convert permissions to rwx format function permToRwx(perm) { const r = perm & 4 ? 'r' : '-'; diff --git a/server/modules/providers/list/claude/claude-auth.provider.ts b/server/modules/providers/list/claude/claude-auth.provider.ts new file mode 100644 index 00000000..1194ae1d --- /dev/null +++ b/server/modules/providers/list/claude/claude-auth.provider.ts @@ -0,0 +1,123 @@ +import { readFile } from 'node:fs/promises'; +import os from 'node:os'; +import path from 'node:path'; + +import spawn from 'cross-spawn'; + +import type { IProviderAuth } from '@/shared/interfaces.js'; +import type { ProviderAuthStatus } from '@/shared/types.js'; +import { readObjectRecord, readOptionalString } from '@/shared/utils.js'; + +type ClaudeCredentialsStatus = { + authenticated: boolean; + email: string | null; + method: string | null; + error?: string; +}; + +export class ClaudeProviderAuth implements IProviderAuth { + /** + * Checks whether the Claude Code CLI is available on this host. + */ + private checkInstalled(): boolean { + const cliPath = process.env.CLAUDE_CLI_PATH || 'claude'; + try { + spawn.sync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 }); + return true; + } catch { + return false; + } + } + + /** + * Returns Claude installation and credential status using Claude Code's auth priority. + */ + async getStatus(): Promise { + const installed = this.checkInstalled(); + + if (!installed) { + return { + installed, + provider: 'claude', + authenticated: false, + email: null, + method: null, + error: 'Claude Code CLI is not installed', + }; + } + + const credentials = await this.checkCredentials(); + + return { + installed, + provider: 'claude', + authenticated: credentials.authenticated, + email: credentials.authenticated ? credentials.email || 'Authenticated' : credentials.email, + method: credentials.method, + error: credentials.authenticated ? undefined : credentials.error || 'Not authenticated', + }; + } + + /** + * Reads Claude settings env values that the CLI can use even when the server process env is empty. + */ + private async loadSettingsEnv(): Promise> { + try { + const settingsPath = path.join(os.homedir(), '.claude', 'settings.json'); + const content = await readFile(settingsPath, 'utf8'); + const settings = readObjectRecord(JSON.parse(content)); + return readObjectRecord(settings?.env) ?? {}; + } catch { + return {}; + } + } + + /** + * Checks Claude credentials in the same priority order used by Claude Code. + */ + private async checkCredentials(): Promise { + if (process.env.ANTHROPIC_API_KEY?.trim()) { + return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; + } + + const settingsEnv = await this.loadSettingsEnv(); + if (readOptionalString(settingsEnv.ANTHROPIC_API_KEY)) { + return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; + } + + if (readOptionalString(settingsEnv.ANTHROPIC_AUTH_TOKEN)) { + return { authenticated: true, email: 'Configured via settings.json', method: 'api_key' }; + } + + try { + const credPath = path.join(os.homedir(), '.claude', '.credentials.json'); + const content = await readFile(credPath, 'utf8'); + const creds = readObjectRecord(JSON.parse(content)) ?? {}; + const oauth = readObjectRecord(creds.claudeAiOauth); + const accessToken = readOptionalString(oauth?.accessToken); + + if (accessToken) { + const expiresAt = typeof oauth?.expiresAt === 'number' ? oauth.expiresAt : undefined; + const email = readOptionalString(creds.email) ?? readOptionalString(creds.user) ?? null; + if (!expiresAt || Date.now() < expiresAt) { + return { + authenticated: true, + email, + method: 'credentials_file', + }; + } + + return { + authenticated: false, + email, + method: 'credentials_file', + error: 'OAuth token has expired. Please re-authenticate with claude login', + }; + } + + return { authenticated: false, email: null, method: null }; + } catch { + return { authenticated: false, email: null, method: null }; + } + } +} diff --git a/server/modules/providers/list/claude/claude-mcp.provider.ts b/server/modules/providers/list/claude/claude-mcp.provider.ts new file mode 100644 index 00000000..fb4b4ac5 --- /dev/null +++ b/server/modules/providers/list/claude/claude-mcp.provider.ts @@ -0,0 +1,135 @@ +import os from 'node:os'; +import path from 'node:path'; + +import { McpProvider } from '@/modules/providers/shared/mcp/mcp.provider.js'; +import type { McpScope, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { + AppError, + readJsonConfig, + readObjectRecord, + readOptionalString, + readStringArray, + readStringRecord, + writeJsonConfig, +} from '@/shared/utils.js'; + +export class ClaudeMcpProvider extends McpProvider { + constructor() { + super('claude', ['user', 'local', 'project'], ['stdio', 'http', 'sse']); + } + + protected async readScopedServers(scope: McpScope, workspacePath: string): Promise> { + if (scope === 'project') { + const filePath = path.join(workspacePath, '.mcp.json'); + const config = await readJsonConfig(filePath); + return readObjectRecord(config.mcpServers) ?? {}; + } + + const filePath = path.join(os.homedir(), '.claude.json'); + const config = await readJsonConfig(filePath); + if (scope === 'user') { + return readObjectRecord(config.mcpServers) ?? {}; + } + + const projects = readObjectRecord(config.projects) ?? {}; + const projectConfig = readObjectRecord(projects[workspacePath]) ?? {}; + return readObjectRecord(projectConfig.mcpServers) ?? {}; + } + + protected async writeScopedServers( + scope: McpScope, + workspacePath: string, + servers: Record, + ): Promise { + if (scope === 'project') { + const filePath = path.join(workspacePath, '.mcp.json'); + const config = await readJsonConfig(filePath); + config.mcpServers = servers; + await writeJsonConfig(filePath, config); + return; + } + + const filePath = path.join(os.homedir(), '.claude.json'); + const config = await readJsonConfig(filePath); + if (scope === 'user') { + config.mcpServers = servers; + await writeJsonConfig(filePath, config); + return; + } + + const projects = readObjectRecord(config.projects) ?? {}; + const projectConfig = readObjectRecord(projects[workspacePath]) ?? {}; + projectConfig.mcpServers = servers; + projects[workspacePath] = projectConfig; + config.projects = projects; + await writeJsonConfig(filePath, config); + } + + protected buildServerConfig(input: UpsertProviderMcpServerInput): Record { + if (input.transport === 'stdio') { + if (!input.command?.trim()) { + throw new AppError('command is required for stdio MCP servers.', { + code: 'MCP_COMMAND_REQUIRED', + statusCode: 400, + }); + } + + return { + type: 'stdio', + command: input.command, + args: input.args ?? [], + env: input.env ?? {}, + }; + } + + if (!input.url?.trim()) { + throw new AppError('url is required for http/sse MCP servers.', { + code: 'MCP_URL_REQUIRED', + statusCode: 400, + }); + } + + return { + type: input.transport, + url: input.url, + headers: input.headers ?? {}, + }; + } + + protected normalizeServerConfig( + scope: McpScope, + name: string, + rawConfig: unknown, + ): ProviderMcpServer | null { + if (!rawConfig || typeof rawConfig !== 'object') { + return null; + } + + const config = rawConfig as Record; + if (typeof config.command === 'string') { + return { + provider: 'claude', + name, + scope, + transport: 'stdio', + command: config.command, + args: readStringArray(config.args), + env: readStringRecord(config.env), + }; + } + + if (typeof config.url === 'string') { + const transport = readOptionalString(config.type) === 'sse' ? 'sse' : 'http'; + return { + provider: 'claude', + name, + scope, + transport, + url: config.url, + headers: readStringRecord(config.headers), + }; + } + + return null; + } +} diff --git a/server/modules/providers/list/claude/claude-sessions.provider.ts b/server/modules/providers/list/claude/claude-sessions.provider.ts new file mode 100644 index 00000000..72bbe07e --- /dev/null +++ b/server/modules/providers/list/claude/claude-sessions.provider.ts @@ -0,0 +1,306 @@ +import { getSessionMessages } from '@/projects.js'; +import type { IProviderSessions } from '@/shared/interfaces.js'; +import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js'; +import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js'; + +const PROVIDER = 'claude'; + +type ClaudeToolResult = { + content: unknown; + isError: boolean; + subagentTools?: unknown; + toolUseResult?: unknown; +}; + +type ClaudeHistoryResult = + | AnyRecord[] + | { + messages?: AnyRecord[]; + total?: number; + hasMore?: boolean; + }; + +const loadClaudeSessionMessages = getSessionMessages as unknown as ( + projectName: string, + sessionId: string, + limit: number | null, + offset: number, +) => Promise; + +/** + * Claude writes internal command and system reminder entries into history. + * Those are useful for the CLI but should not appear in the user-facing chat. + */ +const INTERNAL_CONTENT_PREFIXES = [ + '', + '', + '', + '', + '', + 'Caveat:', + 'This session is being continued from a previous', + '[Request interrupted', +] as const; + +function isInternalContent(content: string): boolean { + return INTERNAL_CONTENT_PREFIXES.some((prefix) => content.startsWith(prefix)); +} + +export class ClaudeSessionsProvider implements IProviderSessions { + /** + * Normalizes one Claude JSONL entry or live SDK stream event into the shared + * message shape consumed by REST and WebSocket clients. + */ + normalizeMessage(rawMessage: unknown, sessionId: string | null): NormalizedMessage[] { + const raw = readObjectRecord(rawMessage); + if (!raw) { + return []; + } + + if (raw.type === 'content_block_delta' && raw.delta?.text) { + return [createNormalizedMessage({ kind: 'stream_delta', content: raw.delta.text, sessionId, provider: PROVIDER })]; + } + if (raw.type === 'content_block_stop') { + return [createNormalizedMessage({ kind: 'stream_end', sessionId, provider: PROVIDER })]; + } + + const messages: NormalizedMessage[] = []; + const ts = raw.timestamp || new Date().toISOString(); + const baseId = raw.uuid || generateMessageId('claude'); + + if (raw.message?.role === 'user' && raw.message?.content) { + if (Array.isArray(raw.message.content)) { + for (let partIndex = 0; partIndex < raw.message.content.length; partIndex++) { + const part = raw.message.content[partIndex]; + if (part.type === 'tool_result') { + messages.push(createNormalizedMessage({ + id: `${baseId}_tr_${part.tool_use_id}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: part.tool_use_id, + content: typeof part.content === 'string' ? part.content : JSON.stringify(part.content), + isError: Boolean(part.is_error), + subagentTools: raw.subagentTools, + toolUseResult: raw.toolUseResult, + })); + } else if (part.type === 'text') { + const text = part.text || ''; + if (text && !isInternalContent(text)) { + messages.push(createNormalizedMessage({ + id: `${baseId}_text_${partIndex}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'user', + content: text, + })); + } + } + } + + if (messages.length === 0) { + const textParts = raw.message.content + .filter((part: AnyRecord) => part.type === 'text') + .map((part: AnyRecord) => part.text) + .filter(Boolean) + .join('\n'); + if (textParts && !isInternalContent(textParts)) { + messages.push(createNormalizedMessage({ + id: `${baseId}_text`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'user', + content: textParts, + })); + } + } + } else if (typeof raw.message.content === 'string') { + const text = raw.message.content; + if (text && !isInternalContent(text)) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'user', + content: text, + })); + } + } + return messages; + } + + if (raw.type === 'thinking' && raw.message?.content) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'thinking', + content: raw.message.content, + })); + return messages; + } + + if (raw.type === 'tool_use' && raw.toolName) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: raw.toolName, + toolInput: raw.toolInput, + toolId: raw.toolCallId || baseId, + })); + return messages; + } + + if (raw.type === 'tool_result') { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: raw.toolCallId || '', + content: raw.output || '', + isError: false, + })); + return messages; + } + + if (raw.message?.role === 'assistant' && raw.message?.content) { + if (Array.isArray(raw.message.content)) { + let partIndex = 0; + for (const part of raw.message.content) { + if (part.type === 'text' && part.text) { + messages.push(createNormalizedMessage({ + id: `${baseId}_${partIndex}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'assistant', + content: part.text, + })); + } else if (part.type === 'tool_use') { + messages.push(createNormalizedMessage({ + id: `${baseId}_${partIndex}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: part.name, + toolInput: part.input, + toolId: part.id, + })); + } else if (part.type === 'thinking' && part.thinking) { + messages.push(createNormalizedMessage({ + id: `${baseId}_${partIndex}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'thinking', + content: part.thinking, + })); + } + partIndex++; + } + } else if (typeof raw.message.content === 'string') { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'assistant', + content: raw.message.content, + })); + } + return messages; + } + + return messages; + } + + /** + * Loads Claude JSONL history for a project/session and returns normalized + * messages, preserving the existing pagination behavior from projects.js. + */ + async fetchHistory( + sessionId: string, + options: FetchHistoryOptions = {}, + ): Promise { + const { projectName, limit = null, offset = 0 } = options; + if (!projectName) { + return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; + } + + let result: ClaudeHistoryResult; + try { + result = await loadClaudeSessionMessages(projectName, sessionId, limit, offset); + } catch (error) { + const message = error instanceof Error ? error.message : String(error); + console.warn(`[ClaudeProvider] Failed to load session ${sessionId}:`, message); + return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; + } + + const rawMessages = Array.isArray(result) ? result : (result.messages || []); + const total = Array.isArray(result) ? rawMessages.length : (result.total || 0); + const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore); + + const toolResultMap = new Map(); + for (const raw of rawMessages) { + if (raw.message?.role === 'user' && Array.isArray(raw.message?.content)) { + for (const part of raw.message.content) { + if (part.type === 'tool_result' && part.tool_use_id) { + toolResultMap.set(part.tool_use_id, { + content: part.content, + isError: Boolean(part.is_error), + subagentTools: raw.subagentTools, + toolUseResult: raw.toolUseResult, + }); + } + } + } + } + + const normalized: NormalizedMessage[] = []; + for (const raw of rawMessages) { + normalized.push(...this.normalizeMessage(raw, sessionId)); + } + + for (const msg of normalized) { + if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { + const toolResult = toolResultMap.get(msg.toolId); + if (!toolResult) { + continue; + } + + msg.toolResult = { + content: typeof toolResult.content === 'string' + ? toolResult.content + : JSON.stringify(toolResult.content), + isError: toolResult.isError, + toolUseResult: toolResult.toolUseResult, + }; + msg.subagentTools = toolResult.subagentTools; + } + } + + return { + messages: normalized, + total, + hasMore, + offset, + limit, + }; + } +} diff --git a/server/modules/providers/list/claude/claude.provider.ts b/server/modules/providers/list/claude/claude.provider.ts new file mode 100644 index 00000000..675d82dd --- /dev/null +++ b/server/modules/providers/list/claude/claude.provider.ts @@ -0,0 +1,15 @@ +import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js'; +import { ClaudeProviderAuth } from '@/modules/providers/list/claude/claude-auth.provider.js'; +import { ClaudeMcpProvider } from '@/modules/providers/list/claude/claude-mcp.provider.js'; +import { ClaudeSessionsProvider } from '@/modules/providers/list/claude/claude-sessions.provider.js'; +import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js'; + +export class ClaudeProvider extends AbstractProvider { + readonly mcp = new ClaudeMcpProvider(); + readonly auth: IProviderAuth = new ClaudeProviderAuth(); + readonly sessions: IProviderSessions = new ClaudeSessionsProvider(); + + constructor() { + super('claude'); + } +} diff --git a/server/modules/providers/list/codex/codex-auth.provider.ts b/server/modules/providers/list/codex/codex-auth.provider.ts new file mode 100644 index 00000000..e938e70d --- /dev/null +++ b/server/modules/providers/list/codex/codex-auth.provider.ts @@ -0,0 +1,100 @@ +import { readFile } from 'node:fs/promises'; +import os from 'node:os'; +import path from 'node:path'; + +import spawn from 'cross-spawn'; + +import type { IProviderAuth } from '@/shared/interfaces.js'; +import type { ProviderAuthStatus } from '@/shared/types.js'; +import { readObjectRecord, readOptionalString } from '@/shared/utils.js'; + +type CodexCredentialsStatus = { + authenticated: boolean; + email: string | null; + method: string | null; + error?: string; +}; + +export class CodexProviderAuth implements IProviderAuth { + /** + * Checks whether Codex is available to the server runtime. + */ + private checkInstalled(): boolean { + try { + spawn.sync('codex', ['--version'], { stdio: 'ignore', timeout: 5000 }); + return true; + } catch { + return false; + } + } + + /** + * Returns Codex SDK availability and credential status. + */ + async getStatus(): Promise { + const installed = this.checkInstalled(); + const credentials = await this.checkCredentials(); + + return { + installed, + provider: 'codex', + authenticated: credentials.authenticated, + email: credentials.email, + method: credentials.method, + error: credentials.authenticated ? undefined : credentials.error || 'Not authenticated', + }; + } + + /** + * Reads Codex auth.json and checks OAuth tokens or an API key fallback. + */ + private async checkCredentials(): Promise { + try { + const authPath = path.join(os.homedir(), '.codex', 'auth.json'); + const content = await readFile(authPath, 'utf8'); + const auth = readObjectRecord(JSON.parse(content)) ?? {}; + const tokens = readObjectRecord(auth.tokens) ?? {}; + const idToken = readOptionalString(tokens.id_token); + const accessToken = readOptionalString(tokens.access_token); + + if (idToken || accessToken) { + return { + authenticated: true, + email: idToken ? this.readEmailFromIdToken(idToken) : 'Authenticated', + method: 'credentials_file', + }; + } + + if (readOptionalString(auth.OPENAI_API_KEY)) { + return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; + } + + return { authenticated: false, email: null, method: null, error: 'No valid tokens found' }; + } catch (error) { + const code = (error as NodeJS.ErrnoException).code; + return { + authenticated: false, + email: null, + method: null, + error: code === 'ENOENT' ? 'Codex not configured' : error instanceof Error ? error.message : 'Failed to read Codex auth', + }; + } + } + + /** + * Extracts the user email from a Codex id_token when a readable JWT payload exists. + */ + private readEmailFromIdToken(idToken: string): string { + try { + const parts = idToken.split('.'); + if (parts.length >= 2) { + const payload = readObjectRecord(JSON.parse(Buffer.from(parts[1], 'base64url').toString('utf8'))); + return readOptionalString(payload?.email) ?? readOptionalString(payload?.user) ?? 'Authenticated'; + } + } catch { + // Fall back to a generic authenticated marker if the token payload is not readable. + } + + return 'Authenticated'; + } +} diff --git a/server/modules/providers/list/codex/codex-mcp.provider.ts b/server/modules/providers/list/codex/codex-mcp.provider.ts new file mode 100644 index 00000000..1aeef5d1 --- /dev/null +++ b/server/modules/providers/list/codex/codex-mcp.provider.ts @@ -0,0 +1,135 @@ +import { mkdir, readFile, writeFile } from 'node:fs/promises'; +import os from 'node:os'; +import path from 'node:path'; + +import TOML from '@iarna/toml'; + +import { McpProvider } from '@/modules/providers/shared/mcp/mcp.provider.js'; +import type { McpScope, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { + AppError, + readObjectRecord, + readOptionalString, + readStringArray, + readStringRecord, +} from '@/shared/utils.js'; + +const readTomlConfig = async (filePath: string): Promise> => { + try { + const content = await readFile(filePath, 'utf8'); + const parsed = TOML.parse(content) as Record; + return readObjectRecord(parsed) ?? {}; + } catch (error) { + const code = (error as NodeJS.ErrnoException).code; + if (code === 'ENOENT') { + return {}; + } + throw error; + } +}; + +const writeTomlConfig = async (filePath: string, data: Record): Promise => { + await mkdir(path.dirname(filePath), { recursive: true }); + const toml = TOML.stringify(data as never); + await writeFile(filePath, toml, 'utf8'); +}; + +export class CodexMcpProvider extends McpProvider { + constructor() { + super('codex', ['user', 'project'], ['stdio', 'http']); + } + + protected async readScopedServers(scope: McpScope, workspacePath: string): Promise> { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.codex', 'config.toml') + : path.join(workspacePath, '.codex', 'config.toml'); + const config = await readTomlConfig(filePath); + return readObjectRecord(config.mcp_servers) ?? {}; + } + + protected async writeScopedServers( + scope: McpScope, + workspacePath: string, + servers: Record, + ): Promise { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.codex', 'config.toml') + : path.join(workspacePath, '.codex', 'config.toml'); + const config = await readTomlConfig(filePath); + config.mcp_servers = servers; + await writeTomlConfig(filePath, config); + } + + protected buildServerConfig(input: UpsertProviderMcpServerInput): Record { + if (input.transport === 'stdio') { + if (!input.command?.trim()) { + throw new AppError('command is required for stdio MCP servers.', { + code: 'MCP_COMMAND_REQUIRED', + statusCode: 400, + }); + } + + return { + command: input.command, + args: input.args ?? [], + env: input.env ?? {}, + env_vars: input.envVars ?? [], + cwd: input.cwd, + }; + } + + if (!input.url?.trim()) { + throw new AppError('url is required for http MCP servers.', { + code: 'MCP_URL_REQUIRED', + statusCode: 400, + }); + } + + return { + url: input.url, + bearer_token_env_var: input.bearerTokenEnvVar, + http_headers: input.headers ?? {}, + env_http_headers: input.envHttpHeaders ?? {}, + }; + } + + protected normalizeServerConfig( + scope: McpScope, + name: string, + rawConfig: unknown, + ): ProviderMcpServer | null { + if (!rawConfig || typeof rawConfig !== 'object') { + return null; + } + + const config = rawConfig as Record; + if (typeof config.command === 'string') { + return { + provider: 'codex', + name, + scope, + transport: 'stdio', + command: config.command, + args: readStringArray(config.args), + env: readStringRecord(config.env), + cwd: readOptionalString(config.cwd), + envVars: readStringArray(config.env_vars), + }; + } + + if (typeof config.url === 'string') { + return { + provider: 'codex', + name, + scope, + transport: 'http', + url: config.url, + headers: readStringRecord(config.http_headers), + bearerTokenEnvVar: readOptionalString(config.bearer_token_env_var), + envHttpHeaders: readStringRecord(config.env_http_headers), + }; + } + + return null; + } +} diff --git a/server/modules/providers/list/codex/codex-sessions.provider.ts b/server/modules/providers/list/codex/codex-sessions.provider.ts new file mode 100644 index 00000000..1ea986f7 --- /dev/null +++ b/server/modules/providers/list/codex/codex-sessions.provider.ts @@ -0,0 +1,319 @@ +import { getCodexSessionMessages } from '@/projects.js'; +import type { IProviderSessions } from '@/shared/interfaces.js'; +import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js'; +import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js'; + +const PROVIDER = 'codex'; + +type CodexHistoryResult = + | AnyRecord[] + | { + messages?: AnyRecord[]; + total?: number; + hasMore?: boolean; + tokenUsage?: unknown; + }; + +const loadCodexSessionMessages = getCodexSessionMessages as unknown as ( + sessionId: string, + limit: number | null, + offset: number, +) => Promise; + +export class CodexSessionsProvider implements IProviderSessions { + /** + * Normalizes a persisted Codex JSONL entry. + * + * Live Codex SDK events are transformed before they reach normalizeMessage(), + * while history entries already use a compact message/tool shape from projects.js. + */ + private normalizeHistoryEntry(raw: AnyRecord, sessionId: string | null): NormalizedMessage[] { + const ts = raw.timestamp || new Date().toISOString(); + const baseId = raw.uuid || generateMessageId('codex'); + + if (raw.message?.role === 'user') { + const content = typeof raw.message.content === 'string' + ? raw.message.content + : Array.isArray(raw.message.content) + ? raw.message.content + .map((part: string | AnyRecord) => typeof part === 'string' ? part : part?.text || '') + .filter(Boolean) + .join('\n') + : String(raw.message.content || ''); + if (!content.trim()) { + return []; + } + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'user', + content, + })]; + } + + if (raw.message?.role === 'assistant') { + const content = typeof raw.message.content === 'string' + ? raw.message.content + : Array.isArray(raw.message.content) + ? raw.message.content + .map((part: string | AnyRecord) => typeof part === 'string' ? part : part?.text || '') + .filter(Boolean) + .join('\n') + : ''; + if (!content.trim()) { + return []; + } + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'assistant', + content, + })]; + } + + if (raw.type === 'thinking' || raw.isReasoning) { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'thinking', + content: raw.message?.content || '', + })]; + } + + if (raw.type === 'tool_use' || raw.toolName) { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: raw.toolName || 'Unknown', + toolInput: raw.toolInput, + toolId: raw.toolCallId || baseId, + })]; + } + + if (raw.type === 'tool_result') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: raw.toolCallId || '', + content: raw.output || '', + isError: Boolean(raw.isError), + })]; + } + + return []; + } + + /** + * Normalizes either a Codex history entry or a transformed live SDK event. + */ + normalizeMessage(rawMessage: unknown, sessionId: string | null): NormalizedMessage[] { + const raw = readObjectRecord(rawMessage); + if (!raw) { + return []; + } + + if (raw.message?.role) { + return this.normalizeHistoryEntry(raw, sessionId); + } + + const ts = raw.timestamp || new Date().toISOString(); + const baseId = raw.uuid || generateMessageId('codex'); + + if (raw.type === 'item') { + switch (raw.itemType) { + case 'agent_message': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: 'assistant', + content: raw.message?.content || '', + })]; + case 'reasoning': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'thinking', + content: raw.message?.content || '', + })]; + case 'command_execution': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: 'Bash', + toolInput: { command: raw.command }, + toolId: baseId, + output: raw.output, + exitCode: raw.exitCode, + status: raw.status, + })]; + case 'file_change': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: 'FileChanges', + toolInput: raw.changes, + toolId: baseId, + status: raw.status, + })]; + case 'mcp_tool_call': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: raw.tool || 'MCP', + toolInput: raw.arguments, + toolId: baseId, + server: raw.server, + result: raw.result, + error: raw.error, + status: raw.status, + })]; + case 'web_search': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: 'WebSearch', + toolInput: { query: raw.query }, + toolId: baseId, + })]; + case 'todo_list': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: 'TodoList', + toolInput: { items: raw.items }, + toolId: baseId, + })]; + case 'error': + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'error', + content: raw.message?.content || 'Unknown error', + })]; + default: + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: raw.itemType || 'Unknown', + toolInput: raw.item || raw, + toolId: baseId, + })]; + } + } + + if (raw.type === 'turn_complete') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'complete', + })]; + } + if (raw.type === 'turn_failed') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'error', + content: raw.error?.message || 'Turn failed', + })]; + } + + return []; + } + + /** + * Loads Codex JSONL history and keeps token usage metadata when projects.js + * provides it. + */ + async fetchHistory( + sessionId: string, + options: FetchHistoryOptions = {}, + ): Promise { + const { limit = null, offset = 0 } = options; + + let result: CodexHistoryResult; + try { + result = await loadCodexSessionMessages(sessionId, limit, offset); + } catch (error) { + const message = error instanceof Error ? error.message : String(error); + console.warn(`[CodexProvider] Failed to load session ${sessionId}:`, message); + return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; + } + + const rawMessages = Array.isArray(result) ? result : (result.messages || []); + const total = Array.isArray(result) ? rawMessages.length : (result.total || 0); + const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore); + const tokenUsage = Array.isArray(result) ? undefined : result.tokenUsage; + + const normalized: NormalizedMessage[] = []; + for (const raw of rawMessages) { + normalized.push(...this.normalizeHistoryEntry(raw, sessionId)); + } + + const toolResultMap = new Map(); + for (const msg of normalized) { + if (msg.kind === 'tool_result' && msg.toolId) { + toolResultMap.set(msg.toolId, msg); + } + } + for (const msg of normalized) { + if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { + const toolResult = toolResultMap.get(msg.toolId); + if (toolResult) { + msg.toolResult = { content: toolResult.content, isError: toolResult.isError }; + } + } + } + + return { + messages: normalized, + total, + hasMore, + offset, + limit, + tokenUsage, + }; + } +} diff --git a/server/modules/providers/list/codex/codex.provider.ts b/server/modules/providers/list/codex/codex.provider.ts new file mode 100644 index 00000000..fe1b9eb5 --- /dev/null +++ b/server/modules/providers/list/codex/codex.provider.ts @@ -0,0 +1,15 @@ +import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js'; +import { CodexProviderAuth } from '@/modules/providers/list/codex/codex-auth.provider.js'; +import { CodexMcpProvider } from '@/modules/providers/list/codex/codex-mcp.provider.js'; +import { CodexSessionsProvider } from '@/modules/providers/list/codex/codex-sessions.provider.js'; +import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js'; + +export class CodexProvider extends AbstractProvider { + readonly mcp = new CodexMcpProvider(); + readonly auth: IProviderAuth = new CodexProviderAuth(); + readonly sessions: IProviderSessions = new CodexSessionsProvider(); + + constructor() { + super('codex'); + } +} diff --git a/server/modules/providers/list/cursor/cursor-auth.provider.ts b/server/modules/providers/list/cursor/cursor-auth.provider.ts new file mode 100644 index 00000000..7cc035a9 --- /dev/null +++ b/server/modules/providers/list/cursor/cursor-auth.provider.ts @@ -0,0 +1,143 @@ +import spawn from 'cross-spawn'; + +import type { IProviderAuth } from '@/shared/interfaces.js'; +import type { ProviderAuthStatus } from '@/shared/types.js'; + +type CursorLoginStatus = { + authenticated: boolean; + email: string | null; + method: string | null; + error?: string; +}; + +export class CursorProviderAuth implements IProviderAuth { + /** + * Checks whether the cursor-agent CLI is available on this host. + */ + private checkInstalled(): boolean { + try { + spawn.sync('cursor-agent', ['--version'], { stdio: 'ignore', timeout: 5000 }); + return true; + } catch { + return false; + } + } + + /** + * Returns Cursor CLI installation and login status. + */ + async getStatus(): Promise { + const installed = this.checkInstalled(); + + if (!installed) { + return { + installed, + provider: 'cursor', + authenticated: false, + email: null, + method: null, + error: 'Cursor CLI is not installed', + }; + } + + const login = await this.checkCursorLogin(); + + return { + installed, + provider: 'cursor', + authenticated: login.authenticated, + email: login.email, + method: login.method, + error: login.authenticated ? undefined : login.error || 'Not logged in', + }; + } + + /** + * Runs cursor-agent status and parses the login marker from stdout. + */ + private checkCursorLogin(): Promise { + return new Promise((resolve) => { + let processCompleted = false; + let childProcess: ReturnType | undefined; + + const timeout = setTimeout(() => { + if (!processCompleted) { + processCompleted = true; + childProcess?.kill(); + resolve({ + authenticated: false, + email: null, + method: null, + error: 'Command timeout', + }); + } + }, 5000); + + try { + childProcess = spawn('cursor-agent', ['status']); + } catch { + clearTimeout(timeout); + processCompleted = true; + resolve({ + authenticated: false, + email: null, + method: null, + error: 'Cursor CLI not found or not installed', + }); + return; + } + + let stdout = ''; + let stderr = ''; + + childProcess.stdout?.on('data', (data: Buffer) => { + stdout += data.toString(); + }); + + childProcess.stderr?.on('data', (data: Buffer) => { + stderr += data.toString(); + }); + + childProcess.on('close', (code) => { + if (processCompleted) { + return; + } + processCompleted = true; + clearTimeout(timeout); + + if (code === 0) { + const emailMatch = stdout.match(/Logged in as ([a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,})/i); + if (emailMatch?.[1]) { + resolve({ authenticated: true, email: emailMatch[1], method: 'cli' }); + return; + } + + if (stdout.includes('Logged in')) { + resolve({ authenticated: true, email: 'Logged in', method: 'cli' }); + return; + } + + resolve({ authenticated: false, email: null, method: null, error: 'Not logged in' }); + return; + } + + resolve({ authenticated: false, email: null, method: null, error: stderr || 'Not logged in' }); + }); + + childProcess.on('error', () => { + if (processCompleted) { + return; + } + processCompleted = true; + clearTimeout(timeout); + + resolve({ + authenticated: false, + email: null, + method: null, + error: 'Cursor CLI not found or not installed', + }); + }); + }); + } +} diff --git a/server/modules/providers/list/cursor/cursor-mcp.provider.ts b/server/modules/providers/list/cursor/cursor-mcp.provider.ts new file mode 100644 index 00000000..007add53 --- /dev/null +++ b/server/modules/providers/list/cursor/cursor-mcp.provider.ts @@ -0,0 +1,108 @@ +import os from 'node:os'; +import path from 'node:path'; + +import { McpProvider } from '@/modules/providers/shared/mcp/mcp.provider.js'; +import type { McpScope, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { + AppError, + readJsonConfig, + readObjectRecord, + readOptionalString, + readStringArray, + readStringRecord, + writeJsonConfig, +} from '@/shared/utils.js'; + +export class CursorMcpProvider extends McpProvider { + constructor() { + super('cursor', ['user', 'project'], ['stdio', 'http']); + } + + protected async readScopedServers(scope: McpScope, workspacePath: string): Promise> { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.cursor', 'mcp.json') + : path.join(workspacePath, '.cursor', 'mcp.json'); + const config = await readJsonConfig(filePath); + return readObjectRecord(config.mcpServers) ?? {}; + } + + protected async writeScopedServers( + scope: McpScope, + workspacePath: string, + servers: Record, + ): Promise { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.cursor', 'mcp.json') + : path.join(workspacePath, '.cursor', 'mcp.json'); + const config = await readJsonConfig(filePath); + config.mcpServers = servers; + await writeJsonConfig(filePath, config); + } + + protected buildServerConfig(input: UpsertProviderMcpServerInput): Record { + if (input.transport === 'stdio') { + if (!input.command?.trim()) { + throw new AppError('command is required for stdio MCP servers.', { + code: 'MCP_COMMAND_REQUIRED', + statusCode: 400, + }); + } + + return { + command: input.command, + args: input.args ?? [], + env: input.env ?? {}, + cwd: input.cwd, + }; + } + + if (!input.url?.trim()) { + throw new AppError('url is required for http MCP servers.', { + code: 'MCP_URL_REQUIRED', + statusCode: 400, + }); + } + + return { + url: input.url, + headers: input.headers ?? {}, + }; + } + + protected normalizeServerConfig( + scope: McpScope, + name: string, + rawConfig: unknown, + ): ProviderMcpServer | null { + if (!rawConfig || typeof rawConfig !== 'object') { + return null; + } + + const config = rawConfig as Record; + if (typeof config.command === 'string') { + return { + provider: 'cursor', + name, + scope, + transport: 'stdio', + command: config.command, + args: readStringArray(config.args), + env: readStringRecord(config.env), + cwd: readOptionalString(config.cwd), + }; + } + + if (typeof config.url === 'string') { + return { + provider: 'cursor', + name, + scope, + transport: 'http', + url: config.url, + headers: readStringRecord(config.headers), + }; + } + + return null; + } +} diff --git a/server/modules/providers/list/cursor/cursor-sessions.provider.ts b/server/modules/providers/list/cursor/cursor-sessions.provider.ts new file mode 100644 index 00000000..e276ba8c --- /dev/null +++ b/server/modules/providers/list/cursor/cursor-sessions.provider.ts @@ -0,0 +1,421 @@ +import crypto from 'node:crypto'; +import os from 'node:os'; +import path from 'node:path'; + +import type { IProviderSessions } from '@/shared/interfaces.js'; +import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js'; +import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js'; + +const PROVIDER = 'cursor'; + +type CursorDbBlob = { + rowid: number; + id: string; + data?: Buffer; +}; + +type CursorJsonBlob = CursorDbBlob & { + parsed: AnyRecord; +}; + +type CursorMessageBlob = { + id: string; + sequence: number; + rowid: number; + content: AnyRecord; +}; + +function sanitizeCursorSessionId(sessionId: string): string { + const normalized = sessionId.trim(); + if (!normalized) { + throw new Error('Cursor session id is required.'); + } + + if ( + normalized.includes('..') + || normalized.includes(path.posix.sep) + || normalized.includes(path.win32.sep) + || normalized !== path.basename(normalized) + ) { + throw new Error(`Invalid cursor session id "${sessionId}".`); + } + + return normalized; +} + +export class CursorSessionsProvider implements IProviderSessions { + /** + * Loads Cursor's SQLite blob DAG and returns message blobs in conversation + * order. Cursor history is stored as content-addressed blobs rather than JSONL. + */ + private async loadCursorBlobs(sessionId: string, projectPath: string): Promise { + // Lazy-import better-sqlite3 so the module doesn't fail if it's unavailable + const { default: Database } = await import('better-sqlite3'); + + const cwdId = crypto.createHash('md5').update(projectPath || process.cwd()).digest('hex'); + const safeSessionId = sanitizeCursorSessionId(sessionId); + const baseChatsPath = path.join(os.homedir(), '.cursor', 'chats', cwdId); + const storeDbPath = path.join(baseChatsPath, safeSessionId, 'store.db'); + const resolvedBaseChatsPath = path.resolve(baseChatsPath); + const resolvedStoreDbPath = path.resolve(storeDbPath); + const relativeStorePath = path.relative(resolvedBaseChatsPath, resolvedStoreDbPath); + if (relativeStorePath.startsWith('..') || path.isAbsolute(relativeStorePath)) { + throw new Error(`Invalid cursor session path for "${sessionId}".`); + } + + const db = new Database(resolvedStoreDbPath, { readonly: true, fileMustExist: true }); + + try { + const allBlobs = db.prepare<[], CursorDbBlob>('SELECT rowid, id, data FROM blobs').all(); + + const blobMap = new Map(); + const parentRefs = new Map(); + const childRefs = new Map(); + const jsonBlobs: CursorJsonBlob[] = []; + + for (const blob of allBlobs) { + blobMap.set(blob.id, blob); + + if (blob.data && blob.data[0] === 0x7B) { + try { + const parsed = JSON.parse(blob.data.toString('utf8')) as AnyRecord; + jsonBlobs.push({ ...blob, parsed }); + } catch { + // Cursor can include binary or partial blobs; only JSON blobs become messages. + } + } + } + + for (const blob of allBlobs) { + if (!blob.data || blob.data[0] === 0x7B) { + continue; + } + + const parents: string[] = []; + let i = 0; + while (i < blob.data.length - 33) { + if (blob.data[i] === 0x0A && blob.data[i + 1] === 0x20) { + const parentHash = blob.data.slice(i + 2, i + 34).toString('hex'); + if (blobMap.has(parentHash)) { + parents.push(parentHash); + } + i += 34; + } else { + i++; + } + } + + if (parents.length > 0) { + parentRefs.set(blob.id, parents); + for (const parentId of parents) { + if (!childRefs.has(parentId)) { + childRefs.set(parentId, []); + } + childRefs.get(parentId)?.push(blob.id); + } + } + } + + const visited = new Set(); + const sorted: CursorDbBlob[] = []; + const visit = (nodeId: string): void => { + if (visited.has(nodeId)) { + return; + } + visited.add(nodeId); + for (const parentId of parentRefs.get(nodeId) || []) { + visit(parentId); + } + const blob = blobMap.get(nodeId); + if (blob) { + sorted.push(blob); + } + }; + + for (const blob of allBlobs) { + if (!parentRefs.has(blob.id)) { + visit(blob.id); + } + } + for (const blob of allBlobs) { + visit(blob.id); + } + + const messageOrder = new Map(); + let orderIndex = 0; + for (const blob of sorted) { + if (blob.data && blob.data[0] !== 0x7B) { + for (const jsonBlob of jsonBlobs) { + try { + const idBytes = Buffer.from(jsonBlob.id, 'hex'); + if (blob.data.includes(idBytes) && !messageOrder.has(jsonBlob.id)) { + messageOrder.set(jsonBlob.id, orderIndex++); + } + } catch { + // Ignore malformed blob ids that cannot be decoded as hex. + } + } + } + } + + const sortedJsonBlobs = jsonBlobs.sort((a, b) => { + const aOrder = messageOrder.get(a.id) ?? Number.MAX_SAFE_INTEGER; + const bOrder = messageOrder.get(b.id) ?? Number.MAX_SAFE_INTEGER; + return aOrder !== bOrder ? aOrder - bOrder : a.rowid - b.rowid; + }); + + const messages: CursorMessageBlob[] = []; + for (let idx = 0; idx < sortedJsonBlobs.length; idx++) { + const blob = sortedJsonBlobs[idx]; + const parsed = blob.parsed; + const role = parsed?.role || parsed?.message?.role; + if (role === 'system') { + continue; + } + messages.push({ + id: blob.id, + sequence: idx + 1, + rowid: blob.rowid, + content: parsed, + }); + } + + return messages; + } finally { + db.close(); + } + } + + /** + * Normalizes live Cursor CLI NDJSON events. Persisted Cursor history is + * normalized from SQLite blobs in fetchHistory(). + */ + normalizeMessage(rawMessage: unknown, sessionId: string | null): NormalizedMessage[] { + const raw = readObjectRecord(rawMessage); + if (raw?.type === 'assistant' && raw.message?.content?.[0]?.text) { + return [createNormalizedMessage({ + kind: 'stream_delta', + content: raw.message.content[0].text, + sessionId, + provider: PROVIDER, + })]; + } + + if (typeof rawMessage === 'string' && rawMessage.trim()) { + return [createNormalizedMessage({ + kind: 'stream_delta', + content: rawMessage, + sessionId, + provider: PROVIDER, + })]; + } + + return []; + } + + /** + * Fetches and paginates Cursor session history from its project-scoped store.db. + */ + async fetchHistory( + sessionId: string, + options: FetchHistoryOptions = {}, + ): Promise { + const { projectPath = '', limit = null, offset = 0 } = options; + + try { + const blobs = await this.loadCursorBlobs(sessionId, projectPath); + const allNormalized = this.normalizeCursorBlobs(blobs, sessionId); + const total = allNormalized.length; + + if (limit !== null) { + const start = offset; + const page = limit === 0 + ? [] + : allNormalized.slice(start, start + limit); + const hasMore = limit === 0 + ? start < total + : start + limit < total; + return { + messages: page, + total, + hasMore, + offset, + limit, + }; + } + + return { + messages: allNormalized, + total, + hasMore: false, + offset: 0, + limit: null, + }; + } catch (error) { + const message = error instanceof Error ? error.message : String(error); + console.warn(`[CursorProvider] Failed to load session ${sessionId}:`, message); + return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; + } + } + + /** + * Converts Cursor SQLite message blobs into normalized messages and attaches + * matching tool results to their tool_use entries. + */ + private normalizeCursorBlobs(blobs: CursorMessageBlob[], sessionId: string | null): NormalizedMessage[] { + const messages: NormalizedMessage[] = []; + const toolUseMap = new Map(); + const baseTime = Date.now(); + + for (let i = 0; i < blobs.length; i++) { + const blob = blobs[i]; + const content = blob.content; + const ts = new Date(baseTime + (blob.sequence ?? i) * 100).toISOString(); + const baseId = blob.id || generateMessageId('cursor'); + + try { + if (!content?.role || !content?.content) { + if (content?.message?.role && content?.message?.content) { + if (content.message.role === 'system') { + continue; + } + const role = content.message.role === 'user' ? 'user' : 'assistant'; + let text = ''; + if (Array.isArray(content.message.content)) { + text = content.message.content + .map((part: string | AnyRecord) => typeof part === 'string' ? part : part?.text || '') + .filter(Boolean) + .join('\n'); + } else if (typeof content.message.content === 'string') { + text = content.message.content; + } + if (text?.trim()) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role, + content: text, + sequence: blob.sequence, + rowid: blob.rowid, + })); + } + } + continue; + } + + if (content.role === 'system') { + continue; + } + + if (content.role === 'tool') { + const toolItems = Array.isArray(content.content) ? content.content : []; + for (const item of toolItems) { + if (item?.type !== 'tool-result') { + continue; + } + const toolCallId = item.toolCallId || content.id; + messages.push(createNormalizedMessage({ + id: `${baseId}_tr`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: toolCallId, + content: item.result || '', + isError: false, + })); + } + continue; + } + + const role = content.role === 'user' ? 'user' : 'assistant'; + + if (Array.isArray(content.content)) { + for (let partIdx = 0; partIdx < content.content.length; partIdx++) { + const part = content.content[partIdx]; + + if (part?.type === 'text' && part?.text) { + messages.push(createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role, + content: part.text, + sequence: blob.sequence, + rowid: blob.rowid, + })); + } else if (part?.type === 'reasoning' && part?.text) { + messages.push(createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'thinking', + content: part.text, + })); + } else if (part?.type === 'tool-call' || part?.type === 'tool_use') { + const rawToolName = part.toolName || part.name || 'Unknown Tool'; + const toolName = rawToolName === 'ApplyPatch' ? 'Edit' : rawToolName; + const toolId = part.toolCallId || part.id || `tool_${i}_${partIdx}`; + const message = createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName, + toolInput: part.args || part.input, + toolId, + }); + messages.push(message); + toolUseMap.set(toolId, message); + } + } + } else if (typeof content.content === 'string' && content.content.trim()) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role, + content: content.content, + sequence: blob.sequence, + rowid: blob.rowid, + })); + } + } catch (error) { + console.warn('Error normalizing cursor blob:', error); + } + } + + for (const msg of messages) { + if (msg.kind === 'tool_result' && msg.toolId && toolUseMap.has(msg.toolId)) { + const toolUse = toolUseMap.get(msg.toolId); + if (toolUse) { + toolUse.toolResult = { + content: msg.content, + isError: msg.isError, + }; + } + } + } + + messages.sort((a, b) => { + if (a.sequence !== undefined && b.sequence !== undefined) { + return a.sequence - b.sequence; + } + if (a.rowid !== undefined && b.rowid !== undefined) { + return a.rowid - b.rowid; + } + return new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime(); + }); + + return messages; + } +} diff --git a/server/modules/providers/list/cursor/cursor.provider.ts b/server/modules/providers/list/cursor/cursor.provider.ts new file mode 100644 index 00000000..7e834a10 --- /dev/null +++ b/server/modules/providers/list/cursor/cursor.provider.ts @@ -0,0 +1,15 @@ +import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js'; +import { CursorProviderAuth } from '@/modules/providers/list/cursor/cursor-auth.provider.js'; +import { CursorMcpProvider } from '@/modules/providers/list/cursor/cursor-mcp.provider.js'; +import { CursorSessionsProvider } from '@/modules/providers/list/cursor/cursor-sessions.provider.js'; +import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js'; + +export class CursorProvider extends AbstractProvider { + readonly mcp = new CursorMcpProvider(); + readonly auth: IProviderAuth = new CursorProviderAuth(); + readonly sessions: IProviderSessions = new CursorSessionsProvider(); + + constructor() { + super('cursor'); + } +} diff --git a/server/modules/providers/list/gemini/gemini-auth.provider.ts b/server/modules/providers/list/gemini/gemini-auth.provider.ts new file mode 100644 index 00000000..60b0749e --- /dev/null +++ b/server/modules/providers/list/gemini/gemini-auth.provider.ts @@ -0,0 +1,151 @@ +import { readFile } from 'node:fs/promises'; +import os from 'node:os'; +import path from 'node:path'; + +import spawn from 'cross-spawn'; + +import type { IProviderAuth } from '@/shared/interfaces.js'; +import type { ProviderAuthStatus } from '@/shared/types.js'; +import { readObjectRecord, readOptionalString } from '@/shared/utils.js'; + +type GeminiCredentialsStatus = { + authenticated: boolean; + email: string | null; + method: string | null; + error?: string; +}; + +export class GeminiProviderAuth implements IProviderAuth { + /** + * Checks whether the Gemini CLI is available on this host. + */ + private checkInstalled(): boolean { + const cliPath = process.env.GEMINI_PATH || 'gemini'; + try { + spawn.sync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 }); + return true; + } catch { + return false; + } + } + + /** + * Returns Gemini CLI installation and credential status. + */ + async getStatus(): Promise { + const installed = this.checkInstalled(); + + if (!installed) { + return { + installed, + provider: 'gemini', + authenticated: false, + email: null, + method: null, + error: 'Gemini CLI is not installed', + }; + } + + const credentials = await this.checkCredentials(); + + return { + installed, + provider: 'gemini', + authenticated: credentials.authenticated, + email: credentials.email, + method: credentials.method, + error: credentials.authenticated ? undefined : credentials.error || 'Not authenticated', + }; + } + + /** + * Checks Gemini credentials from API key env vars or local OAuth credential files. + */ + private async checkCredentials(): Promise { + if (process.env.GEMINI_API_KEY?.trim()) { + return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; + } + + try { + const credsPath = path.join(os.homedir(), '.gemini', 'oauth_creds.json'); + const content = await readFile(credsPath, 'utf8'); + const creds = readObjectRecord(JSON.parse(content)) ?? {}; + const accessToken = readOptionalString(creds.access_token); + + if (!accessToken) { + return { + authenticated: false, + email: null, + method: null, + error: 'No valid tokens found in oauth_creds', + }; + } + + const refreshToken = readOptionalString(creds.refresh_token); + const tokenInfo = await this.getTokenInfoEmail(accessToken); + if (tokenInfo.valid) { + return { + authenticated: true, + email: tokenInfo.email || 'OAuth Session', + method: 'credentials_file', + }; + } + + if (!refreshToken) { + return { + authenticated: false, + email: null, + method: 'credentials_file', + error: 'Access token invalid and no refresh token found', + }; + } + + return { + authenticated: true, + email: await this.getActiveAccountEmail() || 'OAuth Session', + method: 'credentials_file', + }; + } catch { + return { + authenticated: false, + email: null, + method: null, + error: 'Gemini CLI not configured', + }; + } + } + + /** + * Validates a Gemini OAuth access token and returns an email when Google reports one. + */ + private async getTokenInfoEmail(accessToken: string): Promise<{ valid: boolean; email: string | null }> { + try { + const tokenRes = await fetch(`https://oauth2.googleapis.com/tokeninfo?access_token=${accessToken}`); + if (!tokenRes.ok) { + return { valid: false, email: null }; + } + + const tokenInfo = readObjectRecord(await tokenRes.json()); + return { + valid: true, + email: readOptionalString(tokenInfo?.email) ?? null, + }; + } catch { + return { valid: false, email: null }; + } + } + + /** + * Reads Gemini's active local Google account as an offline fallback for display. + */ + private async getActiveAccountEmail(): Promise { + try { + const accPath = path.join(os.homedir(), '.gemini', 'google_accounts.json'); + const accContent = await readFile(accPath, 'utf8'); + const accounts = readObjectRecord(JSON.parse(accContent)); + return readOptionalString(accounts?.active) ?? null; + } catch { + return null; + } + } +} diff --git a/server/modules/providers/list/gemini/gemini-mcp.provider.ts b/server/modules/providers/list/gemini/gemini-mcp.provider.ts new file mode 100644 index 00000000..b86b8f2d --- /dev/null +++ b/server/modules/providers/list/gemini/gemini-mcp.provider.ts @@ -0,0 +1,110 @@ +import os from 'node:os'; +import path from 'node:path'; + +import { McpProvider } from '@/modules/providers/shared/mcp/mcp.provider.js'; +import type { McpScope, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { + AppError, + readJsonConfig, + readObjectRecord, + readOptionalString, + readStringArray, + readStringRecord, + writeJsonConfig, +} from '@/shared/utils.js'; + +export class GeminiMcpProvider extends McpProvider { + constructor() { + super('gemini', ['user', 'project'], ['stdio', 'http', 'sse']); + } + + protected async readScopedServers(scope: McpScope, workspacePath: string): Promise> { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.gemini', 'settings.json') + : path.join(workspacePath, '.gemini', 'settings.json'); + const config = await readJsonConfig(filePath); + return readObjectRecord(config.mcpServers) ?? {}; + } + + protected async writeScopedServers( + scope: McpScope, + workspacePath: string, + servers: Record, + ): Promise { + const filePath = scope === 'user' + ? path.join(os.homedir(), '.gemini', 'settings.json') + : path.join(workspacePath, '.gemini', 'settings.json'); + const config = await readJsonConfig(filePath); + config.mcpServers = servers; + await writeJsonConfig(filePath, config); + } + + protected buildServerConfig(input: UpsertProviderMcpServerInput): Record { + if (input.transport === 'stdio') { + if (!input.command?.trim()) { + throw new AppError('command is required for stdio MCP servers.', { + code: 'MCP_COMMAND_REQUIRED', + statusCode: 400, + }); + } + + return { + command: input.command, + args: input.args ?? [], + env: input.env ?? {}, + cwd: input.cwd, + }; + } + + if (!input.url?.trim()) { + throw new AppError('url is required for http/sse MCP servers.', { + code: 'MCP_URL_REQUIRED', + statusCode: 400, + }); + } + + return { + type: input.transport, + url: input.url, + headers: input.headers ?? {}, + }; + } + + protected normalizeServerConfig( + scope: McpScope, + name: string, + rawConfig: unknown, + ): ProviderMcpServer | null { + if (!rawConfig || typeof rawConfig !== 'object') { + return null; + } + + const config = rawConfig as Record; + if (typeof config.command === 'string') { + return { + provider: 'gemini', + name, + scope, + transport: 'stdio', + command: config.command, + args: readStringArray(config.args), + env: readStringRecord(config.env), + cwd: readOptionalString(config.cwd), + }; + } + + if (typeof config.url === 'string') { + const transport = readOptionalString(config.type) === 'sse' ? 'sse' : 'http'; + return { + provider: 'gemini', + name, + scope, + transport, + url: config.url, + headers: readStringRecord(config.headers), + }; + } + + return null; + } +} diff --git a/server/modules/providers/list/gemini/gemini-sessions.provider.ts b/server/modules/providers/list/gemini/gemini-sessions.provider.ts new file mode 100644 index 00000000..7d5b5f1a --- /dev/null +++ b/server/modules/providers/list/gemini/gemini-sessions.provider.ts @@ -0,0 +1,227 @@ +import sessionManager from '@/sessionManager.js'; +import { getGeminiCliSessionMessages } from '@/projects.js'; +import type { IProviderSessions } from '@/shared/interfaces.js'; +import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js'; +import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js'; + +const PROVIDER = 'gemini'; + +export class GeminiSessionsProvider implements IProviderSessions { + /** + * Normalizes live Gemini stream-json events into the shared message shape. + * + * Gemini history uses a different session file shape, so fetchHistory handles + * that separately after loading raw persisted messages. + */ + normalizeMessage(rawMessage: unknown, sessionId: string | null): NormalizedMessage[] { + const raw = readObjectRecord(rawMessage); + if (!raw) { + return []; + } + + const ts = raw.timestamp || new Date().toISOString(); + const baseId = raw.uuid || generateMessageId('gemini'); + + if (raw.type === 'message' && raw.role === 'assistant') { + const content = raw.content || ''; + const messages: NormalizedMessage[] = []; + if (content) { + messages.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'stream_delta', + content, + })); + } + if (raw.delta !== true) { + messages.push(createNormalizedMessage({ + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'stream_end', + })); + } + return messages; + } + + if (raw.type === 'tool_use') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: raw.tool_name, + toolInput: raw.parameters || {}, + toolId: raw.tool_id || baseId, + })]; + } + + if (raw.type === 'tool_result') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: raw.tool_id || '', + content: raw.output === undefined ? '' : String(raw.output), + isError: raw.status === 'error', + })]; + } + + if (raw.type === 'result') { + const messages = [createNormalizedMessage({ + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'stream_end', + })]; + if (raw.stats?.total_tokens) { + messages.push(createNormalizedMessage({ + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'status', + text: 'Complete', + tokens: raw.stats.total_tokens, + canInterrupt: false, + })); + } + return messages; + } + + if (raw.type === 'error') { + return [createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'error', + content: raw.error || raw.message || 'Unknown Gemini streaming error', + })]; + } + + return []; + } + + /** + * Loads Gemini history from the in-memory session manager first, then falls + * back to Gemini CLI session files on disk. + */ + async fetchHistory( + sessionId: string, + options: FetchHistoryOptions = {}, + ): Promise { + const { limit = null, offset = 0 } = options; + + let rawMessages: AnyRecord[]; + try { + rawMessages = sessionManager.getSessionMessages(sessionId) as AnyRecord[]; + + if (rawMessages.length === 0) { + rawMessages = await getGeminiCliSessionMessages(sessionId) as AnyRecord[]; + } + } catch (error) { + const message = error instanceof Error ? error.message : String(error); + console.warn(`[GeminiProvider] Failed to load session ${sessionId}:`, message); + return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; + } + + const normalized: NormalizedMessage[] = []; + for (let i = 0; i < rawMessages.length; i++) { + const raw = rawMessages[i]; + const ts = raw.timestamp || new Date().toISOString(); + const baseId = raw.uuid || generateMessageId('gemini'); + + const role = raw.message?.role || raw.role; + const content = raw.message?.content || raw.content; + + if (!role || !content) { + continue; + } + + const normalizedRole = role === 'user' ? 'user' : 'assistant'; + + if (Array.isArray(content)) { + for (let partIdx = 0; partIdx < content.length; partIdx++) { + const part = content[partIdx]; + if (part.type === 'text' && part.text) { + normalized.push(createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: normalizedRole, + content: part.text, + })); + } else if (part.type === 'tool_use') { + normalized.push(createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_use', + toolName: part.name, + toolInput: part.input, + toolId: part.id || generateMessageId('gemini_tool'), + })); + } else if (part.type === 'tool_result') { + normalized.push(createNormalizedMessage({ + id: `${baseId}_${partIdx}`, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'tool_result', + toolId: part.tool_use_id || '', + content: part.content === undefined ? '' : String(part.content), + isError: Boolean(part.is_error), + })); + } + } + } else if (typeof content === 'string' && content.trim()) { + normalized.push(createNormalizedMessage({ + id: baseId, + sessionId, + timestamp: ts, + provider: PROVIDER, + kind: 'text', + role: normalizedRole, + content, + })); + } + } + + const toolResultMap = new Map(); + for (const msg of normalized) { + if (msg.kind === 'tool_result' && msg.toolId) { + toolResultMap.set(msg.toolId, msg); + } + } + for (const msg of normalized) { + if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { + const toolResult = toolResultMap.get(msg.toolId); + if (toolResult) { + msg.toolResult = { content: toolResult.content, isError: toolResult.isError }; + } + } + } + + const start = Math.max(0, offset); + const pageLimit = limit === null ? null : Math.max(0, limit); + const messages = pageLimit === null + ? normalized.slice(start) + : normalized.slice(start, start + pageLimit); + + return { + messages, + total: normalized.length, + hasMore: pageLimit === null ? false : start + pageLimit < normalized.length, + offset: start, + limit: pageLimit, + }; + } +} diff --git a/server/modules/providers/list/gemini/gemini.provider.ts b/server/modules/providers/list/gemini/gemini.provider.ts new file mode 100644 index 00000000..d968b7c0 --- /dev/null +++ b/server/modules/providers/list/gemini/gemini.provider.ts @@ -0,0 +1,15 @@ +import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js'; +import { GeminiProviderAuth } from '@/modules/providers/list/gemini/gemini-auth.provider.js'; +import { GeminiMcpProvider } from '@/modules/providers/list/gemini/gemini-mcp.provider.js'; +import { GeminiSessionsProvider } from '@/modules/providers/list/gemini/gemini-sessions.provider.js'; +import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js'; + +export class GeminiProvider extends AbstractProvider { + readonly mcp = new GeminiMcpProvider(); + readonly auth: IProviderAuth = new GeminiProviderAuth(); + readonly sessions: IProviderSessions = new GeminiSessionsProvider(); + + constructor() { + super('gemini'); + } +} diff --git a/server/modules/providers/provider.registry.ts b/server/modules/providers/provider.registry.ts new file mode 100644 index 00000000..2f959b22 --- /dev/null +++ b/server/modules/providers/provider.registry.ts @@ -0,0 +1,36 @@ +import { ClaudeProvider } from '@/modules/providers/list/claude/claude.provider.js'; +import { CodexProvider } from '@/modules/providers/list/codex/codex.provider.js'; +import { CursorProvider } from '@/modules/providers/list/cursor/cursor.provider.js'; +import { GeminiProvider } from '@/modules/providers/list/gemini/gemini.provider.js'; +import type { IProvider } from '@/shared/interfaces.js'; +import type { LLMProvider } from '@/shared/types.js'; +import { AppError } from '@/shared/utils.js'; + +const providers: Record = { + claude: new ClaudeProvider(), + codex: new CodexProvider(), + cursor: new CursorProvider(), + gemini: new GeminiProvider(), +}; + +/** + * Central registry for resolving concrete provider implementations by id. + */ +export const providerRegistry = { + listProviders(): IProvider[] { + return Object.values(providers); + }, + + resolveProvider(provider: string): IProvider { + const key = provider as LLMProvider; + const resolvedProvider = providers[key]; + if (!resolvedProvider) { + throw new AppError(`Unsupported provider "${provider}".`, { + code: 'UNSUPPORTED_PROVIDER', + statusCode: 400, + }); + } + + return resolvedProvider; + }, +}; diff --git a/server/modules/providers/provider.routes.ts b/server/modules/providers/provider.routes.ts new file mode 100644 index 00000000..895aba84 --- /dev/null +++ b/server/modules/providers/provider.routes.ts @@ -0,0 +1,217 @@ +import express, { type Request, type Response } from 'express'; + +import { providerAuthService } from '@/modules/providers/services/provider-auth.service.js'; +import { providerMcpService } from '@/modules/providers/services/mcp.service.js'; +import type { LLMProvider, McpScope, McpTransport, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { AppError, asyncHandler, createApiSuccessResponse } from '@/shared/utils.js'; + +const router = express.Router(); + +const readPathParam = (value: unknown, name: string): string => { + if (typeof value === 'string') { + return value; + } + + if (Array.isArray(value) && typeof value[0] === 'string') { + return value[0]; + } + + throw new AppError(`${name} path parameter is invalid.`, { + code: 'INVALID_PATH_PARAMETER', + statusCode: 400, + }); +}; + +const normalizeProviderParam = (value: unknown): string => + readPathParam(value, 'provider').trim().toLowerCase(); + +const readOptionalQueryString = (value: unknown): string | undefined => { + if (typeof value !== 'string') { + return undefined; + } + + const normalized = value.trim(); + return normalized.length > 0 ? normalized : undefined; +}; + +const parseMcpScope = (value: unknown): McpScope | undefined => { + if (value === undefined) { + return undefined; + } + + const normalized = readOptionalQueryString(value); + if (!normalized) { + return undefined; + } + + if (normalized === 'user' || normalized === 'local' || normalized === 'project') { + return normalized; + } + + throw new AppError(`Unsupported MCP scope "${normalized}".`, { + code: 'INVALID_MCP_SCOPE', + statusCode: 400, + }); +}; + +const parseMcpTransport = (value: unknown): McpTransport => { + const normalized = readOptionalQueryString(value); + if (!normalized) { + throw new AppError('transport is required.', { + code: 'MCP_TRANSPORT_REQUIRED', + statusCode: 400, + }); + } + + if (normalized === 'stdio' || normalized === 'http' || normalized === 'sse') { + return normalized; + } + + throw new AppError(`Unsupported MCP transport "${normalized}".`, { + code: 'INVALID_MCP_TRANSPORT', + statusCode: 400, + }); +}; + +const parseMcpUpsertPayload = (payload: unknown): UpsertProviderMcpServerInput => { + if (!payload || typeof payload !== 'object') { + throw new AppError('Request body must be an object.', { + code: 'INVALID_REQUEST_BODY', + statusCode: 400, + }); + } + + const body = payload as Record; + const name = readOptionalQueryString(body.name); + if (!name) { + throw new AppError('name is required.', { + code: 'MCP_NAME_REQUIRED', + statusCode: 400, + }); + } + + const transport = parseMcpTransport(body.transport); + const scope = parseMcpScope(body.scope); + const workspacePath = readOptionalQueryString(body.workspacePath); + + return { + name, + transport, + scope, + workspacePath, + command: readOptionalQueryString(body.command), + args: Array.isArray(body.args) ? body.args.filter((entry): entry is string => typeof entry === 'string') : undefined, + env: typeof body.env === 'object' && body.env !== null + ? Object.fromEntries( + Object.entries(body.env as Record).filter( + (entry): entry is [string, string] => typeof entry[1] === 'string', + ), + ) + : undefined, + cwd: readOptionalQueryString(body.cwd), + url: readOptionalQueryString(body.url), + headers: typeof body.headers === 'object' && body.headers !== null + ? Object.fromEntries( + Object.entries(body.headers as Record).filter( + (entry): entry is [string, string] => typeof entry[1] === 'string', + ), + ) + : undefined, + envVars: Array.isArray(body.envVars) + ? body.envVars.filter((entry): entry is string => typeof entry === 'string') + : undefined, + bearerTokenEnvVar: readOptionalQueryString(body.bearerTokenEnvVar), + envHttpHeaders: typeof body.envHttpHeaders === 'object' && body.envHttpHeaders !== null + ? Object.fromEntries( + Object.entries(body.envHttpHeaders as Record).filter( + (entry): entry is [string, string] => typeof entry[1] === 'string', + ), + ) + : undefined, + }; +}; + +const parseProvider = (value: unknown): LLMProvider => { + const normalized = normalizeProviderParam(value); + if (normalized === 'claude' || normalized === 'codex' || normalized === 'cursor' || normalized === 'gemini') { + return normalized; + } + + throw new AppError(`Unsupported provider "${normalized}".`, { + code: 'UNSUPPORTED_PROVIDER', + statusCode: 400, + }); +}; + +router.get( + '/:provider/auth/status', + asyncHandler(async (req: Request, res: Response) => { + const provider = parseProvider(req.params.provider); + const status = await providerAuthService.getProviderAuthStatus(provider); + res.json(createApiSuccessResponse(status)); + }), +); + +router.get( + '/:provider/mcp/servers', + asyncHandler(async (req: Request, res: Response) => { + const provider = parseProvider(req.params.provider); + const workspacePath = readOptionalQueryString(req.query.workspacePath); + const scope = parseMcpScope(req.query.scope); + + if (scope) { + const servers = await providerMcpService.listProviderMcpServersForScope(provider, scope, { workspacePath }); + res.json(createApiSuccessResponse({ provider, scope, servers })); + return; + } + + const groupedServers = await providerMcpService.listProviderMcpServers(provider, { workspacePath }); + res.json(createApiSuccessResponse({ provider, scopes: groupedServers })); + }), +); + +router.post( + '/:provider/mcp/servers', + asyncHandler(async (req: Request, res: Response) => { + const provider = parseProvider(req.params.provider); + const payload = parseMcpUpsertPayload(req.body); + const server = await providerMcpService.upsertProviderMcpServer(provider, payload); + res.status(201).json(createApiSuccessResponse({ server })); + }), +); + +router.delete( + '/:provider/mcp/servers/:name', + asyncHandler(async (req: Request, res: Response) => { + const provider = parseProvider(req.params.provider); + const scope = parseMcpScope(req.query.scope); + const workspacePath = readOptionalQueryString(req.query.workspacePath); + const result = await providerMcpService.removeProviderMcpServer(provider, { + name: readPathParam(req.params.name, 'name'), + scope, + workspacePath, + }); + res.json(createApiSuccessResponse(result)); + }), +); + +router.post( + '/mcp/servers/global', + asyncHandler(async (req: Request, res: Response) => { + const payload = parseMcpUpsertPayload(req.body); + if (payload.scope === 'local') { + throw new AppError('Global MCP add supports only "user" or "project" scopes.', { + code: 'INVALID_GLOBAL_MCP_SCOPE', + statusCode: 400, + }); + } + + const results = await providerMcpService.addMcpServerToAllProviders({ + ...payload, + scope: payload.scope === 'user' ? 'user' : 'project', + }); + res.status(201).json(createApiSuccessResponse({ results })); + }), +); + +export default router; diff --git a/server/modules/providers/services/mcp.service.ts b/server/modules/providers/services/mcp.service.ts new file mode 100644 index 00000000..bffb52de --- /dev/null +++ b/server/modules/providers/services/mcp.service.ts @@ -0,0 +1,94 @@ +import os from 'node:os'; + +import { providerRegistry } from '@/modules/providers/provider.registry.js'; +import type { LLMProvider, McpScope, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { AppError } from '@/shared/utils.js'; + +/** Cursor MCP is not supported on Windows hosts (no Cursor CLI integration). */ +function includeProviderInGlobalMcp(providerId: LLMProvider): boolean { + if (providerId === 'cursor' && os.platform() === 'win32') { + return false; + } + + return true; +} + + +export const providerMcpService = { + /** + * Lists MCP servers for one provider grouped by supported scopes. + */ + async listProviderMcpServers( + providerName: string, + options?: { workspacePath?: string }, + ): Promise> { + const provider = providerRegistry.resolveProvider(providerName); + return provider.mcp.listServers(options); + }, + + /** + * Lists MCP servers for one provider scope. + */ + async listProviderMcpServersForScope( + providerName: string, + scope: McpScope, + options?: { workspacePath?: string }, + ): Promise { + const provider = providerRegistry.resolveProvider(providerName); + return provider.mcp.listServersForScope(scope, options); + }, + + /** + * Adds or updates one provider MCP server. + */ + async upsertProviderMcpServer( + providerName: string, + input: UpsertProviderMcpServerInput, + ): Promise { + const provider = providerRegistry.resolveProvider(providerName); + return provider.mcp.upsertServer(input); + }, + + /** + * Removes one provider MCP server. + */ + async removeProviderMcpServer( + providerName: string, + input: { name: string; scope?: McpScope; workspacePath?: string }, + ): Promise<{ removed: boolean; provider: LLMProvider; name: string; scope: McpScope }> { + const provider = providerRegistry.resolveProvider(providerName); + return provider.mcp.removeServer(input); + }, + + /** + * Adds one HTTP/stdio MCP server to every provider. + */ + async addMcpServerToAllProviders( + input: Omit & { scope?: Exclude }, + ): Promise> { + if (input.transport !== 'stdio' && input.transport !== 'http') { + throw new AppError('Global MCP add supports only "stdio" and "http".', { + code: 'INVALID_GLOBAL_MCP_TRANSPORT', + statusCode: 400, + }); + } + + const scope = input.scope ?? 'project'; + const results: Array<{ provider: LLMProvider; created: boolean; error?: string }> = []; + const providers = providerRegistry.listProviders().filter((p) => includeProviderInGlobalMcp(p.id)); + for (const provider of providers) { + try { + await provider.mcp.upsertServer({ ...input, scope }); + results.push({ provider: provider.id, created: true }); + } catch (error) { + results.push({ + provider: provider.id, + created: false, + error: error instanceof Error ? error.message : 'Unknown error', + }); + } + } + + return results; + }, +}; diff --git a/server/modules/providers/services/provider-auth.service.ts b/server/modules/providers/services/provider-auth.service.ts new file mode 100644 index 00000000..e763aaed --- /dev/null +++ b/server/modules/providers/services/provider-auth.service.ts @@ -0,0 +1,26 @@ +import { providerRegistry } from '@/modules/providers/provider.registry.js'; +import type { LLMProvider, ProviderAuthStatus } from '@/shared/types.js'; + +export const providerAuthService = { + /** + * Resolves a provider and returns its installation/authentication status. + */ + async getProviderAuthStatus(providerName: string): Promise { + const provider = providerRegistry.resolveProvider(providerName); + return provider.auth.getStatus(); + }, + + /** + * Returns whether a provider runtime appears installed. + * Falls back to true if status lookup itself fails so callers preserve the + * original runtime error instead of replacing it with a status-check failure. + */ + async isProviderInstalled(providerName: LLMProvider): Promise { + try { + const status = await this.getProviderAuthStatus(providerName); + return status.installed; + } catch { + return true; + } + }, +}; diff --git a/server/modules/providers/services/sessions.service.ts b/server/modules/providers/services/sessions.service.ts new file mode 100644 index 00000000..adff6e8f --- /dev/null +++ b/server/modules/providers/services/sessions.service.ts @@ -0,0 +1,45 @@ +import { providerRegistry } from '@/modules/providers/provider.registry.js'; +import type { + FetchHistoryOptions, + FetchHistoryResult, + LLMProvider, + NormalizedMessage, +} from '@/shared/types.js'; + +/** + * Application service for provider-backed session message operations. + * + * Callers pass a provider id and this service resolves the concrete provider + * class, keeping normalization/history call sites decoupled from implementation + * file layout. + */ +export const sessionsService = { + /** + * Lists provider ids that can load session history and normalize live messages. + */ + listProviderIds(): LLMProvider[] { + return providerRegistry.listProviders().map((provider) => provider.id); + }, + + /** + * Normalizes one provider-native event into frontend session message events. + */ + normalizeMessage( + providerName: string, + raw: unknown, + sessionId: string | null, + ): NormalizedMessage[] { + return providerRegistry.resolveProvider(providerName).sessions.normalizeMessage(raw, sessionId); + }, + + /** + * Fetches normalized persisted session history for one provider/session pair. + */ + fetchHistory( + providerName: string, + sessionId: string, + options?: FetchHistoryOptions, + ): Promise { + return providerRegistry.resolveProvider(providerName).sessions.fetchHistory(sessionId, options); + }, +}; diff --git a/server/modules/providers/shared/base/abstract.provider.ts b/server/modules/providers/shared/base/abstract.provider.ts new file mode 100644 index 00000000..4a591baf --- /dev/null +++ b/server/modules/providers/shared/base/abstract.provider.ts @@ -0,0 +1,20 @@ +import type { IProvider, IProviderAuth, IProviderMcp, IProviderSessions } from '@/shared/interfaces.js'; +import type { LLMProvider } from '@/shared/types.js'; + +/** + * Shared provider base. + * + * Concrete providers must expose auth/MCP handlers and implement message + * normalization/history loading because those behaviors depend on native + * SDK/CLI formats. + */ +export abstract class AbstractProvider implements IProvider { + readonly id: LLMProvider; + abstract readonly mcp: IProviderMcp; + abstract readonly auth: IProviderAuth; + abstract readonly sessions: IProviderSessions; + + protected constructor(id: LLMProvider) { + this.id = id; + } +} diff --git a/server/modules/providers/shared/mcp/mcp.provider.ts b/server/modules/providers/shared/mcp/mcp.provider.ts new file mode 100644 index 00000000..96cc7f25 --- /dev/null +++ b/server/modules/providers/shared/mcp/mcp.provider.ts @@ -0,0 +1,151 @@ +import path from 'node:path'; + +import type { IProviderMcp } from '@/shared/interfaces.js'; +import type { LLMProvider, McpScope, McpTransport, ProviderMcpServer, UpsertProviderMcpServerInput } from '@/shared/types.js'; +import { AppError } from '@/shared/utils.js'; + +const resolveWorkspacePath = (workspacePath?: string): string => + path.resolve(workspacePath ?? process.cwd()); + +const normalizeServerName = (name: string): string => { + const normalized = name.trim(); + if (!normalized) { + throw new AppError('MCP server name is required.', { + code: 'MCP_SERVER_NAME_REQUIRED', + statusCode: 400, + }); + } + + return normalized; +}; + +/** + * Shared MCP provider for provider-specific config readers/writers. + */ +export abstract class McpProvider implements IProviderMcp { + protected readonly provider: LLMProvider; + protected readonly supportedScopes: McpScope[]; + protected readonly supportedTransports: McpTransport[]; + + protected constructor( + provider: LLMProvider, + supportedScopes: McpScope[], + supportedTransports: McpTransport[], + ) { + this.provider = provider; + this.supportedScopes = supportedScopes; + this.supportedTransports = supportedTransports; + } + + async listServers(options?: { workspacePath?: string }): Promise> { + const grouped: Record = { + user: [], + local: [], + project: [], + }; + + for (const scope of this.supportedScopes) { + grouped[scope] = await this.listServersForScope(scope, options); + } + + return grouped; + } + + async listServersForScope( + scope: McpScope, + options?: { workspacePath?: string }, + ): Promise { + if (!this.supportedScopes.includes(scope)) { + return []; + } + + const workspacePath = resolveWorkspacePath(options?.workspacePath); + const scopedServers = await this.readScopedServers(scope, workspacePath); + return Object.entries(scopedServers) + .map(([name, rawConfig]) => this.normalizeServerConfig(scope, name, rawConfig)) + .filter((entry): entry is ProviderMcpServer => entry !== null); + } + + async upsertServer(input: UpsertProviderMcpServerInput): Promise { + const scope = input.scope ?? 'project'; + this.assertScopeAndTransport(scope, input.transport); + + const workspacePath = resolveWorkspacePath(input.workspacePath); + const normalizedName = normalizeServerName(input.name); + const scopedServers = await this.readScopedServers(scope, workspacePath); + scopedServers[normalizedName] = this.buildServerConfig(input); + await this.writeScopedServers(scope, workspacePath, scopedServers); + + return { + provider: this.provider, + name: normalizedName, + scope, + transport: input.transport, + command: input.command, + args: input.args, + env: input.env, + cwd: input.cwd, + url: input.url, + headers: input.headers, + envVars: input.envVars, + bearerTokenEnvVar: input.bearerTokenEnvVar, + envHttpHeaders: input.envHttpHeaders, + }; + } + + async removeServer( + input: { name: string; scope?: McpScope; workspacePath?: string }, + ): Promise<{ removed: boolean; provider: LLMProvider; name: string; scope: McpScope }> { + const scope = input.scope ?? 'project'; + this.assertScope(scope); + + const workspacePath = resolveWorkspacePath(input.workspacePath); + const normalizedName = normalizeServerName(input.name); + const scopedServers = await this.readScopedServers(scope, workspacePath); + const removed = Object.prototype.hasOwnProperty.call(scopedServers, normalizedName); + if (removed) { + delete scopedServers[normalizedName]; + await this.writeScopedServers(scope, workspacePath, scopedServers); + } + + return { removed, provider: this.provider, name: normalizedName, scope }; + } + + protected abstract readScopedServers( + scope: McpScope, + workspacePath: string, + ): Promise>; + + protected abstract writeScopedServers( + scope: McpScope, + workspacePath: string, + servers: Record, + ): Promise; + + protected abstract buildServerConfig(input: UpsertProviderMcpServerInput): Record; + + protected abstract normalizeServerConfig( + scope: McpScope, + name: string, + rawConfig: unknown, + ): ProviderMcpServer | null; + + protected assertScope(scope: McpScope): void { + if (!this.supportedScopes.includes(scope)) { + throw new AppError(`Provider "${this.provider}" does not support "${scope}" MCP scope.`, { + code: 'MCP_SCOPE_NOT_SUPPORTED', + statusCode: 400, + }); + } + } + + protected assertScopeAndTransport(scope: McpScope, transport: McpTransport): void { + this.assertScope(scope); + if (!this.supportedTransports.includes(transport)) { + throw new AppError(`Provider "${this.provider}" does not support "${transport}" MCP transport.`, { + code: 'MCP_TRANSPORT_NOT_SUPPORTED', + statusCode: 400, + }); + } + } +} diff --git a/server/modules/providers/tests/mcp.test.ts b/server/modules/providers/tests/mcp.test.ts new file mode 100644 index 00000000..a64914d6 --- /dev/null +++ b/server/modules/providers/tests/mcp.test.ts @@ -0,0 +1,293 @@ +import assert from 'node:assert/strict'; +import fs from 'node:fs/promises'; +import os from 'node:os'; +import path from 'node:path'; +import test from 'node:test'; + +import TOML from '@iarna/toml'; + +import { providerMcpService } from '@/modules/providers/services/mcp.service.js'; +import { AppError } from '@/shared/utils.js'; + +const patchHomeDir = (nextHomeDir: string) => { + const original = os.homedir; + (os as any).homedir = () => nextHomeDir; + return () => { + (os as any).homedir = original; + }; +}; + +const readJson = async (filePath: string): Promise> => { + const content = await fs.readFile(filePath, 'utf8'); + return JSON.parse(content) as Record; +}; + +/** + * This test covers Claude MCP support for all scopes (user/local/project) and all transports (stdio/http/sse), + * including add, update/list, and remove operations. + */ +test('providerMcpService handles claude MCP scopes/transports with file-backed persistence', { concurrency: false }, async () => { + const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-mcp-claude-')); + const workspacePath = path.join(tempRoot, 'workspace'); + await fs.mkdir(workspacePath, { recursive: true }); + + const restoreHomeDir = patchHomeDir(tempRoot); + try { + await providerMcpService.upsertProviderMcpServer('claude', { + name: 'claude-user-stdio', + scope: 'user', + transport: 'stdio', + command: 'npx', + args: ['-y', 'my-server'], + env: { API_KEY: 'secret' }, + }); + + await providerMcpService.upsertProviderMcpServer('claude', { + name: 'claude-local-http', + scope: 'local', + transport: 'http', + url: 'https://example.com/mcp', + headers: { Authorization: 'Bearer token' }, + workspacePath, + }); + + await providerMcpService.upsertProviderMcpServer('claude', { + name: 'claude-project-sse', + scope: 'project', + transport: 'sse', + url: 'https://example.com/sse', + headers: { 'X-API-Key': 'abc' }, + workspacePath, + }); + + const grouped = await providerMcpService.listProviderMcpServers('claude', { workspacePath }); + assert.ok(grouped.user.some((server) => server.name === 'claude-user-stdio' && server.transport === 'stdio')); + assert.ok(grouped.local.some((server) => server.name === 'claude-local-http' && server.transport === 'http')); + assert.ok(grouped.project.some((server) => server.name === 'claude-project-sse' && server.transport === 'sse')); + + // update behavior is the same upsert route with same name + await providerMcpService.upsertProviderMcpServer('claude', { + name: 'claude-project-sse', + scope: 'project', + transport: 'sse', + url: 'https://example.com/sse-updated', + headers: { 'X-API-Key': 'updated' }, + workspacePath, + }); + + const projectConfig = await readJson(path.join(workspacePath, '.mcp.json')); + const projectServers = projectConfig.mcpServers as Record; + const projectServer = projectServers['claude-project-sse'] as Record; + assert.equal(projectServer.url, 'https://example.com/sse-updated'); + + const removeResult = await providerMcpService.removeProviderMcpServer('claude', { + name: 'claude-local-http', + scope: 'local', + workspacePath, + }); + assert.equal(removeResult.removed, true); + } finally { + restoreHomeDir(); + await fs.rm(tempRoot, { recursive: true, force: true }); + } +}); + +/** + * This test covers Codex MCP support for user/project scopes, stdio/http formats, + * and validation for unsupported scope/transport combinations. + */ +test('providerMcpService handles codex MCP TOML config and capability validation', { concurrency: false }, async () => { + const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-mcp-codex-')); + const workspacePath = path.join(tempRoot, 'workspace'); + await fs.mkdir(workspacePath, { recursive: true }); + + const restoreHomeDir = patchHomeDir(tempRoot); + try { + await providerMcpService.upsertProviderMcpServer('codex', { + name: 'codex-user-stdio', + scope: 'user', + transport: 'stdio', + command: 'python', + args: ['server.py'], + env: { API_KEY: 'x' }, + envVars: ['API_KEY'], + cwd: '/tmp', + }); + + await providerMcpService.upsertProviderMcpServer('codex', { + name: 'codex-project-http', + scope: 'project', + transport: 'http', + url: 'https://codex.example.com/mcp', + headers: { 'X-Custom-Header': 'value' }, + envHttpHeaders: { 'X-API-Key': 'MY_API_KEY_ENV' }, + bearerTokenEnvVar: 'MY_API_TOKEN', + workspacePath, + }); + + const userTomlPath = path.join(tempRoot, '.codex', 'config.toml'); + const userConfig = TOML.parse(await fs.readFile(userTomlPath, 'utf8')) as Record; + const userServers = userConfig.mcp_servers as Record; + const userStdio = userServers['codex-user-stdio'] as Record; + assert.equal(userStdio.command, 'python'); + + const projectTomlPath = path.join(workspacePath, '.codex', 'config.toml'); + const projectConfig = TOML.parse(await fs.readFile(projectTomlPath, 'utf8')) as Record; + const projectServers = projectConfig.mcp_servers as Record; + const projectHttp = projectServers['codex-project-http'] as Record; + assert.equal(projectHttp.url, 'https://codex.example.com/mcp'); + + await assert.rejects( + providerMcpService.upsertProviderMcpServer('codex', { + name: 'codex-local', + scope: 'local', + transport: 'stdio', + command: 'node', + }), + (error: unknown) => + error instanceof AppError && + error.code === 'MCP_SCOPE_NOT_SUPPORTED' && + error.statusCode === 400, + ); + + await assert.rejects( + providerMcpService.upsertProviderMcpServer('codex', { + name: 'codex-sse', + scope: 'project', + transport: 'sse', + url: 'https://example.com/sse', + workspacePath, + }), + (error: unknown) => + error instanceof AppError && + error.code === 'MCP_TRANSPORT_NOT_SUPPORTED' && + error.statusCode === 400, + ); + } finally { + restoreHomeDir(); + await fs.rm(tempRoot, { recursive: true, force: true }); + } +}); + +/** + * This test covers Gemini/Cursor MCP JSON formats and user/project scope persistence. + */ +test('providerMcpService handles gemini and cursor MCP JSON config formats', { concurrency: false }, async () => { + const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-mcp-gc-')); + const workspacePath = path.join(tempRoot, 'workspace'); + await fs.mkdir(workspacePath, { recursive: true }); + + const restoreHomeDir = patchHomeDir(tempRoot); + try { + await providerMcpService.upsertProviderMcpServer('gemini', { + name: 'gemini-stdio', + scope: 'user', + transport: 'stdio', + command: 'node', + args: ['server.js'], + env: { TOKEN: '$TOKEN' }, + cwd: './server', + }); + + await providerMcpService.upsertProviderMcpServer('gemini', { + name: 'gemini-http', + scope: 'project', + transport: 'http', + url: 'https://gemini.example.com/mcp', + headers: { Authorization: 'Bearer token' }, + workspacePath, + }); + + await providerMcpService.upsertProviderMcpServer('cursor', { + name: 'cursor-stdio', + scope: 'project', + transport: 'stdio', + command: 'npx', + args: ['-y', 'mcp-server'], + env: { API_KEY: 'value' }, + workspacePath, + }); + + await providerMcpService.upsertProviderMcpServer('cursor', { + name: 'cursor-http', + scope: 'user', + transport: 'http', + url: 'http://localhost:3333/mcp', + headers: { API_KEY: 'value' }, + }); + + const geminiUserConfig = await readJson(path.join(tempRoot, '.gemini', 'settings.json')); + const geminiUserServer = (geminiUserConfig.mcpServers as Record)['gemini-stdio'] as Record; + assert.equal(geminiUserServer.command, 'node'); + assert.equal(geminiUserServer.type, undefined); + + const geminiProjectConfig = await readJson(path.join(workspacePath, '.gemini', 'settings.json')); + const geminiProjectServer = (geminiProjectConfig.mcpServers as Record)['gemini-http'] as Record; + assert.equal(geminiProjectServer.type, 'http'); + + const cursorUserConfig = await readJson(path.join(tempRoot, '.cursor', 'mcp.json')); + const cursorHttpServer = (cursorUserConfig.mcpServers as Record)['cursor-http'] as Record; + assert.equal(cursorHttpServer.url, 'http://localhost:3333/mcp'); + assert.equal(cursorHttpServer.type, undefined); + } finally { + restoreHomeDir(); + await fs.rm(tempRoot, { recursive: true, force: true }); + } +}); + +/** + * This test covers the global MCP adder requirement: only http/stdio are allowed and + * one payload is written to all providers. + */ +test('providerMcpService global adder writes to all providers and rejects unsupported transports', { concurrency: false }, async () => { + const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-mcp-global-')); + const workspacePath = path.join(tempRoot, 'workspace'); + await fs.mkdir(workspacePath, { recursive: true }); + + const restoreHomeDir = patchHomeDir(tempRoot); + try { + const globalResult = await providerMcpService.addMcpServerToAllProviders({ + name: 'global-http', + scope: 'project', + transport: 'http', + url: 'https://global.example.com/mcp', + workspacePath, + }); + + const expectCursorGlobal = process.platform !== 'win32'; + assert.equal(globalResult.length, expectCursorGlobal ? 4 : 3); + assert.ok(globalResult.every((entry) => entry.created === true)); + + const claudeProject = await readJson(path.join(workspacePath, '.mcp.json')); + assert.ok((claudeProject.mcpServers as Record)['global-http']); + + const codexProject = TOML.parse(await fs.readFile(path.join(workspacePath, '.codex', 'config.toml'), 'utf8')) as Record; + assert.ok((codexProject.mcp_servers as Record)['global-http']); + + const geminiProject = await readJson(path.join(workspacePath, '.gemini', 'settings.json')); + assert.ok((geminiProject.mcpServers as Record)['global-http']); + + if (expectCursorGlobal) { + const cursorProject = await readJson(path.join(workspacePath, '.cursor', 'mcp.json')); + assert.ok((cursorProject.mcpServers as Record)['global-http']); + } + + await assert.rejects( + providerMcpService.addMcpServerToAllProviders({ + name: 'global-sse', + scope: 'project', + transport: 'sse', + url: 'https://example.com/sse', + workspacePath, + }), + (error: unknown) => + error instanceof AppError && + error.code === 'INVALID_GLOBAL_MCP_TRANSPORT' && + error.statusCode === 400, + ); + } finally { + restoreHomeDir(); + await fs.rm(tempRoot, { recursive: true, force: true }); + } +}); + diff --git a/server/openai-codex.js b/server/openai-codex.js index 99a8e435..5a7a9007 100644 --- a/server/openai-codex.js +++ b/server/openai-codex.js @@ -15,9 +15,9 @@ import { Codex } from '@openai/codex-sdk'; import { notifyRunFailed, notifyRunStopped } from './services/notification-orchestrator.js'; -import { codexAdapter } from './providers/codex/adapter.js'; -import { createNormalizedMessage } from './providers/types.js'; -import { getStatusChecker } from './providers/registry.js'; +import { sessionsService } from './modules/providers/services/sessions.service.js'; +import { providerAuthService } from './modules/providers/services/provider-auth.service.js'; +import { createNormalizedMessage } from './shared/utils.js'; // Track active sessions const activeCodexSessions = new Map(); @@ -265,7 +265,7 @@ export async function queryCodex(command, options = {}, ws) { const transformed = transformCodexEvent(event); // Normalize the transformed event into NormalizedMessage(s) via adapter - const normalizedMsgs = codexAdapter.normalizeMessage(transformed, currentSessionId); + const normalizedMsgs = sessionsService.normalizeMessage('codex', transformed, currentSessionId); for (const msg of normalizedMsgs) { sendMessage(ws, msg); } @@ -311,7 +311,7 @@ export async function queryCodex(command, options = {}, ws) { console.error('[Codex] Error:', error); // Check if Codex SDK is available for a clearer error message - const installed = getStatusChecker('codex')?.checkInstalled() ?? true; + const installed = await providerAuthService.isProviderInstalled('codex'); const errorContent = !installed ? 'Codex CLI is not configured. Please set up authentication first.' : error.message; diff --git a/server/providers/claude/adapter.js b/server/providers/claude/adapter.js deleted file mode 100644 index d5f850ba..00000000 --- a/server/providers/claude/adapter.js +++ /dev/null @@ -1,278 +0,0 @@ -/** - * Claude provider adapter. - * - * Normalizes Claude SDK session history into NormalizedMessage format. - * @module adapters/claude - */ - -import { getSessionMessages } from '../../projects.js'; -import { createNormalizedMessage, generateMessageId } from '../types.js'; -import { isInternalContent } from '../utils.js'; - -const PROVIDER = 'claude'; - -/** - * Normalize a raw JSONL message or realtime SDK event into NormalizedMessage(s). - * Handles both history entries (JSONL `{ message: { role, content } }`) and - * realtime streaming events (`content_block_delta`, `content_block_stop`, etc.). - * @param {object} raw - A single entry from JSONL or a live SDK event - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ -export function normalizeMessage(raw, sessionId) { - // ── Streaming events (realtime) ────────────────────────────────────────── - if (raw.type === 'content_block_delta' && raw.delta?.text) { - return [createNormalizedMessage({ kind: 'stream_delta', content: raw.delta.text, sessionId, provider: PROVIDER })]; - } - if (raw.type === 'content_block_stop') { - return [createNormalizedMessage({ kind: 'stream_end', sessionId, provider: PROVIDER })]; - } - - // ── History / full-message events ──────────────────────────────────────── - const messages = []; - const ts = raw.timestamp || new Date().toISOString(); - const baseId = raw.uuid || generateMessageId('claude'); - - // User message - if (raw.message?.role === 'user' && raw.message?.content) { - if (Array.isArray(raw.message.content)) { - // Handle tool_result parts - for (const part of raw.message.content) { - if (part.type === 'tool_result') { - messages.push(createNormalizedMessage({ - id: `${baseId}_tr_${part.tool_use_id}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_result', - toolId: part.tool_use_id, - content: typeof part.content === 'string' ? part.content : JSON.stringify(part.content), - isError: Boolean(part.is_error), - subagentTools: raw.subagentTools, - toolUseResult: raw.toolUseResult, - })); - } else if (part.type === 'text') { - // Regular text parts from user - const text = part.text || ''; - if (text && !isInternalContent(text)) { - messages.push(createNormalizedMessage({ - id: `${baseId}_text`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'user', - content: text, - })); - } - } - } - - // If no text parts were found, check if it's a pure user message - if (messages.length === 0) { - const textParts = raw.message.content - .filter(p => p.type === 'text') - .map(p => p.text) - .filter(Boolean) - .join('\n'); - if (textParts && !isInternalContent(textParts)) { - messages.push(createNormalizedMessage({ - id: `${baseId}_text`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'user', - content: textParts, - })); - } - } - } else if (typeof raw.message.content === 'string') { - const text = raw.message.content; - if (text && !isInternalContent(text)) { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'user', - content: text, - })); - } - } - return messages; - } - - // Thinking message - if (raw.type === 'thinking' && raw.message?.content) { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'thinking', - content: raw.message.content, - })); - return messages; - } - - // Tool use result (codex-style in Claude) - if (raw.type === 'tool_use' && raw.toolName) { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_use', - toolName: raw.toolName, - toolInput: raw.toolInput, - toolId: raw.toolCallId || baseId, - })); - return messages; - } - - if (raw.type === 'tool_result') { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_result', - toolId: raw.toolCallId || '', - content: raw.output || '', - isError: false, - })); - return messages; - } - - // Assistant message - if (raw.message?.role === 'assistant' && raw.message?.content) { - if (Array.isArray(raw.message.content)) { - let partIndex = 0; - for (const part of raw.message.content) { - if (part.type === 'text' && part.text) { - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIndex}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'assistant', - content: part.text, - })); - } else if (part.type === 'tool_use') { - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIndex}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_use', - toolName: part.name, - toolInput: part.input, - toolId: part.id, - })); - } else if (part.type === 'thinking' && part.thinking) { - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIndex}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'thinking', - content: part.thinking, - })); - } - partIndex++; - } - } else if (typeof raw.message.content === 'string') { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'assistant', - content: raw.message.content, - })); - } - return messages; - } - - return messages; -} - -/** - * @type {import('../types.js').ProviderAdapter} - */ -export const claudeAdapter = { - normalizeMessage, - - /** - * Fetch session history from JSONL files, returning normalized messages. - */ - async fetchHistory(sessionId, opts = {}) { - const { projectName, limit = null, offset = 0 } = opts; - if (!projectName) { - return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; - } - - let result; - try { - result = await getSessionMessages(projectName, sessionId, limit, offset); - } catch (error) { - console.warn(`[ClaudeAdapter] Failed to load session ${sessionId}:`, error.message); - return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; - } - - // getSessionMessages returns either an array (no limit) or { messages, total, hasMore } - const rawMessages = Array.isArray(result) ? result : (result.messages || []); - const total = Array.isArray(result) ? rawMessages.length : (result.total || 0); - const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore); - - // First pass: collect tool results for attachment to tool_use messages - const toolResultMap = new Map(); - for (const raw of rawMessages) { - if (raw.message?.role === 'user' && Array.isArray(raw.message?.content)) { - for (const part of raw.message.content) { - if (part.type === 'tool_result') { - toolResultMap.set(part.tool_use_id, { - content: part.content, - isError: Boolean(part.is_error), - timestamp: raw.timestamp, - subagentTools: raw.subagentTools, - toolUseResult: raw.toolUseResult, - }); - } - } - } - } - - // Second pass: normalize all messages - const normalized = []; - for (const raw of rawMessages) { - const entries = normalizeMessage(raw, sessionId); - normalized.push(...entries); - } - - // Attach tool results to their corresponding tool_use messages - for (const msg of normalized) { - if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { - const tr = toolResultMap.get(msg.toolId); - msg.toolResult = { - content: typeof tr.content === 'string' ? tr.content : JSON.stringify(tr.content), - isError: tr.isError, - toolUseResult: tr.toolUseResult, - }; - msg.subagentTools = tr.subagentTools; - } - } - - return { - messages: normalized, - total, - hasMore, - offset, - limit, - }; - }, -}; diff --git a/server/providers/claude/status.js b/server/providers/claude/status.js deleted file mode 100644 index c0d7d231..00000000 --- a/server/providers/claude/status.js +++ /dev/null @@ -1,136 +0,0 @@ -/** - * Claude Provider Status - * - * Checks whether Claude Code CLI is installed and whether the user - * has valid authentication credentials. - * - * @module providers/claude/status - */ - -import { execFileSync } from 'child_process'; -import { promises as fs } from 'fs'; -import path from 'path'; -import os from 'os'; - -/** - * Check if Claude Code CLI is installed and available. - * Uses CLAUDE_CLI_PATH env var if set, otherwise looks for 'claude' in PATH. - * @returns {boolean} - */ -export function checkInstalled() { - const cliPath = process.env.CLAUDE_CLI_PATH || 'claude'; - try { - execFileSync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 }); - return true; - } catch { - return false; - } -} - -/** - * Full status check: installation + authentication. - * @returns {Promise} - */ -export async function checkStatus() { - const installed = checkInstalled(); - - if (!installed) { - return { - installed, - authenticated: false, - email: null, - method: null, - error: 'Claude Code CLI is not installed' - }; - } - - const credentialsResult = await checkCredentials(); - - if (credentialsResult.authenticated) { - return { - installed, - authenticated: true, - email: credentialsResult.email || 'Authenticated', - method: credentialsResult.method || null, - error: null - }; - } - - return { - installed, - authenticated: false, - email: credentialsResult.email || null, - method: credentialsResult.method || null, - error: credentialsResult.error || 'Not authenticated' - }; -} - -// ─── Internal helpers ─────────────────────────────────────────────────────── - -async function loadSettingsEnv() { - try { - const settingsPath = path.join(os.homedir(), '.claude', 'settings.json'); - const content = await fs.readFile(settingsPath, 'utf8'); - const settings = JSON.parse(content); - - if (settings?.env && typeof settings.env === 'object') { - return settings.env; - } - } catch { - // Ignore missing or malformed settings. - } - - return {}; -} - -/** - * Checks Claude authentication credentials. - * - * Priority 1: ANTHROPIC_API_KEY environment variable - * Priority 1b: ~/.claude/settings.json env values - * Priority 2: ~/.claude/.credentials.json OAuth tokens - */ -async function checkCredentials() { - if (process.env.ANTHROPIC_API_KEY && process.env.ANTHROPIC_API_KEY.trim()) { - return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; - } - - const settingsEnv = await loadSettingsEnv(); - - if (typeof settingsEnv.ANTHROPIC_API_KEY === 'string' && settingsEnv.ANTHROPIC_API_KEY.trim()) { - return { authenticated: true, email: 'API Key Auth', method: 'api_key' }; - } - - if (typeof settingsEnv.ANTHROPIC_AUTH_TOKEN === 'string' && settingsEnv.ANTHROPIC_AUTH_TOKEN.trim()) { - return { authenticated: true, email: 'Configured via settings.json', method: 'api_key' }; - } - - try { - const credPath = path.join(os.homedir(), '.claude', '.credentials.json'); - const content = await fs.readFile(credPath, 'utf8'); - const creds = JSON.parse(content); - - const oauth = creds.claudeAiOauth; - if (oauth && oauth.accessToken) { - const isExpired = oauth.expiresAt && Date.now() >= oauth.expiresAt; - if (!isExpired) { - return { - authenticated: true, - email: creds.email || creds.user || null, - method: 'credentials_file' - }; - } - - return { - authenticated: false, - email: creds.email || creds.user || null, - method: 'credentials_file', - error: 'OAuth token has expired. Please re-authenticate with claude login' - }; - } - - return { authenticated: false, email: null, method: null }; - } catch { - return { authenticated: false, email: null, method: null }; - } -} diff --git a/server/providers/codex/adapter.js b/server/providers/codex/adapter.js deleted file mode 100644 index c9cae00f..00000000 --- a/server/providers/codex/adapter.js +++ /dev/null @@ -1,248 +0,0 @@ -/** - * Codex (OpenAI) provider adapter. - * - * Normalizes Codex SDK session history into NormalizedMessage format. - * @module adapters/codex - */ - -import { getCodexSessionMessages } from '../../projects.js'; -import { createNormalizedMessage, generateMessageId } from '../types.js'; - -const PROVIDER = 'codex'; - -/** - * Normalize a raw Codex JSONL message into NormalizedMessage(s). - * @param {object} raw - A single parsed message from Codex JSONL - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ -function normalizeCodexHistoryEntry(raw, sessionId) { - const ts = raw.timestamp || new Date().toISOString(); - const baseId = raw.uuid || generateMessageId('codex'); - - // User message - if (raw.message?.role === 'user') { - const content = typeof raw.message.content === 'string' - ? raw.message.content - : Array.isArray(raw.message.content) - ? raw.message.content.map(p => typeof p === 'string' ? p : p?.text || '').filter(Boolean).join('\n') - : String(raw.message.content || ''); - if (!content.trim()) return []; - return [createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'user', - content, - })]; - } - - // Assistant message - if (raw.message?.role === 'assistant') { - const content = typeof raw.message.content === 'string' - ? raw.message.content - : Array.isArray(raw.message.content) - ? raw.message.content.map(p => typeof p === 'string' ? p : p?.text || '').filter(Boolean).join('\n') - : ''; - if (!content.trim()) return []; - return [createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: 'assistant', - content, - })]; - } - - // Thinking/reasoning - if (raw.type === 'thinking' || raw.isReasoning) { - return [createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'thinking', - content: raw.message?.content || '', - })]; - } - - // Tool use - if (raw.type === 'tool_use' || raw.toolName) { - return [createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_use', - toolName: raw.toolName || 'Unknown', - toolInput: raw.toolInput, - toolId: raw.toolCallId || baseId, - })]; - } - - // Tool result - if (raw.type === 'tool_result') { - return [createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_result', - toolId: raw.toolCallId || '', - content: raw.output || '', - isError: Boolean(raw.isError), - })]; - } - - return []; -} - -/** - * Normalize a raw Codex event (history JSONL or transformed SDK event) into NormalizedMessage(s). - * @param {object} raw - A history entry (has raw.message.role) or transformed SDK event (has raw.type) - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ -export function normalizeMessage(raw, sessionId) { - // History format: has message.role - if (raw.message?.role) { - return normalizeCodexHistoryEntry(raw, sessionId); - } - - const ts = raw.timestamp || new Date().toISOString(); - const baseId = raw.uuid || generateMessageId('codex'); - - // SDK event format (output of transformCodexEvent) - if (raw.type === 'item') { - switch (raw.itemType) { - case 'agent_message': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'text', role: 'assistant', content: raw.message?.content || '', - })]; - case 'reasoning': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'thinking', content: raw.message?.content || '', - })]; - case 'command_execution': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: 'Bash', toolInput: { command: raw.command }, - toolId: baseId, - output: raw.output, exitCode: raw.exitCode, status: raw.status, - })]; - case 'file_change': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: 'FileChanges', toolInput: raw.changes, - toolId: baseId, status: raw.status, - })]; - case 'mcp_tool_call': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: raw.tool || 'MCP', toolInput: raw.arguments, - toolId: baseId, server: raw.server, result: raw.result, - error: raw.error, status: raw.status, - })]; - case 'web_search': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: 'WebSearch', toolInput: { query: raw.query }, - toolId: baseId, - })]; - case 'todo_list': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: 'TodoList', toolInput: { items: raw.items }, - toolId: baseId, - })]; - case 'error': - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'error', content: raw.message?.content || 'Unknown error', - })]; - default: - // Unknown item type — pass through as generic tool_use - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: raw.itemType || 'Unknown', - toolInput: raw.item || raw, toolId: baseId, - })]; - } - } - - if (raw.type === 'turn_complete') { - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'complete', - })]; - } - if (raw.type === 'turn_failed') { - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'error', content: raw.error?.message || 'Turn failed', - })]; - } - - return []; -} - -/** - * @type {import('../types.js').ProviderAdapter} - */ -export const codexAdapter = { - normalizeMessage, - /** - * Fetch session history from Codex JSONL files. - */ - async fetchHistory(sessionId, opts = {}) { - const { limit = null, offset = 0 } = opts; - - let result; - try { - result = await getCodexSessionMessages(sessionId, limit, offset); - } catch (error) { - console.warn(`[CodexAdapter] Failed to load session ${sessionId}:`, error.message); - return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; - } - - const rawMessages = Array.isArray(result) ? result : (result.messages || []); - const total = Array.isArray(result) ? rawMessages.length : (result.total || 0); - const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore); - const tokenUsage = result.tokenUsage || null; - - const normalized = []; - for (const raw of rawMessages) { - const entries = normalizeCodexHistoryEntry(raw, sessionId); - normalized.push(...entries); - } - - // Attach tool results to tool_use messages - const toolResultMap = new Map(); - for (const msg of normalized) { - if (msg.kind === 'tool_result' && msg.toolId) { - toolResultMap.set(msg.toolId, msg); - } - } - for (const msg of normalized) { - if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { - const tr = toolResultMap.get(msg.toolId); - msg.toolResult = { content: tr.content, isError: tr.isError }; - } - } - - return { - messages: normalized, - total, - hasMore, - offset, - limit, - tokenUsage, - }; - }, -}; diff --git a/server/providers/codex/status.js b/server/providers/codex/status.js deleted file mode 100644 index cf1c273f..00000000 --- a/server/providers/codex/status.js +++ /dev/null @@ -1,78 +0,0 @@ -/** - * Codex Provider Status - * - * Checks whether the user has valid Codex authentication credentials. - * Codex uses an SDK that makes direct API calls (no external binary), - * so installation check always returns true if the server is running. - * - * @module providers/codex/status - */ - -import { promises as fs } from 'fs'; -import path from 'path'; -import os from 'os'; - -/** - * Check if Codex is installed. - * Codex SDK is bundled with this application — no external binary needed. - * @returns {boolean} - */ -export function checkInstalled() { - return true; -} - -/** - * Full status check: installation + authentication. - * @returns {Promise} - */ -export async function checkStatus() { - const installed = checkInstalled(); - const result = await checkCredentials(); - - return { - installed, - authenticated: result.authenticated, - email: result.email || null, - error: result.error || null - }; -} - -// ─── Internal helpers ─────────────────────────────────────────────────────── - -async function checkCredentials() { - try { - const authPath = path.join(os.homedir(), '.codex', 'auth.json'); - const content = await fs.readFile(authPath, 'utf8'); - const auth = JSON.parse(content); - - const tokens = auth.tokens || {}; - - if (tokens.id_token || tokens.access_token) { - let email = 'Authenticated'; - if (tokens.id_token) { - try { - const parts = tokens.id_token.split('.'); - if (parts.length >= 2) { - const payload = JSON.parse(Buffer.from(parts[1], 'base64url').toString('utf8')); - email = payload.email || payload.user || 'Authenticated'; - } - } catch { - email = 'Authenticated'; - } - } - - return { authenticated: true, email }; - } - - if (auth.OPENAI_API_KEY) { - return { authenticated: true, email: 'API Key Auth' }; - } - - return { authenticated: false, email: null, error: 'No valid tokens found' }; - } catch (error) { - if (error.code === 'ENOENT') { - return { authenticated: false, email: null, error: 'Codex not configured' }; - } - return { authenticated: false, email: null, error: error.message }; - } -} diff --git a/server/providers/cursor/adapter.js b/server/providers/cursor/adapter.js deleted file mode 100644 index ef94ea1d..00000000 --- a/server/providers/cursor/adapter.js +++ /dev/null @@ -1,348 +0,0 @@ -/** - * Cursor provider adapter. - * - * Normalizes Cursor CLI session history into NormalizedMessage format. - * @module adapters/cursor - */ - -import path from 'path'; -import os from 'os'; -import crypto from 'crypto'; -import { createNormalizedMessage, generateMessageId } from '../types.js'; - -const PROVIDER = 'cursor'; - -/** - * Load raw blobs from Cursor's SQLite store.db, parse the DAG structure, - * and return sorted message blobs in chronological order. - * @param {string} sessionId - * @param {string} projectPath - Absolute project path (used to compute cwdId hash) - * @returns {Promise>} - */ -async function loadCursorBlobs(sessionId, projectPath) { - // Lazy-import better-sqlite3 so the module doesn't fail if it's unavailable - const { default: Database } = await import('better-sqlite3'); - - const cwdId = crypto.createHash('md5').update(projectPath || process.cwd()).digest('hex'); - const storeDbPath = path.join(os.homedir(), '.cursor', 'chats', cwdId, sessionId, 'store.db'); - - const db = new Database(storeDbPath, { readonly: true, fileMustExist: true }); - - try { - const allBlobs = db.prepare('SELECT rowid, id, data FROM blobs').all(); - - const blobMap = new Map(); - const parentRefs = new Map(); - const childRefs = new Map(); - const jsonBlobs = []; - - for (const blob of allBlobs) { - blobMap.set(blob.id, blob); - - if (blob.data && blob.data[0] === 0x7B) { - try { - const parsed = JSON.parse(blob.data.toString('utf8')); - jsonBlobs.push({ ...blob, parsed }); - } catch { - // skip unparseable blobs - } - } else if (blob.data) { - const parents = []; - let i = 0; - while (i < blob.data.length - 33) { - if (blob.data[i] === 0x0A && blob.data[i + 1] === 0x20) { - const parentHash = blob.data.slice(i + 2, i + 34).toString('hex'); - if (blobMap.has(parentHash)) { - parents.push(parentHash); - } - i += 34; - } else { - i++; - } - } - if (parents.length > 0) { - parentRefs.set(blob.id, parents); - for (const parentId of parents) { - if (!childRefs.has(parentId)) childRefs.set(parentId, []); - childRefs.get(parentId).push(blob.id); - } - } - } - } - - // Topological sort (DFS) - const visited = new Set(); - const sorted = []; - function visit(nodeId) { - if (visited.has(nodeId)) return; - visited.add(nodeId); - for (const pid of (parentRefs.get(nodeId) || [])) visit(pid); - const b = blobMap.get(nodeId); - if (b) sorted.push(b); - } - for (const blob of allBlobs) { - if (!parentRefs.has(blob.id)) visit(blob.id); - } - for (const blob of allBlobs) visit(blob.id); - - // Order JSON blobs by DAG appearance - const messageOrder = new Map(); - let orderIndex = 0; - for (const blob of sorted) { - if (blob.data && blob.data[0] !== 0x7B) { - for (const jb of jsonBlobs) { - try { - const idBytes = Buffer.from(jb.id, 'hex'); - if (blob.data.includes(idBytes) && !messageOrder.has(jb.id)) { - messageOrder.set(jb.id, orderIndex++); - } - } catch { /* skip */ } - } - } - } - - const sortedJsonBlobs = jsonBlobs.sort((a, b) => { - const oa = messageOrder.get(a.id) ?? Number.MAX_SAFE_INTEGER; - const ob = messageOrder.get(b.id) ?? Number.MAX_SAFE_INTEGER; - return oa !== ob ? oa - ob : a.rowid - b.rowid; - }); - - const messages = []; - for (let idx = 0; idx < sortedJsonBlobs.length; idx++) { - const blob = sortedJsonBlobs[idx]; - const parsed = blob.parsed; - if (!parsed) continue; - const role = parsed?.role || parsed?.message?.role; - if (role === 'system') continue; - messages.push({ - id: blob.id, - sequence: idx + 1, - rowid: blob.rowid, - content: parsed, - }); - } - - return messages; - } finally { - db.close(); - } -} - -/** - * Normalize a realtime NDJSON event from Cursor CLI into NormalizedMessage(s). - * History uses normalizeCursorBlobs (SQLite DAG), this handles streaming NDJSON. - * @param {object|string} raw - A parsed NDJSON event or a raw text line - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ -export function normalizeMessage(raw, sessionId) { - // Structured assistant message with content array - if (raw && typeof raw === 'object' && raw.type === 'assistant' && raw.message?.content?.[0]?.text) { - return [createNormalizedMessage({ kind: 'stream_delta', content: raw.message.content[0].text, sessionId, provider: PROVIDER })]; - } - // Plain string line (non-JSON output) - if (typeof raw === 'string' && raw.trim()) { - return [createNormalizedMessage({ kind: 'stream_delta', content: raw, sessionId, provider: PROVIDER })]; - } - return []; -} - -/** - * @type {import('../types.js').ProviderAdapter} - */ -export const cursorAdapter = { - normalizeMessage, - /** - * Fetch session history for Cursor from SQLite store.db. - */ - async fetchHistory(sessionId, opts = {}) { - const { projectPath = '', limit = null, offset = 0 } = opts; - - try { - const blobs = await loadCursorBlobs(sessionId, projectPath); - const allNormalized = cursorAdapter.normalizeCursorBlobs(blobs, sessionId); - - // Apply pagination - if (limit !== null && limit > 0) { - const start = offset; - const page = allNormalized.slice(start, start + limit); - return { - messages: page, - total: allNormalized.length, - hasMore: start + limit < allNormalized.length, - offset, - limit, - }; - } - - return { - messages: allNormalized, - total: allNormalized.length, - hasMore: false, - offset: 0, - limit: null, - }; - } catch (error) { - // DB doesn't exist or is unreadable — return empty - console.warn(`[CursorAdapter] Failed to load session ${sessionId}:`, error.message); - return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; - } - }, - - /** - * Normalize raw Cursor blob messages into NormalizedMessage[]. - * @param {any[]} blobs - Raw cursor blobs from store.db ({id, sequence, rowid, content}) - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ - normalizeCursorBlobs(blobs, sessionId) { - const messages = []; - const toolUseMap = new Map(); - - // Use a fixed base timestamp so messages have stable, monotonically-increasing - // timestamps based on their sequence number rather than wall-clock time. - const baseTime = Date.now(); - - for (let i = 0; i < blobs.length; i++) { - const blob = blobs[i]; - const content = blob.content; - const ts = new Date(baseTime + (blob.sequence ?? i) * 100).toISOString(); - const baseId = blob.id || generateMessageId('cursor'); - - try { - if (!content?.role || !content?.content) { - // Try nested message format - if (content?.message?.role && content?.message?.content) { - if (content.message.role === 'system') continue; - const role = content.message.role === 'user' ? 'user' : 'assistant'; - let text = ''; - if (Array.isArray(content.message.content)) { - text = content.message.content - .map(p => typeof p === 'string' ? p : p?.text || '') - .filter(Boolean) - .join('\n'); - } else if (typeof content.message.content === 'string') { - text = content.message.content; - } - if (text?.trim()) { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role, - content: text, - sequence: blob.sequence, - rowid: blob.rowid, - })); - } - } - continue; - } - - if (content.role === 'system') continue; - - // Tool results - if (content.role === 'tool') { - const toolItems = Array.isArray(content.content) ? content.content : []; - for (const item of toolItems) { - if (item?.type !== 'tool-result') continue; - const toolCallId = item.toolCallId || content.id; - messages.push(createNormalizedMessage({ - id: `${baseId}_tr`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_result', - toolId: toolCallId, - content: item.result || '', - isError: false, - })); - } - continue; - } - - const role = content.role === 'user' ? 'user' : 'assistant'; - - if (Array.isArray(content.content)) { - for (let partIdx = 0; partIdx < content.content.length; partIdx++) { - const part = content.content[partIdx]; - - if (part?.type === 'text' && part?.text) { - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role, - content: part.text, - sequence: blob.sequence, - rowid: blob.rowid, - })); - } else if (part?.type === 'reasoning' && part?.text) { - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'thinking', - content: part.text, - })); - } else if (part?.type === 'tool-call' || part?.type === 'tool_use') { - const toolName = (part.toolName || part.name || 'Unknown Tool') === 'ApplyPatch' - ? 'Edit' : (part.toolName || part.name || 'Unknown Tool'); - const toolId = part.toolCallId || part.id || `tool_${i}_${partIdx}`; - messages.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_use', - toolName, - toolInput: part.args || part.input, - toolId, - })); - toolUseMap.set(toolId, messages[messages.length - 1]); - } - } - } else if (typeof content.content === 'string' && content.content.trim()) { - messages.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role, - content: content.content, - sequence: blob.sequence, - rowid: blob.rowid, - })); - } - } catch (error) { - console.warn('Error normalizing cursor blob:', error); - } - } - - // Attach tool results to tool_use messages - for (const msg of messages) { - if (msg.kind === 'tool_result' && msg.toolId && toolUseMap.has(msg.toolId)) { - const toolUse = toolUseMap.get(msg.toolId); - toolUse.toolResult = { - content: msg.content, - isError: msg.isError, - }; - } - } - - // Sort by sequence/rowid - messages.sort((a, b) => { - if (a.sequence !== undefined && b.sequence !== undefined) return a.sequence - b.sequence; - if (a.rowid !== undefined && b.rowid !== undefined) return a.rowid - b.rowid; - return new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime(); - }); - - return messages; - }, -}; diff --git a/server/providers/cursor/status.js b/server/providers/cursor/status.js deleted file mode 100644 index 127e35b7..00000000 --- a/server/providers/cursor/status.js +++ /dev/null @@ -1,128 +0,0 @@ -/** - * Cursor Provider Status - * - * Checks whether cursor-agent CLI is installed and whether the user - * is logged in. - * - * @module providers/cursor/status - */ - -import { execFileSync, spawn } from 'child_process'; - -/** - * Check if cursor-agent CLI is installed. - * @returns {boolean} - */ -export function checkInstalled() { - try { - execFileSync('cursor-agent', ['--version'], { stdio: 'ignore', timeout: 5000 }); - return true; - } catch { - return false; - } -} - -/** - * Full status check: installation + authentication. - * @returns {Promise} - */ -export async function checkStatus() { - const installed = checkInstalled(); - - if (!installed) { - return { - installed, - authenticated: false, - email: null, - error: 'Cursor CLI is not installed' - }; - } - - const result = await checkCursorLogin(); - - return { - installed, - authenticated: result.authenticated, - email: result.email || null, - error: result.error || null - }; -} - -// ─── Internal helpers ─────────────────────────────────────────────────────── - -function checkCursorLogin() { - return new Promise((resolve) => { - let processCompleted = false; - - const timeout = setTimeout(() => { - if (!processCompleted) { - processCompleted = true; - if (childProcess) { - childProcess.kill(); - } - resolve({ - authenticated: false, - email: null, - error: 'Command timeout' - }); - } - }, 5000); - - let childProcess; - try { - childProcess = spawn('cursor-agent', ['status']); - } catch { - clearTimeout(timeout); - processCompleted = true; - resolve({ - authenticated: false, - email: null, - error: 'Cursor CLI not found or not installed' - }); - return; - } - - let stdout = ''; - let stderr = ''; - - childProcess.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - childProcess.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - childProcess.on('close', (code) => { - if (processCompleted) return; - processCompleted = true; - clearTimeout(timeout); - - if (code === 0) { - const emailMatch = stdout.match(/Logged in as ([a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,})/i); - - if (emailMatch) { - resolve({ authenticated: true, email: emailMatch[1] }); - } else if (stdout.includes('Logged in')) { - resolve({ authenticated: true, email: 'Logged in' }); - } else { - resolve({ authenticated: false, email: null, error: 'Not logged in' }); - } - } else { - resolve({ authenticated: false, email: null, error: stderr || 'Not logged in' }); - } - }); - - childProcess.on('error', () => { - if (processCompleted) return; - processCompleted = true; - clearTimeout(timeout); - - resolve({ - authenticated: false, - email: null, - error: 'Cursor CLI not found or not installed' - }); - }); - }); -} diff --git a/server/providers/gemini/adapter.js b/server/providers/gemini/adapter.js deleted file mode 100644 index df303c36..00000000 --- a/server/providers/gemini/adapter.js +++ /dev/null @@ -1,186 +0,0 @@ -/** - * Gemini provider adapter. - * - * Normalizes Gemini CLI session history into NormalizedMessage format. - * @module adapters/gemini - */ - -import sessionManager from '../../sessionManager.js'; -import { getGeminiCliSessionMessages } from '../../projects.js'; -import { createNormalizedMessage, generateMessageId } from '../types.js'; - -const PROVIDER = 'gemini'; - -/** - * Normalize a realtime NDJSON event from Gemini CLI into NormalizedMessage(s). - * Handles: message (delta/final), tool_use, tool_result, result, error. - * @param {object} raw - A parsed NDJSON event - * @param {string} sessionId - * @returns {import('../types.js').NormalizedMessage[]} - */ -export function normalizeMessage(raw, sessionId) { - const ts = raw.timestamp || new Date().toISOString(); - const baseId = raw.uuid || generateMessageId('gemini'); - - if (raw.type === 'message' && raw.role === 'assistant') { - const content = raw.content || ''; - const msgs = []; - if (content) { - msgs.push(createNormalizedMessage({ id: baseId, sessionId, timestamp: ts, provider: PROVIDER, kind: 'stream_delta', content })); - } - // If not a delta, also send stream_end - if (raw.delta !== true) { - msgs.push(createNormalizedMessage({ sessionId, timestamp: ts, provider: PROVIDER, kind: 'stream_end' })); - } - return msgs; - } - - if (raw.type === 'tool_use') { - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_use', toolName: raw.tool_name, toolInput: raw.parameters || {}, - toolId: raw.tool_id || baseId, - })]; - } - - if (raw.type === 'tool_result') { - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'tool_result', toolId: raw.tool_id || '', - content: raw.output === undefined ? '' : String(raw.output), - isError: raw.status === 'error', - })]; - } - - if (raw.type === 'result') { - const msgs = [createNormalizedMessage({ sessionId, timestamp: ts, provider: PROVIDER, kind: 'stream_end' })]; - if (raw.stats?.total_tokens) { - msgs.push(createNormalizedMessage({ - sessionId, timestamp: ts, provider: PROVIDER, - kind: 'status', text: 'Complete', tokens: raw.stats.total_tokens, canInterrupt: false, - })); - } - return msgs; - } - - if (raw.type === 'error') { - return [createNormalizedMessage({ - id: baseId, sessionId, timestamp: ts, provider: PROVIDER, - kind: 'error', content: raw.error || raw.message || 'Unknown Gemini streaming error', - })]; - } - - return []; -} - -/** - * @type {import('../types.js').ProviderAdapter} - */ -export const geminiAdapter = { - normalizeMessage, - /** - * Fetch session history for Gemini. - * First tries in-memory session manager, then falls back to CLI sessions on disk. - */ - async fetchHistory(sessionId, opts = {}) { - let rawMessages; - try { - rawMessages = sessionManager.getSessionMessages(sessionId); - - // Fallback to Gemini CLI sessions on disk - if (rawMessages.length === 0) { - rawMessages = await getGeminiCliSessionMessages(sessionId); - } - } catch (error) { - console.warn(`[GeminiAdapter] Failed to load session ${sessionId}:`, error.message); - return { messages: [], total: 0, hasMore: false, offset: 0, limit: null }; - } - - const normalized = []; - for (let i = 0; i < rawMessages.length; i++) { - const raw = rawMessages[i]; - const ts = raw.timestamp || new Date().toISOString(); - const baseId = raw.uuid || generateMessageId('gemini'); - - // sessionManager format: { type: 'message', message: { role, content }, timestamp } - // CLI format: { role: 'user'|'gemini'|'assistant', content: string|array } - const role = raw.message?.role || raw.role; - const content = raw.message?.content || raw.content; - - if (!role || !content) continue; - - const normalizedRole = (role === 'user') ? 'user' : 'assistant'; - - if (Array.isArray(content)) { - for (let partIdx = 0; partIdx < content.length; partIdx++) { - const part = content[partIdx]; - if (part.type === 'text' && part.text) { - normalized.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: normalizedRole, - content: part.text, - })); - } else if (part.type === 'tool_use') { - normalized.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_use', - toolName: part.name, - toolInput: part.input, - toolId: part.id || generateMessageId('gemini_tool'), - })); - } else if (part.type === 'tool_result') { - normalized.push(createNormalizedMessage({ - id: `${baseId}_${partIdx}`, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'tool_result', - toolId: part.tool_use_id || '', - content: part.content === undefined ? '' : String(part.content), - isError: Boolean(part.is_error), - })); - } - } - } else if (typeof content === 'string' && content.trim()) { - normalized.push(createNormalizedMessage({ - id: baseId, - sessionId, - timestamp: ts, - provider: PROVIDER, - kind: 'text', - role: normalizedRole, - content, - })); - } - } - - // Attach tool results to tool_use messages - const toolResultMap = new Map(); - for (const msg of normalized) { - if (msg.kind === 'tool_result' && msg.toolId) { - toolResultMap.set(msg.toolId, msg); - } - } - for (const msg of normalized) { - if (msg.kind === 'tool_use' && msg.toolId && toolResultMap.has(msg.toolId)) { - const tr = toolResultMap.get(msg.toolId); - msg.toolResult = { content: tr.content, isError: tr.isError }; - } - } - - return { - messages: normalized, - total: normalized.length, - hasMore: false, - offset: 0, - limit: null, - }; - }, -}; diff --git a/server/providers/gemini/status.js b/server/providers/gemini/status.js deleted file mode 100644 index 385f889f..00000000 --- a/server/providers/gemini/status.js +++ /dev/null @@ -1,111 +0,0 @@ -/** - * Gemini Provider Status - * - * Checks whether Gemini CLI is installed and whether the user - * has valid authentication credentials. - * - * @module providers/gemini/status - */ - -import { execFileSync } from 'child_process'; -import { promises as fs } from 'fs'; -import path from 'path'; -import os from 'os'; - -/** - * Check if Gemini CLI is installed. - * Uses GEMINI_PATH env var if set, otherwise looks for 'gemini' in PATH. - * @returns {boolean} - */ -export function checkInstalled() { - const cliPath = process.env.GEMINI_PATH || 'gemini'; - try { - execFileSync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 }); - return true; - } catch { - return false; - } -} - -/** - * Full status check: installation + authentication. - * @returns {Promise} - */ -export async function checkStatus() { - const installed = checkInstalled(); - - if (!installed) { - return { - installed, - authenticated: false, - email: null, - error: 'Gemini CLI is not installed' - }; - } - - const result = await checkCredentials(); - - return { - installed, - authenticated: result.authenticated, - email: result.email || null, - error: result.error || null - }; -} - -// ─── Internal helpers ─────────────────────────────────────────────────────── - -async function checkCredentials() { - if (process.env.GEMINI_API_KEY && process.env.GEMINI_API_KEY.trim()) { - return { authenticated: true, email: 'API Key Auth' }; - } - - try { - const credsPath = path.join(os.homedir(), '.gemini', 'oauth_creds.json'); - const content = await fs.readFile(credsPath, 'utf8'); - const creds = JSON.parse(content); - - if (creds.access_token) { - let email = 'OAuth Session'; - - try { - const tokenRes = await fetch(`https://oauth2.googleapis.com/tokeninfo?access_token=${creds.access_token}`); - if (tokenRes.ok) { - const tokenInfo = await tokenRes.json(); - if (tokenInfo.email) { - email = tokenInfo.email; - } - } else if (!creds.refresh_token) { - return { - authenticated: false, - email: null, - error: 'Access token invalid and no refresh token found' - }; - } else { - // Token might be expired but we have a refresh token, so CLI will refresh it - email = await getActiveAccountEmail() || email; - } - } catch { - // Network error, fallback to checking local accounts file - email = await getActiveAccountEmail() || email; - } - - return { authenticated: true, email }; - } - - return { authenticated: false, email: null, error: 'No valid tokens found in oauth_creds' }; - } catch { - return { authenticated: false, email: null, error: 'Gemini CLI not configured' }; - } -} - -async function getActiveAccountEmail() { - try { - const accPath = path.join(os.homedir(), '.gemini', 'google_accounts.json'); - const accContent = await fs.readFile(accPath, 'utf8'); - const accounts = JSON.parse(accContent); - return accounts.active || null; - } catch { - return null; - } -} diff --git a/server/providers/registry.js b/server/providers/registry.js deleted file mode 100644 index 4f62b60b..00000000 --- a/server/providers/registry.js +++ /dev/null @@ -1,67 +0,0 @@ -/** - * Provider Registry - * - * Centralizes provider adapter and status checker lookup. All code that needs - * a provider adapter or status checker should go through this registry instead - * of importing individual modules directly. - * - * @module providers/registry - */ - -import { claudeAdapter } from './claude/adapter.js'; -import { cursorAdapter } from './cursor/adapter.js'; -import { codexAdapter } from './codex/adapter.js'; -import { geminiAdapter } from './gemini/adapter.js'; - -import * as claudeStatus from './claude/status.js'; -import * as cursorStatus from './cursor/status.js'; -import * as codexStatus from './codex/status.js'; -import * as geminiStatus from './gemini/status.js'; - -/** - * @typedef {import('./types.js').ProviderAdapter} ProviderAdapter - * @typedef {import('./types.js').SessionProvider} SessionProvider - */ - -/** @type {Map} */ -const providers = new Map(); - -/** @type {Map boolean, checkStatus: () => Promise }>} */ -const statusCheckers = new Map(); - -// Register built-in providers -providers.set('claude', claudeAdapter); -providers.set('cursor', cursorAdapter); -providers.set('codex', codexAdapter); -providers.set('gemini', geminiAdapter); - -statusCheckers.set('claude', claudeStatus); -statusCheckers.set('cursor', cursorStatus); -statusCheckers.set('codex', codexStatus); -statusCheckers.set('gemini', geminiStatus); - -/** - * Get a provider adapter by name. - * @param {string} name - Provider name (e.g., 'claude', 'cursor', 'codex', 'gemini') - * @returns {ProviderAdapter | undefined} - */ -export function getProvider(name) { - return providers.get(name); -} - -/** - * Get a provider status checker by name. - * @param {string} name - Provider name - * @returns {{ checkInstalled: () => boolean, checkStatus: () => Promise } | undefined} - */ -export function getStatusChecker(name) { - return statusCheckers.get(name); -} - -/** - * Get all registered provider names. - * @returns {string[]} - */ -export function getAllProviders() { - return Array.from(providers.keys()); -} diff --git a/server/providers/types.js b/server/providers/types.js deleted file mode 100644 index 9867b077..00000000 --- a/server/providers/types.js +++ /dev/null @@ -1,132 +0,0 @@ -/** - * Provider Types & Interface - * - * Defines the normalized message format and the provider adapter interface. - * All providers normalize their native formats into NormalizedMessage - * before sending over REST or WebSocket. - * - * @module providers/types - */ - -// ─── Session Provider ──────────────────────────────────────────────────────── - -/** - * @typedef {'claude' | 'cursor' | 'codex' | 'gemini'} SessionProvider - */ - -// ─── Message Kind ──────────────────────────────────────────────────────────── - -/** - * @typedef {'text' | 'tool_use' | 'tool_result' | 'thinking' | 'stream_delta' | 'stream_end' - * | 'error' | 'complete' | 'status' | 'permission_request' | 'permission_cancelled' - * | 'session_created' | 'interactive_prompt' | 'task_notification'} MessageKind - */ - -// ─── NormalizedMessage ─────────────────────────────────────────────────────── - -/** - * @typedef {Object} NormalizedMessage - * @property {string} id - Unique message id (for dedup between server + realtime) - * @property {string} sessionId - * @property {string} timestamp - ISO 8601 - * @property {SessionProvider} provider - * @property {MessageKind} kind - * - * Additional fields depending on kind: - * - text: role ('user'|'assistant'), content, images? - * - tool_use: toolName, toolInput, toolId - * - tool_result: toolId, content, isError - * - thinking: content - * - stream_delta: content - * - stream_end: (no extra fields) - * - error: content - * - complete: (no extra fields) - * - status: text, tokens?, canInterrupt? - * - permission_request: requestId, toolName, input, context? - * - permission_cancelled: requestId - * - session_created: newSessionId - * - interactive_prompt: content - * - task_notification: status, summary - */ - -// ─── Fetch History ─────────────────────────────────────────────────────────── - -/** - * @typedef {Object} FetchHistoryOptions - * @property {string} [projectName] - Project name (required for Claude) - * @property {string} [projectPath] - Absolute project path (required for Cursor cwdId hash) - * @property {number|null} [limit] - Page size (null = all messages) - * @property {number} [offset] - Pagination offset (default: 0) - */ - -/** - * @typedef {Object} FetchHistoryResult - * @property {NormalizedMessage[]} messages - Normalized messages - * @property {number} total - Total number of messages in the session - * @property {boolean} hasMore - Whether more messages exist before the current page - * @property {number} offset - Current offset - * @property {number|null} limit - Page size used - * @property {object} [tokenUsage] - Token usage data (provider-specific) - */ - -// ─── Provider Status ──────────────────────────────────────────────────────── - -/** - * Result of a provider status check (installation + authentication). - * - * @typedef {Object} ProviderStatus - * @property {boolean} installed - Whether the provider's CLI/SDK is available - * @property {boolean} authenticated - Whether valid credentials exist - * @property {string|null} email - User email or auth method identifier - * @property {string|null} [method] - Auth method (e.g. 'api_key', 'credentials_file') - * @property {string|null} [error] - Error message if not installed or not authenticated - */ - -// ─── Provider Adapter Interface ────────────────────────────────────────────── - -/** - * Every provider adapter MUST implement this interface. - * - * @typedef {Object} ProviderAdapter - * - * @property {(sessionId: string, opts?: FetchHistoryOptions) => Promise} fetchHistory - * Read persisted session messages from disk/database and return them as NormalizedMessage[]. - * The backend calls this from the unified GET /api/sessions/:id/messages endpoint. - * - * Provider implementations: - * - Claude: reads ~/.claude/projects/{projectName}/*.jsonl - * - Cursor: reads from SQLite store.db (via normalizeCursorBlobs helper) - * - Codex: reads ~/.codex/sessions/*.jsonl - * - Gemini: reads from in-memory sessionManager or ~/.gemini/tmp/ JSON files - * - * @property {(raw: any, sessionId: string) => NormalizedMessage[]} normalizeMessage - * Normalize a provider-specific event (JSONL entry or live SDK event) into NormalizedMessage[]. - * Used by provider files to convert both history and realtime events. - */ - -// ─── Runtime Helpers ───────────────────────────────────────────────────────── - -/** - * Generate a unique message ID. - * Uses crypto.randomUUID() to avoid collisions across server restarts and workers. - * @param {string} [prefix='msg'] - Optional prefix - * @returns {string} - */ -export function generateMessageId(prefix = 'msg') { - return `${prefix}_${crypto.randomUUID()}`; -} - -/** - * Create a NormalizedMessage with common fields pre-filled. - * @param {Partial & {kind: MessageKind, provider: SessionProvider}} fields - * @returns {NormalizedMessage} - */ -export function createNormalizedMessage(fields) { - return { - ...fields, - id: fields.id || generateMessageId(fields.kind), - sessionId: fields.sessionId || '', - timestamp: fields.timestamp || new Date().toISOString(), - provider: fields.provider, - }; -} diff --git a/server/providers/utils.js b/server/providers/utils.js deleted file mode 100644 index 1ec1382f..00000000 --- a/server/providers/utils.js +++ /dev/null @@ -1,29 +0,0 @@ -/** - * Shared provider utilities. - * - * @module providers/utils - */ - -/** - * Prefixes that indicate internal/system content which should be hidden from the UI. - * @type {readonly string[]} - */ -export const INTERNAL_CONTENT_PREFIXES = Object.freeze([ - '', - '', - '', - '', - '', - 'Caveat:', - 'This session is being continued from a previous', - '[Request interrupted', -]); - -/** - * Check if user text content is internal/system that should be skipped. - * @param {string} content - * @returns {boolean} - */ -export function isInternalContent(content) { - return INTERNAL_CONTENT_PREFIXES.some(prefix => content.startsWith(prefix)); -} diff --git a/server/routes/cli-auth.js b/server/routes/cli-auth.js deleted file mode 100644 index 4183e83f..00000000 --- a/server/routes/cli-auth.js +++ /dev/null @@ -1,27 +0,0 @@ -/** - * CLI Auth Routes - * - * Thin router that delegates to per-provider status checkers - * registered in the provider registry. - * - * @module routes/cli-auth - */ - -import express from 'express'; -import { getAllProviders, getStatusChecker } from '../providers/registry.js'; - -const router = express.Router(); - -for (const provider of getAllProviders()) { - router.get(`/${provider}/status`, async (req, res) => { - try { - const checker = getStatusChecker(provider); - res.json(await checker.checkStatus()); - } catch (error) { - console.error(`Error checking ${provider} status:`, error); - res.status(500).json({ authenticated: false, error: error.message }); - } - }); -} - -export default router; diff --git a/server/routes/codex.js b/server/routes/codex.js index 3855548e..06630414 100644 --- a/server/routes/codex.js +++ b/server/routes/codex.js @@ -1,73 +1,9 @@ import express from 'express'; -import { spawn } from 'child_process'; -import { promises as fs } from 'fs'; -import path from 'path'; -import os from 'os'; -import TOML from '@iarna/toml'; -import { getCodexSessions, deleteCodexSession } from '../projects.js'; -import { applyCustomSessionNames, sessionNamesDb } from '../database/db.js'; +import { deleteCodexSession } from '../projects.js'; +import { sessionNamesDb } from '../database/db.js'; const router = express.Router(); -function createCliResponder(res) { - let responded = false; - return (status, payload) => { - if (responded || res.headersSent) { - return; - } - responded = true; - res.status(status).json(payload); - }; -} - -router.get('/config', async (req, res) => { - try { - const configPath = path.join(os.homedir(), '.codex', 'config.toml'); - const content = await fs.readFile(configPath, 'utf8'); - const config = TOML.parse(content); - - res.json({ - success: true, - config: { - model: config.model || null, - mcpServers: config.mcp_servers || {}, - approvalMode: config.approval_mode || 'suggest' - } - }); - } catch (error) { - if (error.code === 'ENOENT') { - res.json({ - success: true, - config: { - model: null, - mcpServers: {}, - approvalMode: 'suggest' - } - }); - } else { - console.error('Error reading Codex config:', error); - res.status(500).json({ success: false, error: error.message }); - } - } -}); - -router.get('/sessions', async (req, res) => { - try { - const { projectPath } = req.query; - - if (!projectPath) { - return res.status(400).json({ success: false, error: 'projectPath query parameter required' }); - } - - const sessions = await getCodexSessions(projectPath); - applyCustomSessionNames(sessions, 'codex'); - res.json({ success: true, sessions }); - } catch (error) { - console.error('Error fetching Codex sessions:', error); - res.status(500).json({ success: false, error: error.message }); - } -}); - router.delete('/sessions/:sessionId', async (req, res) => { try { const { sessionId } = req.params; @@ -80,250 +16,4 @@ router.delete('/sessions/:sessionId', async (req, res) => { } }); -// MCP Server Management Routes - -router.get('/mcp/cli/list', async (req, res) => { - try { - const respond = createCliResponder(res); - const proc = spawn('codex', ['mcp', 'list'], { stdio: ['pipe', 'pipe', 'pipe'] }); - - let stdout = ''; - let stderr = ''; - - proc.stdout?.on('data', (data) => { stdout += data.toString(); }); - proc.stderr?.on('data', (data) => { stderr += data.toString(); }); - - proc.on('close', (code) => { - if (code === 0) { - respond(200, { success: true, output: stdout, servers: parseCodexListOutput(stdout) }); - } else { - respond(500, { error: 'Codex CLI command failed', details: stderr || `Exited with code ${code}` }); - } - }); - - proc.on('error', (error) => { - const isMissing = error?.code === 'ENOENT'; - respond(isMissing ? 503 : 500, { - error: isMissing ? 'Codex CLI not installed' : 'Failed to run Codex CLI', - details: error.message, - code: error.code - }); - }); - } catch (error) { - res.status(500).json({ error: 'Failed to list MCP servers', details: error.message }); - } -}); - -router.post('/mcp/cli/add', async (req, res) => { - try { - const { name, command, args = [], env = {} } = req.body; - - if (!name || !command) { - return res.status(400).json({ error: 'name and command are required' }); - } - - // Build: codex mcp add [-e KEY=VAL]... -- [args...] - let cliArgs = ['mcp', 'add', name]; - - Object.entries(env).forEach(([key, value]) => { - cliArgs.push('-e', `${key}=${value}`); - }); - - cliArgs.push('--', command); - - if (args && args.length > 0) { - cliArgs.push(...args); - } - - const respond = createCliResponder(res); - const proc = spawn('codex', cliArgs, { stdio: ['pipe', 'pipe', 'pipe'] }); - - let stdout = ''; - let stderr = ''; - - proc.stdout?.on('data', (data) => { stdout += data.toString(); }); - proc.stderr?.on('data', (data) => { stderr += data.toString(); }); - - proc.on('close', (code) => { - if (code === 0) { - respond(200, { success: true, output: stdout, message: `MCP server "${name}" added successfully` }); - } else { - respond(400, { error: 'Codex CLI command failed', details: stderr || `Exited with code ${code}` }); - } - }); - - proc.on('error', (error) => { - const isMissing = error?.code === 'ENOENT'; - respond(isMissing ? 503 : 500, { - error: isMissing ? 'Codex CLI not installed' : 'Failed to run Codex CLI', - details: error.message, - code: error.code - }); - }); - } catch (error) { - res.status(500).json({ error: 'Failed to add MCP server', details: error.message }); - } -}); - -router.delete('/mcp/cli/remove/:name', async (req, res) => { - try { - const { name } = req.params; - - const respond = createCliResponder(res); - const proc = spawn('codex', ['mcp', 'remove', name], { stdio: ['pipe', 'pipe', 'pipe'] }); - - let stdout = ''; - let stderr = ''; - - proc.stdout?.on('data', (data) => { stdout += data.toString(); }); - proc.stderr?.on('data', (data) => { stderr += data.toString(); }); - - proc.on('close', (code) => { - if (code === 0) { - respond(200, { success: true, output: stdout, message: `MCP server "${name}" removed successfully` }); - } else { - respond(400, { error: 'Codex CLI command failed', details: stderr || `Exited with code ${code}` }); - } - }); - - proc.on('error', (error) => { - const isMissing = error?.code === 'ENOENT'; - respond(isMissing ? 503 : 500, { - error: isMissing ? 'Codex CLI not installed' : 'Failed to run Codex CLI', - details: error.message, - code: error.code - }); - }); - } catch (error) { - res.status(500).json({ error: 'Failed to remove MCP server', details: error.message }); - } -}); - -router.get('/mcp/cli/get/:name', async (req, res) => { - try { - const { name } = req.params; - - const respond = createCliResponder(res); - const proc = spawn('codex', ['mcp', 'get', name], { stdio: ['pipe', 'pipe', 'pipe'] }); - - let stdout = ''; - let stderr = ''; - - proc.stdout?.on('data', (data) => { stdout += data.toString(); }); - proc.stderr?.on('data', (data) => { stderr += data.toString(); }); - - proc.on('close', (code) => { - if (code === 0) { - respond(200, { success: true, output: stdout, server: parseCodexGetOutput(stdout) }); - } else { - respond(404, { error: 'Codex CLI command failed', details: stderr || `Exited with code ${code}` }); - } - }); - - proc.on('error', (error) => { - const isMissing = error?.code === 'ENOENT'; - respond(isMissing ? 503 : 500, { - error: isMissing ? 'Codex CLI not installed' : 'Failed to run Codex CLI', - details: error.message, - code: error.code - }); - }); - } catch (error) { - res.status(500).json({ error: 'Failed to get MCP server details', details: error.message }); - } -}); - -router.get('/mcp/config/read', async (req, res) => { - try { - const configPath = path.join(os.homedir(), '.codex', 'config.toml'); - - let configData = null; - - try { - const fileContent = await fs.readFile(configPath, 'utf8'); - configData = TOML.parse(fileContent); - } catch (error) { - // Config file doesn't exist - } - - if (!configData) { - return res.json({ success: true, configPath, servers: [] }); } - - const servers = []; - - if (configData.mcp_servers && typeof configData.mcp_servers === 'object') { - for (const [name, config] of Object.entries(configData.mcp_servers)) { - servers.push({ - id: name, - name: name, - type: 'stdio', - scope: 'user', - config: { - command: config.command || '', - args: config.args || [], - env: config.env || {} - }, - raw: config - }); - } - } - - res.json({ success: true, configPath, servers }); - } catch (error) { - res.status(500).json({ error: 'Failed to read Codex configuration', details: error.message }); - } -}); - -function parseCodexListOutput(output) { - const servers = []; - const lines = output.split('\n').filter(line => line.trim()); - - for (const line of lines) { - if (line.includes(':')) { - const colonIndex = line.indexOf(':'); - const name = line.substring(0, colonIndex).trim(); - - if (!name) continue; - - const rest = line.substring(colonIndex + 1).trim(); - let description = rest; - let status = 'unknown'; - - if (rest.includes('✓') || rest.includes('✗')) { - const statusMatch = rest.match(/(.*?)\s*-\s*([✓✗].*)$/); - if (statusMatch) { - description = statusMatch[1].trim(); - status = statusMatch[2].includes('✓') ? 'connected' : 'failed'; - } - } - - servers.push({ name, type: 'stdio', status, description }); - } - } - - return servers; -} - -function parseCodexGetOutput(output) { - try { - const jsonMatch = output.match(/\{[\s\S]*\}/); - if (jsonMatch) { - return JSON.parse(jsonMatch[0]); - } - - const server = { raw_output: output }; - const lines = output.split('\n'); - - for (const line of lines) { - if (line.includes('Name:')) server.name = line.split(':')[1]?.trim(); - else if (line.includes('Type:')) server.type = line.split(':')[1]?.trim(); - else if (line.includes('Command:')) server.command = line.split(':')[1]?.trim(); - } - - return server; - } catch (error) { - return { raw_output: output, parse_error: error.message }; - } -} - export default router; diff --git a/server/routes/commands.js b/server/routes/commands.js index 4ce3c4c0..4b791564 100644 --- a/server/routes/commands.js +++ b/server/routes/commands.js @@ -451,55 +451,6 @@ router.post('/list', async (req, res) => { } }); -/** - * POST /api/commands/load - * Load a specific command file and return its content and metadata - */ -router.post('/load', async (req, res) => { - try { - const { commandPath } = req.body; - - if (!commandPath) { - return res.status(400).json({ - error: 'Command path is required' - }); - } - - // Security: Prevent path traversal - const resolvedPath = path.resolve(commandPath); - if (!resolvedPath.startsWith(path.resolve(os.homedir())) && - !resolvedPath.includes('.claude/commands')) { - return res.status(403).json({ - error: 'Access denied', - message: 'Command must be in .claude/commands directory' - }); - } - - // Read and parse the command file - const content = await fs.readFile(commandPath, 'utf8'); - const { data: metadata, content: commandContent } = parseFrontmatter(content); - - res.json({ - path: commandPath, - metadata, - content: commandContent - }); - } catch (error) { - if (error.code === 'ENOENT') { - return res.status(404).json({ - error: 'Command not found', - message: `Command file not found: ${req.body.commandPath}` - }); - } - - console.error('Error loading command:', error); - res.status(500).json({ - error: 'Failed to load command', - message: error.message - }); - } -}); - /** * POST /api/commands/execute * Execute a command with argument replacement diff --git a/server/routes/cursor.js b/server/routes/cursor.js index 3fc0270a..5fbe98ce 100644 --- a/server/routes/cursor.js +++ b/server/routes/cursor.js @@ -2,563 +2,51 @@ import express from 'express'; import { promises as fs } from 'fs'; import path from 'path'; import os from 'os'; -import Database from 'better-sqlite3'; -import crypto from 'crypto'; import { CURSOR_MODELS } from '../../shared/modelConstants.js'; -import { applyCustomSessionNames } from '../database/db.js'; const router = express.Router(); -// GET /api/cursor/config - Read Cursor CLI configuration +// GET /api/cursor/config - Read Cursor CLI configuration. router.get('/config', async (req, res) => { try { const configPath = path.join(os.homedir(), '.cursor', 'cli-config.json'); - + try { const configContent = await fs.readFile(configPath, 'utf8'); const config = JSON.parse(configContent); - + res.json({ success: true, - config: config, - path: configPath + config, + path: configPath, }); } catch (error) { - // Config doesn't exist or is invalid + // Config doesn't exist or is invalid, so return the UI default shape. console.log('Cursor config not found or invalid:', error.message); - - // Return default config + res.json({ success: true, config: { version: 1, model: { modelId: CURSOR_MODELS.DEFAULT, - displayName: "GPT-5" + displayName: 'GPT-5', }, permissions: { allow: [], - deny: [] - } + deny: [], + }, }, - isDefault: true + isDefault: true, }); } } catch (error) { console.error('Error reading Cursor config:', error); - res.status(500).json({ - error: 'Failed to read Cursor configuration', - details: error.message + res.status(500).json({ + error: 'Failed to read Cursor configuration', + details: error.message, }); } }); -// POST /api/cursor/config - Update Cursor CLI configuration -router.post('/config', async (req, res) => { - try { - const { permissions, model } = req.body; - const configPath = path.join(os.homedir(), '.cursor', 'cli-config.json'); - - // Read existing config or create default - let config = { - version: 1, - editor: { - vimMode: false - }, - hasChangedDefaultModel: false, - privacyCache: { - ghostMode: false, - privacyMode: 3, - updatedAt: Date.now() - } - }; - - try { - const existing = await fs.readFile(configPath, 'utf8'); - config = JSON.parse(existing); - } catch (error) { - // Config doesn't exist, use defaults - console.log('Creating new Cursor config'); - } - - // Update permissions if provided - if (permissions) { - config.permissions = { - allow: permissions.allow || [], - deny: permissions.deny || [] - }; - } - - // Update model if provided - if (model) { - config.model = model; - config.hasChangedDefaultModel = true; - } - - // Ensure directory exists - const configDir = path.dirname(configPath); - await fs.mkdir(configDir, { recursive: true }); - - // Write updated config - await fs.writeFile(configPath, JSON.stringify(config, null, 2)); - - res.json({ - success: true, - config: config, - message: 'Cursor configuration updated successfully' - }); - } catch (error) { - console.error('Error updating Cursor config:', error); - res.status(500).json({ - error: 'Failed to update Cursor configuration', - details: error.message - }); - } -}); - -// GET /api/cursor/mcp - Read Cursor MCP servers configuration -router.get('/mcp', async (req, res) => { - try { - const mcpPath = path.join(os.homedir(), '.cursor', 'mcp.json'); - - try { - const mcpContent = await fs.readFile(mcpPath, 'utf8'); - const mcpConfig = JSON.parse(mcpContent); - - // Convert to UI-friendly format - const servers = []; - if (mcpConfig.mcpServers && typeof mcpConfig.mcpServers === 'object') { - for (const [name, config] of Object.entries(mcpConfig.mcpServers)) { - const server = { - id: name, - name: name, - type: 'stdio', - scope: 'cursor', - config: {}, - raw: config - }; - - // Determine transport type and extract config - if (config.command) { - server.type = 'stdio'; - server.config.command = config.command; - server.config.args = config.args || []; - server.config.env = config.env || {}; - } else if (config.url) { - server.type = config.transport || 'http'; - server.config.url = config.url; - server.config.headers = config.headers || {}; - } - - servers.push(server); - } - } - - res.json({ - success: true, - servers: servers, - path: mcpPath - }); - } catch (error) { - // MCP config doesn't exist - console.log('Cursor MCP config not found:', error.message); - res.json({ - success: true, - servers: [], - isDefault: true - }); - } - } catch (error) { - console.error('Error reading Cursor MCP config:', error); - res.status(500).json({ - error: 'Failed to read Cursor MCP configuration', - details: error.message - }); - } -}); - -// POST /api/cursor/mcp/add - Add MCP server to Cursor configuration -router.post('/mcp/add', async (req, res) => { - try { - const { name, type = 'stdio', command, args = [], url, headers = {}, env = {} } = req.body; - const mcpPath = path.join(os.homedir(), '.cursor', 'mcp.json'); - - console.log(`➕ Adding MCP server to Cursor config: ${name}`); - - // Read existing config or create new - let mcpConfig = { mcpServers: {} }; - - try { - const existing = await fs.readFile(mcpPath, 'utf8'); - mcpConfig = JSON.parse(existing); - if (!mcpConfig.mcpServers) { - mcpConfig.mcpServers = {}; - } - } catch (error) { - console.log('Creating new Cursor MCP config'); - } - - // Build server config based on type - let serverConfig = {}; - - if (type === 'stdio') { - serverConfig = { - command: command, - args: args, - env: env - }; - } else if (type === 'http' || type === 'sse') { - serverConfig = { - url: url, - transport: type, - headers: headers - }; - } - - // Add server to config - mcpConfig.mcpServers[name] = serverConfig; - - // Ensure directory exists - const mcpDir = path.dirname(mcpPath); - await fs.mkdir(mcpDir, { recursive: true }); - - // Write updated config - await fs.writeFile(mcpPath, JSON.stringify(mcpConfig, null, 2)); - - res.json({ - success: true, - message: `MCP server "${name}" added to Cursor configuration`, - config: mcpConfig - }); - } catch (error) { - console.error('Error adding MCP server to Cursor:', error); - res.status(500).json({ - error: 'Failed to add MCP server', - details: error.message - }); - } -}); - -// DELETE /api/cursor/mcp/:name - Remove MCP server from Cursor configuration -router.delete('/mcp/:name', async (req, res) => { - try { - const { name } = req.params; - const mcpPath = path.join(os.homedir(), '.cursor', 'mcp.json'); - - console.log(`🗑️ Removing MCP server from Cursor config: ${name}`); - - // Read existing config - let mcpConfig = { mcpServers: {} }; - - try { - const existing = await fs.readFile(mcpPath, 'utf8'); - mcpConfig = JSON.parse(existing); - } catch (error) { - return res.status(404).json({ - error: 'Cursor MCP configuration not found' - }); - } - - // Check if server exists - if (!mcpConfig.mcpServers || !mcpConfig.mcpServers[name]) { - return res.status(404).json({ - error: `MCP server "${name}" not found in Cursor configuration` - }); - } - - // Remove server from config - delete mcpConfig.mcpServers[name]; - - // Write updated config - await fs.writeFile(mcpPath, JSON.stringify(mcpConfig, null, 2)); - - res.json({ - success: true, - message: `MCP server "${name}" removed from Cursor configuration`, - config: mcpConfig - }); - } catch (error) { - console.error('Error removing MCP server from Cursor:', error); - res.status(500).json({ - error: 'Failed to remove MCP server', - details: error.message - }); - } -}); - -// POST /api/cursor/mcp/add-json - Add MCP server using JSON format -router.post('/mcp/add-json', async (req, res) => { - try { - const { name, jsonConfig } = req.body; - const mcpPath = path.join(os.homedir(), '.cursor', 'mcp.json'); - - console.log(`➕ Adding MCP server to Cursor config via JSON: ${name}`); - - // Validate and parse JSON config - let parsedConfig; - try { - parsedConfig = typeof jsonConfig === 'string' ? JSON.parse(jsonConfig) : jsonConfig; - } catch (parseError) { - return res.status(400).json({ - error: 'Invalid JSON configuration', - details: parseError.message - }); - } - - // Read existing config or create new - let mcpConfig = { mcpServers: {} }; - - try { - const existing = await fs.readFile(mcpPath, 'utf8'); - mcpConfig = JSON.parse(existing); - if (!mcpConfig.mcpServers) { - mcpConfig.mcpServers = {}; - } - } catch (error) { - console.log('Creating new Cursor MCP config'); - } - - // Add server to config - mcpConfig.mcpServers[name] = parsedConfig; - - // Ensure directory exists - const mcpDir = path.dirname(mcpPath); - await fs.mkdir(mcpDir, { recursive: true }); - - // Write updated config - await fs.writeFile(mcpPath, JSON.stringify(mcpConfig, null, 2)); - - res.json({ - success: true, - message: `MCP server "${name}" added to Cursor configuration via JSON`, - config: mcpConfig - }); - } catch (error) { - console.error('Error adding MCP server to Cursor via JSON:', error); - res.status(500).json({ - error: 'Failed to add MCP server', - details: error.message - }); - } -}); - -// GET /api/cursor/sessions - Get Cursor sessions from SQLite database -router.get('/sessions', async (req, res) => { - try { - const { projectPath } = req.query; - - // Calculate cwdID hash for the project path (Cursor uses MD5 hash) - const cwdId = crypto.createHash('md5').update(projectPath || process.cwd()).digest('hex'); - const cursorChatsPath = path.join(os.homedir(), '.cursor', 'chats', cwdId); - - - // Check if the directory exists - try { - await fs.access(cursorChatsPath); - } catch (error) { - // No sessions for this project - return res.json({ - success: true, - sessions: [], - cwdId: cwdId, - path: cursorChatsPath - }); - } - - // List all session directories - const sessionDirs = await fs.readdir(cursorChatsPath); - const sessions = []; - - for (const sessionId of sessionDirs) { - const sessionPath = path.join(cursorChatsPath, sessionId); - const storeDbPath = path.join(sessionPath, 'store.db'); - let dbStatMtimeMs = null; - - try { - // Check if store.db exists - await fs.access(storeDbPath); - - // Capture store.db mtime as a reliable fallback timestamp (last activity) - try { - const stat = await fs.stat(storeDbPath); - dbStatMtimeMs = stat.mtimeMs; - } catch (_) {} - - // Open SQLite database - const db = new Database(storeDbPath, { readonly: true, fileMustExist: true }); - - // Get metadata from meta table - const metaRows = db.prepare('SELECT key, value FROM meta').all(); - - let sessionData = { - id: sessionId, - name: 'Untitled Session', - createdAt: null, - mode: null, - projectPath: projectPath, - lastMessage: null, - messageCount: 0 - }; - - // Parse meta table entries - for (const row of metaRows) { - if (row.value) { - try { - // Try to decode as hex-encoded JSON - const hexMatch = row.value.toString().match(/^[0-9a-fA-F]+$/); - if (hexMatch) { - const jsonStr = Buffer.from(row.value, 'hex').toString('utf8'); - const data = JSON.parse(jsonStr); - - if (row.key === 'agent') { - sessionData.name = data.name || sessionData.name; - // Normalize createdAt to ISO string in milliseconds - let createdAt = data.createdAt; - if (typeof createdAt === 'number') { - if (createdAt < 1e12) { - createdAt = createdAt * 1000; // seconds -> ms - } - sessionData.createdAt = new Date(createdAt).toISOString(); - } else if (typeof createdAt === 'string') { - const n = Number(createdAt); - if (!Number.isNaN(n)) { - const ms = n < 1e12 ? n * 1000 : n; - sessionData.createdAt = new Date(ms).toISOString(); - } else { - // Assume it's already an ISO/date string - const d = new Date(createdAt); - sessionData.createdAt = isNaN(d.getTime()) ? null : d.toISOString(); - } - } else { - sessionData.createdAt = sessionData.createdAt || null; - } - sessionData.mode = data.mode; - sessionData.agentId = data.agentId; - sessionData.latestRootBlobId = data.latestRootBlobId; - } - } else { - // If not hex, use raw value for simple keys - if (row.key === 'name') { - sessionData.name = row.value.toString(); - } - } - } catch (e) { - console.log(`Could not parse meta value for key ${row.key}:`, e.message); - } - } - } - - // Get message count from JSON blobs only (actual messages, not DAG structure) - try { - const blobCount = db.prepare(`SELECT COUNT(*) as count FROM blobs WHERE substr(data, 1, 1) = X'7B'`).get(); - sessionData.messageCount = blobCount.count; - - // Get the most recent JSON blob for preview (actual message, not DAG structure) - const lastBlob = db.prepare(`SELECT data FROM blobs WHERE substr(data, 1, 1) = X'7B' ORDER BY rowid DESC LIMIT 1`).get(); - - if (lastBlob && lastBlob.data) { - try { - // Try to extract readable preview from blob (may contain binary with embedded JSON) - const raw = lastBlob.data.toString('utf8'); - let preview = ''; - // Attempt direct JSON parse - try { - const parsed = JSON.parse(raw); - if (parsed?.content) { - if (Array.isArray(parsed.content)) { - const firstText = parsed.content.find(p => p?.type === 'text' && p.text)?.text || ''; - preview = firstText; - } else if (typeof parsed.content === 'string') { - preview = parsed.content; - } - } - } catch (_) {} - if (!preview) { - // Strip non-printable and try to find JSON chunk - const cleaned = raw.replace(/[^\x09\x0A\x0D\x20-\x7E]/g, ''); - const s = cleaned; - const start = s.indexOf('{'); - const end = s.lastIndexOf('}'); - if (start !== -1 && end > start) { - const jsonStr = s.slice(start, end + 1); - try { - const parsed = JSON.parse(jsonStr); - if (parsed?.content) { - if (Array.isArray(parsed.content)) { - const firstText = parsed.content.find(p => p?.type === 'text' && p.text)?.text || ''; - preview = firstText; - } else if (typeof parsed.content === 'string') { - preview = parsed.content; - } - } - } catch (_) { - preview = s; - } - } else { - preview = s; - } - } - if (preview && preview.length > 0) { - sessionData.lastMessage = preview.substring(0, 100) + (preview.length > 100 ? '...' : ''); - } - } catch (e) { - console.log('Could not parse blob data:', e.message); - } - } - } catch (e) { - console.log('Could not read blobs:', e.message); - } - - db.close(); - - // Finalize createdAt: use parsed meta value when valid, else fall back to store.db mtime - if (!sessionData.createdAt) { - if (dbStatMtimeMs && Number.isFinite(dbStatMtimeMs)) { - sessionData.createdAt = new Date(dbStatMtimeMs).toISOString(); - } - } - - sessions.push(sessionData); - - } catch (error) { - console.log(`Could not read session ${sessionId}:`, error.message); - } - } - - // Fallback: ensure createdAt is a valid ISO string (use session directory mtime as last resort) - for (const s of sessions) { - if (!s.createdAt) { - try { - const sessionDir = path.join(cursorChatsPath, s.id); - const st = await fs.stat(sessionDir); - s.createdAt = new Date(st.mtimeMs).toISOString(); - } catch { - s.createdAt = new Date().toISOString(); - } - } - } - // Sort sessions by creation date (newest first) - sessions.sort((a, b) => { - if (!a.createdAt) return 1; - if (!b.createdAt) return -1; - return new Date(b.createdAt) - new Date(a.createdAt); - }); - - applyCustomSessionNames(sessions, 'cursor'); - - res.json({ - success: true, - sessions: sessions, - cwdId: cwdId, - path: cursorChatsPath - }); - - } catch (error) { - console.error('Error reading Cursor sessions:', error); - res.status(500).json({ - error: 'Failed to read Cursor sessions', - details: error.message - }); - } -}); -export default router; \ No newline at end of file +export default router; diff --git a/server/routes/mcp-utils.js b/server/routes/mcp-utils.js index 8b3cd292..52312d2e 100644 --- a/server/routes/mcp-utils.js +++ b/server/routes/mcp-utils.js @@ -7,7 +7,7 @@ */ import express from 'express'; -import { detectTaskMasterMCPServer, getAllMCPServers } from '../utils/mcp-detector.js'; +import { detectTaskMasterMCPServer } from '../utils/mcp-detector.js'; const router = express.Router(); @@ -28,21 +28,4 @@ router.get('/taskmaster-server', async (req, res) => { } }); -/** - * GET /api/mcp-utils/all-servers - * Get all configured MCP servers - */ -router.get('/all-servers', async (req, res) => { - try { - const result = await getAllMCPServers(); - res.json(result); - } catch (error) { - console.error('MCP servers detection error:', error); - res.status(500).json({ - error: 'Failed to get MCP servers', - message: error.message - }); - } -}); - -export default router; \ No newline at end of file +export default router; diff --git a/server/routes/mcp.js b/server/routes/mcp.js deleted file mode 100644 index 080be6ab..00000000 --- a/server/routes/mcp.js +++ /dev/null @@ -1,552 +0,0 @@ -import express from 'express'; -import { promises as fs } from 'fs'; -import path from 'path'; -import os from 'os'; -import { fileURLToPath } from 'url'; -import { dirname } from 'path'; -import { spawn } from 'child_process'; - -const router = express.Router(); -const __filename = fileURLToPath(import.meta.url); -const __dirname = dirname(__filename); - -// Claude CLI command routes - -// GET /api/mcp/cli/list - List MCP servers using Claude CLI -router.get('/cli/list', async (req, res) => { - try { - console.log('📋 Listing MCP servers using Claude CLI'); - - const { spawn } = await import('child_process'); - const { promisify } = await import('util'); - const exec = promisify(spawn); - - const process = spawn('claude', ['mcp', 'list'], { - stdio: ['pipe', 'pipe', 'pipe'] - }); - - let stdout = ''; - let stderr = ''; - - process.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - process.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - process.on('close', (code) => { - if (code === 0) { - res.json({ success: true, output: stdout, servers: parseClaudeListOutput(stdout) }); - } else { - console.error('Claude CLI error:', stderr); - res.status(500).json({ error: 'Claude CLI command failed', details: stderr }); - } - }); - - process.on('error', (error) => { - console.error('Error running Claude CLI:', error); - res.status(500).json({ error: 'Failed to run Claude CLI', details: error.message }); - }); - } catch (error) { - console.error('Error listing MCP servers via CLI:', error); - res.status(500).json({ error: 'Failed to list MCP servers', details: error.message }); - } -}); - -// POST /api/mcp/cli/add - Add MCP server using Claude CLI -router.post('/cli/add', async (req, res) => { - try { - const { name, type = 'stdio', command, args = [], url, headers = {}, env = {}, scope = 'user', projectPath } = req.body; - - console.log(`➕ Adding MCP server using Claude CLI (${scope} scope):`, name); - - const { spawn } = await import('child_process'); - - let cliArgs = ['mcp', 'add']; - - // Add scope flag - cliArgs.push('--scope', scope); - - if (type === 'http') { - cliArgs.push('--transport', 'http', name, url); - // Add headers if provided - Object.entries(headers).forEach(([key, value]) => { - cliArgs.push('--header', `${key}: ${value}`); - }); - } else if (type === 'sse') { - cliArgs.push('--transport', 'sse', name, url); - // Add headers if provided - Object.entries(headers).forEach(([key, value]) => { - cliArgs.push('--header', `${key}: ${value}`); - }); - } else { - // stdio (default): claude mcp add --scope user [args...] - cliArgs.push(name); - // Add environment variables - Object.entries(env).forEach(([key, value]) => { - cliArgs.push('-e', `${key}=${value}`); - }); - cliArgs.push(command); - if (args && args.length > 0) { - cliArgs.push(...args); - } - } - - console.log('🔧 Running Claude CLI command:', 'claude', cliArgs.join(' ')); - - // For local scope, we need to run the command in the project directory - const spawnOptions = { - stdio: ['pipe', 'pipe', 'pipe'] - }; - - if (scope === 'local' && projectPath) { - spawnOptions.cwd = projectPath; - console.log('📁 Running in project directory:', projectPath); - } - - const process = spawn('claude', cliArgs, spawnOptions); - - let stdout = ''; - let stderr = ''; - - process.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - process.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - process.on('close', (code) => { - if (code === 0) { - res.json({ success: true, output: stdout, message: `MCP server "${name}" added successfully` }); - } else { - console.error('Claude CLI error:', stderr); - res.status(400).json({ error: 'Claude CLI command failed', details: stderr }); - } - }); - - process.on('error', (error) => { - console.error('Error running Claude CLI:', error); - res.status(500).json({ error: 'Failed to run Claude CLI', details: error.message }); - }); - } catch (error) { - console.error('Error adding MCP server via CLI:', error); - res.status(500).json({ error: 'Failed to add MCP server', details: error.message }); - } -}); - -// POST /api/mcp/cli/add-json - Add MCP server using JSON format -router.post('/cli/add-json', async (req, res) => { - try { - const { name, jsonConfig, scope = 'user', projectPath } = req.body; - - console.log('➕ Adding MCP server using JSON format:', name); - - // Validate and parse JSON config - let parsedConfig; - try { - parsedConfig = typeof jsonConfig === 'string' ? JSON.parse(jsonConfig) : jsonConfig; - } catch (parseError) { - return res.status(400).json({ - error: 'Invalid JSON configuration', - details: parseError.message - }); - } - - // Validate required fields - if (!parsedConfig.type) { - return res.status(400).json({ - error: 'Invalid configuration', - details: 'Missing required field: type' - }); - } - - if (parsedConfig.type === 'stdio' && !parsedConfig.command) { - return res.status(400).json({ - error: 'Invalid configuration', - details: 'stdio type requires a command field' - }); - } - - if ((parsedConfig.type === 'http' || parsedConfig.type === 'sse') && !parsedConfig.url) { - return res.status(400).json({ - error: 'Invalid configuration', - details: `${parsedConfig.type} type requires a url field` - }); - } - - const { spawn } = await import('child_process'); - - // Build the command: claude mcp add-json --scope '' - const cliArgs = ['mcp', 'add-json', '--scope', scope, name]; - - // Add the JSON config as a properly formatted string - const jsonString = JSON.stringify(parsedConfig); - cliArgs.push(jsonString); - - console.log('🔧 Running Claude CLI command:', 'claude', cliArgs[0], cliArgs[1], cliArgs[2], cliArgs[3], cliArgs[4], jsonString); - - // For local scope, we need to run the command in the project directory - const spawnOptions = { - stdio: ['pipe', 'pipe', 'pipe'] - }; - - if (scope === 'local' && projectPath) { - spawnOptions.cwd = projectPath; - console.log('📁 Running in project directory:', projectPath); - } - - const process = spawn('claude', cliArgs, spawnOptions); - - let stdout = ''; - let stderr = ''; - - process.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - process.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - process.on('close', (code) => { - if (code === 0) { - res.json({ success: true, output: stdout, message: `MCP server "${name}" added successfully via JSON` }); - } else { - console.error('Claude CLI error:', stderr); - res.status(400).json({ error: 'Claude CLI command failed', details: stderr }); - } - }); - - process.on('error', (error) => { - console.error('Error running Claude CLI:', error); - res.status(500).json({ error: 'Failed to run Claude CLI', details: error.message }); - }); - } catch (error) { - console.error('Error adding MCP server via JSON:', error); - res.status(500).json({ error: 'Failed to add MCP server', details: error.message }); - } -}); - -// DELETE /api/mcp/cli/remove/:name - Remove MCP server using Claude CLI -router.delete('/cli/remove/:name', async (req, res) => { - try { - const { name } = req.params; - const { scope } = req.query; // Get scope from query params - - // Handle the ID format (remove scope prefix if present) - let actualName = name; - let actualScope = scope; - - // If the name includes a scope prefix like "local:test", extract it - if (name.includes(':')) { - const [prefix, serverName] = name.split(':'); - actualName = serverName; - actualScope = actualScope || prefix; // Use prefix as scope if not provided in query - } - - console.log('🗑️ Removing MCP server using Claude CLI:', actualName, 'scope:', actualScope); - - const { spawn } = await import('child_process'); - - // Build command args based on scope - let cliArgs = ['mcp', 'remove']; - - // Add scope flag if it's local scope - if (actualScope === 'local') { - cliArgs.push('--scope', 'local'); - } else if (actualScope === 'user' || !actualScope) { - // User scope is default, but we can be explicit - cliArgs.push('--scope', 'user'); - } - - cliArgs.push(actualName); - - console.log('🔧 Running Claude CLI command:', 'claude', cliArgs.join(' ')); - - const process = spawn('claude', cliArgs, { - stdio: ['pipe', 'pipe', 'pipe'] - }); - - let stdout = ''; - let stderr = ''; - - process.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - process.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - process.on('close', (code) => { - if (code === 0) { - res.json({ success: true, output: stdout, message: `MCP server "${name}" removed successfully` }); - } else { - console.error('Claude CLI error:', stderr); - res.status(400).json({ error: 'Claude CLI command failed', details: stderr }); - } - }); - - process.on('error', (error) => { - console.error('Error running Claude CLI:', error); - res.status(500).json({ error: 'Failed to run Claude CLI', details: error.message }); - }); - } catch (error) { - console.error('Error removing MCP server via CLI:', error); - res.status(500).json({ error: 'Failed to remove MCP server', details: error.message }); - } -}); - -// GET /api/mcp/cli/get/:name - Get MCP server details using Claude CLI -router.get('/cli/get/:name', async (req, res) => { - try { - const { name } = req.params; - - console.log('📄 Getting MCP server details using Claude CLI:', name); - - const { spawn } = await import('child_process'); - - const process = spawn('claude', ['mcp', 'get', name], { - stdio: ['pipe', 'pipe', 'pipe'] - }); - - let stdout = ''; - let stderr = ''; - - process.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - process.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - process.on('close', (code) => { - if (code === 0) { - res.json({ success: true, output: stdout, server: parseClaudeGetOutput(stdout) }); - } else { - console.error('Claude CLI error:', stderr); - res.status(404).json({ error: 'Claude CLI command failed', details: stderr }); - } - }); - - process.on('error', (error) => { - console.error('Error running Claude CLI:', error); - res.status(500).json({ error: 'Failed to run Claude CLI', details: error.message }); - }); - } catch (error) { - console.error('Error getting MCP server details via CLI:', error); - res.status(500).json({ error: 'Failed to get MCP server details', details: error.message }); - } -}); - -// GET /api/mcp/config/read - Read MCP servers directly from Claude config files -router.get('/config/read', async (req, res) => { - try { - console.log('📖 Reading MCP servers from Claude config files'); - - const homeDir = os.homedir(); - const configPaths = [ - path.join(homeDir, '.claude.json'), - path.join(homeDir, '.claude', 'settings.json') - ]; - - let configData = null; - let configPath = null; - - // Try to read from either config file - for (const filepath of configPaths) { - try { - const fileContent = await fs.readFile(filepath, 'utf8'); - configData = JSON.parse(fileContent); - configPath = filepath; - console.log(`✅ Found Claude config at: ${filepath}`); - break; - } catch (error) { - // File doesn't exist or is not valid JSON, try next - console.log(`ℹ️ Config not found or invalid at: ${filepath}`); - } - } - - if (!configData) { - return res.json({ - success: false, - message: 'No Claude configuration file found', - servers: [] - }); - } - - // Extract MCP servers from the config - const servers = []; - - // Check for user-scoped MCP servers (at root level) - if (configData.mcpServers && typeof configData.mcpServers === 'object' && Object.keys(configData.mcpServers).length > 0) { - console.log('🔍 Found user-scoped MCP servers:', Object.keys(configData.mcpServers)); - for (const [name, config] of Object.entries(configData.mcpServers)) { - const server = { - id: name, - name: name, - type: 'stdio', // Default type - scope: 'user', // User scope - available across all projects - config: {}, - raw: config // Include raw config for full details - }; - - // Determine transport type and extract config - if (config.command) { - server.type = 'stdio'; - server.config.command = config.command; - server.config.args = config.args || []; - server.config.env = config.env || {}; - } else if (config.url) { - server.type = config.transport || 'http'; - server.config.url = config.url; - server.config.headers = config.headers || {}; - } - - servers.push(server); - } - } - - // Check for local-scoped MCP servers (project-specific) - const currentProjectPath = process.cwd(); - - // Check under 'projects' key - if (configData.projects && configData.projects[currentProjectPath]) { - const projectConfig = configData.projects[currentProjectPath]; - if (projectConfig.mcpServers && typeof projectConfig.mcpServers === 'object' && Object.keys(projectConfig.mcpServers).length > 0) { - console.log(`🔍 Found local-scoped MCP servers for ${currentProjectPath}:`, Object.keys(projectConfig.mcpServers)); - for (const [name, config] of Object.entries(projectConfig.mcpServers)) { - const server = { - id: `local:${name}`, // Prefix with scope for uniqueness - name: name, // Keep original name - type: 'stdio', // Default type - scope: 'local', // Local scope - only for this project - projectPath: currentProjectPath, - config: {}, - raw: config // Include raw config for full details - }; - - // Determine transport type and extract config - if (config.command) { - server.type = 'stdio'; - server.config.command = config.command; - server.config.args = config.args || []; - server.config.env = config.env || {}; - } else if (config.url) { - server.type = config.transport || 'http'; - server.config.url = config.url; - server.config.headers = config.headers || {}; - } - - servers.push(server); - } - } - } - - console.log(`📋 Found ${servers.length} MCP servers in config`); - - res.json({ - success: true, - configPath: configPath, - servers: servers - }); - } catch (error) { - console.error('Error reading Claude config:', error); - res.status(500).json({ - error: 'Failed to read Claude configuration', - details: error.message - }); - } -}); - -// Helper functions to parse Claude CLI output -function parseClaudeListOutput(output) { - const servers = []; - const lines = output.split('\n').filter(line => line.trim()); - - for (const line of lines) { - // Skip the header line - if (line.includes('Checking MCP server health')) continue; - - // Parse lines like "test: test test - ✗ Failed to connect" - // or "server-name: command or description - ✓ Connected" - if (line.includes(':')) { - const colonIndex = line.indexOf(':'); - const name = line.substring(0, colonIndex).trim(); - - // Skip empty names - if (!name) continue; - - // Extract the rest after the name - const rest = line.substring(colonIndex + 1).trim(); - - // Try to extract description and status - let description = rest; - let status = 'unknown'; - let type = 'stdio'; // default type - - // Check for status indicators - if (rest.includes('✓') || rest.includes('✗')) { - const statusMatch = rest.match(/(.*?)\s*-\s*([✓✗].*)$/); - if (statusMatch) { - description = statusMatch[1].trim(); - status = statusMatch[2].includes('✓') ? 'connected' : 'failed'; - } - } - - // Try to determine type from description - if (description.startsWith('http://') || description.startsWith('https://')) { - type = 'http'; - } - - servers.push({ - name, - type, - status: status || 'active', - description - }); - } - } - - console.log('🔍 Parsed Claude CLI servers:', servers); - return servers; -} - -function parseClaudeGetOutput(output) { - // Parse the output from 'claude mcp get ' command - // This is a simple parser - might need adjustment based on actual output format - try { - // Try to extract JSON if present - const jsonMatch = output.match(/\{[\s\S]*\}/); - if (jsonMatch) { - return JSON.parse(jsonMatch[0]); - } - - // Otherwise, parse as text - const server = { raw_output: output }; - const lines = output.split('\n'); - - for (const line of lines) { - if (line.includes('Name:')) { - server.name = line.split(':')[1]?.trim(); - } else if (line.includes('Type:')) { - server.type = line.split(':')[1]?.trim(); - } else if (line.includes('Command:')) { - server.command = line.split(':')[1]?.trim(); - } else if (line.includes('URL:')) { - server.url = line.split(':')[1]?.trim(); - } - } - - return server; - } catch (error) { - return { raw_output: output, parse_error: error.message }; - } -} - -export default router; \ No newline at end of file diff --git a/server/routes/messages.js b/server/routes/messages.js index 8eb14b37..81444d56 100644 --- a/server/routes/messages.js +++ b/server/routes/messages.js @@ -10,7 +10,7 @@ */ import express from 'express'; -import { getProvider, getAllProviders } from '../providers/registry.js'; +import { sessionsService } from '../modules/providers/services/sessions.service.js'; const router = express.Router(); @@ -29,7 +29,7 @@ const router = express.Router(); router.get('/:sessionId/messages', async (req, res) => { try { const { sessionId } = req.params; - const provider = req.query.provider || 'claude'; + const provider = String(req.query.provider || 'claude').trim().toLowerCase(); const projectName = req.query.projectName || ''; const projectPath = req.query.projectPath || ''; const limitParam = req.query.limit; @@ -38,13 +38,13 @@ router.get('/:sessionId/messages', async (req, res) => { : null; const offset = parseInt(req.query.offset || '0', 10); - const adapter = getProvider(provider); - if (!adapter) { - const available = getAllProviders().join(', '); + const availableProviders = sessionsService.listProviderIds(); + if (!availableProviders.includes(provider)) { + const available = availableProviders.join(', '); return res.status(400).json({ error: `Unknown provider: ${provider}. Available: ${available}` }); } - const result = await adapter.fetchHistory(sessionId, { + const result = await sessionsService.fetchHistory(provider, sessionId, { projectName, projectPath, limit, diff --git a/server/routes/settings.js b/server/routes/settings.js index 7eee2454..e2ce0885 100644 --- a/server/routes/settings.js +++ b/server/routes/settings.js @@ -273,4 +273,14 @@ router.post('/push/unsubscribe', async (req, res) => { } }); +// Host OS for UI (e.g. hide Cursor agent when the backend runs on Windows). +router.get('/server-env', async (req, res) => { + try { + res.json({ platform: process.platform }); + } catch (error) { + console.error('Error reading server environment:', error); + res.status(500).json({ error: 'Failed to read server environment' }); + } +}); + export default router; diff --git a/server/routes/taskmaster.js b/server/routes/taskmaster.js index 632d99d5..54f7153a 100644 --- a/server/routes/taskmaster.js +++ b/server/routes/taskmaster.js @@ -13,16 +13,10 @@ import fs from 'fs'; import path from 'path'; import { promises as fsPromises } from 'fs'; import { spawn } from 'child_process'; -import { fileURLToPath } from 'url'; -import { dirname } from 'path'; -import os from 'os'; import { extractProjectDirectory } from '../projects.js'; import { detectTaskMasterMCPServer } from '../utils/mcp-detector.js'; import { broadcastTaskMasterProjectUpdate, broadcastTaskMasterTasksUpdate } from '../utils/taskmaster-websocket.js'; -const __filename = fileURLToPath(import.meta.url); -const __dirname = dirname(__filename); - const router = express.Router(); /** @@ -100,140 +94,6 @@ async function checkTaskMasterInstallation() { }); } -/** - * Detect .taskmaster folder presence in a given project directory - * @param {string} projectPath - Absolute path to project directory - * @returns {Promise} Detection result with status and metadata - */ -async function detectTaskMasterFolder(projectPath) { - try { - const taskMasterPath = path.join(projectPath, '.taskmaster'); - - // Check if .taskmaster directory exists - try { - const stats = await fsPromises.stat(taskMasterPath); - if (!stats.isDirectory()) { - return { - hasTaskmaster: false, - reason: '.taskmaster exists but is not a directory' - }; - } - } catch (error) { - if (error.code === 'ENOENT') { - return { - hasTaskmaster: false, - reason: '.taskmaster directory not found' - }; - } - throw error; - } - - // Check for key TaskMaster files - const keyFiles = [ - 'tasks/tasks.json', - 'config.json' - ]; - - const fileStatus = {}; - let hasEssentialFiles = true; - - for (const file of keyFiles) { - const filePath = path.join(taskMasterPath, file); - try { - await fsPromises.access(filePath, fs.constants.R_OK); - fileStatus[file] = true; - } catch (error) { - fileStatus[file] = false; - if (file === 'tasks/tasks.json') { - hasEssentialFiles = false; - } - } - } - - // Parse tasks.json if it exists for metadata - let taskMetadata = null; - if (fileStatus['tasks/tasks.json']) { - try { - const tasksPath = path.join(taskMasterPath, 'tasks/tasks.json'); - const tasksContent = await fsPromises.readFile(tasksPath, 'utf8'); - const tasksData = JSON.parse(tasksContent); - - // Handle both tagged and legacy formats - let tasks = []; - if (tasksData.tasks) { - // Legacy format - tasks = tasksData.tasks; - } else { - // Tagged format - get tasks from all tags - Object.values(tasksData).forEach(tagData => { - if (tagData.tasks) { - tasks = tasks.concat(tagData.tasks); - } - }); - } - - // Calculate task statistics - const stats = tasks.reduce((acc, task) => { - acc.total++; - acc[task.status] = (acc[task.status] || 0) + 1; - - // Count subtasks - if (task.subtasks) { - task.subtasks.forEach(subtask => { - acc.subtotalTasks++; - acc.subtasks = acc.subtasks || {}; - acc.subtasks[subtask.status] = (acc.subtasks[subtask.status] || 0) + 1; - }); - } - - return acc; - }, { - total: 0, - subtotalTasks: 0, - pending: 0, - 'in-progress': 0, - done: 0, - review: 0, - deferred: 0, - cancelled: 0, - subtasks: {} - }); - - taskMetadata = { - taskCount: stats.total, - subtaskCount: stats.subtotalTasks, - completed: stats.done || 0, - pending: stats.pending || 0, - inProgress: stats['in-progress'] || 0, - review: stats.review || 0, - completionPercentage: stats.total > 0 ? Math.round((stats.done / stats.total) * 100) : 0, - lastModified: (await fsPromises.stat(tasksPath)).mtime.toISOString() - }; - } catch (parseError) { - console.warn('Failed to parse tasks.json:', parseError.message); - taskMetadata = { error: 'Failed to parse tasks.json' }; - } - } - - return { - hasTaskmaster: true, - hasEssentialFiles, - files: fileStatus, - metadata: taskMetadata, - path: taskMasterPath - }; - - } catch (error) { - console.error('Error detecting TaskMaster folder:', error); - return { - hasTaskmaster: false, - reason: `Error checking directory: ${error.message}` - }; - } -} - -// MCP detection is now handled by the centralized utility - // API Routes /** @@ -271,298 +131,6 @@ router.get('/installation-status', async (req, res) => { } }); -/** - * GET /api/taskmaster/detect/:projectName - * Detect TaskMaster configuration for a specific project - */ -router.get('/detect/:projectName', async (req, res) => { - try { - const { projectName } = req.params; - - // Use the existing extractProjectDirectory function to get actual project path - let projectPath; - try { - projectPath = await extractProjectDirectory(projectName); - } catch (error) { - console.error('Error extracting project directory:', error); - return res.status(404).json({ - error: 'Project path not found', - projectName, - message: error.message - }); - } - - // Verify the project path exists - try { - await fsPromises.access(projectPath, fs.constants.R_OK); - } catch (error) { - return res.status(404).json({ - error: 'Project path not accessible', - projectPath, - projectName, - message: error.message - }); - } - - // Run detection in parallel - const [taskMasterResult, mcpResult] = await Promise.all([ - detectTaskMasterFolder(projectPath), - detectTaskMasterMCPServer() - ]); - - // Determine overall status - let status = 'not-configured'; - if (taskMasterResult.hasTaskmaster && taskMasterResult.hasEssentialFiles) { - if (mcpResult.hasMCPServer && mcpResult.isConfigured) { - status = 'fully-configured'; - } else { - status = 'taskmaster-only'; - } - } else if (mcpResult.hasMCPServer && mcpResult.isConfigured) { - status = 'mcp-only'; - } - - const responseData = { - projectName, - projectPath, - status, - taskmaster: taskMasterResult, - mcp: mcpResult, - timestamp: new Date().toISOString() - }; - - res.json(responseData); - - } catch (error) { - console.error('TaskMaster detection error:', error); - res.status(500).json({ - error: 'Failed to detect TaskMaster configuration', - message: error.message - }); - } -}); - -/** - * GET /api/taskmaster/detect-all - * Detect TaskMaster configuration for all known projects - * This endpoint works with the existing projects system - */ -router.get('/detect-all', async (req, res) => { - try { - // Import getProjects from the projects module - const { getProjects } = await import('../projects.js'); - const projects = await getProjects(); - - // Run detection for all projects in parallel - const detectionPromises = projects.map(async (project) => { - try { - // Use the project's fullPath if available, otherwise extract the directory - let projectPath; - if (project.fullPath) { - projectPath = project.fullPath; - } else { - try { - projectPath = await extractProjectDirectory(project.name); - } catch (error) { - throw new Error(`Failed to extract project directory: ${error.message}`); - } - } - - const [taskMasterResult, mcpResult] = await Promise.all([ - detectTaskMasterFolder(projectPath), - detectTaskMasterMCPServer() - ]); - - // Determine status - let status = 'not-configured'; - if (taskMasterResult.hasTaskmaster && taskMasterResult.hasEssentialFiles) { - if (mcpResult.hasMCPServer && mcpResult.isConfigured) { - status = 'fully-configured'; - } else { - status = 'taskmaster-only'; - } - } else if (mcpResult.hasMCPServer && mcpResult.isConfigured) { - status = 'mcp-only'; - } - - return { - projectName: project.name, - displayName: project.displayName, - projectPath, - status, - taskmaster: taskMasterResult, - mcp: mcpResult - }; - } catch (error) { - return { - projectName: project.name, - displayName: project.displayName, - status: 'error', - error: error.message - }; - } - }); - - const results = await Promise.all(detectionPromises); - - res.json({ - projects: results, - summary: { - total: results.length, - fullyConfigured: results.filter(p => p.status === 'fully-configured').length, - taskmasterOnly: results.filter(p => p.status === 'taskmaster-only').length, - mcpOnly: results.filter(p => p.status === 'mcp-only').length, - notConfigured: results.filter(p => p.status === 'not-configured').length, - errors: results.filter(p => p.status === 'error').length - }, - timestamp: new Date().toISOString() - }); - - } catch (error) { - console.error('Bulk TaskMaster detection error:', error); - res.status(500).json({ - error: 'Failed to detect TaskMaster configuration for projects', - message: error.message - }); - } -}); - -/** - * POST /api/taskmaster/initialize/:projectName - * Initialize TaskMaster in a project (placeholder for future CLI integration) - */ -router.post('/initialize/:projectName', async (req, res) => { - try { - const { projectName } = req.params; - const { rules } = req.body; // Optional rule profiles - - // This will be implemented in a later subtask with CLI integration - res.status(501).json({ - error: 'TaskMaster initialization not yet implemented', - message: 'This endpoint will execute task-master init via CLI in a future update', - projectName, - rules - }); - - } catch (error) { - console.error('TaskMaster initialization error:', error); - res.status(500).json({ - error: 'Failed to initialize TaskMaster', - message: error.message - }); - } -}); - -/** - * GET /api/taskmaster/next/:projectName - * Get the next recommended task using task-master CLI - */ -router.get('/next/:projectName', async (req, res) => { - try { - const { projectName } = req.params; - - // Get project path - let projectPath; - try { - projectPath = await extractProjectDirectory(projectName); - } catch (error) { - return res.status(404).json({ - error: 'Project not found', - message: `Project "${projectName}" does not exist` - }); - } - - // Try to execute task-master next command - try { - const { spawn } = await import('child_process'); - - const nextTaskCommand = spawn('task-master', ['next'], { - cwd: projectPath, - stdio: ['pipe', 'pipe', 'pipe'] - }); - - let stdout = ''; - let stderr = ''; - - nextTaskCommand.stdout.on('data', (data) => { - stdout += data.toString(); - }); - - nextTaskCommand.stderr.on('data', (data) => { - stderr += data.toString(); - }); - - await new Promise((resolve, reject) => { - nextTaskCommand.on('close', (code) => { - if (code === 0) { - resolve(); - } else { - reject(new Error(`task-master next failed with code ${code}: ${stderr}`)); - } - }); - - nextTaskCommand.on('error', (error) => { - reject(error); - }); - }); - - // Parse the output - task-master next usually returns JSON - let nextTaskData = null; - if (stdout.trim()) { - try { - nextTaskData = JSON.parse(stdout); - } catch (parseError) { - // If not JSON, treat as plain text - nextTaskData = { message: stdout.trim() }; - } - } - - res.json({ - projectName, - projectPath, - nextTask: nextTaskData, - timestamp: new Date().toISOString() - }); - - } catch (cliError) { - console.warn('Failed to execute task-master CLI:', cliError.message); - - // Fallback to loading tasks and finding next one locally - // Use localhost to bypass proxy for internal server-to-server calls - const tasksResponse = await fetch(`http://localhost:${process.env.SERVER_PORT || process.env.PORT || '3001'}/api/taskmaster/tasks/${encodeURIComponent(projectName)}`, { - headers: { - 'Authorization': req.headers.authorization - } - }); - - if (tasksResponse.ok) { - const tasksData = await tasksResponse.json(); - const nextTask = tasksData.tasks?.find(task => - task.status === 'pending' || task.status === 'in-progress' - ) || null; - - res.json({ - projectName, - projectPath, - nextTask, - fallback: true, - message: 'Used fallback method (CLI not available)', - timestamp: new Date().toISOString() - }); - } else { - throw new Error('Failed to load tasks via fallback method'); - } - } - - } catch (error) { - console.error('TaskMaster next task error:', error); - res.status(500).json({ - error: 'Failed to get next task', - message: error.message - }); - } -}); - /** * GET /api/taskmaster/tasks/:projectName * Load actual tasks from .taskmaster/tasks/tasks.json @@ -904,66 +472,6 @@ router.get('/prd/:projectName/:fileName', async (req, res) => { } }); -/** - * DELETE /api/taskmaster/prd/:projectName/:fileName - * Delete a specific PRD file - */ -router.delete('/prd/:projectName/:fileName', async (req, res) => { - try { - const { projectName, fileName } = req.params; - - // Get project path - let projectPath; - try { - projectPath = await extractProjectDirectory(projectName); - } catch (error) { - return res.status(404).json({ - error: 'Project not found', - message: `Project "${projectName}" does not exist` - }); - } - - const filePath = path.join(projectPath, '.taskmaster', 'docs', fileName); - - // Check if file exists - try { - await fsPromises.access(filePath, fs.constants.F_OK); - } catch (error) { - return res.status(404).json({ - error: 'PRD file not found', - message: `File "${fileName}" does not exist` - }); - } - - // Delete the file - try { - await fsPromises.unlink(filePath); - - res.json({ - projectName, - projectPath, - fileName, - message: 'PRD file deleted successfully', - timestamp: new Date().toISOString() - }); - - } catch (deleteError) { - console.error('Failed to delete PRD file:', deleteError); - return res.status(500).json({ - error: 'Failed to delete PRD file', - message: deleteError.message - }); - } - - } catch (error) { - console.error('PRD delete error:', error); - res.status(500).json({ - error: 'Failed to delete PRD file', - message: error.message - }); - } -}); - /** * POST /api/taskmaster/init/:projectName * Initialize TaskMaster in a project diff --git a/server/shared/interfaces.ts b/server/shared/interfaces.ts new file mode 100644 index 00000000..954b38a3 --- /dev/null +++ b/server/shared/interfaces.ts @@ -0,0 +1,54 @@ +import type { + FetchHistoryOptions, + FetchHistoryResult, + LLMProvider, + McpScope, + NormalizedMessage, + ProviderAuthStatus, + ProviderMcpServer, + UpsertProviderMcpServerInput, +} from '@/shared/types.js'; + +/** + * Main provider contract for CLI and SDK integrations. + * + * Each concrete provider owns its MCP/auth handlers plus the provider-specific + * logic for converting native events/history into the app's normalized shape. + */ +export interface IProvider { + readonly id: LLMProvider; + readonly mcp: IProviderMcp; + readonly auth: IProviderAuth; + readonly sessions: IProviderSessions; +} + + +/** + * Auth contract for one provider. + */ +export interface IProviderAuth { + /** + * Checks whether the provider is installed and has usable credentials. + */ + getStatus(): Promise; +} + +/** + * MCP contract for one provider. + */ +export interface IProviderMcp { + listServers(options?: { workspacePath?: string }): Promise>; + listServersForScope(scope: McpScope, options?: { workspacePath?: string }): Promise; + upsertServer(input: UpsertProviderMcpServerInput): Promise; + removeServer( + input: { name: string; scope?: McpScope; workspacePath?: string }, + ): Promise<{ removed: boolean; provider: LLMProvider; name: string; scope: McpScope }>; +} + +/** + * Session/history contract for one provider. + */ +export interface IProviderSessions { + normalizeMessage(raw: unknown, sessionId: string | null): NormalizedMessage[]; + fetchHistory(sessionId: string, options?: FetchHistoryOptions): Promise; +} diff --git a/server/shared/types.ts b/server/shared/types.ts new file mode 100644 index 00000000..7fe545c5 --- /dev/null +++ b/server/shared/types.ts @@ -0,0 +1,172 @@ +// -------------- HTTP API response shapes for the server, shared across modules -------------- + +export type ApiSuccessShape = { + success: true; + data: TData; +}; + +export type AnyRecord = Record; + +// --------------------------------------------------------------------------------------------- + +export type LLMProvider = 'claude' | 'codex' | 'gemini' | 'cursor'; + +// --------------------------------------------------------------------------------------------- + +export type MessageKind = + | 'text' + | 'tool_use' + | 'tool_result' + | 'thinking' + | 'stream_delta' + | 'stream_end' + | 'error' + | 'complete' + | 'status' + | 'permission_request' + | 'permission_cancelled' + | 'session_created' + | 'interactive_prompt' + | 'task_notification'; + +/** + * Provider-neutral message event emitted over REST and realtime transports. + * + * Providers all produce their own native SDK/CLI event shapes, so this type keeps + * the common envelope strict while allowing provider-specific details to ride + * along as optional properties. + */ +export type NormalizedMessage = { + id: string; + sessionId: string; + timestamp: string; + provider: LLMProvider; + kind: MessageKind; + role?: 'user' | 'assistant'; + content?: string; + images?: unknown; + toolName?: string; + toolInput?: unknown; + toolId?: string; + toolResult?: { + content?: string; + isError?: boolean; + toolUseResult?: unknown; + }; + isError?: boolean; + text?: string; + tokens?: number; + canInterrupt?: boolean; + requestId?: string; + input?: unknown; + context?: unknown; + reason?: string; + newSessionId?: string; + status?: string; + summary?: string; + tokenBudget?: unknown; + subagentTools?: unknown; + toolUseResult?: unknown; + sequence?: number; + rowid?: number; + [key: string]: unknown; +}; + +/** + * Pagination and provider lookup options for reading persisted session history. + */ +export type FetchHistoryOptions = { + /** Claude project folder name. Required by Claude history lookup. */ + projectName?: string; + /** Absolute workspace path. Required by Cursor to compute its chat hash. */ + projectPath?: string; + /** Page size. `null` means all messages. */ + limit?: number | null; + /** Pagination offset from the newest messages. */ + offset?: number; +}; + +/** + * Provider-neutral history result returned by the unified messages endpoint. + */ +export type FetchHistoryResult = { + messages: NormalizedMessage[]; + total: number; + hasMore: boolean; + offset: number; + limit: number | null; + tokenUsage?: unknown; +}; + +// --------------------------------------------------------------------------------------------- + +export type AppErrorOptions = { + code?: string; + statusCode?: number; + details?: unknown; +}; + +// -------------------- MCP related shared types -------------------- +export type McpScope = 'user' | 'local' | 'project'; + +export type McpTransport = 'stdio' | 'http' | 'sse'; + +/** + * Provider MCP server descriptor normalized for frontend consumption. + */ +export type ProviderMcpServer = { + provider: LLMProvider; + name: string; + scope: McpScope; + transport: McpTransport; + command?: string; + args?: string[]; + env?: Record; + cwd?: string; + url?: string; + headers?: Record; + envVars?: string[]; + bearerTokenEnvVar?: string; + envHttpHeaders?: Record; +}; + +/** + * Shared payload shape for MCP server create/update operations. + */ +export type UpsertProviderMcpServerInput = { + name: string; + scope?: McpScope; + transport: McpTransport; + workspacePath?: string; + command?: string; + args?: string[]; + env?: Record; + cwd?: string; + url?: string; + headers?: Record; + envVars?: string[]; + bearerTokenEnvVar?: string; + envHttpHeaders?: Record; +}; + +// --------------------------------------------------------------------------------------------- + +// -------------------- Provider auth status types -------------------- +/** + * Result of a provider status check (installation + authentication). + * + * installed - Whether the provider's CLI/SDK is available + * provider - Provider id the status belongs to + * authenticated - Whether valid credentials exist + * email - User email or auth method identifier + * method - Auth method (e.g. 'api_key', 'credentials_file') + * [error] - Error message if not installed or not authenticated + */ +export type ProviderAuthStatus = { + installed: boolean; + provider: LLMProvider; + authenticated: boolean; + email: string | null; + method: string | null; + error?: string; +}; diff --git a/server/shared/utils.ts b/server/shared/utils.ts new file mode 100644 index 00000000..de6aed56 --- /dev/null +++ b/server/shared/utils.ts @@ -0,0 +1,193 @@ + +import { randomUUID } from 'node:crypto'; +import { mkdir, readFile, writeFile } from 'node:fs/promises'; +import path from 'node:path'; + +import type { NextFunction, Request, RequestHandler, Response } from 'express'; + +import type { + AnyRecord, + ApiSuccessShape, + AppErrorOptions, + NormalizedMessage, +} from '@/shared/types.js'; + +type NormalizedMessageInput = + { + kind: NormalizedMessage['kind']; + provider: NormalizedMessage['provider']; + id?: string | null; + sessionId?: string | null; + timestamp?: string | null; + } & Record; + +export function createApiSuccessResponse( + data: TData, +): ApiSuccessShape { + return { + success: true, + data, + }; +} + +export function asyncHandler( + handler: (req: Request, res: Response, next: NextFunction) => Promise +): RequestHandler { + return (req, res, next) => { + void Promise.resolve(handler(req, res, next)).catch(next); + }; +} + +// --------- Global app error class for consistent error handling across the server --------- +export class AppError extends Error { + readonly code: string; + readonly statusCode: number; + readonly details?: unknown; + + constructor(message: string, options: AppErrorOptions = {}) { + super(message); + this.name = 'AppError'; + this.code = options.code ?? 'INTERNAL_ERROR'; + this.statusCode = options.statusCode ?? 500; + this.details = options.details; + } +} + +// ------------------------------------------------------------------------------------------- + +// ------------------------ Normalized provider message helpers ------------------------ +/** + * Generates a stable unique id for normalized provider messages. + */ +export function generateMessageId(prefix = 'msg'): string { + return `${prefix}_${randomUUID()}`; +} + +/** + * Creates a normalized provider message and fills the shared envelope fields. + * + * Provider adapters and live SDK handlers pass through provider-specific fields, + * while this helper guarantees every emitted event has an id, session id, + * timestamp, and provider marker. + */ +export function createNormalizedMessage(fields: NormalizedMessageInput): NormalizedMessage { + return { + ...fields, + id: fields.id || generateMessageId(fields.kind), + sessionId: fields.sessionId || '', + timestamp: fields.timestamp || new Date().toISOString(), + provider: fields.provider, + }; +} + +// ------------------------------------------------------------------------------------------- + +// ------------------------ The following are mainly for provider MCP runtimes ------------------------ +/** + * Safely narrows an unknown value to a plain object record. + * + * This deliberately rejects arrays, `null`, and primitive values so callers can + * treat the returned value as a JSON-style object map without repeating the same + * defensive shape checks at every config read site. + */ +export const readObjectRecord = (value: any): AnyRecord | null => { + if (!value || typeof value !== 'object' || Array.isArray(value)) { + return null; + } + + return value as AnyRecord; +}; + +/** + * Reads an optional string from unknown input and normalizes empty or whitespace-only + * values to `undefined`. + * + * This is useful when parsing config files where a field may be missing, present + * with the wrong type, or present as an empty string that should be treated as + * "not configured". + */ +export const readOptionalString = (value: unknown): string | undefined => { + if (typeof value !== 'string') { + return undefined; + } + + const normalized = value.trim(); + return normalized.length > 0 ? normalized : undefined; +}; + +/** + * Reads an optional string array from unknown input. + * + * Non-array values are ignored, and any array entries that are not strings are + * filtered out. This lets provider config readers consume loosely shaped JSON/TOML + * data without failing on incidental invalid members. + */ +export const readStringArray = (value: unknown): string[] | undefined => { + if (!Array.isArray(value)) { + return undefined; + } + + return value.filter((entry): entry is string => typeof entry === 'string'); +}; + +/** + * Reads an optional string-to-string map from unknown input. + * + * The function first ensures the source value is a plain object, then keeps only + * keys whose values are strings. If no valid entries remain, it returns `undefined` + * so callers can distinguish "no usable map" from an empty object that was + * intentionally authored downstream. + */ +export const readStringRecord = (value: unknown): Record | undefined => { + const record = readObjectRecord(value); + if (!record) { + return undefined; + } + + const normalized: Record = {}; + for (const [key, entry] of Object.entries(record)) { + if (typeof entry === 'string') { + normalized[key] = entry; + } + } + + return Object.keys(normalized).length > 0 ? normalized : undefined; +}; + +/** + * Reads a JSON config file and guarantees a plain object result. + * + * Missing files are treated as an empty config object so provider-specific MCP + * readers can operate against first-run environments without special-case file + * existence checks. If the file exists but contains invalid JSON, the parse error + * is preserved and rethrown. + */ +export const readJsonConfig = async (filePath: string): Promise> => { + try { + const content = await readFile(filePath, 'utf8'); + const parsed = JSON.parse(content) as Record; + return readObjectRecord(parsed) ?? {}; + } catch (error) { + const code = (error as NodeJS.ErrnoException).code; + if (code === 'ENOENT') { + return {}; + } + + throw error; + } +}; + +/** + * Writes a JSON config file with stable, human-readable formatting. + * + * The parent directory is created automatically so callers can persist config into + * provider-specific folders without pre-creating the directory tree. Output always + * ends with a trailing newline to keep the file diff-friendly. + */ +export const writeJsonConfig = async (filePath: string, data: Record): Promise => { + await mkdir(path.dirname(filePath), { recursive: true }); + await writeFile(filePath, `${JSON.stringify(data, null, 2)}\n`, 'utf8'); +}; + +// ------------------------------------------------------------------------------------------- + diff --git a/server/utils/mcp-detector.js b/server/utils/mcp-detector.js index 4439353f..0d9241ae 100644 --- a/server/utils/mcp-detector.js +++ b/server/utils/mcp-detector.js @@ -145,54 +145,3 @@ export async function detectTaskMasterMCPServer() { } } -/** - * Get all configured MCP servers (not just TaskMaster) - * @returns {Promise} All MCP servers configuration - */ -export async function getAllMCPServers() { - try { - const homeDir = os.homedir(); - const configPaths = [ - path.join(homeDir, '.claude.json'), - path.join(homeDir, '.claude', 'settings.json') - ]; - - let configData = null; - let configPath = null; - - // Try to read from either config file - for (const filepath of configPaths) { - try { - const fileContent = await fsPromises.readFile(filepath, 'utf8'); - configData = JSON.parse(fileContent); - configPath = filepath; - break; - } catch (error) { - continue; - } - } - - if (!configData) { - return { - hasConfig: false, - servers: {}, - projectServers: {} - }; - } - - return { - hasConfig: true, - configPath, - servers: configData.mcpServers || {}, - projectServers: configData.projects || {} - }; - } catch (error) { - console.error('Error getting all MCP servers:', error); - return { - hasConfig: false, - error: error.message, - servers: {}, - projectServers: {} - }; - } -} \ No newline at end of file diff --git a/src/components/chat/hooks/useChatMessages.ts b/src/components/chat/hooks/useChatMessages.ts index 039b4062..8f417de5 100644 --- a/src/components/chat/hooks/useChatMessages.ts +++ b/src/components/chat/hooks/useChatMessages.ts @@ -12,7 +12,7 @@ import { decodeHtmlEntities, unescapeWithMathProtection, formatUsageLimitText } * that the existing UI components expect. * * Internal/system content (e.g. , ) is already - * filtered server-side by the Claude adapter (server/providers/utils.js). + * filtered server-side by the Claude provider module. */ export function normalizedToChatMessages(messages: NormalizedMessage[]): ChatMessage[] { const converted: ChatMessage[] = []; diff --git a/src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx b/src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx index c4f52ede..04569e60 100644 --- a/src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx +++ b/src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx @@ -1,7 +1,8 @@ -import React, { useCallback, useMemo, useState } from "react"; +import React, { useCallback, useEffect, useMemo, useState } from "react"; import { Check, ChevronDown } from "lucide-react"; import { useTranslation } from "react-i18next"; +import { useServerPlatform } from "../../../../hooks/useServerPlatform"; import SessionProviderLogo from "../../../llm-logo-provider/SessionProviderLogo"; import { CLAUDE_MODELS, @@ -45,11 +46,11 @@ type ProviderSelectionEmptyStateProps = { setInput: React.Dispatch>; }; -interface ProviderGroup { +type ProviderGroup = { id: LLMProvider; name: string; models: { value: string; label: string }[]; -} +}; const PROVIDER_GROUPS: ProviderGroup[] = [ { id: "claude", name: "Anthropic", models: CLAUDE_MODELS.OPTIONS }, @@ -105,7 +106,21 @@ export default function ProviderSelectionEmptyState({ setInput, }: ProviderSelectionEmptyStateProps) { const { t } = useTranslation("chat"); + const { isWindowsServer } = useServerPlatform(); const [dialogOpen, setDialogOpen] = useState(false); + + const visibleProviderGroups = useMemo( + () => (isWindowsServer ? PROVIDER_GROUPS.filter((p) => p.id !== "cursor") : PROVIDER_GROUPS), + [isWindowsServer], + ); + + useEffect(() => { + if (isWindowsServer && provider === "cursor") { + setProvider("claude"); + localStorage.setItem("selected-provider", "claude"); + } + }, [isWindowsServer, provider, setProvider]); + const nextTaskPrompt = t("tasks.nextTaskPrompt", { defaultValue: "Start the next task", }); @@ -126,13 +141,8 @@ export default function ProviderSelectionEmptyState({ return found?.label || currentModel; }, [provider, currentModel]); - const handleModelSelect = useCallback( + const setModelForProvider = useCallback( (providerId: LLMProvider, modelValue: string) => { - // Set provider - setProvider(providerId); - localStorage.setItem("selected-provider", providerId); - - // Set model for the correct provider if (providerId === "claude") { setClaudeModel(modelValue); localStorage.setItem("claude-model", modelValue); @@ -146,19 +156,25 @@ export default function ProviderSelectionEmptyState({ setCursorModel(modelValue); localStorage.setItem("cursor-model", modelValue); } + }, + [setClaudeModel, setCursorModel, setCodexModel, setGeminiModel], + ); + const handleModelSelect = useCallback( + (providerId: LLMProvider, modelValue: string) => { + setProvider(providerId); + localStorage.setItem("selected-provider", providerId); + setModelForProvider(providerId, modelValue); setDialogOpen(false); setTimeout(() => textareaRef.current?.focus(), 100); }, - [setProvider, setClaudeModel, setCursorModel, setCodexModel, setGeminiModel, textareaRef], + [setProvider, setModelForProvider, textareaRef], ); - /* ── New session — provider + model picker ── */ if (!selectedSession && !currentSessionId) { return (
- {/* Heading */}

{t("providerSelection.title")} @@ -168,7 +184,6 @@ export default function ProviderSelectionEmptyState({

- {/* Model selector trigger — hero card style */} Model Selector - + - {t("providerSelection.noModelsFound", { defaultValue: "No models found." })} + {t("providerSelection.noModelsFound", { + defaultValue: "No models found.", + })} - {PROVIDER_GROUPS.map((group) => ( + {visibleProviderGroups.map((group) => ( {group.models.map((model) => { - const isSelected = - provider === group.id && currentModel === model.value; + const isSelected = provider === group.id && currentModel === model.value; return ( - {/* Ready prompt */}

{ { @@ -263,7 +282,6 @@ export default function ProviderSelectionEmptyState({ }

- {/* Task banner */} {provider && tasksEnabled && isTaskMasterInstalled && (
diff --git a/src/components/mcp/constants.ts b/src/components/mcp/constants.ts new file mode 100644 index 00000000..4b1a949c --- /dev/null +++ b/src/components/mcp/constants.ts @@ -0,0 +1,58 @@ +import type { McpFormState, McpProvider, McpScope, McpTransport } from './types'; + +export const MCP_PROVIDER_NAMES: Record = { + claude: 'Claude', + cursor: 'Cursor', + codex: 'Codex', + gemini: 'Gemini', +}; + +export const MCP_SUPPORTED_SCOPES: Record = { + claude: ['user', 'project', 'local'], + cursor: ['user', 'project'], + codex: ['user', 'project'], + gemini: ['user', 'project'], +}; + +export const MCP_SUPPORTED_TRANSPORTS: Record = { + claude: ['stdio', 'http', 'sse'], + cursor: ['stdio', 'http'], + codex: ['stdio', 'http'], + gemini: ['stdio', 'http', 'sse'], +}; + +export const MCP_GLOBAL_SUPPORTED_SCOPES: McpScope[] = ['user', 'project']; + +export const MCP_GLOBAL_SUPPORTED_TRANSPORTS: McpTransport[] = ['stdio', 'http']; + +export const MCP_PROVIDER_BUTTON_CLASSES: Record = { + claude: 'bg-purple-600 text-white hover:bg-purple-700', + cursor: 'bg-purple-600 text-white hover:bg-purple-700', + codex: 'bg-gray-800 text-white hover:bg-gray-900 dark:bg-gray-700 dark:hover:bg-gray-600', + gemini: 'bg-blue-600 text-white hover:bg-blue-700', +}; + +export const MCP_SUPPORTS_WORKING_DIRECTORY: Record = { + claude: false, + cursor: false, + codex: true, + gemini: true, +}; + +export const DEFAULT_MCP_FORM: McpFormState = { + name: '', + scope: 'user', + workspacePath: '', + transport: 'stdio', + command: '', + args: [], + env: {}, + cwd: '', + url: '', + headers: {}, + envVars: [], + bearerTokenEnvVar: '', + envHttpHeaders: {}, + importMode: 'form', + jsonInput: '', +}; diff --git a/src/components/mcp/hooks/useMcpServerForm.ts b/src/components/mcp/hooks/useMcpServerForm.ts new file mode 100644 index 00000000..52809cbe --- /dev/null +++ b/src/components/mcp/hooks/useMcpServerForm.ts @@ -0,0 +1,248 @@ +import { type FormEvent, useEffect, useMemo, useState } from 'react'; +import { useTranslation } from 'react-i18next'; + +import { DEFAULT_MCP_FORM, MCP_SUPPORTED_SCOPES, MCP_SUPPORTED_TRANSPORTS } from '../constants'; +import type { McpFormState, McpProject, McpProvider, McpScope, McpTransport, ProviderMcpServer } from '../types'; +import { + formatKeyValueLines, + getErrorMessage, + getProjectPath, + isMcpTransport, + parseKeyValueLines, + parseListLines, +} from '../utils/mcpFormatting'; + +type UseMcpServerFormArgs = { + provider: McpProvider; + isOpen: boolean; + editingServer: ProviderMcpServer | null; + currentProjects: McpProject[]; + supportedScopes?: McpScope[]; + supportedTransports?: McpTransport[]; + unsupportedTransportMessage?: (transport: McpTransport) => string; + onSubmit: (formData: McpFormState, editingServer: ProviderMcpServer | null) => Promise; +}; + +type MultilineFieldText = { + args: string; + env: string; + headers: string; + envVars: string; + envHttpHeaders: string; +}; + +const cloneDefaultForm = ( + provider: McpProvider, + supportedScopes = MCP_SUPPORTED_SCOPES[provider], + supportedTransports = MCP_SUPPORTED_TRANSPORTS[provider], +): McpFormState => ({ + ...DEFAULT_MCP_FORM, + scope: supportedScopes[0], + transport: supportedTransports[0], + args: [], + env: {}, + headers: {}, + envVars: [], + envHttpHeaders: {}, +}); + +const createFormStateFromServer = ( + provider: McpProvider, + server: ProviderMcpServer, + supportedScopes?: McpScope[], + supportedTransports?: McpTransport[], +): McpFormState => ({ + ...cloneDefaultForm(provider, supportedScopes, supportedTransports), + name: server.name, + scope: server.scope, + workspacePath: server.workspacePath || '', + transport: server.transport, + command: server.command || '', + args: server.args || [], + env: server.env || {}, + cwd: server.cwd || '', + url: server.url || '', + headers: server.headers || {}, + envVars: server.envVars || [], + bearerTokenEnvVar: server.bearerTokenEnvVar || '', + envHttpHeaders: server.envHttpHeaders || {}, +}); + +const createMultilineTextFromForm = (formData: McpFormState): MultilineFieldText => ({ + args: formData.args.join('\n'), + env: formatKeyValueLines(formData.env), + headers: formatKeyValueLines(formData.headers), + envVars: formData.envVars.join('\n'), + envHttpHeaders: formatKeyValueLines(formData.envHttpHeaders), +}); + +const normalizeScope = (supportedScopes: McpScope[], value: McpScope): McpScope => ( + supportedScopes.includes(value) ? value : supportedScopes[0] +); + +const normalizeTransport = (supportedTransports: McpTransport[], value: McpTransport): McpTransport => ( + supportedTransports.includes(value) ? value : supportedTransports[0] +); + +export function useMcpServerForm({ + provider, + isOpen, + editingServer, + currentProjects, + supportedScopes = MCP_SUPPORTED_SCOPES[provider], + supportedTransports = MCP_SUPPORTED_TRANSPORTS[provider], + unsupportedTransportMessage, + onSubmit, +}: UseMcpServerFormArgs) { + const { t } = useTranslation('settings'); + const [formData, setFormData] = useState(() => ( + cloneDefaultForm(provider, supportedScopes, supportedTransports) + )); + const [multilineText, setMultilineText] = useState(() => ( + createMultilineTextFromForm(cloneDefaultForm(provider, supportedScopes, supportedTransports)) + )); + const [jsonValidationError, setJsonValidationError] = useState(''); + const [isSubmitting, setIsSubmitting] = useState(false); + + const isEditing = Boolean(editingServer); + + useEffect(() => { + if (!isOpen) { + return; + } + + setJsonValidationError(''); + if (editingServer) { + const nextFormData = createFormStateFromServer(provider, editingServer, supportedScopes, supportedTransports); + setFormData(nextFormData); + setMultilineText(createMultilineTextFromForm(nextFormData)); + return; + } + + const nextFormData = cloneDefaultForm(provider, supportedScopes, supportedTransports); + setFormData(nextFormData); + setMultilineText(createMultilineTextFromForm(nextFormData)); + }, [editingServer, isOpen, provider, supportedScopes, supportedTransports]); + + const projectOptions = useMemo(() => ( + currentProjects + .map((project) => ({ + value: getProjectPath(project), + label: project.displayName || project.name, + })) + .filter((project) => project.value) + ), [currentProjects]); + + const updateForm = (key: K, value: McpFormState[K]) => { + setFormData((prev) => ({ ...prev, [key]: value })); + }; + + const updateScope = (scope: McpScope) => { + setFormData((prev) => ({ + ...prev, + scope: normalizeScope(supportedScopes, scope), + workspacePath: scope === 'user' ? '' : prev.workspacePath, + })); + }; + + const updateTransport = (transport: McpTransport) => { + setFormData((prev) => ({ ...prev, transport: normalizeTransport(supportedTransports, transport) })); + }; + + const validateJsonInput = (value: string) => { + if (!value.trim()) { + setJsonValidationError(''); + return; + } + + try { + const parsed = JSON.parse(value) as { type?: unknown; transport?: unknown; command?: unknown; url?: unknown }; + const transportInput = parsed.transport || parsed.type; + if (!isMcpTransport(transportInput)) { + setJsonValidationError(t('mcpForm.validation.missingType')); + } else if (!supportedTransports.includes(transportInput)) { + setJsonValidationError( + unsupportedTransportMessage?.(transportInput) ?? `${provider} does not support ${transportInput} MCP servers`, + ); + } else if (transportInput === 'stdio' && !parsed.command) { + setJsonValidationError(t('mcpForm.validation.stdioRequiresCommand')); + } else if ((transportInput === 'http' || transportInput === 'sse') && !parsed.url) { + setJsonValidationError(t('mcpForm.validation.httpRequiresUrl', { type: transportInput })); + } else { + setJsonValidationError(''); + } + } catch { + setJsonValidationError(t('mcpForm.validation.invalidJson')); + } + }; + + const updateJsonInput = (value: string) => { + setFormData((prev) => ({ ...prev, jsonInput: value })); + validateJsonInput(value); + }; + + const updateMultilineText = (key: K, value: MultilineFieldText[K]) => { + setMultilineText((prev) => ({ ...prev, [key]: value })); + }; + + const createSubmitFormData = (): McpFormState => ({ + ...formData, + args: parseListLines(multilineText.args), + env: parseKeyValueLines(multilineText.env), + headers: parseKeyValueLines(multilineText.headers), + envVars: parseListLines(multilineText.envVars), + envHttpHeaders: parseKeyValueLines(multilineText.envHttpHeaders), + }); + + const canSubmit = useMemo(() => { + if (!formData.name.trim()) { + return false; + } + + if (formData.scope !== 'user' && !formData.workspacePath.trim()) { + return false; + } + + if (formData.importMode === 'json') { + return Boolean(formData.jsonInput.trim()) && !jsonValidationError; + } + + if (formData.transport === 'stdio') { + return Boolean(formData.command.trim()); + } + + return Boolean(formData.url.trim()); + }, [formData, jsonValidationError]); + + const handleSubmit = async (event: FormEvent) => { + event.preventDefault(); + setIsSubmitting(true); + + try { + // Textareas keep raw strings while editing so users can create blank + // lines or partial KEY=value entries without the form rewriting them. + await onSubmit(createSubmitFormData(), editingServer); + } catch (error) { + alert(`Error: ${getErrorMessage(error)}`); + } finally { + setIsSubmitting(false); + } + }; + + return { + formData, + setFormData, + multilineText, + projectOptions, + isEditing, + isSubmitting, + jsonValidationError, + canSubmit, + updateForm, + updateScope, + updateTransport, + updateJsonInput, + updateMultilineText, + handleSubmit, + }; +} diff --git a/src/components/mcp/hooks/useMcpServers.ts b/src/components/mcp/hooks/useMcpServers.ts new file mode 100644 index 00000000..57ed81cc --- /dev/null +++ b/src/components/mcp/hooks/useMcpServers.ts @@ -0,0 +1,535 @@ +import { useCallback, useEffect, useMemo, useRef, useState } from 'react'; + +import { authenticatedFetch } from '../../../utils/api'; +import { MCP_GLOBAL_SUPPORTED_TRANSPORTS, MCP_PROVIDER_NAMES, MCP_SUPPORTED_SCOPES } from '../constants'; +import type { + ApiResponse, + GlobalMcpServerResult, + McpFormState, + McpProject, + McpProvider, + McpScope, + McpTransport, + ProviderMcpServer, + UpsertProviderMcpServerPayload, +} from '../types'; +import { + createMcpPayloadFromForm, + getErrorMessage, + getProjectPath, + isMcpScope, + isMcpTransport, +} from '../utils/mcpFormatting'; + +type ProviderMcpServerResponse = { + provider: McpProvider; + scope: McpScope; + servers: Array>; +}; + +type GlobalMcpServerResponse = { + results: GlobalMcpServerResult[]; +}; + +type ProjectTarget = { + name: string; + displayName: string; + path: string; +}; + +type McpServersCacheEntry = { + servers: ProviderMcpServer[]; + updatedAt: number; +}; + +type ScopedProjectRequest = { + scope: McpScope; + project: ProjectTarget; +}; + +const MCP_CACHE_TTL_MS = 30_000; +const mcpServersCache = new Map(); + +// Settings users often switch between provider tabs repeatedly. A short module +// cache prevents those tab switches from refetching every project config file. + +const toResponseJson = async (response: Response): Promise => response.json() as Promise; + +const getApiErrorMessage = (payload: unknown, fallback: string): string => { + if (!payload || typeof payload !== 'object') { + return fallback; + } + + const record = payload as Record; + const error = record.error; + if (error && typeof error === 'object') { + const message = (error as Record).message; + if (typeof message === 'string' && message.trim()) { + return message; + } + } + + if (typeof error === 'string' && error.trim()) { + return error; + } + + const details = record.details; + if (typeof details === 'string' && details.trim()) { + return details; + } + + return fallback; +}; + +const normalizeTransport = (value: unknown, fallback: McpTransport = 'stdio'): McpTransport => ( + isMcpTransport(value) ? value : fallback +); + +const normalizeScope = (value: unknown, fallback: McpScope): McpScope => ( + isMcpScope(value) ? value : fallback +); + +const normalizeServer = ( + provider: McpProvider, + scope: McpScope, + server: Partial, + project?: ProjectTarget, +): ProviderMcpServer => { + const transport = normalizeTransport(server.transport, server.url ? 'http' : 'stdio'); + return { + provider, + name: String(server.name ?? ''), + scope: normalizeScope(server.scope, scope), + transport, + command: server.command, + args: server.args ?? [], + env: server.env ?? {}, + cwd: server.cwd, + url: server.url, + headers: server.headers ?? {}, + envVars: server.envVars ?? [], + bearerTokenEnvVar: server.bearerTokenEnvVar, + envHttpHeaders: server.envHttpHeaders ?? {}, + workspacePath: project?.path || server.workspacePath, + projectName: project?.name || server.projectName, + projectDisplayName: project?.displayName || server.projectDisplayName, + }; +}; + +const createProjectTargets = (projects: McpProject[]): ProjectTarget[] => { + const seen = new Set(); + return projects.reduce((acc, project) => { + const projectPath = getProjectPath(project); + if (!projectPath || seen.has(projectPath)) { + return acc; + } + + seen.add(projectPath); + acc.push({ + name: project.name, + displayName: project.displayName || project.name, + path: projectPath, + }); + return acc; + }, []); +}; + +const fetchProviderScopeServers = async ( + provider: McpProvider, + scope: McpScope, + project?: ProjectTarget, +): Promise => { + const params = new URLSearchParams({ scope }); + if (project?.path) { + params.set('workspacePath', project.path); + } + + const response = await authenticatedFetch(`/api/providers/${provider}/mcp/servers?${params.toString()}`); + const data = await toResponseJson>(response); + + if (!response.ok || !data.success) { + throw new Error(getApiErrorMessage(data, `Failed to load ${provider} MCP servers`)); + } + + return (data.data.servers || []).map((server) => normalizeServer(provider, scope, server, project)); +}; + +const deleteProviderServer = async ( + provider: McpProvider, + server: ProviderMcpServer, +): Promise => { + const params = new URLSearchParams({ scope: server.scope }); + if (server.workspacePath) { + params.set('workspacePath', server.workspacePath); + } + + const response = await authenticatedFetch( + `/api/providers/${provider}/mcp/servers/${encodeURIComponent(server.name)}?${params.toString()}`, + { method: 'DELETE' }, + ); + const data = await toResponseJson>(response); + + if (!response.ok || !data.success) { + throw new Error(getApiErrorMessage(data, 'Failed to delete MCP server')); + } +}; + +const saveProviderServer = async ( + provider: McpProvider, + payload: UpsertProviderMcpServerPayload, +): Promise => { + const response = await authenticatedFetch(`/api/providers/${provider}/mcp/servers`, { + method: 'POST', + body: JSON.stringify(payload), + }); + const data = await toResponseJson>(response); + + if (!response.ok || !data.success) { + throw new Error(getApiErrorMessage(data, 'Failed to save MCP server')); + } +}; + +const saveGlobalServer = async ( + payload: UpsertProviderMcpServerPayload, +): Promise => { + const response = await authenticatedFetch('/api/providers/mcp/servers/global', { + method: 'POST', + body: JSON.stringify(payload), + }); + const data = await toResponseJson>(response); + + if (!response.ok || !data.success) { + throw new Error(getApiErrorMessage(data, 'Failed to save MCP server to all providers')); + } + + return data.data.results || []; +}; + +const didServerIdentityChange = ( + editingServer: ProviderMcpServer, + payload: UpsertProviderMcpServerPayload, +): boolean => ( + editingServer.name !== payload.name + || editingServer.scope !== payload.scope + || (editingServer.workspacePath || '') !== (payload.workspacePath || '') +); + +const getServerIdentity = (server: ProviderMcpServer): string => ( + `${server.provider}:${server.scope}:${server.workspacePath || 'global'}:${server.name}` +); + +const getCacheKey = (provider: McpProvider, projects: ProjectTarget[]): string => { + const projectKey = projects.map((project) => project.path).sort().join('|'); + return `${provider}:${projectKey}`; +}; + +const formatGlobalAddFailures = (failures: GlobalMcpServerResult[]): string => ( + failures + .map((failure) => `${MCP_PROVIDER_NAMES[failure.provider]}: ${failure.error || 'Unknown error'}`) + .join('; ') +); + +const sortServers = (servers: ProviderMcpServer[]): ProviderMcpServer[] => { + const scopeOrder: Record = { + user: 0, + project: 1, + local: 2, + }; + + return [...servers].sort((left, right) => { + const scopeDelta = scopeOrder[left.scope] - scopeOrder[right.scope]; + if (scopeDelta !== 0) { + return scopeDelta; + } + + const projectDelta = (left.projectDisplayName || '').localeCompare(right.projectDisplayName || ''); + if (projectDelta !== 0) { + return projectDelta; + } + + return left.name.localeCompare(right.name); + }); +}; + +const mergeServers = ( + existingServers: ProviderMcpServer[], + incomingServers: ProviderMcpServer[], +): ProviderMcpServer[] => { + const serversById = new Map(); + existingServers.forEach((server) => { + serversById.set(getServerIdentity(server), server); + }); + incomingServers.forEach((server) => { + serversById.set(getServerIdentity(server), server); + }); + + return sortServers([...serversById.values()]); +}; + +const replaceScopedServers = ( + existingServers: ProviderMcpServer[], + incomingServers: ProviderMcpServer[], + scope: McpScope, + workspacePath?: string, +): ProviderMcpServer[] => { + const remainingServers = existingServers.filter((server) => ( + server.scope !== scope || (server.workspacePath || '') !== (workspacePath || '') + )); + + return mergeServers(remainingServers, incomingServers); +}; + +type UseMcpServersArgs = { + selectedProvider: McpProvider; + currentProjects: McpProject[]; +}; + +export function useMcpServers({ selectedProvider, currentProjects }: UseMcpServersArgs) { + const [servers, setServers] = useState([]); + const [isLoading, setIsLoading] = useState(false); + const [loadError, setLoadError] = useState(null); + const [deleteError, setDeleteError] = useState(null); + const [saveStatus, setSaveStatus] = useState<'success' | 'error' | null>(null); + const [isLoadingProjectScopes, setIsLoadingProjectScopes] = useState(false); + const [isFormOpen, setIsFormOpen] = useState(false); + const [isGlobalFormOpen, setIsGlobalFormOpen] = useState(false); + const [editingServer, setEditingServer] = useState(null); + const activeLoadIdRef = useRef(0); + + const projectTargets = useMemo(() => createProjectTargets(currentProjects), [currentProjects]); + const cacheKey = useMemo(() => getCacheKey(selectedProvider, projectTargets), [projectTargets, selectedProvider]); + + const refreshServers = useCallback(async (options: { force?: boolean } = {}) => { + const loadId = activeLoadIdRef.current + 1; + activeLoadIdRef.current = loadId; + + const cachedEntry = mcpServersCache.get(cacheKey); + const canUseCache = !options.force && cachedEntry && Date.now() - cachedEntry.updatedAt < MCP_CACHE_TTL_MS; + if (canUseCache) { + setServers(cachedEntry.servers); + setIsLoading(false); + setIsLoadingProjectScopes(false); + setLoadError(null); + return; + } + + if (cachedEntry && !options.force) { + setServers(cachedEntry.servers); + } else { + setServers([]); + } + + setIsLoading(!cachedEntry); + setIsLoadingProjectScopes(false); + setLoadError(null); + + const supportedScopes = MCP_SUPPORTED_SCOPES[selectedProvider]; + let nextServers: ProviderMcpServer[] = cachedEntry && !options.force ? cachedEntry.servers : []; + let firstError: string | null = null; + + // Load the global/user scope first so the visible list can paint quickly. + // Project and local scopes can involve many project config files, so they + // are appended below as background requests instead of blocking this render. + if (supportedScopes.includes('user')) { + try { + const userServers = await fetchProviderScopeServers(selectedProvider, 'user'); + if (activeLoadIdRef.current !== loadId) { + return; + } + + nextServers = replaceScopedServers(nextServers, userServers, 'user'); + setServers(sortServers(nextServers)); + } catch (error) { + firstError = getErrorMessage(error); + } + } + + if (activeLoadIdRef.current !== loadId) { + return; + } + + setIsLoading(false); + + const projectScopeRequests: ScopedProjectRequest[] = []; + projectTargets.forEach((project) => { + if (supportedScopes.includes('project')) { + projectScopeRequests.push({ scope: 'project', project }); + } + + if (supportedScopes.includes('local')) { + projectScopeRequests.push({ scope: 'local', project }); + } + }); + + if (projectScopeRequests.length === 0) { + const finalServers = sortServers(nextServers); + mcpServersCache.set(cacheKey, { servers: finalServers, updatedAt: Date.now() }); + setLoadError(firstError); + return; + } + + setIsLoadingProjectScopes(true); + + // Update the UI as each project scope resolves. This avoids waiting for the + // slowest project before showing servers from faster config files. + await Promise.all(projectScopeRequests.map(async ({ scope, project }) => { + try { + const scopedServers = await fetchProviderScopeServers(selectedProvider, scope, project); + if (activeLoadIdRef.current !== loadId) { + return; + } + + nextServers = replaceScopedServers(nextServers, scopedServers, scope, project.path); + setServers(nextServers); + } catch (error) { + firstError = firstError || getErrorMessage(error); + } + })); + + if (activeLoadIdRef.current !== loadId) { + return; + } + + const finalServers = sortServers(nextServers); + mcpServersCache.set(cacheKey, { servers: finalServers, updatedAt: Date.now() }); + setServers(finalServers); + setLoadError(firstError); + setIsLoadingProjectScopes(false); + }, [cacheKey, projectTargets, selectedProvider]); + + const openForm = useCallback((server?: ProviderMcpServer) => { + setEditingServer(server || null); + setIsFormOpen(true); + }, []); + + const closeForm = useCallback(() => { + setIsFormOpen(false); + setEditingServer(null); + }, []); + + const openGlobalForm = useCallback(() => { + setIsGlobalFormOpen(true); + }, []); + + const closeGlobalForm = useCallback(() => { + setIsGlobalFormOpen(false); + }, []); + + const submitForm = useCallback( + async (formData: McpFormState, serverBeingEdited: ProviderMcpServer | null) => { + const payload = createMcpPayloadFromForm(selectedProvider, formData); + if (payload.scope !== 'user' && !payload.workspacePath) { + throw new Error('Select a project for project-scoped MCP servers'); + } + + await saveProviderServer(selectedProvider, payload); + + if (serverBeingEdited && didServerIdentityChange(serverBeingEdited, payload)) { + await deleteProviderServer(selectedProvider, serverBeingEdited); + } + + mcpServersCache.delete(cacheKey); + await refreshServers({ force: true }); + setSaveStatus('success'); + closeForm(); + }, + [cacheKey, closeForm, refreshServers, selectedProvider], + ); + + const submitGlobalForm = useCallback( + async (formData: McpFormState) => { + const payload = createMcpPayloadFromForm(selectedProvider, formData, { + supportedTransports: MCP_GLOBAL_SUPPORTED_TRANSPORTS, + supportsWorkingDirectory: false, + includeProviderSpecificFields: false, + unsupportedTransportMessage: (transport) => + `Add MCP Server supports only stdio and http across all providers, not ${transport}.`, + }); + + if (payload.scope === 'local') { + throw new Error('Add MCP Server supports only user or project scope across all providers.'); + } + + if (payload.scope !== 'user' && !payload.workspacePath) { + throw new Error('Select a project for project-scoped MCP servers'); + } + + // The global endpoint updates every provider, so clear every provider + // cache entry instead of only the currently visible provider tab. + const results = await saveGlobalServer(payload); + mcpServersCache.clear(); + await refreshServers({ force: true }); + + const failures = results.filter((result) => !result.created); + if (failures.length > 0) { + setSaveStatus('error'); + throw new Error(`Failed to add MCP server to all providers. ${formatGlobalAddFailures(failures)}`); + } + + setSaveStatus('success'); + closeGlobalForm(); + }, + [closeGlobalForm, refreshServers, selectedProvider], + ); + + const deleteServer = useCallback( + async (server: ProviderMcpServer) => { + if (!window.confirm('Are you sure you want to delete this MCP server?')) { + return; + } + + setDeleteError(null); + try { + await deleteProviderServer(selectedProvider, server); + mcpServersCache.delete(cacheKey); + await refreshServers({ force: true }); + setSaveStatus('success'); + } catch (error) { + setDeleteError(getErrorMessage(error)); + setSaveStatus('error'); + } + }, + [cacheKey, refreshServers, selectedProvider], + ); + + useEffect(() => { + void refreshServers(); + }, [refreshServers]); + + useEffect(() => { + setIsFormOpen(false); + setIsGlobalFormOpen(false); + setEditingServer(null); + setDeleteError(null); + setSaveStatus(null); + }, [selectedProvider]); + + useEffect(() => { + if (saveStatus === null) { + return; + } + + const timer = window.setTimeout(() => setSaveStatus(null), 2000); + return () => window.clearTimeout(timer); + }, [saveStatus]); + + return { + servers, + isLoading, + isLoadingProjectScopes, + loadError, + deleteError, + saveStatus, + isFormOpen, + isGlobalFormOpen, + editingServer, + openForm, + openGlobalForm, + closeForm, + closeGlobalForm, + submitForm, + submitGlobalForm, + deleteServer, + refreshServers, + }; +} diff --git a/src/components/mcp/index.ts b/src/components/mcp/index.ts new file mode 100644 index 00000000..33439517 --- /dev/null +++ b/src/components/mcp/index.ts @@ -0,0 +1 @@ +export { default as McpServers } from './view/McpServers'; \ No newline at end of file diff --git a/src/components/mcp/types.ts b/src/components/mcp/types.ts new file mode 100644 index 00000000..810258e9 --- /dev/null +++ b/src/components/mcp/types.ts @@ -0,0 +1,90 @@ +import type { LLMProvider } from '../../types/app'; + +export type McpProvider = LLMProvider; +export type McpScope = 'user' | 'local' | 'project'; +export type McpTransport = 'stdio' | 'http' | 'sse'; +export type McpImportMode = 'form' | 'json'; +export type McpFormMode = 'provider' | 'global'; +export type KeyValueMap = Record; + +export type McpProject = { + name: string; + displayName?: string; + fullPath?: string; + path?: string; +}; + +export type ProviderMcpServer = { + provider: McpProvider; + name: string; + scope: McpScope; + transport: McpTransport; + command?: string; + args?: string[]; + env?: KeyValueMap; + cwd?: string; + url?: string; + headers?: KeyValueMap; + envVars?: string[]; + bearerTokenEnvVar?: string; + envHttpHeaders?: KeyValueMap; + workspacePath?: string; + projectName?: string; + projectDisplayName?: string; +}; + +export type McpFormState = { + name: string; + scope: McpScope; + workspacePath: string; + transport: McpTransport; + command: string; + args: string[]; + env: KeyValueMap; + cwd: string; + url: string; + headers: KeyValueMap; + envVars: string[]; + bearerTokenEnvVar: string; + envHttpHeaders: KeyValueMap; + importMode: McpImportMode; + jsonInput: string; +}; + +export type UpsertProviderMcpServerPayload = { + name: string; + scope: McpScope; + transport: McpTransport; + workspacePath?: string; + command?: string; + args?: string[]; + env?: KeyValueMap; + cwd?: string; + url?: string; + headers?: KeyValueMap; + envVars?: string[]; + bearerTokenEnvVar?: string; + envHttpHeaders?: KeyValueMap; +}; + +export type GlobalMcpServerResult = { + provider: McpProvider; + created: boolean; + error?: string; +}; + +export type ApiSuccessResponse = { + success: true; + data: T; +}; + +export type ApiErrorResponse = { + success: false; + error?: { + code?: string; + message?: string; + details?: unknown; + }; +}; + +export type ApiResponse = ApiSuccessResponse | ApiErrorResponse; diff --git a/src/components/mcp/utils/mcpFormatting.ts b/src/components/mcp/utils/mcpFormatting.ts new file mode 100644 index 00000000..4184c234 --- /dev/null +++ b/src/components/mcp/utils/mcpFormatting.ts @@ -0,0 +1,184 @@ +import { MCP_SUPPORTED_TRANSPORTS, MCP_SUPPORTS_WORKING_DIRECTORY } from '../constants'; +import type { + KeyValueMap, + McpFormState, + McpProvider, + McpScope, + McpTransport, + UpsertProviderMcpServerPayload, +} from '../types'; + +type CreateMcpPayloadOptions = { + supportedTransports?: McpTransport[]; + supportsWorkingDirectory?: boolean; + includeProviderSpecificFields?: boolean; + unsupportedTransportMessage?: (transport: McpTransport) => string; +}; + +const isRecord = (value: unknown): value is Record => ( + Boolean(value) && typeof value === 'object' && !Array.isArray(value) +); + +const readString = (value: unknown): string | undefined => ( + typeof value === 'string' && value.trim() ? value.trim() : undefined +); + +const readStringArray = (value: unknown): string[] | undefined => ( + Array.isArray(value) ? value.filter((entry): entry is string => typeof entry === 'string') : undefined +); + +const readStringRecord = (value: unknown): KeyValueMap | undefined => { + if (!isRecord(value)) { + return undefined; + } + + const normalized: KeyValueMap = {}; + Object.entries(value).forEach(([key, entry]) => { + if (typeof entry === 'string') { + normalized[key] = entry; + } + }); + + return Object.keys(normalized).length > 0 ? normalized : undefined; +}; + +export const formatKeyValueLines = (value: KeyValueMap): string => ( + Object.entries(value).map(([key, entry]) => `${key}=${entry}`).join('\n') +); + +export const parseKeyValueLines = (value: string): KeyValueMap => { + const normalized: KeyValueMap = {}; + value.split('\n').forEach((line) => { + const [key, ...valueParts] = line.split('='); + if (key?.trim()) { + normalized[key.trim()] = valueParts.join('=').trim(); + } + }); + return normalized; +}; + +export const parseListLines = (value: string): string[] => ( + value.split('\n').map((entry) => entry.trim()).filter(Boolean) +); + +export const maskSecret = (value: unknown): string => { + const normalizedValue = String(value ?? ''); + if (normalizedValue.length <= 4) { + return '****'; + } + + return `${normalizedValue.slice(0, 2)}****${normalizedValue.slice(-2)}`; +}; + +export const isMcpScope = (value: unknown): value is McpScope => ( + value === 'user' || value === 'local' || value === 'project' +); + +export const isMcpTransport = (value: unknown): value is McpTransport => ( + value === 'stdio' || value === 'http' || value === 'sse' +); + +export const getProjectPath = (project: { fullPath?: string; path?: string }): string => ( + project.fullPath || project.path || '' +); + +export const getErrorMessage = (error: unknown): string => ( + error instanceof Error ? error.message : 'Unknown error' +); + +const assertSupportedTransport = ( + provider: McpProvider, + transport: McpTransport, + options?: CreateMcpPayloadOptions, +) => { + const supportedTransports = options?.supportedTransports ?? MCP_SUPPORTED_TRANSPORTS[provider]; + if (supportedTransports.includes(transport)) { + return; + } + + throw new Error( + options?.unsupportedTransportMessage?.(transport) ?? `${provider} does not support ${transport} MCP servers`, + ); +}; + +export const parseJsonMcpPayload = ( + provider: McpProvider, + formData: McpFormState, + options?: CreateMcpPayloadOptions, +): UpsertProviderMcpServerPayload => { + const parsed = JSON.parse(formData.jsonInput) as unknown; + if (!isRecord(parsed)) { + throw new Error('JSON configuration must be an object'); + } + + const transportInput = readString(parsed.transport) ?? readString(parsed.type); + const transport = isMcpTransport(transportInput) ? transportInput : undefined; + if (!transport) { + throw new Error('Missing required field: type'); + } + + assertSupportedTransport(provider, transport, options); + + if (transport === 'stdio' && !readString(parsed.command)) { + throw new Error('stdio type requires a command field'); + } + + if ((transport === 'http' || transport === 'sse') && !readString(parsed.url)) { + throw new Error(`${transport} type requires a url field`); + } + + return { + name: formData.name.trim(), + scope: formData.scope, + workspacePath: formData.scope === 'user' ? undefined : formData.workspacePath, + transport, + command: readString(parsed.command), + args: readStringArray(parsed.args) ?? [], + env: readStringRecord(parsed.env) ?? {}, + cwd: (options?.supportsWorkingDirectory ?? MCP_SUPPORTS_WORKING_DIRECTORY[provider]) + ? readString(parsed.cwd) + : undefined, + url: readString(parsed.url), + headers: readStringRecord(parsed.headers ?? parsed.http_headers) ?? {}, + envVars: (options?.includeProviderSpecificFields ?? provider === 'codex') + ? readStringArray(parsed.envVars ?? parsed.env_vars) ?? [] + : undefined, + bearerTokenEnvVar: (options?.includeProviderSpecificFields ?? provider === 'codex') + ? readString(parsed.bearerTokenEnvVar ?? parsed.bearer_token_env_var) + : undefined, + envHttpHeaders: (options?.includeProviderSpecificFields ?? provider === 'codex') + ? readStringRecord(parsed.envHttpHeaders ?? parsed.env_http_headers) ?? {} + : undefined, + }; +}; + +export const createMcpPayloadFromForm = ( + provider: McpProvider, + formData: McpFormState, + options?: CreateMcpPayloadOptions, +): UpsertProviderMcpServerPayload => { + if (formData.importMode === 'json') { + return parseJsonMcpPayload(provider, formData, options); + } + + assertSupportedTransport(provider, formData.transport, options); + + const supportsWorkingDirectory = options?.supportsWorkingDirectory ?? MCP_SUPPORTS_WORKING_DIRECTORY[provider]; + const includeProviderSpecificFields = options?.includeProviderSpecificFields ?? provider === 'codex'; + + return { + name: formData.name.trim(), + scope: formData.scope, + workspacePath: formData.scope === 'user' ? undefined : formData.workspacePath, + transport: formData.transport, + command: formData.transport === 'stdio' ? formData.command.trim() : undefined, + args: formData.transport === 'stdio' ? formData.args : undefined, + env: formData.env, + cwd: supportsWorkingDirectory ? formData.cwd.trim() || undefined : undefined, + url: formData.transport !== 'stdio' ? formData.url.trim() : undefined, + headers: formData.transport !== 'stdio' ? formData.headers : undefined, + envVars: includeProviderSpecificFields ? formData.envVars : undefined, + bearerTokenEnvVar: includeProviderSpecificFields ? formData.bearerTokenEnvVar.trim() || undefined : undefined, + envHttpHeaders: includeProviderSpecificFields ? formData.envHttpHeaders : undefined, + }; +}; diff --git a/src/components/mcp/view/McpServers.tsx b/src/components/mcp/view/McpServers.tsx new file mode 100644 index 00000000..8ec9d03e --- /dev/null +++ b/src/components/mcp/view/McpServers.tsx @@ -0,0 +1,281 @@ +import { Edit3, ExternalLink, Globe, Lock, Plus, Server, Terminal, Trash2, Users, Zap } from 'lucide-react'; +import { useTranslation } from 'react-i18next'; + +import type { McpProject, McpProvider, McpScope, ProviderMcpServer } from '../types'; +import { IS_PLATFORM } from '../../../constants/config'; +import { Badge, Button } from '../../../shared/view/ui'; +import { + MCP_GLOBAL_SUPPORTED_SCOPES, + MCP_GLOBAL_SUPPORTED_TRANSPORTS, + MCP_PROVIDER_BUTTON_CLASSES, + MCP_PROVIDER_NAMES, +} from '../constants'; +import { useMcpServers } from '../hooks/useMcpServers'; +import { maskSecret } from '../utils/mcpFormatting'; + +import McpServerFormModal from './modals/McpServerFormModal'; + +type McpServersProps = { + selectedProvider: McpProvider; + currentProjects: McpProject[]; +}; + +const getTransportIcon = (transport: string | undefined) => { + if (transport === 'stdio') { + return ; + } + + if (transport === 'sse') { + return ; + } + + if (transport === 'http') { + return ; + } + + return ; +}; + +const getScopeLabel = (scope: McpScope): string => { + if (scope === 'user') { + return 'user'; + } + + if (scope === 'local') { + return 'local'; + } + + return 'project'; +}; + +const getServerKey = (server: ProviderMcpServer): string => ( + `${server.provider}:${server.scope}:${server.workspacePath || 'global'}:${server.name}` +); + +function ConfigLine({ label, children }: { label: string; children: string }) { + if (!children) { + return null; + } + + return ( +
+ {label}:{' '} + {children} +
+ ); +} + +function TeamMcpFeatureCard() { + return ( +
+
+
+ +
+
+
+

Team MCP Configs

+ +
+

+ Share MCP server configurations across your team. Everyone stays in sync automatically. +

+ + Available with CloudCLI Pro + + +
+
+
+ ); +} + +export default function McpServers({ selectedProvider, currentProjects }: McpServersProps) { + const { t } = useTranslation('settings'); + const { + servers, + isLoading, + isLoadingProjectScopes, + loadError, + deleteError, + saveStatus, + isFormOpen, + isGlobalFormOpen, + editingServer, + openForm, + openGlobalForm, + closeForm, + closeGlobalForm, + submitForm, + submitGlobalForm, + deleteServer, + } = useMcpServers({ selectedProvider, currentProjects }); + + const providerName = MCP_PROVIDER_NAMES[selectedProvider]; + const description = t(`mcpServers.description.${selectedProvider}`, { + defaultValue: `Model Context Protocol servers provide additional tools and data sources to ${providerName}`, + }); + const globalButtonLabel = 'Add Global MCP Server'; + const providerButtonLabel = `Add ${providerName} MCP Server`; + const globalAddDescription = 'Add Global MCP Server writes one common stdio or HTTP server to Claude, Cursor, Codex, and Gemini.'; + const providerAddDescription = `${providerButtonLabel} only changes ${providerName}.`; + const globalModalDescription = 'Adds this MCP server to every provider: Claude, Cursor, Codex, and Gemini. ' + + 'Only stdio and HTTP transports are supported because the same config must work across all providers.'; + + return ( +
+
+ +

{t('mcpServers.title')}

+
+

{description}

+ +
+
+ + +
+
+ {saveStatus === 'success' && ( + {t('saveStatus.success')} + )} + {isLoadingProjectScopes && ( + Refreshing project scopes... + )} +
+
+ + {(loadError || deleteError) && ( +
+ {deleteError || loadError} +
+ )} + +
+ {isLoading && servers.length === 0 && ( +
Loading MCP servers...
+ )} + + {servers.map((server) => ( +
+
+
+
+ {getTransportIcon(server.transport)} + {server.name} + + {server.transport || 'stdio'} + + + {getScopeLabel(server.scope)} + + {server.projectDisplayName && ( + + {server.projectDisplayName} + + )} +
+ +
+ {server.command || ''} + {server.url || ''} + {(server.args || []).join(' ')} + {server.cwd || ''} + {server.env && Object.keys(server.env).length > 0 && ( + + {Object.entries(server.env).map(([key, value]) => `${key}=${maskSecret(value)}`).join(', ')} + + )} + {server.envVars && server.envVars.length > 0 && ( + {server.envVars.join(', ')} + )} +
+
+ +
+ + +
+
+
+ ))} + + {!isLoading && !isLoadingProjectScopes && servers.length === 0 && ( +
{t('mcpServers.empty')}
+ )} +
+ + {selectedProvider === 'codex' && ( +
+

{t('mcpServers.help.title')}

+

{t('mcpServers.help.description')}

+
+ )} + + {selectedProvider === 'claude' && !IS_PLATFORM && } + + + + submitGlobalForm(formData)} + /> +
+ ); +} diff --git a/src/components/mcp/view/modals/McpServerFormModal.tsx b/src/components/mcp/view/modals/McpServerFormModal.tsx new file mode 100644 index 00000000..afffa512 --- /dev/null +++ b/src/components/mcp/view/modals/McpServerFormModal.tsx @@ -0,0 +1,434 @@ +import { FolderOpen, Globe, X } from 'lucide-react'; +import { useTranslation } from 'react-i18next'; + +import { Button, Input } from '../../../../shared/view/ui'; +import { + MCP_PROVIDER_NAMES, + MCP_SUPPORTED_SCOPES, + MCP_SUPPORTED_TRANSPORTS, + MCP_SUPPORTS_WORKING_DIRECTORY, +} from '../../constants'; +import { useMcpServerForm } from '../../hooks/useMcpServerForm'; +import type { + McpFormMode, + McpFormState, + McpProject, + McpProvider, + McpScope, + McpTransport, + ProviderMcpServer, +} from '../../types'; + +type McpServerFormModalProps = { + provider: McpProvider; + mode?: McpFormMode; + isOpen: boolean; + editingServer: ProviderMcpServer | null; + currentProjects: McpProject[]; + title?: string; + description?: string; + submitLabel?: string; + supportedScopes?: McpScope[]; + supportedTransports?: McpTransport[]; + onClose: () => void; + onSubmit: (formData: McpFormState, editingServer: ProviderMcpServer | null) => Promise; +}; + +const getScopeLabel = (scope: McpScope, mode: McpFormMode): string => { + if (scope === 'user') { + return mode === 'global' ? 'User (All Providers)' : 'User (Global)'; + } + + if (scope === 'local') { + return 'Claude Local'; + } + + return mode === 'global' ? 'Project (All Providers)' : 'Project'; +}; + +const getScopeDescription = (scope: McpScope, mode: McpFormMode): string => { + if (scope === 'user') { + return mode === 'global' + ? 'Writes to each provider user config and is available across projects on this machine' + : 'Available across all projects on your machine'; + } + + if (scope === 'local') { + return 'Stored in Claude user settings for the selected project'; + } + + return mode === 'global' + ? 'Writes to the selected project workspace for every provider' + : 'Stored in the selected project workspace'; +}; + +export default function McpServerFormModal({ + provider, + mode = 'provider', + isOpen, + editingServer, + currentProjects, + title, + description, + submitLabel, + supportedScopes, + supportedTransports, + onClose, + onSubmit, +}: McpServerFormModalProps) { + const { t } = useTranslation('settings'); + const isGlobalMode = mode === 'global'; + const availableScopes = supportedScopes ?? MCP_SUPPORTED_SCOPES[provider]; + const availableTransports = supportedTransports ?? MCP_SUPPORTED_TRANSPORTS[provider]; + const { + formData, + multilineText, + projectOptions, + isEditing, + isSubmitting, + jsonValidationError, + canSubmit, + updateForm, + updateScope, + updateTransport, + updateJsonInput, + updateMultilineText, + handleSubmit, + } = useMcpServerForm({ + provider, + isOpen, + editingServer, + currentProjects, + supportedScopes: availableScopes, + supportedTransports: availableTransports, + unsupportedTransportMessage: isGlobalMode + ? (transport) => `Add MCP Server supports only stdio and http across all providers, not ${transport}.` + : undefined, + onSubmit, + }); + + if (!isOpen) { + return null; + } + + const providerName = MCP_PROVIDER_NAMES[provider]; + const modalTitle = title ?? (isEditing ? t('mcpForm.title.edit') : t('mcpForm.title.add')); + const addButtonLabel = submitLabel ?? `${t('mcpForm.actions.addServer')} to ${providerName}`; + const showProjectSelector = formData.scope !== 'user'; + const supportsHttpHeaders = formData.transport === 'http' || formData.transport === 'sse'; + const supportsWorkingDirectory = !isGlobalMode && MCP_SUPPORTS_WORKING_DIRECTORY[provider]; + const showCodexOnlyFields = provider === 'codex' && !isGlobalMode; + + return ( +
+
+
+

{modalTitle}

+ +
+ +
+ {description && ( +
+ {description} +
+ )} + + {!isEditing && ( +
+ + +
+ )} + + {isEditing && ( +
+ +
+ {formData.scope === 'user' ? : } + {getScopeLabel(formData.scope, mode)} + {formData.workspacePath && ( + - {formData.workspacePath} + )} +
+

{t('mcpForm.scope.cannotChange')}

+
+ )} + + {!isEditing && ( +
+
+ +
+ {availableScopes.map((scope) => ( + + ))} +
+

{getScopeDescription(formData.scope, mode)}

+
+ + {showProjectSelector && ( +
+ + + {formData.workspacePath && ( +

+ {t('mcpForm.projectPath', { path: formData.workspacePath })} +

+ )} +
+ )} +
+ )} + +
+
+ + updateForm('name', event.target.value)} + placeholder={t('mcpForm.placeholders.serverName')} + required + /> +
+ + {formData.importMode === 'form' && ( +
+ + +
+ )} +
+ + {formData.importMode === 'json' && ( +
+ +