Compare commits

..

25 Commits

Author SHA1 Message Date
Haileyesus
7e92de7cb7 feat(providers): add comprehensive guide for provider module setup and usage 2026-05-12 20:18:09 +03:00
Haileyesus
bacca8d62b fix(security): centralize safe frontmatter parsing
Move frontmatter parsing into server/shared/frontmatter.ts so every backend caller
uses the same gray-matter configuration instead of importing gray-matter directly.

The goal is to keep executable JS and JSON frontmatter engines disabled for
all markdown discovered from the filesystem, not only command routes.

Provider skills and shared skill metadata now go through parseFrontMatter too.
That closes the gap where plugin or provider markdown could regain default
gray-matter behavior simply because it lived outside the original command path.

Classify the new parser in backend boundaries so modules can depend on the
safe shared API without reaching into legacy utility paths.
2026-05-12 20:05:52 +03:00
Haileyesus
aabf331e91 fix(providers): guard invalid skill command namespaces
Claude plugin ids come from local settings and installed plugin metadata.

Invalid ids such as empty strings or @ should not become command namespaces.

Skip plugin folders when no safe plugin name can be derived.

This prevents malformed slash commands like /:command from reaching the UI.

Add regression coverage for empty and @ plugin ids.

Keyboard selection in the slash menu should match mouse selection.

Only skills are inserted into the composer because they are provider invocations.

Built-in and custom commands execute directly and close the menu on success or failure.
2026-05-11 18:58:55 +03:00
Haileyesus
053e43447a feat(providers): surface skills in slash command menu
Provider skills were hidden behind provider-specific filesystem rules.

That made the backend and UI unable to offer one discovery path for skills.

Add a normalized skills contract, provider service, and provider skills API.

Keep provider-specific lookup rules inside adapters so routes and UI stay generic.

Claude needs plugin handling because enabled plugins resolve through installed_plugins.json.

Plugin folders can expose commands or skills, so Claude scans both forms.

Claude plugin commands are namespaced to avoid collisions with user and project skills.

Codex, Gemini, and Cursor adapters map their expected skill roots into the same contract.

The slash menu now shows skills beside built-in and custom commands for discovery.

The menu avoids mid-message activation, duplicate rows, loose namespace matches, and input overlap.

Provider tests cover discovery locations and Claude plugin edge cases.
2026-05-11 18:05:24 +03:00
Haile
039696c2de Fix/websocket streaming issues (#748) 2026-05-08 22:51:03 +03:00
Haile
beb0a50413 fix: enhance regex to correctly parse wrapper file paths for claude.exe (#741) 2026-05-04 17:54:02 +02:00
Haile
e89d2da5df Fix New session issues and websocket issues (#738)
* fix: reset-state-on-new-session-click

* fix(chat): preserve continuity while session ids settle

New conversations were crossing a short but important consistency gap.
The route could already point at a newly created session id while the
projects payload had not refreshed yet, and realtime/optimistic messages
could still be keyed under a provisional id. In that window the UI could
stop reading the active session store, briefly render the conversation as
missing, and then repopulate it a moment later.

That same gap also made duplication more likely. Optimistic local user
messages could survive long enough to appear beside the persisted copy,
and finalized assistant streaming rows could sit directly next to the
server-backed assistant message with the same content before realtime state
was cleared. The result was a chat view that felt unstable exactly when a
new session was being created.

This commit makes session-id reconciliation a first-class part of the chat
flow instead of assuming every layer will agree immediately. The session
store now understands canonical session aliases and can migrate one
conversation from a provisional id to the real id without dropping its
in-memory state. The route navigation path can replace the provisional URL
entry instead of stacking it in history, and the project/session selection
logic keeps a synthetic selected session alive long enough for the sidebar
and project payloads to catch up.

The practical goal is to keep one visible conversation throughout the whole
creation lifecycle: no dead window between websocket events and project
refresh, no stale provisional URL after the real id is known, and no extra
optimistic/local bubbles when server history catches up.

* fix(cli): resolve executable path for Claude CLI on Windows

* fix(session-synchronizer): improve session name extraction for Claude and Codex
2026-05-04 13:54:07 +03:00
simosmik
392c73b693 fix: add clarification on auto mode 2026-04-30 15:28:00 +00:00
viper151
5e7c4c5f8c chore(release): v1.31.5 2026-04-30 14:12:48 +00:00
simosmik
3f71d4932b feat: add auto mode to claude code 2026-04-30 14:09:51 +00:00
viper151
80561ee9e9 chore(release): v1.31.4 2026-04-30 12:47:35 +00:00
simosmik
658421c1c4 fix: bump codex sdk to latest version 2026-04-30 12:45:30 +00:00
viper151
881465aa71 chore(release): v1.31.3 2026-04-30 11:50:31 +00:00
Simos Mikelatos
9f2afebc66 Create command palette and add new features for search and actions (#728)
* refactor(ui): replace in-repo Command primitive with cmdk wrapper

* feat(command-palette): add global Cmd+K palette with v1 actions

* feat(command-palette): add session, file, and commit search sources

* refactor: add provider names to model constants

* feat(command-palette): add settings, navigation, message search, and ⌘K hints

* feat(command-palette): add git fetch/pull/push and branch switch actions

* refactor(command-palette): consolidate fetch source hooks behind useApiSource

* refactor(command-palette): extract useCommandKey and SETTINGS_MAIN_TABS metadata

* refactor(command-palette): extract groups into declarative registry

* refactor(command-palette): wire openFile through PaletteOpsContext

* refactor: migrate openSettings and refreshProjects from window.* to PaletteOpsContext

* refactor(command-palette): inline groups and delete registry indirection

* refactor(command-palette): return items array directly from source hooks

* refactor(palette-ops): flatten Handle wrapper into ref-based registry

* refactor: inline useCommandKey as MOD_KEY constant in two call sites

* feat: introduce pages and fix bug on branch switching

* fix: small labels

* fix: coderabbit issues

* fix: coderabbit comments

* Update src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-04-30 14:48:48 +03:00
viper151
df3d5de8c1 chore(release): v1.31.2 2026-04-30 07:02:47 +00:00
Simos Mikelatos
b44c93d884 Bump version from 1.31.0 to 1.31.1 2026-04-30 09:01:23 +02:00
Simos Mikelatos
a1c6d667a4 Bump version from 1.31.0 to 1.31.1 2026-04-30 09:01:01 +02:00
simosmik
0753c04783 fix: migrations for new sqlite schema 2026-04-30 06:56:43 +00:00
Simos Mikelatos
e1275e6d3c Add CloudCLI Scheduler plugin description to README 2026-04-30 08:32:54 +02:00
viper151
ccb8b83692 chore(release): v1.31.0 2026-04-30 06:31:49 +00:00
Simos Mikelatos
641731b3ef Update modelConstants.js 2026-04-30 08:23:35 +02:00
Simos Mikelatos
d4bdc667cc Reorder sonnet model in CLAUDE_MODELS 2026-04-30 08:23:21 +02:00
Simos Mikelatos
ce724e6e3f Add GPT-5.5 model to CODEX_MODELS 2026-04-30 08:22:26 +02:00
Rkkooo
b4a39c7297 fix(/status): use CLAUDE_MODELS.DEFAULT instead of stale 'claude-sonnet-4.5' fallback (#723)
The /status slash command's response object falls back to a hardcoded
'claude-sonnet-4.5' when no session model is set in context. Two issues:

1. 'claude-sonnet-4.5' is not in CLAUDE_MODELS.OPTIONS (the dropdown), so
   the displayed model can be a value the user can't actually select.
2. CLAUDE_MODELS.DEFAULT is currently "opus" — meaning a fresh session
   started with no model picked actually runs on opus, not sonnet-4.5.
   The /status command therefore mis-reports the running model.

Tracking the centralized DEFAULT keeps the displayed value coherent with
what claude-sdk.js actually spawns (line 208 already uses
`options.model || CLAUDE_MODELS.DEFAULT`).

Co-authored-by: Enrico Masella <enrico.masella@gmail.com>
Co-authored-by: Simos Mikelatos <simosmik@gmail.com>
2026-04-30 08:20:14 +02:00
Haile
44edf94f3a Refactor provider/session architecture to be DB-driven, modular, and sessionId-first across backend and frontend (#715)
* refactor: remove unused exports

* refactor: remove unused fields from project and session objects

* refactor: rename session_names table and related code to sessions for clarity and consistency

* refactor(database): move db into typescript

- Implemented githubTokensDb for managing GitHub tokens with CRUD operations.
- Created
otificationPreferencesDb to handle user notification preferences.
- Added projectsDb for project path management and related operations.
- Introduced pushSubscriptionsDb for managing browser push subscriptions.
- Developed scanStateDb to track the last scanned timestamp.
- Established sessionsDb for session management with CRUD functionalities.
- Created userDb for user management, including authentication and onboarding.
- Implemented apidKeysDb for storing and managing VAPID keys.

feat(database): define schema for new database tables

- Added SQL schema definitions for users, API keys, user credentials, notification preferences, VAPID keys, push subscriptions, projects, sessions, scan state, and app configuration.
- Included necessary indexes for performance optimization.

refactor(shared): enhance type definitions and utility functions

- Updated shared types and interfaces for improved clarity and consistency.
- Added new types for credential management and provider-specific operations.
- Refined utility functions for better error handling and message normalization.

* feat: added session indexer logic

* perf(projects): lazy-load TaskMaster metadata per selected project

Why:

- /api/projects is a hot path (initial load, sidebar refresh, websocket sync).

- Scanning .taskmaster for every project on each call added avoidable fs I/O and payload size.

- TaskMaster metadata is only needed after selecting a specific project.

- Moving it to a project-scoped endpoint makes loading cost match user intent.

- The UI now hydrates TaskMaster state on selection and keeps it across refresh events.

- This prevents status flicker/regression while still removing global scan overhead.

- Selection fetches are sequence-guarded to block stale async responses on fast switching.

- isManuallyAdded was removed from responses to keep the public project contract minimal.

- Project dumps now use incrementing snapshot files to preserve history for debugging.

What changed:

- Added GET /api/projects/:projectName/taskmaster and getProjectTaskMaster().

- Removed TaskMaster detection from bulk getProjects().

- Added api.projectTaskmaster(...) plus selection-time hydration in frontend contexts.

- Merged cached taskmaster values into refreshed project lists for continuity.

- Removed isManuallyAdded from manual project payloads.

* refactor: update import paths for database modules and remove legacy db.js and schema.js files

* refactor(projects): identify projects by DB projectId instead of folder-derived name

GET /api/projects used to scan ~/.claude/projects/ on every request, derive
each project's identity from the encoded folder name, and re-parse JSONL
files to build session lists. Using the folder-derived name as the project
identifier leaked the Claude CLI's on-disk encoding into every API route,
forced every downstream endpoint to re-resolve a real path via JSONL
'cwd' inspection, and made the project list endpoint O(projects x sessions)
on disk I/O.

This change switches the entire API surface to identify projects by the
stable primary key from the 'projects' table and drives the listing
straight from the DB:

- Add projectsDb.getProjectPathById as the canonical projectId -> path
  resolver so routes no longer need to touch the filesystem to figure out
  where a project lives.

- Rewrite getProjects so it reads the project list from the 'projects'
  table and the per-project session list from the 'sessions' table (one
  SELECT per project). No filesystem scanning happens for this endpoint
  anymore, which removes the dependency on ~/.claude/projects existing,
  on Cursor's MD5-hashed chat folders being discoverable, and on Codex's
  JSONL history being on disk. Per the migration spec each session now
  exposes 'summary' sourced from sessions.custom_name, 'messageCount' = 0
  (message counting is not implemented), and sessionMeta.hasMore is
  pinned to false since this endpoint doesn't drive session pagination.

- Introduce id-based wrappers (getSessionsById, renameProjectById,
  deleteSessionById, deleteProjectById, getProjectTaskMasterById) so
  every caller can pass projectId and resolve the real path through the
  DB. renameProjectById also writes to projects.custom_project_name so
  the DB-driven getProjects response reflects renames immediately; it
  keeps project-config.json in sync for any legacy reader that still
  consults the JSON file.

- Migrate every /api/projects/:projectName route in server/index.js,
  server/routes/taskmaster.js, and server/routes/messages.js to
  :projectId, and change server/routes/git.js so the 'project'
  query/body parameter carries a projectId that is resolved through the
  DB before any git command runs. TaskMaster WebSocket broadcasts emit
  'projectId' for the same reason so the frontend can match
  notifications against its current selection without another lookup.

- Delete helpers that existed only to feed the old getProjects path
  (getCursorSessions, getGeminiCliSessions, getProjectTaskMaster) along
  with their unused imports (better-sqlite3's Database,
  applyCustomSessionNames). The legacy folder-name helpers (getSessions,
  renameProject, deleteSession, deleteProject, extractProjectDirectory)
  are kept as internal implementation details of the id-based wrappers
  and of destructive cleanup / conversation search, but they are no
  longer re-exported.

- searchConversations still walks JSONL to produce match snippets (that
  data doesn't live in the DB), but it now includes the resolved
  projectId in each result so the sidebar can cross-reference hits with
  its already loaded project list without a second round-trip.

Frontend migration:

- Project.name is replaced by Project.projectId in src/types/app.ts, and
  ProjectSession.__projectName becomes __projectId so session tagging
  and sidebar state keys stay aligned with the backend identifier.
  Settings continues to use SettingsProject.name for legacy consumers,
  but it is populated from projectId by normalizeProjectForSettings.

- All places that previously indexed per-project state by project.name
  (sidebar expanded/starred/loading/deletingProjects sets,
  additionalSessions map, projectHasMoreOverrides, starredProjects
  localStorage, command history and draft-input localStorage,
  TaskMaster caches) now key on projectId so state survives
  display-name edits and is consistent across the app.

- src/utils/api.js renames every endpoint parameter to projectId, the
  unified messages endpoint takes projectId in its query string, and
  useSessionStore forwards projectId on fetchFromServer / fetchMore /
  refreshFromServer. Git panel, file tree, code editor, PRD editor,
  plugins context, MCP server flows and TaskMaster hooks are all
  updated to pass projectId.

- DEFAULT_PROJECT_FOR_EMPTY_SHELL is updated to carry a 'default'
  projectId sentinel so the empty-shell placeholder still satisfies the
  Project contract.

Bug fix bundled in:

- sessionsDb.setName no longer bumps updated_at when a row already
  exists. Renaming is a label change, not activity, so there is no
  reason for it to reset 'last activity' in the sidebar. It also no
  longer relies on SQLite's CURRENT_TIMESTAMP, which stores a naive
  'YYYY-MM-DD HH:MM:SS' value that JavaScript parses as local time and
  caused renamed sessions to appear shifted backwards by the client's
  UTC offset. When an INSERT actually happens it now writes ISO-8601
  UTC with a 'Z' suffix.

- buildSessionsByProviderFromDb normalizes any legacy naive timestamps
  in the sessions table to ISO-8601 UTC on the way out so rows written
  before this change also render correctly on the client.

Other cleanup:

- Removed the filesystem-first project-discovery comment block at the
  top of server/projects.js and replaced it with a short note that
  describes the new DB-driven flow and lists the few remaining
  filesystem-dependent helpers (message reads, search, destructive
  delete, manual project registration).

- server/modules/providers/index.ts is added as a small barrel so the
  providers module exposes a stable public surface.

Made-with: Cursor

* refactor(projects): reorganize project-related logic into dedicated modules

* refactor(projects): rename getProjects with getProjectsWithSessions

* refactor: update import path for getProjectsWithSessions to include file extension

* refactor: use updated session watcher
In addition, for projects_updated websocket response, send the sessionId instead

* refactor(websocket): move websocket logic to its own module

* refactor(sessions-watcher): remove redundant logging after session sync completion

* refactor(index.js): reorganize code structure

* refactor(index.js): fix import order

* refactor: remove unnecessary GitHub cloning logic from create-workspace endpoint

* refactor: modularize project services, and wizard create/clone flow

Restructure project creation, listing, GitHub clone progress, and TaskMaster
details behind a dedicated TypeScript module under server/modules/projects/,
and align the client wizard with a single path-based flow.

Server / routing
- Remove server/routes/projects.js and mount server/modules/projects/
  projects.routes.ts at /api/projects (still behind authenticateToken).
- Drop duplicate handlers from server/index.js for GET /api/projects and
  GET /api/projects/:projectId/taskmaster; those live on the new router.
- Import WORKSPACES_ROOT and validateWorkspacePath from shared utils in
  index.js instead of the deleted projects route module.

Projects router (projects.routes.ts)
- GET /: list projects with sessions (existing snapshot behavior).
- POST /create-project: validate body, reject legacy workspaceType and
  mixed clone fields, delegate to createProject service, return distinct
  success copy when an archived path is reactivated.
- GET /clone-progress: Server-Sent Events for clone progress/complete/error;
  requires authenticated user id for token resolution; wires startCloneProject.
- GET /:projectId/taskmaster: delegates to getProjectTaskMaster.

Services (new)
- project-management.service.ts: path validation, workspace directory
  creation, persistence via projectsDb.createProjectPath, mapping to API
  project shape; surfaces AppError for validation, conflict, and not-found
  cases; optional dependency injection for tests.
- project-clone.service.ts: validates workspace, resolves GitHub auth
  (stored token or inline token), runs git clone with progress callbacks,
  registers project via createProject on success; sanitizes errors and
  supports cancellation; injectable dependencies for tests.
- projects-has-taskmaster.service.ts: moves TaskMaster detection and
  normalization out of server/projects.js; resolve-by-id and public
  getProjectTaskMaster with structured AppError responses.

Persistence and shared types
- projectsDb.createProjectPath now returns CreateProjectPathResult
  (created | reactivated_archived | active_conflict) using INSERT … ON
  CONFLICT with selective update when the row is archived; normalizes
  display name from path or custom name; repository row typing moves to
  shared ProjectRepositoryRow.
- getProjectPaths() returns only non-archived rows (isArchived = 0).
- shared/types.ts: ProjectRepositoryRow, CreateProjectPathResult/outcome,
  WorkspacePathValidationResult.
- shared/utils.ts: WORKSPACES_ROOT, forbidden path lists, validateWorkspacePath,
  asyncHandler for Express async routes.

Legacy cleanup
- server/projects.js: remove detectTaskMasterFolder, normalizeTaskMasterInfo,
  and getProjectTaskMasterById (logic lives in the new service).
- server/routes/agent.js: register external API project paths with
  projectsDb.createProjectPath instead of addProjectManually try/catch;
  treat active_conflict as an existing registration and continue.

Tests
- Add Node test suites for project-management, project-clone, and
  projects-has-taskmaster services; update projects.service test import
  for renamed projects-with-sessions-fetch.service.ts.

Rename
- projects.service.ts → projects-with-sessions-fetch.service.ts;
  re-export from modules/projects/index.ts.

Client (project creation wizard)
- Remove StepTypeSelection and workspaceType from form state and types;
  wizard is two steps (configure path/GitHub auth, then review).
- createWorkspaceRequest → createProjectRequest; clone vs create-only
  inferred from githubUrl (pathUtils / isCloneWorkflow).
- Adjust step indices, WizardProgress, StepConfiguration/Review,
  WorkspacePathField, and src/utils/api.js as needed for the new API.

Docs
- Minor websocket README touch-up.

Net: ~1.6k insertions / ~0.9k deletions across 29 files; behavior is
centralized in typed services with explicit HTTP errors and test seams.

* refactor: remove loading sessions logic from sidebar

* refactor: move project rename to module

* refactor: move project deletion to module

* refactor: move project star state from localStorage to backend

* refactor: implement optimistic UI for project star state management

* feat: optimistic update for session watcher

* fix(projects-state): stop websocket message reprocessing loop

The websocket projects effect in useProjectsState could re-handle the same
latestMessage after local state writes triggered re-renders.

Under bursty websocket traffic, this created an update feedback cycle
that surfaced as 'Maximum update depth exceeded', often from Sidebar.

What changed:
- Added lastHandledMessageRef so each latestMessage object is handled once.
- Added an early return guard when the current message was already handled.
- Made projects updates idempotent by comparing previous and merged payloads
  before calling setProjects.

Result:
- Breaks the effect -> state update -> effect re-entry cycle.
- Reduces redundant renders during rapid projects_updated traffic while
  preserving normal project/session synchronization.

* refactor: optimize project auto-expand logic

* refactor: move projects provider specific logic into respective session providers

* refactor: move rename and delete sessions to modules

* refactor: move fetching messages to module

* fix: remove unused var

Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>

* Potential fix for pull request finding 'Useless assignment to local variable'

Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>

* refactor(projects/sidebar): remove temp snapshot side-effects and simplify session metadata UX

Why this change was needed:
- Project listing had an implicit side effect: every fetch wrote a debug snapshot under `.tmp/project-dumps`.
  That added unnecessary disk I/O to a hot path, introduced hidden runtime behavior, and created maintenance
  overhead for code that was not part of product functionality.
- Keeping snapshot-specific exports/tests around made the projects module API broader than needed and coupled
  tests to temporary/debug behavior instead of user-visible behavior.
- Codex sessions could remain stuck with a placeholder name (`Untitled Codex Session`) even after a real title
  became available from newer sync data, which degraded session discoverability in the UI.
- Sidebar session rows showed duplicated provider branding and long-form relative times, which added visual noise
  and reduced scan speed when many sessions are listed.

What changed:
- Removed temporary projects snapshot dumping from `projects-with-sessions-fetch.service.ts`:
  - deleted snapshot types/helpers and file-write flow
  - removed the write call from `getProjectsWithSessions`
- Removed snapshot-related surface area from `projects/index.ts`.
- Removed the snapshot-focused test `projects.service.test.ts` that only validated removed debug behavior.
- Updated `codex-session-synchronizer.provider.ts` to upgrade session names when an existing session still has
  the placeholder title but a real parsed name is now available.
- Updated `SidebarSessionItem.tsx`:
  - removed duplicate provider logo rendering in each session row
  - moved age indicator to the right side
  - made age indicator fade on hover to prioritize action controls
  - switched to compact relative time format (`<1m`, `Xm`, `Xhr`, `Xd`) for faster list scanning

Outcome:
- Lower overhead and fewer hidden side effects in project fetches.
- Cleaner module boundaries in projects.
- Better Codex session naming consistency after sync.
- Cleaner sidebar density and clearer hover/action behavior.

* refactor: implement pagination for project sessions loading

* refactor: move search to module

* fix: search performance

* refactor: add handling for internal Codex metadata in conversation search

* fix(migrations,projects,clone): normalize legacy schema before writes and harden conflict detection

Why

- Legacy installs can have a sessions table shape that predates provider/custom_name columns. Running migrateLegacySessionNames first caused its INSERT OR REPLACE INTO sessions (...) to target columns that may not exist and fail during startup migration.

- Some upgraded databases had projects.project_id as plain TEXT instead of a real PRIMARY KEY. That breaks assumptions used by id-based lookups and can allow invalid/duplicate identity semantics over time.

- projectsDb.createProjectPath inferred outcomes from 
ow.isArchived, but the upsert path always returns the post-update row with isArchived=0, so archived-reactivation and fresh-create could be misclassified.

- git clone accepted user-controlled URLs directly in argv position, so inputs beginning with - could be interpreted as options instead of a repository argument.

What

- Added 
ebuildProjectsTableWithPrimaryKeySchema in migrations: detect table shape via getTableInfo('projects'), verify project_id has pk=1, and rebuild when missing.

- Rebuild flow now creates a canonical projects__new table (project_id TEXT PRIMARY KEY), copies rows with transformation, backfills empty ids via SQLITE_UUID_SQL, deduplicates conflicting ids/paths, then swaps tables inside a transaction.

- Replaced the prior ddColumnToTableIfNotExists(...) + UPDATE project_id sequence with PK-aware detection/rebuild logic so legacy DBs converge to the required schema.

- Reordered migration sequence to run 
ebuildSessionsTableWithProjectSchema before migrateLegacySessionNames, ensuring sessions is normalized before legacy session_names merge writes execute.

- Updated projectsDb.createProjectPath to generate an ttemptedId before insert, pass it into the prepared statement, and classify outcomes by comparing returned 
ow.project_id to ttemptedId (created vs 
eactivated_archived), with no-row remaining ctive_conflict.

- Hardened clone execution by inserting -- before clone URL in git argv and rejecting normalized GitHub URLs that start with - in startCloneProject.

Tests

- Added integration coverage for projectsDb.createProjectPath branches: fresh insert, archived reactivation, and active conflict.

- Added clone service test for option-prefixed githubUrl rejection (INVALID_GITHUB_URL).

* refactor(session-synchronizer): update last scanned timestamp based on synchronization results

* refactor: improve session limit and offset validation in provider routes

* refactor: normalize project paths across database and service modules

* refactor(database): make session id the primary key in sessions table

* fix(codex): preserve reasoning entries as thinking blocks

Codex history normalization was downgrading reasoning into plain assistant text
because of branch ordering, not because the raw data was missing.

Why this mattered:
- Codex reasoning JSONL entries are intentionally mapped to history items with
  type thinking, but they also carry message.role assistant.
- normalizeHistoryEntry evaluated the assistant-role branch before the
  thinking branch.
- As a result, reasoning content matched the assistant-text path first and was
  emitted as kind text instead of kind thinking.
- This collapses semantic intent, so UI and downstream features that rely on
  thinking blocks (separate rendering, filtering, and interpretation of model
  thought process vs final answer) receive the wrong message kind.

What changed:
- Prioritized thinking detection (raw.type === thinking or raw.isReasoning)
  before role-based assistant normalization.
- Kept a non-empty content guard for thinking payloads to avoid emitting empty
  artifacts.

Impact:
- Reasoning entries from persisted Codex JSONL now remain thinking blocks
  end-to-end.
- Regular assistant text normalization behavior remains unchanged.

* refactor: remove dead code

* refactor: directly use getProjectPathById from projectsDb

* refactor: add gemini jsonl session support
2026-04-30 08:19:26 +02:00
212 changed files with 17584 additions and 6953 deletions

View File

@@ -3,6 +3,32 @@
All notable changes to CloudCLI UI will be documented in this file.
## [1.31.5](https://github.com/siteboon/claudecodeui/compare/v1.31.4...v1.31.5) (2026-04-30)
### New Features
* add auto mode to claude code ([3f71d49](https://github.com/siteboon/claudecodeui/commit/3f71d4932b05dfedcdf816e2a3d7d0cd69c4f566))
## [1.31.4](https://github.com/siteboon/claudecodeui/compare/v1.31.3...v1.31.4) (2026-04-30)
### Bug Fixes
* bump codex sdk to latest version ([658421c](https://github.com/siteboon/claudecodeui/commit/658421c1c44ec4eb58b69ec7b1844a9fba11a3f3))
## [1.31.3](https://github.com/siteboon/claudecodeui/compare/v1.31.2...v1.31.3) (2026-04-30)
## [1.31.2](https://github.com/siteboon/claudecodeui/compare/v1.31.0...v1.31.2) (2026-04-30)
### Bug Fixes
* migrations for new sqlite schema ([0753c04](https://github.com/siteboon/claudecodeui/commit/0753c047837dab17b86ae4453027e30b465870f8))
## [1.31.0](https://github.com/siteboon/claudecodeui/compare/v1.30.0...v1.31.0) (2026-04-30)
### Bug Fixes
* **/status:** use CLAUDE_MODELS.DEFAULT instead of stale 'claude-sonnet-4.5' fallback ([#723](https://github.com/siteboon/claudecodeui/issues/723)) ([b4a39c7](https://github.com/siteboon/claudecodeui/commit/b4a39c729710a6294c62eb742e99e05f3e3914e9))
## [1.30.0](https://github.com/siteboon/claudecodeui/compare/v1.29.5...v1.30.0) (2026-04-21)
### New Features

View File

@@ -164,7 +164,7 @@ CloudCLI has a plugin system that lets you add custom tabs with their own fronte
|---|---|
| **[Project Stats](https://github.com/cloudcli-ai/cloudcli-plugin-starter)** | Shows file counts, lines of code, file-type breakdown, largest files, and recently modified files for your current project |
| **[Web Terminal](https://github.com/cloudcli-ai/cloudcli-plugin-terminal)** | Full xterm.js terminal with multi-tab support|
| **[CloudCLI Scheduler](https://github.com/grostim/cloudcli-cron)** | Create workspace-scoped scheduled prompts and execute them through a local CLI such as Codex, Claude Code, or Gemini CLI|
### Build Your Own
**[Plugin Starter Template →](https://github.com/cloudcli-ai/cloudcli-plugin-starter)** — fork this repo to create your own plugin. It includes a working example with frontend rendering, live context updates, and RPC communication to a backend server.

View File

@@ -157,7 +157,11 @@ export default tseslint.config(
},
{
type: "backend-shared-utils", // shared backend runtime helpers that modules may import directly
pattern: ["server/shared/utils.{js,ts}"], // classify the shared utils file so modules can depend on it explicitly
pattern: [
"server/shared/utils.{js,ts}",
"server/shared/frontmatter.ts",
"server/shared/claude-cli-path.ts",
], // classify shared utility files so modules can depend on them explicitly
mode: "file",
},
{
@@ -165,9 +169,8 @@ export default tseslint.config(
pattern: [
"server/projects.js",
"server/sessionManager.js",
"server/database/*.{js,ts}",
"server/utils/runtime-paths.js",
], // provider history loading still resolves session data through these legacy runtime/database files
], // provider history loading still resolves session data through these legacy runtime files
mode: "file",
},
{

726
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "@cloudcli-ai/cloudcli",
"version": "1.30.0",
"version": "1.31.5",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@cloudcli-ai/cloudcli",
"version": "1.30.0",
"version": "1.31.5",
"hasInstallScript": true,
"license": "AGPL-3.0-or-later",
"dependencies": {
@@ -21,10 +21,11 @@
"@codemirror/theme-one-dark": "^6.1.2",
"@iarna/toml": "^2.2.5",
"@octokit/rest": "^22.0.0",
"@openai/codex-sdk": "^0.101.0",
"@openai/codex-sdk": "^0.125.0",
"@replit/codemirror-minimap": "^0.5.2",
"@tailwindcss/typography": "^0.5.16",
"@uiw/react-codemirror": "^4.23.13",
"@vscode/ripgrep": "^1.17.1",
"@xterm/addon-clipboard": "^0.1.0",
"@xterm/addon-fit": "^0.10.0",
"@xterm/addon-web-links": "^0.11.0",
@@ -35,6 +36,7 @@
"chokidar": "^4.0.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"cmdk": "^1.1.1",
"cors": "^2.8.5",
"cross-spawn": "^7.0.3",
"express": "^4.18.2",
@@ -80,6 +82,7 @@
"@types/node": "^22.19.7",
"@types/react": "^18.2.43",
"@types/react-dom": "^18.2.17",
"@types/ws": "^8.18.1",
"@vitejs/plugin-react": "^4.6.0",
"auto-changelog": "^2.5.0",
"autoprefixer": "^10.4.16",
@@ -3304,9 +3307,9 @@
}
},
"node_modules/@openai/codex": {
"version": "0.101.0",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0.tgz",
"integrity": "sha512-H874q5K5I3chrT588BaddMr7GNvRYypc8C1MKWytNUF2PgxWMko2g/2DgKbt5OdajZKMsWdbsPywu34KQGf5Qw==",
"version": "0.125.0",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0.tgz",
"integrity": "sha512-GiE9wlgL95u/5BRirY5d3EaRLU1tu7Y1R09R8lCHHVmcQdSmhS809FdPDWH3gIYHS7ZriAPqXwJ3aLA0WKl40Q==",
"license": "Apache-2.0",
"bin": {
"codex": "bin/codex.js"
@@ -3315,19 +3318,19 @@
"node": ">=16"
},
"optionalDependencies": {
"@openai/codex-darwin-arm64": "npm:@openai/codex@0.101.0-darwin-arm64",
"@openai/codex-darwin-x64": "npm:@openai/codex@0.101.0-darwin-x64",
"@openai/codex-linux-arm64": "npm:@openai/codex@0.101.0-linux-arm64",
"@openai/codex-linux-x64": "npm:@openai/codex@0.101.0-linux-x64",
"@openai/codex-win32-arm64": "npm:@openai/codex@0.101.0-win32-arm64",
"@openai/codex-win32-x64": "npm:@openai/codex@0.101.0-win32-x64"
"@openai/codex-darwin-arm64": "npm:@openai/codex@0.125.0-darwin-arm64",
"@openai/codex-darwin-x64": "npm:@openai/codex@0.125.0-darwin-x64",
"@openai/codex-linux-arm64": "npm:@openai/codex@0.125.0-linux-arm64",
"@openai/codex-linux-x64": "npm:@openai/codex@0.125.0-linux-x64",
"@openai/codex-win32-arm64": "npm:@openai/codex@0.125.0-win32-arm64",
"@openai/codex-win32-x64": "npm:@openai/codex@0.125.0-win32-x64"
}
},
"node_modules/@openai/codex-darwin-arm64": {
"name": "@openai/codex",
"version": "0.101.0-darwin-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-darwin-arm64.tgz",
"integrity": "sha512-unk4rTRQQ9o0w2Upu35IsJHpoZHJ+tU/myn6LNhUjcP9FrjLnEcAQJ6WIMtdTYVPja1PGhFSO0DNxV79GMvehw==",
"version": "0.125.0-darwin-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-darwin-arm64.tgz",
"integrity": "sha512-Gn2fHiSO0XgyHp1OSd5DWUTm66Bv9UEuipW5pVEj1E+hWZCOrdqnYttllKFWtRGj5yiKefNX3JIxONgh/ZwlOQ==",
"cpu": [
"arm64"
],
@@ -3342,9 +3345,9 @@
},
"node_modules/@openai/codex-darwin-x64": {
"name": "@openai/codex",
"version": "0.101.0-darwin-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-darwin-x64.tgz",
"integrity": "sha512-+KFi1IapCQGd3vLQp2lI4xI3hu2QffDZYt7Fhfw6NxEFOKhHnTamRtQ5yI8jYQcYF+pQfYF2fyiuXLM1lITLQw==",
"version": "0.125.0-darwin-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-darwin-x64.tgz",
"integrity": "sha512-TZ5Lek2X/UXTI9LXFxzarvQaJeuTrqVh4POc7soO/8RclVnCxADnCf15sivxLd5eiFW4t0myGoeVoM4lciRiRg==",
"cpu": [
"x64"
],
@@ -3359,9 +3362,9 @@
},
"node_modules/@openai/codex-linux-arm64": {
"name": "@openai/codex",
"version": "0.101.0-linux-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-linux-arm64.tgz",
"integrity": "sha512-RkDnQeq7M6ZBtD+8i+I5ewjjOf02BcJq6r1kN4RBewfAQBsz6B73Ns3OrI2bHVRsuPtAf8Cf1S4xg/eFZT2Omg==",
"version": "0.125.0-linux-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-linux-arm64.tgz",
"integrity": "sha512-pPnJoJD6rZ2Iin0zNt/up36bO2/EOp2B+1/rPHu/lSq3PJbT3Fmnfut2kJy5LylXb7bGA2XQbtqOogZzIbnlkA==",
"cpu": [
"arm64"
],
@@ -3376,9 +3379,9 @@
},
"node_modules/@openai/codex-linux-x64": {
"name": "@openai/codex",
"version": "0.101.0-linux-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-linux-x64.tgz",
"integrity": "sha512-SJeEdQ4ReEU3nvtceZ1uY3me6oWoB3djr3GnZmAUCEUuYEWD1kRGprAyJB1N0B+8zhSv0SU2e9sX5t3aCV4AwQ==",
"version": "0.125.0-linux-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-linux-x64.tgz",
"integrity": "sha512-K2NTTEeBpz/G+N2x17UGWfauRt3So+ir4f+U/60l5PPnYEJB/w3YZrlXo2G9og8Dm9BqtoBAjoPV74sRv9tWWQ==",
"cpu": [
"x64"
],
@@ -3392,12 +3395,12 @@
}
},
"node_modules/@openai/codex-sdk": {
"version": "0.101.0",
"resolved": "https://registry.npmjs.org/@openai/codex-sdk/-/codex-sdk-0.101.0.tgz",
"integrity": "sha512-Lrar2pDvGUX64itSbMNKuNBzxh72UwKokY4TPuXJRURwGX0qyDi80n7DiVivC40BwFsQWNs6behSo/9Mr6PoLw==",
"version": "0.125.0",
"resolved": "https://registry.npmjs.org/@openai/codex-sdk/-/codex-sdk-0.125.0.tgz",
"integrity": "sha512-1xCIHdSbQVF880nJ2aVWdPIsWZbSpKODwuP9y/gvtChDYhYfYEW0DKp2H8ZlctkzIjlzS/WzYmP6ZZPHIvs2Dg==",
"license": "Apache-2.0",
"dependencies": {
"@openai/codex": "0.101.0"
"@openai/codex": "0.125.0"
},
"engines": {
"node": ">=18"
@@ -3405,9 +3408,9 @@
},
"node_modules/@openai/codex-win32-arm64": {
"name": "@openai/codex",
"version": "0.101.0-win32-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-win32-arm64.tgz",
"integrity": "sha512-WQ8QsychjHyvlr+vCSTMbd2/yrBIZre5tRuM79eZi973BJz0CSEiFsNSGg5fvpnJuiHHawZ/8HWeir7nlatamQ==",
"version": "0.125.0-win32-arm64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-win32-arm64.tgz",
"integrity": "sha512-zxoUakw9oIHIFrAyk400XkkLBJFA6nOym0NDq6sQ/jhdcYraKqNSRCII2nsBwZHk+/4zgUvuk52iuutgysY/rQ==",
"cpu": [
"arm64"
],
@@ -3422,9 +3425,9 @@
},
"node_modules/@openai/codex-win32-x64": {
"name": "@openai/codex",
"version": "0.101.0-win32-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.101.0-win32-x64.tgz",
"integrity": "sha512-H+7h9x0fYrJRUZZHCA62Dzb/CS5Scl1sUw1aamfmHJzzorX+uTFOgGsibzqFpHTd6nRM4q8//fCdSxe5wUpOQQ==",
"version": "0.125.0-win32-x64",
"resolved": "https://registry.npmjs.org/@openai/codex/-/codex-0.125.0-win32-x64.tgz",
"integrity": "sha512-ofpOK+OWH5QFuUZ9pTM0d/PcXUXiIP5z5DpRcE9MlucJoyOl4Zy4Nu3NcuHF4YzCkZMQb6x3j0tjDEPHKqNQzw==",
"cpu": [
"x64"
],
@@ -3461,6 +3464,447 @@
"node": ">=14"
}
},
"node_modules/@radix-ui/primitive": {
"version": "1.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/primitive/-/primitive-1.1.3.tgz",
"integrity": "sha512-JTF99U/6XIjCBo0wqkU5sK10glYe27MRRsfwoiq5zzOEZLHU3A3KCMa5X/azekYRCJ0HlwI0crAXS/5dEHTzDg==",
"license": "MIT"
},
"node_modules/@radix-ui/react-compose-refs": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/@radix-ui/react-compose-refs/-/react-compose-refs-1.1.2.tgz",
"integrity": "sha512-z4eqJvfiNnFMHIIvXP3CY57y2WJs5g2v3X0zm9mEJkrkNv4rDxu+sg9Jh8EkXyeqBkB7SOcboo9dMVqhyrACIg==",
"license": "MIT",
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-context": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/@radix-ui/react-context/-/react-context-1.1.2.tgz",
"integrity": "sha512-jCi/QKUM2r1Ju5a3J64TH2A5SpKAgh0LpknyqdQ4m6DCV0xJ2HG1xARRwNGPQfi1SLdLWZ1OJz6F4OMBBNiGJA==",
"license": "MIT",
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-dialog": {
"version": "1.1.15",
"resolved": "https://registry.npmjs.org/@radix-ui/react-dialog/-/react-dialog-1.1.15.tgz",
"integrity": "sha512-TCglVRtzlffRNxRMEyR36DGBLJpeusFcgMVD9PZEzAKnUs1lKCgX5u9BmC2Yg+LL9MgZDugFFs1Vl+Jp4t/PGw==",
"license": "MIT",
"dependencies": {
"@radix-ui/primitive": "1.1.3",
"@radix-ui/react-compose-refs": "1.1.2",
"@radix-ui/react-context": "1.1.2",
"@radix-ui/react-dismissable-layer": "1.1.11",
"@radix-ui/react-focus-guards": "1.1.3",
"@radix-ui/react-focus-scope": "1.1.7",
"@radix-ui/react-id": "1.1.1",
"@radix-ui/react-portal": "1.1.9",
"@radix-ui/react-presence": "1.1.5",
"@radix-ui/react-primitive": "2.1.3",
"@radix-ui/react-slot": "1.2.3",
"@radix-ui/react-use-controllable-state": "1.2.2",
"aria-hidden": "^1.2.4",
"react-remove-scroll": "^2.6.3"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-dialog/node_modules/@radix-ui/react-primitive": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-primitive/-/react-primitive-2.1.3.tgz",
"integrity": "sha512-m9gTwRkhy2lvCPe6QJp4d3G1TYEUHn/FzJUtq9MjH46an1wJU+GdoGC5VLof8RX8Ft/DlpshApkhswDLZzHIcQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-slot": "1.2.3"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-dismissable-layer": {
"version": "1.1.11",
"resolved": "https://registry.npmjs.org/@radix-ui/react-dismissable-layer/-/react-dismissable-layer-1.1.11.tgz",
"integrity": "sha512-Nqcp+t5cTB8BinFkZgXiMJniQH0PsUt2k51FUhbdfeKvc4ACcG2uQniY/8+h1Yv6Kza4Q7lD7PQV0z0oicE0Mg==",
"license": "MIT",
"dependencies": {
"@radix-ui/primitive": "1.1.3",
"@radix-ui/react-compose-refs": "1.1.2",
"@radix-ui/react-primitive": "2.1.3",
"@radix-ui/react-use-callback-ref": "1.1.1",
"@radix-ui/react-use-escape-keydown": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-dismissable-layer/node_modules/@radix-ui/react-primitive": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-primitive/-/react-primitive-2.1.3.tgz",
"integrity": "sha512-m9gTwRkhy2lvCPe6QJp4d3G1TYEUHn/FzJUtq9MjH46an1wJU+GdoGC5VLof8RX8Ft/DlpshApkhswDLZzHIcQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-slot": "1.2.3"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-focus-guards": {
"version": "1.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-focus-guards/-/react-focus-guards-1.1.3.tgz",
"integrity": "sha512-0rFg/Rj2Q62NCm62jZw0QX7a3sz6QCQU0LpZdNrJX8byRGaGVTqbrW9jAoIAHyMQqsNpeZ81YgSizOt5WXq0Pw==",
"license": "MIT",
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-focus-scope": {
"version": "1.1.7",
"resolved": "https://registry.npmjs.org/@radix-ui/react-focus-scope/-/react-focus-scope-1.1.7.tgz",
"integrity": "sha512-t2ODlkXBQyn7jkl6TNaw/MtVEVvIGelJDCG41Okq/KwUsJBwQ4XVZsHAVUkK4mBv3ewiAS3PGuUWuY2BoK4ZUw==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-compose-refs": "1.1.2",
"@radix-ui/react-primitive": "2.1.3",
"@radix-ui/react-use-callback-ref": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-focus-scope/node_modules/@radix-ui/react-primitive": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-primitive/-/react-primitive-2.1.3.tgz",
"integrity": "sha512-m9gTwRkhy2lvCPe6QJp4d3G1TYEUHn/FzJUtq9MjH46an1wJU+GdoGC5VLof8RX8Ft/DlpshApkhswDLZzHIcQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-slot": "1.2.3"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-id": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@radix-ui/react-id/-/react-id-1.1.1.tgz",
"integrity": "sha512-kGkGegYIdQsOb4XjsfM97rXsiHaBwco+hFI66oO4s9LU+PLAC5oJ7khdOVFxkhsmlbpUqDAvXw11CluXP+jkHg==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-use-layout-effect": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-portal": {
"version": "1.1.9",
"resolved": "https://registry.npmjs.org/@radix-ui/react-portal/-/react-portal-1.1.9.tgz",
"integrity": "sha512-bpIxvq03if6UNwXZ+HTK71JLh4APvnXntDc6XOX8UVq4XQOVl7lwok0AvIl+b8zgCw3fSaVTZMpAPPagXbKmHQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-primitive": "2.1.3",
"@radix-ui/react-use-layout-effect": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-portal/node_modules/@radix-ui/react-primitive": {
"version": "2.1.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-primitive/-/react-primitive-2.1.3.tgz",
"integrity": "sha512-m9gTwRkhy2lvCPe6QJp4d3G1TYEUHn/FzJUtq9MjH46an1wJU+GdoGC5VLof8RX8Ft/DlpshApkhswDLZzHIcQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-slot": "1.2.3"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-presence": {
"version": "1.1.5",
"resolved": "https://registry.npmjs.org/@radix-ui/react-presence/-/react-presence-1.1.5.tgz",
"integrity": "sha512-/jfEwNDdQVBCNvjkGit4h6pMOzq8bHkopq458dPt2lMjx+eBQUohZNG9A7DtO/O5ukSbxuaNGXMjHicgwy6rQQ==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-compose-refs": "1.1.2",
"@radix-ui/react-use-layout-effect": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-primitive": {
"version": "2.1.4",
"resolved": "https://registry.npmjs.org/@radix-ui/react-primitive/-/react-primitive-2.1.4.tgz",
"integrity": "sha512-9hQc4+GNVtJAIEPEqlYqW5RiYdrr8ea5XQ0ZOnD6fgru+83kqT15mq2OCcbe8KnjRZl5vF3ks69AKz3kh1jrhg==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-slot": "1.2.4"
},
"peerDependencies": {
"@types/react": "*",
"@types/react-dom": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc",
"react-dom": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
},
"@types/react-dom": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-primitive/node_modules/@radix-ui/react-slot": {
"version": "1.2.4",
"resolved": "https://registry.npmjs.org/@radix-ui/react-slot/-/react-slot-1.2.4.tgz",
"integrity": "sha512-Jl+bCv8HxKnlTLVrcDE8zTMJ09R9/ukw4qBs/oZClOfoQk/cOTbDn+NceXfV7j09YPVQUryJPHurafcSg6EVKA==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-compose-refs": "1.1.2"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-slot": {
"version": "1.2.3",
"resolved": "https://registry.npmjs.org/@radix-ui/react-slot/-/react-slot-1.2.3.tgz",
"integrity": "sha512-aeNmHnBxbi2St0au6VBVC7JXFlhLlOnvIIlePNniyUNAClzmtAUEY8/pBiK3iHjufOlwA+c20/8jngo7xcrg8A==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-compose-refs": "1.1.2"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-use-callback-ref": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@radix-ui/react-use-callback-ref/-/react-use-callback-ref-1.1.1.tgz",
"integrity": "sha512-FkBMwD+qbGQeMu1cOHnuGB6x4yzPjho8ap5WtbEJ26umhgqVXbhekKUQO+hZEL1vU92a3wHwdp0HAcqAUF5iDg==",
"license": "MIT",
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-use-controllable-state": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/@radix-ui/react-use-controllable-state/-/react-use-controllable-state-1.2.2.tgz",
"integrity": "sha512-BjasUjixPFdS+NKkypcyyN5Pmg83Olst0+c6vGov0diwTEo6mgdqVR6hxcEgFuh4QrAs7Rc+9KuGJ9TVCj0Zzg==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-use-effect-event": "0.0.2",
"@radix-ui/react-use-layout-effect": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-use-effect-event": {
"version": "0.0.2",
"resolved": "https://registry.npmjs.org/@radix-ui/react-use-effect-event/-/react-use-effect-event-0.0.2.tgz",
"integrity": "sha512-Qp8WbZOBe+blgpuUT+lw2xheLP8q0oatc9UpmiemEICxGvFLYmHm9QowVZGHtJlGbS6A6yJ3iViad/2cVjnOiA==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-use-layout-effect": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-use-escape-keydown": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@radix-ui/react-use-escape-keydown/-/react-use-escape-keydown-1.1.1.tgz",
"integrity": "sha512-Il0+boE7w/XebUHyBjroE+DbByORGR9KKmITzbR7MyQ4akpORYP/ZmbhAr0DG7RmmBqoOnZdy2QlvajJ2QA59g==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-use-callback-ref": "1.1.1"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@radix-ui/react-use-layout-effect": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@radix-ui/react-use-layout-effect/-/react-use-layout-effect-1.1.1.tgz",
"integrity": "sha512-RbJRS4UWQFkzHTTwVymMTUv8EqYhOp8dOOviLj2ugtTiXRaRQS7GLGxZTLL1jWhMeoSCf5zmcZkqTl9IiYfXcQ==",
"license": "MIT",
"peerDependencies": {
"@types/react": "*",
"react": "^16.8 || ^17.0 || ^18.0 || ^19.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/@release-it/conventional-changelog": {
"version": "10.0.5",
"resolved": "https://registry.npmjs.org/@release-it/conventional-changelog/-/conventional-changelog-10.0.5.tgz",
@@ -4109,7 +4553,7 @@
"version": "18.3.7",
"resolved": "https://registry.npmjs.org/@types/react-dom/-/react-dom-18.3.7.tgz",
"integrity": "sha512-MEe3UeoENYVFXzoXEWsvcpg6ZvlrFNlOQ7EOsvhI3CfAXwzPfO8Qwuxd40nepsYKqyyVQnTdEfv68q91yLcKrQ==",
"dev": true,
"devOptional": true,
"license": "MIT",
"peerDependencies": {
"@types/react": "^18.0.0"
@@ -4142,6 +4586,16 @@
"integrity": "sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==",
"license": "MIT"
},
"node_modules/@types/ws": {
"version": "8.18.1",
"resolved": "https://registry.npmjs.org/@types/ws/-/ws-8.18.1.tgz",
"integrity": "sha512-ThVF6DCVhA8kUGy+aazFQ4kXQ7E1Ty7A3ypFOe0IcJV8O/M511G99AW24irKrW56Wt44yG9+ij8FaqoBGkuBXg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/node": "*"
}
},
"node_modules/@typescript-eslint/eslint-plugin": {
"version": "8.56.1",
"resolved": "https://registry.npmjs.org/@typescript-eslint/eslint-plugin/-/eslint-plugin-8.56.1.tgz",
@@ -4786,6 +5240,18 @@
"vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
}
},
"node_modules/@vscode/ripgrep": {
"version": "1.17.1",
"resolved": "https://registry.npmjs.org/@vscode/ripgrep/-/ripgrep-1.17.1.tgz",
"integrity": "sha512-xTs7DGyAO3IsJYOCTBP8LnTvPiYVKEuyv8s0xyJDBXfs8rhBfqnZPvb6xDT+RnwWzcXqW27xLS/aGrkjX7lNWw==",
"hasInstallScript": true,
"license": "MIT",
"dependencies": {
"https-proxy-agent": "^7.0.2",
"proxy-from-env": "^1.1.0",
"yauzl": "^2.9.2"
}
},
"node_modules/@xterm/addon-clipboard": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/@xterm/addon-clipboard/-/addon-clipboard-0.1.0.tgz",
@@ -5060,6 +5526,18 @@
"sprintf-js": "~1.0.2"
}
},
"node_modules/aria-hidden": {
"version": "1.2.6",
"resolved": "https://registry.npmjs.org/aria-hidden/-/aria-hidden-1.2.6.tgz",
"integrity": "sha512-ik3ZgC9dY/lYVVM++OISsaYDeg1tb0VtP5uL3ouh1koGOaUMDPpbFIei4JkFimWUFPn90sbMNMXQAIVOlnYKJA==",
"license": "MIT",
"dependencies": {
"tslib": "^2.0.0"
},
"engines": {
"node": ">=10"
}
},
"node_modules/array-buffer-byte-length": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/array-buffer-byte-length/-/array-buffer-byte-length-1.0.2.tgz",
@@ -5618,6 +6096,15 @@
"ieee754": "^1.1.13"
}
},
"node_modules/buffer-crc32": {
"version": "0.2.13",
"resolved": "https://registry.npmjs.org/buffer-crc32/-/buffer-crc32-0.2.13.tgz",
"integrity": "sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==",
"license": "MIT",
"engines": {
"node": "*"
}
},
"node_modules/buffer-equal-constant-time": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/buffer-equal-constant-time/-/buffer-equal-constant-time-1.0.1.tgz",
@@ -6153,6 +6640,22 @@
"node": ">=6"
}
},
"node_modules/cmdk": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/cmdk/-/cmdk-1.1.1.tgz",
"integrity": "sha512-Vsv7kFaXm+ptHDMZ7izaRsP70GgrW9NBNGswt9OZaVBLlE0SNpDq8eu/VGXyF9r7M0azK3Wy7OlYXsuyYLFzHg==",
"license": "MIT",
"dependencies": {
"@radix-ui/react-compose-refs": "^1.1.1",
"@radix-ui/react-dialog": "^1.1.6",
"@radix-ui/react-id": "^1.1.0",
"@radix-ui/react-primitive": "^2.0.2"
},
"peerDependencies": {
"react": "^18 || ^19 || ^19.0.0-rc",
"react-dom": "^18 || ^19 || ^19.0.0-rc"
}
},
"node_modules/codemirror": {
"version": "6.0.2",
"resolved": "https://registry.npmjs.org/codemirror/-/codemirror-6.0.2.tgz",
@@ -6925,6 +7428,12 @@
"node": ">=8"
}
},
"node_modules/detect-node-es": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/detect-node-es/-/detect-node-es-1.1.0.tgz",
"integrity": "sha512-ypdmJU/TbBby2Dxibuv7ZLW3Bs1QEmM7nHjEANfohJLvE0XVujisn1qPJcZxg+qDucsr+bP6fLD1rPS3AhJ7EQ==",
"license": "MIT"
},
"node_modules/devlop": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/devlop/-/devlop-1.1.0.tgz",
@@ -8277,6 +8786,15 @@
"walk-up-path": "^4.0.0"
}
},
"node_modules/fd-slicer": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/fd-slicer/-/fd-slicer-1.1.0.tgz",
"integrity": "sha512-cE1qsB/VwyQozZ+q1dGxR8LBYNZeofhEdUNGSMbQD3Gw2lAzX9Zb3uIU6Ebc/Fmyjo9AWWfnn0AUCHqtevs/8g==",
"license": "MIT",
"dependencies": {
"pend": "~1.2.0"
}
},
"node_modules/file-entry-cache": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/file-entry-cache/-/file-entry-cache-8.0.0.tgz",
@@ -8632,6 +9150,15 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/get-nonce": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/get-nonce/-/get-nonce-1.0.1.tgz",
"integrity": "sha512-FJhYRoDaiatfEkUK8HKlicmu/3SGFD51q3itKDGoSTysQJBnfOcxU5GxnhE1E6soB76MbT0MBtnKJuXyAx+96Q==",
"license": "MIT",
"engines": {
"node": ">=6"
}
},
"node_modules/get-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
@@ -13381,6 +13908,12 @@
"dev": true,
"license": "MIT"
},
"node_modules/pend": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/pend/-/pend-1.2.0.tgz",
"integrity": "sha512-F3asv42UuXchdzt+xXqfW1OGlVBe+mxa2mqI0pg5yAHZPvFmY3Y6drSf/GQ1A86WgWEN9Kzh/WrgKa6iGcHXLg==",
"license": "MIT"
},
"node_modules/perfect-debounce": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/perfect-debounce/-/perfect-debounce-2.0.0.tgz",
@@ -13774,7 +14307,6 @@
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
"dev": true,
"license": "MIT"
},
"node_modules/pump": {
@@ -14016,6 +14548,53 @@
"node": ">=0.10.0"
}
},
"node_modules/react-remove-scroll": {
"version": "2.7.2",
"resolved": "https://registry.npmjs.org/react-remove-scroll/-/react-remove-scroll-2.7.2.tgz",
"integrity": "sha512-Iqb9NjCCTt6Hf+vOdNIZGdTiH1QSqr27H/Ek9sv/a97gfueI/5h1s3yRi1nngzMUaOOToin5dI1dXKdXiF+u0Q==",
"license": "MIT",
"dependencies": {
"react-remove-scroll-bar": "^2.3.7",
"react-style-singleton": "^2.2.3",
"tslib": "^2.1.0",
"use-callback-ref": "^1.3.3",
"use-sidecar": "^1.1.3"
},
"engines": {
"node": ">=10"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/react-remove-scroll-bar": {
"version": "2.3.8",
"resolved": "https://registry.npmjs.org/react-remove-scroll-bar/-/react-remove-scroll-bar-2.3.8.tgz",
"integrity": "sha512-9r+yi9+mgU33AKcj6IbT9oRCO78WriSj6t/cF8DWBZJ9aOGPOTEDvdUDz1FwKim7QXWwmHqtdHnRJfhAxEG46Q==",
"license": "MIT",
"dependencies": {
"react-style-singleton": "^2.2.2",
"tslib": "^2.0.0"
},
"engines": {
"node": ">=10"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/react-router": {
"version": "6.30.1",
"resolved": "https://registry.npmjs.org/react-router/-/react-router-6.30.1.tgz",
@@ -14048,6 +14627,28 @@
"react-dom": ">=16.8"
}
},
"node_modules/react-style-singleton": {
"version": "2.2.3",
"resolved": "https://registry.npmjs.org/react-style-singleton/-/react-style-singleton-2.2.3.tgz",
"integrity": "sha512-b6jSvxvVnyptAiLjbkWLE/lOnR4lfTtDAl+eUC7RZy+QQWc6wRzIV2CE6xBuMmDxc2qIihtDCZD5NPOFl7fRBQ==",
"license": "MIT",
"dependencies": {
"get-nonce": "^1.0.0",
"tslib": "^2.0.0"
},
"engines": {
"node": ">=10"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/react-syntax-highlighter": {
"version": "15.6.6",
"resolved": "https://registry.npmjs.org/react-syntax-highlighter/-/react-syntax-highlighter-15.6.6.tgz",
@@ -17545,6 +18146,49 @@
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
}
},
"node_modules/use-callback-ref": {
"version": "1.3.3",
"resolved": "https://registry.npmjs.org/use-callback-ref/-/use-callback-ref-1.3.3.tgz",
"integrity": "sha512-jQL3lRnocaFtu3V00JToYz/4QkNWswxijDaCVNZRiRTO3HQDLsdu1ZtmIUvV4yPp+rvWm5j0y0TG/S61cuijTg==",
"license": "MIT",
"dependencies": {
"tslib": "^2.0.0"
},
"engines": {
"node": ">=10"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/use-sidecar": {
"version": "1.1.3",
"resolved": "https://registry.npmjs.org/use-sidecar/-/use-sidecar-1.1.3.tgz",
"integrity": "sha512-Fedw0aZvkhynoPYlA5WXrMCAMm+nSWdZt6lzJQ7Ok8S6Q+VsHmHpRWndVRJ8Be0ZbkfPc5LRYH+5XrzXcEeLRQ==",
"license": "MIT",
"dependencies": {
"detect-node-es": "^1.1.0",
"tslib": "^2.0.0"
},
"engines": {
"node": ">=10"
},
"peerDependencies": {
"@types/react": "*",
"react": "^16.8.0 || ^17.0.0 || ^18.0.0 || ^19.0.0 || ^19.0.0-rc"
},
"peerDependenciesMeta": {
"@types/react": {
"optional": true
}
}
},
"node_modules/use-sync-external-store": {
"version": "1.6.0",
"resolved": "https://registry.npmmirror.com/use-sync-external-store/-/use-sync-external-store-1.6.0.tgz",
@@ -18225,6 +18869,16 @@
"node": ">=8"
}
},
"node_modules/yauzl": {
"version": "2.10.0",
"resolved": "https://registry.npmjs.org/yauzl/-/yauzl-2.10.0.tgz",
"integrity": "sha512-p4a9I6X6nu6IhoGmBqAcbJy1mlC4j27vEPZX9F4L4/vZT3Lyq1VkFHw/V/PUcB9Buo+DG3iHkT0x3Qya58zc3g==",
"license": "MIT",
"dependencies": {
"buffer-crc32": "~0.2.3",
"fd-slicer": "~1.1.0"
}
},
"node_modules/yocto-queue": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-0.1.0.tgz",

View File

@@ -1,6 +1,6 @@
{
"name": "@cloudcli-ai/cloudcli",
"version": "1.30.0",
"version": "1.31.5",
"description": "A web-based UI for Claude Code CLI",
"type": "module",
"main": "dist-server/server/index.js",
@@ -76,10 +76,11 @@
"@codemirror/theme-one-dark": "^6.1.2",
"@iarna/toml": "^2.2.5",
"@octokit/rest": "^22.0.0",
"@openai/codex-sdk": "^0.101.0",
"@openai/codex-sdk": "^0.125.0",
"@replit/codemirror-minimap": "^0.5.2",
"@tailwindcss/typography": "^0.5.16",
"@uiw/react-codemirror": "^4.23.13",
"@vscode/ripgrep": "^1.17.1",
"@xterm/addon-clipboard": "^0.1.0",
"@xterm/addon-fit": "^0.10.0",
"@xterm/addon-web-links": "^0.11.0",
@@ -90,6 +91,7 @@
"chokidar": "^4.0.3",
"class-variance-authority": "^0.7.1",
"clsx": "^2.1.1",
"cmdk": "^1.1.1",
"cors": "^2.8.5",
"cross-spawn": "^7.0.3",
"express": "^4.18.2",
@@ -132,6 +134,7 @@
"@types/node": "^22.19.7",
"@types/react": "^18.2.43",
"@types/react-dom": "^18.2.17",
"@types/ws": "^8.18.1",
"@vitejs/plugin-react": "^4.6.0",
"auto-changelog": "^2.5.0",
"autoprefixer": "^10.4.16",

View File

@@ -18,6 +18,7 @@ import { promises as fs } from 'fs';
import path from 'path';
import os from 'os';
import { CLAUDE_MODELS } from '../shared/modelConstants.js';
import { resolveClaudeCodeExecutablePath } from './shared/claude-cli-path.js';
import {
createNotificationEvent,
notifyRunFailed,
@@ -153,11 +154,9 @@ function mapCliOptionsToSDK(options = {}) {
// Since SDK 0.2.113, options.env replaces process.env instead of overlaying it.
sdkOptions.env = { ...process.env };
// Use CLAUDE_CLI_PATH if explicitly set, otherwise fall back to 'claude' on PATH.
// The SDK 0.2.113+ looks for a bundled native binary optional dep by default;
// this fallback ensures users who installed via the official installer still work
// even when npm prune --production has removed those optional deps.
sdkOptions.pathToClaudeCodeExecutable = process.env.CLAUDE_CLI_PATH || 'claude';
// Resolve the executable eagerly on Windows because the SDK uses raw child_process.spawn,
// which does not reliably follow npm's shell wrappers like cross-spawn does.
sdkOptions.pathToClaudeCodeExecutable = resolveClaudeCodeExecutablePath(process.env.CLAUDE_CLI_PATH);
// Map working directory
if (cwd) {
@@ -527,6 +526,12 @@ async function queryClaudeSDK(command, options = {}, ws) {
}]
};
// Caveat: in 'auto' and 'bypassPermissions' modes the SDK resolves approval
// at the permission-mode step and skips this callback, so interactive tools
// (AskUserQuestion, ExitPlanMode) won't reach the UI — the classifier/bypass
// auto-approves them and the model acts on a generated answer. Move these
// tools to a PreToolUse hook (runs before the mode check) if we need them
// to work in those modes.
sdkOptions.canUseTool = async (toolName, input, context) => {
const requiresInteraction = TOOLS_REQUIRING_INTERACTION.has(toolName);

View File

@@ -150,7 +150,6 @@ async function spawnCursor(command, options = {}, ws) {
try {
const response = JSON.parse(line);
console.log('Parsed JSON response:', response);
// Handle different message types
switch (response.type) {
@@ -159,7 +158,6 @@ async function spawnCursor(command, options = {}, ws) {
// Capture session ID
if (response.session_id && !capturedSessionId) {
capturedSessionId = response.session_id;
console.log('Captured session ID:', capturedSessionId);
// Update process key with captured session ID
if (processKey !== capturedSessionId) {
@@ -197,7 +195,6 @@ async function spawnCursor(command, options = {}, ws) {
case 'result': {
// Session complete — send stream end + lifecycle complete with result payload
console.log('Cursor session result:', response);
const resultText = typeof response.result === 'string' ? response.result : '';
ws.send(createNormalizedMessage({
kind: 'complete',
@@ -213,8 +210,6 @@ async function spawnCursor(command, options = {}, ws) {
// Unknown message types — ignore.
}
} catch (parseError) {
console.log('Non-JSON response:', line);
if (shouldSuppressForTrustRetry(line)) {
return;
}
@@ -228,7 +223,6 @@ async function spawnCursor(command, options = {}, ws) {
// Handle stdout (streaming JSON responses)
cursorProcess.stdout.on('data', (data) => {
const rawOutput = data.toString();
console.log('Cursor CLI stdout:', rawOutput);
// Stream chunks can split JSON objects across packets; keep trailing partial line.
stdoutLineBuffer += rawOutput;
@@ -254,8 +248,6 @@ async function spawnCursor(command, options = {}, ws) {
// Handle process completion
cursorProcess.on('close', async (code) => {
console.log(`Cursor CLI process exited with code ${code}`);
const finalSessionId = capturedSessionId || sessionId || processKey;
activeCursorProcesses.delete(finalSessionId);

View File

@@ -1,593 +0,0 @@
import Database from 'better-sqlite3';
import path from 'path';
import fs from 'fs';
import crypto from 'crypto';
import { findAppRoot, getModuleDir } from '../utils/runtime-paths.js';
import {
APP_CONFIG_TABLE_SQL,
USER_NOTIFICATION_PREFERENCES_TABLE_SQL,
VAPID_KEYS_TABLE_SQL,
PUSH_SUBSCRIPTIONS_TABLE_SQL,
SESSION_NAMES_TABLE_SQL,
SESSION_NAMES_LOOKUP_INDEX_SQL,
DATABASE_SCHEMA_SQL
} from './schema.js';
const __dirname = getModuleDir(import.meta.url);
// The compiled backend lives under dist-server/server/database, but the install root we log
// should still point at the project/app root. Resolving it here avoids build-layout drift.
const APP_ROOT = findAppRoot(__dirname);
// ANSI color codes for terminal output
const colors = {
reset: '\x1b[0m',
bright: '\x1b[1m',
cyan: '\x1b[36m',
dim: '\x1b[2m',
};
const c = {
info: (text) => `${colors.cyan}${text}${colors.reset}`,
bright: (text) => `${colors.bright}${text}${colors.reset}`,
dim: (text) => `${colors.dim}${text}${colors.reset}`,
};
// Use DATABASE_PATH environment variable if set, otherwise use default location
const DB_PATH = process.env.DATABASE_PATH || path.join(__dirname, 'auth.db');
// Ensure database directory exists if custom path is provided
if (process.env.DATABASE_PATH) {
const dbDir = path.dirname(DB_PATH);
try {
if (!fs.existsSync(dbDir)) {
fs.mkdirSync(dbDir, { recursive: true });
console.log(`Created database directory: ${dbDir}`);
}
} catch (error) {
console.error(`Failed to create database directory ${dbDir}:`, error.message);
throw error;
}
}
// As part of 1.19.2 we are introducing a new location for auth.db. The below handles exisitng moving legacy database from install directory to new location
const LEGACY_DB_PATH = path.join(__dirname, 'auth.db');
if (DB_PATH !== LEGACY_DB_PATH && !fs.existsSync(DB_PATH) && fs.existsSync(LEGACY_DB_PATH)) {
try {
fs.copyFileSync(LEGACY_DB_PATH, DB_PATH);
console.log(`[MIGRATION] Copied database from ${LEGACY_DB_PATH} to ${DB_PATH}`);
for (const suffix of ['-wal', '-shm']) {
if (fs.existsSync(LEGACY_DB_PATH + suffix)) {
fs.copyFileSync(LEGACY_DB_PATH + suffix, DB_PATH + suffix);
}
}
} catch (err) {
console.warn(`[MIGRATION] Could not copy legacy database: ${err.message}`);
}
}
// Create database connection
const db = new Database(DB_PATH);
// app_config must exist before any other module imports (auth.js reads the JWT secret at load time).
// runMigrations() also creates this table, but it runs too late for existing installations
// where auth.js is imported before initializeDatabase() is called.
db.exec(APP_CONFIG_TABLE_SQL);
// Show app installation path prominently
const appInstallPath = APP_ROOT;
console.log('');
console.log(c.dim('═'.repeat(60)));
console.log(`${c.info('[INFO]')} App Installation: ${c.bright(appInstallPath)}`);
console.log(`${c.info('[INFO]')} Database: ${c.dim(path.relative(appInstallPath, DB_PATH))}`);
if (process.env.DATABASE_PATH) {
console.log(` ${c.dim('(Using custom DATABASE_PATH from environment)')}`);
}
console.log(c.dim('═'.repeat(60)));
console.log('');
const runMigrations = () => {
try {
const tableInfo = db.prepare("PRAGMA table_info(users)").all();
const columnNames = tableInfo.map(col => col.name);
if (!columnNames.includes('git_name')) {
console.log('Running migration: Adding git_name column');
db.exec('ALTER TABLE users ADD COLUMN git_name TEXT');
}
if (!columnNames.includes('git_email')) {
console.log('Running migration: Adding git_email column');
db.exec('ALTER TABLE users ADD COLUMN git_email TEXT');
}
if (!columnNames.includes('has_completed_onboarding')) {
console.log('Running migration: Adding has_completed_onboarding column');
db.exec('ALTER TABLE users ADD COLUMN has_completed_onboarding BOOLEAN DEFAULT 0');
}
db.exec(USER_NOTIFICATION_PREFERENCES_TABLE_SQL);
db.exec(VAPID_KEYS_TABLE_SQL);
db.exec(PUSH_SUBSCRIPTIONS_TABLE_SQL);
db.exec(APP_CONFIG_TABLE_SQL);
db.exec(SESSION_NAMES_TABLE_SQL);
db.exec(SESSION_NAMES_LOOKUP_INDEX_SQL);
console.log('Database migrations completed successfully');
} catch (error) {
console.error('Error running migrations:', error.message);
throw error;
}
};
// Initialize database with schema
const initializeDatabase = async () => {
try {
db.exec(DATABASE_SCHEMA_SQL);
console.log('Database initialized successfully');
runMigrations();
} catch (error) {
console.error('Error initializing database:', error.message);
throw error;
}
};
// User database operations
const userDb = {
// Check if any users exist
hasUsers: () => {
try {
const row = db.prepare('SELECT COUNT(*) as count FROM users').get();
return row.count > 0;
} catch (err) {
throw err;
}
},
// Create a new user
createUser: (username, passwordHash) => {
try {
const stmt = db.prepare('INSERT INTO users (username, password_hash) VALUES (?, ?)');
const result = stmt.run(username, passwordHash);
return { id: result.lastInsertRowid, username };
} catch (err) {
throw err;
}
},
// Get user by username
getUserByUsername: (username) => {
try {
const row = db.prepare('SELECT * FROM users WHERE username = ? AND is_active = 1').get(username);
return row;
} catch (err) {
throw err;
}
},
// Update last login time (non-fatal — logged but not thrown)
updateLastLogin: (userId) => {
try {
db.prepare('UPDATE users SET last_login = CURRENT_TIMESTAMP WHERE id = ?').run(userId);
} catch (err) {
console.warn('Failed to update last login:', err.message);
}
},
// Get user by ID
getUserById: (userId) => {
try {
const row = db.prepare('SELECT id, username, created_at, last_login FROM users WHERE id = ? AND is_active = 1').get(userId);
return row;
} catch (err) {
throw err;
}
},
getFirstUser: () => {
try {
const row = db.prepare('SELECT id, username, created_at, last_login FROM users WHERE is_active = 1 LIMIT 1').get();
return row;
} catch (err) {
throw err;
}
},
updateGitConfig: (userId, gitName, gitEmail) => {
try {
const stmt = db.prepare('UPDATE users SET git_name = ?, git_email = ? WHERE id = ?');
stmt.run(gitName, gitEmail, userId);
} catch (err) {
throw err;
}
},
getGitConfig: (userId) => {
try {
const row = db.prepare('SELECT git_name, git_email FROM users WHERE id = ?').get(userId);
return row;
} catch (err) {
throw err;
}
},
completeOnboarding: (userId) => {
try {
const stmt = db.prepare('UPDATE users SET has_completed_onboarding = 1 WHERE id = ?');
stmt.run(userId);
} catch (err) {
throw err;
}
},
hasCompletedOnboarding: (userId) => {
try {
const row = db.prepare('SELECT has_completed_onboarding FROM users WHERE id = ?').get(userId);
return row?.has_completed_onboarding === 1;
} catch (err) {
throw err;
}
}
};
// API Keys database operations
const apiKeysDb = {
// Generate a new API key
generateApiKey: () => {
return 'ck_' + crypto.randomBytes(32).toString('hex');
},
// Create a new API key
createApiKey: (userId, keyName) => {
try {
const apiKey = apiKeysDb.generateApiKey();
const stmt = db.prepare('INSERT INTO api_keys (user_id, key_name, api_key) VALUES (?, ?, ?)');
const result = stmt.run(userId, keyName, apiKey);
return { id: result.lastInsertRowid, keyName, apiKey };
} catch (err) {
throw err;
}
},
// Get all API keys for a user
getApiKeys: (userId) => {
try {
const rows = db.prepare('SELECT id, key_name, api_key, created_at, last_used, is_active FROM api_keys WHERE user_id = ? ORDER BY created_at DESC').all(userId);
return rows;
} catch (err) {
throw err;
}
},
// Validate API key and get user
validateApiKey: (apiKey) => {
try {
const row = db.prepare(`
SELECT u.id, u.username, ak.id as api_key_id
FROM api_keys ak
JOIN users u ON ak.user_id = u.id
WHERE ak.api_key = ? AND ak.is_active = 1 AND u.is_active = 1
`).get(apiKey);
if (row) {
// Update last_used timestamp
db.prepare('UPDATE api_keys SET last_used = CURRENT_TIMESTAMP WHERE id = ?').run(row.api_key_id);
}
return row;
} catch (err) {
throw err;
}
},
// Delete an API key
deleteApiKey: (userId, apiKeyId) => {
try {
const stmt = db.prepare('DELETE FROM api_keys WHERE id = ? AND user_id = ?');
const result = stmt.run(apiKeyId, userId);
return result.changes > 0;
} catch (err) {
throw err;
}
},
// Toggle API key active status
toggleApiKey: (userId, apiKeyId, isActive) => {
try {
const stmt = db.prepare('UPDATE api_keys SET is_active = ? WHERE id = ? AND user_id = ?');
const result = stmt.run(isActive ? 1 : 0, apiKeyId, userId);
return result.changes > 0;
} catch (err) {
throw err;
}
}
};
// User credentials database operations (for GitHub tokens, GitLab tokens, etc.)
const credentialsDb = {
// Create a new credential
createCredential: (userId, credentialName, credentialType, credentialValue, description = null) => {
try {
const stmt = db.prepare('INSERT INTO user_credentials (user_id, credential_name, credential_type, credential_value, description) VALUES (?, ?, ?, ?, ?)');
const result = stmt.run(userId, credentialName, credentialType, credentialValue, description);
return { id: result.lastInsertRowid, credentialName, credentialType };
} catch (err) {
throw err;
}
},
// Get all credentials for a user, optionally filtered by type
getCredentials: (userId, credentialType = null) => {
try {
let query = 'SELECT id, credential_name, credential_type, description, created_at, is_active FROM user_credentials WHERE user_id = ?';
const params = [userId];
if (credentialType) {
query += ' AND credential_type = ?';
params.push(credentialType);
}
query += ' ORDER BY created_at DESC';
const rows = db.prepare(query).all(...params);
return rows;
} catch (err) {
throw err;
}
},
// Get active credential value for a user by type (returns most recent active)
getActiveCredential: (userId, credentialType) => {
try {
const row = db.prepare('SELECT credential_value FROM user_credentials WHERE user_id = ? AND credential_type = ? AND is_active = 1 ORDER BY created_at DESC LIMIT 1').get(userId, credentialType);
return row?.credential_value || null;
} catch (err) {
throw err;
}
},
// Delete a credential
deleteCredential: (userId, credentialId) => {
try {
const stmt = db.prepare('DELETE FROM user_credentials WHERE id = ? AND user_id = ?');
const result = stmt.run(credentialId, userId);
return result.changes > 0;
} catch (err) {
throw err;
}
},
// Toggle credential active status
toggleCredential: (userId, credentialId, isActive) => {
try {
const stmt = db.prepare('UPDATE user_credentials SET is_active = ? WHERE id = ? AND user_id = ?');
const result = stmt.run(isActive ? 1 : 0, credentialId, userId);
return result.changes > 0;
} catch (err) {
throw err;
}
}
};
const DEFAULT_NOTIFICATION_PREFERENCES = {
channels: {
inApp: false,
webPush: false
},
events: {
actionRequired: true,
stop: true,
error: true
}
};
const normalizeNotificationPreferences = (value) => {
const source = value && typeof value === 'object' ? value : {};
return {
channels: {
inApp: source.channels?.inApp === true,
webPush: source.channels?.webPush === true
},
events: {
actionRequired: source.events?.actionRequired !== false,
stop: source.events?.stop !== false,
error: source.events?.error !== false
}
};
};
const notificationPreferencesDb = {
getPreferences: (userId) => {
try {
const row = db.prepare('SELECT preferences_json FROM user_notification_preferences WHERE user_id = ?').get(userId);
if (!row) {
const defaults = normalizeNotificationPreferences(DEFAULT_NOTIFICATION_PREFERENCES);
db.prepare(
'INSERT INTO user_notification_preferences (user_id, preferences_json, updated_at) VALUES (?, ?, CURRENT_TIMESTAMP)'
).run(userId, JSON.stringify(defaults));
return defaults;
}
let parsed;
try {
parsed = JSON.parse(row.preferences_json);
} catch {
parsed = DEFAULT_NOTIFICATION_PREFERENCES;
}
return normalizeNotificationPreferences(parsed);
} catch (err) {
throw err;
}
},
updatePreferences: (userId, preferences) => {
try {
const normalized = normalizeNotificationPreferences(preferences);
db.prepare(
`INSERT INTO user_notification_preferences (user_id, preferences_json, updated_at)
VALUES (?, ?, CURRENT_TIMESTAMP)
ON CONFLICT(user_id) DO UPDATE SET
preferences_json = excluded.preferences_json,
updated_at = CURRENT_TIMESTAMP`
).run(userId, JSON.stringify(normalized));
return normalized;
} catch (err) {
throw err;
}
}
};
const pushSubscriptionsDb = {
saveSubscription: (userId, endpoint, keysP256dh, keysAuth) => {
try {
db.prepare(
`INSERT INTO push_subscriptions (user_id, endpoint, keys_p256dh, keys_auth)
VALUES (?, ?, ?, ?)
ON CONFLICT(endpoint) DO UPDATE SET
user_id = excluded.user_id,
keys_p256dh = excluded.keys_p256dh,
keys_auth = excluded.keys_auth`
).run(userId, endpoint, keysP256dh, keysAuth);
} catch (err) {
throw err;
}
},
getSubscriptions: (userId) => {
try {
return db.prepare('SELECT endpoint, keys_p256dh, keys_auth FROM push_subscriptions WHERE user_id = ?').all(userId);
} catch (err) {
throw err;
}
},
removeSubscription: (endpoint) => {
try {
db.prepare('DELETE FROM push_subscriptions WHERE endpoint = ?').run(endpoint);
} catch (err) {
throw err;
}
},
removeAllForUser: (userId) => {
try {
db.prepare('DELETE FROM push_subscriptions WHERE user_id = ?').run(userId);
} catch (err) {
throw err;
}
}
};
// Session custom names database operations
const sessionNamesDb = {
// Set (insert or update) a custom session name
setName: (sessionId, provider, customName) => {
db.prepare(`
INSERT INTO session_names (session_id, provider, custom_name)
VALUES (?, ?, ?)
ON CONFLICT(session_id, provider)
DO UPDATE SET custom_name = excluded.custom_name, updated_at = CURRENT_TIMESTAMP
`).run(sessionId, provider, customName);
},
// Get a single custom session name
getName: (sessionId, provider) => {
const row = db.prepare(
'SELECT custom_name FROM session_names WHERE session_id = ? AND provider = ?'
).get(sessionId, provider);
return row?.custom_name || null;
},
// Batch lookup — returns Map<sessionId, customName>
getNames: (sessionIds, provider) => {
if (!sessionIds.length) return new Map();
const placeholders = sessionIds.map(() => '?').join(',');
const rows = db.prepare(
`SELECT session_id, custom_name FROM session_names
WHERE session_id IN (${placeholders}) AND provider = ?`
).all(...sessionIds, provider);
return new Map(rows.map(r => [r.session_id, r.custom_name]));
},
// Delete a custom session name
deleteName: (sessionId, provider) => {
return db.prepare(
'DELETE FROM session_names WHERE session_id = ? AND provider = ?'
).run(sessionId, provider).changes > 0;
},
};
// Apply custom session names from the database (overrides CLI-generated summaries)
function applyCustomSessionNames(sessions, provider) {
if (!sessions?.length) return;
try {
const ids = sessions.map(s => s.id);
const customNames = sessionNamesDb.getNames(ids, provider);
for (const session of sessions) {
const custom = customNames.get(session.id);
if (custom) session.summary = custom;
}
} catch (error) {
console.warn(`[DB] Failed to apply custom session names for ${provider}:`, error.message);
}
}
// App config database operations
const appConfigDb = {
get: (key) => {
try {
const row = db.prepare('SELECT value FROM app_config WHERE key = ?').get(key);
return row?.value || null;
} catch (err) {
return null;
}
},
set: (key, value) => {
db.prepare(
'INSERT INTO app_config (key, value) VALUES (?, ?) ON CONFLICT(key) DO UPDATE SET value = excluded.value'
).run(key, value);
},
getOrCreateJwtSecret: () => {
let secret = appConfigDb.get('jwt_secret');
if (!secret) {
secret = crypto.randomBytes(64).toString('hex');
appConfigDb.set('jwt_secret', secret);
}
return secret;
}
};
// Backward compatibility - keep old names pointing to new system
const githubTokensDb = {
createGithubToken: (userId, tokenName, githubToken, description = null) => {
return credentialsDb.createCredential(userId, tokenName, 'github_token', githubToken, description);
},
getGithubTokens: (userId) => {
return credentialsDb.getCredentials(userId, 'github_token');
},
getActiveGithubToken: (userId) => {
return credentialsDb.getActiveCredential(userId, 'github_token');
},
deleteGithubToken: (userId, tokenId) => {
return credentialsDb.deleteCredential(userId, tokenId);
},
toggleGithubToken: (userId, tokenId, isActive) => {
return credentialsDb.toggleCredential(userId, tokenId, isActive);
}
};
export {
db,
initializeDatabase,
userDb,
apiKeysDb,
credentialsDb,
notificationPreferencesDb,
pushSubscriptionsDb,
sessionNamesDb,
applyCustomSessionNames,
appConfigDb,
githubTokensDb // Backward compatibility
};

View File

@@ -1,102 +0,0 @@
export const APP_CONFIG_TABLE_SQL = `CREATE TABLE IF NOT EXISTS app_config (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);`;
export const USER_NOTIFICATION_PREFERENCES_TABLE_SQL = `CREATE TABLE IF NOT EXISTS user_notification_preferences (
user_id INTEGER PRIMARY KEY,
preferences_json TEXT NOT NULL,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);`;
export const VAPID_KEYS_TABLE_SQL = `CREATE TABLE IF NOT EXISTS vapid_keys (
id INTEGER PRIMARY KEY AUTOINCREMENT,
public_key TEXT NOT NULL,
private_key TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);`;
export const PUSH_SUBSCRIPTIONS_TABLE_SQL = `CREATE TABLE IF NOT EXISTS push_subscriptions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
endpoint TEXT NOT NULL UNIQUE,
keys_p256dh TEXT NOT NULL,
keys_auth TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);`;
export const SESSION_NAMES_TABLE_SQL = `CREATE TABLE IF NOT EXISTS session_names (
id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id TEXT NOT NULL,
provider TEXT NOT NULL DEFAULT 'claude',
custom_name TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
UNIQUE(session_id, provider)
);`;
export const SESSION_NAMES_LOOKUP_INDEX_SQL = `CREATE INDEX IF NOT EXISTS idx_session_names_lookup ON session_names(session_id, provider);`;
export const DATABASE_SCHEMA_SQL = `PRAGMA foreign_keys = ON;
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
last_login DATETIME,
is_active BOOLEAN DEFAULT 1,
git_name TEXT,
git_email TEXT,
has_completed_onboarding BOOLEAN DEFAULT 0
);
CREATE INDEX IF NOT EXISTS idx_users_username ON users(username);
CREATE INDEX IF NOT EXISTS idx_users_active ON users(is_active);
CREATE TABLE IF NOT EXISTS api_keys (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
key_name TEXT NOT NULL,
api_key TEXT UNIQUE NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
last_used DATETIME,
is_active BOOLEAN DEFAULT 1,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_api_keys_key ON api_keys(api_key);
CREATE INDEX IF NOT EXISTS idx_api_keys_user_id ON api_keys(user_id);
CREATE INDEX IF NOT EXISTS idx_api_keys_active ON api_keys(is_active);
CREATE TABLE IF NOT EXISTS user_credentials (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
credential_name TEXT NOT NULL,
credential_type TEXT NOT NULL,
credential_value TEXT NOT NULL,
description TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
is_active BOOLEAN DEFAULT 1,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
CREATE INDEX IF NOT EXISTS idx_user_credentials_user_id ON user_credentials(user_id);
CREATE INDEX IF NOT EXISTS idx_user_credentials_type ON user_credentials(credential_type);
CREATE INDEX IF NOT EXISTS idx_user_credentials_active ON user_credentials(is_active);
${USER_NOTIFICATION_PREFERENCES_TABLE_SQL}
${VAPID_KEYS_TABLE_SQL}
${PUSH_SUBSCRIPTIONS_TABLE_SQL}
${SESSION_NAMES_TABLE_SQL}
${SESSION_NAMES_LOOKUP_INDEX_SQL}
${APP_CONFIG_TABLE_SQL}
`;

View File

@@ -1,19 +1,123 @@
import { spawn } from 'child_process';
import { promises as fs } from 'fs';
import os from 'os';
import path from 'path';
import crossSpawn from 'cross-spawn';
// Use cross-spawn on Windows for correct .cmd resolution (same pattern as cursor-cli.js)
const spawnFunction = process.platform === 'win32' ? crossSpawn : spawn;
import { promises as fs } from 'fs';
import path from 'path';
import os from 'os';
import sessionManager from './sessionManager.js';
import GeminiResponseHandler from './gemini-response-handler.js';
import { notifyRunFailed, notifyRunStopped } from './services/notification-orchestrator.js';
import { providerAuthService } from './modules/providers/services/provider-auth.service.js';
import { createNormalizedMessage } from './shared/utils.js';
// Use cross-spawn on Windows for correct .cmd resolution (same pattern as cursor-cli.js)
const spawnFunction = process.platform === 'win32' ? crossSpawn : spawn;
let activeGeminiProcesses = new Map(); // Track active processes by session ID
function mapGeminiExitCodeToMessage(exitCode) {
switch (exitCode) {
case 42:
return 'Gemini rejected the request input (exit code 42).';
case 44:
return 'Gemini sandbox error (exit code 44). Check local sandbox/container settings.';
case 52:
return 'Gemini configuration error (exit code 52). Check your Gemini settings files for invalid JSON/config.';
case 53:
return 'Gemini conversation turn limit reached (exit code 53). Start a new Gemini session.';
default:
return null;
}
}
const GEMINI_AUTH_ENV_KEYS = [
'GEMINI_API_KEY',
'GOOGLE_API_KEY',
'GOOGLE_CLOUD_PROJECT',
'GOOGLE_CLOUD_PROJECT_ID',
'GOOGLE_CLOUD_LOCATION',
'GOOGLE_APPLICATION_CREDENTIALS'
];
function parseEnvFileContent(content) {
const parsed = {};
for (const rawLine of content.split(/\r?\n/)) {
const line = rawLine.trim();
if (!line || line.startsWith('#')) {
continue;
}
const exportPrefix = 'export ';
const normalizedLine = line.startsWith(exportPrefix) ? line.slice(exportPrefix.length).trim() : line;
const separatorIndex = normalizedLine.indexOf('=');
if (separatorIndex <= 0) {
continue;
}
const key = normalizedLine.slice(0, separatorIndex).trim();
if (!key) {
continue;
}
let value = normalizedLine.slice(separatorIndex + 1).trim();
const hasDoubleQuotes = value.startsWith('"') && value.endsWith('"');
const hasSingleQuotes = value.startsWith('\'') && value.endsWith('\'');
if (hasDoubleQuotes || hasSingleQuotes) {
value = value.slice(1, -1);
} else {
// Support inline comments in unquoted values: KEY=value # comment
value = value.replace(/\s+#.*$/, '').trim();
}
parsed[key] = value;
}
return parsed;
}
async function loadGeminiUserLevelEnv() {
const geminiCliHome = (process.env.GEMINI_CLI_HOME || '').trim() || os.homedir();
const envCandidates = [
path.join(geminiCliHome, '.gemini', '.env'),
path.join(geminiCliHome, '.env')
];
for (const envPath of envCandidates) {
try {
await fs.access(envPath);
const content = await fs.readFile(envPath, 'utf8');
return parseEnvFileContent(content);
} catch {
// Keep scanning for the next candidate.
}
}
return {};
}
async function buildGeminiProcessEnv() {
const processEnv = { ...process.env };
if (processEnv.GEMINI_API_KEY || processEnv.GOOGLE_API_KEY || processEnv.GOOGLE_APPLICATION_CREDENTIALS) {
return processEnv;
}
// Gemini CLI docs recommend ~/.gemini/.env for persistent headless auth settings.
// When the server process was launched without shell profile variables, we still
// want the spawned CLI process to inherit those user-level credentials.
const userEnv = await loadGeminiUserLevelEnv();
for (const key of GEMINI_AUTH_ENV_KEYS) {
if (!processEnv[key] && userEnv[key]) {
processEnv[key] = userEnv[key];
}
}
return processEnv;
}
async function spawnGemini(command, options = {}, ws) {
const { sessionId, projectPath, cwd, toolsSettings, permissionMode, images, sessionSummary } = options;
let capturedSessionId = sessionId; // Track session ID throughout the process
@@ -100,6 +204,11 @@ async function spawnGemini(command, options = {}, ws) {
args.push('--debug');
}
// This integration runs Gemini in headless mode and cannot answer trust prompts.
// Skip folder-trust interactivity so authenticated runs don't fail with
// FatalUntrustedWorkspaceError in previously unseen directories.
args.push('--skip-trust');
// Add MCP config flag only if MCP servers are configured
try {
const geminiConfigPath = path.join(os.homedir(), '.gemini.json');
@@ -154,9 +263,6 @@ async function spawnGemini(command, options = {}, ws) {
// Try to find gemini in PATH first, then fall back to environment variable
const geminiPath = process.env.GEMINI_PATH || 'gemini';
console.log('Spawning Gemini CLI:', geminiPath, args.join(' '));
console.log('Working directory:', workingDir);
let spawnCmd = geminiPath;
let spawnArgs = args;
@@ -168,11 +274,13 @@ async function spawnGemini(command, options = {}, ws) {
spawnArgs = ['-c', 'exec "$0" "$@"', geminiPath, ...args];
}
const spawnEnv = await buildGeminiProcessEnv();
return new Promise((resolve, reject) => {
const geminiProcess = spawnFunction(spawnCmd, spawnArgs, {
cwd: workingDir,
stdio: ['pipe', 'pipe', 'pipe'],
env: { ...process.env } // Inherit all environment variables
env: spawnEnv
});
let terminalNotificationSent = false;
let terminalFailureReason = null;
@@ -276,12 +384,43 @@ async function spawnGemini(command, options = {}, ws) {
}
},
onInit: (event) => {
if (capturedSessionId) {
const sess = sessionManager.getSession(capturedSessionId);
if (sess && !sess.cliSessionId) {
sess.cliSessionId = event.session_id;
sessionManager.saveSession(capturedSessionId);
const discoveredSessionId = event?.session_id;
if (!discoveredSessionId) {
return;
}
// New Gemini sessions announce their canonical ID asynchronously via the
// initial `init` stream event. Avoid synthetic IDs and only register
// the session once that real ID is known (same model used by Claude/Codex).
if (!capturedSessionId) {
capturedSessionId = discoveredSessionId;
sessionManager.createSession(capturedSessionId, cwd || process.cwd());
if (command) {
sessionManager.addMessage(capturedSessionId, 'user', command);
}
if (processKey !== capturedSessionId) {
activeGeminiProcesses.delete(processKey);
activeGeminiProcesses.set(capturedSessionId, geminiProcess);
}
geminiProcess.sessionId = capturedSessionId;
if (ws.setSessionId && typeof ws.setSessionId === 'function') {
ws.setSessionId(capturedSessionId);
}
if (!sessionId && !sessionCreatedSent) {
sessionCreatedSent = true;
ws.send(createNormalizedMessage({ kind: 'session_created', newSessionId: capturedSessionId, sessionId: capturedSessionId, provider: 'gemini' }));
}
}
const sess = sessionManager.getSession(capturedSessionId);
if (sess && !sess.cliSessionId) {
sess.cliSessionId = discoveredSessionId;
sessionManager.saveSession(capturedSessionId);
}
}
});
@@ -292,30 +431,6 @@ async function spawnGemini(command, options = {}, ws) {
const rawOutput = data.toString();
startTimeout(); // Re-arm the timeout
// For new sessions, create a session ID FIRST
if (!sessionId && !sessionCreatedSent && !capturedSessionId) {
capturedSessionId = `gemini_${Date.now()}`;
sessionCreatedSent = true;
// Create session in session manager
sessionManager.createSession(capturedSessionId, cwd || process.cwd());
// Save the user message now that we have a session ID
if (command) {
sessionManager.addMessage(capturedSessionId, 'user', command);
}
// Update process key with captured session ID
if (processKey !== capturedSessionId) {
activeGeminiProcesses.delete(processKey);
activeGeminiProcesses.set(capturedSessionId, geminiProcess);
}
ws.setSessionId && typeof ws.setSessionId === 'function' && ws.setSessionId(capturedSessionId);
ws.send(createNormalizedMessage({ kind: 'session_created', newSessionId: capturedSessionId, sessionId: capturedSessionId, provider: 'gemini' }));
}
if (responseHandler) {
responseHandler.processData(rawOutput);
} else if (rawOutput) {
@@ -381,12 +496,38 @@ async function spawnGemini(command, options = {}, ws) {
notifyTerminalState({ code });
resolve();
} else {
// code 127 = shell "command not found" — check installation
const socketSessionId = typeof ws.getSessionId === 'function' ? ws.getSessionId() : finalSessionId;
// code 127 = shell "command not found" - check installation
if (code === 127) {
const installed = await providerAuthService.isProviderInstalled('gemini');
if (!installed) {
const socketSessionId = typeof ws.getSessionId === 'function' ? ws.getSessionId() : finalSessionId;
ws.send(createNormalizedMessage({ kind: 'error', content: 'Gemini CLI is not installed. Please install it first: https://github.com/google-gemini/gemini-cli', sessionId: socketSessionId, provider: 'gemini' }));
terminalFailureReason = 'Gemini CLI is not installed. Please install it first: https://github.com/google-gemini/gemini-cli';
ws.send(createNormalizedMessage({ kind: 'error', content: terminalFailureReason, sessionId: socketSessionId, provider: 'gemini' }));
}
} else if (code === 41) {
// Gemini CLI documents exit code 41 as FatalAuthenticationError.
// Surface an actionable auth error instead of a generic exit-code message.
let authErrorSuffix = '';
try {
const authStatus = await providerAuthService.getProviderAuthStatus('gemini');
if (!authStatus?.authenticated && authStatus?.error) {
authErrorSuffix = ` Details: ${authStatus.error}`;
}
} catch {
// Keep base remediation text when auth status lookup fails.
}
terminalFailureReason =
'Gemini authentication failed (exit code 41). '
+ 'Run `gemini` in a terminal to choose an auth method, or configure a valid `GEMINI_API_KEY`.'
+ authErrorSuffix;
ws.send(createNormalizedMessage({ kind: 'error', content: terminalFailureReason, sessionId: socketSessionId, provider: 'gemini' }));
} else {
const mappedError = mapGeminiExitCodeToMessage(code);
if (mappedError) {
terminalFailureReason = mappedError;
ws.send(createNormalizedMessage({ kind: 'error', content: terminalFailureReason, sessionId: socketSessionId, provider: 'gemini' }));
}
}
@@ -394,7 +535,14 @@ async function spawnGemini(command, options = {}, ws) {
code,
error: code === null ? 'Gemini CLI process was terminated or timed out' : null
});
reject(new Error(code === null ? 'Gemini CLI process was terminated or timed out' : `Gemini CLI exited with code ${code}`));
reject(
new Error(
terminalFailureReason
|| (code === null
? 'Gemini CLI process was terminated or timed out'
: `Gemini CLI exited with code ${code}`)
)
);
}
});

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
import jwt from 'jsonwebtoken';
import { userDb, appConfigDb } from '../database/db.js';
import { userDb, appConfigDb } from '../modules/database/index.js';
import { IS_PLATFORM } from '../constants/config.js';
// Use env var if set, otherwise auto-generate a unique secret per installation

View File

@@ -0,0 +1,143 @@
/**
* Database connection management.
*
* Owns the single SQLite connection used across all repositories.
* Handles path resolution, directory creation, legacy database migration,
* and eager app_config bootstrap so the auth middleware can read the
* JWT secret before the full schema is applied.
*
* Consumers should never create their own Database instance — they use
* `getConnection()` to obtain the shared singleton.
*/
import Database from 'better-sqlite3';
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import { APP_CONFIG_TABLE_SCHEMA_SQL } from '@/modules/database/schema.js';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// ---------------------------------------------------------------------------
// Path resolution
// ---------------------------------------------------------------------------
/**
* Resolves the database file path from environment or falls back
* to the legacy location inside the server/database/ folder.
*
* Priority:
* 1. DATABASE_PATH environment variable (set by cli.js or load-env-vars.js)
* 2. Legacy path: server/database/auth.db
*/
function resolveDatabasePath(): string {
// process.env.DATABASE_PATH is set by load-env-vars.js to either the .env value or a default(~/.cloudcli/auth.db) in the user's home directory.
return process.env.DATABASE_PATH || resolveLegacyDatabasePath();
}
/**
* Resolves the legacy database path (always inside server/database/).
* Used for the one-time migration to the new external location.
*/
function resolveLegacyDatabasePath(): string {
const serverDir = path.resolve(__dirname, '..', '..', '..');
return path.join(serverDir, 'database', 'auth.db');
}
// ---------------------------------------------------------------------------
// Directory & migration helpers
// ---------------------------------------------------------------------------
function ensureDatabaseDirectory(dbPath: string): void {
const dir = path.dirname(dbPath);
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir, { recursive: true });
console.log('Created database directory:', dir);
}
}
/**
* If the database was moved to an external location (e.g. ~/.cloudcli/)
* but the user still has a legacy auth.db inside the install directory,
* copy it to the new location as a one-time migration.
*/
function migrateLegacyDatabase(targetPath: string): void {
const legacyPath = resolveLegacyDatabasePath();
if (targetPath === legacyPath) return;
if (fs.existsSync(targetPath)) return;
if (!fs.existsSync(legacyPath)) return;
try {
fs.copyFileSync(legacyPath, targetPath);
console.log('Migrated legacy database', { from: legacyPath, to: targetPath });
// copy the write-ahead log and shared memory files (auth.db-wal, auth.db-shm) if they exist, to preserve any uncommitted transactions
for (const suffix of ['-wal', '-shm']) {
const src = legacyPath + suffix;
if (fs.existsSync(src)) {
fs.copyFileSync(src, targetPath + suffix);
}
}
} catch (err: any) {
console.error('Could not migrate legacy database', { error: err.message });
}
}
// ---------------------------------------------------------------------------
// Singleton connection
// ---------------------------------------------------------------------------
let instance: Database.Database | null = null;
/**
* Returns the shared database connection, creating it on first call.
*
* The first invocation:
* 1. Resolves the target database path
* 2. Ensures the parent directory exists
* 3. Migrates from the legacy install-directory path if needed
* 4. Opens the SQLite connection
* 5. Eagerly creates the app_config table (auth reads JWT secret at import time)
* 6. Logs the database location
*/
export function getConnection(): Database.Database {
if (instance) return instance;
const dbPath = resolveDatabasePath();
ensureDatabaseDirectory(dbPath);
migrateLegacyDatabase(dbPath);
instance = new Database(dbPath);
// app_config must exist immediately — the auth middleware reads
// the JWT secret at module-load time, before initializeDatabase() runs.
instance.exec(APP_CONFIG_TABLE_SCHEMA_SQL);
return instance;
}
/**
* Returns the resolved database file path without opening a connection.
* Useful for diagnostics and CLI status commands.
*/
export function getDatabasePath(): string {
return resolveDatabasePath();
}
/**
* Closes the database connection and clears the singleton.
* Primarily used for graceful shutdown or testing.
*/
export function closeConnection(): void {
if (instance) {
instance.close();
instance = null;
console.log('Database connection closed');
}
}

View File

@@ -0,0 +1,12 @@
export { initializeDatabase } from '@/modules/database/init-db.js';
export { apiKeysDb } from '@/modules/database/repositories/api-keys.js';
export { appConfigDb } from '@/modules/database/repositories/app-config.js';
export { credentialsDb } from '@/modules/database/repositories/credentials.js';
export { githubTokensDb } from '@/modules/database/repositories/github-tokens.js';
export { notificationPreferencesDb } from '@/modules/database/repositories/notification-preferences.js';
export { projectsDb } from '@/modules/database/repositories/projects.db.js';
export { pushSubscriptionsDb } from '@/modules/database/repositories/push-subscriptions.js';
export { scanStateDb } from '@/modules/database/repositories/scan-state.db.js';
export { sessionsDb } from '@/modules/database/repositories/sessions.db.js';
export { userDb } from '@/modules/database/repositories/users.js';
export { vapidKeysDb } from '@/modules/database/repositories/vapid-keys.js';

View File

@@ -0,0 +1,17 @@
import { getConnection } from "@/modules/database/connection.js";
import { runMigrations } from "@/modules/database/migrations.js";
import { INIT_SCHEMA_SQL } from "@/modules/database/schema.js";
// Initialize database with schema
export const initializeDatabase = async () => {
try {
const db = getConnection();
db.exec(INIT_SCHEMA_SQL);
console.log('Database schema applied');
runMigrations(db);
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
console.log('Database initialization failed', { error: message });
throw err;
}
};

View File

@@ -0,0 +1,455 @@
import { Database } from 'better-sqlite3';
import {
APP_CONFIG_TABLE_SCHEMA_SQL,
LAST_SCANNED_AT_SQL,
PROJECTS_TABLE_SCHEMA_SQL,
PUSH_SUBSCRIPTIONS_TABLE_SCHEMA_SQL,
SESSIONS_TABLE_SCHEMA_SQL,
USER_NOTIFICATION_PREFERENCES_TABLE_SCHEMA_SQL,
VAPID_KEYS_TABLE_SCHEMA_SQL,
} from '@/modules/database/schema.js';
const SQLITE_UUID_SQL = `
lower(hex(randomblob(4))) || '-' ||
lower(hex(randomblob(2))) || '-' ||
lower(hex(randomblob(2))) || '-' ||
lower(hex(randomblob(2))) || '-' ||
lower(hex(randomblob(6)))
`;
type TableInfoRow = {
name: string;
pk: number;
};
const addColumnToTableIfNotExists = (
db: Database,
tableName: string,
columnNames: string[],
columnName: string,
columnType: string
) => {
if (!columnNames.includes(columnName)) {
console.log(`Running migration: Adding ${columnName} column to ${tableName} table`);
db.exec(`ALTER TABLE ${tableName} ADD COLUMN ${columnName} ${columnType}`);
}
};
const tableExists = (db: Database, tableName: string): boolean =>
Boolean(
db
.prepare("SELECT name FROM sqlite_master WHERE type = 'table' AND name = ?")
.get(tableName)
);
const getTableInfo = (db: Database, tableName: string): TableInfoRow[] =>
db.prepare(`PRAGMA table_info(${tableName})`).all() as TableInfoRow[];
const migrateLegacySessionNames = (db: Database): void => {
const hasLegacySessionNamesTable = tableExists(db, 'session_names');
const hasSessionsTable = tableExists(db, 'sessions');
if (!hasLegacySessionNamesTable) {
return;
}
if (hasSessionsTable) {
console.log('Running migration: Merging session_names into sessions');
db.exec(`
INSERT INTO sessions (session_id, provider, custom_name, created_at, updated_at)
SELECT
session_id,
COALESCE(provider, 'claude'),
custom_name,
COALESCE(created_at, CURRENT_TIMESTAMP),
COALESCE(updated_at, CURRENT_TIMESTAMP)
FROM session_names
WHERE true
ON CONFLICT(session_id) DO UPDATE SET
provider = excluded.provider,
custom_name = COALESCE(excluded.custom_name, sessions.custom_name),
created_at = COALESCE(sessions.created_at, excluded.created_at),
updated_at = COALESCE(excluded.updated_at, sessions.updated_at)
`);
db.exec('DROP TABLE session_names');
return;
}
console.log('Running migration: Renaming session_names table to sessions');
db.exec('ALTER TABLE session_names RENAME TO sessions');
};
const migrateLegacyWorkspaceTableIntoProjects = (db: Database): void => {
db.exec(PROJECTS_TABLE_SCHEMA_SQL);
if (!tableExists(db, 'workspace_original_paths')) {
return;
}
console.log('Running migration: Migrating workspace_original_paths data into projects');
db.exec(`
INSERT INTO projects (project_id, project_path, custom_project_name, isStarred, isArchived)
SELECT
CASE
WHEN workspace_id IS NULL OR trim(workspace_id) = ''
THEN ${SQLITE_UUID_SQL}
ELSE workspace_id
END,
workspace_path,
custom_workspace_name,
COALESCE(isStarred, 0),
0
FROM workspace_original_paths
WHERE workspace_path IS NOT NULL AND trim(workspace_path) <> ''
ON CONFLICT(project_path) DO UPDATE SET
custom_project_name = COALESCE(projects.custom_project_name, excluded.custom_project_name),
isStarred = COALESCE(projects.isStarred, excluded.isStarred)
`);
};
const rebuildProjectsTableWithPrimaryKeySchema = (db: Database): void => {
const hasProjectsTable = tableExists(db, 'projects');
if (!hasProjectsTable) {
db.exec(PROJECTS_TABLE_SCHEMA_SQL);
return;
}
const projectsTableInfo = getTableInfo(db, 'projects');
const columnNames = projectsTableInfo.map((column) => column.name);
const hasProjectIdPrimaryKey = projectsTableInfo.some(
(column) => column.name === 'project_id' && column.pk === 1,
);
if (hasProjectIdPrimaryKey) {
addColumnToTableIfNotExists(db, 'projects', columnNames, 'custom_project_name', 'TEXT DEFAULT NULL');
addColumnToTableIfNotExists(db, 'projects', columnNames, 'isStarred', 'BOOLEAN DEFAULT 0');
addColumnToTableIfNotExists(db, 'projects', columnNames, 'isArchived', 'BOOLEAN DEFAULT 0');
db.exec(`
UPDATE projects
SET project_id = ${SQLITE_UUID_SQL}
WHERE project_id IS NULL OR trim(project_id) = ''
`);
return;
}
console.log('Running migration: Rebuilding projects table to enforce project_id primary key');
const projectPathExpression = columnNames.includes('project_path')
? 'project_path'
: columnNames.includes('workspace_path')
? 'workspace_path'
: 'NULL';
const customProjectNameExpression = columnNames.includes('custom_project_name')
? 'custom_project_name'
: columnNames.includes('custom_workspace_name')
? 'custom_workspace_name'
: 'NULL';
const isStarredExpression = columnNames.includes('isStarred') ? 'COALESCE(isStarred, 0)' : '0';
const isArchivedExpression = columnNames.includes('isArchived') ? 'COALESCE(isArchived, 0)' : '0';
const projectIdExpression = columnNames.includes('project_id')
? `CASE
WHEN project_id IS NULL OR trim(project_id) = ''
THEN ${SQLITE_UUID_SQL}
ELSE project_id
END`
: SQLITE_UUID_SQL;
db.exec('PRAGMA foreign_keys = OFF');
try {
db.exec('BEGIN TRANSACTION');
db.exec('DROP TABLE IF EXISTS projects__new');
db.exec(`
CREATE TABLE projects__new (
project_id TEXT PRIMARY KEY NOT NULL,
project_path TEXT NOT NULL UNIQUE,
custom_project_name TEXT DEFAULT NULL,
isStarred BOOLEAN DEFAULT 0,
isArchived BOOLEAN DEFAULT 0
)
`);
db.exec(`
WITH source_rows AS (
SELECT
${projectPathExpression} AS project_path,
${customProjectNameExpression} AS custom_project_name,
${isStarredExpression} AS isStarred,
${isArchivedExpression} AS isArchived,
${projectIdExpression} AS candidate_project_id,
rowid AS source_rowid
FROM projects
WHERE ${projectPathExpression} IS NOT NULL AND trim(${projectPathExpression}) <> ''
),
deduped_paths AS (
SELECT
project_path,
custom_project_name,
isStarred,
isArchived,
candidate_project_id,
source_rowid,
ROW_NUMBER() OVER (PARTITION BY project_path ORDER BY source_rowid) AS project_path_rank
FROM source_rows
),
prepared_rows AS (
SELECT
CASE
WHEN ROW_NUMBER() OVER (PARTITION BY candidate_project_id ORDER BY source_rowid) = 1
THEN candidate_project_id
ELSE ${SQLITE_UUID_SQL}
END AS project_id,
project_path,
custom_project_name,
isStarred,
isArchived
FROM deduped_paths
WHERE project_path_rank = 1
)
INSERT INTO projects__new (
project_id,
project_path,
custom_project_name,
isStarred,
isArchived
)
SELECT
project_id,
project_path,
custom_project_name,
isStarred,
isArchived
FROM prepared_rows
`);
db.exec('DROP TABLE projects');
db.exec('ALTER TABLE projects__new RENAME TO projects');
db.exec('COMMIT');
} catch (migrationError) {
db.exec('ROLLBACK');
throw migrationError;
} finally {
db.exec('PRAGMA foreign_keys = ON');
}
};
const rebuildSessionsTableWithProjectSchema = (db: Database): void => {
const hasSessions = tableExists(db, 'sessions');
if (!hasSessions) {
db.exec(SESSIONS_TABLE_SCHEMA_SQL);
return;
}
const sessionsTableInfo = getTableInfo(db, 'sessions');
const columnNames = sessionsTableInfo.map((column) => column.name);
const primaryKeyColumns = sessionsTableInfo
.filter((column) => column.pk > 0)
.sort((a, b) => a.pk - b.pk)
.map((column) => column.name);
const shouldRebuild =
!columnNames.includes('project_path') ||
primaryKeyColumns.length !== 1 ||
primaryKeyColumns[0] !== 'session_id' ||
!columnNames.includes('provider');
if (!shouldRebuild) {
addColumnToTableIfNotExists(db, 'sessions', columnNames, 'jsonl_path', 'TEXT');
addColumnToTableIfNotExists(db, 'sessions', columnNames, 'isArchived', 'BOOLEAN DEFAULT 0');
addColumnToTableIfNotExists(db, 'sessions', columnNames, 'created_at', 'DATETIME');
addColumnToTableIfNotExists(db, 'sessions', columnNames, 'updated_at', 'DATETIME');
db.exec('UPDATE sessions SET isArchived = COALESCE(isArchived, 0)');
db.exec('UPDATE sessions SET created_at = COALESCE(created_at, CURRENT_TIMESTAMP)');
db.exec('UPDATE sessions SET updated_at = COALESCE(updated_at, CURRENT_TIMESTAMP)');
return;
}
console.log('Running migration: Rebuilding sessions table to project-based schema');
const projectPathExpression = columnNames.includes('project_path')
? 'project_path'
: columnNames.includes('workspace_path')
? 'workspace_path'
: 'NULL';
const providerExpression = columnNames.includes('provider')
? "COALESCE(provider, 'claude')"
: "'claude'";
const customNameExpression = columnNames.includes('custom_name')
? 'custom_name'
: 'NULL';
const jsonlPathExpression = columnNames.includes('jsonl_path')
? 'jsonl_path'
: 'NULL';
const isArchivedExpression = columnNames.includes('isArchived')
? 'COALESCE(isArchived, 0)'
: '0';
const createdAtExpression = columnNames.includes('created_at')
? 'COALESCE(created_at, CURRENT_TIMESTAMP)'
: 'CURRENT_TIMESTAMP';
const updatedAtExpression = columnNames.includes('updated_at')
? 'COALESCE(updated_at, CURRENT_TIMESTAMP)'
: 'CURRENT_TIMESTAMP';
db.exec('PRAGMA foreign_keys = OFF');
try {
db.exec('BEGIN TRANSACTION');
db.exec('DROP TABLE IF EXISTS sessions__new');
db.exec(`
CREATE TABLE sessions__new (
session_id TEXT NOT NULL,
provider TEXT NOT NULL DEFAULT 'claude',
custom_name TEXT,
project_path TEXT,
jsonl_path TEXT,
isArchived BOOLEAN DEFAULT 0,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (session_id),
FOREIGN KEY (project_path) REFERENCES projects(project_path)
ON DELETE SET NULL
ON UPDATE CASCADE
)
`);
db.exec(`
WITH source_rows AS (
SELECT
session_id,
${providerExpression} AS provider,
${customNameExpression} AS custom_name,
${projectPathExpression} AS project_path,
${jsonlPathExpression} AS jsonl_path,
${isArchivedExpression} AS isArchived,
${createdAtExpression} AS created_at,
${updatedAtExpression} AS updated_at,
rowid AS source_rowid
FROM sessions
WHERE session_id IS NOT NULL AND trim(session_id) <> ''
),
ranked_rows AS (
SELECT
session_id,
provider,
custom_name,
project_path,
jsonl_path,
isArchived,
created_at,
updated_at,
ROW_NUMBER() OVER (
PARTITION BY session_id
ORDER BY datetime(COALESCE(updated_at, created_at)) DESC, source_rowid DESC
) AS session_rank
FROM source_rows
)
INSERT INTO sessions__new (
session_id,
provider,
custom_name,
project_path,
jsonl_path,
isArchived,
created_at,
updated_at
)
SELECT
session_id,
provider,
custom_name,
project_path,
jsonl_path,
isArchived,
created_at,
updated_at
FROM ranked_rows
WHERE session_rank = 1
`);
db.exec('DROP TABLE sessions');
db.exec('ALTER TABLE sessions__new RENAME TO sessions');
db.exec('COMMIT');
} catch (migrationError) {
db.exec('ROLLBACK');
throw migrationError;
} finally {
db.exec('PRAGMA foreign_keys = ON');
}
};
const ensureProjectsForSessionPaths = (db: Database): void => {
if (!tableExists(db, 'sessions')) {
return;
}
db.exec(`
INSERT INTO projects (project_id, project_path, custom_project_name, isStarred, isArchived)
SELECT
${SQLITE_UUID_SQL},
project_path,
NULL,
0,
0
FROM sessions
WHERE project_path IS NOT NULL AND trim(project_path) <> ''
ON CONFLICT(project_path) DO NOTHING
`);
};
export const runMigrations = (db: Database) => {
try {
const usersTableInfo = db.prepare('PRAGMA table_info(users)').all() as { name: string }[];
const userColumnNames = usersTableInfo.map((column) => column.name);
addColumnToTableIfNotExists(db, 'users', userColumnNames, 'git_name', 'TEXT');
addColumnToTableIfNotExists(db, 'users', userColumnNames, 'git_email', 'TEXT');
addColumnToTableIfNotExists(
db,
'users',
userColumnNames,
'has_completed_onboarding',
'BOOLEAN DEFAULT 0'
);
db.exec(APP_CONFIG_TABLE_SCHEMA_SQL);
db.exec(USER_NOTIFICATION_PREFERENCES_TABLE_SCHEMA_SQL);
db.exec(VAPID_KEYS_TABLE_SCHEMA_SQL);
db.exec(PUSH_SUBSCRIPTIONS_TABLE_SCHEMA_SQL);
db.exec('CREATE INDEX IF NOT EXISTS idx_push_subscriptions_user_id ON push_subscriptions(user_id)');
db.exec(PROJECTS_TABLE_SCHEMA_SQL);
rebuildProjectsTableWithPrimaryKeySchema(db);
migrateLegacyWorkspaceTableIntoProjects(db);
rebuildSessionsTableWithProjectSchema(db);
migrateLegacySessionNames(db);
ensureProjectsForSessionPaths(db);
db.exec('CREATE INDEX IF NOT EXISTS idx_session_ids_lookup ON sessions(session_id)');
db.exec('CREATE INDEX IF NOT EXISTS idx_sessions_project_path ON sessions(project_path)');
db.exec('CREATE INDEX IF NOT EXISTS idx_sessions_is_archived ON sessions(isArchived)');
db.exec('CREATE INDEX IF NOT EXISTS idx_projects_is_starred ON projects(isStarred)');
db.exec('CREATE INDEX IF NOT EXISTS idx_projects_is_archived ON projects(isArchived)');
db.exec('DROP INDEX IF EXISTS idx_session_names_lookup');
db.exec('DROP INDEX IF EXISTS idx_sessions_workspace_path');
db.exec('DROP INDEX IF EXISTS idx_workspace_original_paths_is_starred');
db.exec('DROP INDEX IF EXISTS idx_workspace_original_paths_workspace_id');
if (tableExists(db, 'workspace_original_paths')) {
console.log('Running migration: Dropping legacy workspace_original_paths table');
db.exec('DROP TABLE workspace_original_paths');
}
db.exec(LAST_SCANNED_AT_SQL);
console.log('Database migrations completed successfully');
} catch (error: any) {
console.error('Error running migrations:', error.message);
throw error;
}
};

View File

@@ -0,0 +1,119 @@
/**
* API keys repository.
*
* Manages API keys used for external/programmatic access to the backend.
* Keys are prefixed with `ck_` and tied to a user via foreign key.
*/
import crypto from 'crypto';
import { getConnection } from '@/modules/database/connection.js';
type ApiKeyRow = {
id: number;
key_name: string;
api_key: string;
created_at: string;
last_used: string | null;
is_active: number;
};
type CreateApiKeyResult = {
id: number | bigint;
keyName: string;
apiKey: string;
};
type ValidatedApiKeyUser = {
id: number;
username: string;
api_key_id: number;
};
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
/** Generates a cryptographically random API key with the `ck_` prefix. */
function generateApiKey(): string {
return 'ck_' + crypto.randomBytes(32).toString('hex');
}
// ---------------------------------------------------------------------------
// Queries
// ---------------------------------------------------------------------------
export const apiKeysDb = {
generateApiKey,
/** Creates a new API key for the given user and returns it for one-time display. */
createApiKey(userId: number, keyName: string): CreateApiKeyResult {
const db = getConnection();
const apiKey = generateApiKey();
const result = db
.prepare(
'INSERT INTO api_keys (user_id, key_name, api_key) VALUES (?, ?, ?)'
)
.run(userId, keyName, apiKey);
return { id: result.lastInsertRowid, keyName, apiKey };
},
/** Lists all API keys for a user, most recent first. */
getApiKeys(userId: number): ApiKeyRow[] {
const db = getConnection();
return db
.prepare(
'SELECT id, key_name, api_key, created_at, last_used, is_active FROM api_keys WHERE user_id = ? ORDER BY created_at DESC'
)
.all(userId) as ApiKeyRow[];
},
/**
* Validates an API key and resolves the owning user.
* If the key is valid, its `last_used` timestamp is updated as a side effect.
* Returns undefined when the key is invalid or the user is inactive.
*/
validateApiKey(apiKey: string): ValidatedApiKeyUser | undefined {
const db = getConnection();
const row = db
.prepare(
`SELECT u.id, u.username, ak.id as api_key_id
FROM api_keys ak
JOIN users u ON ak.user_id = u.id
WHERE ak.api_key = ? AND ak.is_active = 1 AND u.is_active = 1`
)
.get(apiKey) as ValidatedApiKeyUser | undefined;
if (row) {
db.prepare(
'UPDATE api_keys SET last_used = CURRENT_TIMESTAMP WHERE id = ?'
).run(row.api_key_id);
}
return row;
},
/** Permanently removes an API key. Returns true if a row was deleted. */
deleteApiKey(userId: number, apiKeyId: number): boolean {
const db = getConnection();
const result = db
.prepare('DELETE FROM api_keys WHERE id = ? AND user_id = ?')
.run(apiKeyId, userId);
return result.changes > 0;
},
/** Enables or disables an API key without deleting it. */
toggleApiKey(
userId: number,
apiKeyId: number,
isActive: boolean
): boolean {
const db = getConnection();
const result = db
.prepare(
'UPDATE api_keys SET is_active = ? WHERE id = ? AND user_id = ?'
)
.run(isActive ? 1 : 0, apiKeyId, userId);
return result.changes > 0;
},
};

View File

@@ -0,0 +1,53 @@
/**
* App config repository.
*
* Key-value store for application-level configuration that persists
* across restarts (JWT secret, feature flags, etc.). Values are always
* stored as strings; callers handle parsing.
*/
import crypto from 'crypto';
import { getConnection } from '@/modules/database/connection.js';
// ---------------------------------------------------------------------------
// Queries
// ---------------------------------------------------------------------------
export const appConfigDb = {
/** Returns the stored value for a config key, or null if missing. */
get(key: string): string | null {
try {
const db = getConnection();
const row = db
.prepare('SELECT value FROM app_config WHERE key = ?')
.get(key) as { value: string } | undefined;
return row?.value ?? null;
} catch {
// Swallow errors so early-startup reads (e.g. JWT secret) do not crash.
return null;
}
},
/** Inserts or updates a config key (upsert). */
set(key: string, value: string): void {
const db = getConnection();
db.prepare(
'INSERT INTO app_config (key, value) VALUES (?, ?) ON CONFLICT(key) DO UPDATE SET value = excluded.value'
).run(key, value);
},
/**
* Returns the JWT signing secret, generating and persisting one
* if it does not already exist. This ensures the secret survives
* server restarts while being created automatically on first boot.
*/
getOrCreateJwtSecret(): string {
let secret = appConfigDb.get('jwt_secret');
if (!secret) {
secret = crypto.randomBytes(64).toString('hex');
appConfigDb.set('jwt_secret', secret);
}
return secret;
},
};

View File

@@ -0,0 +1,106 @@
/**
* User credentials repository.
*
* Manages external service tokens (GitHub, GitLab, Bitbucket, etc.)
* stored per-user. Each credential has a type discriminator so multiple
* credential kinds can coexist in the same table.
*/
import { getConnection } from '@/modules/database/connection.js';
import type {
CreateCredentialResult,
CredentialPublicRow,
} from '@/shared/types.js';
// ---------------------------------------------------------------------------
// Queries
// ---------------------------------------------------------------------------
export const credentialsDb = {
/** Stores a new credential and returns a safe (no raw value) result. */
createCredential(
userId: number,
credentialName: string,
credentialType: string,
credentialValue: string,
description: string | null = null
): CreateCredentialResult {
const db = getConnection();
const result = db
.prepare(
'INSERT INTO user_credentials (user_id, credential_name, credential_type, credential_value, description) VALUES (?, ?, ?, ?, ?)'
)
.run(userId, credentialName, credentialType, credentialValue, description);
return {
id: result.lastInsertRowid,
credentialName,
credentialType,
};
},
/**
* Lists credentials for a user (excluding raw values).
* Optionally filters by credential type (e.g. 'github_token').
*/
getCredentials(
userId: number,
credentialType: string | null = null
): CredentialPublicRow[] {
const db = getConnection();
if (credentialType) {
return db
.prepare(
'SELECT id, credential_name, credential_type, description, created_at, is_active FROM user_credentials WHERE user_id = ? AND credential_type = ? ORDER BY created_at DESC'
)
.all(userId, credentialType) as CredentialPublicRow[];
}
return db
.prepare(
'SELECT id, credential_name, credential_type, description, created_at, is_active FROM user_credentials WHERE user_id = ? ORDER BY created_at DESC'
)
.all(userId) as CredentialPublicRow[];
},
/**
* Returns the raw credential value for the most recent active
* credential of the given type, or null if none exists.
*/
getActiveCredential(
userId: number,
credentialType: string
): string | null {
const db = getConnection();
const row = db
.prepare(
'SELECT credential_value FROM user_credentials WHERE user_id = ? AND credential_type = ? AND is_active = 1 ORDER BY created_at DESC LIMIT 1'
)
.get(userId, credentialType) as { credential_value: string } | undefined;
return row?.credential_value ?? null;
},
/** Permanently removes a credential. Returns true if a row was deleted. */
deleteCredential(userId: number, credentialId: number): boolean {
const db = getConnection();
const result = db
.prepare('DELETE FROM user_credentials WHERE id = ? AND user_id = ?')
.run(credentialId, userId);
return result.changes > 0;
},
/** Enables or disables a credential without deleting it. */
toggleCredential(
userId: number,
credentialId: number,
isActive: boolean
): boolean {
const db = getConnection();
const result = db
.prepare(
'UPDATE user_credentials SET is_active = ? WHERE id = ? AND user_id = ?'
)
.run(isActive ? 1 : 0, credentialId, userId);
return result.changes > 0;
},
};

View File

@@ -0,0 +1,100 @@
/**
* GitHub tokens repository.
*
* Backward-compatible helper layer over generic credentials storage.
* Tokens are stored in `user_credentials` with `credential_type = 'github_token'`.
*/
import { getConnection } from '@/modules/database/connection.js';
import { credentialsDb } from '@/modules/database/repositories/credentials.js';
import type {
CredentialPublicRow,
CreateCredentialResult,
} from '@/shared/types.js';
const GITHUB_TOKEN_TYPE = 'github_token';
type CredentialRow = {
id: number;
user_id: number;
credential_name: string;
credential_type: string;
credential_value: string;
description: string | null;
created_at: string;
is_active: number;
};
type GithubTokenLookup = CredentialRow & {
github_token: string;
};
export const githubTokensDb = {
/** Creates a GitHub token credential entry. */
createGithubToken(
userId: number,
tokenName: string,
githubToken: string,
description: string | null = null
): CreateCredentialResult {
return credentialsDb.createCredential(
userId,
tokenName,
GITHUB_TOKEN_TYPE,
githubToken,
description
);
},
/** Returns all GitHub tokens (safe shape: no credential value). */
getGithubTokens(userId: number): CredentialPublicRow[] {
return credentialsDb.getCredentials(userId, GITHUB_TOKEN_TYPE);
},
/** Returns the most recent active GitHub token value for a user. */
getActiveGithubToken(userId: number): string | null {
return credentialsDb.getActiveCredential(userId, GITHUB_TOKEN_TYPE);
},
/**
* Returns a specific active GitHub token row by id/user, including
* a `github_token` compatibility field.
*/
getGithubTokenById(userId: number, tokenId: number): GithubTokenLookup | null {
const db = getConnection();
const row = db
.prepare(
`SELECT *
FROM user_credentials
WHERE id = ? AND user_id = ? AND credential_type = ? AND is_active = 1`
)
.get(tokenId, userId, GITHUB_TOKEN_TYPE) as CredentialRow | undefined;
if (!row) return null;
return {
...row,
github_token: row.credential_value,
};
},
/** Updates active state for a GitHub token. */
updateGithubToken(
userId: number,
tokenId: number,
isActive: boolean
): boolean {
return credentialsDb.toggleCredential(userId, tokenId, isActive);
},
/** Deletes a GitHub token. */
deleteGithubToken(userId: number, tokenId: number): boolean {
return credentialsDb.deleteCredential(userId, tokenId);
},
// Legacy alias used by existing routes
toggleGithubToken(userId: number, tokenId: number, isActive: boolean): boolean {
return githubTokensDb.updateGithubToken(userId, tokenId, isActive);
},
};

View File

@@ -0,0 +1,103 @@
/**
* Notification preferences repository.
*
* Stores per-user notification channel/event preferences as JSON.
*/
import { getConnection } from '@/modules/database/connection.js';
type NotificationPreferences = {
channels: {
inApp: boolean;
webPush: boolean;
};
events: {
actionRequired: boolean;
stop: boolean;
error: boolean;
};
};
const DEFAULT_NOTIFICATION_PREFERENCES: NotificationPreferences = {
channels: {
inApp: false,
webPush: false,
},
events: {
actionRequired: true,
stop: true,
error: true,
},
};
function normalizeNotificationPreferences(value: unknown): NotificationPreferences {
const source = value && typeof value === 'object' ? (value as Record<string, any>) : {};
return {
channels: {
inApp: source.channels?.inApp === true,
webPush: source.channels?.webPush === true,
},
events: {
actionRequired: source.events?.actionRequired !== false,
stop: source.events?.stop !== false,
error: source.events?.error !== false,
},
};
}
export const notificationPreferencesDb = {
/** Returns the normalized preferences for a user, creating defaults on first read. */
getNotificationPreferences(userId: number): NotificationPreferences {
const db = getConnection();
const row = db
.prepare(
'SELECT preferences_json FROM user_notification_preferences WHERE user_id = ?'
)
.get(userId) as { preferences_json: string } | undefined;
if (!row) {
const defaults = normalizeNotificationPreferences(DEFAULT_NOTIFICATION_PREFERENCES);
db.prepare(
'INSERT INTO user_notification_preferences (user_id, preferences_json, updated_at) VALUES (?, ?, CURRENT_TIMESTAMP)'
).run(userId, JSON.stringify(defaults));
return defaults;
}
let parsed: unknown;
try {
parsed = JSON.parse(row.preferences_json);
} catch {
parsed = DEFAULT_NOTIFICATION_PREFERENCES;
}
return normalizeNotificationPreferences(parsed);
},
/** Upserts normalized preferences for a user and returns the stored value. */
updateNotificationPreferences(
userId: number,
preferences: unknown
): NotificationPreferences {
const normalized = normalizeNotificationPreferences(preferences);
const db = getConnection();
db.prepare(
`INSERT INTO user_notification_preferences (user_id, preferences_json, updated_at)
VALUES (?, ?, CURRENT_TIMESTAMP)
ON CONFLICT(user_id) DO UPDATE SET
preferences_json = excluded.preferences_json,
updated_at = CURRENT_TIMESTAMP`
).run(userId, JSON.stringify(normalized));
return normalized;
},
// Legacy aliases used by existing services/routes
getPreferences(userId: number): NotificationPreferences {
return notificationPreferencesDb.getNotificationPreferences(userId);
},
updatePreferences(userId: number, preferences: unknown): NotificationPreferences {
return notificationPreferencesDb.updateNotificationPreferences(userId, preferences);
},
};

View File

@@ -0,0 +1,72 @@
import assert from 'node:assert/strict';
import { mkdtemp, rm } from 'node:fs/promises';
import { tmpdir } from 'node:os';
import path from 'node:path';
import test from 'node:test';
import { closeConnection } from '@/modules/database/connection.js';
import { initializeDatabase } from '@/modules/database/init-db.js';
import { projectsDb } from '@/modules/database/repositories/projects.db.js';
async function withIsolatedDatabase(runTest: () => void | Promise<void>): Promise<void> {
const previousDatabasePath = process.env.DATABASE_PATH;
const tempDirectory = await mkdtemp(path.join(tmpdir(), 'projects-db-'));
const databasePath = path.join(tempDirectory, 'auth.db');
closeConnection();
process.env.DATABASE_PATH = databasePath;
await initializeDatabase();
try {
await runTest();
} finally {
closeConnection();
if (previousDatabasePath === undefined) {
delete process.env.DATABASE_PATH;
} else {
process.env.DATABASE_PATH = previousDatabasePath;
}
await rm(tempDirectory, { recursive: true, force: true });
}
}
test('projectsDb.createProjectPath returns created for fresh paths', async () => {
await withIsolatedDatabase(() => {
const created = projectsDb.createProjectPath('/workspace/new-project');
assert.equal(created.outcome, 'created');
assert.ok(created.project);
assert.equal(created.project?.project_path, '/workspace/new-project');
assert.equal(created.project?.isArchived, 0);
});
});
test('projectsDb.createProjectPath returns reactivated_archived for archived duplicates', async () => {
await withIsolatedDatabase(() => {
const initial = projectsDb.createProjectPath('/workspace/archived-project', 'Archived Project');
assert.equal(initial.outcome, 'created');
assert.ok(initial.project);
projectsDb.updateProjectIsArchived('/workspace/archived-project', true);
const reused = projectsDb.createProjectPath('/workspace/archived-project', 'Renamed Project');
assert.equal(reused.outcome, 'reactivated_archived');
assert.ok(reused.project);
assert.equal(reused.project?.project_id, initial.project?.project_id);
assert.equal(reused.project?.isArchived, 0);
});
});
test('projectsDb.createProjectPath returns active_conflict for active duplicates', async () => {
await withIsolatedDatabase(() => {
const initial = projectsDb.createProjectPath('/workspace/active-project');
assert.equal(initial.outcome, 'created');
assert.ok(initial.project);
const conflict = projectsDb.createProjectPath('/workspace/active-project');
assert.equal(conflict.outcome, 'active_conflict');
assert.ok(conflict.project);
assert.equal(conflict.project?.project_id, initial.project?.project_id);
assert.equal(conflict.project?.isArchived, 0);
});
});

View File

@@ -0,0 +1,196 @@
import { randomUUID } from 'node:crypto';
import path from 'node:path';
import { getConnection } from '@/modules/database/connection.js';
import type { CreateProjectPathResult, ProjectRepositoryRow } from '@/shared/types.js';
import { normalizeProjectPath } from '@/shared/utils.js';
function normalizeProjectDisplayName(projectPath: string, customProjectName: string | null): string {
const trimmedCustomName = typeof customProjectName === 'string' ? customProjectName.trim() : '';
if (trimmedCustomName.length > 0) {
return trimmedCustomName;
}
const directoryName = path.basename(projectPath);
return directoryName || projectPath;
}
export const projectsDb = {
createProjectPath(projectPath: string, customProjectName: string | null = null): CreateProjectPathResult {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
const normalizedProjectName = normalizeProjectDisplayName(normalizedProjectPath, customProjectName);
const attemptedId = randomUUID();
const row = db.prepare(`
INSERT INTO projects (project_id, project_path, custom_project_name, isArchived)
VALUES (?, ?, ?, 0)
ON CONFLICT(project_path) DO UPDATE SET
isArchived = 0
WHERE projects.isArchived = 1
RETURNING project_id, project_path, custom_project_name, isStarred, isArchived
`).get(attemptedId, normalizedProjectPath, normalizedProjectName) as ProjectRepositoryRow | undefined;
if (row) {
return {
outcome: row.project_id === attemptedId ? 'created' : 'reactivated_archived',
project: row,
};
}
const existingProject = projectsDb.getProjectPath(normalizedProjectPath);
return {
outcome: 'active_conflict',
project: existingProject,
};
},
getProjectPath(projectPath: string): ProjectRepositoryRow | null {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
const row = db.prepare(`
SELECT project_id, project_path, custom_project_name, isStarred, isArchived
FROM projects
WHERE project_path = ?
`).get(normalizedProjectPath) as ProjectRepositoryRow | undefined;
return row ?? null;
},
getProjectById(projectId: string): ProjectRepositoryRow | null {
const db = getConnection();
const row = db.prepare(`
SELECT project_id, project_path, custom_project_name, isStarred, isArchived
FROM projects
WHERE project_id = ?
`).get(projectId) as ProjectRepositoryRow | undefined;
return row ?? null;
},
/**
* Resolve the absolute project directory from a database project_id.
*
* This is the canonical lookup used after the projectName → projectId migration:
* API routes receive the DB-assigned `projectId` and must resolve the real folder
* path through this helper before touching the filesystem. Returns `null` when the
* project row does not exist so callers can respond with a 404.
*/
getProjectPathById(projectId: string): string | null {
const db = getConnection();
const row = db.prepare(`
SELECT project_path
FROM projects
WHERE project_id = ?
`).get(projectId) as Pick<ProjectRepositoryRow, 'project_path'> | undefined;
return row?.project_path ?? null;
},
getProjectPaths(): ProjectRepositoryRow[] {
const db = getConnection();
return db.prepare(`
SELECT project_id, project_path, custom_project_name, isStarred, isArchived
FROM projects
WHERE isArchived = 0
`).all() as ProjectRepositoryRow[];
},
/**
* Archived rows are queried separately so archive-focused UIs can present
* hidden workspaces without reintroducing them into the active sidebar list.
*/
getArchivedProjectPaths(): ProjectRepositoryRow[] {
const db = getConnection();
return db.prepare(`
SELECT project_id, project_path, custom_project_name, isStarred, isArchived
FROM projects
WHERE isArchived = 1
`).all() as ProjectRepositoryRow[];
},
getCustomProjectName(projectPath: string): string | null {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
const row = db.prepare(`
SELECT custom_project_name
FROM projects
WHERE project_path = ?
`).get(normalizedProjectPath) as Pick<ProjectRepositoryRow, 'custom_project_name'> | undefined;
return row?.custom_project_name ?? null;
},
updateCustomProjectName(projectPath: string, customProjectName: string | null): void {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
db.prepare(`
INSERT INTO projects (project_id, project_path, custom_project_name)
VALUES (?, ?, ?)
ON CONFLICT(project_path) DO UPDATE SET custom_project_name = excluded.custom_project_name
`).run(randomUUID(), normalizedProjectPath, customProjectName);
},
updateCustomProjectNameById(projectId: string, customProjectName: string | null): void {
const db = getConnection();
db.prepare(`
UPDATE projects
SET custom_project_name = ?
WHERE project_id = ?
`).run(customProjectName, projectId);
},
updateProjectIsStarred(projectPath: string, isStarred: boolean): void {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
db.prepare(`
UPDATE projects
SET isStarred = ?
WHERE project_path = ?
`).run(isStarred ? 1 : 0, normalizedProjectPath);
},
updateProjectIsStarredById(projectId: string, isStarred: boolean): void {
const db = getConnection();
db.prepare(`
UPDATE projects
SET isStarred = ?
WHERE project_id = ?
`).run(isStarred ? 1 : 0, projectId);
},
updateProjectIsArchived(projectPath: string, isArchived: boolean): void {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
db.prepare(`
UPDATE projects
SET isArchived = ?
WHERE project_path = ?
`).run(isArchived ? 1 : 0, normalizedProjectPath);
},
updateProjectIsArchivedById(projectId: string, isArchived: boolean): void {
const db = getConnection();
db.prepare(`
UPDATE projects
SET isArchived = ?
WHERE project_id = ?
`).run(isArchived ? 1 : 0, projectId);
},
deleteProjectPath(projectPath: string): void {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
db.prepare(`
DELETE FROM projects
WHERE project_path = ?
`).run(normalizedProjectPath);
},
deleteProjectById(projectId: string): void {
const db = getConnection();
db.prepare(`
DELETE FROM projects
WHERE project_id = ?
`).run(projectId);
},
};

View File

@@ -0,0 +1,80 @@
/**
* Push subscriptions repository.
*
* Persists browser push subscription endpoints and keys per user.
*/
import { getConnection } from '@/modules/database/connection.js';
type PushSubscriptionLookupRow = {
endpoint: string;
keys_p256dh: string;
keys_auth: string;
};
export const pushSubscriptionsDb = {
/** Upserts a push subscription endpoint for a user. */
createPushSubscription(
userId: number,
endpoint: string,
keysP256dh: string,
keysAuth: string
): void {
const db = getConnection();
db.prepare(
`INSERT INTO push_subscriptions (user_id, endpoint, keys_p256dh, keys_auth)
VALUES (?, ?, ?, ?)
ON CONFLICT(endpoint) DO UPDATE SET
user_id = excluded.user_id,
keys_p256dh = excluded.keys_p256dh,
keys_auth = excluded.keys_auth`
).run(userId, endpoint, keysP256dh, keysAuth);
},
/** Returns all subscriptions for a user. */
getPushSubscriptions(userId: number): PushSubscriptionLookupRow[] {
const db = getConnection();
return db
.prepare(
'SELECT endpoint, keys_p256dh, keys_auth FROM push_subscriptions WHERE user_id = ?'
)
.all(userId) as PushSubscriptionLookupRow[];
},
/** Deletes one subscription by endpoint. */
deletePushSubscription(endpoint: string): void {
const db = getConnection();
db.prepare('DELETE FROM push_subscriptions WHERE endpoint = ?').run(endpoint);
},
/** Deletes all subscriptions for a user. */
deletePushSubscriptionsForUser(userId: number): void {
const db = getConnection();
db.prepare('DELETE FROM push_subscriptions WHERE user_id = ?').run(userId);
},
// Legacy aliases used by existing services/routes
saveSubscription(
userId: number,
endpoint: string,
keysP256dh: string,
keysAuth: string
): void {
pushSubscriptionsDb.createPushSubscription(
userId,
endpoint,
keysP256dh,
keysAuth
);
},
getSubscriptions(userId: number): PushSubscriptionLookupRow[] {
return pushSubscriptionsDb.getPushSubscriptions(userId);
},
removeSubscription(endpoint: string): void {
pushSubscriptionsDb.deletePushSubscription(endpoint);
},
removeAllForUser(userId: number): void {
pushSubscriptionsDb.deletePushSubscriptionsForUser(userId);
},
};

View File

@@ -0,0 +1,42 @@
import { getConnection } from '@/modules/database/connection.js';
type ScanStateRow = {
last_scanned_at: string;
};
export const scanStateDb = {
getLastScannedAt() {
const db = getConnection();
const row = db
.prepare(`SELECT last_scanned_at FROM scan_state WHERE id = 1`)
.get() as ScanStateRow;
if (!row) {
return null; // Before any scan, the row is undefined.
}
let lastScannedDate: Date | null = null;
const lastScannedStr = row.last_scanned_at;
if (lastScannedStr) {
// SQLite CURRENT_TIMESTAMP returns UTC in "YYYY-MM-DD HH:MM:SS" format.
// Replace space with 'T' and append 'Z' to parse reliably in JS across all platforms.
lastScannedDate = new Date(lastScannedStr.replace(' ', 'T') + 'Z');
}
return lastScannedDate;
},
updateLastScannedAt(scannedAt: Date = new Date()) {
const db = getConnection();
const sqliteTimestamp = scannedAt.toISOString().slice(0, 19).replace('T', ' ');
db.prepare(`
INSERT INTO scan_state (id, last_scanned_at)
VALUES (1, ?)
ON CONFLICT (id)
DO UPDATE SET last_scanned_at = excluded.last_scanned_at
`).run(sqliteTimestamp);
}
};

View File

@@ -0,0 +1,72 @@
import assert from 'node:assert/strict';
import { mkdtemp, rm } from 'node:fs/promises';
import { tmpdir } from 'node:os';
import path from 'node:path';
import test from 'node:test';
import { closeConnection } from '@/modules/database/connection.js';
import { initializeDatabase } from '@/modules/database/init-db.js';
import { sessionsDb } from '@/modules/database/repositories/sessions.db.js';
async function withIsolatedDatabase(runTest: () => void | Promise<void>): Promise<void> {
const previousDatabasePath = process.env.DATABASE_PATH;
const tempDirectory = await mkdtemp(path.join(tmpdir(), 'sessions-db-'));
const databasePath = path.join(tempDirectory, 'auth.db');
closeConnection();
process.env.DATABASE_PATH = databasePath;
await initializeDatabase();
try {
await runTest();
} finally {
closeConnection();
if (previousDatabasePath === undefined) {
delete process.env.DATABASE_PATH;
} else {
process.env.DATABASE_PATH = previousDatabasePath;
}
await rm(tempDirectory, { recursive: true, force: true });
}
}
test('session archive queries hide archived rows from active project views', async () => {
await withIsolatedDatabase(() => {
sessionsDb.createSession('session-active', 'claude', '/workspace/demo-project', 'Active Session');
sessionsDb.createSession('session-archived', 'claude', '/workspace/demo-project', 'Archived Session');
sessionsDb.updateSessionIsArchived('session-archived', true);
const activeSessions = sessionsDb.getAllSessions();
const archivedSessions = sessionsDb.getArchivedSessions();
const activeProjectSessions = sessionsDb.getSessionsByProjectPath('/workspace/demo-project');
const allProjectSessions = sessionsDb.getSessionsByProjectPathIncludingArchived('/workspace/demo-project');
assert.deepEqual(activeSessions.map((session) => session.session_id), ['session-active']);
assert.deepEqual(archivedSessions.map((session) => session.session_id), ['session-archived']);
assert.deepEqual(activeProjectSessions.map((session) => session.session_id), ['session-active']);
assert.deepEqual(
allProjectSessions.map((session) => session.session_id).sort(),
['session-active', 'session-archived'],
);
assert.equal(sessionsDb.countSessionsByProjectPath('/workspace/demo-project'), 1);
});
});
test('createSession reactivates archived rows when the session becomes active again', async () => {
await withIsolatedDatabase(() => {
sessionsDb.createSession('session-reused', 'claude', '/workspace/demo-project', 'First Name');
sessionsDb.updateSessionIsArchived('session-reused', true);
sessionsDb.createSession('session-reused', 'claude', '/workspace/demo-project', 'Updated Name');
const activeSessions = sessionsDb.getAllSessions();
const archivedSessions = sessionsDb.getArchivedSessions();
const restoredSession = sessionsDb.getSessionById('session-reused');
assert.equal(activeSessions.length, 1);
assert.equal(activeSessions[0]?.session_id, 'session-reused');
assert.equal(activeSessions[0]?.custom_name, 'Updated Name');
assert.equal(archivedSessions.length, 0);
assert.equal(restoredSession?.isArchived, 0);
});
});

View File

@@ -0,0 +1,225 @@
import { getConnection } from '@/modules/database/connection.js';
import { projectsDb } from '@/modules/database/repositories/projects.db.js';
import { normalizeProjectPath } from '@/shared/utils.js';
type SessionRow = {
session_id: string;
provider: string;
project_path: string | null;
jsonl_path: string | null;
custom_name: string | null;
isArchived: number;
created_at: string;
updated_at: string;
};
type SessionMetadataLookupRow = Pick<
SessionRow,
'session_id' | 'provider' | 'project_path' | 'jsonl_path' | 'custom_name' | 'isArchived' | 'created_at' | 'updated_at'
>;
function normalizeTimestamp(value?: string): string | null {
if (!value) return null;
const parsed = new Date(value);
if (Number.isNaN(parsed.getTime())) {
return null;
}
return parsed.toISOString();
}
function normalizeProjectPathForProvider(provider: string, projectPath: string): string {
void provider;
return normalizeProjectPath(projectPath);
}
export const sessionsDb = {
createSession(
sessionId: string,
provider: string,
projectPath: string,
customName?: string,
createdAt?: string,
updatedAt?: string,
jsonlPath?: string | null
): string {
const db = getConnection();
const createdAtValue = normalizeTimestamp(createdAt);
const updatedAtValue = normalizeTimestamp(updatedAt);
const normalizedProjectPath = normalizeProjectPathForProvider(provider, projectPath);
// First, ensure the project path is recorded in the projects table,
// since it's a foreign key in the sessions table.
projectsDb.createProjectPath(normalizedProjectPath);
db.prepare(
`INSERT INTO sessions (session_id, provider, custom_name, project_path, jsonl_path, isArchived, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, 0, COALESCE(?, CURRENT_TIMESTAMP), COALESCE(?, CURRENT_TIMESTAMP))
ON CONFLICT(session_id) DO UPDATE SET
provider = excluded.provider,
updated_at = excluded.updated_at,
project_path = excluded.project_path,
jsonl_path = excluded.jsonl_path,
isArchived = 0,
custom_name = COALESCE(excluded.custom_name, sessions.custom_name)`
).run(
sessionId,
provider,
customName ?? null,
normalizedProjectPath,
jsonlPath ?? null,
createdAtValue,
updatedAtValue
);
return sessionId;
},
updateSessionCustomName(sessionId: string, customName: string): void {
const db = getConnection();
db.prepare(
`UPDATE sessions
SET custom_name = ?
WHERE session_id = ?`
).run(customName, sessionId);
},
getSessionById(sessionId: string): SessionMetadataLookupRow | null {
const db = getConnection();
const row = db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE session_id = ?
ORDER BY updated_at DESC
LIMIT 1`
)
.get(sessionId) as SessionMetadataLookupRow | undefined;
return row ?? null;
},
getAllSessions(): SessionRow[] {
const db = getConnection();
return db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE isArchived = 0`
)
.all() as SessionRow[];
},
/**
* Archived rows are intentionally queried separately so the caller can render
* them in a dedicated view without reintroducing them into active session lists.
*/
getArchivedSessions(): SessionRow[] {
const db = getConnection();
return db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE isArchived = 1
ORDER BY datetime(COALESCE(updated_at, created_at)) DESC, session_id DESC`
)
.all() as SessionRow[];
},
getSessionsByProjectPath(projectPath: string): SessionRow[] {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
return db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE project_path = ?
AND isArchived = 0`
)
.all(normalizedProjectPath) as SessionRow[];
},
/**
* Permanent project deletion must see every session row for the path,
* including archived ones, so their transcript files can be cleaned up.
*/
getSessionsByProjectPathIncludingArchived(projectPath: string): SessionRow[] {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
return db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE project_path = ?`
)
.all(normalizedProjectPath) as SessionRow[];
},
getSessionsByProjectPathPage(projectPath: string, limit: number, offset: number): SessionRow[] {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
return db
.prepare(
`SELECT session_id, provider, project_path, jsonl_path, custom_name, isArchived, created_at, updated_at
FROM sessions
WHERE project_path = ?
AND isArchived = 0
ORDER BY datetime(COALESCE(updated_at, created_at)) DESC, session_id DESC
LIMIT ? OFFSET ?`
)
.all(normalizedProjectPath, limit, offset) as SessionRow[];
},
countSessionsByProjectPath(projectPath: string): number {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
const row = db
.prepare(
`SELECT COUNT(*) AS count
FROM sessions
WHERE project_path = ?
AND isArchived = 0`
)
.get(normalizedProjectPath) as { count: number } | undefined;
return Number(row?.count ?? 0);
},
deleteSessionsByProjectPath(projectPath: string): void {
const db = getConnection();
const normalizedProjectPath = normalizeProjectPath(projectPath);
db.prepare(`DELETE FROM sessions WHERE project_path = ?`).run(normalizedProjectPath);
},
getSessionName(sessionId: string, provider: string): string | null {
const db = getConnection();
const row = db
.prepare(
`SELECT custom_name
FROM sessions
WHERE session_id = ? AND provider = ?`
)
.get(sessionId, provider) as { custom_name: string | null } | undefined;
return row?.custom_name ?? null;
},
/**
* Soft-delete and restore both use the same flag update so callers keep the
* row, metadata, and file path intact while toggling visibility.
*/
updateSessionIsArchived(sessionId: string, isArchived: boolean): void {
const db = getConnection();
db.prepare(
`UPDATE sessions
SET isArchived = ?
WHERE session_id = ?`
).run(isArchived ? 1 : 0, sessionId);
},
deleteSessionById(sessionId: string): boolean {
const db = getConnection();
return db.prepare('DELETE FROM sessions WHERE session_id = ?').run(sessionId).changes > 0;
},
};

View File

@@ -0,0 +1,140 @@
/**
* User repository.
*
* Provides typed CRUD operations for the `users` table.
* This is a single-user system, but the schema supports multiple
* users for forward compatibility.
*/
import { getConnection } from '@/modules/database/connection.js';
type UserRow = {
id: number;
username: string;
password_hash: string;
created_at: string;
last_login: string | null;
is_active: number;
git_name: string | null;
git_email: string | null;
has_completed_onboarding: number;
};
type UserPublicRow = Pick<UserRow, 'id' | 'username' | 'created_at' | 'last_login'>;
type UserGitConfig = {
git_name: string | null;
git_email: string | null;
};
type CreateUserResult = {
id: number | bigint;
username: string;
};
// ---------------------------------------------------------------------------
// Queries
// ---------------------------------------------------------------------------
export const userDb = {
/** Returns true if at least one user exists in the database. */
hasUsers(): boolean {
const db = getConnection();
const row = db.prepare('SELECT COUNT(*) as count FROM users').get() as {
count: number;
};
return row.count > 0;
},
/** Inserts a new user and returns the created ID + username. */
createUser(username: string, passwordHash: string): CreateUserResult {
const db = getConnection();
const result = db
.prepare('INSERT INTO users (username, password_hash) VALUES (?, ?)')
.run(username, passwordHash);
return { id: result.lastInsertRowid, username };
},
/**
* Looks up an active user by username.
* Returns the full row (including password hash) for auth verification.
*/
getUserByUsername(username: string): UserRow | undefined {
const db = getConnection();
return db
.prepare('SELECT * FROM users WHERE username = ? AND is_active = 1')
.get(username) as UserRow | undefined;
},
/** Updates the last_login timestamp. Non-fatal — logs but does not throw. */
updateLastLogin(userId: number): void {
try {
const db = getConnection();
db.prepare(
'UPDATE users SET last_login = CURRENT_TIMESTAMP WHERE id = ?'
).run(userId);
} catch (err) {
const message = err instanceof Error ? err.message : String(err);
console.error('Failed to update last login', { error: message });
}
},
/** Returns public user fields by ID (no password hash). */
getUserById(userId: number): UserPublicRow | undefined {
const db = getConnection();
return db
.prepare(
'SELECT id, username, created_at, last_login FROM users WHERE id = ? AND is_active = 1'
)
.get(userId) as UserPublicRow | undefined;
},
/** Returns the first active user. Used for single-user mode lookups. */
getFirstUser(): UserPublicRow | undefined {
const db = getConnection();
return db
.prepare(
'SELECT id, username, created_at, last_login FROM users WHERE is_active = 1 LIMIT 1'
)
.get() as UserPublicRow | undefined;
},
/** Stores the user's preferred git name and email. */
updateGitConfig(
userId: number,
gitName: string,
gitEmail: string
): void {
const db = getConnection();
db.prepare('UPDATE users SET git_name = ?, git_email = ? WHERE id = ?').run(
gitName,
gitEmail,
userId
);
},
/** Retrieves the user's git identity (name + email). */
getGitConfig(userId: number): UserGitConfig | undefined {
const db = getConnection();
return db
.prepare('SELECT git_name, git_email FROM users WHERE id = ?')
.get(userId) as UserGitConfig | undefined;
},
/** Marks onboarding as complete for the given user. */
completeOnboarding(userId: number): void {
const db = getConnection();
db.prepare(
'UPDATE users SET has_completed_onboarding = 1 WHERE id = ?'
).run(userId);
},
/** Returns true if the user has finished the onboarding flow. */
hasCompletedOnboarding(userId: number): boolean {
const db = getConnection();
const row = db
.prepare('SELECT has_completed_onboarding FROM users WHERE id = ?')
.get(userId) as { has_completed_onboarding: number } | undefined;
return row?.has_completed_onboarding === 1;
},
};

View File

@@ -0,0 +1,57 @@
/**
* VAPID keys repository.
*
* Stores and retrieves the Web Push VAPID key pair.
*/
import { getConnection } from '@/modules/database/connection.js';
type VapidKeyRow = {
public_key: string;
private_key: string;
};
type VapidKeyPair = {
publicKey: string;
privateKey: string;
};
export const vapidKeysDb = {
/** Returns the latest stored VAPID key pair, or null when unset. */
getVapidKeys(): VapidKeyPair | null {
const db = getConnection();
const row = db
.prepare(
'SELECT public_key, private_key FROM vapid_keys ORDER BY id DESC LIMIT 1'
)
.get() as Pick<VapidKeyRow, 'public_key' | 'private_key'> | undefined;
if (!row) return null;
return {
publicKey: row.public_key,
privateKey: row.private_key,
};
},
/** Persists a new VAPID key pair. */
createVapidKeys(publicKey: string, privateKey: string): void {
const db = getConnection();
db.prepare(
'INSERT INTO vapid_keys (public_key, private_key) VALUES (?, ?)'
).run(publicKey, privateKey);
},
/** Replaces all existing keys with a fresh pair. */
updateVapidKeys(publicKey: string, privateKey: string): void {
const db = getConnection();
db.prepare('DELETE FROM vapid_keys').run();
vapidKeysDb.createVapidKeys(publicKey, privateKey);
},
/** Deletes all VAPID key rows. */
deleteVapidKeys(): void {
const db = getConnection();
db.prepare('DELETE FROM vapid_keys').run();
},
};

View File

@@ -0,0 +1,153 @@
const USER_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username TEXT UNIQUE NOT NULL,
password_hash TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
last_login DATETIME,
is_active BOOLEAN DEFAULT 1,
git_name TEXT,
git_email TEXT,
has_completed_onboarding BOOLEAN DEFAULT 0
);
`;
export const API_KEYS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS api_keys (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
key_name TEXT NOT NULL,
api_key TEXT UNIQUE NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
last_used DATETIME,
is_active BOOLEAN DEFAULT 1,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
`;
export const USER_CREDENTIALS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS user_credentials (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
credential_name TEXT NOT NULL,
credential_type TEXT NOT NULL, -- 'github_token', 'gitlab_token', 'bitbucket_token', etc.
credential_value TEXT NOT NULL,
description TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
is_active BOOLEAN DEFAULT 1,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
`;
export const USER_NOTIFICATION_PREFERENCES_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS user_notification_preferences (
user_id INTEGER PRIMARY KEY,
preferences_json TEXT NOT NULL,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
`;
export const VAPID_KEYS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS vapid_keys (
id INTEGER PRIMARY KEY AUTOINCREMENT,
public_key TEXT NOT NULL,
private_key TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
`;
export const PUSH_SUBSCRIPTIONS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS push_subscriptions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_id INTEGER NOT NULL,
endpoint TEXT NOT NULL UNIQUE,
keys_p256dh TEXT NOT NULL,
keys_auth TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
`;
export const PROJECTS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS projects (
project_id TEXT PRIMARY KEY NOT NULL,
project_path TEXT NOT NULL UNIQUE,
custom_project_name TEXT DEFAULT NULL,
isStarred BOOLEAN DEFAULT 0,
isArchived BOOLEAN DEFAULT 0
);
`;
export const SESSIONS_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS sessions (
session_id TEXT NOT NULL,
provider TEXT NOT NULL DEFAULT 'claude',
custom_name TEXT,
project_path TEXT,
jsonl_path TEXT,
isArchived BOOLEAN DEFAULT 0,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (session_id),
FOREIGN KEY (project_path) REFERENCES projects(project_path)
ON DELETE SET NULL
ON UPDATE CASCADE
);
`;
export const LAST_SCANNED_AT_SQL = `
CREATE TABLE IF NOT EXISTS scan_state (
id INTEGER PRIMARY KEY CHECK (id = 1),
last_scanned_at TIMESTAMP NULL
);
`;
export const APP_CONFIG_TABLE_SCHEMA_SQL = `
CREATE TABLE IF NOT EXISTS app_config (
key TEXT PRIMARY KEY,
value TEXT NOT NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);
`;
export const INIT_SCHEMA_SQL = `
-- Initialize authentication database
PRAGMA foreign_keys = ON;
${USER_TABLE_SCHEMA_SQL}
-- Indexes for performance for user lookups
CREATE INDEX IF NOT EXISTS idx_users_username ON users(username);
CREATE INDEX IF NOT EXISTS idx_users_active ON users(is_active);
${API_KEYS_TABLE_SCHEMA_SQL}
CREATE INDEX IF NOT EXISTS idx_api_keys_key ON api_keys(api_key);
CREATE INDEX IF NOT EXISTS idx_api_keys_user_id ON api_keys(user_id);
CREATE INDEX IF NOT EXISTS idx_api_keys_active ON api_keys(is_active);
${USER_CREDENTIALS_TABLE_SCHEMA_SQL}
CREATE INDEX IF NOT EXISTS idx_user_credentials_user_id ON user_credentials(user_id);
CREATE INDEX IF NOT EXISTS idx_user_credentials_type ON user_credentials(credential_type);
CREATE INDEX IF NOT EXISTS idx_user_credentials_active ON user_credentials(is_active);
${USER_NOTIFICATION_PREFERENCES_TABLE_SCHEMA_SQL}
CREATE INDEX IF NOT EXISTS idx_user_notification_preferences_user_id ON user_notification_preferences(user_id);
${VAPID_KEYS_TABLE_SCHEMA_SQL}
${PUSH_SUBSCRIPTIONS_TABLE_SCHEMA_SQL}
CREATE INDEX IF NOT EXISTS idx_push_subscriptions_user_id ON push_subscriptions(user_id);
${PROJECTS_TABLE_SCHEMA_SQL}
-- NOTE: These indexes are created in migrations after legacy table-shape repairs.
-- Creating them here can fail on upgraded installs where projects lacks those columns.
${SESSIONS_TABLE_SCHEMA_SQL}
CREATE INDEX IF NOT EXISTS idx_session_ids_lookup ON sessions(session_id);
-- NOTE: This index is created in migrations after sessions is rebuilt to include project_path.
-- Creating it here can fail on upgraded installs where the legacy sessions table has no project_path.
${LAST_SCANNED_AT_SQL}
${APP_CONFIG_TABLE_SCHEMA_SQL}
`;

View File

@@ -0,0 +1,6 @@
export {
generateDisplayName,
getProjectsWithSessions,
} from './services/projects-with-sessions-fetch.service.js';
export { updateProjectDisplayName } from './services/project-management.service.js';
export { deleteOrArchiveProject, deleteSessionJsonlFilesForProjectPath } from './services/project-delete.service.js';

View File

@@ -0,0 +1,264 @@
import express from 'express';
import { createProject, updateProjectDisplayName } from '@/modules/projects/services/project-management.service.js';
import { startCloneProject } from '@/modules/projects/services/project-clone.service.js';
import { getProjectTaskMaster } from '@/modules/projects/services/projects-has-taskmaster.service.js';
import { AppError, asyncHandler, createApiSuccessResponse } from '@/shared/utils.js';
import { getArchivedProjectsWithSessions, getProjectSessionsPage, getProjectsWithSessions } from '@/modules/projects/services/projects-with-sessions-fetch.service.js';
import { deleteOrArchiveProject, restoreArchivedProject } from '@/modules/projects/services/project-delete.service.js';
import { applyLegacyStarredProjectIds, toggleProjectStar } from '@/modules/projects/services/project-star.service.js';
const router = express.Router();
type AuthenticatedUser = {
id?: number | string;
};
function readQueryStringValue(value: unknown): string {
if (typeof value === 'string') {
return value;
}
if (Array.isArray(value) && typeof value[0] === 'string') {
return value[0];
}
return '';
}
function readOptionalNumericQueryValue(value: unknown): number | null {
const rawValue = readQueryStringValue(value).trim();
if (!rawValue) {
return null;
}
const parsedValue = Number.parseInt(rawValue, 10);
return Number.isNaN(parsedValue) ? null : parsedValue;
}
function parseNonNegativeIntQuery(value: unknown, name: string, fallback: number): number {
const rawValue = readQueryStringValue(value).trim();
if (!rawValue) {
return fallback;
}
const parsedValue = Number.parseInt(rawValue, 10);
if (Number.isNaN(parsedValue) || parsedValue < 0) {
throw new AppError(`${name} must be a non-negative integer`, {
code: 'INVALID_QUERY_PARAMETER',
statusCode: 400,
});
}
return parsedValue;
}
function resolveRouteErrorMessage(error: unknown): string {
if (error instanceof AppError) {
return error.message;
}
if (error instanceof Error && error.message) {
return error.message;
}
return 'Failed to clone repository';
}
router.get(
'/',
asyncHandler(async (_req, res) => {
const projects = await getProjectsWithSessions();
res.json(projects);
}),
);
router.get(
'/archived',
asyncHandler(async (_req, res) => {
const projects = await getArchivedProjectsWithSessions();
res.json(createApiSuccessResponse({ projects }));
}),
);
router.get(
'/:projectId/sessions',
asyncHandler(async (req, res) => {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
const limit = parseNonNegativeIntQuery(req.query.limit, 'limit', 20);
const offset = parseNonNegativeIntQuery(req.query.offset, 'offset', 0);
const sessionsPage = await getProjectSessionsPage(projectId, { limit, offset });
res.json(sessionsPage);
}),
);
router.post(
'/create-project',
asyncHandler(async (req, res) => {
const requestBody = req.body as Record<string, unknown>;
const projectPath = typeof requestBody.path === 'string' ? requestBody.path : '';
const customName = typeof requestBody.customName === 'string' ? requestBody.customName : null;
if (requestBody.workspaceType !== undefined) {
throw new AppError('workspaceType is no longer supported. Use the single create-project flow.', {
code: 'LEGACY_WORKSPACE_TYPE_UNSUPPORTED',
statusCode: 400,
});
}
if (requestBody.githubUrl || requestBody.githubTokenId || requestBody.newGithubToken) {
throw new AppError('Repository cloning is not supported on create-project', {
code: 'CLONE_NOT_SUPPORTED_ON_CREATE_PROJECT',
statusCode: 400,
details: 'Use /api/projects/clone-progress for cloning workflows',
});
}
const projectCreationResult = await createProject({
projectPath,
customName,
});
res.json({
success: true,
project: projectCreationResult.project,
message:
projectCreationResult.outcome === 'reactivated_archived'
? 'Archived project path reused successfully'
: 'Project created successfully',
});
}),
);
/**
* One-time (or idempotent) migration: apply legacy `localStorage` starred projectIds to the DB, then clear client storage.
*/
router.post(
'/migrate-legacy-stars',
asyncHandler(async (req, res) => {
const projectIds = Array.isArray((req.body as { projectIds?: unknown })?.projectIds)
? ((req.body as { projectIds: unknown[] }).projectIds as unknown[]).map((x) => String(x))
: [];
const { updated } = applyLegacyStarredProjectIds(projectIds);
res.json({ success: true, updated });
}),
);
router.get('/clone-progress', async (req, res) => {
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
res.flushHeaders();
const sendEvent = (type: string, data: Record<string, unknown>) => {
if (res.writableEnded) {
return;
}
res.write(`data: ${JSON.stringify({ type, ...data })}\n\n`);
};
let cloneOperation: Awaited<ReturnType<typeof startCloneProject>> | null = null;
const closeListener = () => {
cloneOperation?.cancel();
};
req.on('close', closeListener);
try {
const queryParams = req.query as Record<string, unknown>;
const workspacePath = readQueryStringValue(queryParams.path);
const githubUrl = readQueryStringValue(queryParams.githubUrl);
const githubTokenId = readOptionalNumericQueryValue(queryParams.githubTokenId);
const newGithubToken = readQueryStringValue(queryParams.newGithubToken) || null;
const authenticatedUser = (req as typeof req & { user?: AuthenticatedUser }).user;
const userId = authenticatedUser?.id;
if (userId === undefined || userId === null) {
throw new AppError('Authenticated user is required', {
code: 'AUTHENTICATION_REQUIRED',
statusCode: 401,
});
}
cloneOperation = await startCloneProject(
{
workspacePath,
githubUrl,
githubTokenId,
newGithubToken,
userId,
},
{
onProgress: (message) => {
sendEvent('progress', { message });
},
onComplete: ({ project, message }) => {
sendEvent('complete', { project, message });
},
},
);
await cloneOperation.waitForCompletion;
} catch (error) {
sendEvent('error', { message: resolveRouteErrorMessage(error) });
} finally {
req.off('close', closeListener);
if (!res.writableEnded) {
res.end();
}
}
});
router.get(
'/:projectId/taskmaster',
asyncHandler(async (req, res) => {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
const taskMasterDetails = await getProjectTaskMaster(projectId);
res.json(taskMasterDetails);
}),
);
router.put('/:projectId/rename', (req, res) => {
try {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
const { displayName } = req.body as { displayName?: unknown };
updateProjectDisplayName(projectId, displayName);
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error instanceof Error ? error.message : 'Failed to rename project' });
}
});
router.post(
'/:projectId/toggle-star',
asyncHandler(async (req, res) => {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
const { isStarred } = toggleProjectStar(projectId);
res.json({ success: true, isStarred });
}),
);
router.post(
'/:projectId/restore',
asyncHandler(async (req, res) => {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
restoreArchivedProject(projectId);
res.json(createApiSuccessResponse({ projectId, isArchived: false }));
}),
);
/**
* - `force` not set / false: archive project in DB only (`isArchived` = 1; hidden from active list).
* - `force=true`: remove DB row, delete session rows for that path, remove all `*.jsonl` under the Claude project dir.
*/
router.delete(
'/:projectId',
asyncHandler(async (req, res) => {
const projectId = typeof req.params.projectId === 'string' ? req.params.projectId : '';
const force = req.query.force === 'true';
await deleteOrArchiveProject(projectId, force);
res.json({ success: true });
}),
);
export default router;

View File

@@ -0,0 +1,321 @@
import { spawn } from 'node:child_process';
import { access, mkdir, rm } from 'node:fs/promises';
import path from 'node:path';
import { githubTokensDb } from '@/modules/database/index.js';
import { createProject } from '@/modules/projects/services/project-management.service.js';
import type { WorkspacePathValidationResult } from '@/shared/types.js';
import { AppError, validateWorkspacePath } from '@/shared/utils.js';
type CloneProjectInput = {
workspacePath: string;
githubUrl: string;
githubTokenId?: number | null;
newGithubToken?: string | null;
userId: number | string;
};
type CloneCompletePayload = {
project: Record<string, unknown>;
message: string;
};
type CloneProjectEventHandlers = {
onProgress: (message: string) => void;
onComplete: (payload: CloneCompletePayload) => void;
};
type GitCloneProcess = {
stdout: NodeJS.ReadableStream | null;
stderr: NodeJS.ReadableStream | null;
on(event: 'close', listener: (code: number | null) => void): void;
on(event: 'error', listener: (error: NodeJS.ErrnoException) => void): void;
kill(): void;
};
type CloneProjectDependencies = {
validatePath: (requestedPath: string) => Promise<WorkspacePathValidationResult>;
ensureDirectory: (directoryPath: string) => Promise<void>;
pathExists: (targetPath: string) => Promise<boolean>;
removePath: (targetPath: string) => Promise<void>;
getGithubTokenById: (
tokenId: number,
userId: number,
) => Promise<{ github_token: string } | null>;
spawnGitClone: (cloneUrl: string, clonePath: string) => GitCloneProcess;
registerProject: (projectPath: string, customName: string) => Promise<{ project: Record<string, unknown> }>;
logError: (message: string, error: unknown) => void;
};
export type CloneProjectOperation = {
waitForCompletion: Promise<void>;
cancel: () => void;
};
async function defaultPathExists(targetPath: string): Promise<boolean> {
try {
await access(targetPath);
return true;
} catch (error) {
if ((error as NodeJS.ErrnoException).code === 'ENOENT') {
return false;
}
throw error;
}
}
function sanitizeGitError(message: string, token: string | null): string {
if (!message || !token) {
return message;
}
const escapedToken = token.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
return message.replace(new RegExp(escapedToken, 'g'), '***');
}
function resolveCloneFailureMessage(lastError: string, sanitizedError: string): string {
if (lastError.includes('Authentication failed') || lastError.includes('could not read Username')) {
return 'Authentication failed. Please check your credentials.';
}
if (lastError.includes('Repository not found')) {
return 'Repository not found. Please check the URL and ensure you have access.';
}
if (lastError.includes('already exists')) {
return 'Directory already exists';
}
if (sanitizedError) {
return sanitizedError;
}
return 'Git clone failed';
}
function resolveErrorMessage(error: unknown): string {
if (error instanceof AppError) {
return error.message;
}
if (error instanceof Error && error.message) {
return error.message;
}
return 'Unexpected error';
}
const defaultDependencies: CloneProjectDependencies = {
validatePath: validateWorkspacePath,
ensureDirectory: async (directoryPath: string): Promise<void> => {
await mkdir(directoryPath, { recursive: true });
},
pathExists: defaultPathExists,
removePath: async (targetPath: string): Promise<void> => {
await rm(targetPath, { recursive: true, force: true });
},
getGithubTokenById: async (
tokenId: number,
userId: number,
): Promise<{ github_token: string } | null> => {
const tokenRow = githubTokensDb.getGithubTokenById(userId, tokenId) as
| { github_token: string }
| null;
return tokenRow;
},
spawnGitClone: (cloneUrl: string, clonePath: string): GitCloneProcess =>
spawn('git', ['clone', '--progress', '--', cloneUrl, clonePath], {
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
GIT_TERMINAL_PROMPT: '0',
},
}) as unknown as GitCloneProcess,
registerProject: async (
projectPath: string,
customName: string,
): Promise<{ project: Record<string, unknown> }> =>
createProject({
projectPath,
customName,
}) as Promise<{ project: Record<string, unknown> }>,
logError: (message: string, error: unknown): void => {
console.error(message, error);
},
};
export async function startCloneProject(
input: CloneProjectInput,
handlers: CloneProjectEventHandlers,
dependencies: CloneProjectDependencies = defaultDependencies,
): Promise<CloneProjectOperation> {
const normalizedWorkspacePath = input.workspacePath.trim();
const normalizedGithubUrl = input.githubUrl.trim();
if (!normalizedWorkspacePath) {
throw new AppError('workspacePath and githubUrl are required', {
code: 'WORKSPACE_PATH_REQUIRED',
statusCode: 400,
});
}
if (!normalizedGithubUrl) {
throw new AppError('workspacePath and githubUrl are required', {
code: 'GITHUB_URL_REQUIRED',
statusCode: 400,
});
}
if (normalizedGithubUrl.startsWith('-')) {
throw new AppError('Invalid githubUrl', {
code: 'INVALID_GITHUB_URL',
statusCode: 400,
});
}
const pathValidation = await dependencies.validatePath(normalizedWorkspacePath);
if (!pathValidation.valid || !pathValidation.resolvedPath) {
throw new AppError(pathValidation.error || 'Invalid workspace path', {
code: 'INVALID_PROJECT_PATH',
statusCode: 400,
});
}
const absolutePath = pathValidation.resolvedPath;
await dependencies.ensureDirectory(absolutePath);
let githubToken: string | null = null;
if (typeof input.githubTokenId === 'number') {
const numericUserId =
typeof input.userId === 'number' ? input.userId : Number.parseInt(String(input.userId), 10);
if (Number.isNaN(numericUserId)) {
throw new AppError('Authenticated user is required', {
code: 'AUTHENTICATION_REQUIRED',
statusCode: 401,
});
}
const token = await dependencies.getGithubTokenById(input.githubTokenId, numericUserId);
if (!token) {
throw new AppError('GitHub token not found', {
code: 'GITHUB_TOKEN_NOT_FOUND',
statusCode: 404,
});
}
githubToken = token.github_token;
} else if (input.newGithubToken && input.newGithubToken.trim().length > 0) {
githubToken = input.newGithubToken.trim();
}
const sanitizedGithubUrl = normalizedGithubUrl.replace(/\/+$/, '').replace(/\.git$/, '');
const repoName = sanitizedGithubUrl.split('/').pop() || 'repository';
const clonePath = path.join(absolutePath, repoName);
if (await dependencies.pathExists(clonePath)) {
throw new AppError(
`Directory "${repoName}" already exists. Please choose a different location or remove the existing directory.`,
{
code: 'CLONE_TARGET_ALREADY_EXISTS',
statusCode: 409,
},
);
}
let cloneUrl = normalizedGithubUrl;
if (githubToken) {
try {
const url = new URL(normalizedGithubUrl);
url.username = githubToken;
url.password = '';
cloneUrl = url.toString();
} catch {
// SSH URLs cannot be represented by URL constructor and are used as-is.
}
}
handlers.onProgress(`Cloning into '${repoName}'...`);
const gitProcess = dependencies.spawnGitClone(cloneUrl, clonePath);
let lastError = '';
gitProcess.stdout?.on('data', (data: Buffer | string) => {
const message = data.toString().trim();
if (message) {
handlers.onProgress(message);
}
});
gitProcess.stderr?.on('data', (data: Buffer | string) => {
const message = data.toString().trim();
lastError = message;
if (message) {
handlers.onProgress(message);
}
});
const waitForCompletion = new Promise<void>((resolve, reject) => {
gitProcess.on('close', async (code) => {
if (code === 0) {
try {
const createdProject = await dependencies.registerProject(clonePath, repoName);
handlers.onComplete({
project: createdProject.project,
message: 'Repository cloned successfully',
});
resolve();
} catch (error) {
reject(
new AppError(`Clone succeeded but failed to add project: ${resolveErrorMessage(error)}`, {
code: 'CLONE_PROJECT_REGISTRATION_FAILED',
statusCode: 500,
}),
);
}
return;
}
const sanitizedError = sanitizeGitError(lastError, githubToken);
const errorMessage = resolveCloneFailureMessage(lastError, sanitizedError);
try {
await dependencies.removePath(clonePath);
} catch (cleanupError) {
dependencies.logError('Failed to clean up after clone failure:', cleanupError);
}
reject(
new AppError(errorMessage, {
code: 'GIT_CLONE_FAILED',
statusCode: 500,
}),
);
});
gitProcess.on('error', (error) => {
if (error.code === 'ENOENT') {
reject(
new AppError('Git is not installed or not in PATH', {
code: 'GIT_NOT_FOUND',
statusCode: 500,
}),
);
return;
}
reject(
new AppError(error.message, {
code: 'GIT_EXECUTION_FAILED',
statusCode: 500,
}),
);
});
});
return {
waitForCompletion,
cancel: () => {
gitProcess.kill();
},
};
}

View File

@@ -0,0 +1,90 @@
import { promises as fs } from 'node:fs';
import path from 'node:path';
import { projectsDb, sessionsDb } from '@/modules/database/index.js';
import { AppError } from '@/shared/utils.js';
function uniqueJsonlPathsFromSessions(
sessions: Array<{ jsonl_path: string | null }>,
): string[] {
const seen = new Set<string>();
const result: string[] = [];
for (const row of sessions) {
const raw = row.jsonl_path?.trim();
if (!raw) {
continue;
}
const absolute = path.isAbsolute(raw) ? path.normalize(raw) : path.resolve(raw);
if (seen.has(absolute)) {
continue;
}
seen.add(absolute);
result.push(absolute);
}
return result;
}
async function unlinkJsonlIfExists(filePath: string): Promise<void> {
try {
await fs.unlink(filePath);
} catch (error) {
const code = (error as NodeJS.ErrnoException).code;
if (code === 'ENOENT') {
return;
}
console.warn(`[project-delete] Failed to remove ${filePath}:`, (error as Error).message);
}
}
/**
* Loads all session rows for the project path and removes each distinct `jsonl_path` file on disk.
*/
export async function deleteSessionJsonlFilesForProjectPath(projectPath: string): Promise<void> {
const sessions = sessionsDb.getSessionsByProjectPathIncludingArchived(projectPath);
const paths = uniqueJsonlPathsFromSessions(sessions);
for (const filePath of paths) {
await unlinkJsonlIfExists(filePath);
}
}
/**
* - **Soft delete** (`force` false): set `isArchived` on the `projects` row (hide from the active list; DB only).
* - **Force** (`force` true): for each session row for that `project_path`, delete the file at `jsonl_path`
* (when set), then remove session rows and the `projects` row.
*/
export async function deleteOrArchiveProject(projectId: string, force: boolean): Promise<void> {
const row = projectsDb.getProjectById(projectId);
if (!row) {
throw new AppError(`Unknown projectId: ${projectId}`, {
code: 'PROJECT_NOT_FOUND',
statusCode: 404,
});
}
if (!force) {
projectsDb.updateProjectIsArchivedById(projectId, true);
return;
}
await deleteSessionJsonlFilesForProjectPath(row.project_path);
sessionsDb.deleteSessionsByProjectPath(row.project_path);
projectsDb.deleteProjectById(projectId);
}
/**
* Restores one archived project row back into the active project list.
*/
export function restoreArchivedProject(projectId: string): void {
const row = projectsDb.getProjectById(projectId);
if (!row) {
throw new AppError(`Unknown projectId: ${projectId}`, {
code: 'PROJECT_NOT_FOUND',
statusCode: 404,
});
}
projectsDb.updateProjectIsArchivedById(projectId, false);
}

View File

@@ -0,0 +1,150 @@
import fs from 'node:fs/promises';
import path from 'node:path';
import { projectsDb } from '@/modules/database/index.js';
import type {
CreateProjectPathResult,
ProjectRepositoryRow,
WorkspacePathValidationResult,
} from '@/shared/types.js';
import { AppError, normalizeProjectPath, validateWorkspacePath } from '@/shared/utils.js';
type CreateProjectInput = {
projectPath: string;
customName?: string | null;
};
type CreateProjectDependencies = {
validatePath: (projectPath: string) => Promise<WorkspacePathValidationResult>;
ensureWorkspaceDirectory: (projectPath: string) => Promise<void>;
persistProjectPath: (projectPath: string, customName: string | null) => CreateProjectPathResult;
getProjectByPath: (projectPath: string) => ProjectRepositoryRow | null;
};
type ProjectApiView = {
projectId: string;
path: string;
fullPath: string;
displayName: string;
customName: string | null;
isArchived: boolean;
isStarred: boolean;
sessions: [];
cursorSessions: [];
codexSessions: [];
geminiSessions: [];
sessionMeta: {
hasMore: false;
total: 0;
};
};
type CreateProjectServiceResult = {
outcome: 'created' | 'reactivated_archived';
project: ProjectApiView;
};
const defaultDependencies: CreateProjectDependencies = {
validatePath: validateWorkspacePath,
ensureWorkspaceDirectory: async (projectPath: string): Promise<void> => {
await fs.mkdir(projectPath, { recursive: true });
const directoryStats = await fs.stat(projectPath);
if (!directoryStats.isDirectory()) {
throw new AppError('Path exists but is not a directory', {
code: 'PROJECT_PATH_NOT_DIRECTORY',
statusCode: 400,
});
}
},
persistProjectPath: (projectPath: string, customName: string | null): CreateProjectPathResult =>
projectsDb.createProjectPath(projectPath, customName),
getProjectByPath: (projectPath: string): ProjectRepositoryRow | null =>
projectsDb.getProjectPath(projectPath),
};
function resolveDisplayName(customName: string | null | undefined, projectPath: string): string {
const trimmedCustomName = typeof customName === 'string' ? customName.trim() : '';
if (trimmedCustomName.length > 0) {
return trimmedCustomName;
}
return path.basename(projectPath) || projectPath;
}
function mapProjectRowToApiView(projectRow: ProjectRepositoryRow): ProjectApiView {
return {
projectId: projectRow.project_id,
path: projectRow.project_path,
fullPath: projectRow.project_path,
displayName: resolveDisplayName(projectRow.custom_project_name, projectRow.project_path),
customName: projectRow.custom_project_name,
isArchived: Boolean(projectRow.isArchived),
isStarred: Boolean(projectRow.isStarred),
sessions: [],
cursorSessions: [],
codexSessions: [],
geminiSessions: [],
sessionMeta: {
hasMore: false,
total: 0,
},
};
}
export async function createProject(
input: CreateProjectInput,
dependencies: CreateProjectDependencies = defaultDependencies,
): Promise<CreateProjectServiceResult> {
const normalizedPath = normalizeProjectPath(input.projectPath || '');
if (!normalizedPath) {
throw new AppError('path is required', {
code: 'PROJECT_PATH_REQUIRED',
statusCode: 400,
});
}
const pathValidation = await dependencies.validatePath(normalizedPath);
if (!pathValidation.valid || !pathValidation.resolvedPath) {
throw new AppError('Invalid project path', {
code: 'INVALID_PROJECT_PATH',
statusCode: 400,
details: pathValidation.error ?? 'Path validation failed',
});
}
const resolvedProjectPath = normalizeProjectPath(pathValidation.resolvedPath);
await dependencies.ensureWorkspaceDirectory(resolvedProjectPath);
const normalizedCustomName = resolveDisplayName(input.customName ?? null, resolvedProjectPath);
const persistedProject = dependencies.persistProjectPath(resolvedProjectPath, normalizedCustomName);
if (persistedProject.outcome === 'active_conflict') {
throw new AppError('Project path already exists and is active', {
code: 'PROJECT_ALREADY_EXISTS',
statusCode: 409,
details: `Project path already exists: ${resolvedProjectPath}`,
});
}
const projectRow = persistedProject.project ?? dependencies.getProjectByPath(resolvedProjectPath);
if (!projectRow) {
throw new AppError('Failed to resolve project after creation', {
code: 'PROJECT_CREATE_FAILED',
statusCode: 500,
});
}
// Archived rows intentionally remain archived when reused, as requested.
return {
outcome: persistedProject.outcome,
project: mapProjectRowToApiView(projectRow),
};
}
/**
* Sets `projects.custom_project_name` for the given `projectId` (or clears it when empty).
*/
export function updateProjectDisplayName(projectId: string, newDisplayName: unknown): void {
const trimmed = typeof newDisplayName === 'string' ? newDisplayName.trim() : '';
projectsDb.updateCustomProjectNameById(projectId, trimmed.length > 0 ? trimmed : null);
}

View File

@@ -0,0 +1,78 @@
import { projectsDb } from '@/modules/database/index.js';
import { AppError } from '@/shared/utils.js';
type ToggleProjectStarResult = {
isStarred: boolean;
};
type ApplyLegacyStarredProjectIdsResult = {
updated: number;
};
function normalizeProjectId(projectId: string): string {
return projectId.trim();
}
function uniqueProjectIds(projectIds: string[]): string[] {
const uniqueIds = new Set<string>();
for (const projectId of projectIds) {
const normalizedProjectId = normalizeProjectId(projectId);
if (!normalizedProjectId) {
continue;
}
uniqueIds.add(normalizedProjectId);
}
return [...uniqueIds];
}
/**
* Applies legacy `localStorage` stars keyed by DB `projectId` onto `projects.isStarred`.
*
* The operation is idempotent: already-starred projects are ignored, unknown ids are skipped.
*/
export function applyLegacyStarredProjectIds(projectIds: string[]): ApplyLegacyStarredProjectIdsResult {
const normalizedProjectIds = uniqueProjectIds(projectIds);
let updated = 0;
for (const projectId of normalizedProjectIds) {
const project = projectsDb.getProjectById(projectId);
if (!project) {
continue;
}
if (Boolean(project.isStarred)) {
continue;
}
projectsDb.updateProjectIsStarredById(projectId, true);
updated += 1;
}
return { updated };
}
/**
* Flips `projects.isStarred` for one project and returns the new state.
*/
export function toggleProjectStar(projectId: string): ToggleProjectStarResult {
const normalizedProjectId = normalizeProjectId(projectId);
if (!normalizedProjectId) {
throw new AppError('projectId is required', {
code: 'PROJECT_ID_REQUIRED',
statusCode: 400,
});
}
const project = projectsDb.getProjectById(normalizedProjectId);
if (!project) {
throw new AppError('Project not found', {
code: 'PROJECT_NOT_FOUND',
statusCode: 404,
});
}
const nextStarredState = !Boolean(project.isStarred);
projectsDb.updateProjectIsStarredById(normalizedProjectId, nextStarredState);
return { isStarred: nextStarredState };
}

View File

@@ -0,0 +1,248 @@
import { access, readFile, stat } from 'node:fs/promises';
import path from 'node:path';
import { projectsDb } from '@/modules/database/index.js';
import { AppError } from '@/shared/utils.js';
type TaskMasterTask = {
status?: string;
subtasks?: Array<{
status?: string;
}>;
};
type TaskMasterMetadata =
| {
taskCount: number;
subtaskCount: number;
completed: number;
pending: number;
inProgress: number;
review: number;
completionPercentage: number;
lastModified: string;
}
| {
error: string;
}
| null;
type TaskMasterDetectionResult = {
hasTaskmaster: boolean;
hasEssentialFiles?: boolean;
files?: Record<string, boolean>;
metadata?: TaskMasterMetadata;
path?: string;
reason?: string;
};
type NormalizedTaskMasterInfo = {
hasTaskmaster: boolean;
hasEssentialFiles: boolean;
metadata: TaskMasterMetadata;
status: 'configured' | 'not-configured';
};
type GetProjectTaskMasterByIdResult = {
projectId: string;
projectPath: string;
taskmaster: NormalizedTaskMasterInfo;
};
type GetProjectTaskMasterDependencies = {
resolveProjectPathById: (projectId: string) => string | null;
detectTaskMasterFolder: (projectPath: string) => Promise<TaskMasterDetectionResult>;
};
type GetProjectTaskMasterResolver = (projectId: string) => Promise<GetProjectTaskMasterByIdResult | null>;
function extractTasksFromJson(tasksData: unknown): TaskMasterTask[] {
if (!tasksData || typeof tasksData !== 'object') {
return [];
}
const legacyTasks = (tasksData as { tasks?: unknown }).tasks;
if (Array.isArray(legacyTasks)) {
return legacyTasks as TaskMasterTask[];
}
const taggedTaskCollections: TaskMasterTask[] = [];
for (const tagValue of Object.values(tasksData)) {
if (!tagValue || typeof tagValue !== 'object') {
continue;
}
const tagTasks = (tagValue as { tasks?: unknown }).tasks;
if (Array.isArray(tagTasks)) {
taggedTaskCollections.push(...(tagTasks as TaskMasterTask[]));
}
}
return taggedTaskCollections;
}
async function detectTaskMasterFolder(projectPath: string): Promise<TaskMasterDetectionResult> {
try {
const taskMasterPath = path.join(projectPath, '.taskmaster');
try {
const taskMasterStats = await stat(taskMasterPath);
if (!taskMasterStats.isDirectory()) {
return {
hasTaskmaster: false,
reason: '.taskmaster exists but is not a directory',
};
}
} catch (error) {
const fileError = error as NodeJS.ErrnoException;
if (fileError.code === 'ENOENT') {
return {
hasTaskmaster: false,
reason: '.taskmaster directory not found',
};
}
throw fileError;
}
const keyFiles = ['tasks/tasks.json', 'config.json'];
const fileStatus: Record<string, boolean> = {};
let hasEssentialFiles = true;
for (const fileName of keyFiles) {
const absoluteFilePath = path.join(taskMasterPath, fileName);
try {
await access(absoluteFilePath);
fileStatus[fileName] = true;
} catch {
fileStatus[fileName] = false;
if (fileName === 'tasks/tasks.json') {
hasEssentialFiles = false;
}
}
}
let taskMetadata: TaskMasterMetadata = null;
if (fileStatus['tasks/tasks.json']) {
const tasksPath = path.join(taskMasterPath, 'tasks/tasks.json');
try {
const tasksContent = await readFile(tasksPath, 'utf8');
const parsedTasksJson = JSON.parse(tasksContent) as unknown;
const tasks = extractTasksFromJson(parsedTasksJson);
const stats = tasks.reduce(
(accumulator, currentTask) => {
accumulator.total += 1;
const normalizedTaskStatus = currentTask.status || 'pending';
accumulator.byStatus[normalizedTaskStatus] = (accumulator.byStatus[normalizedTaskStatus] || 0) + 1;
if (Array.isArray(currentTask.subtasks)) {
for (const subtask of currentTask.subtasks) {
accumulator.subtotalTasks += 1;
const normalizedSubtaskStatus = subtask.status || 'pending';
accumulator.subtaskByStatus[normalizedSubtaskStatus] =
(accumulator.subtaskByStatus[normalizedSubtaskStatus] || 0) + 1;
}
}
return accumulator;
},
{
total: 0,
subtotalTasks: 0,
byStatus: {} as Record<string, number>,
subtaskByStatus: {} as Record<string, number>,
},
);
const tasksStat = await stat(tasksPath);
taskMetadata = {
taskCount: stats.total,
subtaskCount: stats.subtotalTasks,
completed: stats.byStatus.done || 0,
pending: stats.byStatus.pending || 0,
inProgress: stats.byStatus['in-progress'] || 0,
review: stats.byStatus.review || 0,
completionPercentage: stats.total > 0 ? Math.round(((stats.byStatus.done || 0) / stats.total) * 100) : 0,
lastModified: tasksStat.mtime.toISOString(),
};
} catch (parseError) {
console.warn('Failed to parse tasks.json:', (parseError as Error).message);
taskMetadata = {
error: 'Failed to parse tasks.json',
};
}
}
return {
hasTaskmaster: true,
hasEssentialFiles,
files: fileStatus,
metadata: taskMetadata,
path: taskMasterPath,
};
} catch (error) {
console.error('Error detecting TaskMaster folder:', error);
return {
hasTaskmaster: false,
reason: `Error checking directory: ${(error as Error).message}`,
};
}
}
function normalizeTaskMasterInfo(taskMasterResult: TaskMasterDetectionResult | null = null): NormalizedTaskMasterInfo {
const hasTaskmaster = Boolean(taskMasterResult?.hasTaskmaster);
const hasEssentialFiles = Boolean(taskMasterResult?.hasEssentialFiles);
return {
hasTaskmaster,
hasEssentialFiles,
metadata: taskMasterResult?.metadata ?? null,
status: hasTaskmaster && hasEssentialFiles ? 'configured' : 'not-configured',
};
}
const defaultDependencies: GetProjectTaskMasterDependencies = {
resolveProjectPathById: (projectId: string): string | null => projectsDb.getProjectPathById(projectId),
detectTaskMasterFolder,
};
export async function getProjectTaskMasterById(
projectId: string,
dependencies: GetProjectTaskMasterDependencies = defaultDependencies,
): Promise<GetProjectTaskMasterByIdResult | null> {
const projectPath = dependencies.resolveProjectPathById(projectId);
if (!projectPath) {
return null;
}
const taskMasterResult = await dependencies.detectTaskMasterFolder(projectPath);
return {
projectId,
projectPath,
taskmaster: normalizeTaskMasterInfo(taskMasterResult),
};
}
export async function getProjectTaskMaster(
projectId: string,
resolveById: GetProjectTaskMasterResolver = getProjectTaskMasterById,
): Promise<GetProjectTaskMasterByIdResult> {
const normalizedProjectId = projectId.trim();
if (!normalizedProjectId) {
throw new AppError('projectId is required', {
code: 'PROJECT_ID_REQUIRED',
statusCode: 400,
});
}
const taskMasterDetails = await resolveById(normalizedProjectId);
if (!taskMasterDetails) {
throw new AppError('Project not found', {
code: 'PROJECT_NOT_FOUND',
statusCode: 404,
});
}
return taskMasterDetails;
}

View File

@@ -0,0 +1,349 @@
import fs from 'node:fs/promises';
import path from 'node:path';
import { projectsDb, sessionsDb } from '@/modules/database/index.js';
import { sessionSynchronizerService } from '@/modules/providers/index.js';
import { WS_OPEN_STATE, connectedClients } from '@/modules/websocket/index.js';
import type { RealtimeClientConnection } from '@/shared/types.js';
import { AppError } from '@/shared/utils.js';
type SessionSummary = {
id: string;
summary: string;
messageCount: number;
lastActivity: string;
};
type SessionsByProvider = Record<'claude' | 'cursor' | 'codex' | 'gemini', SessionSummary[]>;
type SessionRepositoryRow = {
provider: string;
session_id: string;
custom_name?: string | null;
updated_at?: string | null;
created_at?: string | null;
};
export type ProjectListItem = {
projectId: string;
path: string;
displayName: string;
fullPath: string;
isStarred: boolean;
sessions: SessionSummary[];
cursorSessions: SessionSummary[];
codexSessions: SessionSummary[];
geminiSessions: SessionSummary[];
sessionMeta: {
hasMore: boolean;
total: number;
};
};
export type ArchivedProjectListItem = ProjectListItem & {
isArchived: true;
};
type ProgressUpdate = {
phase: 'loading' | 'complete';
current: number;
total: number;
currentProject?: string;
};
type GetProjectsWithSessionsOptions = {
skipSynchronization?: boolean;
sessionsLimit?: number;
sessionsOffset?: number;
};
type SessionPaginationOptions = {
limit?: number;
offset?: number;
};
type ProjectSessionsPageResult = {
sessionsByProvider: SessionsByProvider;
total: number;
hasMore: boolean;
};
export type ProjectSessionsPageApiView = {
projectId: string;
sessions: SessionSummary[];
cursorSessions: SessionSummary[];
codexSessions: SessionSummary[];
geminiSessions: SessionSummary[];
sessionMeta: {
hasMore: boolean;
total: number;
};
};
const DEFAULT_PROJECT_SESSIONS_PAGE_SIZE = 20;
const MAX_PROJECT_SESSIONS_PAGE_SIZE = 200;
/**
* Generate better display name from path.
*/
export async function generateDisplayName(projectName: string, actualProjectDir: string | null = null): Promise<string> {
// Use actual project directory if provided, otherwise decode from project name.
const projectPath = actualProjectDir || projectName.replace(/-/g, '/');
// Try to read package.json from the project path.
try {
const packageJsonPath = path.join(projectPath, 'package.json');
const packageData = await fs.readFile(packageJsonPath, 'utf8');
const packageJson = JSON.parse(packageData) as { name?: string };
// Return the name from package.json if it exists.
if (packageJson.name) {
return packageJson.name;
}
} catch {
// Fall back to path-based naming if package.json doesn't exist or can't be read.
}
// If it starts with /, it's an absolute path.
if (projectPath.startsWith('/')) {
const parts = projectPath.split('/').filter(Boolean);
// Return only the last folder name.
return parts[parts.length - 1] || projectPath;
}
return projectPath;
}
function normalizeSessionPagination(options: SessionPaginationOptions = {}): { limit: number; offset: number } {
const rawLimit = Number.isFinite(options.limit) ? Math.floor(Number(options.limit)) : DEFAULT_PROJECT_SESSIONS_PAGE_SIZE;
const rawOffset = Number.isFinite(options.offset) ? Math.floor(Number(options.offset)) : 0;
return {
limit: Math.min(Math.max(1, rawLimit), MAX_PROJECT_SESSIONS_PAGE_SIZE),
offset: Math.max(0, rawOffset),
};
}
function mapSessionRowToSummary(row: SessionRepositoryRow): SessionSummary {
return {
id: row.session_id,
summary: row.custom_name || '',
messageCount: 0,
lastActivity: row.updated_at ?? row.created_at ?? new Date().toISOString(),
};
}
function bucketSessionRowsByProvider(rows: SessionRepositoryRow[]): SessionsByProvider {
const byProvider: SessionsByProvider = {
claude: [],
cursor: [],
codex: [],
gemini: [],
};
for (const row of rows) {
const provider = row.provider as keyof SessionsByProvider;
const bucket = byProvider[provider];
if (!bucket) {
continue;
}
bucket.push(mapSessionRowToSummary(row));
}
return byProvider;
}
function readProjectSessionsIncludingArchived(projectPath: string): ProjectSessionsPageResult {
const rows = sessionsDb.getSessionsByProjectPathIncludingArchived(projectPath) as SessionRepositoryRow[];
return {
sessionsByProvider: bucketSessionRowsByProvider(rows),
total: rows.length,
hasMore: false,
};
}
/**
* Reads one paginated project session slice from the DB and groups rows by provider.
*/
function readProjectSessionsPageByPath(
projectPath: string,
options: SessionPaginationOptions = {},
): ProjectSessionsPageResult {
const pagination = normalizeSessionPagination(options);
const rows = sessionsDb.getSessionsByProjectPathPage(
projectPath,
pagination.limit,
pagination.offset,
) as SessionRepositoryRow[];
const total = sessionsDb.countSessionsByProjectPath(projectPath);
return {
sessionsByProvider: bucketSessionRowsByProvider(rows),
total,
hasMore: pagination.offset + rows.length < total,
};
}
// Broadcast progress to all connected WebSocket clients
function broadcastProgress(progress: ProgressUpdate) {
const message = JSON.stringify({
type: 'loading_progress',
...progress,
});
connectedClients.forEach((client: RealtimeClientConnection) => {
if (client.readyState === WS_OPEN_STATE) {
client.send(message);
}
});
}
/**
* Reads all projects from DB and returns provider-bucketed session summaries.
*/
export async function getProjectsWithSessions(
options: GetProjectsWithSessionsOptions = {}
): Promise<ProjectListItem[]> {
if (!options.skipSynchronization) {
await sessionSynchronizerService.synchronizeSessions();
}
const projectRows = projectsDb.getProjectPaths() as Array<{
project_id: string;
project_path: string;
custom_project_name?: string | null;
isStarred?: number;
}>;
const totalProjects = projectRows.length;
const projects: ProjectListItem[] = [];
let processedProjects = 0;
for (const row of projectRows) {
processedProjects += 1;
const projectId = row.project_id;
const projectPath = row.project_path;
broadcastProgress({
phase: 'loading',
current: processedProjects,
total: totalProjects,
currentProject: projectPath,
});
const displayName =
row.custom_project_name && row.custom_project_name.trim().length > 0
? row.custom_project_name
: await generateDisplayName(path.basename(projectPath) || projectPath, projectPath);
const sessionsPage = readProjectSessionsPageByPath(projectPath, {
limit: options.sessionsLimit,
offset: options.sessionsOffset,
});
projects.push({
projectId,
path: projectPath,
displayName,
fullPath: projectPath,
isStarred: Boolean(row.isStarred),
sessions: sessionsPage.sessionsByProvider.claude,
cursorSessions: sessionsPage.sessionsByProvider.cursor,
codexSessions: sessionsPage.sessionsByProvider.codex,
geminiSessions: sessionsPage.sessionsByProvider.gemini,
sessionMeta: {
hasMore: sessionsPage.hasMore,
total: sessionsPage.total,
},
});
}
broadcastProgress({
phase: 'complete',
current: totalProjects,
total: totalProjects,
});
return projects;
}
/**
* Reads archived projects from DB and includes every session row for each
* project path, because an archived workspace should surface all preserved
* conversation history in the archive view regardless of each session's flag.
*/
export async function getArchivedProjectsWithSessions(
options: Pick<GetProjectsWithSessionsOptions, 'skipSynchronization'> = {},
): Promise<ArchivedProjectListItem[]> {
if (!options.skipSynchronization) {
await sessionSynchronizerService.synchronizeSessions();
}
const projectRows = projectsDb.getArchivedProjectPaths() as Array<{
project_id: string;
project_path: string;
custom_project_name?: string | null;
isStarred?: number;
}>;
const archivedProjects: ArchivedProjectListItem[] = [];
for (const row of projectRows) {
const displayName =
row.custom_project_name && row.custom_project_name.trim().length > 0
? row.custom_project_name
: await generateDisplayName(path.basename(row.project_path) || row.project_path, row.project_path);
const sessionsPage = readProjectSessionsIncludingArchived(row.project_path);
archivedProjects.push({
projectId: row.project_id,
path: row.project_path,
displayName,
fullPath: row.project_path,
isStarred: Boolean(row.isStarred),
isArchived: true,
sessions: sessionsPage.sessionsByProvider.claude,
cursorSessions: sessionsPage.sessionsByProvider.cursor,
codexSessions: sessionsPage.sessionsByProvider.codex,
geminiSessions: sessionsPage.sessionsByProvider.gemini,
sessionMeta: {
hasMore: sessionsPage.hasMore,
total: sessionsPage.total,
},
});
}
return archivedProjects;
}
/**
* Loads one paginated session slice for a specific project id.
*/
export async function getProjectSessionsPage(
projectId: string,
options: SessionPaginationOptions = {},
): Promise<ProjectSessionsPageApiView> {
const projectRow = projectsDb.getProjectById(projectId);
if (!projectRow) {
throw new AppError(`Project "${projectId}" was not found.`, {
code: 'PROJECT_NOT_FOUND',
statusCode: 404,
});
}
const sessionsPage = readProjectSessionsPageByPath(projectRow.project_path, options);
return {
projectId: projectRow.project_id,
sessions: sessionsPage.sessionsByProvider.claude,
cursorSessions: sessionsPage.sessionsByProvider.cursor,
codexSessions: sessionsPage.sessionsByProvider.codex,
geminiSessions: sessionsPage.sessionsByProvider.gemini,
sessionMeta: {
hasMore: sessionsPage.hasMore,
total: sessionsPage.total,
},
};
}

View File

@@ -0,0 +1,183 @@
import assert from 'node:assert/strict';
import { EventEmitter } from 'node:events';
import path from 'node:path';
import { PassThrough } from 'node:stream';
import test from 'node:test';
import { startCloneProject } from '@/modules/projects/services/project-clone.service.js';
import { AppError } from '@/shared/utils.js';
type TestDependencies = Parameters<typeof startCloneProject>[2];
function buildDependencies(overrides: Partial<NonNullable<TestDependencies>> = {}): NonNullable<TestDependencies> {
return {
validatePath: async () => ({ valid: true, resolvedPath: '/workspace/root' }),
ensureDirectory: async () => undefined,
pathExists: async () => false,
removePath: async () => undefined,
getGithubTokenById: async () => ({ github_token: 'token-value' }),
spawnGitClone: () => {
throw new Error('spawnGitClone should be overridden in this test');
},
registerProject: async () => ({ project: { projectId: 'project-1' } }),
logError: () => undefined,
...overrides,
};
}
function createMockGitProcess() {
const emitter = new EventEmitter() as EventEmitter & {
stdout: PassThrough;
stderr: PassThrough;
kill: () => void;
};
emitter.stdout = new PassThrough();
emitter.stderr = new PassThrough();
emitter.kill = () => {
emitter.emit('close', null);
};
return emitter;
}
test('startCloneProject rejects when workspace path is missing', async () => {
await assert.rejects(
async () =>
startCloneProject(
{
workspacePath: '',
githubUrl: 'https://github.com/example/repo',
userId: 1,
},
{
onProgress: () => undefined,
onComplete: () => undefined,
},
buildDependencies(),
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'WORKSPACE_PATH_REQUIRED');
return true;
},
);
});
test('startCloneProject rejects when github URL is missing', async () => {
await assert.rejects(
async () =>
startCloneProject(
{
workspacePath: '/workspace/root',
githubUrl: '',
userId: 1,
},
{
onProgress: () => undefined,
onComplete: () => undefined,
},
buildDependencies(),
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'GITHUB_URL_REQUIRED');
return true;
},
);
});
test('startCloneProject rejects github URL values that begin with option prefixes', async () => {
await assert.rejects(
async () =>
startCloneProject(
{
workspacePath: '/workspace/root',
githubUrl: '--upload-pack=malicious',
userId: 1,
},
{
onProgress: () => undefined,
onComplete: () => undefined,
},
buildDependencies(),
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'INVALID_GITHUB_URL');
return true;
},
);
});
test('startCloneProject rejects when selected github token does not exist', async () => {
await assert.rejects(
async () =>
startCloneProject(
{
workspacePath: '/workspace/root',
githubUrl: 'https://github.com/example/repo',
githubTokenId: 12,
userId: 1,
},
{
onProgress: () => undefined,
onComplete: () => undefined,
},
buildDependencies({
getGithubTokenById: async () => null,
}),
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'GITHUB_TOKEN_NOT_FOUND');
return true;
},
);
});
test('startCloneProject completes and emits complete payload when git exits successfully', async () => {
const gitProcess = createMockGitProcess();
const progressMessages: string[] = [];
let completePayload: { project: Record<string, unknown>; message: string } | null = null;
let capturedProjectPath = '';
let capturedCustomName = '';
const operation = await startCloneProject(
{
workspacePath: '/workspace/root',
githubUrl: 'https://github.com/example/repo.git',
userId: 1,
},
{
onProgress: (message) => {
progressMessages.push(message);
},
onComplete: (payload: { project: Record<string, unknown>; message: string }) => {
completePayload = payload;
},
},
buildDependencies({
spawnGitClone: () => gitProcess as any,
registerProject: async (projectPath, customName) => {
capturedProjectPath = projectPath;
capturedCustomName = customName;
return { project: { projectId: 'project-1', path: projectPath } };
},
}),
);
gitProcess.emit('close', 0);
await operation.waitForCompletion;
assert.ok(progressMessages.some((message) => message.includes("Cloning into 'repo'")));
assert.equal(capturedCustomName, 'repo');
assert.equal(path.basename(capturedProjectPath), 'repo');
assert.notEqual(completePayload, null);
const resolvedCompletePayload = completePayload as unknown as {
project: Record<string, unknown>;
message: string;
};
assert.equal(resolvedCompletePayload.message, 'Repository cloned successfully');
assert.equal((resolvedCompletePayload.project.projectId as string) || '', 'project-1');
});

View File

@@ -0,0 +1,117 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { createProject } from '@/modules/projects/services/project-management.service.js';
import { AppError } from '@/shared/utils.js';
const projectRow = {
project_id: 'project-1',
project_path: '/workspace/my-project',
custom_project_name: 'my-project',
isStarred: 0,
isArchived: 0,
};
test('createProject throws when project path is missing', async () => {
await assert.rejects(
async () => createProject({ projectPath: '' }),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'PROJECT_PATH_REQUIRED');
assert.equal(error.statusCode, 400);
return true;
},
);
});
test('createProject throws when path validation fails', async () => {
await assert.rejects(
async () =>
createProject(
{ projectPath: '/invalid/path' },
{
validatePath: async () => ({ valid: false, error: 'blocked path' }),
ensureWorkspaceDirectory: async () => undefined,
persistProjectPath: () => ({ outcome: 'created', project: projectRow }),
getProjectByPath: () => projectRow,
},
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'INVALID_PROJECT_PATH');
assert.equal(error.statusCode, 400);
assert.equal(error.details, 'blocked path');
return true;
},
);
});
test('createProject throws conflict when active project path already exists', async () => {
await assert.rejects(
async () =>
createProject(
{ projectPath: '/workspace/my-project' },
{
validatePath: async () => ({ valid: true, resolvedPath: '/workspace/my-project' }),
ensureWorkspaceDirectory: async () => undefined,
persistProjectPath: () => ({ outcome: 'active_conflict', project: projectRow }),
getProjectByPath: () => projectRow,
},
),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'PROJECT_ALREADY_EXISTS');
assert.equal(error.statusCode, 409);
assert.equal(error.details, 'Project path already exists: /workspace/my-project');
return true;
},
);
});
test('createProject falls back to directory name when custom name is not provided', async () => {
let capturedCustomName: string | null = null;
const result = await createProject(
{ projectPath: '/workspace/my-project', customName: '' },
{
validatePath: async () => ({ valid: true, resolvedPath: '/workspace/my-project' }),
ensureWorkspaceDirectory: async () => undefined,
persistProjectPath: (_projectPath, customName) => {
capturedCustomName = customName;
return {
outcome: 'created',
project: {
...projectRow,
custom_project_name: customName,
},
};
},
getProjectByPath: () => projectRow,
},
);
assert.equal(capturedCustomName, 'my-project');
assert.equal(result.outcome, 'created');
assert.equal(result.project.displayName, 'my-project');
});
test('createProject returns archived reuse outcome when archived row is reused', async () => {
const result = await createProject(
{ projectPath: '/workspace/my-project' },
{
validatePath: async () => ({ valid: true, resolvedPath: '/workspace/my-project' }),
ensureWorkspaceDirectory: async () => undefined,
persistProjectPath: () => ({
outcome: 'reactivated_archived',
project: {
...projectRow,
isArchived: 1,
},
}),
getProjectByPath: () => projectRow,
},
);
assert.equal(result.outcome, 'reactivated_archived');
assert.equal(result.project.isArchived, true);
});

View File

@@ -0,0 +1,123 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { projectsDb } from '@/modules/database/index.js';
import { applyLegacyStarredProjectIds, toggleProjectStar } from '@/modules/projects/services/project-star.service.js';
import { AppError } from '@/shared/utils.js';
type ProjectRow = {
project_id: string;
project_path: string;
custom_project_name: string | null;
isStarred: number;
isArchived: number;
};
test('toggleProjectStar throws when projectId is missing', () => {
assert.throws(
() => toggleProjectStar(' '),
(error: unknown) =>
error instanceof AppError
&& error.code === 'PROJECT_ID_REQUIRED'
&& error.statusCode === 400,
);
});
test('toggleProjectStar throws when project does not exist', () => {
const originalGetProjectById = projectsDb.getProjectById;
try {
projectsDb.getProjectById = () => null;
assert.throws(
() => toggleProjectStar('project-1'),
(error: unknown) =>
error instanceof AppError
&& error.code === 'PROJECT_NOT_FOUND'
&& error.statusCode === 404,
);
} finally {
projectsDb.getProjectById = originalGetProjectById;
}
});
test('toggleProjectStar flips star state and persists it', () => {
const originalGetProjectById = projectsDb.getProjectById;
const originalUpdateProjectIsStarredById = projectsDb.updateProjectIsStarredById;
let capturedProjectId = '';
let capturedState = false;
try {
projectsDb.getProjectById = () =>
({
project_id: 'project-1',
project_path: '/workspace/project-1',
custom_project_name: 'project-1',
isStarred: 0,
isArchived: 0,
}) as ProjectRow;
projectsDb.updateProjectIsStarredById = (projectId: string, isStarred: boolean) => {
capturedProjectId = projectId;
capturedState = isStarred;
};
const result = toggleProjectStar('project-1');
assert.equal(result.isStarred, true);
assert.equal(capturedProjectId, 'project-1');
assert.equal(capturedState, true);
} finally {
projectsDb.getProjectById = originalGetProjectById;
projectsDb.updateProjectIsStarredById = originalUpdateProjectIsStarredById;
}
});
test('applyLegacyStarredProjectIds stars only valid, unstarred projects', () => {
const originalGetProjectById = projectsDb.getProjectById;
const originalUpdateProjectIsStarredById = projectsDb.updateProjectIsStarredById;
const updatedProjectIds: string[] = [];
try {
projectsDb.getProjectById = (projectId: string) => {
if (projectId === 'project-a') {
return {
project_id: 'project-a',
project_path: '/workspace/project-a',
custom_project_name: 'A',
isStarred: 0,
isArchived: 0,
} as ProjectRow;
}
if (projectId === 'project-b') {
return {
project_id: 'project-b',
project_path: '/workspace/project-b',
custom_project_name: 'B',
isStarred: 1,
isArchived: 0,
} as ProjectRow;
}
return null;
};
projectsDb.updateProjectIsStarredById = (projectId: string) => {
updatedProjectIds.push(projectId);
};
const result = applyLegacyStarredProjectIds([
'project-a',
'project-b',
'missing-project',
'project-a',
'',
' ',
]);
assert.equal(result.updated, 1);
assert.deepEqual(updatedProjectIds, ['project-a']);
} finally {
projectsDb.getProjectById = originalGetProjectById;
projectsDb.updateProjectIsStarredById = originalUpdateProjectIsStarredById;
}
});

View File

@@ -0,0 +1,105 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import {
getProjectTaskMaster,
getProjectTaskMasterById,
} from '@/modules/projects/services/projects-has-taskmaster.service.js';
import { AppError } from '@/shared/utils.js';
test('getProjectTaskMasterById returns null when project path is missing', async () => {
const result = await getProjectTaskMasterById('project-1', {
resolveProjectPathById: () => null,
detectTaskMasterFolder: async () => {
throw new Error('detectTaskMasterFolder should not be called when path is missing');
},
});
assert.equal(result, null);
});
test('getProjectTaskMasterById returns configured status when taskmaster exists with essential files', async () => {
const result = await getProjectTaskMasterById('project-1', {
resolveProjectPathById: () => '/workspace/project-1',
detectTaskMasterFolder: async () => ({
hasTaskmaster: true,
hasEssentialFiles: true,
metadata: {
taskCount: 3,
subtaskCount: 0,
completed: 1,
pending: 2,
inProgress: 0,
review: 0,
completionPercentage: 33,
lastModified: '2026-01-01T00:00:00.000Z',
},
}),
});
assert.ok(result);
assert.equal(result.projectId, 'project-1');
assert.equal(result.projectPath, '/workspace/project-1');
assert.equal(result.taskmaster.hasTaskmaster, true);
assert.equal(result.taskmaster.hasEssentialFiles, true);
assert.equal(result.taskmaster.status, 'configured');
assert.deepEqual(result.taskmaster.metadata, {
taskCount: 3,
subtaskCount: 0,
completed: 1,
pending: 2,
inProgress: 0,
review: 0,
completionPercentage: 33,
lastModified: '2026-01-01T00:00:00.000Z',
});
});
test('getProjectTaskMasterById returns not-configured status when taskmaster is missing', async () => {
const result = await getProjectTaskMasterById('project-1', {
resolveProjectPathById: () => '/workspace/project-1',
detectTaskMasterFolder: async () => ({
hasTaskmaster: false,
}),
});
assert.ok(result);
assert.equal(result.taskmaster.hasTaskmaster, false);
assert.equal(result.taskmaster.hasEssentialFiles, false);
assert.equal(result.taskmaster.status, 'not-configured');
assert.equal(result.taskmaster.metadata, null);
});
test('getProjectTaskMaster throws when project id is missing', async () => {
await assert.rejects(
async () =>
getProjectTaskMaster('', async () => ({
projectId: 'project-1',
projectPath: '/workspace/project-1',
taskmaster: {
hasTaskmaster: true,
hasEssentialFiles: true,
metadata: null,
status: 'configured',
},
})),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'PROJECT_ID_REQUIRED');
assert.equal(error.statusCode, 400);
return true;
},
);
});
test('getProjectTaskMaster throws when project does not exist', async () => {
await assert.rejects(
async () => getProjectTaskMaster('project-that-does-not-exist', async () => null),
(error: unknown) => {
assert.ok(error instanceof AppError);
assert.equal(error.code, 'PROJECT_NOT_FOUND');
assert.equal(error.statusCode, 404);
return true;
},
);
});

View File

@@ -0,0 +1,346 @@
# Providers Module Guide
This file documents the current provider contract in `server/modules/providers`.
Keep it current whenever provider wiring, skill discovery, or session sync
behavior changes. The goal is that a human or AI agent can add a new provider
without guessing which files need to move.
## Current Provider Shape
Every provider wrapper exposes five facets:
- `auth`
- `mcp`
- `skills`
- `sessions`
- `sessionSynchronizer`
These correspond to the shared interfaces in `server/shared/interfaces.ts`:
- `IProviderAuth`
- `IProviderMcp`
- `IProviderSkills`
- `IProviderSessions`
- `IProviderSessionSynchronizer`
The services that consume them are:
- `providerAuthService`
- `providerMcpService`
- `providerSkillsService`
- `sessionsService`
- `sessionSynchronizerService`
Current provider ids in this repo are:
- `claude`
- `codex`
- `cursor`
- `gemini`
Those ids are mirrored in backend unions and frontend provider constants. If
adding a new provider, update every place that hardcodes this list.
## Current File Layout
Each provider lives under its own folder in `server/modules/providers/list/`:
```text
server/modules/providers/list/<provider>/
<provider>.provider.ts
<provider>-auth.provider.ts
<provider>-mcp.provider.ts
<provider>-skills.provider.ts
<provider>-sessions.provider.ts
<provider>-session-synchronizer.provider.ts
```
The existing provider folders are `claude`, `codex`, `cursor`, and `gemini`.
## What Each Facet Does
| Facet | Responsibility | Base / Service |
| --- | --- | --- |
| `auth` | Report install/auth state for the provider runtime | `IProviderAuth` -> `providerAuthService` |
| `mcp` | Read, list, write, and remove provider-native MCP config | `McpProvider` -> `providerMcpService` |
| `skills` | Discover provider-native skill markdown files | `SkillsProvider` -> `providerSkillsService` |
| `sessions` | Normalize live events and fetch session history | `IProviderSessions` -> `sessionsService` |
| `sessionSynchronizer` | Scan transcript artifacts and upsert session metadata | `IProviderSessionSynchronizer` -> `sessionSynchronizerService` |
`sessions` and `sessionSynchronizer` are separate concerns:
- `sessions` handles runtime event normalization and history fetches.
- `sessionSynchronizer` handles file-backed session indexing into `sessionsDb`.
## How To Add A Provider
1. Add the provider id everywhere it is part of the contract.
- Update `server/shared/types.ts` `LLMProvider`.
- Update `src/types/app.ts` `LLMProvider` if the frontend should know about it.
- Update `server/modules/providers/provider.routes.ts`.
- Update `server/routes/agent.js` if the provider is launchable from the agent runtime.
- Update `server/index.js` if the provider needs runtime boot or shutdown wiring.
- Update `shared/modelConstants.js` if the provider appears in UI provider pickers.
- Update `src/components/chat/hooks/useChatProviderState.ts` and
`src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx` if
the provider should be selectable in chat.
- Update `src/components/provider-auth/view/ProviderLoginModal.tsx` if the
provider has a login/setup flow.
2. Create the wrapper class.
- Add `server/modules/providers/list/<provider>/<provider>.provider.ts`.
- Extend `AbstractProvider`.
- Expose readonly `auth`, `mcp`, `skills`, `sessions`, and `sessionSynchronizer`.
- Call `super('<provider>')`.
3. Implement auth.
- Return a full `ProviderAuthStatus`.
- Treat normal `not installed` / `not authenticated` states as data, not exceptions.
- Keep provider-specific credential discovery inside the auth provider.
- If the provider has no auth step, return a stable unauthenticated or not-installed status instead of omitting the facet.
4. Implement MCP.
- Extend `McpProvider`.
- Pass the supported scopes and transports to `super(...)`.
- Implement the four required methods:
- `readScopedServers(...)`
- `writeScopedServers(...)`
- `buildServerConfig(...)`
- `normalizeServerConfig(...)`
- Use the shared validation and normalization behavior from `McpProvider`.
- Keep the provider-specific config format local to the provider implementation.
Current MCP formats in this repo are:
| Provider | User / Project Storage | Supported Scopes | Supported Transports |
| --- | --- | --- | --- |
| Claude | `.mcp.json` in user / local / project locations | `user`, `local`, `project` | `stdio`, `http`, `sse` |
| Codex | `.codex/config.toml` | `user`, `project` | `stdio`, `http` |
| Cursor | `.cursor/mcp.json` | `user`, `project` | `stdio`, `http` |
| Gemini | `.gemini/settings.json` | `user`, `project` | `stdio`, `http` |
5. Implement skills.
- Extend `SkillsProvider`.
- Implement `getSkillSources(workspacePath)`.
- Return the actual discovery roots for the provider.
- Skills are discovered from `SKILL.md` files.
- `readProviderSkillMarkdownDefinition(...)` reads front matter `name` and `description`.
- If `name` is missing, the parent directory name is used as a fallback.
- Use `recursive: true` only when the provider stores skills in nested trees.
- Keep the emitted `command` string aligned with the provider's real skill syntax.
Current skill discovery roots are:
| Provider | User Roots | Project / Repo Roots | Prefix | Notes |
| --- | --- | --- | --- | --- |
| Claude | `~/.claude/skills` | `<workspace>/.claude/skills` | `/` | Also discovers Claude plugin skills from enabled plugin installs. Command skills live under `commands/`; markdown skills live under `skills/` and are scanned recursively. |
| Codex | `~/.agents/skills`, `~/.codex/skills/.system`, `/etc/codex/skills` | `<workspace>/.agents/skills`, `path.dirname(workspacePath)/.agents/skills`, topmost git root `.agents/skills` | `$` | Overlapping roots are deduplicated before scanning. |
| Cursor | `~/.cursor/skills` | `<workspace>/.cursor/skills`, `<workspace>/.agents/skills` | `/` | Uses slash-style commands. |
| Gemini | `~/.gemini/skills`, `~/.agents/skills` | `<workspace>/.gemini/skills`, `<workspace>/.agents/skills` | `/` | Uses slash-style commands. |
Command forms currently used by the providers are:
- Claude user/project skills: `/skill-name`
- Claude plugin skills: `/plugin-name:skill-name`
- Codex skills: `$skill-name`
- Cursor skills: `/skill-name`
- Gemini skills: `/skill-name`
6. Implement sessions.
- Implement `normalizeMessage(raw, sessionId)` and `fetchHistory(sessionId, options)`.
- Use `createNormalizedMessage(...)` and `generateMessageId(...)` for emitted messages.
- Keep normalized message ids unique. If one raw event produces multiple text
parts, append a discriminator so ids do not collide.
- Keep pagination consistent:
- `limit: null` means unbounded/full history.
- `limit: 0` means an empty page.
- always return `total`, `hasMore`, `offset`, and `limit` when paginating.
- Sanitize any filesystem-derived ids before using them in file or database paths.
- Do not assume a provider's history format matches another provider's format.
7. Implement session synchronization.
- Implement `synchronize(since?: Date)` to scan provider artifacts and upsert
sessions into `sessionsDb`.
- Implement `synchronizeFile(filePath)` for single-file watcher updates.
- Use the existing helpers when they fit:
- `buildLookupMap(...)`
- `extractFirstValidJsonlData(...)`
- `findFilesRecursivelyCreatedAfter(...)`
- `normalizeSessionName(...)`
- `readFileTimestamps(...)`
- Make the sync resilient to partial, malformed, or missing provider files.
- The orchestration service runs all provider synchronizers and only advances
`scan_state.last_scanned_at` when every provider succeeds.
Current session sync roots are:
| Provider | Scan Roots | Metadata Helpers / Notes |
| --- | --- | --- |
| Claude | `~/.claude/projects/**/*.jsonl` | Uses `~/.claude/history.jsonl` for name lookup and the trailing `ai-title`, `last-prompt`, or `custom-title` entries for title recovery. |
| Codex | `~/.codex/sessions/**/*.jsonl` | Uses `~/.codex/session_index.jsonl` for title lookup and the last `task_complete` message for a fallback title. |
| Cursor | `~/.cursor/projects/**/*.jsonl` | Uses sibling `worker.log` to recover `workspacePath`, then derives the session title from the first user prompt. |
| Gemini | `~/.gemini/tmp/**/*.jsonl` | Current full scans only index temp JSONL chat artifacts. Single-file sync also accepts legacy `.json` files. |
8. Register the provider.
- Add the new provider class to `server/modules/providers/provider.registry.ts`.
- Update `server/modules/providers/provider.routes.ts` provider parsing.
- If the provider introduces a new service or lifecycle hook, export it from the module entrypoint that consumes providers.
9. Wire runtime and UI surfaces outside the providers module when needed.
If the provider can run live chat sessions, update the runtime entrypoints too:
- `server/routes/agent.js`
- `server/index.js`
If the provider is visible in the UI, update:
- `shared/modelConstants.js`
- `src/components/chat/hooks/useChatProviderState.ts`
- `src/components/chat/view/subcomponents/ProviderSelectionEmptyState.tsx`
- `src/components/provider-auth/view/ProviderLoginModal.tsx`
## Minimal Wrapper Template
```ts
import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js';
import { <Provider>ProviderAuth } from './<provider>-auth.provider.js';
import { <Provider>McpProvider } from './<provider>-mcp.provider.js';
import { <Provider>SkillsProvider } from './<provider>-skills.provider.js';
import { <Provider>SessionsProvider } from './<provider>-sessions.provider.js';
import { <Provider>SessionSynchronizer } from './<provider>-session-synchronizer.provider.js';
import type {
IProviderAuth,
IProviderMcp,
IProviderSessionSynchronizer,
IProviderSessions,
IProviderSkills,
} from '@/shared/interfaces.js';
export class <Provider>Provider extends AbstractProvider {
readonly auth: IProviderAuth = new <Provider>ProviderAuth();
readonly mcp: IProviderMcp = new <Provider>McpProvider();
readonly skills: IProviderSkills = new <Provider>SkillsProvider();
readonly sessions: IProviderSessions = new <Provider>SessionsProvider();
readonly sessionSynchronizer: IProviderSessionSynchronizer =
new <Provider>SessionSynchronizer();
constructor() {
super('<provider>');
}
}
```
## Minimal Skills Template
```ts
import path from 'node:path';
import { SkillsProvider } from '@/modules/providers/shared/skills/skills.provider.js';
import type { ProviderSkillSource } from '@/shared/types.js';
export class <Provider>SkillsProvider extends SkillsProvider {
constructor() {
super('<provider>');
}
protected async getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]> {
return [
{
scope: 'project',
rootDir: path.join(workspacePath, '.<provider>', 'skills'),
commandPrefix: '/',
},
];
}
}
```
## Minimal Session Sync Template
```ts
import type { IProviderSessionSynchronizer } from '@/shared/interfaces.js';
export class <Provider>SessionSynchronizer implements IProviderSessionSynchronizer {
async synchronize(since?: Date): Promise<number> {
return 0;
}
async synchronizeFile(filePath: string): Promise<string | null> {
return null;
}
}
```
## AI Prompt Template
Use this prompt when asking an AI agent to add a provider:
```text
Add a new provider "<provider>" using the current provider module architecture.
Requirements:
1) Create:
- server/modules/providers/list/<provider>/<provider>.provider.ts
- server/modules/providers/list/<provider>/<provider>-auth.provider.ts
- server/modules/providers/list/<provider>/<provider>-mcp.provider.ts
- server/modules/providers/list/<provider>/<provider>-skills.provider.ts
- server/modules/providers/list/<provider>/<provider>-sessions.provider.ts
- server/modules/providers/list/<provider>/<provider>-session-synchronizer.provider.ts
2) Register in:
- server/modules/providers/provider.registry.ts
- server/modules/providers/provider.routes.ts
- server/shared/types.ts LLMProvider
- src/types/app.ts LLMProvider
3) Mirror the nearest existing provider implementation for file naming, style,
and error handling.
4) Implement skills support with SkillsProvider and the current skill roots.
5) Implement session synchronization if the provider stores transcript files.
6) Ensure sessions use unique ids, safe path handling, and correct pagination.
7) Keep `sessions` and `sessionSynchronizer` separate.
8) Run:
- npx eslint <touched files>
- npx tsc --noEmit -p server/tsconfig.json
```
## Validation
After adding or changing a provider, run the relevant checks:
```bash
npx eslint server/modules/providers/**/*.ts server/shared/types.ts server/shared/interfaces.ts
npx tsc --noEmit -p server/tsconfig.json
```
Useful tests in this repo:
- `server/modules/providers/tests/mcp.test.ts`
- `server/modules/providers/tests/skills.test.ts`
If you touch sessions or session synchronization, add or update focused tests
alongside the implementation.
## Common Mistakes
- Adding provider files but forgetting `provider.registry.ts` or
`provider.routes.ts`.
- Updating backend provider ids but not `src/types/app.ts` or the frontend
provider constants.
- Omitting `skills` or `sessionSynchronizer` from the wrapper.
- Returning duplicate normalized message ids for split content.
- Treating `limit === 0` as unbounded history.
- Building file paths from raw session ids without validation.
- Hardcoding a skill root without checking the provider's actual discovery rules.
- Forgetting that Claude plugin skills are discovered differently from normal
user/project skill folders.
- Assuming one provider's MCP config file format works for the others.

View File

@@ -0,0 +1,5 @@
export { sessionSynchronizerService } from './services/session-synchronizer.service.js';
export { providerSkillsService } from './services/skills.service.js';
export { initializeSessionsWatcher } from './services/sessions-watcher.service.js';
export { closeSessionsWatcher } from './services/sessions-watcher.service.js';

View File

@@ -4,6 +4,7 @@ import path from 'node:path';
import spawn from 'cross-spawn';
import { resolveClaudeCodeExecutablePath } from '@/shared/claude-cli-path.js';
import type { IProviderAuth } from '@/shared/interfaces.js';
import type { ProviderAuthStatus } from '@/shared/types.js';
import { readObjectRecord, readOptionalString } from '@/shared/utils.js';
@@ -20,13 +21,13 @@ export class ClaudeProviderAuth implements IProviderAuth {
* Checks whether the Claude Code CLI is available on this host.
*/
private checkInstalled(): boolean {
const cliPath = process.env.CLAUDE_CLI_PATH || 'claude';
try {
spawn.sync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 });
return true;
} catch {
return false;
}
const cliPath = resolveClaudeCodeExecutablePath(process.env.CLAUDE_CLI_PATH);
try {
spawn.sync(cliPath, ['--version'], { stdio: 'ignore', timeout: 5000 });
return true;
} catch {
return false;
}
}
/**

View File

@@ -0,0 +1,176 @@
import os from 'node:os';
import path from 'node:path';
import { readFile } from 'node:fs/promises';
import { sessionsDb } from '@/modules/database/index.js';
import {
buildLookupMap,
extractFirstValidJsonlData,
findFilesRecursivelyCreatedAfter,
normalizeSessionName,
readFileTimestamps,
} from '@/shared/utils.js';
import type { IProviderSessionSynchronizer } from '@/shared/interfaces.js';
type ParsedSession = {
sessionId: string;
projectPath: string;
sessionName?: string;
};
/**
* Session indexer for Claude transcript artifacts.
*/
export class ClaudeSessionSynchronizer implements IProviderSessionSynchronizer {
private readonly provider = 'claude' as const;
private readonly claudeHome = path.join(os.homedir(), '.claude');
/**
* Scans ~/.claude/projects and upserts discovered sessions into DB.
*/
async synchronize(since?: Date): Promise<number> {
const nameMap = await buildLookupMap(path.join(this.claudeHome, 'history.jsonl'), 'sessionId', 'display');
const files = await findFilesRecursivelyCreatedAfter(
path.join(this.claudeHome, 'projects'),
'.jsonl',
since ?? null
);
let processed = 0;
for (const filePath of files) {
const parsed = await this.processSessionFile(filePath, nameMap);
if (!parsed) {
continue;
}
const timestamps = await readFileTimestamps(filePath);
sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
processed += 1;
}
return processed;
}
/**
* Parses and upserts one Claude session JSONL file.
*/
async synchronizeFile(filePath: string): Promise<string | null> {
if (!filePath.endsWith('.jsonl')) {
return null;
}
const nameMap = await buildLookupMap(path.join(this.claudeHome, 'history.jsonl'), 'sessionId', 'display');
const parsed = await this.processSessionFile(filePath, nameMap);
if (!parsed) {
return null;
}
const timestamps = await readFileTimestamps(filePath);
return sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
}
/**
* Extracts session metadata from one Claude JSONL session file.
*/
private async processSessionFile(
filePath: string,
nameMap: Map<string, string>
): Promise<ParsedSession | null> {
const parsed = await extractFirstValidJsonlData(filePath, (rawData) => {
const data = rawData as Record<string, unknown>;
const sessionId = typeof data.sessionId === 'string' ? data.sessionId : undefined;
const projectPath = typeof data.cwd === 'string' ? data.cwd : undefined;
if (!sessionId || !projectPath) {
return null;
}
return {
sessionId,
projectPath,
};
});
if (!parsed) {
return null;
}
const existingSession = sessionsDb.getSessionById(parsed.sessionId);
const existingSessionName = existingSession?.custom_name;
if (existingSessionName && existingSessionName !== 'Untitled Claude Session') {
return {
...parsed,
sessionName: normalizeSessionName(existingSessionName, 'Untitled Claude Session'),
};
}
let sessionName = nameMap.get(parsed.sessionId);
if (!sessionName) {
sessionName = await this.extractSessionAiTitleFromEnd(filePath, parsed.sessionId);
}
return {
...parsed,
sessionName: normalizeSessionName(sessionName, 'Untitled Claude Session'),
};
}
private async extractSessionAiTitleFromEnd(
filePath: string,
sessionId: string
): Promise<string | undefined> {
try {
const content = await readFile(filePath, 'utf8');
const lines = content.split(/\r?\n/);
for (let index = lines.length - 1; index >= 0; index -= 1) {
const line = lines[index]?.trim();
if (!line) {
continue;
}
let parsed: unknown;
try {
parsed = JSON.parse(line);
} catch {
continue;
}
const data = parsed as Record<string, unknown>;
const eventType = typeof data.type === 'string' ? data.type : undefined;
const eventSessionId = typeof data.sessionId === 'string' ? data.sessionId : undefined;
const aiTitle = typeof data.aiTitle === 'string' ? data.aiTitle : undefined;
const lastPrompt = typeof data.lastPrompt === 'string' ? data.lastPrompt : undefined;
const claudeRenamedTitle = typeof data.customTitle === 'string' ? data.customTitle : undefined;
if (
(eventType === 'ai-title' && eventSessionId === sessionId && aiTitle?.trim()) ||
(eventType === 'last-prompt' && eventSessionId === sessionId && lastPrompt?.trim()) ||
(eventType === "custom-title" && eventSessionId === sessionId && claudeRenamedTitle?.trim())
) {
return aiTitle || lastPrompt || claudeRenamedTitle;
}
}
} catch {
// Ignore missing/unreadable files so sync can continue.
}
return undefined;
}
}

View File

@@ -1,7 +1,12 @@
import { getSessionMessages } from '@/projects.js';
import fs from 'node:fs';
import fsp from 'node:fs/promises';
import path from 'node:path';
import readline from 'node:readline';
import type { IProviderSessions } from '@/shared/interfaces.js';
import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js';
import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js';
import { sessionsDb } from '@/modules/database/index.js';
const PROVIDER = 'claude';
@@ -15,30 +20,198 @@ type ClaudeToolResult = {
type ClaudeHistoryResult =
| AnyRecord[]
| {
messages?: AnyRecord[];
total?: number;
hasMore?: boolean;
};
messages?: AnyRecord[];
total?: number;
hasMore?: boolean;
};
const loadClaudeSessionMessages = getSessionMessages as unknown as (
projectName: string,
type ClaudeHistoryMessagesResult =
| AnyRecord[]
| {
messages: AnyRecord[];
total: number;
hasMore: boolean;
offset?: number;
limit?: number | null;
};
async function parseAgentTools(filePath: string): Promise<AnyRecord[]> {
const tools: AnyRecord[] = [];
try {
const fileStream = fs.createReadStream(filePath);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity,
});
for await (const line of rl) {
if (!line.trim()) {
continue;
}
try {
const entry = JSON.parse(line) as AnyRecord;
if (entry.message?.role === 'assistant' && Array.isArray(entry.message?.content)) {
for (const part of entry.message.content as AnyRecord[]) {
if (part.type === 'tool_use') {
tools.push({
toolId: part.id,
toolName: part.name,
toolInput: part.input,
timestamp: entry.timestamp,
});
}
}
}
if (entry.message?.role === 'user' && Array.isArray(entry.message?.content)) {
for (const part of entry.message.content as AnyRecord[]) {
if (part.type !== 'tool_result') {
continue;
}
const tool = tools.find((candidate) => candidate.toolId === part.tool_use_id);
if (!tool) {
continue;
}
tool.toolResult = {
content: typeof part.content === 'string'
? part.content
: Array.isArray(part.content)
? part.content
.map((contentPart: AnyRecord) => contentPart?.text || '')
.join('\n')
: JSON.stringify(part.content),
isError: Boolean(part.is_error),
};
}
}
} catch {
// Skip malformed lines that can happen during concurrent writes.
}
}
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`Error parsing agent file ${filePath}:`, message);
}
return tools;
}
async function getSessionMessages(
sessionId: string,
limit: number | null,
offset: number,
) => Promise<ClaudeHistoryResult>;
): Promise<ClaudeHistoryMessagesResult> {
try {
const jsonLPath = sessionsDb.getSessionById(sessionId)?.jsonl_path;
if (!jsonLPath) {
return { messages: [], total: 0, hasMore: false };
}
const projectDir = path.dirname(jsonLPath);
const files = await fsp.readdir(projectDir);
const agentFiles = files.filter((file) => file.endsWith('.jsonl') && file.startsWith('agent-'));
const messages: AnyRecord[] = [];
const agentToolsCache = new Map<string, AnyRecord[]>();
const fileStream = fs.createReadStream(jsonLPath);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity,
});
for await (const line of rl) {
if (!line.trim()) {
continue;
}
try {
const entry = JSON.parse(line) as AnyRecord;
if (entry.sessionId === sessionId) {
messages.push(entry);
}
} catch {
// Skip malformed JSONL lines that can happen during concurrent writes.
}
}
const agentIds = new Set<string>();
for (const message of messages) {
const agentId = message.toolUseResult?.agentId;
if (agentId) {
agentIds.add(String(agentId));
}
}
for (const agentId of agentIds) {
const agentFileName = `agent-${agentId}.jsonl`;
if (!agentFiles.includes(agentFileName)) {
continue;
}
const agentFilePath = path.join(projectDir, agentFileName);
const tools = await parseAgentTools(agentFilePath);
agentToolsCache.set(agentId, tools);
}
for (const message of messages) {
const agentId = message.toolUseResult?.agentId;
if (!agentId) {
continue;
}
const agentTools = agentToolsCache.get(String(agentId));
if (agentTools && agentTools.length > 0) {
message.subagentTools = agentTools;
}
}
const sortedMessages = messages.sort(
(a, b) => new Date(a.timestamp || 0).getTime() - new Date(b.timestamp || 0).getTime(),
);
const total = sortedMessages.length;
if (limit === null) {
return sortedMessages;
}
const startIndex = Math.max(0, total - offset - limit);
const endIndex = total - offset;
const paginatedMessages = sortedMessages.slice(startIndex, endIndex);
const hasMore = startIndex > 0;
return {
messages: paginatedMessages,
total,
hasMore,
offset,
limit,
};
} catch (error) {
console.error(`Error reading messages for session ${sessionId}:`, error);
return limit === null ? [] : { messages: [], total: 0, hasMore: false };
}
}
/**
* Claude writes internal command and system reminder entries into history.
* Those are useful for the CLI but should not appear in the user-facing chat.
* Claude writes a mix of truly internal transcript rows and "UI-hidden" local
* command artifacts into the same JSONL stream.
*
* Important distinction:
* - system reminders / caveats / interruption banners should stay hidden
* - local command payloads (`<command-name>...`) and stdout wrappers
* (`<local-command-stdout>...`) should be remapped into normal chat messages
* instead of being discarded as internal content
*/
const INTERNAL_CONTENT_PREFIXES = [
'<command-name>',
'<command-message>',
'<command-args>',
'<local-command-stdout>',
'<system-reminder>',
'Caveat:',
'This session is being continued from a previous',
'[Request interrupted',
] as const;
@@ -46,6 +219,73 @@ function isInternalContent(content: string): boolean {
return INTERNAL_CONTENT_PREFIXES.some((prefix) => content.startsWith(prefix));
}
/**
* Claude wraps local slash-command metadata in lightweight XML-like tags inside
* a plain string payload. We intentionally parse only the small tag surface we
* care about instead of introducing a generic XML parser for untrusted history.
*/
function extractTaggedContent(content: string, tagName: string): string | null {
const escapedTagName = tagName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&');
const match = new RegExp(`<${escapedTagName}>([\\s\\S]*?)<\\/${escapedTagName}>`).exec(content);
return match ? match[1] : null;
}
type ClaudeLocalCommandPayload = {
commandName: string;
commandMessage: string;
commandArgs: string;
};
/**
* Converts Claude's hidden local command wrapper into structured metadata.
*
* The three tags often coexist in one string payload. Returning `null` lets the
* normal text path continue untouched for unrelated messages.
*/
function parseLocalCommandPayload(content: string): ClaudeLocalCommandPayload | null {
const commandName = extractTaggedContent(content, 'command-name');
const commandMessage = extractTaggedContent(content, 'command-message');
const commandArgs = extractTaggedContent(content, 'command-args');
if (commandName === null && commandMessage === null && commandArgs === null) {
return null;
}
return {
commandName: commandName ?? '',
commandMessage: commandMessage ?? '',
commandArgs: commandArgs ?? '',
};
}
/**
* Produces the short user-visible command string that should appear in chat.
*
* We prefer the slash-prefixed command name because that most closely matches
* what the user actually typed, and only fall back to the message body when the
* command name is unavailable in older transcript variants.
*/
function buildLocalCommandDisplayText(payload: ClaudeLocalCommandPayload): string {
const commandName = payload.commandName.trim();
const commandMessage = payload.commandMessage.trim();
const commandArgs = payload.commandArgs.trim();
const baseCommand = commandName || commandMessage;
if (!baseCommand) {
return '';
}
return commandArgs ? `${baseCommand} ${commandArgs}` : baseCommand;
}
/**
* Claude local-command stdout may contain ANSI styling codes because it was
* captured from the terminal. The web chat should receive readable plain text.
*/
function stripAnsiFormatting(text: string): string {
return text.replace(/\u001B\[[0-9;?]*[ -/]*[@-~]/g, '');
}
export class ClaudeSessionsProvider implements IProviderSessions {
/**
* Normalizes one Claude JSONL entry or live SDK stream event into the shared
@@ -68,7 +308,7 @@ export class ClaudeSessionsProvider implements IProviderSessions {
const ts = raw.timestamp || new Date().toISOString();
const baseId = raw.uuid || generateMessageId('claude');
if (raw.message?.role === 'user' && raw.message?.content) {
if (raw.message?.role === 'user' && raw.message?.content && raw.isMeta !== true) {
if (Array.isArray(raw.message.content)) {
for (let partIndex = 0; partIndex < raw.message.content.length; partIndex++) {
const part = raw.message.content[partIndex];
@@ -121,6 +361,80 @@ export class ClaudeSessionsProvider implements IProviderSessions {
}
} else if (typeof raw.message.content === 'string') {
const text = raw.message.content;
/**
* Claude stores compact summaries as synthetic "user" rows so the CLI
* can resume the next session turn with the summary in-context.
*
* For the web UI this is much more useful as assistant-authored summary
* text; otherwise it is both filtered by the generic internal-prefix
* check and visually mislabeled as a user message.
*/
if (raw.isCompactSummary === true && text.trim()) {
messages.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'text',
role: 'assistant',
content: text,
isCompactSummary: true,
}));
return messages;
}
/**
* Local slash commands are serialized as tagged text even though they
* are semantically a user action. Expose the parsed fields to the
* frontend and emit a plain user-visible command string so the command
* no longer disappears from history.
*/
const localCommandPayload = parseLocalCommandPayload(text);
if (localCommandPayload) {
const displayText = buildLocalCommandDisplayText(localCommandPayload);
if (displayText) {
messages.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'text',
role: 'user',
content: displayText,
commandName: localCommandPayload.commandName,
commandMessage: localCommandPayload.commandMessage,
commandArgs: localCommandPayload.commandArgs,
isLocalCommand: true,
}));
}
return messages;
}
/**
* Local command stdout is also written as a "user" row in Claude's
* transcript, but it is terminal output produced in response to the
* command. Re-label it as assistant text so the chat transcript matches
* the actual conversational flow seen by the user.
*/
const localCommandStdout = extractTaggedContent(text, 'local-command-stdout');
if (localCommandStdout !== null) {
const stdoutText = stripAnsiFormatting(localCommandStdout).trim();
if (stdoutText) {
messages.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'text',
role: 'assistant',
content: stdoutText,
isLocalCommandStdout: true,
}));
}
return messages;
}
if (text && !isInternalContent(text)) {
messages.push(createNormalizedMessage({
id: baseId,
@@ -238,14 +552,13 @@ export class ClaudeSessionsProvider implements IProviderSessions {
sessionId: string,
options: FetchHistoryOptions = {},
): Promise<FetchHistoryResult> {
const { projectName, limit = null, offset = 0 } = options;
if (!projectName) {
return { messages: [], total: 0, hasMore: false, offset: 0, limit: null };
}
const { limit = null, offset = 0 } = options;
let result: ClaudeHistoryResult;
try {
result = await loadClaudeSessionMessages(projectName, sessionId, limit, offset);
// Load full history first so `total` reflects frontend-normalized messages,
// not raw JSONL records.
result = await getSessionMessages(sessionId, null, 0);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`[ClaudeProvider] Failed to load session ${sessionId}:`, message);
@@ -253,8 +566,6 @@ export class ClaudeSessionsProvider implements IProviderSessions {
}
const rawMessages = Array.isArray(result) ? result : (result.messages || []);
const total = Array.isArray(result) ? rawMessages.length : (result.total || 0);
const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore);
const toolResultMap = new Map<string, ClaudeToolResult>();
for (const raw of rawMessages) {
@@ -295,12 +606,31 @@ export class ClaudeSessionsProvider implements IProviderSessions {
}
}
const totalNormalized = normalized.length;
let total = 0;
for (const msg of normalized) {
if (msg.kind !== 'tool_result') {
total += 1;
}
}
const normalizedOffset = Math.max(0, offset);
const normalizedLimit = limit === null ? null : Math.max(0, limit);
const messages = normalizedLimit === null
? normalized
: normalized.slice(
Math.max(0, totalNormalized - normalizedOffset - normalizedLimit),
Math.max(0, totalNormalized - normalizedOffset),
);
const hasMore = normalizedLimit === null
? false
: Math.max(0, totalNormalized - normalizedOffset - normalizedLimit) > 0;
return {
messages: normalized,
messages,
total,
hasMore,
offset,
limit,
offset: normalizedOffset,
limit: normalizedLimit,
};
}
}

View File

@@ -0,0 +1,257 @@
import { readFile, readdir, stat } from 'node:fs/promises';
import os from 'node:os';
import path from 'node:path';
import { SkillsProvider } from '@/modules/providers/shared/skills/skills.provider.js';
import { parseFrontMatter } from '@/shared/frontmatter.js';
import type {
ProviderSkill,
ProviderSkillListOptions,
ProviderSkillSource,
} from '@/shared/types.js';
import {
findProviderSkillMarkdownFiles,
readJsonConfig,
readObjectRecord,
readOptionalString,
readProviderSkillMarkdownDefinition,
} from '@/shared/utils.js';
const getClaudeHomePath = (): string => path.join(os.homedir(), '.claude');
const getClaudePluginName = (pluginId: string): string | null => {
const normalizedPluginId = pluginId.trim();
if (!normalizedPluginId || normalizedPluginId === '@') {
return null;
}
const [pluginName] = normalizedPluginId.split('@');
return readOptionalString(pluginName) ?? null;
};
const stripMarkdownExtension = (filename: string): string =>
filename.replace(/\.md$/i, '');
const pathExistsAsDirectory = async (directoryPath: string): Promise<boolean> => {
try {
const directoryStats = await stat(directoryPath);
return directoryStats.isDirectory();
} catch {
return false;
}
};
const listChildDirectories = async (directoryPath: string): Promise<string[]> => {
try {
const entries = await readdir(directoryPath, { withFileTypes: true });
return entries
.filter((entry) => entry.isDirectory())
.map((entry) => path.join(directoryPath, entry.name))
.sort((left, right) => left.localeCompare(right));
} catch {
return [];
}
};
const readClaudePluginName = async (
installPath: string,
pluginId: string,
): Promise<string | null> => {
try {
const pluginConfig = await readJsonConfig(
path.join(installPath, '.claude-plugin', 'plugin.json'),
);
// Older or partial plugin installs may not have plugin.json yet. Falling
// back keeps discovery useful without inventing a separate namespace.
return readOptionalString(pluginConfig.name) ?? getClaudePluginName(pluginId);
} catch {
return getClaudePluginName(pluginId);
}
};
export class ClaudeSkillsProvider extends SkillsProvider {
constructor() {
super('claude');
}
async listSkills(options?: ProviderSkillListOptions): Promise<ProviderSkill[]> {
return [
...(await super.listSkills(options)),
...(await this.listPluginSkills(getClaudeHomePath())),
];
}
protected async getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]> {
const claudeHomePath = getClaudeHomePath();
return [
{
scope: 'user',
rootDir: path.join(claudeHomePath, 'skills'),
commandPrefix: '/',
},
{
scope: 'project',
rootDir: path.join(workspacePath, '.claude', 'skills'),
commandPrefix: '/',
},
];
}
private async listPluginSkills(claudeHomePath: string): Promise<ProviderSkill[]> {
const settings = await readJsonConfig(path.join(claudeHomePath, 'settings.json'));
const enabledPlugins = readObjectRecord(settings.enabledPlugins);
if (!enabledPlugins) {
return [];
}
const installedConfig = await readJsonConfig(
path.join(claudeHomePath, 'plugins', 'installed_plugins.json'),
);
const installedPlugins = readObjectRecord(installedConfig.plugins);
if (!installedPlugins) {
return [];
}
const skills: ProviderSkill[] = [];
const visitedPluginFolders = new Set<string>();
const pluginEntries = Object.entries(enabledPlugins)
.sort(([left], [right]) => left.localeCompare(right));
for (const [pluginId, enabled] of pluginEntries) {
if (enabled !== true) {
continue;
}
const installs = installedPlugins[pluginId];
if (!Array.isArray(installs)) {
continue;
}
for (const install of installs) {
const installRecord = readObjectRecord(install);
const installPath = readOptionalString(installRecord?.installPath);
if (!installPath) {
continue;
}
// Claude's installed path points at one version folder; the usable
// plugin payloads live in the direct child folders beside it.
const pluginFolders = await listChildDirectories(path.dirname(installPath));
for (const pluginFolder of pluginFolders) {
const pluginFolderKey = `${pluginId}:${path.resolve(pluginFolder)}`;
if (visitedPluginFolders.has(pluginFolderKey)) {
continue;
}
visitedPluginFolders.add(pluginFolderKey);
const pluginName = await readClaudePluginName(pluginFolder, pluginId);
if (!pluginName) {
continue;
}
const commandsPath = path.join(pluginFolder, 'commands');
if (await pathExistsAsDirectory(commandsPath)) {
skills.push(
...(await this.listPluginCommandSkills(commandsPath, pluginId, pluginName)),
);
continue;
}
const skillsPath = path.join(pluginFolder, 'skills');
if (!(await pathExistsAsDirectory(skillsPath))) {
continue;
}
skills.push(
...(await this.listPluginSkillMarkdowns(pluginFolder, pluginId, pluginName)),
);
}
}
}
return skills;
}
private async listPluginCommandSkills(
commandsPath: string,
pluginId: string,
pluginName: string,
): Promise<ProviderSkill[]> {
const skills: ProviderSkill[] = [];
try {
const entries = await readdir(commandsPath, { withFileTypes: true });
const commandFiles = entries
.filter((entry) => entry.isFile() && entry.name.toLowerCase().endsWith('.md'))
.sort((left, right) => left.name.localeCompare(right.name));
for (const commandFile of commandFiles) {
const sourcePath = path.join(commandsPath, commandFile.name);
try {
const definition = await this.readPluginCommandDefinition(sourcePath);
skills.push({
provider: this.provider,
name: definition.name,
description: definition.description,
command: `/${pluginName}:${definition.name}`,
scope: 'plugin',
sourcePath,
pluginName,
pluginId,
});
} catch {
// Malformed command markdown should not block sibling plugin commands.
}
}
} catch {
// Missing or unreadable command folders are treated as empty plugin command sets.
}
return skills;
}
private async readPluginCommandDefinition(
commandPath: string,
): Promise<{ name: string; description: string }> {
const content = await readFile(commandPath, 'utf8');
const parsed = parseFrontMatter(content);
const data = readObjectRecord(parsed.data) ?? {};
return {
name: stripMarkdownExtension(path.basename(commandPath)),
description: readOptionalString(data.description) ?? '',
};
}
private async listPluginSkillMarkdowns(
installPath: string,
pluginId: string,
pluginName: string,
): Promise<ProviderSkill[]> {
const skillFiles = await findProviderSkillMarkdownFiles(path.join(installPath, 'skills'), {
recursive: true,
});
const skills: ProviderSkill[] = [];
for (const skillPath of skillFiles) {
try {
const definition = await readProviderSkillMarkdownDefinition(skillPath);
skills.push({
provider: this.provider,
name: definition.name,
description: definition.description,
command: `/${pluginName}:${definition.name}`,
scope: 'plugin',
sourcePath: skillPath,
pluginName,
pluginId,
});
} catch {
// A bad plugin skill file should not block other installed plugin skills.
}
}
return skills;
}
}

View File

@@ -1,13 +1,22 @@
import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js';
import { ClaudeProviderAuth } from '@/modules/providers/list/claude/claude-auth.provider.js';
import { ClaudeMcpProvider } from '@/modules/providers/list/claude/claude-mcp.provider.js';
import { ClaudeSessionSynchronizer } from '@/modules/providers/list/claude/claude-session-synchronizer.provider.js';
import { ClaudeSessionsProvider } from '@/modules/providers/list/claude/claude-sessions.provider.js';
import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js';
import { ClaudeSkillsProvider } from '@/modules/providers/list/claude/claude-skills.provider.js';
import type {
IProviderAuth,
IProviderSessionSynchronizer,
IProviderSkills,
IProviderSessions,
} from '@/shared/interfaces.js';
export class ClaudeProvider extends AbstractProvider {
readonly mcp = new ClaudeMcpProvider();
readonly auth: IProviderAuth = new ClaudeProviderAuth();
readonly skills: IProviderSkills = new ClaudeSkillsProvider();
readonly sessions: IProviderSessions = new ClaudeSessionsProvider();
readonly sessionSynchronizer: IProviderSessionSynchronizer = new ClaudeSessionSynchronizer();
constructor() {
super('claude');

View File

@@ -0,0 +1,179 @@
import os from 'node:os';
import path from 'node:path';
import { readFile } from 'node:fs/promises';
import { sessionsDb } from '@/modules/database/index.js';
import {
buildLookupMap,
extractFirstValidJsonlData,
findFilesRecursivelyCreatedAfter,
normalizeSessionName,
readFileTimestamps,
} from '@/shared/utils.js';
import type { IProviderSessionSynchronizer } from '@/shared/interfaces.js';
type ParsedSession = {
sessionId: string;
projectPath: string;
sessionName?: string;
};
/**
* Session indexer for Codex transcript artifacts.
*/
export class CodexSessionSynchronizer implements IProviderSessionSynchronizer {
private readonly provider = 'codex' as const;
private readonly codexHome = path.join(os.homedir(), '.codex');
/**
* Scans ~/.codex/sessions and upserts discovered sessions into DB.
*/
async synchronize(since?: Date): Promise<number> {
const nameMap = await buildLookupMap(path.join(this.codexHome, 'session_index.jsonl'), 'id', 'thread_name');
const files = await findFilesRecursivelyCreatedAfter(
path.join(this.codexHome, 'sessions'),
'.jsonl',
since ?? null
);
let processed = 0;
for (const filePath of files) {
const parsed = await this.processSessionFile(filePath, nameMap);
if (!parsed) {
continue;
}
const existingSession = sessionsDb.getSessionById(parsed.sessionId);
if (existingSession) {
// If session name is untitled and we now have a name, update it
if (existingSession.custom_name === 'Untitled Codex Session' && parsed.sessionName && parsed.sessionName !== 'Untitled Codex Session') {
sessionsDb.updateSessionCustomName(parsed.sessionId, parsed.sessionName);
}
}
const timestamps = await readFileTimestamps(filePath);
sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
processed += 1;
}
return processed;
}
/**
* Parses and upserts one Codex session JSONL file.
*/
async synchronizeFile(filePath: string): Promise<string | null> {
if (!filePath.endsWith('.jsonl')) {
return null;
}
const nameMap = await buildLookupMap(path.join(this.codexHome, 'session_index.jsonl'), 'id', 'thread_name');
const parsed = await this.processSessionFile(filePath, nameMap);
if (!parsed) {
return null;
}
const timestamps = await readFileTimestamps(filePath);
return sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
}
/**
* Extracts session metadata from one Codex JSONL session file.
*/
private async processSessionFile(
filePath: string,
nameMap: Map<string, string>
): Promise<ParsedSession | null> {
const parsed = await extractFirstValidJsonlData(filePath, (rawData) => {
const data = rawData as Record<string, unknown>;
const payload = data.payload as Record<string, unknown> | undefined;
const sessionId = typeof payload?.id === 'string' ? payload.id : undefined;
const projectPath = typeof payload?.cwd === 'string' ? payload.cwd : undefined;
if (!sessionId || !projectPath) {
return null;
}
return {
sessionId,
projectPath,
};
});
if (!parsed) {
return null;
}
const existingSession = sessionsDb.getSessionById(parsed.sessionId);
const existingSessionName = existingSession?.custom_name;
if (existingSessionName && existingSessionName !== 'Untitled Codex Session') {
return {
...parsed,
sessionName: normalizeSessionName(existingSessionName, 'Untitled Codex Session'),
};
}
let sessionName = nameMap.get(parsed.sessionId);
if (!sessionName) {
sessionName = await this.extractLastAgentMessageFromEnd(filePath);
}
return {
...parsed,
sessionName: normalizeSessionName(sessionName, 'Untitled Codex Session'),
};
}
private async extractLastAgentMessageFromEnd(filePath: string): Promise<string | undefined> {
try {
const content = await readFile(filePath, 'utf8');
const lines = content.split(/\r?\n/);
for (let index = lines.length - 1; index >= 0; index -= 1) {
const line = lines[index]?.trim();
if (!line) {
continue;
}
let parsed: unknown;
try {
parsed = JSON.parse(line);
} catch {
continue;
}
const data = parsed as Record<string, unknown>;
const eventType = typeof data.type === 'string' ? data.type : undefined;
const payload = data.payload as Record<string, unknown> | undefined;
const payloadType = typeof payload?.type === 'string' ? payload.type : undefined;
const lastAgentMessage = typeof payload?.last_agent_message === 'string'
? payload.last_agent_message
: undefined;
if (eventType === 'event_msg' && payloadType === 'task_complete' && lastAgentMessage?.trim()) {
return lastAgentMessage;
}
}
} catch {
// Ignore missing/unreadable files so sync can continue.
}
return undefined;
}
}

View File

@@ -1,4 +1,7 @@
import { getCodexSessionMessages } from '@/projects.js';
import fsSync from 'node:fs';
import readline from 'node:readline';
import { sessionsDb } from '@/modules/database/index.js';
import type { IProviderSessions } from '@/shared/interfaces.js';
import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js';
import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js';
@@ -11,14 +14,250 @@ type CodexHistoryResult =
messages?: AnyRecord[];
total?: number;
hasMore?: boolean;
offset?: number;
limit?: number | null;
tokenUsage?: unknown;
};
const loadCodexSessionMessages = getCodexSessionMessages as unknown as (
function isVisibleCodexUserMessage(payload: AnyRecord | null | undefined): boolean {
if (!payload || payload.type !== 'user_message') {
return false;
}
if (payload.kind && payload.kind !== 'plain') {
return false;
}
return typeof payload.message === 'string' && payload.message.trim().length > 0;
}
function extractCodexTextContent(content: unknown): string {
if (!Array.isArray(content)) {
return typeof content === 'string' ? content : '';
}
return content
.map((item) => {
if (!item || typeof item !== 'object') {
return '';
}
const record = item as AnyRecord;
if (
(record.type === 'input_text' || record.type === 'output_text' || record.type === 'text')
&& typeof record.text === 'string'
) {
return record.text;
}
return '';
})
.filter(Boolean)
.join('\n');
}
async function getCodexSessionMessages(
sessionId: string,
limit: number | null,
offset: number,
) => Promise<CodexHistoryResult>;
limit: number | null = null,
offset = 0,
): Promise<CodexHistoryResult> {
try {
const sessionFilePath = sessionsDb.getSessionById(sessionId)?.jsonl_path;
if (!sessionFilePath) {
console.warn(`Codex session file not found for session ${sessionId}`);
return { messages: [], total: 0, hasMore: false };
}
const messages: AnyRecord[] = [];
let tokenUsage: AnyRecord | null = null;
const fileStream = fsSync.createReadStream(sessionFilePath);
const rl = readline.createInterface({
input: fileStream,
crlfDelay: Infinity,
});
for await (const line of rl) {
if (!line.trim()) {
continue;
}
try {
const entry = JSON.parse(line) as AnyRecord;
if (entry.type === 'event_msg' && entry.payload?.type === 'token_count' && entry.payload?.info) {
const info = entry.payload.info as AnyRecord;
if (info.total_token_usage) {
const usage = info.total_token_usage as AnyRecord;
tokenUsage = {
used: usage.total_tokens || 0,
total: info.model_context_window || 200000,
};
}
}
if (entry.type === 'event_msg' && isVisibleCodexUserMessage(entry.payload as AnyRecord)) {
messages.push({
type: 'user',
timestamp: entry.timestamp,
message: {
role: 'user',
content: entry.payload.message,
},
});
}
if (
entry.type === 'response_item' &&
entry.payload?.type === 'message' &&
entry.payload.role === 'assistant'
) {
const textContent = extractCodexTextContent(entry.payload.content);
if (textContent.trim()) {
messages.push({
type: 'assistant',
timestamp: entry.timestamp,
message: {
role: 'assistant',
content: textContent,
},
});
}
}
if (entry.type === 'response_item' && entry.payload?.type === 'reasoning') {
const summaryText = Array.isArray(entry.payload.summary)
? entry.payload.summary
.map((item: AnyRecord) => item?.text)
.filter(Boolean)
.join('\n')
: '';
if (summaryText.trim()) {
messages.push({
type: 'thinking',
timestamp: entry.timestamp,
message: {
role: 'assistant',
content: summaryText,
},
});
}
}
if (entry.type === 'response_item' && entry.payload?.type === 'function_call') {
let toolName = entry.payload.name;
let toolInput = entry.payload.arguments;
if (toolName === 'shell_command') {
toolName = 'Bash';
try {
const args = JSON.parse(entry.payload.arguments) as AnyRecord;
toolInput = JSON.stringify({ command: args.command });
} catch {
// Keep original arguments when parsing fails.
}
}
messages.push({
type: 'tool_use',
timestamp: entry.timestamp,
toolName,
toolInput,
toolCallId: entry.payload.call_id,
});
}
if (entry.type === 'response_item' && entry.payload?.type === 'function_call_output') {
messages.push({
type: 'tool_result',
timestamp: entry.timestamp,
toolCallId: entry.payload.call_id,
output: entry.payload.output,
});
}
if (entry.type === 'response_item' && entry.payload?.type === 'custom_tool_call') {
const toolName = entry.payload.name || 'custom_tool';
const input = entry.payload.input || '';
if (toolName === 'apply_patch') {
const fileMatch = String(input).match(/\*\*\* Update File: (.+)/);
const filePath = fileMatch ? fileMatch[1].trim() : 'unknown';
const lines = String(input).split('\n');
const oldLines: string[] = [];
const newLines: string[] = [];
for (const lineContent of lines) {
if (lineContent.startsWith('-') && !lineContent.startsWith('---')) {
oldLines.push(lineContent.slice(1));
} else if (lineContent.startsWith('+') && !lineContent.startsWith('+++')) {
newLines.push(lineContent.slice(1));
}
}
messages.push({
type: 'tool_use',
timestamp: entry.timestamp,
toolName: 'Edit',
toolInput: JSON.stringify({
file_path: filePath,
old_string: oldLines.join('\n'),
new_string: newLines.join('\n'),
}),
toolCallId: entry.payload.call_id,
});
} else {
messages.push({
type: 'tool_use',
timestamp: entry.timestamp,
toolName,
toolInput: input,
toolCallId: entry.payload.call_id,
});
}
}
if (entry.type === 'response_item' && entry.payload?.type === 'custom_tool_call_output') {
messages.push({
type: 'tool_result',
timestamp: entry.timestamp,
toolCallId: entry.payload.call_id,
output: entry.payload.output || '',
});
}
} catch {
// Skip malformed lines.
}
}
messages.sort(
(a, b) => new Date(a.timestamp || 0).getTime() - new Date(b.timestamp || 0).getTime(),
);
const total = messages.length;
if (limit !== null) {
const startIndex = Math.max(0, total - offset - limit);
const endIndex = total - offset;
const paginatedMessages = messages.slice(startIndex, endIndex);
const hasMore = startIndex > 0;
return {
messages: paginatedMessages,
total,
hasMore,
offset,
limit,
tokenUsage,
};
}
return { messages, tokenUsage };
} catch (error) {
console.error(`Error reading Codex session messages for ${sessionId}:`, error);
return { messages: [], total: 0, hasMore: false };
}
}
export class CodexSessionsProvider implements IProviderSessions {
/**
@@ -31,6 +270,23 @@ export class CodexSessionsProvider implements IProviderSessions {
const ts = raw.timestamp || new Date().toISOString();
const baseId = raw.uuid || generateMessageId('codex');
if (raw.type === 'thinking' || raw.isReasoning) {
const thinkingContent = typeof raw.message?.content === 'string'
? raw.message.content
: '';
if (!thinkingContent.trim()) {
return [];
}
return [createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'thinking',
content: thinkingContent,
})];
}
if (raw.message?.role === 'user') {
const content = typeof raw.message.content === 'string'
? raw.message.content
@@ -77,17 +333,6 @@ export class CodexSessionsProvider implements IProviderSessions {
})];
}
if (raw.type === 'thinking' || raw.isReasoning) {
return [createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'thinking',
content: raw.message?.content || '',
})];
}
if (raw.type === 'tool_use' || raw.toolName) {
return [createNormalizedMessage({
id: baseId,
@@ -275,7 +520,9 @@ export class CodexSessionsProvider implements IProviderSessions {
let result: CodexHistoryResult;
try {
result = await loadCodexSessionMessages(sessionId, limit, offset);
// Load full history first so `total` reflects frontend-normalized messages,
// not raw JSONL records.
result = await getCodexSessionMessages(sessionId, null, 0);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`[CodexProvider] Failed to load session ${sessionId}:`, message);
@@ -283,8 +530,6 @@ export class CodexSessionsProvider implements IProviderSessions {
}
const rawMessages = Array.isArray(result) ? result : (result.messages || []);
const total = Array.isArray(result) ? rawMessages.length : (result.total || 0);
const hasMore = Array.isArray(result) ? false : Boolean(result.hasMore);
const tokenUsage = Array.isArray(result) ? undefined : result.tokenUsage;
const normalized: NormalizedMessage[] = [];
@@ -307,12 +552,31 @@ export class CodexSessionsProvider implements IProviderSessions {
}
}
const totalNormalized = normalized.length;
let total = 0;
for (const msg of normalized) {
if (msg.kind !== 'tool_result') {
total += 1;
}
}
const normalizedOffset = Math.max(0, offset);
const normalizedLimit = limit === null ? null : Math.max(0, limit);
const messages = normalizedLimit === null
? normalized
: normalized.slice(
Math.max(0, totalNormalized - normalizedOffset - normalizedLimit),
Math.max(0, totalNormalized - normalizedOffset),
);
const hasMore = normalizedLimit === null
? false
: Math.max(0, totalNormalized - normalizedOffset - normalizedLimit) > 0;
return {
messages: normalized,
messages,
total,
hasMore,
offset,
limit,
offset: normalizedOffset,
limit: normalizedLimit,
tokenUsage,
};
}

View File

@@ -0,0 +1,100 @@
import fs from 'node:fs/promises';
import os from 'node:os';
import path from 'node:path';
import { SkillsProvider } from '@/modules/providers/shared/skills/skills.provider.js';
import type { ProviderSkillSource } from '@/shared/types.js';
const hasGitMarker = async (dirPath: string): Promise<boolean> => {
try {
const gitMarkerStats = await fs.stat(path.join(dirPath, '.git'));
return gitMarkerStats.isDirectory() || gitMarkerStats.isFile();
} catch {
return false;
}
};
const findTopmostGitRoot = async (startPath: string): Promise<string | null> => {
let currentPath = path.resolve(startPath);
let topmostGitRoot: string | null = null;
while (true) {
if (await hasGitMarker(currentPath)) {
topmostGitRoot = currentPath;
}
const parentPath = path.dirname(currentPath);
if (parentPath === currentPath) {
break;
}
currentPath = parentPath;
}
return topmostGitRoot;
};
const addUniqueSource = (
sources: ProviderSkillSource[],
seenRootDirs: Set<string>,
source: ProviderSkillSource,
): void => {
const normalizedRootDir = path.resolve(source.rootDir);
if (seenRootDirs.has(normalizedRootDir)) {
return;
}
seenRootDirs.add(normalizedRootDir);
sources.push({ ...source, rootDir: normalizedRootDir });
};
export class CodexSkillsProvider extends SkillsProvider {
constructor() {
super('codex');
}
protected async getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]> {
const sources: ProviderSkillSource[] = [];
const seenRootDirs = new Set<string>();
const repoRoot = await findTopmostGitRoot(workspacePath);
addUniqueSource(sources, seenRootDirs, {
scope: 'repo',
rootDir: path.join(workspacePath, '.agents', 'skills'),
commandPrefix: '$',
});
if (repoRoot) {
// Codex checks repository skills at the launch folder, one folder above it,
// and the topmost git root; these can collapse to the same directory.
addUniqueSource(sources, seenRootDirs, {
scope: 'repo',
rootDir: path.join(path.dirname(workspacePath), '.agents', 'skills'),
commandPrefix: '$',
});
addUniqueSource(sources, seenRootDirs, {
scope: 'repo',
rootDir: path.join(repoRoot, '.agents', 'skills'),
commandPrefix: '$',
});
}
addUniqueSource(sources, seenRootDirs, {
scope: 'user',
rootDir: path.join(os.homedir(), '.agents', 'skills'),
commandPrefix: '$',
});
addUniqueSource(sources, seenRootDirs, {
scope: 'admin',
rootDir: path.join('/etc', 'codex', 'skills'),
commandPrefix: '$',
});
addUniqueSource(sources, seenRootDirs, {
scope: 'system',
rootDir: path.join(os.homedir(), '.codex', 'skills', '.system'),
commandPrefix: '$',
});
return sources;
}
}

View File

@@ -1,13 +1,22 @@
import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js';
import { CodexProviderAuth } from '@/modules/providers/list/codex/codex-auth.provider.js';
import { CodexMcpProvider } from '@/modules/providers/list/codex/codex-mcp.provider.js';
import { CodexSessionSynchronizer } from '@/modules/providers/list/codex/codex-session-synchronizer.provider.js';
import { CodexSessionsProvider } from '@/modules/providers/list/codex/codex-sessions.provider.js';
import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js';
import { CodexSkillsProvider } from '@/modules/providers/list/codex/codex-skills.provider.js';
import type {
IProviderAuth,
IProviderSessionSynchronizer,
IProviderSkills,
IProviderSessions,
} from '@/shared/interfaces.js';
export class CodexProvider extends AbstractProvider {
readonly mcp = new CodexMcpProvider();
readonly auth: IProviderAuth = new CodexProviderAuth();
readonly skills: IProviderSkills = new CodexSkillsProvider();
readonly sessions: IProviderSessions = new CodexSessionsProvider();
readonly sessionSynchronizer: IProviderSessionSynchronizer = new CodexSessionSynchronizer();
constructor() {
super('codex');

View File

@@ -0,0 +1,153 @@
import crypto from 'node:crypto';
import fs from 'node:fs';
import fsp from 'node:fs/promises';
import os from 'node:os';
import path from 'node:path';
import readline from 'node:readline';
import { sessionsDb } from '@/modules/database/index.js';
import {
extractFirstValidJsonlData,
findFilesRecursivelyCreatedAfter,
normalizeSessionName,
readFileTimestamps,
} from '@/shared/utils.js';
import type { IProviderSessionSynchronizer } from '@/shared/interfaces.js';
type ParsedSession = {
sessionId: string;
projectPath: string;
sessionName?: string;
};
/**
* Returns directory entries or an empty list when the folder is missing.
*/
async function listDirectoryEntriesSafe(
directoryPath: string
): Promise<import('node:fs').Dirent[]> {
try {
return await fsp.readdir(directoryPath, { withFileTypes: true });
} catch {
return [];
}
}
/**
* Session indexer for Cursor transcript artifacts.
*/
export class CursorSessionSynchronizer implements IProviderSessionSynchronizer {
private readonly provider = 'cursor' as const;
private readonly cursorHome = path.join(os.homedir(), '.cursor');
/**
* Scans Cursor chats and upserts discovered sessions into DB.
*/
async synchronize(since?: Date): Promise<number> {
const projectsDir = path.join(this.cursorHome, 'projects');
let processed = 0;
const files = await findFilesRecursivelyCreatedAfter(projectsDir, '.jsonl', since ?? null);
for (const filePath of files) {
const parsed = await this.processSessionFile(filePath);
if (!parsed) {
continue;
}
const timestamps = await readFileTimestamps(filePath);
sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
processed += 1;
}
return processed;
}
/**
* Parses and upserts one Cursor session JSONL file.
*/
async synchronizeFile(filePath: string): Promise<string | null> {
if (!filePath.endsWith('.jsonl')) {
return null;
}
const parsed = await this.processSessionFile(filePath);
if (!parsed) {
return null;
}
const timestamps = await readFileTimestamps(filePath);
return sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
}
/**
* Extracts project path from Cursor worker.log.
*/
private async extractProjectPathFromWorkerLog(filePath: string): Promise<string | null> {
try {
const fileStream = fs.createReadStream(filePath, { encoding: 'utf8' });
const lineReader = readline.createInterface({ input: fileStream, crlfDelay: Infinity });
for await (const line of lineReader) {
const match = line.match(/workspacePath=(.*)$/);
const projectPath = match?.[1]?.trim();
if (projectPath) {
lineReader.close();
fileStream.close();
return projectPath;
}
}
} catch {
// Missing worker logs are valid for partial or incomplete session data.
}
return null;
}
/**
* Extracts session metadata from one Cursor JSONL session file.
*/
private async processSessionFile(filePath: string): Promise<ParsedSession | null> {
const sessionId = path.basename(filePath, '.jsonl');
const grandparentDir = path.dirname(path.dirname(path.dirname(filePath)));
const workerLogPath = path.join(grandparentDir, 'worker.log');
const projectPath = await this.extractProjectPathFromWorkerLog(workerLogPath);
if (!projectPath) {
return null;
}
return extractFirstValidJsonlData(filePath, (rawData) => {
const data = rawData as Record<string, any>;
if (data.role !== 'user') {
return null;
}
const text = typeof data.message?.content?.[0]?.text === 'string' ? data.message.content[0].text : '';
const firstLine = text.replace(/<\/?user_query>/g, '').trim().split('\n')[0];
return {
sessionId,
projectPath,
sessionName: normalizeSessionName(firstLine, 'Untitled Cursor Session'),
};
});
}
}

View File

@@ -25,6 +25,167 @@ type CursorMessageBlob = {
content: AnyRecord;
};
function isInternalCursorText(value: unknown): boolean {
if (typeof value !== 'string') {
return false;
}
const normalized = value.trim();
return normalized.startsWith('<user_info>') || normalized.startsWith('<system_reminder>');
}
function isInternalCursorPart(part: unknown): boolean {
if (!part || typeof part !== 'object') {
return false;
}
const record = part as AnyRecord;
const type = typeof record.type === 'string' ? record.type : '';
if (type === 'user_info' || type === 'system_reminder') {
return true;
}
return isInternalCursorText(record.text);
}
function unwrapUserQueryText(value: string, role: 'user' | 'assistant'): string {
if (role !== 'user') {
return value;
}
const normalized = value.trimStart();
const openTag = '<user_query>';
const closeTag = '</user_query>';
if (!normalized.startsWith(openTag)) {
return value;
}
const afterOpen = normalized.slice(openTag.length);
const closeIndex = afterOpen.lastIndexOf(closeTag);
const inner = closeIndex >= 0 ? afterOpen.slice(0, closeIndex) : afterOpen;
return inner.trim();
}
function normalizeToolId(value: unknown): string | null {
if (typeof value !== 'string') {
return null;
}
const normalized = value.trim();
return normalized ? normalized : null;
}
function extractCursorToolResultContent(item: AnyRecord): string {
if (typeof item.result === 'string' && item.result.trim()) {
return item.result;
}
if (typeof item.output === 'string' && item.output.trim()) {
return item.output;
}
if (Array.isArray(item.experimental_content)) {
const experimentalText = item.experimental_content
.map((part: unknown) => {
if (typeof part === 'string') {
return part;
}
if (part && typeof part === 'object') {
const record = part as AnyRecord;
if (typeof record.text === 'string') {
return record.text;
}
}
return '';
})
.filter(Boolean)
.join('\n');
if (experimentalText.trim()) {
return experimentalText;
}
}
return typeof item.result === 'string' ? item.result : '';
}
function parseCursorToolInput(rawInput: unknown): unknown {
if (typeof rawInput !== 'string') {
return rawInput;
}
const trimmed = rawInput.trim();
if (!trimmed) {
return rawInput;
}
try {
return JSON.parse(trimmed);
} catch {
return rawInput;
}
}
function normalizeCursorToolInput(toolName: string, rawInput: unknown): unknown {
const parsed = parseCursorToolInput(rawInput);
if (!parsed || typeof parsed !== 'object' || Array.isArray(parsed)) {
return parsed;
}
const input = parsed as AnyRecord;
const normalized: AnyRecord = { ...input };
const filePath = input.file_path
?? input.filePath
?? input.path
?? input.file
?? input.filename;
if (typeof filePath === 'string' && filePath.trim()) {
normalized.file_path = filePath;
}
if (toolName === 'Write') {
const content = input.content
?? input.text
?? input.value
?? input.contents
?? input.fileContent
?? input.new_string
?? input.newString;
if (typeof content === 'string') {
normalized.content = content;
}
}
if (toolName === 'Edit') {
const oldString = input.old_string
?? input.oldString
?? input.old
?? '';
const newString = input.new_string
?? input.newString
?? input.new
?? input.content
?? '';
if (typeof oldString === 'string') {
normalized.old_string = oldString;
}
if (typeof newString === 'string') {
normalized.new_string = newString;
}
}
if (toolName === 'ApplyPatch') {
const patch = input.patch ?? input.diff ?? input.content;
if (typeof patch === 'string' && !normalized.patch) {
normalized.patch = patch;
}
}
return normalized;
}
function sanitizeCursorSessionId(sessionId: string): string {
const normalized = sessionId.trim();
if (!normalized) {
@@ -225,13 +386,14 @@ export class CursorSessionsProvider implements IProviderSessions {
try {
const blobs = await this.loadCursorBlobs(sessionId, projectPath);
const allNormalized = this.normalizeCursorBlobs(blobs, sessionId);
const total = allNormalized.length;
const renderableMessages = allNormalized.filter((msg) => msg.kind !== 'tool_result');
const total = renderableMessages.length;
if (limit !== null) {
const start = offset;
const page = limit === 0
? []
: allNormalized.slice(start, start + limit);
: renderableMessages.slice(start, start + limit);
const hasMore = limit === 0
? start < total
: start + limit < total;
@@ -245,7 +407,7 @@ export class CursorSessionsProvider implements IProviderSessions {
}
return {
messages: allNormalized,
messages: renderableMessages,
total,
hasMore: false,
offset: 0,
@@ -283,11 +445,24 @@ export class CursorSessionsProvider implements IProviderSessions {
let text = '';
if (Array.isArray(content.message.content)) {
text = content.message.content
.map((part: string | AnyRecord) => typeof part === 'string' ? part : part?.text || '')
.map((part: string | AnyRecord) => {
if (typeof part === 'string') {
if (isInternalCursorText(part)) {
return '';
}
return unwrapUserQueryText(part, role);
}
if (isInternalCursorPart(part)) {
return '';
}
return unwrapUserQueryText(part?.text || '', role);
})
.filter(Boolean)
.join('\n');
} else if (typeof content.message.content === 'string') {
text = content.message.content;
if (!isInternalCursorText(content.message.content)) {
text = unwrapUserQueryText(content.message.content, role);
}
}
if (text?.trim()) {
messages.push(createNormalizedMessage({
@@ -316,7 +491,14 @@ export class CursorSessionsProvider implements IProviderSessions {
if (item?.type !== 'tool-result') {
continue;
}
const toolCallId = item.toolCallId || content.id;
const cursorOptions = content.providerOptions?.cursor as AnyRecord | undefined;
const highLevelToolCallResult = cursorOptions?.highLevelToolCallResult;
const toolCallId = normalizeToolId(item.toolCallId)
|| normalizeToolId(item.tool_call_id)
|| normalizeToolId(highLevelToolCallResult?.toolCallId)
|| normalizeToolId(highLevelToolCallResult?.tool_call_id)
|| normalizeToolId(content.id)
|| '';
messages.push(createNormalizedMessage({
id: `${baseId}_tr`,
sessionId,
@@ -324,8 +506,9 @@ export class CursorSessionsProvider implements IProviderSessions {
provider: PROVIDER,
kind: 'tool_result',
toolId: toolCallId,
content: item.result || '',
isError: false,
content: extractCursorToolResultContent(item),
isError: Boolean(item.isError || item.is_error),
toolUseResult: highLevelToolCallResult,
}));
}
continue;
@@ -336,8 +519,15 @@ export class CursorSessionsProvider implements IProviderSessions {
if (Array.isArray(content.content)) {
for (let partIdx = 0; partIdx < content.content.length; partIdx++) {
const part = content.content[partIdx];
if (isInternalCursorPart(part)) {
continue;
}
if (part?.type === 'text' && part?.text) {
const normalizedPartText = unwrapUserQueryText(part.text, role);
if (!normalizedPartText) {
continue;
}
messages.push(createNormalizedMessage({
id: `${baseId}_${partIdx}`,
sessionId,
@@ -345,7 +535,7 @@ export class CursorSessionsProvider implements IProviderSessions {
provider: PROVIDER,
kind: 'text',
role,
content: part.text,
content: normalizedPartText,
sequence: blob.sequence,
rowid: blob.rowid,
}));
@@ -361,7 +551,11 @@ export class CursorSessionsProvider implements IProviderSessions {
} else if (part?.type === 'tool-call' || part?.type === 'tool_use') {
const rawToolName = part.toolName || part.name || 'Unknown Tool';
const toolName = rawToolName === 'ApplyPatch' ? 'Edit' : rawToolName;
const toolId = part.toolCallId || part.id || `tool_${i}_${partIdx}`;
const toolId = normalizeToolId(part.toolCallId)
|| normalizeToolId(part.tool_call_id)
|| normalizeToolId(part.id)
|| `tool_${i}_${partIdx}`;
const normalizedToolInput = normalizeCursorToolInput(rawToolName, part.args ?? part.input);
const message = createNormalizedMessage({
id: `${baseId}_${partIdx}`,
sessionId,
@@ -369,14 +563,22 @@ export class CursorSessionsProvider implements IProviderSessions {
provider: PROVIDER,
kind: 'tool_use',
toolName,
toolInput: part.args || part.input,
toolInput: normalizedToolInput,
toolId,
});
messages.push(message);
toolUseMap.set(toolId, message);
}
}
} else if (typeof content.content === 'string' && content.content.trim()) {
} else if (
typeof content.content === 'string'
&& content.content.trim()
&& !isInternalCursorText(content.content)
) {
const normalizedText = unwrapUserQueryText(content.content, role);
if (!normalizedText) {
continue;
}
messages.push(createNormalizedMessage({
id: baseId,
sessionId,
@@ -384,7 +586,7 @@ export class CursorSessionsProvider implements IProviderSessions {
provider: PROVIDER,
kind: 'text',
role,
content: content.content,
content: normalizedText,
sequence: blob.sequence,
rowid: blob.rowid,
}));
@@ -401,6 +603,7 @@ export class CursorSessionsProvider implements IProviderSessions {
toolUse.toolResult = {
content: msg.content,
isError: msg.isError,
toolUseResult: msg.toolUseResult,
};
}
}

View File

@@ -0,0 +1,31 @@
import os from 'node:os';
import path from 'node:path';
import { SkillsProvider } from '@/modules/providers/shared/skills/skills.provider.js';
import type { ProviderSkillSource } from '@/shared/types.js';
export class CursorSkillsProvider extends SkillsProvider {
constructor() {
super('cursor');
}
protected async getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]> {
return [
{
scope: 'project',
rootDir: path.join(workspacePath, '.agents', 'skills'),
commandPrefix: '/',
},
{
scope: 'project',
rootDir: path.join(workspacePath, '.cursor', 'skills'),
commandPrefix: '/',
},
{
scope: 'user',
rootDir: path.join(os.homedir(), '.cursor', 'skills'),
commandPrefix: '/',
},
];
}
}

View File

@@ -1,13 +1,22 @@
import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js';
import { CursorProviderAuth } from '@/modules/providers/list/cursor/cursor-auth.provider.js';
import { CursorMcpProvider } from '@/modules/providers/list/cursor/cursor-mcp.provider.js';
import { CursorSessionSynchronizer } from '@/modules/providers/list/cursor/cursor-session-synchronizer.provider.js';
import { CursorSessionsProvider } from '@/modules/providers/list/cursor/cursor-sessions.provider.js';
import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js';
import { CursorSkillsProvider } from '@/modules/providers/list/cursor/cursor-skills.provider.js';
import type {
IProviderAuth,
IProviderSessionSynchronizer,
IProviderSkills,
IProviderSessions,
} from '@/shared/interfaces.js';
export class CursorProvider extends AbstractProvider {
readonly mcp = new CursorMcpProvider();
readonly auth: IProviderAuth = new CursorProviderAuth();
readonly skills: IProviderSkills = new CursorSkillsProvider();
readonly sessions: IProviderSessions = new CursorSessionsProvider();
readonly sessionSynchronizer: IProviderSessionSynchronizer = new CursorSessionSynchronizer();
constructor() {
super('cursor');

View File

@@ -15,7 +15,24 @@ type GeminiCredentialsStatus = {
error?: string;
};
type GeminiAuthType =
| 'oauth-personal'
| 'gemini-api-key'
| 'vertex-ai'
| 'compute-default-credentials'
| 'gateway'
| 'cloud-shell'
| null;
export class GeminiProviderAuth implements IProviderAuth {
/**
* Gemini CLI can override its home root via GEMINI_CLI_HOME.
* Use the same resolution so status checks match runtime behavior.
*/
private getGeminiCliHome(): string {
return process.env.GEMINI_CLI_HOME?.trim() || os.homedir();
}
/**
* Checks whether the Gemini CLI is available on this host.
*/
@@ -58,6 +75,88 @@ export class GeminiProviderAuth implements IProviderAuth {
};
}
/**
* Parses dotenv-style key/value pairs.
*/
private parseEnvFile(content: string): Record<string, string> {
const parsed: Record<string, string> = {};
for (const rawLine of content.split(/\r?\n/)) {
const line = rawLine.trim();
if (!line || line.startsWith('#')) {
continue;
}
const normalizedLine = line.startsWith('export ')
? line.slice('export '.length).trim()
: line;
const separatorIndex = normalizedLine.indexOf('=');
if (separatorIndex <= 0) {
continue;
}
const key = normalizedLine.slice(0, separatorIndex).trim();
if (!key) {
continue;
}
let value = normalizedLine.slice(separatorIndex + 1).trim();
const quoted = (value.startsWith('"') && value.endsWith('"')) || (value.startsWith('\'') && value.endsWith('\''));
if (quoted) {
value = value.slice(1, -1);
} else {
value = value.replace(/\s+#.*$/, '').trim();
}
parsed[key] = value;
}
return parsed;
}
/**
* Loads user-level auth env in Gemini's "first file found" order.
*/
private async loadUserLevelAuthEnv(): Promise<Record<string, string>> {
const geminiCliHome = this.getGeminiCliHome();
const envCandidates = [
path.join(geminiCliHome, '.gemini', '.env'),
path.join(geminiCliHome, '.env'),
];
for (const envPath of envCandidates) {
try {
const content = await readFile(envPath, 'utf8');
return this.parseEnvFile(content);
} catch {
// Continue to the next fallback.
}
}
return {};
}
/**
* Reads Gemini's selected auth type from settings.json when available.
*/
private async readSelectedAuthType(): Promise<GeminiAuthType> {
try {
const settingsPath = path.join(this.getGeminiCliHome(), '.gemini', 'settings.json');
const content = await readFile(settingsPath, 'utf8');
const settings = readObjectRecord(JSON.parse(content));
const security = readObjectRecord(settings?.security);
const auth = readObjectRecord(security?.auth);
const selectedType = readOptionalString(auth?.selectedType);
if (!selectedType) {
return null;
}
return selectedType as GeminiAuthType;
} catch {
return null;
}
}
/**
* Checks Gemini credentials from API key env vars or local OAuth credential files.
*/
@@ -66,8 +165,46 @@ export class GeminiProviderAuth implements IProviderAuth {
return { authenticated: true, email: 'API Key Auth', method: 'api_key' };
}
const userEnv = await this.loadUserLevelAuthEnv();
if (readOptionalString(userEnv.GEMINI_API_KEY)) {
return { authenticated: true, email: 'API Key Auth', method: 'api_key' };
}
const selectedType = await this.readSelectedAuthType();
if (selectedType === 'vertex-ai') {
const hasGoogleApiKey = Boolean(
process.env.GOOGLE_API_KEY?.trim()
|| readOptionalString(userEnv.GOOGLE_API_KEY)
);
const hasProject = Boolean(
process.env.GOOGLE_CLOUD_PROJECT?.trim()
|| process.env.GOOGLE_CLOUD_PROJECT_ID?.trim()
|| readOptionalString(userEnv.GOOGLE_CLOUD_PROJECT)
|| readOptionalString(userEnv.GOOGLE_CLOUD_PROJECT_ID)
);
const hasLocation = Boolean(
process.env.GOOGLE_CLOUD_LOCATION?.trim()
|| readOptionalString(userEnv.GOOGLE_CLOUD_LOCATION)
);
const hasServiceAccount = Boolean(
process.env.GOOGLE_APPLICATION_CREDENTIALS?.trim()
|| readOptionalString(userEnv.GOOGLE_APPLICATION_CREDENTIALS)
);
if (hasGoogleApiKey || hasServiceAccount || (hasProject && hasLocation)) {
return { authenticated: true, email: 'Vertex AI Auth', method: 'vertex_ai' };
}
return {
authenticated: false,
email: null,
method: 'vertex_ai',
error: 'Gemini is set to Vertex AI, but required env vars are missing',
};
}
try {
const credsPath = path.join(os.homedir(), '.gemini', 'oauth_creds.json');
const credsPath = path.join(this.getGeminiCliHome(), '.gemini', 'oauth_creds.json');
const content = await readFile(credsPath, 'utf8');
const creds = readObjectRecord(JSON.parse(content)) ?? {};
const accessToken = readOptionalString(creds.access_token);
@@ -106,6 +243,25 @@ export class GeminiProviderAuth implements IProviderAuth {
method: 'credentials_file',
};
} catch {
if (selectedType === 'gemini-api-key') {
return {
authenticated: false,
email: null,
method: 'api_key',
error: 'Gemini is set to "Use Gemini API key", but GEMINI_API_KEY is unavailable',
};
}
if (selectedType === 'oauth-personal') {
return {
authenticated: false,
email: null,
method: 'credentials_file',
error: 'Gemini is set to Google sign-in, but no cached OAuth credentials were found',
};
}
// If no explicit auth type was selected, surface the generic "not configured" error.
return {
authenticated: false,
email: null,
@@ -140,7 +296,7 @@ export class GeminiProviderAuth implements IProviderAuth {
*/
private async getActiveAccountEmail(): Promise<string | null> {
try {
const accPath = path.join(os.homedir(), '.gemini', 'google_accounts.json');
const accPath = path.join(this.getGeminiCliHome(), '.gemini', 'google_accounts.json');
const accContent = await readFile(accPath, 'utf8');
const accounts = readObjectRecord(JSON.parse(accContent));
return readOptionalString(accounts?.active) ?? null;

View File

@@ -0,0 +1,405 @@
import crypto from 'node:crypto';
import os from 'node:os';
import path from 'node:path';
import { readFile } from 'node:fs/promises';
import { projectsDb, sessionsDb } from '@/modules/database/index.js';
import {
findFilesRecursivelyCreatedAfter,
normalizeProjectPath,
normalizeSessionName,
readFileTimestamps,
} from '@/shared/utils.js';
import type { IProviderSessionSynchronizer } from '@/shared/interfaces.js';
import type { AnyRecord } from '@/shared/types.js';
type ParsedSession = {
sessionId: string;
projectPath: string;
sessionName?: string;
};
type GeminiJsonlMetadata = {
sessionId: string;
projectPath?: string;
projectHash?: string;
firstUserMessage?: string;
};
/**
* Session indexer for Gemini transcript artifacts.
*/
export class GeminiSessionSynchronizer implements IProviderSessionSynchronizer {
private readonly provider = 'gemini' as const;
private readonly geminiHome = path.join(os.homedir(), '.gemini');
/**
* Scans Gemini legacy JSON and new JSONL artifacts and upserts sessions into DB.
*/
async synchronize(since?: Date): Promise<number> {
const projectHashLookup = this.buildProjectHashLookup();
// const legacySessionFiles = await findFilesRecursivelyCreatedAfter(
// path.join(this.geminiHome, 'sessions'),
// '.json',
// since ?? null
// );
// Gemini creates overlapping artifacts across `sessions/` and `tmp/`.
// We currently index only `tmp/*/chats/*.jsonl` because those files are the
// live transcript source and avoid duplicate session rows from mirrored files.
// const legacyTempFiles = await findFilesRecursivelyCreatedAfter(
// path.join(this.geminiHome, 'tmp'),
// '.json',
// since ?? null
// );
// const jsonlSessionFiles = await findFilesRecursivelyCreatedAfter(
// path.join(this.geminiHome, 'sessions'),
// '.jsonl',
// since ?? null
// );
const jsonlTempFiles = await findFilesRecursivelyCreatedAfter(
path.join(this.geminiHome, 'tmp'),
'.jsonl',
since ?? null
);
// Current strategy: index only temp chat JSONL artifacts.
const files = [
// ...legacySessionFiles,
// Intentionally disabled to avoid duplicate indexing from mirrored
// `sessions/*.json` and `sessions/*.jsonl` artifacts.
// ...legacyTempFiles,
// ...jsonlSessionFiles,
...jsonlTempFiles,
];
let processed = 0;
for (const filePath of files) {
if (this.shouldSkipTempArtifact(filePath)) {
continue;
}
const parsed = filePath.endsWith('.jsonl')
? await this.processJsonlSessionFile(filePath, projectHashLookup)
: await this.processLegacySessionFile(filePath);
if (!parsed) {
continue;
}
const timestamps = await readFileTimestamps(filePath);
sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
processed += 1;
}
return processed;
}
/**
* Parses and upserts one Gemini legacy JSON or JSONL artifact.
*/
async synchronizeFile(filePath: string): Promise<string | null> {
if (!filePath.endsWith('.json') && !filePath.endsWith('.jsonl')) {
return null;
}
if (this.shouldSkipTempArtifact(filePath)) {
return null;
}
const parsed = filePath.endsWith('.jsonl')
? await this.processJsonlSessionFile(filePath, this.buildProjectHashLookup())
: await this.processLegacySessionFile(filePath);
if (!parsed) {
return null;
}
const timestamps = await readFileTimestamps(filePath);
return sessionsDb.createSession(
parsed.sessionId,
this.provider,
parsed.projectPath,
parsed.sessionName,
timestamps.createdAt,
timestamps.updatedAt,
filePath
);
}
/**
* Extracts session metadata from one Gemini legacy JSON artifact.
*/
private async processLegacySessionFile(filePath: string): Promise<ParsedSession | null> {
try {
const content = await readFile(filePath, 'utf8');
const data = JSON.parse(content) as AnyRecord;
const sessionId =
typeof data.sessionId === 'string'
? data.sessionId
: typeof data.id === 'string'
? data.id
: undefined;
if (!sessionId) {
return null;
}
const workspaceProjectPath = await this.resolveProjectPathFromChatWorkspace(filePath);
const projectPath = typeof data.projectPath === 'string' && data.projectPath.trim().length > 0
? data.projectPath
: workspaceProjectPath;
if (!projectPath) {
return null;
}
const messages = Array.isArray(data.messages) ? data.messages : [];
const firstMessage = messages[0] as AnyRecord | undefined;
let rawName: string | undefined;
if (Array.isArray(firstMessage?.content) && typeof firstMessage.content[0]?.text === 'string') {
rawName = firstMessage.content[0].text;
} else if (typeof firstMessage?.content === 'string') {
rawName = firstMessage.content;
}
return {
sessionId,
projectPath,
sessionName: normalizeSessionName(rawName, 'New Gemini Chat'),
};
} catch {
return null;
}
}
/**
* Extracts session metadata from one Gemini JSONL artifact.
*/
private async processJsonlSessionFile(
filePath: string,
projectHashLookup: Map<string, string>
): Promise<ParsedSession | null> {
const metadata = await this.extractJsonlMetadata(filePath);
if (!metadata) {
return null;
}
let projectPath = typeof metadata.projectPath === 'string' ? metadata.projectPath.trim() : '';
if (!projectPath) {
const workspaceProjectPath = await this.resolveProjectPathFromChatWorkspace(filePath);
if (workspaceProjectPath) {
projectPath = workspaceProjectPath;
}
}
if (!projectPath && typeof metadata.projectHash === 'string') {
projectPath = projectHashLookup.get(metadata.projectHash.trim().toLowerCase()) ?? '';
}
if (!projectPath) {
return null;
}
// Once we resolve a project hash/path pair, keep it in-memory for this sync run.
if (typeof metadata.projectHash === 'string' && metadata.projectHash.trim()) {
projectHashLookup.set(metadata.projectHash.trim().toLowerCase(), projectPath);
}
return {
sessionId: metadata.sessionId,
projectPath,
sessionName: normalizeSessionName(metadata.firstUserMessage, 'New Gemini Chat'),
};
}
/**
* Reads first useful metadata from Gemini JSONL files.
*/
private async extractJsonlMetadata(filePath: string): Promise<GeminiJsonlMetadata | null> {
try {
const content = await readFile(filePath, 'utf8');
const lines = content.split('\n');
let sessionId: string | undefined;
let projectPath: string | undefined;
let projectHash: string | undefined;
let firstUserMessage: string | undefined;
for (const line of lines) {
const trimmed = line.trim();
if (!trimmed) {
continue;
}
let parsed: AnyRecord;
try {
parsed = JSON.parse(trimmed) as AnyRecord;
} catch {
continue;
}
if (!sessionId && typeof parsed.sessionId === 'string') {
sessionId = parsed.sessionId;
}
if (!projectPath && typeof parsed.projectPath === 'string') {
projectPath = parsed.projectPath;
}
if (!projectHash && typeof parsed.projectHash === 'string') {
projectHash = parsed.projectHash;
}
if (!firstUserMessage && parsed.type === 'user') {
firstUserMessage = this.extractGeminiTextContent(parsed.content);
}
if (sessionId && (projectPath || projectHash) && firstUserMessage) {
break;
}
}
if (!sessionId) {
return null;
}
return {
sessionId,
projectPath,
projectHash,
firstUserMessage,
};
} catch {
return null;
}
}
/**
* Tries to resolve project root from Gemini tmp chat workspaces.
*/
private async resolveProjectPathFromChatWorkspace(filePath: string): Promise<string> {
if (!filePath.includes(`${path.sep}chats${path.sep}`)) {
return '';
}
const chatsDir = path.dirname(filePath);
const workspaceDir = path.dirname(chatsDir);
const projectRootPath = path.join(workspaceDir, '.project_root');
try {
const rootContent = await readFile(projectRootPath, 'utf8');
return rootContent.trim();
} catch {
return '';
}
}
/**
* Builds a hash->path lookup for Gemini JSONL metadata that stores projectHash.
*/
private buildProjectHashLookup(): Map<string, string> {
const lookup = new Map<string, string>();
const knownPaths = new Set<string>();
for (const project of projectsDb.getProjectPaths()) {
if (typeof project.project_path === 'string' && project.project_path.trim()) {
knownPaths.add(project.project_path.trim());
}
}
for (const session of sessionsDb.getAllSessions()) {
if (session.provider === this.provider && typeof session.project_path === 'string' && session.project_path.trim()) {
knownPaths.add(session.project_path.trim());
}
}
for (const knownPath of knownPaths) {
this.addProjectHashCandidates(lookup, knownPath);
}
return lookup;
}
/**
* Adds likely Gemini hash variants for one project path.
*/
private addProjectHashCandidates(lookup: Map<string, string>, projectPath: string): void {
const trimmed = projectPath.trim();
if (!trimmed) {
return;
}
const normalized = normalizeProjectPath(trimmed);
const resolved = path.resolve(trimmed);
const resolvedNormalized = normalizeProjectPath(resolved);
const candidates = new Set<string>([
trimmed,
normalized,
resolved,
resolvedNormalized,
]);
if (process.platform === 'win32') {
for (const candidate of [...candidates]) {
candidates.add(candidate.toLowerCase());
}
}
for (const candidate of candidates) {
if (!candidate) {
continue;
}
const hash = this.sha256(candidate);
if (!lookup.has(hash)) {
lookup.set(hash, trimmed);
}
}
}
/**
* Returns first user text from Gemini content payload shapes.
*/
private extractGeminiTextContent(content: unknown): string | undefined {
if (typeof content === 'string' && content.trim().length > 0) {
return content;
}
if (!Array.isArray(content)) {
return undefined;
}
for (const part of content) {
if (typeof part === 'string' && part.trim().length > 0) {
return part;
}
if (part && typeof part === 'object' && typeof (part as AnyRecord).text === 'string') {
const text = (part as AnyRecord).text;
if (text.trim().length > 0) {
return text;
}
}
}
return undefined;
}
/**
* Keeps tmp scanning scoped to chat artifacts only.
*/
private shouldSkipTempArtifact(filePath: string): boolean {
return (
filePath.startsWith(path.join(this.geminiHome, 'tmp'))
&& !filePath.includes(`${path.sep}chats${path.sep}`)
);
}
private sha256(value: string): string {
return crypto.createHash('sha256').update(value).digest('hex');
}
}

View File

@@ -1,11 +1,249 @@
import sessionManager from '@/sessionManager.js';
import { getGeminiCliSessionMessages } from '@/projects.js';
import fsSync from 'node:fs';
import fs from 'node:fs/promises';
import readline from 'node:readline';
import { sessionsDb } from '@/modules/database/index.js';
import type { IProviderSessions } from '@/shared/interfaces.js';
import type { AnyRecord, FetchHistoryOptions, FetchHistoryResult, NormalizedMessage } from '@/shared/types.js';
import { createNormalizedMessage, generateMessageId, readObjectRecord } from '@/shared/utils.js';
const PROVIDER = 'gemini';
type GeminiHistoryResult = {
messages: AnyRecord[];
tokenUsage?: unknown;
};
function mapGeminiRole(value: unknown): 'user' | 'assistant' | null {
if (value === 'user') {
return 'user';
}
if (value === 'gemini' || value === 'assistant') {
return 'assistant';
}
return null;
}
function extractGeminiTextContent(content: unknown): string {
if (typeof content === 'string') {
return content;
}
if (!Array.isArray(content)) {
return '';
}
return content
.map((part) => {
if (typeof part === 'string') {
return part;
}
if (!part || typeof part !== 'object') {
return '';
}
const record = part as AnyRecord;
if (typeof record.text === 'string') {
return record.text;
}
return '';
})
.filter(Boolean)
.join('\n');
}
function extractGeminiThoughts(thoughts: unknown): string {
if (!Array.isArray(thoughts)) {
return '';
}
return thoughts
.map((item) => {
if (!item || typeof item !== 'object') {
return '';
}
const record = item as AnyRecord;
const subject = typeof record.subject === 'string' ? record.subject.trim() : '';
const description = typeof record.description === 'string' ? record.description.trim() : '';
if (subject && description) {
return `${subject}: ${description}`;
}
return description || subject;
})
.filter(Boolean)
.join('\n');
}
function buildGeminiTokenUsage(tokens: unknown): AnyRecord | undefined {
if (!tokens || typeof tokens !== 'object') {
return undefined;
}
const record = tokens as AnyRecord;
const input = Number(record.input || 0);
const output = Number(record.output || 0);
const cached = Number(record.cached || 0);
const thoughts = Number(record.thoughts || 0);
const tool = Number(record.tool || 0);
const totalFromFields = input + output + cached + thoughts + tool;
const total = Number(record.total || totalFromFields || 0);
return {
used: total,
total: total,
breakdown: {
input,
output,
cached,
thoughts,
tool,
},
};
}
async function getGeminiLegacySessionMessages(sessionFilePath: string): Promise<GeminiHistoryResult> {
try {
const data = await fs.readFile(sessionFilePath, 'utf8');
const session = JSON.parse(data) as AnyRecord;
const sourceMessages = Array.isArray(session.messages) ? session.messages : [];
const messages: AnyRecord[] = [];
for (const msg of sourceMessages) {
const role = mapGeminiRole(msg.type ?? msg.role);
if (!role) {
continue;
}
messages.push({
type: 'message',
uuid: typeof msg.id === 'string' ? msg.id : undefined,
message: { role, content: msg.content },
timestamp: msg.timestamp || null,
});
}
return { messages };
} catch {
return { messages: [] };
}
}
async function getGeminiJsonlSessionMessages(sessionFilePath: string): Promise<GeminiHistoryResult> {
const messages: AnyRecord[] = [];
let tokenUsage: AnyRecord | undefined;
try {
const fileStream = fsSync.createReadStream(sessionFilePath);
const lineReader = readline.createInterface({
input: fileStream,
crlfDelay: Infinity,
});
for await (const line of lineReader) {
const trimmed = line.trim();
if (!trimmed) {
continue;
}
let entry: AnyRecord;
try {
entry = JSON.parse(trimmed) as AnyRecord;
} catch {
continue;
}
// Metadata/update lines (e.g. {$set:{lastUpdated:...}}) do not represent chat messages.
if (entry.$set) {
continue;
}
const role = mapGeminiRole(entry.type);
if (role) {
const textContent = extractGeminiTextContent(entry.content);
if (textContent.trim()) {
messages.push({
type: 'message',
uuid: typeof entry.id === 'string' ? entry.id : undefined,
message: { role, content: textContent },
timestamp: entry.timestamp || null,
});
}
const thinkingContent = extractGeminiThoughts(entry.thoughts);
if (thinkingContent.trim()) {
messages.push({
type: 'thinking',
uuid: typeof entry.id === 'string' ? `${entry.id}_thinking` : undefined,
message: { role: 'assistant', content: thinkingContent },
timestamp: entry.timestamp || null,
isReasoning: true,
});
}
if (role === 'assistant') {
const usage = buildGeminiTokenUsage(entry.tokens);
if (usage) {
tokenUsage = usage;
}
}
continue;
}
if (entry.type === 'tool_use') {
messages.push({
type: 'tool_use',
uuid: typeof entry.id === 'string' ? entry.id : undefined,
timestamp: entry.timestamp || null,
toolName: entry.tool_name || entry.name || 'Tool',
toolInput: entry.parameters ?? entry.input ?? entry.arguments ?? '',
toolCallId: entry.tool_id || entry.toolCallId || entry.id,
});
continue;
}
if (entry.type === 'tool_result') {
messages.push({
type: 'tool_result',
uuid: typeof entry.id === 'string' ? entry.id : undefined,
timestamp: entry.timestamp || null,
toolCallId: entry.tool_id || entry.toolCallId || entry.id || '',
output: entry.output ?? entry.result ?? '',
isError: Boolean(entry.error) || entry.status === 'error',
});
}
}
} catch {
return { messages: [] };
}
messages.sort(
(a, b) => new Date(a.timestamp || 0).getTime() - new Date(b.timestamp || 0).getTime(),
);
return { messages, tokenUsage };
}
async function getGeminiCliSessionMessages(sessionId: string): Promise<GeminiHistoryResult> {
const sessionFilePath = sessionsDb.getSessionById(sessionId)?.jsonl_path;
if (!sessionFilePath) {
return { messages: [] };
}
if (sessionFilePath.endsWith('.jsonl')) {
return getGeminiJsonlSessionMessages(sessionFilePath);
}
return getGeminiLegacySessionMessages(sessionFilePath);
}
export class GeminiSessionsProvider implements IProviderSessions {
/**
* Normalizes live Gemini stream-json events into the shared message shape.
@@ -108,8 +346,7 @@ export class GeminiSessionsProvider implements IProviderSessions {
}
/**
* Loads Gemini history from the in-memory session manager first, then falls
* back to Gemini CLI session files on disk.
* Loads Gemini history from Gemini CLI session files on disk.
*/
async fetchHistory(
sessionId: string,
@@ -117,28 +354,73 @@ export class GeminiSessionsProvider implements IProviderSessions {
): Promise<FetchHistoryResult> {
const { limit = null, offset = 0 } = options;
let rawMessages: AnyRecord[];
let result: GeminiHistoryResult;
try {
rawMessages = sessionManager.getSessionMessages(sessionId) as AnyRecord[];
if (rawMessages.length === 0) {
rawMessages = await getGeminiCliSessionMessages(sessionId) as AnyRecord[];
}
result = await getGeminiCliSessionMessages(sessionId);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.warn(`[GeminiProvider] Failed to load session ${sessionId}:`, message);
return { messages: [], total: 0, hasMore: false, offset: 0, limit: null };
}
const rawMessages = result.messages;
const normalized: NormalizedMessage[] = [];
for (let i = 0; i < rawMessages.length; i++) {
const raw = rawMessages[i];
const ts = raw.timestamp || new Date().toISOString();
const baseId = raw.uuid || generateMessageId('gemini');
if (raw.type === 'thinking' || raw.isReasoning) {
const thinkingContent = typeof raw.message?.content === 'string'
? raw.message.content
: typeof raw.content === 'string'
? raw.content
: '';
if (thinkingContent.trim()) {
normalized.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'thinking',
content: thinkingContent,
}));
}
continue;
}
if (raw.type === 'tool_use' || raw.toolName) {
normalized.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'tool_use',
toolName: raw.toolName || 'Tool',
toolInput: raw.toolInput,
toolId: raw.toolCallId || baseId,
}));
continue;
}
if (raw.type === 'tool_result') {
normalized.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'tool_result',
toolId: raw.toolCallId || '',
content: raw.output === undefined ? '' : String(raw.output),
isError: Boolean(raw.isError),
}));
continue;
}
const role = raw.message?.role || raw.role;
const content = raw.message?.content || raw.content;
if (!role || !content) {
continue;
}
@@ -147,8 +429,26 @@ export class GeminiSessionsProvider implements IProviderSessions {
if (Array.isArray(content)) {
for (let partIdx = 0; partIdx < content.length; partIdx++) {
const part = content[partIdx];
if (part.type === 'text' && part.text) {
const part = content[partIdx] as AnyRecord | string;
if (typeof part === 'string' && part.trim()) {
normalized.push(createNormalizedMessage({
id: `${baseId}_${partIdx}`,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'text',
role: normalizedRole,
content: part,
}));
continue;
}
if (!part || typeof part !== 'object') {
continue;
}
if ((part.type === 'text' || !part.type) && typeof part.text === 'string' && part.text.trim()) {
normalized.push(createNormalizedMessage({
id: `${baseId}_${partIdx}`,
sessionId,
@@ -192,6 +492,19 @@ export class GeminiSessionsProvider implements IProviderSessions {
role: normalizedRole,
content,
}));
} else {
const textContent = extractGeminiTextContent(content);
if (textContent.trim()) {
normalized.push(createNormalizedMessage({
id: baseId,
sessionId,
timestamp: ts,
provider: PROVIDER,
kind: 'text',
role: normalizedRole,
content: textContent,
}));
}
}
}
@@ -215,13 +528,20 @@ export class GeminiSessionsProvider implements IProviderSessions {
const messages = pageLimit === null
? normalized.slice(start)
: normalized.slice(start, start + pageLimit);
let total = 0;
for (const msg of normalized) {
if (msg.kind !== 'tool_result') {
total += 1;
}
}
return {
messages,
total: normalized.length,
total,
hasMore: pageLimit === null ? false : start + pageLimit < normalized.length,
offset: start,
limit: pageLimit,
tokenUsage: result.tokenUsage,
};
}
}

View File

@@ -0,0 +1,36 @@
import os from 'node:os';
import path from 'node:path';
import { SkillsProvider } from '@/modules/providers/shared/skills/skills.provider.js';
import type { ProviderSkillSource } from '@/shared/types.js';
export class GeminiSkillsProvider extends SkillsProvider {
constructor() {
super('gemini');
}
protected async getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]> {
return [
{
scope: 'user',
rootDir: path.join(os.homedir(), '.gemini', 'skills'),
commandPrefix: '/',
},
{
scope: 'user',
rootDir: path.join(os.homedir(), '.agents', 'skills'),
commandPrefix: '/',
},
{
scope: 'project',
rootDir: path.join(workspacePath, '.gemini', 'skills'),
commandPrefix: '/',
},
{
scope: 'project',
rootDir: path.join(workspacePath, '.agents', 'skills'),
commandPrefix: '/',
},
];
}
}

View File

@@ -1,13 +1,22 @@
import { AbstractProvider } from '@/modules/providers/shared/base/abstract.provider.js';
import { GeminiProviderAuth } from '@/modules/providers/list/gemini/gemini-auth.provider.js';
import { GeminiMcpProvider } from '@/modules/providers/list/gemini/gemini-mcp.provider.js';
import { GeminiSessionSynchronizer } from '@/modules/providers/list/gemini/gemini-session-synchronizer.provider.js';
import { GeminiSessionsProvider } from '@/modules/providers/list/gemini/gemini-sessions.provider.js';
import type { IProviderAuth, IProviderSessions } from '@/shared/interfaces.js';
import { GeminiSkillsProvider } from '@/modules/providers/list/gemini/gemini-skills.provider.js';
import type {
IProviderAuth,
IProviderSessionSynchronizer,
IProviderSkills,
IProviderSessions,
} from '@/shared/interfaces.js';
export class GeminiProvider extends AbstractProvider {
readonly mcp = new GeminiMcpProvider();
readonly auth: IProviderAuth = new GeminiProviderAuth();
readonly skills: IProviderSkills = new GeminiSkillsProvider();
readonly sessions: IProviderSessions = new GeminiSessionsProvider();
readonly sessionSynchronizer: IProviderSessionSynchronizer = new GeminiSessionSynchronizer();
constructor() {
super('gemini');

View File

@@ -2,6 +2,9 @@ import express, { type Request, type Response } from 'express';
import { providerAuthService } from '@/modules/providers/services/provider-auth.service.js';
import { providerMcpService } from '@/modules/providers/services/mcp.service.js';
import { providerSkillsService } from '@/modules/providers/services/skills.service.js';
import { sessionConversationsSearchService } from '@/modules/providers/services/session-conversations-search.service.js';
import { sessionsService } from '@/modules/providers/services/sessions.service.js';
import type { LLMProvider, McpScope, McpTransport, UpsertProviderMcpServerInput } from '@/shared/types.js';
import { AppError, asyncHandler, createApiSuccessResponse } from '@/shared/utils.js';
@@ -25,6 +28,20 @@ const readPathParam = (value: unknown, name: string): string => {
const normalizeProviderParam = (value: unknown): string =>
readPathParam(value, 'provider').trim().toLowerCase();
const SESSION_ID_PATTERN = /^[a-zA-Z0-9._-]{1,120}$/;
const parseSessionId = (value: unknown): string => {
const sessionId = readPathParam(value, 'sessionId').trim();
if (!SESSION_ID_PATTERN.test(sessionId)) {
throw new AppError('Invalid sessionId.', {
code: 'INVALID_SESSION_ID',
statusCode: 400,
});
}
return sessionId;
};
const readOptionalQueryString = (value: unknown): string | undefined => {
if (typeof value !== 'string') {
return undefined;
@@ -34,6 +51,29 @@ const readOptionalQueryString = (value: unknown): string | undefined => {
return normalized.length > 0 ? normalized : undefined;
};
const parseOptionalBooleanQuery = (value: unknown, name: string): boolean | undefined => {
if (value === undefined) {
return undefined;
}
const normalized = readOptionalQueryString(value);
if (!normalized) {
return undefined;
}
if (normalized === 'true') {
return true;
}
if (normalized === 'false') {
return false;
}
throw new AppError(`${name} must be "true" or "false".`, {
code: 'INVALID_QUERY_PARAMETER',
statusCode: 400,
});
};
const parseMcpScope = (value: unknown): McpScope | undefined => {
if (value === undefined) {
return undefined;
@@ -103,19 +143,19 @@ const parseMcpUpsertPayload = (payload: unknown): UpsertProviderMcpServerInput =
args: Array.isArray(body.args) ? body.args.filter((entry): entry is string => typeof entry === 'string') : undefined,
env: typeof body.env === 'object' && body.env !== null
? Object.fromEntries(
Object.entries(body.env as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
Object.entries(body.env as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
: undefined,
cwd: readOptionalQueryString(body.cwd),
url: readOptionalQueryString(body.url),
headers: typeof body.headers === 'object' && body.headers !== null
? Object.fromEntries(
Object.entries(body.headers as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
Object.entries(body.headers as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
: undefined,
envVars: Array.isArray(body.envVars)
? body.envVars.filter((entry): entry is string => typeof entry === 'string')
@@ -123,10 +163,10 @@ const parseMcpUpsertPayload = (payload: unknown): UpsertProviderMcpServerInput =
bearerTokenEnvVar: readOptionalQueryString(body.bearerTokenEnvVar),
envHttpHeaders: typeof body.envHttpHeaders === 'object' && body.envHttpHeaders !== null
? Object.fromEntries(
Object.entries(body.envHttpHeaders as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
Object.entries(body.envHttpHeaders as Record<string, unknown>).filter(
(entry): entry is [string, string] => typeof entry[1] === 'string',
),
)
: undefined,
};
};
@@ -143,6 +183,62 @@ const parseProvider = (value: unknown): LLMProvider => {
});
};
const parseSessionRenameSummary = (payload: unknown): string => {
if (!payload || typeof payload !== 'object') {
throw new AppError('Request body must be an object.', {
code: 'INVALID_REQUEST_BODY',
statusCode: 400,
});
}
const body = payload as Record<string, unknown>;
const summary = typeof body.summary === 'string' ? body.summary.trim() : '';
if (!summary) {
throw new AppError('Summary is required.', {
code: 'INVALID_SESSION_SUMMARY',
statusCode: 400,
});
}
if (summary.length > 500) {
throw new AppError('Summary must not exceed 500 characters.', {
code: 'INVALID_SESSION_SUMMARY',
statusCode: 400,
});
}
return summary;
};
const parseSessionSearchQuery = (value: unknown): string => {
const query = readOptionalQueryString(value) ?? '';
if (query.length < 2) {
throw new AppError('Query must be at least 2 characters', {
code: 'INVALID_SEARCH_QUERY',
statusCode: 400,
});
}
return query;
};
const parseSessionSearchLimit = (value: unknown): number => {
const raw = readOptionalQueryString(value);
if (!raw) {
return 50;
}
const parsed = Number.parseInt(raw, 10);
if (Number.isNaN(parsed)) {
throw new AppError('limit must be a valid integer.', {
code: 'INVALID_QUERY_PARAMETER',
statusCode: 400,
});
}
return Math.max(1, Math.min(parsed, 100));
};
router.get(
'/:provider/auth/status',
asyncHandler(async (req: Request, res: Response) => {
@@ -152,6 +248,18 @@ router.get(
}),
);
// ----------------- Skills routes -----------------
router.get(
'/:provider/skills',
asyncHandler(async (req: Request, res: Response) => {
const provider = parseProvider(req.params.provider);
const workspacePath = readOptionalQueryString(req.query.workspacePath);
const skills = await providerSkillsService.listProviderSkills(provider, { workspacePath });
res.json(createApiSuccessResponse({ provider, skills }));
}),
);
// ----------------- MCP routes -----------------
router.get(
'/:provider/mcp/servers',
asyncHandler(async (req: Request, res: Response) => {
@@ -214,4 +322,137 @@ router.post(
}),
);
// ----------------- Session routes -----------------
router.get(
'/sessions/archived',
asyncHandler(async (_req: Request, res: Response) => {
const sessions = sessionsService.listArchivedSessions();
res.json(createApiSuccessResponse({ sessions }));
}),
);
router.delete(
'/sessions/:sessionId',
asyncHandler(async (req: Request, res: Response) => {
const sessionId = parseSessionId(req.params.sessionId);
const force = parseOptionalBooleanQuery(req.query.force, 'force') ?? false;
const deletedFromDisk = parseOptionalBooleanQuery(req.query.deletedFromDisk, 'deletedFromDisk') ?? force;
const result = await sessionsService.deleteOrArchiveSessionById(sessionId, {
force,
deletedFromDisk,
});
res.json(createApiSuccessResponse(result));
}),
);
router.post(
'/sessions/:sessionId/restore',
asyncHandler(async (req: Request, res: Response) => {
const sessionId = parseSessionId(req.params.sessionId);
const result = sessionsService.restoreSessionById(sessionId);
res.json(createApiSuccessResponse(result));
}),
);
router.put(
'/sessions/:sessionId',
asyncHandler(async (req: Request, res: Response) => {
const sessionId = parseSessionId(req.params.sessionId);
const summary = parseSessionRenameSummary(req.body);
const result = sessionsService.renameSessionById(sessionId, summary);
res.json(createApiSuccessResponse(result));
}),
);
router.get(
'/sessions/:sessionId/messages',
asyncHandler(async (req: Request, res: Response) => {
const sessionId = parseSessionId(req.params.sessionId);
const limitRaw = readOptionalQueryString(req.query.limit);
const offsetRaw = readOptionalQueryString(req.query.offset);
let limit: number | null = null;
if (limitRaw !== undefined) {
const parsedLimit = Number.parseInt(limitRaw, 10);
if (Number.isNaN(parsedLimit) || parsedLimit < 0) {
throw new AppError('limit must be a non-negative integer.', {
code: 'INVALID_QUERY_PARAMETER',
statusCode: 400,
});
}
limit = parsedLimit;
}
let offset = 0;
if (offsetRaw !== undefined) {
const parsedOffset = Number.parseInt(offsetRaw, 10);
if (Number.isNaN(parsedOffset) || parsedOffset < 0) {
throw new AppError('offset must be a non-negative integer.', {
code: 'INVALID_QUERY_PARAMETER',
statusCode: 400,
});
}
offset = parsedOffset;
}
const result = await sessionsService.fetchHistory(sessionId, {
limit,
offset,
});
res.json(result);
}),
);
router.get('/search/sessions', asyncHandler(async (req: Request, res: Response) => {
const query = parseSessionSearchQuery(req.query.q);
const limit = parseSessionSearchLimit(req.query.limit);
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
'X-Accel-Buffering': 'no',
});
let closed = false;
const abortController = new AbortController();
req.on('close', () => {
closed = true;
abortController.abort();
});
try {
await sessionConversationsSearchService.search({
query,
limit,
signal: abortController.signal,
onProgress: ({ projectResult, totalMatches, scannedProjects, totalProjects }) => {
if (closed) {
return;
}
if (projectResult) {
res.write(`event: result\ndata: ${JSON.stringify({ projectResult, totalMatches, scannedProjects, totalProjects })}\n\n`);
return;
}
res.write(`event: progress\ndata: ${JSON.stringify({ totalMatches, scannedProjects, totalProjects })}\n\n`);
},
});
if (!closed) {
res.write('event: done\ndata: {}\n\n');
}
} catch (error) {
console.error('Error searching conversations:', error);
if (!closed) {
res.write(`event: error\ndata: ${JSON.stringify({ error: 'Search failed' })}\n\n`);
}
} finally {
if (!closed) {
res.end();
}
}
}));
export default router;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,74 @@
import { scanStateDb } from '@/modules/database/index.js';
import { providerRegistry } from '@/modules/providers/provider.registry.js';
import type { LLMProvider } from '@/shared/types.js';
type SessionSynchronizeResult = {
processedByProvider: Record<LLMProvider, number>;
failures: string[];
};
/**
* Orchestrates provider-specific session indexers and indexed-session lifecycle operations.
*/
export const sessionSynchronizerService = {
/**
* Runs all provider synchronizers and updates scan_state.last_scanned_at.
*/
async synchronizeSessions(): Promise<SessionSynchronizeResult> {
const lastScanAt = scanStateDb.getLastScannedAt();
const scanBoundary = new Date();
const processedByProvider: Record<LLMProvider, number> = {
claude: 0,
codex: 0,
cursor: 0,
gemini: 0,
};
const failures: string[] = [];
const results = await Promise.allSettled(
providerRegistry.listProviders().map(async (provider) => ({
provider: provider.id,
processed: await provider.sessionSynchronizer.synchronize(lastScanAt ?? undefined),
}))
);
for (const result of results) {
if (result.status === 'fulfilled') {
processedByProvider[result.value.provider] = result.value.processed;
continue;
}
const reason = result.reason instanceof Error ? result.reason.message : String(result.reason);
failures.push(reason);
}
if (failures.length === 0) {
scanStateDb.updateLastScannedAt(scanBoundary);
} else {
console.warn(
`[Sessions] Skipping scan_state cursor advance because ${failures.length} provider sync(s) failed.`,
);
}
return {
processedByProvider,
failures,
};
},
/**
* Indexes one provider artifact file without running a full provider rescan.
*/
async synchronizeProviderFile(
provider: LLMProvider,
filePath: string
): Promise<{ provider: LLMProvider; indexed: boolean; sessionId: string | null }> {
const resolvedProvider = providerRegistry.resolveProvider(provider);
const sessionId = await resolvedProvider.sessionSynchronizer.synchronizeFile(filePath);
return {
provider,
indexed: Boolean(sessionId),
sessionId,
};
},
};

View File

@@ -0,0 +1,285 @@
import os from 'node:os';
import path from 'node:path';
import { promises as fsPromises } from 'node:fs';
import chokidar, { type FSWatcher } from 'chokidar';
import { sessionSynchronizerService } from '@/modules/providers/services/session-synchronizer.service.js';
import { WS_OPEN_STATE, connectedClients } from '@/modules/websocket/index.js';
import type { LLMProvider } from '@/shared/types.js';
import { getProjectsWithSessions } from '@/modules/projects/index.js';
type WatcherEventType = 'add' | 'change';
const PROVIDER_WATCH_PATHS: Array<{ provider: LLMProvider; rootPath: string }> = [
{
provider: 'claude',
rootPath: path.join(os.homedir(), '.claude', 'projects'),
},
{
provider: 'cursor',
rootPath: path.join(os.homedir(), '.cursor', 'projects'),
},
{
provider: 'codex',
rootPath: path.join(os.homedir(), '.codex', 'sessions'),
},
// {
// provider: 'gemini',
// rootPath: path.join(os.homedir(), '.gemini', 'sessions'),
// },
// Keep `sessions/` watcher disabled: Gemini also mirrors artifacts there,
// which causes duplicate synchronization events.
{
provider: 'gemini',
rootPath: path.join(os.homedir(), '.gemini', 'tmp'),
},
];
const WATCHER_IGNORED_PATTERNS = [
'**/node_modules/**',
'**/.git/**',
'**/dist/**',
'**/build/**',
'**/*.tmp',
'**/*.swp',
'**/.DS_Store',
];
const PROJECTS_UPDATE_DEBOUNCE_MS = 500;
const PROJECTS_UPDATE_MAX_WAIT_MS = 2_000;
const watchers: FSWatcher[] = [];
type PendingWatcherUpdate = {
providers: Set<LLMProvider>;
changeTypes: Set<WatcherEventType>;
updatedSessionIds: Set<string>;
};
let pendingWatcherUpdate: PendingWatcherUpdate | null = null;
let pendingWatcherUpdateStartedAt: number | null = null;
let pendingWatcherFlushTimer: ReturnType<typeof setTimeout> | null = null;
let watcherRefreshInFlight = false;
let watcherRescheduleAfterRefresh = false;
/**
* Filters watcher events to provider-specific session artifact file types.
*/
function isWatcherTargetFile(provider: LLMProvider, filePath: string): boolean {
if (provider === 'gemini') {
return filePath.endsWith('.json') || filePath.endsWith('.jsonl');
}
return filePath.endsWith('.jsonl');
}
function clearPendingWatcherFlushTimer(): void {
if (pendingWatcherFlushTimer) {
clearTimeout(pendingWatcherFlushTimer);
pendingWatcherFlushTimer = null;
}
}
function schedulePendingWatcherFlush(): void {
if (!pendingWatcherUpdate) {
return;
}
const now = Date.now();
if (pendingWatcherUpdateStartedAt === null) {
pendingWatcherUpdateStartedAt = now;
}
const elapsed = now - pendingWatcherUpdateStartedAt;
const remainingMaxWait = Math.max(0, PROJECTS_UPDATE_MAX_WAIT_MS - elapsed);
const delay = Math.min(PROJECTS_UPDATE_DEBOUNCE_MS, remainingMaxWait);
clearPendingWatcherFlushTimer();
pendingWatcherFlushTimer = setTimeout(() => {
void flushPendingWatcherUpdate();
}, delay);
}
function queuePendingWatcherUpdate(
eventType: WatcherEventType,
provider: LLMProvider,
updatedSessionId: string | null
): void {
if (!pendingWatcherUpdate) {
pendingWatcherUpdate = {
providers: new Set<LLMProvider>(),
changeTypes: new Set<WatcherEventType>(),
updatedSessionIds: new Set<string>(),
};
}
pendingWatcherUpdate.providers.add(provider);
pendingWatcherUpdate.changeTypes.add(eventType);
if (updatedSessionId) {
pendingWatcherUpdate.updatedSessionIds.add(updatedSessionId);
}
schedulePendingWatcherFlush();
}
async function flushPendingWatcherUpdate(): Promise<void> {
clearPendingWatcherFlushTimer();
if (!pendingWatcherUpdate) {
return;
}
if (watcherRefreshInFlight) {
watcherRescheduleAfterRefresh = true;
return;
}
const queuedUpdate = pendingWatcherUpdate;
pendingWatcherUpdate = null;
pendingWatcherUpdateStartedAt = null;
watcherRefreshInFlight = true;
try {
const updatedProjects = await getProjectsWithSessions({ skipSynchronization: true });
const changeTypes = Array.from(queuedUpdate.changeTypes);
const watchProviders = Array.from(queuedUpdate.providers);
const updatedSessionIds = Array.from(queuedUpdate.updatedSessionIds);
// Backward-compatible fields stay populated with the first queued values.
const updateMessage = JSON.stringify({
type: 'projects_updated',
projects: updatedProjects,
timestamp: new Date().toISOString(),
changeType: changeTypes[0] ?? 'change',
updatedSessionId: updatedSessionIds[0] ?? undefined,
watchProvider: watchProviders[0] ?? undefined,
changeTypes,
updatedSessionIds,
watchProviders,
batched: true,
});
connectedClients.forEach(client => {
if (client.readyState === WS_OPEN_STATE) {
client.send(updateMessage);
}
});
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error('Session watcher refresh failed while broadcasting projects_updated', { error: message });
} finally {
watcherRefreshInFlight = false;
if (pendingWatcherUpdate || watcherRescheduleAfterRefresh) {
watcherRescheduleAfterRefresh = false;
schedulePendingWatcherFlush();
}
}
}
/**
* Handles file watcher updates and triggers provider file-level synchronization.
*/
async function onUpdate(
eventType: WatcherEventType,
filePath: string,
provider: LLMProvider
): Promise<void> {
if (!isWatcherTargetFile(provider, filePath)) {
return;
}
try {
const result = await sessionSynchronizerService.synchronizeProviderFile(provider, filePath);
if (!result.indexed) {
return;
}
console.log(`Session synchronization triggered by ${eventType} event for provider "${provider}"`, {
filePath,
sessionId: result.sessionId,
});
queuePendingWatcherUpdate(eventType, provider, result.sessionId);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error(`Session watcher sync failed for provider "${provider}"`, {
eventType,
filePath,
error: message,
});
}
}
/**
* Starts provider filesystem watchers and performs initial DB synchronization.
*/
export async function initializeSessionsWatcher(): Promise<void> {
console.log('Setting up session watchers');
const initialSync = await sessionSynchronizerService.synchronizeSessions();
console.log('Initial session synchronization complete', {
processedByProvider: initialSync.processedByProvider,
failures: initialSync.failures,
});
for (const { provider, rootPath } of PROVIDER_WATCH_PATHS) {
try {
await fsPromises.mkdir(rootPath, { recursive: true });
const watcher = chokidar.watch(rootPath, {
ignored: WATCHER_IGNORED_PATTERNS,
persistent: true,
ignoreInitial: true,
followSymlinks: false,
depth: 6,
usePolling: true,
interval: 6_000,
binaryInterval: 6_000,
});
watcher
.on('add', (filePath: string) => {
void onUpdate('add', filePath, provider);
})
.on('change', (filePath: string) => {
void onUpdate('change', filePath, provider);
})
.on('error', (error: unknown) => {
const message = error instanceof Error ? error.message : String(error);
console.error(`Session watcher error for provider "${provider}"`, { error: message });
});
watchers.push(watcher);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error(`Failed to initialize session watcher for provider "${provider}"`, {
rootPath,
error: message,
});
}
}
}
/**
* Stops all active provider session watchers.
*/
export async function closeSessionsWatcher(): Promise<void> {
clearPendingWatcherFlushTimer();
await Promise.all(
watchers.map(async (watcher) => {
try {
await watcher.close();
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error('Failed to close session watcher', { error: message });
}
})
);
watchers.length = 0;
pendingWatcherUpdate = null;
pendingWatcherUpdateStartedAt = null;
watcherRefreshInFlight = false;
watcherRescheduleAfterRefresh = false;
}

View File

@@ -1,3 +1,7 @@
import fsp from 'node:fs/promises';
import path from 'node:path';
import { projectsDb, sessionsDb } from '@/modules/database/index.js';
import { providerRegistry } from '@/modules/providers/provider.registry.js';
import type {
FetchHistoryOptions,
@@ -5,6 +9,58 @@ import type {
LLMProvider,
NormalizedMessage,
} from '@/shared/types.js';
import { AppError } from '@/shared/utils.js';
type ArchivedSessionListItem = {
sessionId: string;
provider: LLMProvider;
projectId: string | null;
projectPath: string | null;
projectDisplayName: string;
sessionTitle: string;
createdAt: string | null;
updatedAt: string | null;
lastActivity: string | null;
isProjectArchived: boolean;
};
/**
* Removes one file if it exists.
*/
async function removeFileIfExists(filePath: string): Promise<boolean> {
try {
await fsp.unlink(filePath);
return true;
} catch (error) {
const code = (error as NodeJS.ErrnoException).code;
if (code === 'ENOENT') {
return false;
}
throw error;
}
}
/**
* Archive rows need a stable project label even when the owning project is not
* part of the active sidebar payload. This lightweight resolver keeps the
* archive API self-contained while still matching the project's stored display
* name when one exists.
*/
function resolveProjectDisplayName(
projectPath: string | null,
customProjectName: string | null | undefined,
): string {
const trimmedCustomName = typeof customProjectName === 'string' ? customProjectName.trim() : '';
if (trimmedCustomName.length > 0) {
return trimmedCustomName;
}
if (!projectPath) {
return 'Unknown Project';
}
return path.basename(projectPath) || projectPath;
}
/**
* Application service for provider-backed session message operations.
@@ -33,13 +89,145 @@ export const sessionsService = {
},
/**
* Fetches normalized persisted session history for one provider/session pair.
* Fetches persisted history by session id.
*
* Provider and provider-specific lookup hints are resolved from the indexed
* session metadata in the database.
*/
fetchHistory(
providerName: string,
sessionId: string,
options?: FetchHistoryOptions,
options: Pick<FetchHistoryOptions, 'limit' | 'offset'> = {},
): Promise<FetchHistoryResult> {
return providerRegistry.resolveProvider(providerName).sessions.fetchHistory(sessionId, options);
const session = sessionsDb.getSessionById(sessionId);
if (!session) {
throw new AppError(`Session "${sessionId}" was not found.`, {
code: 'SESSION_NOT_FOUND',
statusCode: 404,
});
}
const provider = session.provider as LLMProvider;
return providerRegistry.resolveProvider(provider).sessions.fetchHistory(sessionId, {
limit: options.limit ?? null,
offset: options.offset ?? 0,
projectPath: session.project_path ?? '',
});
},
/**
* Returns archived sessions with enough project metadata for the sidebar to
* group, filter, open, and restore them without a per-row follow-up query.
*/
listArchivedSessions(): ArchivedSessionListItem[] {
const archivedSessions = sessionsDb.getArchivedSessions();
const projectCache = new Map<string, ReturnType<typeof projectsDb.getProjectPath>>();
return archivedSessions.map((session) => {
const projectPath = session.project_path?.trim() ? session.project_path : null;
let project = null;
if (projectPath) {
if (!projectCache.has(projectPath)) {
projectCache.set(projectPath, projectsDb.getProjectPath(projectPath));
}
project = projectCache.get(projectPath) ?? null;
}
return {
sessionId: session.session_id,
provider: session.provider as LLMProvider,
projectId: project?.project_id ?? null,
projectPath,
projectDisplayName: resolveProjectDisplayName(projectPath, project?.custom_project_name),
sessionTitle: session.custom_name?.trim() || session.session_id,
createdAt: session.created_at ?? null,
updatedAt: session.updated_at ?? null,
lastActivity: session.updated_at ?? session.created_at ?? null,
isProjectArchived: Boolean(project?.isArchived),
};
});
},
/**
* Archives or permanently deletes one persisted session row by id.
*
* Soft-delete mirrors the project behavior by toggling `isArchived` so the
* row disappears from active lists but remains restorable. Force-delete
* optionally removes the transcript file before deleting the database row.
*/
async deleteOrArchiveSessionById(
sessionId: string,
options: {
force?: boolean;
deletedFromDisk?: boolean;
} = {},
): Promise<{ sessionId: string; action: 'archived' | 'deleted'; deletedFromDisk: boolean }> {
const session = sessionsDb.getSessionById(sessionId);
if (!session) {
throw new AppError(`Session "${sessionId}" was not found.`, {
code: 'SESSION_NOT_FOUND',
statusCode: 404,
});
}
if (!options.force) {
sessionsDb.updateSessionIsArchived(sessionId, true);
return {
sessionId,
action: 'archived',
deletedFromDisk: false,
};
}
let removedFromDisk = false;
if (options.deletedFromDisk && session.jsonl_path) {
removedFromDisk = await removeFileIfExists(session.jsonl_path);
}
const deleted = sessionsDb.deleteSessionById(sessionId);
if (!deleted) {
throw new AppError(`Session "${sessionId}" was not found.`, {
code: 'SESSION_NOT_FOUND',
statusCode: 404,
});
}
return {
sessionId,
action: 'deleted',
deletedFromDisk: removedFromDisk,
};
},
/**
* Restores one archived session back into the active sidebar lists.
*/
restoreSessionById(sessionId: string): { sessionId: string; isArchived: false } {
const session = sessionsDb.getSessionById(sessionId);
if (!session) {
throw new AppError(`Session "${sessionId}" was not found.`, {
code: 'SESSION_NOT_FOUND',
statusCode: 404,
});
}
sessionsDb.updateSessionIsArchived(sessionId, false);
return { sessionId, isArchived: false };
},
/**
* Renames one session by id without requiring the caller to pass provider.
*/
renameSessionById(sessionId: string, summary: string): { sessionId: string; summary: string } {
const session = sessionsDb.getSessionById(sessionId);
if (!session) {
throw new AppError(`Session "${sessionId}" was not found.`, {
code: 'SESSION_NOT_FOUND',
statusCode: 404,
});
}
sessionsDb.updateSessionCustomName(sessionId, summary);
return { sessionId, summary };
},
};

View File

@@ -0,0 +1,15 @@
import { providerRegistry } from '@/modules/providers/provider.registry.js';
import type { ProviderSkill, ProviderSkillListOptions } from '@/shared/types.js';
export const providerSkillsService = {
/**
* Lists normalized skills visible to one provider.
*/
async listProviderSkills(
providerName: string,
options?: ProviderSkillListOptions,
): Promise<ProviderSkill[]> {
const provider = providerRegistry.resolveProvider(providerName);
return provider.skills.listSkills(options);
},
};

View File

@@ -1,4 +1,11 @@
import type { IProvider, IProviderAuth, IProviderMcp, IProviderSessions } from '@/shared/interfaces.js';
import type {
IProvider,
IProviderAuth,
IProviderMcp,
IProviderSessionSynchronizer,
IProviderSkills,
IProviderSessions,
} from '@/shared/interfaces.js';
import type { LLMProvider } from '@/shared/types.js';
/**
@@ -12,7 +19,9 @@ export abstract class AbstractProvider implements IProvider {
readonly id: LLMProvider;
abstract readonly mcp: IProviderMcp;
abstract readonly auth: IProviderAuth;
abstract readonly skills: IProviderSkills;
abstract readonly sessions: IProviderSessions;
abstract readonly sessionSynchronizer: IProviderSessionSynchronizer;
protected constructor(id: LLMProvider) {
this.id = id;

View File

@@ -0,0 +1,64 @@
import path from 'node:path';
import type { IProviderSkills } from '@/shared/interfaces.js';
import type {
LLMProvider,
ProviderSkill,
ProviderSkillListOptions,
ProviderSkillSource,
} from '@/shared/types.js';
import {
findProviderSkillMarkdownFiles,
readProviderSkillMarkdownDefinition,
} from '@/shared/utils.js';
const resolveWorkspacePath = (workspacePath?: string): string =>
path.resolve(workspacePath ?? process.cwd());
/**
* Shared skills provider for provider-specific skill source discovery.
*/
export abstract class SkillsProvider implements IProviderSkills {
protected readonly provider: LLMProvider;
protected constructor(provider: LLMProvider) {
this.provider = provider;
}
async listSkills(options?: ProviderSkillListOptions): Promise<ProviderSkill[]> {
const workspacePath = resolveWorkspacePath(options?.workspacePath);
const sources = await this.getSkillSources(workspacePath);
const skills: ProviderSkill[] = [];
for (const source of sources) {
const skillFiles = await findProviderSkillMarkdownFiles(source.rootDir, {
recursive: source.recursive,
});
for (const skillPath of skillFiles) {
try {
const definition = await readProviderSkillMarkdownDefinition(skillPath);
const command = source.commandForSkill
? source.commandForSkill(definition.name)
: `${source.commandPrefix ?? '/'}${definition.name}`;
skills.push({
provider: this.provider,
name: definition.name,
description: definition.description,
command,
scope: source.scope,
sourcePath: skillPath,
pluginName: source.pluginName,
pluginId: source.pluginId,
});
} catch {
// A malformed or unreadable skill markdown file should not hide other valid skills.
}
}
}
return skills;
}
protected abstract getSkillSources(workspacePath: string): Promise<ProviderSkillSource[]>;
}

View File

@@ -0,0 +1,446 @@
import assert from 'node:assert/strict';
import fs from 'node:fs/promises';
import os from 'node:os';
import path from 'node:path';
import test from 'node:test';
import { providerSkillsService } from '@/modules/providers/services/skills.service.js';
const patchHomeDir = (nextHomeDir: string) => {
const original = os.homedir;
(os as any).homedir = () => nextHomeDir;
return () => {
(os as any).homedir = original;
};
};
const writeSkill = async (
skillsRoot: string,
directoryName: string,
name: string,
description: string,
): Promise<string> => {
const skillDir = path.join(skillsRoot, directoryName);
await fs.mkdir(skillDir, { recursive: true });
const skillPath = path.join(skillDir, 'SKILL.md');
await fs.writeFile(
skillPath,
`---\nname: ${name}\ndescription: ${description}\n---\n\n`,
'utf8',
);
return skillPath;
};
const writeClaudePluginManifest = async (
installPath: string,
name: string,
): Promise<void> => {
const pluginConfigDir = path.join(installPath, '.claude-plugin');
await fs.mkdir(pluginConfigDir, { recursive: true });
await fs.writeFile(
path.join(pluginConfigDir, 'plugin.json'),
JSON.stringify(
{
name,
version: '0.1.0',
description: `${name} test plugin`,
},
null,
2,
),
'utf8',
);
};
const writeClaudePluginCommand = async (
commandsRoot: string,
commandName: string,
description: string,
): Promise<string> => {
await fs.mkdir(commandsRoot, { recursive: true });
const commandPath = path.join(commandsRoot, `${commandName}.md`);
await fs.writeFile(
commandPath,
`---\ndescription: ${description}\nargument-hint: 'test args'\n---\n\nCommand body.\n`,
'utf8',
);
return commandPath;
};
/**
* This test covers Claude user/project skill folders plus plugin discovery from
* installed plugin command files and fallback plugin skill files.
*/
test('providerSkillsService lists claude user, project, and enabled plugin skills', { concurrency: false }, async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-skills-claude-'));
const workspacePath = path.join(tempRoot, 'workspace');
const commandPluginInstallPath = path.join(
tempRoot,
'.claude',
'plugins',
'cache',
'notion-plugin',
'notion',
'abc123',
);
const skillPluginInstallPath = path.join(
tempRoot,
'.claude',
'plugins',
'cache',
'anthropic-agent-skills',
'example-skills',
'def456',
);
const disabledPluginInstallPath = path.join(
tempRoot,
'.claude',
'plugins',
'cache',
'disabled-marketplace',
'disabled-skills',
'ghi789',
);
const emptyIdPluginInstallPath = path.join(
tempRoot,
'.claude',
'plugins',
'cache',
'invalid-empty-plugin',
'empty',
'000',
);
const atIdPluginInstallPath = path.join(
tempRoot,
'.claude',
'plugins',
'cache',
'invalid-at-plugin',
'at',
'000',
);
const siblingSkillPluginPath = path.join(path.dirname(skillPluginInstallPath), 'legacy777');
await fs.mkdir(workspacePath, { recursive: true });
const restoreHomeDir = patchHomeDir(tempRoot);
try {
await writeSkill(
path.join(tempRoot, '.claude', 'skills'),
'claude-user-dir',
'claude-user',
'Claude user skill',
);
await writeSkill(
path.join(workspacePath, '.claude', 'skills'),
'claude-project-dir',
'claude-project',
'Claude project skill',
);
await writeClaudePluginManifest(commandPluginInstallPath, 'Notion');
await writeClaudePluginCommand(
path.join(commandPluginInstallPath, 'commands'),
'insert-row',
'Insert a Notion database row',
);
await writeSkill(
path.join(commandPluginInstallPath, 'skills'),
'ignored-command-plugin-skill-dir',
'ignored-command-plugin-skill',
'Command plugin fallback skill should be ignored',
);
await writeClaudePluginManifest(skillPluginInstallPath, 'ExampleSkills');
await writeSkill(
path.join(skillPluginInstallPath, 'skills'),
'claude-plugin-dir',
'claude-plugin',
'Claude plugin skill',
);
await writeSkill(
path.join(skillPluginInstallPath, 'skills'),
'claude-plugin-second-dir',
'claude-plugin-second',
'Second Claude plugin skill',
);
await writeSkill(
path.join(skillPluginInstallPath, 'skills', 'nested', 'collection'),
'claude-plugin-nested-dir',
'claude-plugin-nested',
'Nested Claude plugin skill',
);
await writeSkill(
path.join(siblingSkillPluginPath, 'skills'),
'claude-plugin-sibling-dir',
'claude-plugin-sibling',
'Sibling Claude plugin skill',
);
await writeClaudePluginManifest(disabledPluginInstallPath, 'DisabledSkills');
await writeClaudePluginCommand(
path.join(disabledPluginInstallPath, 'commands'),
'disabled-command',
'Disabled plugin command',
);
await writeClaudePluginCommand(
path.join(emptyIdPluginInstallPath, 'commands'),
'invalid-empty-command',
'Invalid empty id command',
);
await writeClaudePluginCommand(
path.join(atIdPluginInstallPath, 'commands'),
'invalid-at-command',
'Invalid at id command',
);
await writeSkill(
path.join(
disabledPluginInstallPath,
'skills',
),
'disabled-plugin-dir',
'disabled-plugin',
'Disabled plugin skill',
);
await fs.writeFile(
path.join(tempRoot, '.claude', 'settings.json'),
JSON.stringify(
{
enabledPlugins: {
'': true,
'@': true,
'notion@notion-marketplace': true,
'example-skills@anthropic-agent-skills': true,
'disabled-skills@disabled-marketplace': false,
},
},
null,
2,
),
'utf8',
);
await fs.writeFile(
path.join(tempRoot, '.claude', 'plugins', 'installed_plugins.json'),
JSON.stringify(
{
version: 2,
plugins: {
'': [
{
scope: 'user',
installPath: emptyIdPluginInstallPath,
version: '000',
},
],
'@': [
{
scope: 'user',
installPath: atIdPluginInstallPath,
version: '000',
},
],
'notion@notion-marketplace': [
{
scope: 'user',
installPath: commandPluginInstallPath,
version: 'abc123',
},
],
'example-skills@anthropic-agent-skills': [
{
scope: 'user',
installPath: skillPluginInstallPath,
version: 'def456',
},
],
'disabled-skills@disabled-marketplace': [
{
scope: 'user',
installPath: disabledPluginInstallPath,
version: 'ghi789',
},
],
},
},
null,
2,
),
'utf8',
);
const skills = await providerSkillsService.listProviderSkills('claude', { workspacePath });
const byName = new Map(skills.map((skill) => [skill.name, skill]));
assert.equal(byName.get('claude-user')?.scope, 'user');
assert.equal(byName.get('claude-user')?.command, '/claude-user');
assert.equal(byName.get('claude-project')?.scope, 'project');
assert.equal(byName.get('claude-project')?.command, '/claude-project');
const pluginCommand = byName.get('insert-row');
assert.equal(pluginCommand?.scope, 'plugin');
assert.equal(pluginCommand?.pluginName, 'Notion');
assert.equal(pluginCommand?.pluginId, 'notion@notion-marketplace');
assert.equal(pluginCommand?.command, '/Notion:insert-row');
assert.equal(pluginCommand?.description, 'Insert a Notion database row');
assert.match(pluginCommand?.sourcePath ?? '', /commands[\\/]insert-row\.md$/);
assert.equal(byName.has('ignored-command-plugin-skill'), false);
const pluginSkill = byName.get('claude-plugin');
assert.equal(pluginSkill?.scope, 'plugin');
assert.equal(pluginSkill?.pluginName, 'ExampleSkills');
assert.equal(pluginSkill?.pluginId, 'example-skills@anthropic-agent-skills');
assert.equal(pluginSkill?.command, '/ExampleSkills:claude-plugin');
assert.equal(pluginSkill?.description, 'Claude plugin skill');
assert.match(
pluginSkill?.sourcePath ?? '',
/cache[\\/]anthropic-agent-skills[\\/]example-skills[\\/]def456[\\/]skills[\\/]/,
);
const secondPluginSkill = byName.get('claude-plugin-second');
assert.equal(secondPluginSkill?.scope, 'plugin');
assert.equal(secondPluginSkill?.command, '/ExampleSkills:claude-plugin-second');
const nestedPluginSkill = byName.get('claude-plugin-nested');
assert.equal(nestedPluginSkill?.scope, 'plugin');
assert.equal(nestedPluginSkill?.command, '/ExampleSkills:claude-plugin-nested');
assert.equal(nestedPluginSkill?.description, 'Nested Claude plugin skill');
const siblingPluginSkill = byName.get('claude-plugin-sibling');
assert.equal(siblingPluginSkill?.scope, 'plugin');
assert.equal(siblingPluginSkill?.pluginName, 'example-skills');
assert.equal(siblingPluginSkill?.command, '/example-skills:claude-plugin-sibling');
assert.equal(siblingPluginSkill?.description, 'Sibling Claude plugin skill');
assert.equal(byName.has('disabled-command'), false);
assert.equal(byName.has('disabled-plugin'), false);
assert.equal(byName.has('invalid-empty-command'), false);
assert.equal(byName.has('invalid-at-command'), false);
assert.equal(skills.some((skill) => skill.command.startsWith('/:')), false);
} finally {
restoreHomeDir();
await fs.rm(tempRoot, { recursive: true, force: true });
}
});
/**
* This test covers Codex repository/user/system skill folders and verifies that
* repository lookup includes cwd, parent, and git root skill locations.
*/
test('providerSkillsService lists codex repository, user, and system skills', { concurrency: false }, async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-skills-codex-'));
const repoRoot = path.join(tempRoot, 'repo');
const workspacePath = path.join(repoRoot, 'packages', 'app');
await fs.mkdir(path.join(repoRoot, '.git'), { recursive: true });
await fs.mkdir(workspacePath, { recursive: true });
const restoreHomeDir = patchHomeDir(tempRoot);
try {
await writeSkill(
path.join(workspacePath, '.agents', 'skills'),
'codex-cwd-dir',
'codex-cwd',
'Codex cwd skill',
);
await writeSkill(
path.join(repoRoot, 'packages', '.agents', 'skills'),
'codex-parent-dir',
'codex-parent',
'Codex parent skill',
);
await writeSkill(
path.join(repoRoot, '.agents', 'skills'),
'codex-root-dir',
'codex-root',
'Codex root skill',
);
await writeSkill(
path.join(tempRoot, '.agents', 'skills'),
'codex-user-dir',
'codex-user',
'Codex user skill',
);
await writeSkill(
path.join(tempRoot, '.codex', 'skills', '.system'),
'codex-system-dir',
'codex-system',
'Codex system skill',
);
const skills = await providerSkillsService.listProviderSkills('codex', { workspacePath });
const byName = new Map(skills.map((skill) => [skill.name, skill]));
assert.equal(byName.get('codex-cwd')?.scope, 'repo');
assert.equal(byName.get('codex-parent')?.scope, 'repo');
assert.equal(byName.get('codex-root')?.scope, 'repo');
assert.equal(byName.get('codex-user')?.scope, 'user');
assert.equal(byName.get('codex-system')?.scope, 'system');
assert.equal(byName.get('codex-root')?.command, '$codex-root');
} finally {
restoreHomeDir();
await fs.rm(tempRoot, { recursive: true, force: true });
}
});
/**
* This test covers Gemini and Cursor skill directory rules, including shared
* `.agents/skills` project support.
*/
test('providerSkillsService lists gemini and cursor skills from their configured directories', { concurrency: false }, async () => {
const tempRoot = await fs.mkdtemp(path.join(os.tmpdir(), 'llm-skills-gc-'));
const workspacePath = path.join(tempRoot, 'workspace');
await fs.mkdir(workspacePath, { recursive: true });
const restoreHomeDir = patchHomeDir(tempRoot);
try {
await writeSkill(
path.join(tempRoot, '.gemini', 'skills'),
'gemini-user-dir',
'gemini-user',
'Gemini user skill',
);
await writeSkill(
path.join(tempRoot, '.agents', 'skills'),
'agents-user-dir',
'agents-user',
'Agents user skill',
);
await writeSkill(
path.join(workspacePath, '.gemini', 'skills'),
'gemini-project-dir',
'gemini-project',
'Gemini project skill',
);
await writeSkill(
path.join(workspacePath, '.agents', 'skills'),
'agents-project-dir',
'agents-project',
'Agents project skill',
);
await writeSkill(
path.join(workspacePath, '.cursor', 'skills'),
'cursor-project-dir',
'cursor-project',
'Cursor project skill',
);
await writeSkill(
path.join(tempRoot, '.cursor', 'skills'),
'cursor-user-dir',
'cursor-user',
'Cursor user skill',
);
const geminiSkills = await providerSkillsService.listProviderSkills('gemini', { workspacePath });
const geminiByName = new Map(geminiSkills.map((skill) => [skill.name, skill]));
assert.equal(geminiByName.get('gemini-user')?.scope, 'user');
assert.equal(geminiByName.get('agents-user')?.scope, 'user');
assert.equal(geminiByName.get('gemini-project')?.scope, 'project');
assert.equal(geminiByName.get('agents-project')?.scope, 'project');
assert.equal(geminiByName.get('gemini-project')?.command, '/gemini-project');
const cursorSkills = await providerSkillsService.listProviderSkills('cursor', { workspacePath });
const cursorByName = new Map(cursorSkills.map((skill) => [skill.name, skill]));
assert.equal(cursorByName.get('agents-project')?.scope, 'project');
assert.equal(cursorByName.get('cursor-project')?.scope, 'project');
assert.equal(cursorByName.get('cursor-user')?.scope, 'user');
assert.equal(cursorByName.get('cursor-user')?.command, '/cursor-user');
} finally {
restoreHomeDir();
await fs.rm(tempRoot, { recursive: true, force: true });
}
});

View File

@@ -0,0 +1,267 @@
# WebSocket Module
This module owns the server-side WebSocket gateway used by:
1. Chat streaming (`/ws`)
2. Interactive terminal sessions (`/shell`)
3. Plugin WebSocket passthrough (`/plugin-ws/:pluginName`)
It is intentionally structured as **small services** plus a **barrel export** in `index.ts`.
## Public API
`server/modules/websocket/index.ts` exports:
1. `createWebSocketServer(server, dependencies)`
Creates and wires the shared `ws` server.
2. `connectedClients` and `WS_OPEN_STATE`
Shared chat client registry and open-state constant used by other modules.
## Why Dependency Injection Is Used
The module receives runtime-specific functions from `server/index.js` instead of importing legacy runtime files directly.
Benefits:
1. Keeps module boundaries clean (`server/modules/*` architecture rule).
2. Makes each service easier to test in isolation.
3. Keeps WebSocket transport concerns separate from provider runtime concerns.
## Service Map
| File | Responsibility |
|---|---|
| `services/websocket-server.service.ts` | Creates `WebSocketServer`, binds `verifyClient`, routes connection by pathname |
| `services/websocket-auth.service.ts` | Authenticates upgrade requests and attaches `request.user` |
| `services/chat-websocket.service.ts` | Handles `/ws` chat protocol and provider command/session control messages |
| `services/shell-websocket.service.ts` | Handles `/shell` PTY lifecycle, reconnect buffering, auth URL detection |
| `services/plugin-websocket-proxy.service.ts` | Bridges client socket to plugin socket |
| `services/websocket-writer.service.ts` | Adapts raw WebSocket to writer interface (`send`, `setSessionId`, `getSessionId`) |
| `services/websocket-state.service.ts` | Holds shared chat client set and open-state constant |
## High-Level Architecture
```mermaid
flowchart LR
A[HTTP Server] --> B[createWebSocketServer]
B --> C[verifyWebSocketClient]
B --> D{Pathname}
D -->|/ws| E[handleChatConnection]
D -->|/shell| F[handleShellConnection]
D -->|/plugin-ws/:name| G[handlePluginWsProxy]
D -->|other| H[close()]
E --> I[connectedClients Set]
E --> J[WebSocketWriter]
F --> K[ptySessionsMap]
G --> L[Upstream Plugin ws://127.0.0.1:port/ws]
I --> M[projects.service broadcastProgress]
I --> N[sessions-watcher.service projects_updated]
```
## Connection Handshake + Routing
```mermaid
sequenceDiagram
participant Client
participant WSS as WebSocketServer
participant Auth as verifyWebSocketClient
participant Router as connection router
participant Chat as /ws handler
participant Shell as /shell handler
participant Proxy as /plugin-ws handler
Client->>WSS: Upgrade Request
WSS->>Auth: verifyClient(info)
alt Platform mode
Auth->>Auth: authenticateWebSocket(null)
Auth->>Auth: attach request.user
else OSS mode
Auth->>Auth: read token from ?token or Authorization
Auth->>Auth: authenticateWebSocket(token)
Auth->>Auth: attach request.user
end
alt Auth failed
Auth-->>WSS: false (reject handshake)
else Auth ok
Auth-->>WSS: true
WSS->>Router: on("connection", ws, request)
alt pathname == /ws
Router->>Chat: handleChatConnection(ws, request, deps.chat)
else pathname == /shell
Router->>Shell: handleShellConnection(ws, deps.shell)
else pathname startsWith /plugin-ws/
Router->>Proxy: handlePluginWsProxy(ws, pathname, getPluginPort)
else unknown
Router->>Router: ws.close()
end
end
```
## `/ws` Chat Flow
When a chat socket connects:
1. Add socket to `connectedClients`.
2. Build `WebSocketWriter` (captures `userId` from authenticated request).
3. Parse each incoming message with `parseIncomingJsonObject`.
4. Dispatch by `data.type`.
5. On close, remove socket from `connectedClients`.
### Chat Message Dispatch
```mermaid
flowchart TD
A[Incoming WS message] --> B[parseIncomingJsonObject]
B -->|invalid| C[send {type:error}]
B -->|ok| D{data.type}
D -->|claude-command| E[queryClaudeSDK]
D -->|cursor-command| F[spawnCursor]
D -->|codex-command| G[queryCodex]
D -->|gemini-command| H[spawnGemini]
D -->|cursor-resume| I[spawnCursor resume]
D -->|abort-session| J[abort by provider]
D -->|claude-permission-response| K[resolveToolApproval]
D -->|cursor-abort| L[abortCursorSession]
D -->|check-session-status| M[is*SessionActive + optional reconnectSessionWriter]
D -->|get-pending-permissions| N[getPendingApprovalsForSession]
D -->|get-active-sessions| O[getActive*Sessions]
```
### Chat Notes
1. `abort-session` returns a normalized `complete` message with `aborted: true`.
2. `check-session-status` returns `{ type: "session-status", isProcessing }`.
3. Claude status checks can reconnect output stream to the new socket via `reconnectSessionWriter`.
## `/shell` Terminal Flow
The shell handler manages persistent PTY sessions keyed by:
`<projectPath>_<sessionIdOrDefault>[_cmd_<hash>]`
This enables reconnect behavior and isolates command-specific plain-shell sessions.
### Shell Lifecycle
```mermaid
stateDiagram-v2
[*] --> WaitingInit
WaitingInit --> ValidateInit: message.type == init
ValidateInit --> ReconnectExisting: session key exists and not login reset
ValidateInit --> SpawnNewPTY: valid path + valid sessionId
ValidateInit --> EmitError: invalid payload/path/sessionId
ReconnectExisting --> Running: attach ws, replay buffer
SpawnNewPTY --> Running: pty.spawn + wire onData/onExit
Running --> Running: input -> pty.write
Running --> Running: resize -> pty.resize
Running --> Running: onData -> buffer + output + auth_url detection
Running --> Exited: onExit
Running --> Detached: ws close
Detached --> Running: reconnect before timeout
Detached --> Killed: timeout reached -> pty.kill
Exited --> [*]
Killed --> [*]
EmitError --> WaitingInit
```
### Shell Behaviors in Detail
1. `init`:
Reads `projectPath`, `sessionId`, `provider`, `hasSession`, `initialCommand`, `isPlainShell`.
2. Login reset:
For login-like commands, existing keyed PTY session is killed and recreated.
3. Validation:
Path must exist and be a directory; `sessionId` must match safe pattern.
4. Command build:
Provider-specific command construction with resume semantics.
5. PTY output buffering:
Stores up to 5000 chunks for replay on reconnect.
6. URL detection:
Strips ANSI, accumulates text buffer, extracts URLs, emits `auth_url` once per normalized URL, supports `autoOpen`.
7. Close behavior:
Socket disconnect does not instantly kill PTY; session is kept alive and terminated on timeout.
## `/plugin-ws/:pluginName` Proxy Flow
```mermaid
sequenceDiagram
participant Client
participant Proxy as handlePluginWsProxy
participant PM as getPluginPort
participant Upstream as Plugin WS
Client->>Proxy: Connect /plugin-ws/:name
Proxy->>Proxy: Validate pluginName regex
alt Invalid name
Proxy-->>Client: close(4400, "Invalid plugin name")
else Valid
Proxy->>PM: getPluginPort(name)
alt Plugin not running
Proxy-->>Client: close(4404, "Plugin not running")
else Port found
Proxy->>Upstream: new WebSocket(ws://127.0.0.1:port/ws)
Client-->>Upstream: relay messages bidirectionally
Upstream-->>Client: relay messages bidirectionally
Upstream-->>Client: close propagation
Client-->>Upstream: close propagation
Upstream-->>Client: close(4502, "Upstream error") on upstream error
end
end
```
## Shared Client Registry and Broadcasts
Only chat sockets (`/ws`) are tracked in `connectedClients`.
That shared set is consumed by:
1. `modules/projects/services/projects-with-sessions-fetch.service.ts`
Broadcasts `loading_progress` while project snapshots are being built.
2. `modules/providers/services/sessions-watcher.service.ts`
Broadcasts `projects_updated` when provider session artifacts change.
This design centralizes cross-module realtime fanout without requiring route-local references to WebSocket internals.
## Writer Adapter (`WebSocketWriter`)
`WebSocketWriter` normalizes chat transport behavior to match existing writer-style interfaces used elsewhere.
Methods:
1. `send(data)`
JSON-serializes and sends only if socket is open.
2. `setSessionId(sessionId)` / `getSessionId()`
Supports provider session bookkeeping and resume flows.
3. `updateWebSocket(newRawWs)`
Allows active session stream redirection on reconnect.
## Error Handling and Close Codes
Current explicit close codes in this module:
1. `4400`: Invalid plugin name
2. `4404`: Plugin not running
3. `4502`: Upstream plugin WebSocket error
Other errors:
1. Chat handler catches and emits `{ type: "error", error }`.
2. Shell handler catches and writes terminal-visible error output.
3. Unknown websocket paths are closed immediately.
## Extending This Module
To add a new websocket route:
1. Add a new handler service under `services/`.
2. Extend `WebSocketServerDependencies` in `websocket-server.service.ts` if needed.
3. Add a new pathname branch in the router.
4. Wire dependency injection from `server/index.js`.
5. Keep `index.ts` as barrel-only export surface.

View File

@@ -0,0 +1,2 @@
export { WS_OPEN_STATE, connectedClients } from './services/websocket-state.service.js';
export { createWebSocketServer } from './services/websocket-server.service.js';

View File

@@ -0,0 +1,271 @@
import type { WebSocket } from 'ws';
import { connectedClients } from '@/modules/websocket/services/websocket-state.service.js';
import { WebSocketWriter } from '@/modules/websocket/services/websocket-writer.service.js';
import type {
AnyRecord,
AuthenticatedWebSocketRequest,
LLMProvider,
} from '@/shared/types.js';
import { createNormalizedMessage, parseIncomingJsonObject } from '@/shared/utils.js';
type ChatIncomingMessage = AnyRecord & {
type?: string;
command?: string;
options?: AnyRecord;
provider?: string;
sessionId?: string;
requestId?: string;
allow?: unknown;
updatedInput?: unknown;
message?: unknown;
rememberEntry?: unknown;
};
const DEFAULT_PROVIDER: LLMProvider = 'claude';
type ChatWebSocketDependencies = {
queryClaudeSDK: (command: string, options: unknown, writer: WebSocketWriter) => Promise<unknown>;
spawnCursor: (command: string, options: unknown, writer: WebSocketWriter) => Promise<unknown>;
queryCodex: (command: string, options: unknown, writer: WebSocketWriter) => Promise<unknown>;
spawnGemini: (command: string, options: unknown, writer: WebSocketWriter) => Promise<unknown>;
abortClaudeSDKSession: (sessionId: string) => Promise<boolean>;
abortCursorSession: (sessionId: string) => boolean;
abortCodexSession: (sessionId: string) => boolean;
abortGeminiSession: (sessionId: string) => boolean;
resolveToolApproval: (
requestId: string,
payload: {
allow: boolean;
updatedInput?: unknown;
message?: string;
rememberEntry?: unknown;
}
) => void;
isClaudeSDKSessionActive: (sessionId: string) => boolean;
isCursorSessionActive: (sessionId: string) => boolean;
isCodexSessionActive: (sessionId: string) => boolean;
isGeminiSessionActive: (sessionId: string) => boolean;
reconnectSessionWriter: (sessionId: string, ws: WebSocket) => boolean;
getPendingApprovalsForSession: (sessionId: string) => unknown[];
getActiveClaudeSDKSessions: () => unknown;
getActiveCursorSessions: () => unknown;
getActiveCodexSessions: () => unknown;
getActiveGeminiSessions: () => unknown;
};
/**
* Normalizes potentially invalid provider names coming from websocket payloads.
*/
function readProvider(value: unknown): LLMProvider {
if (value === 'claude' || value === 'cursor' || value === 'codex' || value === 'gemini') {
return value;
}
return DEFAULT_PROVIDER;
}
/**
* Extracts the authenticated request user id in the formats currently produced
* by platform and OSS auth code paths.
*/
function readRequestUserId(
request: AuthenticatedWebSocketRequest | undefined
): string | number | null {
const user = request?.user;
if (!user) {
return null;
}
if (typeof user.id === 'string' || typeof user.id === 'number') {
return user.id;
}
if (typeof user.userId === 'string' || typeof user.userId === 'number') {
return user.userId;
}
return null;
}
/**
* Handles authenticated chat websocket messages used by the main chat panel.
*/
export function handleChatConnection(
ws: WebSocket,
request: AuthenticatedWebSocketRequest,
dependencies: ChatWebSocketDependencies
): void {
console.log('[INFO] Chat WebSocket connected');
connectedClients.add(ws);
const writer = new WebSocketWriter(ws, readRequestUserId(request));
ws.on('message', async (rawMessage) => {
try {
const parsed = parseIncomingJsonObject(rawMessage);
if (!parsed) {
throw new Error('Invalid websocket payload');
}
const data = parsed as ChatIncomingMessage;
const messageType = data.type;
if (!messageType) {
throw new Error('Message type is required');
}
if (messageType === 'claude-command') {
await dependencies.queryClaudeSDK(data.command ?? '', data.options, writer);
return;
}
if (messageType === 'cursor-command') {
await dependencies.spawnCursor(data.command ?? '', data.options, writer);
return;
}
if (messageType === 'codex-command') {
await dependencies.queryCodex(data.command ?? '', data.options, writer);
return;
}
if (messageType === 'gemini-command') {
await dependencies.spawnGemini(data.command ?? '', data.options, writer);
return;
}
if (messageType === 'cursor-resume') {
await dependencies.spawnCursor(
'',
{
sessionId: data.sessionId,
resume: true,
cwd: data.options?.cwd,
},
writer
);
return;
}
if (messageType === 'abort-session') {
const provider = readProvider(data.provider);
const sessionId = typeof data.sessionId === 'string' ? data.sessionId : '';
let success = false;
if (provider === 'cursor') {
success = dependencies.abortCursorSession(sessionId);
} else if (provider === 'codex') {
success = dependencies.abortCodexSession(sessionId);
} else if (provider === 'gemini') {
success = dependencies.abortGeminiSession(sessionId);
} else {
success = await dependencies.abortClaudeSDKSession(sessionId);
}
writer.send(
createNormalizedMessage({
kind: 'complete',
exitCode: success ? 0 : 1,
aborted: true,
success,
sessionId,
provider,
})
);
return;
}
if (messageType === 'claude-permission-response') {
if (typeof data.requestId === 'string' && data.requestId.length > 0) {
dependencies.resolveToolApproval(data.requestId, {
allow: Boolean(data.allow),
updatedInput: data.updatedInput,
message: typeof data.message === 'string' ? data.message : undefined,
rememberEntry: data.rememberEntry,
});
}
return;
}
if (messageType === 'cursor-abort') {
const sessionId = typeof data.sessionId === 'string' ? data.sessionId : '';
const success = dependencies.abortCursorSession(sessionId);
writer.send(
createNormalizedMessage({
kind: 'complete',
exitCode: success ? 0 : 1,
aborted: true,
success,
sessionId,
provider: 'cursor',
})
);
return;
}
if (messageType === 'check-session-status') {
const provider = readProvider(data.provider);
const sessionId = typeof data.sessionId === 'string' ? data.sessionId : '';
let isActive = false;
if (provider === 'cursor') {
isActive = dependencies.isCursorSessionActive(sessionId);
} else if (provider === 'codex') {
isActive = dependencies.isCodexSessionActive(sessionId);
} else if (provider === 'gemini') {
isActive = dependencies.isGeminiSessionActive(sessionId);
} else {
isActive = dependencies.isClaudeSDKSessionActive(sessionId);
if (isActive) {
dependencies.reconnectSessionWriter(sessionId, ws);
}
}
writer.send({
type: 'session-status',
sessionId,
provider,
isProcessing: isActive,
});
return;
}
if (messageType === 'get-pending-permissions') {
const sessionId = typeof data.sessionId === 'string' ? data.sessionId : '';
if (sessionId && dependencies.isClaudeSDKSessionActive(sessionId)) {
const pending = dependencies.getPendingApprovalsForSession(sessionId);
writer.send({
type: 'pending-permissions-response',
sessionId,
data: pending,
});
}
return;
}
if (messageType === 'get-active-sessions') {
writer.send({
type: 'active-sessions',
sessions: {
claude: dependencies.getActiveClaudeSDKSessions(),
cursor: dependencies.getActiveCursorSessions(),
codex: dependencies.getActiveCodexSessions(),
gemini: dependencies.getActiveGeminiSessions(),
},
});
}
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error('[ERROR] Chat WebSocket error:', message);
writer.send({
type: 'error',
error: message,
});
}
});
ws.on('close', () => {
console.log('[INFO] Chat client disconnected');
connectedClients.delete(ws);
});
}

View File

@@ -0,0 +1,65 @@
import { WebSocket } from 'ws';
/**
* Proxies an authenticated client websocket to a plugin websocket endpoint.
*/
export function handlePluginWsProxy(
clientWs: WebSocket,
pathname: string,
getPluginPort: (pluginName: string) => number | null
): void {
const pluginName = pathname.replace('/plugin-ws/', '');
if (!pluginName || /[^a-zA-Z0-9_-]/.test(pluginName)) {
clientWs.close(4400, 'Invalid plugin name');
return;
}
const port = getPluginPort(pluginName);
if (!port) {
clientWs.close(4404, 'Plugin not running');
return;
}
const upstream = new WebSocket(`ws://127.0.0.1:${port}/ws`);
upstream.on('open', () => {
console.log(`[Plugins] WS proxy connected to "${pluginName}" on port ${port}`);
});
upstream.on('message', (data) => {
if (clientWs.readyState === WebSocket.OPEN) {
clientWs.send(data);
}
});
clientWs.on('message', (data) => {
if (upstream.readyState === WebSocket.OPEN) {
upstream.send(data);
}
});
upstream.on('close', () => {
if (clientWs.readyState === WebSocket.OPEN) {
clientWs.close();
}
});
clientWs.on('close', () => {
if (upstream.readyState === WebSocket.OPEN) {
upstream.close();
}
});
upstream.on('error', (error) => {
console.error(`[Plugins] WS proxy error for "${pluginName}":`, error.message);
if (clientWs.readyState === WebSocket.OPEN) {
clientWs.close(4502, 'Upstream error');
}
});
clientWs.on('error', () => {
if (upstream.readyState === WebSocket.OPEN) {
upstream.close();
}
});
}

View File

@@ -0,0 +1,453 @@
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import pty, { type IPty } from 'node-pty';
import { WebSocket, type RawData } from 'ws';
import { parseIncomingJsonObject } from '@/shared/utils.js';
type ShellIncomingMessage = {
type?: string;
data?: string;
cols?: number;
rows?: number;
projectPath?: string;
sessionId?: string;
hasSession?: boolean;
provider?: string;
initialCommand?: string;
isPlainShell?: boolean;
};
type PtySessionEntry = {
pty: IPty;
ws: WebSocket | null;
buffer: string[];
timeoutId: NodeJS.Timeout | null;
projectPath: string;
sessionId: string | null;
};
const ptySessionsMap = new Map<string, PtySessionEntry>();
const PTY_SESSION_TIMEOUT = 30 * 60 * 1000;
const SHELL_URL_PARSE_BUFFER_LIMIT = 32768;
type ShellWebSocketDependencies = {
getSessionById: (sessionId: string) => { cliSessionId?: string } | null | undefined;
stripAnsiSequences: (content: string) => string;
normalizeDetectedUrl: (url: string) => string | null;
extractUrlsFromText: (content: string) => string[];
shouldAutoOpenUrlFromOutput: (content: string) => boolean;
};
/**
* Reads a string field from untyped payloads and falls back when absent.
*/
function readString(value: unknown, fallback = ''): string {
return typeof value === 'string' ? value : fallback;
}
/**
* Reads a boolean field from untyped payloads and falls back when absent.
*/
function readBoolean(value: unknown, fallback = false): boolean {
return typeof value === 'boolean' ? value : fallback;
}
/**
* Reads a finite number field from untyped payloads and falls back when absent.
*/
function readNumber(value: unknown, fallback: number): number {
return typeof value === 'number' && Number.isFinite(value) ? value : fallback;
}
/**
* Parses incoming websocket shell messages and keeps processing safe when
* malformed payloads are received.
*/
function parseShellMessage(rawMessage: RawData): ShellIncomingMessage | null {
const payload = parseIncomingJsonObject(rawMessage);
if (!payload) {
return null;
}
return payload as ShellIncomingMessage;
}
/**
* Resolves provider command line for plain shell and agent-backed shell modes.
*/
function buildShellCommand(
message: ShellIncomingMessage,
dependencies: ShellWebSocketDependencies
): string {
const hasSession = readBoolean(message.hasSession);
const sessionId = readString(message.sessionId);
const initialCommand = readString(message.initialCommand);
const provider = readString(message.provider, 'claude');
const safeSessionIdPattern = /^[a-zA-Z0-9_.\-:]+$/;
const isPlainShell =
readBoolean(message.isPlainShell) ||
(!!initialCommand && !hasSession) ||
provider === 'plain-shell';
if (isPlainShell) {
return initialCommand;
}
if (provider === 'cursor') {
if (hasSession && sessionId) {
return `cursor-agent --resume="${sessionId}"`;
}
return 'cursor-agent';
}
if (provider === 'codex') {
if (hasSession && sessionId) {
if (os.platform() === 'win32') {
return `codex resume "${sessionId}"; if ($LASTEXITCODE -ne 0) { codex }`;
}
return `codex resume "${sessionId}" || codex`;
}
return 'codex';
}
if (provider === 'gemini') {
const command = initialCommand || 'gemini';
let resumeId = sessionId;
if (hasSession && sessionId) {
try {
const existingSession = dependencies.getSessionById(sessionId);
if (existingSession && existingSession.cliSessionId) {
resumeId = existingSession.cliSessionId;
if (!safeSessionIdPattern.test(resumeId)) {
resumeId = '';
}
}
} catch (error) {
console.error('Failed to get Gemini CLI session ID:', error);
}
}
if (hasSession && resumeId) {
return `${command} --resume "${resumeId}"`;
}
return command;
}
const command = initialCommand || 'claude';
if (hasSession && sessionId) {
if (os.platform() === 'win32') {
return `claude --resume "${sessionId}"; if ($LASTEXITCODE -ne 0) { claude }`;
}
return `claude --resume "${sessionId}" || claude`;
}
return command;
}
/**
* Handles websocket connections used by the standalone shell terminal UI.
*/
export function handleShellConnection(
ws: WebSocket,
dependencies: ShellWebSocketDependencies
): void {
console.log('[INFO] Shell websocket connected');
let shellProcess: IPty | null = null;
let ptySessionKey: string | null = null;
let urlDetectionBuffer = '';
const announcedAuthUrls = new Set<string>();
ws.on('message', async (rawMessage) => {
try {
const data = parseShellMessage(rawMessage);
if (!data?.type) {
throw new Error('Invalid websocket payload');
}
if (data.type === 'init') {
const projectPath = readString(data.projectPath, process.cwd());
const sessionId = readString(data.sessionId) || null;
const hasSession = readBoolean(data.hasSession);
const provider = readString(data.provider, 'claude');
const initialCommand = readString(data.initialCommand);
const isPlainShell =
readBoolean(data.isPlainShell) ||
(!!initialCommand && !hasSession) ||
provider === 'plain-shell';
urlDetectionBuffer = '';
announcedAuthUrls.clear();
const isLoginCommand =
!!initialCommand &&
(initialCommand.includes('setup-token') ||
initialCommand.includes('cursor-agent login') ||
initialCommand.includes('auth login'));
const commandSuffix =
isPlainShell && initialCommand
? `_cmd_${Buffer.from(initialCommand).toString('base64').slice(0, 16)}`
: '';
ptySessionKey = `${projectPath}_${sessionId ?? 'default'}${commandSuffix}`;
if (isLoginCommand) {
const oldSession = ptySessionsMap.get(ptySessionKey);
if (oldSession) {
if (oldSession.timeoutId) {
clearTimeout(oldSession.timeoutId);
}
oldSession.pty.kill();
ptySessionsMap.delete(ptySessionKey);
}
}
const existingSession = isLoginCommand ? null : ptySessionsMap.get(ptySessionKey);
if (existingSession) {
shellProcess = existingSession.pty;
if (existingSession.timeoutId) {
clearTimeout(existingSession.timeoutId);
}
ws.send(
JSON.stringify({
type: 'output',
data: '\x1b[36m[Reconnected to existing session]\x1b[0m\r\n',
})
);
if (existingSession.buffer.length > 0) {
existingSession.buffer.forEach((bufferedData) => {
ws.send(
JSON.stringify({
type: 'output',
data: bufferedData,
})
);
});
}
existingSession.ws = ws;
return;
}
const resolvedProjectPath = path.resolve(projectPath);
try {
const stats = fs.statSync(resolvedProjectPath);
if (!stats.isDirectory()) {
throw new Error('Not a directory');
}
} catch {
ws.send(JSON.stringify({ type: 'error', message: 'Invalid project path' }));
return;
}
const safeSessionIdPattern = /^[a-zA-Z0-9_.\-:]+$/;
if (sessionId && !safeSessionIdPattern.test(sessionId)) {
ws.send(JSON.stringify({ type: 'error', message: 'Invalid session ID' }));
return;
}
const shellCommand = buildShellCommand(data, dependencies);
const shell = os.platform() === 'win32' ? 'powershell.exe' : 'bash';
const shellArgs =
os.platform() === 'win32' ? ['-Command', shellCommand] : ['-c', shellCommand];
const termCols = readNumber(data.cols, 80);
const termRows = readNumber(data.rows, 24);
shellProcess = pty.spawn(shell, shellArgs, {
name: 'xterm-256color',
cols: termCols,
rows: termRows,
cwd: resolvedProjectPath,
env: {
...process.env,
TERM: 'xterm-256color',
COLORTERM: 'truecolor',
FORCE_COLOR: '3',
},
});
ptySessionsMap.set(ptySessionKey, {
pty: shellProcess,
ws,
buffer: [],
timeoutId: null,
projectPath,
sessionId,
});
shellProcess.onData((chunk) => {
if (!ptySessionKey) {
return;
}
const session = ptySessionsMap.get(ptySessionKey);
if (!session) {
return;
}
if (session.buffer.length < 5000) {
session.buffer.push(chunk);
} else {
session.buffer.shift();
session.buffer.push(chunk);
}
if (session.ws && session.ws.readyState === WebSocket.OPEN) {
let outputData = chunk;
const cleanChunk = dependencies.stripAnsiSequences(chunk);
urlDetectionBuffer = `${urlDetectionBuffer}${cleanChunk}`.slice(-SHELL_URL_PARSE_BUFFER_LIMIT);
outputData = outputData.replace(
/OPEN_URL:\s*(https?:\/\/[^\s\x1b\x07]+)/g,
'[INFO] Opening in browser: $1'
);
const emitAuthUrl = (detectedUrl: string, autoOpen = false) => {
const normalizedUrl = dependencies.normalizeDetectedUrl(detectedUrl);
if (!normalizedUrl) {
return;
}
const isNewUrl = !announcedAuthUrls.has(normalizedUrl);
if (isNewUrl) {
announcedAuthUrls.add(normalizedUrl);
session.ws?.send(
JSON.stringify({
type: 'auth_url',
url: normalizedUrl,
autoOpen,
})
);
}
};
const normalizedDetectedUrls = dependencies.extractUrlsFromText(urlDetectionBuffer)
.map((url) => dependencies.normalizeDetectedUrl(url))
.filter((url): url is string => Boolean(url));
const dedupedDetectedUrls = Array.from(new Set(normalizedDetectedUrls)).filter(
(url, _, urls) =>
!urls.some((otherUrl) => otherUrl !== url && otherUrl.startsWith(url))
);
dedupedDetectedUrls.forEach((url) => emitAuthUrl(url, false));
if (
dependencies.shouldAutoOpenUrlFromOutput(cleanChunk) &&
dedupedDetectedUrls.length > 0
) {
const bestUrl = dedupedDetectedUrls.reduce((longest, current) =>
current.length > longest.length ? current : longest
);
emitAuthUrl(bestUrl, true);
}
session.ws.send(
JSON.stringify({
type: 'output',
data: outputData,
})
);
}
});
shellProcess.onExit((exitCode) => {
if (!ptySessionKey) {
return;
}
const session = ptySessionsMap.get(ptySessionKey);
if (session && session.ws && session.ws.readyState === WebSocket.OPEN) {
session.ws.send(
JSON.stringify({
type: 'output',
data: `\r\n\x1b[33mProcess exited with code ${exitCode.exitCode}${
exitCode.signal != null ? ` (${exitCode.signal})` : ''
}\x1b[0m\r\n`,
})
);
}
if (session?.timeoutId) {
clearTimeout(session.timeoutId);
}
ptySessionsMap.delete(ptySessionKey);
shellProcess = null;
});
let welcomeMsg = `\x1b[36mStarting terminal in: ${projectPath}\x1b[0m\r\n`;
if (!isPlainShell) {
const providerName =
provider === 'cursor'
? 'Cursor'
: provider === 'codex'
? 'Codex'
: provider === 'gemini'
? 'Gemini'
: 'Claude';
welcomeMsg = hasSession
? `\x1b[36mResuming ${providerName} session ${sessionId} in: ${projectPath}\x1b[0m\r\n`
: `\x1b[36mStarting new ${providerName} session in: ${projectPath}\x1b[0m\r\n`;
}
ws.send(
JSON.stringify({
type: 'output',
data: welcomeMsg,
})
);
return;
}
if (data.type === 'input') {
if (shellProcess) {
shellProcess.write(readString(data.data));
}
return;
}
if (data.type === 'resize') {
if (shellProcess) {
shellProcess.resize(readNumber(data.cols, 80), readNumber(data.rows, 24));
}
}
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
console.error('[ERROR] Shell WebSocket error:', message);
if (ws.readyState === WebSocket.OPEN) {
ws.send(
JSON.stringify({
type: 'output',
data: `\r\n\x1b[31mError: ${message}\x1b[0m\r\n`,
})
);
}
}
});
ws.on('close', () => {
if (!ptySessionKey) {
return;
}
const session = ptySessionsMap.get(ptySessionKey);
if (!session) {
return;
}
session.ws = null;
session.timeoutId = setTimeout(() => {
session.pty.kill();
ptySessionsMap.delete(ptySessionKey as string);
}, PTY_SESSION_TIMEOUT);
});
ws.on('error', (error) => {
console.error('[ERROR] Shell WebSocket error:', error);
});
}

View File

@@ -0,0 +1,54 @@
import type { VerifyClientCallbackSync } from 'ws';
import type { AuthenticatedWebSocketRequest } from '@/shared/types.js';
type WebSocketAuthDependencies = {
isPlatform: boolean;
authenticateWebSocket: (token: string | null) => {
id?: string | number;
userId?: string | number;
username?: string;
[key: string]: unknown;
} | null;
};
/**
* Authenticates websocket upgrade requests before the `connection` handler runs.
*/
export function verifyWebSocketClient(
info: Parameters<VerifyClientCallbackSync<AuthenticatedWebSocketRequest>>[0],
dependencies: WebSocketAuthDependencies
): boolean {
const request = info.req as AuthenticatedWebSocketRequest;
console.log('WebSocket connection attempt to:', request.url);
// Platform mode: use the first DB user and skip token checks.
if (dependencies.isPlatform) {
const user = dependencies.authenticateWebSocket(null);
if (!user) {
console.log('[WARN] Platform mode: No user found in database');
return false;
}
request.user = user;
console.log('[OK] Platform mode WebSocket authenticated for user:', user.username);
return true;
}
// OSS mode: read JWT from query string first, then Authorization header.
const upgradeUrl = new URL(request.url ?? '/', 'http://localhost');
const token =
upgradeUrl.searchParams.get('token') ??
request.headers.authorization?.split(' ')[1] ??
null;
const user = dependencies.authenticateWebSocket(token);
if (!user) {
console.log('[WARN] WebSocket authentication failed');
return false;
}
request.user = user;
console.log('[OK] WebSocket authenticated for user:', user.username);
return true;
}

View File

@@ -0,0 +1,58 @@
import type { Server as HttpServer } from 'node:http';
import { WebSocketServer, type VerifyClientCallbackSync } from 'ws';
import { handleChatConnection } from '@/modules/websocket/services/chat-websocket.service.js';
import { verifyWebSocketClient } from '@/modules/websocket/services/websocket-auth.service.js';
import { handlePluginWsProxy } from '@/modules/websocket/services/plugin-websocket-proxy.service.js';
import { handleShellConnection } from '@/modules/websocket/services/shell-websocket.service.js';
import type { AuthenticatedWebSocketRequest } from '@/shared/types.js';
type WebSocketServerDependencies = {
verifyClient: Parameters<typeof verifyWebSocketClient>[1];
chat: Parameters<typeof handleChatConnection>[2];
shell: Parameters<typeof handleShellConnection>[1];
getPluginPort: Parameters<typeof handlePluginWsProxy>[2];
};
/**
* Creates and wires the server-wide websocket gateway used for chat, shell, and
* plugin proxy routes.
*/
export function createWebSocketServer(
server: HttpServer,
dependencies: WebSocketServerDependencies
): WebSocketServer {
const wss = new WebSocketServer({
server,
verifyClient: ((
info: Parameters<VerifyClientCallbackSync<AuthenticatedWebSocketRequest>>[0]
) => verifyWebSocketClient(info, dependencies.verifyClient)),
});
wss.on('connection', (ws, request) => {
const incomingRequest = request as AuthenticatedWebSocketRequest;
const url = incomingRequest.url ?? '/';
const pathname = new URL(url, 'http://localhost').pathname;
if (pathname === '/shell') {
handleShellConnection(ws, dependencies.shell);
return;
}
if (pathname === '/ws') {
handleChatConnection(ws, incomingRequest, dependencies.chat);
return;
}
if (pathname.startsWith('/plugin-ws/')) {
handlePluginWsProxy(ws, pathname, dependencies.getPluginPort);
return;
}
console.log('[WARN] Unknown WebSocket path:', pathname);
ws.close();
});
return wss;
}

View File

@@ -0,0 +1,16 @@
import type { RealtimeClientConnection } from '@/shared/types.js';
/**
* Numeric readyState for an open WebSocket connection.
*
* We keep this in module state so services that broadcast updates do not need
* to import `ws` directly just to compare open/closed state.
*/
export const WS_OPEN_STATE = 1;
/**
* Shared registry of active chat WebSocket connections.
*
* Project/session services publish realtime updates by iterating this set.
*/
export const connectedClients = new Set<RealtimeClientConnection>();

View File

@@ -0,0 +1,38 @@
import { WS_OPEN_STATE } from '@/modules/websocket/services/websocket-state.service.js';
import type { RealtimeClientConnection } from '@/shared/types.js';
/**
* Thin transport adapter that gives WebSocket connections the same interface as
* SSE writers used by API routes (`send`, `setSessionId`, `getSessionId`).
*/
export class WebSocketWriter {
ws: RealtimeClientConnection;
sessionId: string | null;
userId: string | number | null;
isWebSocketWriter: boolean;
constructor(ws: RealtimeClientConnection, userId: string | number | null = null) {
this.ws = ws;
this.sessionId = null;
this.userId = userId;
this.isWebSocketWriter = true;
}
send(data: unknown): void {
if (this.ws.readyState === WS_OPEN_STATE) {
this.ws.send(JSON.stringify(data));
}
}
updateWebSocket(newRawWs: RealtimeClientConnection): void {
this.ws = newRawWs;
}
setSessionId(sessionId: string): void {
this.sessionId = sessionId;
}
getSessionId(): string | null {
return this.sessionId;
}
}

View File

@@ -143,7 +143,7 @@ function transformCodexEvent(event) {
case 'thread.started':
return {
type: 'thread_started',
threadId: event.id
threadId: event.thread_id || event.id
};
case 'error':
@@ -207,7 +207,8 @@ export async function queryCodex(command, options = {}, ws) {
let codex;
let thread;
let currentSessionId = sessionId;
let capturedSessionId = sessionId;
let sessionCreatedSent = false;
let terminalFailure = null;
const abortController = new AbortController();
@@ -231,20 +232,23 @@ export async function queryCodex(command, options = {}, ws) {
thread = codex.startThread(threadOptions);
}
// Get the thread ID
currentSessionId = thread.id || sessionId || `codex-${Date.now()}`;
const registerSession = (id) => {
if (!id) {
return;
}
activeCodexSessions.set(id, {
thread,
codex,
status: 'running',
abortController,
startedAt: new Date().toISOString()
});
};
// Track the session
activeCodexSessions.set(currentSessionId, {
thread,
codex,
status: 'running',
abortController,
startedAt: new Date().toISOString()
});
// Send session created event
sendMessage(ws, createNormalizedMessage({ kind: 'session_created', newSessionId: currentSessionId, sessionId: currentSessionId, provider: 'codex' }));
// Existing sessions can be tracked immediately; new sessions are tracked after thread.started.
if (capturedSessionId) {
registerSession(capturedSessionId);
}
// Execute with streaming
const streamedTurn = await thread.runStreamed(command, {
@@ -252,11 +256,34 @@ export async function queryCodex(command, options = {}, ws) {
});
for await (const event of streamedTurn.events) {
// Capture thread/session id lazily from the stream (Codex emits this asynchronously).
if (event.type === 'thread.started') {
const discoveredSessionId = event.thread_id || event.id || null;
if (discoveredSessionId && !capturedSessionId) {
capturedSessionId = discoveredSessionId;
registerSession(capturedSessionId);
if (ws.setSessionId && typeof ws.setSessionId === 'function') {
ws.setSessionId(capturedSessionId);
}
if (!sessionId && !sessionCreatedSent) {
sessionCreatedSent = true;
sendMessage(ws, createNormalizedMessage({ kind: 'session_created', newSessionId: capturedSessionId, sessionId: capturedSessionId, provider: 'codex' }));
}
}
}
// Check if session was aborted
const session = activeCodexSessions.get(currentSessionId);
if (!session || session.status === 'aborted') {
if (abortController.signal.aborted) {
break;
}
if (capturedSessionId) {
const session = activeCodexSessions.get(capturedSessionId);
if (session?.status === 'aborted') {
break;
}
}
if (event.type === 'item.started' || event.type === 'item.updated') {
continue;
@@ -265,7 +292,7 @@ export async function queryCodex(command, options = {}, ws) {
const transformed = transformCodexEvent(event);
// Normalize the transformed event into NormalizedMessage(s) via adapter
const normalizedMsgs = sessionsService.normalizeMessage('codex', transformed, currentSessionId);
const normalizedMsgs = sessionsService.normalizeMessage('codex', transformed, capturedSessionId || sessionId || null);
for (const msg of normalizedMsgs) {
sendMessage(ws, msg);
}
@@ -275,7 +302,7 @@ export async function queryCodex(command, options = {}, ws) {
notifyRunFailed({
userId: ws?.userId || null,
provider: 'codex',
sessionId: currentSessionId,
sessionId: capturedSessionId || sessionId || null,
sessionName: sessionSummary,
error: terminalFailure
});
@@ -284,24 +311,29 @@ export async function queryCodex(command, options = {}, ws) {
// Extract and send token usage if available (normalized to match Claude format)
if (event.type === 'turn.completed' && event.usage) {
const totalTokens = (event.usage.input_tokens || 0) + (event.usage.output_tokens || 0);
sendMessage(ws, createNormalizedMessage({ kind: 'status', text: 'token_budget', tokenBudget: { used: totalTokens, total: 200000 }, sessionId: currentSessionId, provider: 'codex' }));
sendMessage(ws, createNormalizedMessage({ kind: 'status', text: 'token_budget', tokenBudget: { used: totalTokens, total: 200000 }, sessionId: capturedSessionId || sessionId || null, provider: 'codex' }));
}
}
// Send completion event
if (!terminalFailure) {
sendMessage(ws, createNormalizedMessage({ kind: 'complete', actualSessionId: thread.id, sessionId: currentSessionId, provider: 'codex' }));
sendMessage(ws, createNormalizedMessage({
kind: 'complete',
actualSessionId: capturedSessionId || thread.id || sessionId || null,
sessionId: capturedSessionId || sessionId || null,
provider: 'codex'
}));
notifyRunStopped({
userId: ws?.userId || null,
provider: 'codex',
sessionId: currentSessionId,
sessionId: capturedSessionId || sessionId || null,
sessionName: sessionSummary,
stopReason: 'completed'
});
}
} catch (error) {
const session = currentSessionId ? activeCodexSessions.get(currentSessionId) : null;
const session = capturedSessionId ? activeCodexSessions.get(capturedSessionId) : null;
const wasAborted =
session?.status === 'aborted' ||
error?.name === 'AbortError' ||
@@ -316,12 +348,12 @@ export async function queryCodex(command, options = {}, ws) {
? 'Codex CLI is not configured. Please set up authentication first.'
: error.message;
sendMessage(ws, createNormalizedMessage({ kind: 'error', content: errorContent, sessionId: currentSessionId, provider: 'codex' }));
sendMessage(ws, createNormalizedMessage({ kind: 'error', content: errorContent, sessionId: capturedSessionId || sessionId || null, provider: 'codex' }));
if (!terminalFailure) {
notifyRunFailed({
userId: ws?.userId || null,
provider: 'codex',
sessionId: currentSessionId,
sessionId: capturedSessionId || sessionId || null,
sessionName: sessionSummary,
error
});
@@ -330,8 +362,8 @@ export async function queryCodex(command, options = {}, ws) {
} finally {
// Update session status
if (currentSessionId) {
const session = activeCodexSessions.get(currentSessionId);
if (capturedSessionId) {
const session = activeCodexSessions.get(capturedSessionId);
if (session) {
session.status = session.status === 'aborted' ? 'aborted' : 'completed';
}

File diff suppressed because it is too large Load Diff

View File

@@ -4,8 +4,7 @@ import path from 'path';
import os from 'os';
import { promises as fs } from 'fs';
import crypto from 'crypto';
import { userDb, apiKeysDb, githubTokensDb } from '../database/db.js';
import { addProjectManually } from '../projects.js';
import { userDb, apiKeysDb, githubTokensDb, projectsDb } from '../modules/database/index.js';
import { queryClaudeSDK } from '../claude-sdk.js';
import { spawnCursor } from '../cursor-cli.js';
import { queryCodex } from '../openai-codex.js';
@@ -13,6 +12,7 @@ import { spawnGemini } from '../gemini-cli.js';
import { Octokit } from '@octokit/rest';
import { CLAUDE_MODELS, CURSOR_MODELS, CODEX_MODELS } from '../../shared/modelConstants.js';
import { IS_PLATFORM } from '../constants/config.js';
import { normalizeProjectPath } from '../shared/utils.js';
const router = express.Router();
@@ -890,7 +890,7 @@ router.post('/', validateExternalApiKey, async (req, res) => {
finalProjectPath = await cloneGitHubRepo(githubUrl.trim(), tokenToUse, targetPath);
} else {
// Use existing project path
finalProjectPath = path.resolve(projectPath);
finalProjectPath = normalizeProjectPath(path.resolve(projectPath));
// Verify the path exists
try {
@@ -900,19 +900,14 @@ router.post('/', validateExternalApiKey, async (req, res) => {
}
}
// Register the project (or use existing registration)
let project;
try {
project = await addProjectManually(finalProjectPath);
console.log('📦 Project registered:', project);
} catch (error) {
// If project already exists, that's fine - continue with the existing registration
if (error.message && error.message.includes('Project already configured')) {
console.log('📦 Using existing project registration for:', finalProjectPath);
project = { path: finalProjectPath };
} else {
throw error;
}
finalProjectPath = normalizeProjectPath(finalProjectPath);
// Register project path in DB (or reuse existing active registration)
const registrationResult = projectsDb.createProjectPath(finalProjectPath, null);
if (registrationResult.outcome === 'active_conflict') {
console.log('Project registration already exists for:', finalProjectPath);
} else {
console.log('Project registered:', registrationResult.project);
}
// Set up writer based on streaming mode

View File

@@ -1,9 +1,11 @@
import express from 'express';
import bcrypt from 'bcrypt';
import { userDb, db } from '../database/db.js';
import { userDb } from '../modules/database/index.js';
import { getConnection } from '../modules/database/connection.js';
import { generateToken, authenticateToken } from '../middleware/auth.js';
const router = express.Router();
const db = getConnection();
// Check auth status and setup requirements
router.get('/status', async (req, res) => {
@@ -132,4 +134,4 @@ router.post('/logout', authenticateToken, (req, res) => {
res.json({ success: true, message: 'Logged out successfully' });
});
export default router;
export default router;

View File

@@ -1,19 +0,0 @@
import express from 'express';
import { deleteCodexSession } from '../projects.js';
import { sessionNamesDb } from '../database/db.js';
const router = express.Router();
router.delete('/sessions/:sessionId', async (req, res) => {
try {
const { sessionId } = req.params;
await deleteCodexSession(sessionId);
sessionNamesDb.deleteName(sessionId, 'codex');
res.json({ success: true });
} catch (error) {
console.error(`Error deleting Codex session ${req.params.sessionId}:`, error);
res.status(500).json({ success: false, error: error.message });
}
});
export default router;

View File

@@ -1,9 +1,11 @@
import express from 'express';
import { promises as fs } from 'fs';
import path from 'path';
import os from 'os';
import path from 'path';
import express from 'express';
import { CLAUDE_MODELS, CURSOR_MODELS, CODEX_MODELS } from '../../shared/modelConstants.js';
import { parseFrontmatter } from '../utils/frontmatter.js';
import { parseFrontMatter } from '../shared/frontmatter.js';
import { findAppRoot, getModuleDir } from '../utils/runtime-paths.js';
const __dirname = getModuleDir(import.meta.url);
@@ -40,7 +42,7 @@ async function scanCommandsDirectory(dir, baseDir, namespace) {
// Parse markdown file for metadata
try {
const content = await fs.readFile(fullPath, 'utf8');
const { data: frontmatter, content: commandContent } = parseFrontmatter(content);
const { data: frontmatter, content: commandContent } = parseFrontMatter(content);
// Calculate relative path from baseDir for command name
const relativePath = path.relative(baseDir, fullPath);
@@ -320,7 +322,7 @@ Custom commands can be created in:
packageName,
uptime: uptimeFormatted,
uptimeSeconds: Math.floor(uptime),
model: context?.model || 'claude-sonnet-4.5',
model: context?.model || CLAUDE_MODELS.DEFAULT,
provider: context?.provider || 'claude',
nodeVersion: process.version,
platform: process.platform
@@ -513,7 +515,7 @@ router.post('/execute', async (req, res) => {
}
}
const content = await fs.readFile(commandPath, 'utf8');
const { data: metadata, content: commandContent } = parseFrontmatter(content);
const { data: metadata, content: commandContent } = parseFrontMatter(content);
// Basic argument replacement (will be enhanced in command parser utility)
let processedContent = commandContent;

View File

@@ -1,6 +1,7 @@
import express from 'express';
import sessionManager from '../sessionManager.js';
import { sessionNamesDb } from '../database/db.js';
import { sessionsDb } from '../modules/database/index.js';
const router = express.Router();
@@ -13,7 +14,7 @@ router.delete('/sessions/:sessionId', async (req, res) => {
}
await sessionManager.deleteSession(sessionId);
sessionNamesDb.deleteName(sessionId, 'gemini');
sessionsDb.deleteSessionById(sessionId);
res.json({ success: true });
} catch (error) {
console.error(`Error deleting Gemini session ${req.params.sessionId}:`, error);

View File

@@ -2,7 +2,7 @@ import express from 'express';
import { spawn } from 'child_process';
import path from 'path';
import { promises as fs } from 'fs';
import { extractProjectDirectory } from '../projects.js';
import { projectsDb } from '../modules/database/index.js';
import { queryClaudeSDK } from '../claude-sdk.js';
import { spawnCursor } from '../cursor-cli.js';
@@ -101,14 +101,19 @@ function validateProjectPath(projectPath) {
return resolved;
}
// Helper function to get the actual project path from the encoded project name
async function getActualProjectPath(projectName) {
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
console.error(`Error extracting project directory for ${projectName}:`, error);
throw new Error(`Unable to resolve project path for "${projectName}"`);
/**
* Resolve the absolute project directory for a given DB `projectId`.
*
* After the projectName → projectId migration, every git endpoint receives
* the DB primary key (`project` query/body param). The legacy filesystem
* resolver that walked Claude's JSONL history is no longer used here; the
* path comes straight from the `projects` table and is then sanity-checked
* by `validateProjectPath` before any `git` command runs against it.
*/
async function getActualProjectPath(projectId) {
const projectPath = await projectsDb.getProjectPathById(projectId);
if (!projectPath) {
throw new Error(`Unable to resolve project path for "${projectId}"`);
}
return validateProjectPath(projectPath);
}
@@ -292,7 +297,7 @@ router.get('/status', async (req, res) => {
const { project } = req.query;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -355,7 +360,7 @@ router.get('/diff', async (req, res) => {
const { project, file } = req.query;
if (!project || !file) {
return res.status(400).json({ error: 'Project name and file path are required' });
return res.status(400).json({ error: 'Project id and file path are required' });
}
try {
@@ -438,7 +443,7 @@ router.get('/file-with-diff', async (req, res) => {
const { project, file } = req.query;
if (!project || !file) {
return res.status(400).json({ error: 'Project name and file path are required' });
return res.status(400).json({ error: 'Project id and file path are required' });
}
try {
@@ -518,7 +523,7 @@ router.post('/initial-commit', async (req, res) => {
const { project } = req.body;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -593,7 +598,7 @@ router.post('/revert-local-commit', async (req, res) => {
const { project } = req.body;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -640,7 +645,7 @@ router.get('/branches', async (req, res) => {
const { project } = req.query;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -684,7 +689,7 @@ router.post('/checkout', async (req, res) => {
const { project, branch } = req.body;
if (!project || !branch) {
return res.status(400).json({ error: 'Project name and branch are required' });
return res.status(400).json({ error: 'Project id and branch are required' });
}
try {
@@ -706,7 +711,7 @@ router.post('/create-branch', async (req, res) => {
const { project, branch } = req.body;
if (!project || !branch) {
return res.status(400).json({ error: 'Project name and branch name are required' });
return res.status(400).json({ error: 'Project id and branch name are required' });
}
try {
@@ -728,7 +733,7 @@ router.post('/delete-branch', async (req, res) => {
const { project, branch } = req.body;
if (!project || !branch) {
return res.status(400).json({ error: 'Project name and branch name are required' });
return res.status(400).json({ error: 'Project id and branch name are required' });
}
try {
@@ -754,7 +759,7 @@ router.get('/commits', async (req, res) => {
const { project, limit = 10 } = req.query;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -811,7 +816,7 @@ router.get('/commit-diff', async (req, res) => {
const { project, commit } = req.query;
if (!project || !commit) {
return res.status(400).json({ error: 'Project name and commit hash are required' });
return res.status(400).json({ error: 'Project id and commit hash are required' });
}
try {
@@ -843,7 +848,7 @@ router.post('/generate-commit-message', async (req, res) => {
const { project, files, provider = 'claude' } = req.body;
if (!project || !files || files.length === 0) {
return res.status(400).json({ error: 'Project name and files are required' });
return res.status(400).json({ error: 'Project id and files are required' });
}
// Validate provider
@@ -1048,7 +1053,7 @@ router.get('/remote-status', async (req, res) => {
const { project } = req.query;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -1126,7 +1131,7 @@ router.post('/fetch', async (req, res) => {
const { project } = req.body;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -1167,7 +1172,7 @@ router.post('/pull', async (req, res) => {
const { project } = req.body;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -1235,7 +1240,7 @@ router.post('/push', async (req, res) => {
const { project } = req.body;
if (!project) {
return res.status(400).json({ error: 'Project name is required' });
return res.status(400).json({ error: 'Project id is required' });
}
try {
@@ -1306,7 +1311,7 @@ router.post('/publish', async (req, res) => {
const { project, branch } = req.body;
if (!project || !branch) {
return res.status(400).json({ error: 'Project name and branch are required' });
return res.status(400).json({ error: 'Project id and branch are required' });
}
try {
@@ -1385,7 +1390,7 @@ router.post('/discard', async (req, res) => {
const { project, file } = req.body;
if (!project || !file) {
return res.status(400).json({ error: 'Project name and file path are required' });
return res.status(400).json({ error: 'Project id and file path are required' });
}
try {
@@ -1439,7 +1444,7 @@ router.post('/delete-untracked', async (req, res) => {
const { project, file } = req.body;
if (!project || !file) {
return res.status(400).json({ error: 'Project name and file path are required' });
return res.status(400).json({ error: 'Project id and file path are required' });
}
try {

View File

@@ -1,61 +0,0 @@
/**
* Unified messages endpoint.
*
* GET /api/sessions/:sessionId/messages?provider=claude&projectName=foo&limit=50&offset=0
*
* Replaces the four provider-specific session message endpoints with a single route
* that delegates to the appropriate adapter via the provider registry.
*
* @module routes/messages
*/
import express from 'express';
import { sessionsService } from '../modules/providers/services/sessions.service.js';
const router = express.Router();
/**
* GET /api/sessions/:sessionId/messages
*
* Auth: authenticateToken applied at mount level in index.js
*
* Query params:
* provider - 'claude' | 'cursor' | 'codex' | 'gemini' (default: 'claude')
* projectName - required for claude provider
* projectPath - required for cursor provider (absolute path used for cwdId hash)
* limit - page size (omit or null for all)
* offset - pagination offset (default: 0)
*/
router.get('/:sessionId/messages', async (req, res) => {
try {
const { sessionId } = req.params;
const provider = String(req.query.provider || 'claude').trim().toLowerCase();
const projectName = req.query.projectName || '';
const projectPath = req.query.projectPath || '';
const limitParam = req.query.limit;
const limit = limitParam !== undefined && limitParam !== null && limitParam !== ''
? parseInt(limitParam, 10)
: null;
const offset = parseInt(req.query.offset || '0', 10);
const availableProviders = sessionsService.listProviderIds();
if (!availableProviders.includes(provider)) {
const available = availableProviders.join(', ');
return res.status(400).json({ error: `Unknown provider: ${provider}. Available: ${available}` });
}
const result = await sessionsService.fetchHistory(provider, sessionId, {
projectName,
projectPath,
limit,
offset,
});
return res.json(result);
} catch (error) {
console.error('Error fetching unified messages:', error);
return res.status(500).json({ error: 'Failed to fetch messages' });
}
});
export default router;

View File

@@ -1,548 +0,0 @@
import express from 'express';
import { promises as fs } from 'fs';
import path from 'path';
import { spawn } from 'child_process';
import os from 'os';
import { addProjectManually } from '../projects.js';
const router = express.Router();
function sanitizeGitError(message, token) {
if (!message || !token) return message;
return message.replace(new RegExp(token.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'), 'g'), '***');
}
// Configure allowed workspace root (defaults to user's home directory)
export const WORKSPACES_ROOT = process.env.WORKSPACES_ROOT || os.homedir();
// System-critical paths that should never be used as workspace directories
export const FORBIDDEN_PATHS = [
// Unix
'/',
'/etc',
'/bin',
'/sbin',
'/usr',
'/dev',
'/proc',
'/sys',
'/var',
'/boot',
'/root',
'/lib',
'/lib64',
'/opt',
'/tmp',
'/run',
// Windows
'C:\\Windows',
'C:\\Program Files',
'C:\\Program Files (x86)',
'C:\\ProgramData',
'C:\\System Volume Information',
'C:\\$Recycle.Bin'
];
/**
* Validates that a path is safe for workspace operations
* @param {string} requestedPath - The path to validate
* @returns {Promise<{valid: boolean, resolvedPath?: string, error?: string}>}
*/
export async function validateWorkspacePath(requestedPath) {
try {
// Resolve to absolute path
let absolutePath = path.resolve(requestedPath);
// Check if path is a forbidden system directory
const normalizedPath = path.normalize(absolutePath);
if (FORBIDDEN_PATHS.includes(normalizedPath) || normalizedPath === '/') {
return {
valid: false,
error: 'Cannot use system-critical directories as workspace locations'
};
}
// Additional check for paths starting with forbidden directories
for (const forbidden of FORBIDDEN_PATHS) {
if (normalizedPath === forbidden ||
normalizedPath.startsWith(forbidden + path.sep)) {
// Exception: /var/tmp and similar user-accessible paths might be allowed
// but /var itself and most /var subdirectories should be blocked
if (forbidden === '/var' &&
(normalizedPath.startsWith('/var/tmp') ||
normalizedPath.startsWith('/var/folders'))) {
continue; // Allow these specific cases
}
return {
valid: false,
error: `Cannot create workspace in system directory: ${forbidden}`
};
}
}
// Try to resolve the real path (following symlinks)
let realPath;
try {
// Check if path exists to resolve real path
await fs.access(absolutePath);
realPath = await fs.realpath(absolutePath);
} catch (error) {
if (error.code === 'ENOENT') {
// Path doesn't exist yet - check parent directory
let parentPath = path.dirname(absolutePath);
try {
const parentRealPath = await fs.realpath(parentPath);
// Reconstruct the full path with real parent
realPath = path.join(parentRealPath, path.basename(absolutePath));
} catch (parentError) {
if (parentError.code === 'ENOENT') {
// Parent doesn't exist either - use the absolute path as-is
// We'll validate it's within allowed root
realPath = absolutePath;
} else {
throw parentError;
}
}
} else {
throw error;
}
}
// Resolve the workspace root to its real path
const resolvedWorkspaceRoot = await fs.realpath(WORKSPACES_ROOT);
// Ensure the resolved path is contained within the allowed workspace root
if (!realPath.startsWith(resolvedWorkspaceRoot + path.sep) &&
realPath !== resolvedWorkspaceRoot) {
return {
valid: false,
error: `Workspace path must be within the allowed workspace root: ${WORKSPACES_ROOT}`
};
}
// Additional symlink check for existing paths
try {
await fs.access(absolutePath);
const stats = await fs.lstat(absolutePath);
if (stats.isSymbolicLink()) {
// Verify symlink target is also within allowed root
const linkTarget = await fs.readlink(absolutePath);
const resolvedTarget = path.resolve(path.dirname(absolutePath), linkTarget);
const realTarget = await fs.realpath(resolvedTarget);
if (!realTarget.startsWith(resolvedWorkspaceRoot + path.sep) &&
realTarget !== resolvedWorkspaceRoot) {
return {
valid: false,
error: 'Symlink target is outside the allowed workspace root'
};
}
}
} catch (error) {
if (error.code !== 'ENOENT') {
throw error;
}
// Path doesn't exist - that's fine for new workspace creation
}
return {
valid: true,
resolvedPath: realPath
};
} catch (error) {
return {
valid: false,
error: `Path validation failed: ${error.message}`
};
}
}
/**
* Create a new workspace
* POST /api/projects/create-workspace
*
* Body:
* - workspaceType: 'existing' | 'new'
* - path: string (workspace path)
* - githubUrl?: string (optional, for new workspaces)
* - githubTokenId?: number (optional, ID of stored token)
* - newGithubToken?: string (optional, one-time token)
*/
router.post('/create-workspace', async (req, res) => {
try {
const { workspaceType, path: workspacePath, githubUrl, githubTokenId, newGithubToken } = req.body;
// Validate required fields
if (!workspaceType || !workspacePath) {
return res.status(400).json({ error: 'workspaceType and path are required' });
}
if (!['existing', 'new'].includes(workspaceType)) {
return res.status(400).json({ error: 'workspaceType must be "existing" or "new"' });
}
// Validate path safety before any operations
const validation = await validateWorkspacePath(workspacePath);
if (!validation.valid) {
return res.status(400).json({
error: 'Invalid workspace path',
details: validation.error
});
}
const absolutePath = validation.resolvedPath;
// Handle existing workspace
if (workspaceType === 'existing') {
// Check if the path exists
try {
await fs.access(absolutePath);
const stats = await fs.stat(absolutePath);
if (!stats.isDirectory()) {
return res.status(400).json({ error: 'Path exists but is not a directory' });
}
} catch (error) {
if (error.code === 'ENOENT') {
return res.status(404).json({ error: 'Workspace path does not exist' });
}
throw error;
}
// Add the existing workspace to the project list
const project = await addProjectManually(absolutePath);
return res.json({
success: true,
project,
message: 'Existing workspace added successfully'
});
}
// Handle new workspace creation
if (workspaceType === 'new') {
// Create the directory if it doesn't exist
await fs.mkdir(absolutePath, { recursive: true });
// If GitHub URL is provided, clone the repository
if (githubUrl) {
let githubToken = null;
// Get GitHub token if needed
if (githubTokenId) {
// Fetch token from database
const token = await getGithubTokenById(githubTokenId, req.user.id);
if (!token) {
// Clean up created directory
await fs.rm(absolutePath, { recursive: true, force: true });
return res.status(404).json({ error: 'GitHub token not found' });
}
githubToken = token.github_token;
} else if (newGithubToken) {
githubToken = newGithubToken;
}
// Extract repo name from URL for the clone destination
const normalizedUrl = githubUrl.replace(/\/+$/, '').replace(/\.git$/, '');
const repoName = normalizedUrl.split('/').pop() || 'repository';
const clonePath = path.join(absolutePath, repoName);
// Check if clone destination already exists to prevent data loss
try {
await fs.access(clonePath);
return res.status(409).json({
error: 'Directory already exists',
details: `The destination path "${clonePath}" already exists. Please choose a different location or remove the existing directory.`
});
} catch (err) {
// Directory doesn't exist, which is what we want
}
// Clone the repository into a subfolder
try {
await cloneGitHubRepository(githubUrl, clonePath, githubToken);
} catch (error) {
// Only clean up if clone created partial data (check if dir exists and is empty or partial)
try {
const stats = await fs.stat(clonePath);
if (stats.isDirectory()) {
await fs.rm(clonePath, { recursive: true, force: true });
}
} catch (cleanupError) {
// Directory doesn't exist or cleanup failed - ignore
}
throw new Error(`Failed to clone repository: ${error.message}`);
}
// Add the cloned repo path to the project list
const project = await addProjectManually(clonePath);
return res.json({
success: true,
project,
message: 'New workspace created and repository cloned successfully'
});
}
// Add the new workspace to the project list (no clone)
const project = await addProjectManually(absolutePath);
return res.json({
success: true,
project,
message: 'New workspace created successfully'
});
}
} catch (error) {
console.error('Error creating workspace:', error);
res.status(500).json({
error: error.message || 'Failed to create workspace',
details: process.env.NODE_ENV === 'development' ? error.stack : undefined
});
}
});
/**
* Helper function to get GitHub token from database
*/
async function getGithubTokenById(tokenId, userId) {
const { db } = await import('../database/db.js');
const credential = db.prepare(
'SELECT * FROM user_credentials WHERE id = ? AND user_id = ? AND credential_type = ? AND is_active = 1'
).get(tokenId, userId, 'github_token');
// Return in the expected format (github_token field for compatibility)
if (credential) {
return {
...credential,
github_token: credential.credential_value
};
}
return null;
}
/**
* Clone repository with progress streaming (SSE)
* GET /api/projects/clone-progress
*/
router.get('/clone-progress', async (req, res) => {
const { path: workspacePath, githubUrl, githubTokenId, newGithubToken } = req.query;
res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Connection', 'keep-alive');
res.flushHeaders();
const sendEvent = (type, data) => {
res.write(`data: ${JSON.stringify({ type, ...data })}\n\n`);
};
try {
if (!workspacePath || !githubUrl) {
sendEvent('error', { message: 'workspacePath and githubUrl are required' });
res.end();
return;
}
const validation = await validateWorkspacePath(workspacePath);
if (!validation.valid) {
sendEvent('error', { message: validation.error });
res.end();
return;
}
const absolutePath = validation.resolvedPath;
await fs.mkdir(absolutePath, { recursive: true });
let githubToken = null;
if (githubTokenId) {
const token = await getGithubTokenById(parseInt(githubTokenId), req.user.id);
if (!token) {
await fs.rm(absolutePath, { recursive: true, force: true });
sendEvent('error', { message: 'GitHub token not found' });
res.end();
return;
}
githubToken = token.github_token;
} else if (newGithubToken) {
githubToken = newGithubToken;
}
const normalizedUrl = githubUrl.replace(/\/+$/, '').replace(/\.git$/, '');
const repoName = normalizedUrl.split('/').pop() || 'repository';
const clonePath = path.join(absolutePath, repoName);
// Check if clone destination already exists to prevent data loss
try {
await fs.access(clonePath);
sendEvent('error', { message: `Directory "${repoName}" already exists. Please choose a different location or remove the existing directory.` });
res.end();
return;
} catch (err) {
// Directory doesn't exist, which is what we want
}
let cloneUrl = githubUrl;
if (githubToken) {
try {
const url = new URL(githubUrl);
url.username = githubToken;
url.password = '';
cloneUrl = url.toString();
} catch (error) {
// SSH URL or invalid - use as-is
}
}
sendEvent('progress', { message: `Cloning into '${repoName}'...` });
const gitProcess = spawn('git', ['clone', '--progress', cloneUrl, clonePath], {
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
GIT_TERMINAL_PROMPT: '0'
}
});
let lastError = '';
gitProcess.stdout.on('data', (data) => {
const message = data.toString().trim();
if (message) {
sendEvent('progress', { message });
}
});
gitProcess.stderr.on('data', (data) => {
const message = data.toString().trim();
lastError = message;
if (message) {
sendEvent('progress', { message });
}
});
gitProcess.on('close', async (code) => {
if (code === 0) {
try {
const project = await addProjectManually(clonePath);
sendEvent('complete', { project, message: 'Repository cloned successfully' });
} catch (error) {
sendEvent('error', { message: `Clone succeeded but failed to add project: ${error.message}` });
}
} else {
const sanitizedError = sanitizeGitError(lastError, githubToken);
let errorMessage = 'Git clone failed';
if (lastError.includes('Authentication failed') || lastError.includes('could not read Username')) {
errorMessage = 'Authentication failed. Please check your credentials.';
} else if (lastError.includes('Repository not found')) {
errorMessage = 'Repository not found. Please check the URL and ensure you have access.';
} else if (lastError.includes('already exists')) {
errorMessage = 'Directory already exists';
} else if (sanitizedError) {
errorMessage = sanitizedError;
}
try {
await fs.rm(clonePath, { recursive: true, force: true });
} catch (cleanupError) {
console.error('Failed to clean up after clone failure:', sanitizeGitError(cleanupError.message, githubToken));
}
sendEvent('error', { message: errorMessage });
}
res.end();
});
gitProcess.on('error', (error) => {
if (error.code === 'ENOENT') {
sendEvent('error', { message: 'Git is not installed or not in PATH' });
} else {
sendEvent('error', { message: error.message });
}
res.end();
});
req.on('close', () => {
gitProcess.kill();
});
} catch (error) {
sendEvent('error', { message: error.message });
res.end();
}
});
/**
* Helper function to clone a GitHub repository
*/
function cloneGitHubRepository(githubUrl, destinationPath, githubToken = null) {
return new Promise((resolve, reject) => {
let cloneUrl = githubUrl;
if (githubToken) {
try {
const url = new URL(githubUrl);
url.username = githubToken;
url.password = '';
cloneUrl = url.toString();
} catch (error) {
// SSH URL - use as-is
}
}
const gitProcess = spawn('git', ['clone', '--progress', cloneUrl, destinationPath], {
stdio: ['ignore', 'pipe', 'pipe'],
env: {
...process.env,
GIT_TERMINAL_PROMPT: '0'
}
});
let stdout = '';
let stderr = '';
gitProcess.stdout.on('data', (data) => {
stdout += data.toString();
});
gitProcess.stderr.on('data', (data) => {
stderr += data.toString();
});
gitProcess.on('close', (code) => {
if (code === 0) {
resolve({ stdout, stderr });
} else {
let errorMessage = 'Git clone failed';
if (stderr.includes('Authentication failed') || stderr.includes('could not read Username')) {
errorMessage = 'Authentication failed. Please check your GitHub token.';
} else if (stderr.includes('Repository not found')) {
errorMessage = 'Repository not found. Please check the URL and ensure you have access.';
} else if (stderr.includes('already exists')) {
errorMessage = 'Directory already exists';
} else if (stderr) {
errorMessage = stderr;
}
reject(new Error(errorMessage));
}
});
gitProcess.on('error', (error) => {
if (error.code === 'ENOENT') {
reject(new Error('Git is not installed or not in PATH'));
} else {
reject(error);
}
});
});
}
export default router;

View File

@@ -1,5 +1,5 @@
import express from 'express';
import { apiKeysDb, credentialsDb, notificationPreferencesDb, pushSubscriptionsDb } from '../database/db.js';
import { apiKeysDb, credentialsDb, notificationPreferencesDb, pushSubscriptionsDb } from '../modules/database/index.js';
import { getPublicKey } from '../services/vapid-keys.js';
import { createNotificationEvent, notifyUserIfEnabled } from '../services/notification-orchestrator.js';

View File

@@ -13,10 +13,25 @@ import fs from 'fs';
import path from 'path';
import { promises as fsPromises } from 'fs';
import { spawn } from 'child_process';
import { extractProjectDirectory } from '../projects.js';
import { projectsDb } from '../modules/database/index.js';
import { detectTaskMasterMCPServer } from '../utils/mcp-detector.js';
import { broadcastTaskMasterProjectUpdate, broadcastTaskMasterTasksUpdate } from '../utils/taskmaster-websocket.js';
/**
* Resolve the absolute project directory from a DB-assigned `projectId`.
*
* TaskMaster routes used to accept a Claude-encoded folder name (`projectName`)
* and derive the path from JSONL history. After the projectId migration the
* only identifier we accept is the primary key of the `projects` table, so
* every handler calls this helper and 404s when the id is unknown.
*/
async function resolveProjectPathFromId(projectId) {
if (!projectId) {
return null;
}
return projectsDb.getProjectPathById(projectId);
}
const router = express.Router();
/**
@@ -132,21 +147,22 @@ router.get('/installation-status', async (req, res) => {
});
/**
* GET /api/taskmaster/tasks/:projectName
* GET /api/taskmaster/tasks/:projectId
* Load actual tasks from .taskmaster/tasks/tasks.json
*
* `projectId` is the DB primary key of the project; the folder is resolved via
* the projects table rather than extracted from Claude JSONL history.
*/
router.get('/tasks/:projectName', async (req, res) => {
router.get('/tasks/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const { projectId } = req.params;
// Get project path via the DB; the legacy JSONL-based resolver is gone.
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -158,7 +174,7 @@ router.get('/tasks/:projectName', async (req, res) => {
await fsPromises.access(tasksFilePath);
} catch (error) {
return res.json({
projectName,
projectId,
tasks: [],
message: 'No tasks.json file found'
});
@@ -213,7 +229,7 @@ router.get('/tasks/:projectName', async (req, res) => {
}));
res.json({
projectName,
projectId,
projectPath,
tasks: transformedTasks,
currentTag,
@@ -247,21 +263,19 @@ router.get('/tasks/:projectName', async (req, res) => {
});
/**
* GET /api/taskmaster/prd/:projectName
* GET /api/taskmaster/prd/:projectId
* List all PRD files in the project's .taskmaster/docs directory
*/
router.get('/prd/:projectName', async (req, res) => {
router.get('/prd/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const { projectId } = req.params;
// projectId → projectPath lookup through the DB (post-migration).
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -272,7 +286,7 @@ router.get('/prd/:projectName', async (req, res) => {
await fsPromises.access(docsPath, fs.constants.R_OK);
} catch (error) {
return res.json({
projectName,
projectId,
prdFiles: [],
message: 'No .taskmaster/docs directory found'
});
@@ -299,7 +313,7 @@ router.get('/prd/:projectName', async (req, res) => {
}
res.json({
projectName,
projectId,
projectPath,
prdFiles: prdFiles.sort((a, b) => new Date(b.modified) - new Date(a.modified)),
timestamp: new Date().toISOString()
@@ -323,12 +337,12 @@ router.get('/prd/:projectName', async (req, res) => {
});
/**
* POST /api/taskmaster/prd/:projectName
* POST /api/taskmaster/prd/:projectId
* Create or update a PRD file in the project's .taskmaster/docs directory
*/
router.post('/prd/:projectName', async (req, res) => {
router.post('/prd/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
const { projectId } = req.params;
const { fileName, content } = req.body;
if (!fileName || !content) {
@@ -346,14 +360,12 @@ router.post('/prd/:projectName', async (req, res) => {
});
}
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
// Resolve the project folder through the DB using the projectId param.
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -379,7 +391,7 @@ router.post('/prd/:projectName', async (req, res) => {
const stats = await fsPromises.stat(filePath);
res.json({
projectName,
projectId,
projectPath,
fileName,
filePath: path.relative(projectPath, filePath),
@@ -408,21 +420,18 @@ router.post('/prd/:projectName', async (req, res) => {
});
/**
* GET /api/taskmaster/prd/:projectName/:fileName
* GET /api/taskmaster/prd/:projectId/:fileName
* Get content of a specific PRD file
*/
router.get('/prd/:projectName/:fileName', async (req, res) => {
router.get('/prd/:projectId/:fileName', async (req, res) => {
try {
const { projectName, fileName } = req.params;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const { projectId, fileName } = req.params;
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -444,7 +453,7 @@ router.get('/prd/:projectName/:fileName', async (req, res) => {
const stats = await fsPromises.stat(filePath);
res.json({
projectName,
projectId,
projectPath,
fileName,
filePath: path.relative(projectPath, filePath),
@@ -473,21 +482,18 @@ router.get('/prd/:projectName/:fileName', async (req, res) => {
});
/**
* POST /api/taskmaster/init/:projectName
* POST /api/taskmaster/init/:projectId
* Initialize TaskMaster in a project
*/
router.post('/init/:projectName', async (req, res) => {
router.post('/init/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const { projectId } = req.params;
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -522,17 +528,19 @@ router.post('/init/:projectName', async (req, res) => {
initProcess.on('close', (code) => {
if (code === 0) {
// Broadcast TaskMaster project update via WebSocket
// Broadcast TaskMaster project update via WebSocket. The
// WebSocket payload keeps using `projectId` so the frontend
// can match notifications against the current selection.
if (req.app.locals.wss) {
broadcastTaskMasterProjectUpdate(
req.app.locals.wss,
projectName,
req.app.locals.wss,
projectId,
{ hasTaskmaster: true, status: 'initialized' }
);
}
res.json({
projectName,
projectId,
projectPath,
message: 'TaskMaster initialized successfully',
output: stdout,
@@ -562,12 +570,12 @@ router.post('/init/:projectName', async (req, res) => {
});
/**
* POST /api/taskmaster/add-task/:projectName
* POST /api/taskmaster/add-task/:projectId
* Add a new task to the project
*/
router.post('/add-task/:projectName', async (req, res) => {
router.post('/add-task/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
const { projectId } = req.params;
const { prompt, title, description, priority = 'medium', dependencies } = req.body;
if (!prompt && (!title || !description)) {
@@ -576,15 +584,12 @@ router.post('/add-task/:projectName', async (req, res) => {
message: 'Either "prompt" or both "title" and "description" are required'
});
}
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -629,16 +634,17 @@ router.post('/add-task/:projectName', async (req, res) => {
console.log('Stderr:', stderr);
if (code === 0) {
// Broadcast task update via WebSocket
// Broadcast task update via WebSocket using the projectId so
// clients subscribed to this project get notified immediately.
if (req.app.locals.wss) {
broadcastTaskMasterTasksUpdate(
req.app.locals.wss,
projectName
req.app.locals.wss,
projectId
);
}
res.json({
projectName,
projectId,
projectPath,
message: 'Task added successfully',
output: stdout,
@@ -666,22 +672,19 @@ router.post('/add-task/:projectName', async (req, res) => {
});
/**
* PUT /api/taskmaster/update-task/:projectName/:taskId
* PUT /api/taskmaster/update-task/:projectId/:taskId
* Update a specific task using TaskMaster CLI
*/
router.put('/update-task/:projectName/:taskId', async (req, res) => {
router.put('/update-task/:projectId/:taskId', async (req, res) => {
try {
const { projectName, taskId } = req.params;
const { projectId, taskId } = req.params;
const { title, description, status, priority, details } = req.body;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -707,11 +710,11 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
if (code === 0) {
// Broadcast task update via WebSocket
if (req.app.locals.wss) {
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectName);
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
}
res.json({
projectName,
projectId,
projectPath,
taskId,
message: 'Task status updated successfully',
@@ -759,11 +762,11 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
if (code === 0) {
// Broadcast task update via WebSocket
if (req.app.locals.wss) {
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectName);
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
}
res.json({
projectName,
projectId,
projectPath,
taskId,
message: 'Task updated successfully',
@@ -793,22 +796,19 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
});
/**
* POST /api/taskmaster/parse-prd/:projectName
* POST /api/taskmaster/parse-prd/:projectId
* Parse a PRD file to generate tasks
*/
router.post('/parse-prd/:projectName', async (req, res) => {
router.post('/parse-prd/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
const { projectId } = req.params;
const { fileName = 'prd.txt', numTasks, append = false } = req.body;
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -859,13 +859,13 @@ router.post('/parse-prd/:projectName', async (req, res) => {
// Broadcast task update via WebSocket
if (req.app.locals.wss) {
broadcastTaskMasterTasksUpdate(
req.app.locals.wss,
projectName
req.app.locals.wss,
projectId
);
}
res.json({
projectName,
projectId,
projectPath,
prdFile: fileName,
message: 'PRD parsed and tasks generated successfully',
@@ -1340,12 +1340,12 @@ Description of the business problem, data sources, and expected insights.
});
/**
* POST /api/taskmaster/apply-template/:projectName
* POST /api/taskmaster/apply-template/:projectId
* Apply a PRD template to create a new PRD file
*/
router.post('/apply-template/:projectName', async (req, res) => {
router.post('/apply-template/:projectId', async (req, res) => {
try {
const { projectName } = req.params;
const { projectId } = req.params;
const { templateId, fileName = 'prd.txt', customizations = {} } = req.body;
if (!templateId) {
@@ -1355,14 +1355,11 @@ router.post('/apply-template/:projectName', async (req, res) => {
});
}
// Get project path
let projectPath;
try {
projectPath = await extractProjectDirectory(projectName);
} catch (error) {
const projectPath = await resolveProjectPathFromId(projectId);
if (!projectPath) {
return res.status(404).json({
error: 'Project not found',
message: `Project "${projectName}" does not exist`
message: `Project "${projectId}" does not exist`
});
}
@@ -1401,7 +1398,7 @@ router.post('/apply-template/:projectName', async (req, res) => {
await fsPromises.writeFile(filePath, content, 'utf8');
res.json({
projectName,
projectId,
projectPath,
templateId,
templateName: template.name,

View File

@@ -1,5 +1,5 @@
import express from 'express';
import { userDb } from '../database/db.js';
import { userDb } from '../modules/database/index.js';
import { authenticateToken } from '../middleware/auth.js';
import { getSystemGitConfig } from '../utils/gitConfig.js';
import { spawn } from 'child_process';

View File

@@ -1,5 +1,6 @@
import webPush from 'web-push';
import { notificationPreferencesDb, pushSubscriptionsDb, sessionNamesDb } from '../database/db.js';
import { notificationPreferencesDb, pushSubscriptionsDb, sessionsDb } from '../modules/database/index.js';
const KIND_TO_PREF_KEY = {
action_required: 'actionRequired',
@@ -107,7 +108,7 @@ function resolveSessionName(event) {
return null;
}
return normalizeSessionName(sessionNamesDb.getName(event.sessionId, event.provider));
return normalizeSessionName(sessionsDb.getSessionName(event.sessionId, event.provider));
}
function buildPushBody(event) {

View File

@@ -1,7 +1,8 @@
import webPush from 'web-push';
import { db } from '../database/db.js';
import { getConnection } from '../modules/database/connection.js';
let cachedKeys = null;
const db = getConnection();
function ensureVapidKeys() {
if (cachedKeys) return cachedKeys;

View File

@@ -0,0 +1,61 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import {
resolveClaudeCodeExecutablePath,
type ResolveClaudeCodeExecutablePathDependencies,
} from '@/shared/claude-cli-path.js';
test('resolveClaudeCodeExecutablePath resolves the npm Claude wrapper to its native exe on Windows', () => {
const wrapperDir = 'C:\\nvm4w\\nodejs';
const nativePath = `${wrapperDir}\\node_modules\\@anthropic-ai\\claude-code\\bin\\claude.exe`;
const execFileSync =
(() => `${wrapperDir}\\claude\r\n${wrapperDir}\\claude.cmd\r\n`) as unknown as ResolveClaudeCodeExecutablePathDependencies['execFileSync'];
const readFileSync = (() => '') as unknown as ResolveClaudeCodeExecutablePathDependencies['readFileSync'];
const resolved = resolveClaudeCodeExecutablePath('claude', {
platform: 'win32',
execFileSync,
existsSync: (candidate) => candidate === nativePath,
readFileSync,
});
assert.equal(resolved, nativePath);
});
test('resolveClaudeCodeExecutablePath keeps an explicit JavaScript launcher path unchanged', () => {
const scriptPath = 'C:\\tools\\claude.js';
const resolved = resolveClaudeCodeExecutablePath(scriptPath, {
platform: 'win32',
});
assert.equal(resolved, scriptPath);
});
test('resolveClaudeCodeExecutablePath can parse a wrapper file path containing letters r and n before claude.exe', () => {
const wrapperPath = 'C:\\tools\\claude';
const nativePath = 'C:\\tools\\custom\\bin\\node_modules\\@anthropic-ai\\claude-code\\bin\\claude.exe';
const readFileSync = (() => `exec "$basedir/custom/bin/node_modules/@anthropic-ai/claude-code/bin/claude.exe" "$@"`) as unknown as ResolveClaudeCodeExecutablePathDependencies['readFileSync'];
const resolved = resolveClaudeCodeExecutablePath(wrapperPath, {
platform: 'win32',
existsSync: (candidate) => candidate === nativePath,
readFileSync,
});
assert.equal(resolved, nativePath);
});
test('resolveClaudeCodeExecutablePath falls back to the configured command when PATH lookup fails', () => {
const execFileSync = (() => {
throw new Error('not found');
}) as unknown as ResolveClaudeCodeExecutablePathDependencies['execFileSync'];
const resolved = resolveClaudeCodeExecutablePath('claude', {
platform: 'win32',
execFileSync,
});
assert.equal(resolved, 'claude');
});

View File

@@ -0,0 +1,139 @@
import { execFileSync } from 'node:child_process';
import fs from 'node:fs';
import path from 'node:path';
const DEFAULT_CLAUDE_COMMAND = 'claude';
const CLAUDE_SCRIPT_EXTENSIONS = new Set(['.cjs', '.js', '.jsx', '.mjs', '.ts', '.tsx']);
const CLAUDE_WRAPPER_SEGMENTS = ['node_modules', '@anthropic-ai', 'claude-code', 'bin', 'claude.exe'] as const;
export type ResolveClaudeCodeExecutablePathDependencies = {
execFileSync?: typeof execFileSync;
existsSync?: typeof fs.existsSync;
platform?: NodeJS.Platform;
readFileSync?: typeof fs.readFileSync;
};
function getPathApi(platform: NodeJS.Platform) {
return platform === 'win32' ? path.win32 : path;
}
function stripWrappingQuotes(value: string): string {
const trimmed = value.trim();
if (
(trimmed.startsWith('"') && trimmed.endsWith('"')) ||
(trimmed.startsWith("'") && trimmed.endsWith("'"))
) {
return trimmed.slice(1, -1);
}
return trimmed;
}
function isPathLike(value: string): boolean {
return value.includes('/') || value.includes('\\');
}
function resolveClaudeWrapperBinary(
wrapperPath: string,
deps: Required<ResolveClaudeCodeExecutablePathDependencies>,
): string | null {
const pathApi = getPathApi(deps.platform);
const directCandidate = pathApi.resolve(pathApi.dirname(wrapperPath), ...CLAUDE_WRAPPER_SEGMENTS);
if (deps.existsSync(directCandidate)) {
return directCandidate;
}
let content: string;
try {
content = deps.readFileSync(wrapperPath, 'utf8');
} catch {
return null;
}
const matches = content.matchAll(/["']([^"'\\\r\n]*claude\.exe)["']/gi);
for (const match of matches) {
const rawTarget = match[1]
.replace(/^\$basedir[\\/]/i, '')
.replace(/^%dp0%[\\/]/i, '')
.replace(/^%~dp0[\\/]/i, '');
const normalizedTarget = rawTarget.replace(/[\\/]/g, pathApi.sep);
const candidate = pathApi.isAbsolute(normalizedTarget)
? normalizedTarget
: pathApi.resolve(pathApi.dirname(wrapperPath), normalizedTarget);
if (deps.existsSync(candidate)) {
return candidate;
}
}
return null;
}
function resolveWindowsClaudeExecutablePath(
configuredPath: string,
deps: Required<ResolveClaudeCodeExecutablePathDependencies>,
): string {
const pathApi = getPathApi(deps.platform);
const extension = pathApi.extname(configuredPath).toLowerCase();
const explicitPath = isPathLike(configuredPath) || pathApi.isAbsolute(configuredPath);
if (CLAUDE_SCRIPT_EXTENSIONS.has(extension)) {
return configuredPath;
}
if (explicitPath && extension === '.exe') {
return configuredPath;
}
if (explicitPath) {
return resolveClaudeWrapperBinary(configuredPath, deps) ?? configuredPath;
}
try {
const stdout = deps.execFileSync('where.exe', [configuredPath], {
encoding: 'utf8',
stdio: ['ignore', 'pipe', 'ignore'],
windowsHide: true,
});
const candidates = stdout
.split(/\r?\n/)
.map((entry) => entry.trim())
.filter(Boolean);
for (const candidate of candidates) {
if (pathApi.extname(candidate).toLowerCase() === '.exe') {
return candidate;
}
}
for (const candidate of candidates) {
const resolved = resolveClaudeWrapperBinary(candidate, deps);
if (resolved) {
return resolved;
}
}
} catch {
return configuredPath;
}
return configuredPath;
}
export function resolveClaudeCodeExecutablePath(
configuredPath: string | undefined = process.env.CLAUDE_CLI_PATH,
dependencies: ResolveClaudeCodeExecutablePathDependencies = {},
): string {
const deps: Required<ResolveClaudeCodeExecutablePathDependencies> = {
execFileSync: dependencies.execFileSync ?? execFileSync,
existsSync: dependencies.existsSync ?? fs.existsSync,
platform: dependencies.platform ?? process.platform,
readFileSync: dependencies.readFileSync ?? fs.readFileSync,
};
const normalizedPath = stripWrappingQuotes(configuredPath || DEFAULT_CLAUDE_COMMAND);
if (deps.platform !== 'win32') {
return normalizedPath;
}
return resolveWindowsClaudeExecutablePath(normalizedPath, deps);
}

View File

@@ -9,10 +9,10 @@ const frontmatterOptions = {
engines: {
js: disabledFrontmatterEngine,
javascript: disabledFrontmatterEngine,
json: disabledFrontmatterEngine
}
json: disabledFrontmatterEngine,
},
};
export function parseFrontmatter(content) {
export function parseFrontMatter(content: string) {
return matter(content, frontmatterOptions);
}

View File

@@ -4,11 +4,14 @@ import type {
LLMProvider,
McpScope,
NormalizedMessage,
ProviderSkill,
ProviderSkillListOptions,
ProviderAuthStatus,
ProviderMcpServer,
UpsertProviderMcpServerInput,
} from '@/shared/types.js';
//----------------- PROVIDER CONTRACT INTERFACES ------------
/**
* Main provider contract for CLI and SDK integrations.
*
@@ -19,12 +22,18 @@ export interface IProvider {
readonly id: LLMProvider;
readonly mcp: IProviderMcp;
readonly auth: IProviderAuth;
readonly skills: IProviderSkills;
readonly sessions: IProviderSessions;
readonly sessionSynchronizer: IProviderSessionSynchronizer;
}
// ---------------------------
//----------------- PROVIDER AUTH INTERFACE ------------
/**
* Auth contract for one provider.
*
* Implementations should return a complete installation/authentication status
* without throwing for normal "not installed" or "not authenticated" states.
*/
export interface IProviderAuth {
/**
@@ -33,8 +42,29 @@ export interface IProviderAuth {
getStatus(): Promise<ProviderAuthStatus>;
}
// ---------------------------
//----------------- PROVIDER SKILLS INTERFACE ------------
/**
* Skills contract for one provider.
*
* Implementations discover provider-native skill markdown locations and return
* normalized skill records with the exact command syntax expected by that
* provider. Each skill is read from a `SKILL.md` file under its skill directory.
*/
export interface IProviderSkills {
/**
* Lists all skills visible to this provider for the optional workspace.
*/
listSkills(options?: ProviderSkillListOptions): Promise<ProviderSkill[]>;
}
// ---------------------------
//----------------- PROVIDER MCP INTERFACE ------------
/**
* MCP contract for one provider.
*
* Implementations must map provider-native MCP config formats to shared
* `ProviderMcpServer` records used by routes and frontend state.
*/
export interface IProviderMcp {
listServers(options?: { workspacePath?: string }): Promise<Record<McpScope, ProviderMcpServer[]>>;
@@ -45,10 +75,37 @@ export interface IProviderMcp {
): Promise<{ removed: boolean; provider: LLMProvider; name: string; scope: McpScope }>;
}
// ---------------------------
//----------------- PROVIDER SESSION INTERFACE ------------
/**
* Session/history contract for one provider.
*
* Implementations normalize provider-specific events and message history into
* shared transport shapes consumed by API routes and realtime streams.
*/
export interface IProviderSessions {
normalizeMessage(raw: unknown, sessionId: string | null): NormalizedMessage[];
fetchHistory(sessionId: string, options?: FetchHistoryOptions): Promise<FetchHistoryResult>;
}
// ---------------------------
//----------------- PROVIDER SESSION SYNCHRONIZER INTERFACE ------------
/**
* Session indexing contract for one provider.
*
* Implementations scan provider-specific session artifacts on disk and upsert
* normalized session metadata into the database. The service layer uses this
* interface for both full rescans and single-file incremental sync triggered
* by filesystem watcher events.
*/
export interface IProviderSessionSynchronizer {
/**
* Scans provider session artifacts and upserts discovered sessions into DB.
*/
synchronize(since?: Date): Promise<number>;
/**
* Parses and upserts one provider artifact file without running a full scan.
*/
synchronizeFile(filePath: string): Promise<string | null>;
}

View File

@@ -1,18 +1,77 @@
// -------------- HTTP API response shapes for the server, shared across modules --------------
import type { IncomingMessage } from 'node:http';
//----------------- HTTP RESPONSE SHAPES ------------
/**
* Canonical success envelope used by backend APIs that return a structured payload.
*
* Use this for route handlers that need a stable `success/data` shape so frontend
* consumers can parse responses consistently across endpoints.
*/
export type ApiSuccessShape<TData = unknown> = {
success: true;
data: TData;
};
/**
* Generic plain-object record used when parsing loosely typed JSON payloads.
*
* Use this only after runtime shape checks, not as a replacement for validated
* domain models.
*/
export type AnyRecord = Record<string, any>;
// ---------------------------------------------------------------------------------------------
// ---------------------------
//----------------- WEBSOCKET TRANSPORT TYPES ------------
/**
* Minimal websocket client contract used by backend broadcaster services.
*
* Any transport object added to `connectedClients` must implement these two
* members so shared services can safely send JSON strings and check whether the
* socket is still open before broadcasting.
*/
export type RealtimeClientConnection = {
readyState: number;
send(data: string): void;
};
/**
* Authenticated user payload attached to websocket upgrade requests.
*
* Platform and OSS auth flows currently use either `id` or `userId`; both are
* represented here so websocket handlers can resolve a stable writer user id.
*/
export type AuthenticatedWebSocketUser = {
id?: string | number;
userId?: string | number;
username?: string;
[key: string]: unknown;
};
/**
* HTTP upgrade request shape after websocket authentication succeeds.
*
* `verifyClient` populates `request.user` with the authenticated payload, and
* downstream websocket handlers rely on this extended request type.
*/
export type AuthenticatedWebSocketRequest = IncomingMessage & {
user?: AuthenticatedWebSocketUser;
};
// ---------------------------
//----------------- PROVIDER MESSAGE MODEL ------------
/**
* Providers supported by the unified server runtime.
*
* Use this as the source of truth whenever a function or payload needs to identify
* a specific LLM integration.
*/
export type LLMProvider = 'claude' | 'codex' | 'gemini' | 'cursor';
// ---------------------------------------------------------------------------------------------
/**
* Message/event variants emitted by provider adapters and normalized transports.
*
* Keep this union in sync with event kinds produced by provider session adapters.
*/
export type MessageKind =
| 'text'
| 'tool_use'
@@ -30,11 +89,10 @@ export type MessageKind =
| 'task_notification';
/**
* Provider-neutral message event emitted over REST and realtime transports.
* Provider-neutral message envelope used in REST responses and realtime channels.
*
* Providers all produce their own native SDK/CLI event shapes, so this type keeps
* the common envelope strict while allowing provider-specific details to ride
* along as optional properties.
* Every provider-specific message must be converted into this shape before being
* emitted outside provider-specific modules.
*/
export type NormalizedMessage = {
id: string;
@@ -44,6 +102,21 @@ export type NormalizedMessage = {
kind: MessageKind;
role?: 'user' | 'assistant';
content?: string;
/**
* Optional display-oriented metadata used by providers that need to expose
* richer transcript artifacts without introducing a brand-new message kind.
*
* Current Claude usage:
* - local slash commands expose parsed command fields
* - compact summaries are flagged so the UI can treat them differently later
*/
displayText?: string;
commandName?: string;
commandMessage?: string;
commandArgs?: string;
isLocalCommand?: boolean;
isLocalCommandStdout?: boolean;
isCompactSummary?: boolean;
images?: unknown;
toolName?: string;
toolInput?: unknown;
@@ -73,21 +146,21 @@ export type NormalizedMessage = {
};
/**
* Pagination and provider lookup options for reading persisted session history.
* Shared options used to fetch historical provider messages.
*
* Consumers should pass provider-specific lookup hints (`projectPath`) only
* when the selected provider requires them.
*/
export type FetchHistoryOptions = {
/** Claude project folder name. Required by Claude history lookup. */
projectName?: string;
/** Absolute workspace path. Required by Cursor to compute its chat hash. */
projectPath?: string;
/** Page size. `null` means all messages. */
limit?: number | null;
/** Pagination offset from the newest messages. */
offset?: number;
};
/**
* Provider-neutral history result returned by the unified messages endpoint.
* Standardized response payload returned from provider history readers.
*
* Use this as the contract for APIs that return paginated conversation history.
*/
export type FetchHistoryResult = {
messages: NormalizedMessage[];
@@ -98,21 +171,103 @@ export type FetchHistoryResult = {
tokenUsage?: unknown;
};
// ---------------------------------------------------------------------------------------------
// ---------------------------
//----------------- PROVIDER SKILL TYPES ------------
/**
* Scope where a provider skill definition was discovered.
*
* Provider skill adapters should use this to describe the origin of each
* skill markdown file without leaking provider-specific folder names into route
* contracts. `repo` is used for Codex repository lookup locations, while
* `project` is used for providers that treat workspace-local skills as project
* scoped.
*/
export type ProviderSkillScope = 'user' | 'project' | 'plugin' | 'repo' | 'admin' | 'system';
/**
* Shared input accepted by provider skill listing operations.
*
* Routes pass `workspacePath` when a caller wants project/repository skills for
* a specific folder. Providers should fall back to the backend process cwd when
* this option is omitted.
*/
export type ProviderSkillListOptions = {
workspacePath?: string;
};
/**
* Normalized skill record returned by provider skill adapters.
*
* The `command` value is the exact invocation text the selected provider expects
* for this skill. Claude plugin skills use a namespaced command such as
* `/plugin-name:skill-name`, while Codex skills use the `$skill-name` form.
* `sourcePath` points to the skill markdown file that produced the record so
* callers can distinguish duplicate skill names across scopes.
*/
export type ProviderSkill = {
provider: LLMProvider;
name: string;
description: string;
command: string;
scope: ProviderSkillScope;
sourcePath: string;
pluginName?: string;
pluginId?: string;
};
/**
* Internal source descriptor consumed by shared provider skill discovery logic.
*
* Concrete provider adapters build these records from their native lookup rules.
* The shared skills provider then scans `rootDir` for child skill markdown files
* and uses `commandForSkill` or `commandPrefix` to produce the provider-specific
* invocation command. Set `recursive` only when a provider stores skills under
* arbitrary nested folders below the source root.
*/
export type ProviderSkillSource = {
scope: ProviderSkillScope;
rootDir: string;
recursive?: boolean;
commandPrefix?: '/' | '$';
commandForSkill?: (skillName: string) => string;
pluginName?: string;
pluginId?: string;
};
// ---------------------------
//----------------- SHARED ERROR TYPES ------------
/**
* Optional metadata used when constructing application-level errors.
*
* `statusCode` should reflect the HTTP response status, while `code` identifies
* the stable machine-readable error category.
*/
export type AppErrorOptions = {
code?: string;
statusCode?: number;
details?: unknown;
};
// -------------------- MCP related shared types --------------------
// ---------------------------
//----------------- MCP TYPES ------------
/**
* Scope where an MCP server definition is stored and resolved.
*
* `user` is global for a user account, `local` is provider-local, and `project`
* is tied to a specific project path.
*/
export type McpScope = 'user' | 'local' | 'project';
/**
* Transport protocol used by an MCP server definition.
*/
export type McpTransport = 'stdio' | 'http' | 'sse';
/**
* Provider MCP server descriptor normalized for frontend consumption.
* Normalized MCP server model exposed to frontend and route handlers.
*
* Provider adapters should map provider-native config to this structure before
* returning results.
*/
export type ProviderMcpServer = {
provider: LLMProvider;
@@ -131,7 +286,10 @@ export type ProviderMcpServer = {
};
/**
* Shared payload shape for MCP server create/update operations.
* Payload for create/update MCP server operations.
*
* Routes and services should accept this type, validate it, and then persist it
* through provider-specific MCP repositories.
*/
export type UpsertProviderMcpServerInput = {
name: string;
@@ -149,18 +307,13 @@ export type UpsertProviderMcpServerInput = {
envHttpHeaders?: Record<string, string>;
};
// ---------------------------------------------------------------------------------------------
// -------------------- Provider auth status types --------------------
// ---------------------------
//----------------- PROVIDER AUTH TYPES ------------
/**
* Result of a provider status check (installation + authentication).
* Authentication status result returned by provider health checks.
*
* installed - Whether the provider's CLI/SDK is available
* provider - Provider id the status belongs to
* authenticated - Whether valid credentials exist
* email - User email or auth method identifier
* method - Auth method (e.g. 'api_key', 'credentials_file')
* [error] - Error message if not installed or not authenticated
* This shape is consumed by settings/status endpoints to report installation and
* credential state for each provider.
*/
export type ProviderAuthStatus = {
installed: boolean;
@@ -170,3 +323,83 @@ export type ProviderAuthStatus = {
method: string | null;
error?: string;
};
// ---------------------------
//----------------- SHARED DATABASE CREDENTIAL TYPES ------------
/**
* Safe credential view returned by credential listing APIs.
*
* This intentionally excludes the raw credential secret while still exposing
* metadata needed for UI rendering and management operations.
*/
export type CredentialPublicRow = {
id: number;
credential_name: string;
credential_type: string;
description: string | null;
created_at: string;
is_active: number;
};
/**
* Result returned after creating a credential record.
*
* Use this return shape when callers need the created id and display metadata,
* but must never receive the stored secret value.
*/
export type CreateCredentialResult = {
id: number | bigint;
credentialName: string;
credentialType: string;
};
// ---------------------------
//----------------- PROJECT PERSISTENCE TYPES ------------
/**
* Canonical project row shape returned by the projects repository.
*
* Use this type whenever backend services need to pass around one database
* project record without leaking raw SQL row typing across modules.
*/
export type ProjectRepositoryRow = {
project_id: string;
project_path: string;
custom_project_name: string | null;
isStarred: number;
isArchived: number;
};
/**
* Result category returned by `projectsDb.createProjectPath`.
*
* `created` means a fresh row was inserted, `reactivated_archived` means an
* existing archived path was accepted and updated, and `active_conflict` means
* an already-active path blocked project creation.
*/
export type CreateProjectPathOutcome =
| 'created'
| 'reactivated_archived'
| 'active_conflict';
/**
* Structured result returned by project-path upsert operations.
*
* Services should use this result to decide whether a request succeeded,
* should return a conflict, or needs follow-up retrieval of row metadata.
*/
export type CreateProjectPathResult = {
outcome: CreateProjectPathOutcome;
project: ProjectRepositoryRow | null;
};
/**
* Validation result for user-supplied workspace/project paths.
*
* `resolvedPath` is present only when validation succeeds. `error` is present
* only when validation fails and is suitable for user-facing diagnostics.
*/
export type WorkspacePathValidationResult = {
valid: boolean;
resolvedPath?: string;
error?: string;
};

Some files were not shown because too many files have changed in this diff Show More