Files
claudecodeui/server/index.js
Haile 44edf94f3a Refactor provider/session architecture to be DB-driven, modular, and sessionId-first across backend and frontend (#715)
* refactor: remove unused exports

* refactor: remove unused fields from project and session objects

* refactor: rename session_names table and related code to sessions for clarity and consistency

* refactor(database): move db into typescript

- Implemented githubTokensDb for managing GitHub tokens with CRUD operations.
- Created
otificationPreferencesDb to handle user notification preferences.
- Added projectsDb for project path management and related operations.
- Introduced pushSubscriptionsDb for managing browser push subscriptions.
- Developed scanStateDb to track the last scanned timestamp.
- Established sessionsDb for session management with CRUD functionalities.
- Created userDb for user management, including authentication and onboarding.
- Implemented apidKeysDb for storing and managing VAPID keys.

feat(database): define schema for new database tables

- Added SQL schema definitions for users, API keys, user credentials, notification preferences, VAPID keys, push subscriptions, projects, sessions, scan state, and app configuration.
- Included necessary indexes for performance optimization.

refactor(shared): enhance type definitions and utility functions

- Updated shared types and interfaces for improved clarity and consistency.
- Added new types for credential management and provider-specific operations.
- Refined utility functions for better error handling and message normalization.

* feat: added session indexer logic

* perf(projects): lazy-load TaskMaster metadata per selected project

Why:

- /api/projects is a hot path (initial load, sidebar refresh, websocket sync).

- Scanning .taskmaster for every project on each call added avoidable fs I/O and payload size.

- TaskMaster metadata is only needed after selecting a specific project.

- Moving it to a project-scoped endpoint makes loading cost match user intent.

- The UI now hydrates TaskMaster state on selection and keeps it across refresh events.

- This prevents status flicker/regression while still removing global scan overhead.

- Selection fetches are sequence-guarded to block stale async responses on fast switching.

- isManuallyAdded was removed from responses to keep the public project contract minimal.

- Project dumps now use incrementing snapshot files to preserve history for debugging.

What changed:

- Added GET /api/projects/:projectName/taskmaster and getProjectTaskMaster().

- Removed TaskMaster detection from bulk getProjects().

- Added api.projectTaskmaster(...) plus selection-time hydration in frontend contexts.

- Merged cached taskmaster values into refreshed project lists for continuity.

- Removed isManuallyAdded from manual project payloads.

* refactor: update import paths for database modules and remove legacy db.js and schema.js files

* refactor(projects): identify projects by DB projectId instead of folder-derived name

GET /api/projects used to scan ~/.claude/projects/ on every request, derive
each project's identity from the encoded folder name, and re-parse JSONL
files to build session lists. Using the folder-derived name as the project
identifier leaked the Claude CLI's on-disk encoding into every API route,
forced every downstream endpoint to re-resolve a real path via JSONL
'cwd' inspection, and made the project list endpoint O(projects x sessions)
on disk I/O.

This change switches the entire API surface to identify projects by the
stable primary key from the 'projects' table and drives the listing
straight from the DB:

- Add projectsDb.getProjectPathById as the canonical projectId -> path
  resolver so routes no longer need to touch the filesystem to figure out
  where a project lives.

- Rewrite getProjects so it reads the project list from the 'projects'
  table and the per-project session list from the 'sessions' table (one
  SELECT per project). No filesystem scanning happens for this endpoint
  anymore, which removes the dependency on ~/.claude/projects existing,
  on Cursor's MD5-hashed chat folders being discoverable, and on Codex's
  JSONL history being on disk. Per the migration spec each session now
  exposes 'summary' sourced from sessions.custom_name, 'messageCount' = 0
  (message counting is not implemented), and sessionMeta.hasMore is
  pinned to false since this endpoint doesn't drive session pagination.

- Introduce id-based wrappers (getSessionsById, renameProjectById,
  deleteSessionById, deleteProjectById, getProjectTaskMasterById) so
  every caller can pass projectId and resolve the real path through the
  DB. renameProjectById also writes to projects.custom_project_name so
  the DB-driven getProjects response reflects renames immediately; it
  keeps project-config.json in sync for any legacy reader that still
  consults the JSON file.

- Migrate every /api/projects/:projectName route in server/index.js,
  server/routes/taskmaster.js, and server/routes/messages.js to
  :projectId, and change server/routes/git.js so the 'project'
  query/body parameter carries a projectId that is resolved through the
  DB before any git command runs. TaskMaster WebSocket broadcasts emit
  'projectId' for the same reason so the frontend can match
  notifications against its current selection without another lookup.

- Delete helpers that existed only to feed the old getProjects path
  (getCursorSessions, getGeminiCliSessions, getProjectTaskMaster) along
  with their unused imports (better-sqlite3's Database,
  applyCustomSessionNames). The legacy folder-name helpers (getSessions,
  renameProject, deleteSession, deleteProject, extractProjectDirectory)
  are kept as internal implementation details of the id-based wrappers
  and of destructive cleanup / conversation search, but they are no
  longer re-exported.

- searchConversations still walks JSONL to produce match snippets (that
  data doesn't live in the DB), but it now includes the resolved
  projectId in each result so the sidebar can cross-reference hits with
  its already loaded project list without a second round-trip.

Frontend migration:

- Project.name is replaced by Project.projectId in src/types/app.ts, and
  ProjectSession.__projectName becomes __projectId so session tagging
  and sidebar state keys stay aligned with the backend identifier.
  Settings continues to use SettingsProject.name for legacy consumers,
  but it is populated from projectId by normalizeProjectForSettings.

- All places that previously indexed per-project state by project.name
  (sidebar expanded/starred/loading/deletingProjects sets,
  additionalSessions map, projectHasMoreOverrides, starredProjects
  localStorage, command history and draft-input localStorage,
  TaskMaster caches) now key on projectId so state survives
  display-name edits and is consistent across the app.

- src/utils/api.js renames every endpoint parameter to projectId, the
  unified messages endpoint takes projectId in its query string, and
  useSessionStore forwards projectId on fetchFromServer / fetchMore /
  refreshFromServer. Git panel, file tree, code editor, PRD editor,
  plugins context, MCP server flows and TaskMaster hooks are all
  updated to pass projectId.

- DEFAULT_PROJECT_FOR_EMPTY_SHELL is updated to carry a 'default'
  projectId sentinel so the empty-shell placeholder still satisfies the
  Project contract.

Bug fix bundled in:

- sessionsDb.setName no longer bumps updated_at when a row already
  exists. Renaming is a label change, not activity, so there is no
  reason for it to reset 'last activity' in the sidebar. It also no
  longer relies on SQLite's CURRENT_TIMESTAMP, which stores a naive
  'YYYY-MM-DD HH:MM:SS' value that JavaScript parses as local time and
  caused renamed sessions to appear shifted backwards by the client's
  UTC offset. When an INSERT actually happens it now writes ISO-8601
  UTC with a 'Z' suffix.

- buildSessionsByProviderFromDb normalizes any legacy naive timestamps
  in the sessions table to ISO-8601 UTC on the way out so rows written
  before this change also render correctly on the client.

Other cleanup:

- Removed the filesystem-first project-discovery comment block at the
  top of server/projects.js and replaced it with a short note that
  describes the new DB-driven flow and lists the few remaining
  filesystem-dependent helpers (message reads, search, destructive
  delete, manual project registration).

- server/modules/providers/index.ts is added as a small barrel so the
  providers module exposes a stable public surface.

Made-with: Cursor

* refactor(projects): reorganize project-related logic into dedicated modules

* refactor(projects): rename getProjects with getProjectsWithSessions

* refactor: update import path for getProjectsWithSessions to include file extension

* refactor: use updated session watcher
In addition, for projects_updated websocket response, send the sessionId instead

* refactor(websocket): move websocket logic to its own module

* refactor(sessions-watcher): remove redundant logging after session sync completion

* refactor(index.js): reorganize code structure

* refactor(index.js): fix import order

* refactor: remove unnecessary GitHub cloning logic from create-workspace endpoint

* refactor: modularize project services, and wizard create/clone flow

Restructure project creation, listing, GitHub clone progress, and TaskMaster
details behind a dedicated TypeScript module under server/modules/projects/,
and align the client wizard with a single path-based flow.

Server / routing
- Remove server/routes/projects.js and mount server/modules/projects/
  projects.routes.ts at /api/projects (still behind authenticateToken).
- Drop duplicate handlers from server/index.js for GET /api/projects and
  GET /api/projects/:projectId/taskmaster; those live on the new router.
- Import WORKSPACES_ROOT and validateWorkspacePath from shared utils in
  index.js instead of the deleted projects route module.

Projects router (projects.routes.ts)
- GET /: list projects with sessions (existing snapshot behavior).
- POST /create-project: validate body, reject legacy workspaceType and
  mixed clone fields, delegate to createProject service, return distinct
  success copy when an archived path is reactivated.
- GET /clone-progress: Server-Sent Events for clone progress/complete/error;
  requires authenticated user id for token resolution; wires startCloneProject.
- GET /:projectId/taskmaster: delegates to getProjectTaskMaster.

Services (new)
- project-management.service.ts: path validation, workspace directory
  creation, persistence via projectsDb.createProjectPath, mapping to API
  project shape; surfaces AppError for validation, conflict, and not-found
  cases; optional dependency injection for tests.
- project-clone.service.ts: validates workspace, resolves GitHub auth
  (stored token or inline token), runs git clone with progress callbacks,
  registers project via createProject on success; sanitizes errors and
  supports cancellation; injectable dependencies for tests.
- projects-has-taskmaster.service.ts: moves TaskMaster detection and
  normalization out of server/projects.js; resolve-by-id and public
  getProjectTaskMaster with structured AppError responses.

Persistence and shared types
- projectsDb.createProjectPath now returns CreateProjectPathResult
  (created | reactivated_archived | active_conflict) using INSERT … ON
  CONFLICT with selective update when the row is archived; normalizes
  display name from path or custom name; repository row typing moves to
  shared ProjectRepositoryRow.
- getProjectPaths() returns only non-archived rows (isArchived = 0).
- shared/types.ts: ProjectRepositoryRow, CreateProjectPathResult/outcome,
  WorkspacePathValidationResult.
- shared/utils.ts: WORKSPACES_ROOT, forbidden path lists, validateWorkspacePath,
  asyncHandler for Express async routes.

Legacy cleanup
- server/projects.js: remove detectTaskMasterFolder, normalizeTaskMasterInfo,
  and getProjectTaskMasterById (logic lives in the new service).
- server/routes/agent.js: register external API project paths with
  projectsDb.createProjectPath instead of addProjectManually try/catch;
  treat active_conflict as an existing registration and continue.

Tests
- Add Node test suites for project-management, project-clone, and
  projects-has-taskmaster services; update projects.service test import
  for renamed projects-with-sessions-fetch.service.ts.

Rename
- projects.service.ts → projects-with-sessions-fetch.service.ts;
  re-export from modules/projects/index.ts.

Client (project creation wizard)
- Remove StepTypeSelection and workspaceType from form state and types;
  wizard is two steps (configure path/GitHub auth, then review).
- createWorkspaceRequest → createProjectRequest; clone vs create-only
  inferred from githubUrl (pathUtils / isCloneWorkflow).
- Adjust step indices, WizardProgress, StepConfiguration/Review,
  WorkspacePathField, and src/utils/api.js as needed for the new API.

Docs
- Minor websocket README touch-up.

Net: ~1.6k insertions / ~0.9k deletions across 29 files; behavior is
centralized in typed services with explicit HTTP errors and test seams.

* refactor: remove loading sessions logic from sidebar

* refactor: move project rename to module

* refactor: move project deletion to module

* refactor: move project star state from localStorage to backend

* refactor: implement optimistic UI for project star state management

* feat: optimistic update for session watcher

* fix(projects-state): stop websocket message reprocessing loop

The websocket projects effect in useProjectsState could re-handle the same
latestMessage after local state writes triggered re-renders.

Under bursty websocket traffic, this created an update feedback cycle
that surfaced as 'Maximum update depth exceeded', often from Sidebar.

What changed:
- Added lastHandledMessageRef so each latestMessage object is handled once.
- Added an early return guard when the current message was already handled.
- Made projects updates idempotent by comparing previous and merged payloads
  before calling setProjects.

Result:
- Breaks the effect -> state update -> effect re-entry cycle.
- Reduces redundant renders during rapid projects_updated traffic while
  preserving normal project/session synchronization.

* refactor: optimize project auto-expand logic

* refactor: move projects provider specific logic into respective session providers

* refactor: move rename and delete sessions to modules

* refactor: move fetching messages to module

* fix: remove unused var

Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>

* Potential fix for pull request finding 'Useless assignment to local variable'

Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>

* refactor(projects/sidebar): remove temp snapshot side-effects and simplify session metadata UX

Why this change was needed:
- Project listing had an implicit side effect: every fetch wrote a debug snapshot under `.tmp/project-dumps`.
  That added unnecessary disk I/O to a hot path, introduced hidden runtime behavior, and created maintenance
  overhead for code that was not part of product functionality.
- Keeping snapshot-specific exports/tests around made the projects module API broader than needed and coupled
  tests to temporary/debug behavior instead of user-visible behavior.
- Codex sessions could remain stuck with a placeholder name (`Untitled Codex Session`) even after a real title
  became available from newer sync data, which degraded session discoverability in the UI.
- Sidebar session rows showed duplicated provider branding and long-form relative times, which added visual noise
  and reduced scan speed when many sessions are listed.

What changed:
- Removed temporary projects snapshot dumping from `projects-with-sessions-fetch.service.ts`:
  - deleted snapshot types/helpers and file-write flow
  - removed the write call from `getProjectsWithSessions`
- Removed snapshot-related surface area from `projects/index.ts`.
- Removed the snapshot-focused test `projects.service.test.ts` that only validated removed debug behavior.
- Updated `codex-session-synchronizer.provider.ts` to upgrade session names when an existing session still has
  the placeholder title but a real parsed name is now available.
- Updated `SidebarSessionItem.tsx`:
  - removed duplicate provider logo rendering in each session row
  - moved age indicator to the right side
  - made age indicator fade on hover to prioritize action controls
  - switched to compact relative time format (`<1m`, `Xm`, `Xhr`, `Xd`) for faster list scanning

Outcome:
- Lower overhead and fewer hidden side effects in project fetches.
- Cleaner module boundaries in projects.
- Better Codex session naming consistency after sync.
- Cleaner sidebar density and clearer hover/action behavior.

* refactor: implement pagination for project sessions loading

* refactor: move search to module

* fix: search performance

* refactor: add handling for internal Codex metadata in conversation search

* fix(migrations,projects,clone): normalize legacy schema before writes and harden conflict detection

Why

- Legacy installs can have a sessions table shape that predates provider/custom_name columns. Running migrateLegacySessionNames first caused its INSERT OR REPLACE INTO sessions (...) to target columns that may not exist and fail during startup migration.

- Some upgraded databases had projects.project_id as plain TEXT instead of a real PRIMARY KEY. That breaks assumptions used by id-based lookups and can allow invalid/duplicate identity semantics over time.

- projectsDb.createProjectPath inferred outcomes from 
ow.isArchived, but the upsert path always returns the post-update row with isArchived=0, so archived-reactivation and fresh-create could be misclassified.

- git clone accepted user-controlled URLs directly in argv position, so inputs beginning with - could be interpreted as options instead of a repository argument.

What

- Added 
ebuildProjectsTableWithPrimaryKeySchema in migrations: detect table shape via getTableInfo('projects'), verify project_id has pk=1, and rebuild when missing.

- Rebuild flow now creates a canonical projects__new table (project_id TEXT PRIMARY KEY), copies rows with transformation, backfills empty ids via SQLITE_UUID_SQL, deduplicates conflicting ids/paths, then swaps tables inside a transaction.

- Replaced the prior ddColumnToTableIfNotExists(...) + UPDATE project_id sequence with PK-aware detection/rebuild logic so legacy DBs converge to the required schema.

- Reordered migration sequence to run 
ebuildSessionsTableWithProjectSchema before migrateLegacySessionNames, ensuring sessions is normalized before legacy session_names merge writes execute.

- Updated projectsDb.createProjectPath to generate an ttemptedId before insert, pass it into the prepared statement, and classify outcomes by comparing returned 
ow.project_id to ttemptedId (created vs 
eactivated_archived), with no-row remaining ctive_conflict.

- Hardened clone execution by inserting -- before clone URL in git argv and rejecting normalized GitHub URLs that start with - in startCloneProject.

Tests

- Added integration coverage for projectsDb.createProjectPath branches: fresh insert, archived reactivation, and active conflict.

- Added clone service test for option-prefixed githubUrl rejection (INVALID_GITHUB_URL).

* refactor(session-synchronizer): update last scanned timestamp based on synchronization results

* refactor: improve session limit and offset validation in provider routes

* refactor: normalize project paths across database and service modules

* refactor(database): make session id the primary key in sessions table

* fix(codex): preserve reasoning entries as thinking blocks

Codex history normalization was downgrading reasoning into plain assistant text
because of branch ordering, not because the raw data was missing.

Why this mattered:
- Codex reasoning JSONL entries are intentionally mapped to history items with
  type thinking, but they also carry message.role assistant.
- normalizeHistoryEntry evaluated the assistant-role branch before the
  thinking branch.
- As a result, reasoning content matched the assistant-text path first and was
  emitted as kind text instead of kind thinking.
- This collapses semantic intent, so UI and downstream features that rely on
  thinking blocks (separate rendering, filtering, and interpretation of model
  thought process vs final answer) receive the wrong message kind.

What changed:
- Prioritized thinking detection (raw.type === thinking or raw.isReasoning)
  before role-based assistant normalization.
- Kept a non-empty content guard for thinking payloads to avoid emitting empty
  artifacts.

Impact:
- Reasoning entries from persisted Codex JSONL now remain thinking blocks
  end-to-end.
- Regular assistant text normalization behavior remains unchanged.

* refactor: remove dead code

* refactor: directly use getProjectPathById from projectsDb

* refactor: add gemini jsonl session support
2026-04-30 08:19:26 +02:00

1502 lines
56 KiB
JavaScript
Executable File

#!/usr/bin/env node
// Load environment variables before other imports execute
import './load-env.js';
import fs, { promises as fsPromises } from 'fs';
import path from 'path';
import os from 'os';
import http from 'http';
import { spawn } from 'child_process';
import express from 'express';
import cors from 'cors';
import mime from 'mime-types';
import { AppError, WORKSPACES_ROOT, validateWorkspacePath } from '@/shared/utils.js';
import { closeSessionsWatcher, initializeSessionsWatcher } from '@/modules/providers/index.js';
import { createWebSocketServer } from '@/modules/websocket/index.js';
import { getConnectableHost } from '../shared/networkHosts.js';
import { findAppRoot, getModuleDir } from './utils/runtime-paths.js';
import {
queryClaudeSDK,
abortClaudeSDKSession,
isClaudeSDKSessionActive,
getActiveClaudeSDKSessions,
resolveToolApproval,
getPendingApprovalsForSession,
reconnectSessionWriter,
} from './claude-sdk.js';
import {
spawnCursor,
abortCursorSession,
isCursorSessionActive,
getActiveCursorSessions,
} from './cursor-cli.js';
import {
queryCodex,
abortCodexSession,
isCodexSessionActive,
getActiveCodexSessions,
} from './openai-codex.js';
import {
spawnGemini,
abortGeminiSession,
isGeminiSessionActive,
getActiveGeminiSessions,
} from './gemini-cli.js';
import sessionManager from './sessionManager.js';
import {
stripAnsiSequences,
normalizeDetectedUrl,
extractUrlsFromText,
shouldAutoOpenUrlFromOutput,
} from './utils/url-detection.js';
import gitRoutes from './routes/git.js';
import authRoutes from './routes/auth.js';
import cursorRoutes from './routes/cursor.js';
import taskmasterRoutes from './routes/taskmaster.js';
import mcpUtilsRoutes from './routes/mcp-utils.js';
import commandsRoutes from './routes/commands.js';
import settingsRoutes from './routes/settings.js';
import agentRoutes from './routes/agent.js';
import projectModuleRoutes from './modules/projects/projects.routes.js';
import userRoutes from './routes/user.js';
import geminiRoutes from './routes/gemini.js';
import pluginsRoutes from './routes/plugins.js';
import providerRoutes from './modules/providers/provider.routes.js';
import { startEnabledPluginServers, stopAllPlugins, getPluginPort } from './utils/plugin-process-manager.js';
import { initializeDatabase, projectsDb } from './modules/database/index.js';
import { configureWebPush } from './services/vapid-keys.js';
import { validateApiKey, authenticateToken, authenticateWebSocket } from './middleware/auth.js';
import { IS_PLATFORM } from './constants/config.js';
import { c } from './utils/colors.js';
const __dirname = getModuleDir(import.meta.url);
// The server source runs from /server, while the compiled output runs from /dist-server/server.
// Resolving the app root once keeps every repo-level lookup below aligned across both layouts.
const APP_ROOT = findAppRoot(__dirname);
const installMode = fs.existsSync(path.join(APP_ROOT, '.git')) ? 'git' : 'npm';
console.log('SERVER_PORT from env:', process.env.SERVER_PORT);
const app = express();
const server = http.createServer(app);
// Single WebSocket server that handles chat, shell, and plugin proxy paths.
const wss = createWebSocketServer(server, {
verifyClient: {
isPlatform: IS_PLATFORM,
authenticateWebSocket,
},
chat: {
queryClaudeSDK,
spawnCursor,
queryCodex,
spawnGemini,
abortClaudeSDKSession,
abortCursorSession,
abortCodexSession,
abortGeminiSession,
resolveToolApproval,
isClaudeSDKSessionActive,
isCursorSessionActive,
isCodexSessionActive,
isGeminiSessionActive,
reconnectSessionWriter,
getPendingApprovalsForSession,
getActiveClaudeSDKSessions,
getActiveCursorSessions,
getActiveCodexSessions,
getActiveGeminiSessions,
},
shell: {
getSessionById: (sessionId) => sessionManager.getSession(sessionId),
stripAnsiSequences,
normalizeDetectedUrl,
extractUrlsFromText,
shouldAutoOpenUrlFromOutput,
},
getPluginPort,
});
// Make WebSocket server available to routes
app.locals.wss = wss;
app.use(cors({ exposedHeaders: ['X-Refreshed-Token'] }));
app.use(express.json({
limit: '50mb',
type: (req) => {
// Skip multipart/form-data requests (for file uploads like images)
const contentType = req.headers['content-type'] || '';
if (contentType.includes('multipart/form-data')) {
return false;
}
return contentType.includes('json');
}
}));
app.use(express.urlencoded({ limit: '50mb', extended: true }));
// Public health check endpoint (no authentication required)
app.get('/health', (req, res) => {
res.json({
status: 'ok',
timestamp: new Date().toISOString(),
installMode
});
});
// Optional API key validation (if configured)
app.use('/api', validateApiKey);
// Authentication routes (public)
app.use('/api/auth', authRoutes);
// Projects API Routes (protected)
app.use('/api/projects', authenticateToken, projectModuleRoutes);
// Git API Routes (protected)
app.use('/api/git', authenticateToken, gitRoutes);
// Cursor API Routes (protected)
app.use('/api/cursor', authenticateToken, cursorRoutes);
// TaskMaster API Routes (protected)
app.use('/api/taskmaster', authenticateToken, taskmasterRoutes);
// MCP utilities
app.use('/api/mcp-utils', authenticateToken, mcpUtilsRoutes);
// Commands API Routes (protected)
app.use('/api/commands', authenticateToken, commandsRoutes);
// Settings API Routes (protected)
app.use('/api/settings', authenticateToken, settingsRoutes);
// User API Routes (protected)
app.use('/api/user', authenticateToken, userRoutes);
// Gemini API Routes (protected)
app.use('/api/gemini', authenticateToken, geminiRoutes);
// Plugins API Routes (protected)
app.use('/api/plugins', authenticateToken, pluginsRoutes);
// Unified provider MCP routes (protected)
app.use('/api/providers', authenticateToken, providerRoutes);
// Agent API Routes (uses API key authentication)
app.use('/api/agent', agentRoutes);
// Serve public files (like api-docs.html)
app.use(express.static(path.join(APP_ROOT, 'public')));
// Static files served after API routes
// Add cache control: HTML files should not be cached, but assets can be cached
app.use(express.static(path.join(APP_ROOT, 'dist'), {
setHeaders: (res, filePath) => {
if (filePath.endsWith('.html')) {
// Prevent HTML caching to avoid service worker issues after builds
res.setHeader('Cache-Control', 'no-cache, no-store, must-revalidate');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Expires', '0');
} else if (filePath.match(/\.(js|css|woff2?|ttf|eot|svg|png|jpg|jpeg|gif|ico)$/)) {
// Cache static assets for 1 year (they have hashed names)
res.setHeader('Cache-Control', 'public, max-age=31536000, immutable');
}
}
}));
// API Routes (protected)
// /api/config endpoint removed - no longer needed
// Frontend now uses window.location for WebSocket URLs
// System update endpoint
app.post('/api/system/update', authenticateToken, async (req, res) => {
try {
// Get the project root directory (parent of server directory)
const projectRoot = APP_ROOT;
console.log('Starting system update from directory:', projectRoot);
// Platform deployments use their own update workflow from the project root.
const updateCommand = IS_PLATFORM
// In platform, husky and dev dependencies are not needed
? 'npm run update:platform'
: installMode === 'git'
? 'git checkout main && git pull && npm install'
: 'npm install -g @cloudcli-ai/cloudcli@latest';
const updateCwd = IS_PLATFORM || installMode === 'git'
? projectRoot
: os.homedir();
const child = spawn('sh', ['-c', updateCommand], {
cwd: updateCwd,
env: process.env
});
let output = '';
let errorOutput = '';
child.stdout.on('data', (data) => {
const text = data.toString();
output += text;
console.log('Update output:', text);
});
child.stderr.on('data', (data) => {
const text = data.toString();
errorOutput += text;
console.error('Update error:', text);
});
child.on('close', (code) => {
if (code === 0) {
res.json({
success: true,
output: output || 'Update completed successfully',
message: 'Update completed. Please restart the server to apply changes.'
});
} else {
res.status(500).json({
success: false,
error: 'Update command failed',
output: output,
errorOutput: errorOutput
});
}
});
child.on('error', (error) => {
console.error('Update process error:', error);
res.status(500).json({
success: false,
error: error.message
});
});
} catch (error) {
console.error('System update error:', error);
res.status(500).json({
success: false,
error: error.message
});
}
});
const expandWorkspacePath = (inputPath) => {
if (!inputPath) return inputPath;
if (inputPath === '~') {
return WORKSPACES_ROOT;
}
if (inputPath.startsWith('~/') || inputPath.startsWith('~\\')) {
return path.join(WORKSPACES_ROOT, inputPath.slice(2));
}
return inputPath;
};
// Browse filesystem endpoint for project suggestions - uses existing getFileTree
app.get('/api/browse-filesystem', authenticateToken, async (req, res) => {
try {
const { path: dirPath } = req.query;
console.log('[API] Browse filesystem request for path:', dirPath);
console.log('[API] WORKSPACES_ROOT is:', WORKSPACES_ROOT);
// Default to home directory if no path provided
const defaultRoot = WORKSPACES_ROOT;
let targetPath = dirPath ? expandWorkspacePath(dirPath) : defaultRoot;
// Resolve and normalize the path
targetPath = path.resolve(targetPath);
// Security check - ensure path is within allowed workspace root
const validation = await validateWorkspacePath(targetPath);
if (!validation.valid) {
return res.status(403).json({ error: validation.error });
}
const resolvedPath = validation.resolvedPath || targetPath;
// Security check - ensure path is accessible
try {
await fs.promises.access(resolvedPath);
const stats = await fs.promises.stat(resolvedPath);
if (!stats.isDirectory()) {
return res.status(400).json({ error: 'Path is not a directory' });
}
} catch (err) {
return res.status(404).json({ error: 'Directory not accessible' });
}
// Use existing getFileTree function with shallow depth (only direct children)
const fileTree = await getFileTree(resolvedPath, 1, 0, false); // maxDepth=1, showHidden=false
// Filter only directories and format for suggestions
const directories = fileTree
.filter(item => item.type === 'directory')
.map(item => ({
path: item.path,
name: item.name,
type: 'directory'
}))
.sort((a, b) => {
const aHidden = a.name.startsWith('.');
const bHidden = b.name.startsWith('.');
if (aHidden && !bHidden) return 1;
if (!aHidden && bHidden) return -1;
return a.name.localeCompare(b.name);
});
// Add common directories if browsing home directory
const suggestions = [];
let resolvedWorkspaceRoot = defaultRoot;
try {
resolvedWorkspaceRoot = await fsPromises.realpath(defaultRoot);
} catch (error) {
// Use default root as-is if realpath fails
}
if (resolvedPath === resolvedWorkspaceRoot) {
const commonDirs = ['Desktop', 'Documents', 'Projects', 'Development', 'Dev', 'Code', 'workspace'];
const existingCommon = directories.filter(dir => commonDirs.includes(dir.name));
const otherDirs = directories.filter(dir => !commonDirs.includes(dir.name));
suggestions.push(...existingCommon, ...otherDirs);
} else {
suggestions.push(...directories);
}
res.json({
path: resolvedPath,
suggestions: suggestions
});
} catch (error) {
console.error('Error browsing filesystem:', error);
res.status(500).json({ error: 'Failed to browse filesystem' });
}
});
app.post('/api/create-folder', authenticateToken, async (req, res) => {
try {
const { path: folderPath } = req.body;
if (!folderPath) {
return res.status(400).json({ error: 'Path is required' });
}
const expandedPath = expandWorkspacePath(folderPath);
const resolvedInput = path.resolve(expandedPath);
const validation = await validateWorkspacePath(resolvedInput);
if (!validation.valid) {
return res.status(403).json({ error: validation.error });
}
const targetPath = validation.resolvedPath || resolvedInput;
const parentDir = path.dirname(targetPath);
try {
await fs.promises.access(parentDir);
} catch (err) {
return res.status(404).json({ error: 'Parent directory does not exist' });
}
try {
await fs.promises.access(targetPath);
return res.status(409).json({ error: 'Folder already exists' });
} catch (err) {
// Folder doesn't exist, which is what we want
}
try {
await fs.promises.mkdir(targetPath, { recursive: false });
res.json({ success: true, path: targetPath });
} catch (mkdirError) {
if (mkdirError.code === 'EEXIST') {
return res.status(409).json({ error: 'Folder already exists' });
}
throw mkdirError;
}
} catch (error) {
console.error('Error creating folder:', error);
res.status(500).json({ error: 'Failed to create folder' });
}
});
// Read file content endpoint
app.get('/api/projects/:projectId/file', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { filePath } = req.query;
// Security: ensure the requested path is inside the project root
if (!filePath) {
return res.status(400).json({ error: 'Invalid file path' });
}
// Resolve the absolute project root via the DB-backed helper; the
// caller passes the DB-assigned `projectId`, not a folder name.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Handle both absolute and relative paths
const resolved = path.isAbsolute(filePath)
? path.resolve(filePath)
: path.resolve(projectRoot, filePath);
const normalizedRoot = path.resolve(projectRoot) + path.sep;
if (!resolved.startsWith(normalizedRoot)) {
return res.status(403).json({ error: 'Path must be under project root' });
}
const content = await fsPromises.readFile(resolved, 'utf8');
res.json({ content, path: resolved });
} catch (error) {
console.error('Error reading file:', error);
if (error.code === 'ENOENT') {
res.status(404).json({ error: 'File not found' });
} else if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else {
res.status(500).json({ error: error.message });
}
}
});
// Serve raw file bytes for previews and downloads.
app.get('/api/projects/:projectId/files/content', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { path: filePath } = req.query;
// Security: ensure the requested path is inside the project root
if (!filePath) {
return res.status(400).json({ error: 'Invalid file path' });
}
// Projects are now addressed by DB `projectId`, resolved to their path here.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Match the text reader endpoint so callers can pass either project-relative
// or absolute paths without changing how the bytes are served.
const resolved = path.isAbsolute(filePath)
? path.resolve(filePath)
: path.resolve(projectRoot, filePath);
const normalizedRoot = path.resolve(projectRoot) + path.sep;
if (!resolved.startsWith(normalizedRoot)) {
return res.status(403).json({ error: 'Path must be under project root' });
}
// Check if file exists
try {
await fsPromises.access(resolved);
} catch (error) {
return res.status(404).json({ error: 'File not found' });
}
// Get file extension and set appropriate content type
const mimeType = mime.lookup(resolved) || 'application/octet-stream';
res.setHeader('Content-Type', mimeType);
// Stream the file
const fileStream = fs.createReadStream(resolved);
fileStream.pipe(res);
fileStream.on('error', (error) => {
console.error('Error streaming file:', error);
if (!res.headersSent) {
res.status(500).json({ error: 'Error reading file' });
}
});
} catch (error) {
console.error('Error serving binary file:', error);
if (!res.headersSent) {
res.status(500).json({ error: error.message });
}
}
});
// Save file content endpoint
app.put('/api/projects/:projectId/file', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { filePath, content } = req.body;
// Security: ensure the requested path is inside the project root
if (!filePath) {
return res.status(400).json({ error: 'Invalid file path' });
}
if (content === undefined) {
return res.status(400).json({ error: 'Content is required' });
}
// Projects are now addressed by DB `projectId`, resolved to their path here.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Handle both absolute and relative paths
const resolved = path.isAbsolute(filePath)
? path.resolve(filePath)
: path.resolve(projectRoot, filePath);
const normalizedRoot = path.resolve(projectRoot) + path.sep;
if (!resolved.startsWith(normalizedRoot)) {
return res.status(403).json({ error: 'Path must be under project root' });
}
// Write the new content
await fsPromises.writeFile(resolved, content, 'utf8');
res.json({
success: true,
path: resolved,
message: 'File saved successfully'
});
} catch (error) {
console.error('Error saving file:', error);
if (error.code === 'ENOENT') {
res.status(404).json({ error: 'File or directory not found' });
} else if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else {
res.status(500).json({ error: error.message });
}
}
});
app.get('/api/projects/:projectId/files', authenticateToken, async (req, res) => {
try {
// Using fsPromises from import
// Resolve the project's absolute path through the DB (projectId is the
// primary key of the `projects` table after the identifier migration).
const actualPath = await projectsDb.getProjectPathById(req.params.projectId);
if (!actualPath) {
return res.status(404).json({ error: 'Project not found' });
}
// Check if path exists
try {
await fsPromises.access(actualPath);
} catch (e) {
return res.status(404).json({ error: `Project path not found: ${actualPath}` });
}
const files = await getFileTree(actualPath, 10, 0, true);
res.json(files);
} catch (error) {
console.error('[ERROR] File tree error:', error.message);
res.status(500).json({ error: error.message });
}
});
// ============================================================================
// FILE OPERATIONS API ENDPOINTS
// ============================================================================
/**
* Validate that a path is within the project root
* @param {string} projectRoot - The project root path
* @param {string} targetPath - The path to validate
* @returns {{ valid: boolean, resolved?: string, error?: string }}
*/
function validatePathInProject(projectRoot, targetPath) {
const resolved = path.isAbsolute(targetPath)
? path.resolve(targetPath)
: path.resolve(projectRoot, targetPath);
const normalizedRoot = path.resolve(projectRoot) + path.sep;
if (!resolved.startsWith(normalizedRoot)) {
return { valid: false, error: 'Path must be under project root' };
}
return { valid: true, resolved };
}
/**
* Validate filename - check for invalid characters
* @param {string} name - The filename to validate
* @returns {{ valid: boolean, error?: string }}
*/
function validateFilename(name) {
if (!name || !name.trim()) {
return { valid: false, error: 'Filename cannot be empty' };
}
// Check for invalid characters (Windows + Unix)
const invalidChars = /[<>:"/\\|?*\x00-\x1f]/;
if (invalidChars.test(name)) {
return { valid: false, error: 'Filename contains invalid characters' };
}
// Check for reserved names (Windows)
const reserved = /^(CON|PRN|AUX|NUL|COM[1-9]|LPT[1-9])$/i;
if (reserved.test(name)) {
return { valid: false, error: 'Filename is a reserved name' };
}
// Check for dots only
if (/^\.+$/.test(name)) {
return { valid: false, error: 'Filename cannot be only dots' };
}
return { valid: true };
}
// POST /api/projects/:projectId/files/create - Create new file or directory
app.post('/api/projects/:projectId/files/create', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { path: parentPath, type, name } = req.body;
// Validate input
if (!name || !type) {
return res.status(400).json({ error: 'Name and type are required' });
}
if (!['file', 'directory'].includes(type)) {
return res.status(400).json({ error: 'Type must be "file" or "directory"' });
}
const nameValidation = validateFilename(name);
if (!nameValidation.valid) {
return res.status(400).json({ error: nameValidation.error });
}
// Resolve the project directory through the DB using the new projectId.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Build and validate target path
const targetDir = parentPath || '';
const targetPath = targetDir ? path.join(targetDir, name) : name;
const validation = validatePathInProject(projectRoot, targetPath);
if (!validation.valid) {
return res.status(403).json({ error: validation.error });
}
const resolvedPath = validation.resolved;
// Check if already exists
try {
await fsPromises.access(resolvedPath);
return res.status(409).json({ error: `${type === 'file' ? 'File' : 'Directory'} already exists` });
} catch {
// Doesn't exist, which is what we want
}
// Create file or directory
if (type === 'directory') {
await fsPromises.mkdir(resolvedPath, { recursive: false });
} else {
// Ensure parent directory exists
const parentDir = path.dirname(resolvedPath);
try {
await fsPromises.access(parentDir);
} catch {
await fsPromises.mkdir(parentDir, { recursive: true });
}
await fsPromises.writeFile(resolvedPath, '', 'utf8');
}
res.json({
success: true,
path: resolvedPath,
name,
type,
message: `${type === 'file' ? 'File' : 'Directory'} created successfully`
});
} catch (error) {
console.error('Error creating file/directory:', error);
if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else if (error.code === 'ENOENT') {
res.status(404).json({ error: 'Parent directory not found' });
} else {
res.status(500).json({ error: error.message });
}
}
});
// PUT /api/projects/:projectId/files/rename - Rename file or directory
app.put('/api/projects/:projectId/files/rename', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { oldPath, newName } = req.body;
// Validate input
if (!oldPath || !newName) {
return res.status(400).json({ error: 'oldPath and newName are required' });
}
const nameValidation = validateFilename(newName);
if (!nameValidation.valid) {
return res.status(400).json({ error: nameValidation.error });
}
// Resolve the project directory through the DB using the new projectId.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Validate old path
const oldValidation = validatePathInProject(projectRoot, oldPath);
if (!oldValidation.valid) {
return res.status(403).json({ error: oldValidation.error });
}
const resolvedOldPath = oldValidation.resolved;
// Check if old path exists
try {
await fsPromises.access(resolvedOldPath);
} catch {
return res.status(404).json({ error: 'File or directory not found' });
}
// Build and validate new path
const parentDir = path.dirname(resolvedOldPath);
const resolvedNewPath = path.join(parentDir, newName);
const newValidation = validatePathInProject(projectRoot, resolvedNewPath);
if (!newValidation.valid) {
return res.status(403).json({ error: newValidation.error });
}
// Check if new path already exists
try {
await fsPromises.access(resolvedNewPath);
return res.status(409).json({ error: 'A file or directory with this name already exists' });
} catch {
// Doesn't exist, which is what we want
}
// Rename
await fsPromises.rename(resolvedOldPath, resolvedNewPath);
res.json({
success: true,
oldPath: resolvedOldPath,
newPath: resolvedNewPath,
newName,
message: 'Renamed successfully'
});
} catch (error) {
console.error('Error renaming file/directory:', error);
if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else if (error.code === 'ENOENT') {
res.status(404).json({ error: 'File or directory not found' });
} else if (error.code === 'EXDEV') {
res.status(400).json({ error: 'Cannot move across different filesystems' });
} else {
res.status(500).json({ error: error.message });
}
}
});
// DELETE /api/projects/:projectId/files - Delete file or directory
app.delete('/api/projects/:projectId/files', authenticateToken, async (req, res) => {
try {
const { projectId } = req.params;
const { path: targetPath, type } = req.body;
// Validate input
if (!targetPath) {
return res.status(400).json({ error: 'Path is required' });
}
// Resolve the project directory through the DB using the new projectId.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
// Validate path
const validation = validatePathInProject(projectRoot, targetPath);
if (!validation.valid) {
return res.status(403).json({ error: validation.error });
}
const resolvedPath = validation.resolved;
// Check if path exists and get stats
let stats;
try {
stats = await fsPromises.stat(resolvedPath);
} catch {
return res.status(404).json({ error: 'File or directory not found' });
}
// Prevent deleting the project root itself
if (resolvedPath === path.resolve(projectRoot)) {
return res.status(403).json({ error: 'Cannot delete project root directory' });
}
// Delete based on type
if (stats.isDirectory()) {
await fsPromises.rm(resolvedPath, { recursive: true, force: true });
} else {
await fsPromises.unlink(resolvedPath);
}
res.json({
success: true,
path: resolvedPath,
type: stats.isDirectory() ? 'directory' : 'file',
message: 'Deleted successfully'
});
} catch (error) {
console.error('Error deleting file/directory:', error);
if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else if (error.code === 'ENOENT') {
res.status(404).json({ error: 'File or directory not found' });
} else if (error.code === 'ENOTEMPTY') {
res.status(400).json({ error: 'Directory is not empty' });
} else {
res.status(500).json({ error: error.message });
}
}
});
// POST /api/projects/:projectId/files/upload - Upload files
// Dynamic import of multer for file uploads
const uploadFilesHandler = async (req, res) => {
// Dynamic import of multer
const multer = (await import('multer')).default;
const uploadMiddleware = multer({
storage: multer.diskStorage({
destination: (req, file, cb) => {
cb(null, os.tmpdir());
},
filename: (req, file, cb) => {
// Use a unique temp name, but preserve original name in file.originalname
// Note: file.originalname may contain path separators for folder uploads
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
// For temp file, just use a safe unique name without the path
cb(null, `upload-${uniqueSuffix}`);
}
}),
limits: {
fileSize: 50 * 1024 * 1024, // 50MB limit
files: 20 // Max 20 files at once
}
});
// Use multer middleware
uploadMiddleware.array('files', 20)(req, res, async (err) => {
if (err) {
console.error('Multer error:', err);
if (err.code === 'LIMIT_FILE_SIZE') {
return res.status(400).json({ error: 'File too large. Maximum size is 50MB.' });
}
if (err.code === 'LIMIT_FILE_COUNT') {
return res.status(400).json({ error: 'Too many files. Maximum is 20 files.' });
}
return res.status(500).json({ error: err.message });
}
try {
const { projectId } = req.params;
const { targetPath, relativePaths } = req.body;
// Parse relative paths if provided (for folder uploads)
let filePaths = [];
if (relativePaths) {
try {
filePaths = JSON.parse(relativePaths);
} catch (e) {
console.log('[DEBUG] Failed to parse relativePaths:', relativePaths);
}
}
console.log('[DEBUG] File upload request:', {
projectId,
targetPath: JSON.stringify(targetPath),
targetPathType: typeof targetPath,
filesCount: req.files?.length,
relativePaths: filePaths
});
if (!req.files || req.files.length === 0) {
return res.status(400).json({ error: 'No files provided' });
}
// Resolve the project directory through the DB using the new projectId.
const projectRoot = await projectsDb.getProjectPathById(projectId);
if (!projectRoot) {
return res.status(404).json({ error: 'Project not found' });
}
console.log('[DEBUG] Project root:', projectRoot);
// Validate and resolve target path
// If targetPath is empty or '.', use project root directly
const targetDir = targetPath || '';
let resolvedTargetDir;
console.log('[DEBUG] Target dir:', JSON.stringify(targetDir));
if (!targetDir || targetDir === '.' || targetDir === './') {
// Empty path means upload to project root
resolvedTargetDir = path.resolve(projectRoot);
console.log('[DEBUG] Using project root as target:', resolvedTargetDir);
} else {
const validation = validatePathInProject(projectRoot, targetDir);
if (!validation.valid) {
console.log('[DEBUG] Path validation failed:', validation.error);
return res.status(403).json({ error: validation.error });
}
resolvedTargetDir = validation.resolved;
console.log('[DEBUG] Resolved target dir:', resolvedTargetDir);
}
// Ensure target directory exists
try {
await fsPromises.access(resolvedTargetDir);
} catch {
await fsPromises.mkdir(resolvedTargetDir, { recursive: true });
}
// Move uploaded files from temp to target directory
const uploadedFiles = [];
console.log('[DEBUG] Processing files:', req.files.map(f => ({ originalname: f.originalname, path: f.path })));
for (let i = 0; i < req.files.length; i++) {
const file = req.files[i];
// Use relative path if provided (for folder uploads), otherwise use originalname
const fileName = (filePaths && filePaths[i]) ? filePaths[i] : file.originalname;
console.log('[DEBUG] Processing file:', fileName, '(originalname:', file.originalname + ')');
const destPath = path.join(resolvedTargetDir, fileName);
// Validate destination path
const destValidation = validatePathInProject(projectRoot, destPath);
if (!destValidation.valid) {
console.log('[DEBUG] Destination validation failed for:', destPath);
// Clean up temp file
await fsPromises.unlink(file.path).catch(() => {});
continue;
}
// Ensure parent directory exists (for nested files from folder upload)
const parentDir = path.dirname(destPath);
try {
await fsPromises.access(parentDir);
} catch {
await fsPromises.mkdir(parentDir, { recursive: true });
}
// Move file (copy + unlink to handle cross-device scenarios)
await fsPromises.copyFile(file.path, destPath);
await fsPromises.unlink(file.path);
uploadedFiles.push({
name: fileName,
path: destPath,
size: file.size,
mimeType: file.mimetype
});
}
res.json({
success: true,
files: uploadedFiles,
targetPath: resolvedTargetDir,
message: `Uploaded ${uploadedFiles.length} file(s) successfully`
});
} catch (error) {
console.error('Error uploading files:', error);
// Clean up any remaining temp files
if (req.files) {
for (const file of req.files) {
await fsPromises.unlink(file.path).catch(() => {});
}
}
if (error.code === 'EACCES') {
res.status(403).json({ error: 'Permission denied' });
} else {
res.status(500).json({ error: error.message });
}
}
});
};
app.post('/api/projects/:projectId/files/upload', authenticateToken, uploadFilesHandler);
// Image upload endpoint. Accepts the DB-assigned `projectId` (not a folder name)
// but the current implementation doesn't need to touch the project directory,
// so we just leave the param rename for consistency with the rest of the API.
app.post('/api/projects/:projectId/upload-images', authenticateToken, async (req, res) => {
try {
const multer = (await import('multer')).default;
const path = (await import('path')).default;
const fs = (await import('fs')).promises;
const os = (await import('os')).default;
// Configure multer for image uploads
const storage = multer.diskStorage({
destination: async (req, file, cb) => {
const uploadDir = path.join(os.tmpdir(), 'claude-ui-uploads', String(req.user.id));
await fs.mkdir(uploadDir, { recursive: true });
cb(null, uploadDir);
},
filename: (req, file, cb) => {
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
const sanitizedName = file.originalname.replace(/[^a-zA-Z0-9.-]/g, '_');
cb(null, uniqueSuffix + '-' + sanitizedName);
}
});
const fileFilter = (req, file, cb) => {
const allowedMimes = ['image/jpeg', 'image/png', 'image/gif', 'image/webp', 'image/svg+xml'];
if (allowedMimes.includes(file.mimetype)) {
cb(null, true);
} else {
cb(new Error('Invalid file type. Only JPEG, PNG, GIF, WebP, and SVG are allowed.'));
}
};
const upload = multer({
storage,
fileFilter,
limits: {
fileSize: 5 * 1024 * 1024, // 5MB
files: 5
}
});
// Handle multipart form data
upload.array('images', 5)(req, res, async (err) => {
if (err) {
return res.status(400).json({ error: err.message });
}
if (!req.files || req.files.length === 0) {
return res.status(400).json({ error: 'No image files provided' });
}
try {
// Process uploaded images
const processedImages = await Promise.all(
req.files.map(async (file) => {
// Read file and convert to base64
const buffer = await fs.readFile(file.path);
const base64 = buffer.toString('base64');
const mimeType = file.mimetype;
// Clean up temp file immediately
await fs.unlink(file.path);
return {
name: file.originalname,
data: `data:${mimeType};base64,${base64}`,
size: file.size,
mimeType: mimeType
};
})
);
res.json({ images: processedImages });
} catch (error) {
console.error('Error processing images:', error);
// Clean up any remaining files
await Promise.all(req.files.map(f => fs.unlink(f.path).catch(() => { })));
res.status(500).json({ error: 'Failed to process images' });
}
});
} catch (error) {
console.error('Error in image upload endpoint:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
// Get token usage for a specific session. `projectId` is the DB primary key;
// the Claude branch below resolves it to an absolute path via the DB.
app.get('/api/projects/:projectId/sessions/:sessionId/token-usage', authenticateToken, async (req, res) => {
try {
const { projectId, sessionId } = req.params;
const { provider = 'claude' } = req.query;
const homeDir = os.homedir();
// Allow only safe characters in sessionId
const safeSessionId = String(sessionId).replace(/[^a-zA-Z0-9._-]/g, '');
if (!safeSessionId || safeSessionId !== String(sessionId)) {
return res.status(400).json({ error: 'Invalid sessionId' });
}
// Handle Cursor sessions - they use SQLite and don't have token usage info
if (provider === 'cursor') {
return res.json({
used: 0,
total: 0,
breakdown: { input: 0, cacheCreation: 0, cacheRead: 0 },
unsupported: true,
message: 'Token usage tracking not available for Cursor sessions'
});
}
// Handle Gemini sessions - they are raw logs in our current setup
if (provider === 'gemini') {
return res.json({
used: 0,
total: 0,
breakdown: { input: 0, cacheCreation: 0, cacheRead: 0 },
unsupported: true,
message: 'Token usage tracking not available for Gemini sessions'
});
}
// Handle Codex sessions
if (provider === 'codex') {
const codexSessionsDir = path.join(homeDir, '.codex', 'sessions');
// Find the session file by searching for the session ID
const findSessionFile = async (dir) => {
try {
const entries = await fsPromises.readdir(dir, { withFileTypes: true });
for (const entry of entries) {
const fullPath = path.join(dir, entry.name);
if (entry.isDirectory()) {
const found = await findSessionFile(fullPath);
if (found) return found;
} else if (entry.name.includes(safeSessionId) && entry.name.endsWith('.jsonl')) {
return fullPath;
}
}
} catch (error) {
// Skip directories we can't read
}
return null;
};
const sessionFilePath = await findSessionFile(codexSessionsDir);
if (!sessionFilePath) {
return res.status(404).json({ error: 'Codex session file not found', sessionId: safeSessionId });
}
// Read and parse the Codex JSONL file
let fileContent;
try {
fileContent = await fsPromises.readFile(sessionFilePath, 'utf8');
} catch (error) {
if (error.code === 'ENOENT') {
return res.status(404).json({ error: 'Session file not found', path: sessionFilePath });
}
throw error;
}
const lines = fileContent.trim().split('\n');
let totalTokens = 0;
let contextWindow = 200000; // Default for Codex/OpenAI
// Find the latest token_count event with info (scan from end)
for (let i = lines.length - 1; i >= 0; i--) {
try {
const entry = JSON.parse(lines[i]);
// Codex stores token info in event_msg with type: "token_count"
if (entry.type === 'event_msg' && entry.payload?.type === 'token_count' && entry.payload?.info) {
const tokenInfo = entry.payload.info;
if (tokenInfo.total_token_usage) {
totalTokens = tokenInfo.total_token_usage.total_tokens || 0;
}
if (tokenInfo.model_context_window) {
contextWindow = tokenInfo.model_context_window;
}
break; // Stop after finding the latest token count
}
} catch (parseError) {
// Skip lines that can't be parsed
continue;
}
}
return res.json({
used: totalTokens,
total: contextWindow
});
}
// Handle Claude sessions (default)
// Resolve the project path through the DB using the caller-supplied
// `projectId`. Legacy code here called extractProjectDirectory with a
// folder-encoded project name; the migration centralizes that lookup
// in the projects table.
const projectPath = await projectsDb.getProjectPathById(projectId);
if (!projectPath) {
return res.status(404).json({ error: 'Project not found' });
}
// Construct the JSONL file path
// Claude stores session files in ~/.claude/projects/[encoded-project-path]/[session-id].jsonl
// The encoding replaces any non-alphanumeric character (except -) with -
const encodedPath = projectPath.replace(/[^a-zA-Z0-9-]/g, '-');
const projectDir = path.join(homeDir, '.claude', 'projects', encodedPath);
const jsonlPath = path.join(projectDir, `${safeSessionId}.jsonl`);
// Constrain to projectDir
const rel = path.relative(path.resolve(projectDir), path.resolve(jsonlPath));
if (rel.startsWith('..') || path.isAbsolute(rel)) {
return res.status(400).json({ error: 'Invalid path' });
}
// Read and parse the JSONL file
let fileContent;
try {
fileContent = await fsPromises.readFile(jsonlPath, 'utf8');
} catch (error) {
if (error.code === 'ENOENT') {
return res.status(404).json({ error: 'Session file not found', path: jsonlPath });
}
throw error; // Re-throw other errors to be caught by outer try-catch
}
const lines = fileContent.trim().split('\n');
const parsedContextWindow = parseInt(process.env.CONTEXT_WINDOW, 10);
const contextWindow = Number.isFinite(parsedContextWindow) ? parsedContextWindow : 160000;
let inputTokens = 0;
let cacheCreationTokens = 0;
let cacheReadTokens = 0;
// Find the latest assistant message with usage data (scan from end)
for (let i = lines.length - 1; i >= 0; i--) {
try {
const entry = JSON.parse(lines[i]);
// Only count assistant messages which have usage data
if (entry.type === 'assistant' && entry.message?.usage) {
const usage = entry.message.usage;
// Use token counts from latest assistant message only
inputTokens = usage.input_tokens || 0;
cacheCreationTokens = usage.cache_creation_input_tokens || 0;
cacheReadTokens = usage.cache_read_input_tokens || 0;
break; // Stop after finding the latest assistant message
}
} catch (parseError) {
// Skip lines that can't be parsed
continue;
}
}
// Calculate total context usage (excluding output_tokens, as per ccusage)
const totalUsed = inputTokens + cacheCreationTokens + cacheReadTokens;
res.json({
used: totalUsed,
total: contextWindow,
breakdown: {
input: inputTokens,
cacheCreation: cacheCreationTokens,
cacheRead: cacheReadTokens
}
});
} catch (error) {
console.error('Error reading session token usage:', error);
res.status(500).json({ error: 'Failed to read session token usage' });
}
});
// Serve React app for all other routes (excluding static files)
app.get('*', (req, res) => {
// Skip requests for static assets (files with extensions)
if (path.extname(req.path)) {
return res.status(404).send('Not found');
}
// Only serve index.html for HTML routes, not for static assets
// Static assets should already be handled by express.static middleware above
const indexPath = path.join(APP_ROOT, 'dist', 'index.html');
// Check if dist/index.html exists (production build available)
if (fs.existsSync(indexPath)) {
// Set no-cache headers for HTML to prevent service worker issues
res.setHeader('Cache-Control', 'no-cache, no-store, must-revalidate');
res.setHeader('Pragma', 'no-cache');
res.setHeader('Expires', '0');
res.sendFile(indexPath);
} else {
// In development, redirect to Vite dev server only if dist doesn't exist
const redirectHost = getConnectableHost(req.hostname);
res.redirect(`${req.protocol}://${redirectHost}:${VITE_PORT}`);
}
});
// global error middleware must be last
app.use((err, req, res, next) => {
if (err instanceof AppError) {
return res.status(err.statusCode).json({
success: false,
error: {
code: err.code,
message: err.message,
details: err.details,
},
});
}
console.error(err);
return res.status(500).json({
success: false,
error: {
code: 'INTERNAL_ERROR',
message: 'Internal server error',
},
});
});
// Helper function to convert permissions to rwx format
function permToRwx(perm) {
const r = perm & 4 ? 'r' : '-';
const w = perm & 2 ? 'w' : '-';
const x = perm & 1 ? 'x' : '-';
return r + w + x;
}
async function getFileTree(dirPath, maxDepth = 3, currentDepth = 0, showHidden = true) {
// Using fsPromises from import
const items = [];
try {
const entries = await fsPromises.readdir(dirPath, { withFileTypes: true });
for (const entry of entries) {
// Debug: log all entries including hidden files
// Skip heavy build directories and VCS directories
if (entry.name === 'node_modules' ||
entry.name === 'dist' ||
entry.name === 'build' ||
entry.name === '.git' ||
entry.name === '.svn' ||
entry.name === '.hg') continue;
const itemPath = path.join(dirPath, entry.name);
const item = {
name: entry.name,
path: itemPath,
type: entry.isDirectory() ? 'directory' : 'file'
};
// Get file stats for additional metadata
try {
const stats = await fsPromises.stat(itemPath);
item.size = stats.size;
item.modified = stats.mtime.toISOString();
// Convert permissions to rwx format
const mode = stats.mode;
const ownerPerm = (mode >> 6) & 7;
const groupPerm = (mode >> 3) & 7;
const otherPerm = mode & 7;
item.permissions = ((mode >> 6) & 7).toString() + ((mode >> 3) & 7).toString() + (mode & 7).toString();
item.permissionsRwx = permToRwx(ownerPerm) + permToRwx(groupPerm) + permToRwx(otherPerm);
} catch (statError) {
// If stat fails, provide default values
item.size = 0;
item.modified = null;
item.permissions = '000';
item.permissionsRwx = '---------';
}
if (entry.isDirectory() && currentDepth < maxDepth) {
// Recursively get subdirectories but limit depth
try {
// Check if we can access the directory before trying to read it
await fsPromises.access(item.path, fs.constants.R_OK);
item.children = await getFileTree(item.path, maxDepth, currentDepth + 1, showHidden);
} catch (e) {
// Silently skip directories we can't access (permission denied, etc.)
item.children = [];
}
}
items.push(item);
}
} catch (error) {
// Only log non-permission errors to avoid spam
if (error.code !== 'EACCES' && error.code !== 'EPERM') {
console.error('Error reading directory:', error);
}
}
return items.sort((a, b) => {
if (a.type !== b.type) {
return a.type === 'directory' ? -1 : 1;
}
return a.name.localeCompare(b.name);
});
}
const SERVER_PORT = process.env.SERVER_PORT || 3001;
const HOST = process.env.HOST || '0.0.0.0';
const DISPLAY_HOST = getConnectableHost(HOST);
const VITE_PORT = process.env.VITE_PORT || 5173;
// Initialize database and start server
async function startServer() {
try {
// Initialize authentication database
await initializeDatabase();
// Configure Web Push (VAPID keys)
configureWebPush();
// Check if running in production mode (dist folder exists)
const distIndexPath = path.join(APP_ROOT, 'dist', 'index.html');
const isProduction = fs.existsSync(distIndexPath);
// Log Claude implementation mode
console.log(`${c.info('[INFO]')} Using Claude Agents SDK for Claude integration`);
console.log('');
if (isProduction) {
console.log(`${c.info('[INFO]')} To run in production mode, go to http://${DISPLAY_HOST}:${SERVER_PORT}`);
}
console.log(`${c.info('[INFO]')} To run in development mode with hot-module replacement, go to http://${DISPLAY_HOST}:${VITE_PORT}`);
server.listen(SERVER_PORT, HOST, async () => {
const appInstallPath = APP_ROOT;
console.log('');
console.log(c.dim('═'.repeat(63)));
console.log(` ${c.bright('CloudCLI Server - Ready')}`);
console.log(c.dim('═'.repeat(63)));
console.log('');
console.log(`${c.info('[INFO]')} Server URL: ${c.bright('http://' + DISPLAY_HOST + ':' + SERVER_PORT)}`);
console.log(`${c.info('[INFO]')} Installed at: ${c.dim(appInstallPath)}`);
console.log(`${c.tip('[TIP]')} Run "cloudcli status" for full configuration details`);
console.log('');
// Start watching the projects folder for changes
await initializeSessionsWatcher();
// Start server-side plugin processes for enabled plugins
startEnabledPluginServers().catch(err => {
console.error('[Plugins] Error during startup:', err.message);
});
});
await closeSessionsWatcher();
// Clean up plugin processes on shutdown
const shutdownPlugins = async () => {
await stopAllPlugins();
process.exit(0);
};
process.on('SIGTERM', () => void shutdownPlugins());
process.on('SIGINT', () => void shutdownPlugins());
} catch (error) {
console.error('[ERROR] Failed to start server:', error);
process.exit(1);
}
}
startServer();