mirror of
https://github.com/siteboon/claudecodeui.git
synced 2026-05-02 02:38:38 +00:00
* refactor: remove unused exports
* refactor: remove unused fields from project and session objects
* refactor: rename session_names table and related code to sessions for clarity and consistency
* refactor(database): move db into typescript
- Implemented githubTokensDb for managing GitHub tokens with CRUD operations.
- Created
otificationPreferencesDb to handle user notification preferences.
- Added projectsDb for project path management and related operations.
- Introduced pushSubscriptionsDb for managing browser push subscriptions.
- Developed scanStateDb to track the last scanned timestamp.
- Established sessionsDb for session management with CRUD functionalities.
- Created userDb for user management, including authentication and onboarding.
- Implemented apidKeysDb for storing and managing VAPID keys.
feat(database): define schema for new database tables
- Added SQL schema definitions for users, API keys, user credentials, notification preferences, VAPID keys, push subscriptions, projects, sessions, scan state, and app configuration.
- Included necessary indexes for performance optimization.
refactor(shared): enhance type definitions and utility functions
- Updated shared types and interfaces for improved clarity and consistency.
- Added new types for credential management and provider-specific operations.
- Refined utility functions for better error handling and message normalization.
* feat: added session indexer logic
* perf(projects): lazy-load TaskMaster metadata per selected project
Why:
- /api/projects is a hot path (initial load, sidebar refresh, websocket sync).
- Scanning .taskmaster for every project on each call added avoidable fs I/O and payload size.
- TaskMaster metadata is only needed after selecting a specific project.
- Moving it to a project-scoped endpoint makes loading cost match user intent.
- The UI now hydrates TaskMaster state on selection and keeps it across refresh events.
- This prevents status flicker/regression while still removing global scan overhead.
- Selection fetches are sequence-guarded to block stale async responses on fast switching.
- isManuallyAdded was removed from responses to keep the public project contract minimal.
- Project dumps now use incrementing snapshot files to preserve history for debugging.
What changed:
- Added GET /api/projects/:projectName/taskmaster and getProjectTaskMaster().
- Removed TaskMaster detection from bulk getProjects().
- Added api.projectTaskmaster(...) plus selection-time hydration in frontend contexts.
- Merged cached taskmaster values into refreshed project lists for continuity.
- Removed isManuallyAdded from manual project payloads.
* refactor: update import paths for database modules and remove legacy db.js and schema.js files
* refactor(projects): identify projects by DB projectId instead of folder-derived name
GET /api/projects used to scan ~/.claude/projects/ on every request, derive
each project's identity from the encoded folder name, and re-parse JSONL
files to build session lists. Using the folder-derived name as the project
identifier leaked the Claude CLI's on-disk encoding into every API route,
forced every downstream endpoint to re-resolve a real path via JSONL
'cwd' inspection, and made the project list endpoint O(projects x sessions)
on disk I/O.
This change switches the entire API surface to identify projects by the
stable primary key from the 'projects' table and drives the listing
straight from the DB:
- Add projectsDb.getProjectPathById as the canonical projectId -> path
resolver so routes no longer need to touch the filesystem to figure out
where a project lives.
- Rewrite getProjects so it reads the project list from the 'projects'
table and the per-project session list from the 'sessions' table (one
SELECT per project). No filesystem scanning happens for this endpoint
anymore, which removes the dependency on ~/.claude/projects existing,
on Cursor's MD5-hashed chat folders being discoverable, and on Codex's
JSONL history being on disk. Per the migration spec each session now
exposes 'summary' sourced from sessions.custom_name, 'messageCount' = 0
(message counting is not implemented), and sessionMeta.hasMore is
pinned to false since this endpoint doesn't drive session pagination.
- Introduce id-based wrappers (getSessionsById, renameProjectById,
deleteSessionById, deleteProjectById, getProjectTaskMasterById) so
every caller can pass projectId and resolve the real path through the
DB. renameProjectById also writes to projects.custom_project_name so
the DB-driven getProjects response reflects renames immediately; it
keeps project-config.json in sync for any legacy reader that still
consults the JSON file.
- Migrate every /api/projects/:projectName route in server/index.js,
server/routes/taskmaster.js, and server/routes/messages.js to
:projectId, and change server/routes/git.js so the 'project'
query/body parameter carries a projectId that is resolved through the
DB before any git command runs. TaskMaster WebSocket broadcasts emit
'projectId' for the same reason so the frontend can match
notifications against its current selection without another lookup.
- Delete helpers that existed only to feed the old getProjects path
(getCursorSessions, getGeminiCliSessions, getProjectTaskMaster) along
with their unused imports (better-sqlite3's Database,
applyCustomSessionNames). The legacy folder-name helpers (getSessions,
renameProject, deleteSession, deleteProject, extractProjectDirectory)
are kept as internal implementation details of the id-based wrappers
and of destructive cleanup / conversation search, but they are no
longer re-exported.
- searchConversations still walks JSONL to produce match snippets (that
data doesn't live in the DB), but it now includes the resolved
projectId in each result so the sidebar can cross-reference hits with
its already loaded project list without a second round-trip.
Frontend migration:
- Project.name is replaced by Project.projectId in src/types/app.ts, and
ProjectSession.__projectName becomes __projectId so session tagging
and sidebar state keys stay aligned with the backend identifier.
Settings continues to use SettingsProject.name for legacy consumers,
but it is populated from projectId by normalizeProjectForSettings.
- All places that previously indexed per-project state by project.name
(sidebar expanded/starred/loading/deletingProjects sets,
additionalSessions map, projectHasMoreOverrides, starredProjects
localStorage, command history and draft-input localStorage,
TaskMaster caches) now key on projectId so state survives
display-name edits and is consistent across the app.
- src/utils/api.js renames every endpoint parameter to projectId, the
unified messages endpoint takes projectId in its query string, and
useSessionStore forwards projectId on fetchFromServer / fetchMore /
refreshFromServer. Git panel, file tree, code editor, PRD editor,
plugins context, MCP server flows and TaskMaster hooks are all
updated to pass projectId.
- DEFAULT_PROJECT_FOR_EMPTY_SHELL is updated to carry a 'default'
projectId sentinel so the empty-shell placeholder still satisfies the
Project contract.
Bug fix bundled in:
- sessionsDb.setName no longer bumps updated_at when a row already
exists. Renaming is a label change, not activity, so there is no
reason for it to reset 'last activity' in the sidebar. It also no
longer relies on SQLite's CURRENT_TIMESTAMP, which stores a naive
'YYYY-MM-DD HH:MM:SS' value that JavaScript parses as local time and
caused renamed sessions to appear shifted backwards by the client's
UTC offset. When an INSERT actually happens it now writes ISO-8601
UTC with a 'Z' suffix.
- buildSessionsByProviderFromDb normalizes any legacy naive timestamps
in the sessions table to ISO-8601 UTC on the way out so rows written
before this change also render correctly on the client.
Other cleanup:
- Removed the filesystem-first project-discovery comment block at the
top of server/projects.js and replaced it with a short note that
describes the new DB-driven flow and lists the few remaining
filesystem-dependent helpers (message reads, search, destructive
delete, manual project registration).
- server/modules/providers/index.ts is added as a small barrel so the
providers module exposes a stable public surface.
Made-with: Cursor
* refactor(projects): reorganize project-related logic into dedicated modules
* refactor(projects): rename getProjects with getProjectsWithSessions
* refactor: update import path for getProjectsWithSessions to include file extension
* refactor: use updated session watcher
In addition, for projects_updated websocket response, send the sessionId instead
* refactor(websocket): move websocket logic to its own module
* refactor(sessions-watcher): remove redundant logging after session sync completion
* refactor(index.js): reorganize code structure
* refactor(index.js): fix import order
* refactor: remove unnecessary GitHub cloning logic from create-workspace endpoint
* refactor: modularize project services, and wizard create/clone flow
Restructure project creation, listing, GitHub clone progress, and TaskMaster
details behind a dedicated TypeScript module under server/modules/projects/,
and align the client wizard with a single path-based flow.
Server / routing
- Remove server/routes/projects.js and mount server/modules/projects/
projects.routes.ts at /api/projects (still behind authenticateToken).
- Drop duplicate handlers from server/index.js for GET /api/projects and
GET /api/projects/:projectId/taskmaster; those live on the new router.
- Import WORKSPACES_ROOT and validateWorkspacePath from shared utils in
index.js instead of the deleted projects route module.
Projects router (projects.routes.ts)
- GET /: list projects with sessions (existing snapshot behavior).
- POST /create-project: validate body, reject legacy workspaceType and
mixed clone fields, delegate to createProject service, return distinct
success copy when an archived path is reactivated.
- GET /clone-progress: Server-Sent Events for clone progress/complete/error;
requires authenticated user id for token resolution; wires startCloneProject.
- GET /:projectId/taskmaster: delegates to getProjectTaskMaster.
Services (new)
- project-management.service.ts: path validation, workspace directory
creation, persistence via projectsDb.createProjectPath, mapping to API
project shape; surfaces AppError for validation, conflict, and not-found
cases; optional dependency injection for tests.
- project-clone.service.ts: validates workspace, resolves GitHub auth
(stored token or inline token), runs git clone with progress callbacks,
registers project via createProject on success; sanitizes errors and
supports cancellation; injectable dependencies for tests.
- projects-has-taskmaster.service.ts: moves TaskMaster detection and
normalization out of server/projects.js; resolve-by-id and public
getProjectTaskMaster with structured AppError responses.
Persistence and shared types
- projectsDb.createProjectPath now returns CreateProjectPathResult
(created | reactivated_archived | active_conflict) using INSERT … ON
CONFLICT with selective update when the row is archived; normalizes
display name from path or custom name; repository row typing moves to
shared ProjectRepositoryRow.
- getProjectPaths() returns only non-archived rows (isArchived = 0).
- shared/types.ts: ProjectRepositoryRow, CreateProjectPathResult/outcome,
WorkspacePathValidationResult.
- shared/utils.ts: WORKSPACES_ROOT, forbidden path lists, validateWorkspacePath,
asyncHandler for Express async routes.
Legacy cleanup
- server/projects.js: remove detectTaskMasterFolder, normalizeTaskMasterInfo,
and getProjectTaskMasterById (logic lives in the new service).
- server/routes/agent.js: register external API project paths with
projectsDb.createProjectPath instead of addProjectManually try/catch;
treat active_conflict as an existing registration and continue.
Tests
- Add Node test suites for project-management, project-clone, and
projects-has-taskmaster services; update projects.service test import
for renamed projects-with-sessions-fetch.service.ts.
Rename
- projects.service.ts → projects-with-sessions-fetch.service.ts;
re-export from modules/projects/index.ts.
Client (project creation wizard)
- Remove StepTypeSelection and workspaceType from form state and types;
wizard is two steps (configure path/GitHub auth, then review).
- createWorkspaceRequest → createProjectRequest; clone vs create-only
inferred from githubUrl (pathUtils / isCloneWorkflow).
- Adjust step indices, WizardProgress, StepConfiguration/Review,
WorkspacePathField, and src/utils/api.js as needed for the new API.
Docs
- Minor websocket README touch-up.
Net: ~1.6k insertions / ~0.9k deletions across 29 files; behavior is
centralized in typed services with explicit HTTP errors and test seams.
* refactor: remove loading sessions logic from sidebar
* refactor: move project rename to module
* refactor: move project deletion to module
* refactor: move project star state from localStorage to backend
* refactor: implement optimistic UI for project star state management
* feat: optimistic update for session watcher
* fix(projects-state): stop websocket message reprocessing loop
The websocket projects effect in useProjectsState could re-handle the same
latestMessage after local state writes triggered re-renders.
Under bursty websocket traffic, this created an update feedback cycle
that surfaced as 'Maximum update depth exceeded', often from Sidebar.
What changed:
- Added lastHandledMessageRef so each latestMessage object is handled once.
- Added an early return guard when the current message was already handled.
- Made projects updates idempotent by comparing previous and merged payloads
before calling setProjects.
Result:
- Breaks the effect -> state update -> effect re-entry cycle.
- Reduces redundant renders during rapid projects_updated traffic while
preserving normal project/session synchronization.
* refactor: optimize project auto-expand logic
* refactor: move projects provider specific logic into respective session providers
* refactor: move rename and delete sessions to modules
* refactor: move fetching messages to module
* fix: remove unused var
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
* Potential fix for pull request finding 'Useless assignment to local variable'
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
* refactor(projects/sidebar): remove temp snapshot side-effects and simplify session metadata UX
Why this change was needed:
- Project listing had an implicit side effect: every fetch wrote a debug snapshot under `.tmp/project-dumps`.
That added unnecessary disk I/O to a hot path, introduced hidden runtime behavior, and created maintenance
overhead for code that was not part of product functionality.
- Keeping snapshot-specific exports/tests around made the projects module API broader than needed and coupled
tests to temporary/debug behavior instead of user-visible behavior.
- Codex sessions could remain stuck with a placeholder name (`Untitled Codex Session`) even after a real title
became available from newer sync data, which degraded session discoverability in the UI.
- Sidebar session rows showed duplicated provider branding and long-form relative times, which added visual noise
and reduced scan speed when many sessions are listed.
What changed:
- Removed temporary projects snapshot dumping from `projects-with-sessions-fetch.service.ts`:
- deleted snapshot types/helpers and file-write flow
- removed the write call from `getProjectsWithSessions`
- Removed snapshot-related surface area from `projects/index.ts`.
- Removed the snapshot-focused test `projects.service.test.ts` that only validated removed debug behavior.
- Updated `codex-session-synchronizer.provider.ts` to upgrade session names when an existing session still has
the placeholder title but a real parsed name is now available.
- Updated `SidebarSessionItem.tsx`:
- removed duplicate provider logo rendering in each session row
- moved age indicator to the right side
- made age indicator fade on hover to prioritize action controls
- switched to compact relative time format (`<1m`, `Xm`, `Xhr`, `Xd`) for faster list scanning
Outcome:
- Lower overhead and fewer hidden side effects in project fetches.
- Cleaner module boundaries in projects.
- Better Codex session naming consistency after sync.
- Cleaner sidebar density and clearer hover/action behavior.
* refactor: implement pagination for project sessions loading
* refactor: move search to module
* fix: search performance
* refactor: add handling for internal Codex metadata in conversation search
* fix(migrations,projects,clone): normalize legacy schema before writes and harden conflict detection
Why
- Legacy installs can have a sessions table shape that predates provider/custom_name columns. Running migrateLegacySessionNames first caused its INSERT OR REPLACE INTO sessions (...) to target columns that may not exist and fail during startup migration.
- Some upgraded databases had projects.project_id as plain TEXT instead of a real PRIMARY KEY. That breaks assumptions used by id-based lookups and can allow invalid/duplicate identity semantics over time.
- projectsDb.createProjectPath inferred outcomes from
ow.isArchived, but the upsert path always returns the post-update row with isArchived=0, so archived-reactivation and fresh-create could be misclassified.
- git clone accepted user-controlled URLs directly in argv position, so inputs beginning with - could be interpreted as options instead of a repository argument.
What
- Added
ebuildProjectsTableWithPrimaryKeySchema in migrations: detect table shape via getTableInfo('projects'), verify project_id has pk=1, and rebuild when missing.
- Rebuild flow now creates a canonical projects__new table (project_id TEXT PRIMARY KEY), copies rows with transformation, backfills empty ids via SQLITE_UUID_SQL, deduplicates conflicting ids/paths, then swaps tables inside a transaction.
- Replaced the prior ddColumnToTableIfNotExists(...) + UPDATE project_id sequence with PK-aware detection/rebuild logic so legacy DBs converge to the required schema.
- Reordered migration sequence to run
ebuildSessionsTableWithProjectSchema before migrateLegacySessionNames, ensuring sessions is normalized before legacy session_names merge writes execute.
- Updated projectsDb.createProjectPath to generate an ttemptedId before insert, pass it into the prepared statement, and classify outcomes by comparing returned
ow.project_id to ttemptedId (created vs
eactivated_archived), with no-row remaining ctive_conflict.
- Hardened clone execution by inserting -- before clone URL in git argv and rejecting normalized GitHub URLs that start with - in startCloneProject.
Tests
- Added integration coverage for projectsDb.createProjectPath branches: fresh insert, archived reactivation, and active conflict.
- Added clone service test for option-prefixed githubUrl rejection (INVALID_GITHUB_URL).
* refactor(session-synchronizer): update last scanned timestamp based on synchronization results
* refactor: improve session limit and offset validation in provider routes
* refactor: normalize project paths across database and service modules
* refactor(database): make session id the primary key in sessions table
* fix(codex): preserve reasoning entries as thinking blocks
Codex history normalization was downgrading reasoning into plain assistant text
because of branch ordering, not because the raw data was missing.
Why this mattered:
- Codex reasoning JSONL entries are intentionally mapped to history items with
type thinking, but they also carry message.role assistant.
- normalizeHistoryEntry evaluated the assistant-role branch before the
thinking branch.
- As a result, reasoning content matched the assistant-text path first and was
emitted as kind text instead of kind thinking.
- This collapses semantic intent, so UI and downstream features that rely on
thinking blocks (separate rendering, filtering, and interpretation of model
thought process vs final answer) receive the wrong message kind.
What changed:
- Prioritized thinking detection (raw.type === thinking or raw.isReasoning)
before role-based assistant normalization.
- Kept a non-empty content guard for thinking payloads to avoid emitting empty
artifacts.
Impact:
- Reasoning entries from persisted Codex JSONL now remain thinking blocks
end-to-end.
- Regular assistant text normalization behavior remains unchanged.
* refactor: remove dead code
* refactor: directly use getProjectPathById from projectsDb
* refactor: add gemini jsonl session support
1241 lines
44 KiB
JavaScript
1241 lines
44 KiB
JavaScript
import express from 'express';
|
||
import { spawn } from 'child_process';
|
||
import path from 'path';
|
||
import os from 'os';
|
||
import { promises as fs } from 'fs';
|
||
import crypto from 'crypto';
|
||
import { userDb, apiKeysDb, githubTokensDb, projectsDb } from '../modules/database/index.js';
|
||
import { queryClaudeSDK } from '../claude-sdk.js';
|
||
import { spawnCursor } from '../cursor-cli.js';
|
||
import { queryCodex } from '../openai-codex.js';
|
||
import { spawnGemini } from '../gemini-cli.js';
|
||
import { Octokit } from '@octokit/rest';
|
||
import { CLAUDE_MODELS, CURSOR_MODELS, CODEX_MODELS } from '../../shared/modelConstants.js';
|
||
import { IS_PLATFORM } from '../constants/config.js';
|
||
import { normalizeProjectPath } from '../shared/utils.js';
|
||
|
||
const router = express.Router();
|
||
|
||
/**
|
||
* Middleware to authenticate agent API requests.
|
||
*
|
||
* Supports two authentication modes:
|
||
* 1. Platform mode (IS_PLATFORM=true): For managed/hosted deployments where
|
||
* authentication is handled by an external proxy. Requests are trusted and
|
||
* the default user context is used.
|
||
*
|
||
* 2. API key mode (default): For self-hosted deployments where users authenticate
|
||
* via API keys created in the UI. Keys are validated against the local database.
|
||
*/
|
||
const validateExternalApiKey = (req, res, next) => {
|
||
// Platform mode: Authentication is handled externally (e.g., by a proxy layer).
|
||
// Trust the request and use the default user context.
|
||
if (IS_PLATFORM) {
|
||
try {
|
||
const user = userDb.getFirstUser();
|
||
if (!user) {
|
||
return res.status(500).json({ error: 'Platform mode: No user found in database' });
|
||
}
|
||
req.user = user;
|
||
return next();
|
||
} catch (error) {
|
||
console.error('Platform mode error:', error);
|
||
return res.status(500).json({ error: 'Platform mode: Failed to fetch user' });
|
||
}
|
||
}
|
||
|
||
// Self-hosted mode: Validate API key from header or query parameter
|
||
const apiKey = req.headers['x-api-key'] || req.query.apiKey;
|
||
|
||
if (!apiKey) {
|
||
return res.status(401).json({ error: 'API key required' });
|
||
}
|
||
|
||
const user = apiKeysDb.validateApiKey(apiKey);
|
||
|
||
if (!user) {
|
||
return res.status(401).json({ error: 'Invalid or inactive API key' });
|
||
}
|
||
|
||
req.user = user;
|
||
next();
|
||
};
|
||
|
||
/**
|
||
* Get the remote URL of a git repository
|
||
* @param {string} repoPath - Path to the git repository
|
||
* @returns {Promise<string>} - Remote URL of the repository
|
||
*/
|
||
async function getGitRemoteUrl(repoPath) {
|
||
return new Promise((resolve, reject) => {
|
||
const gitProcess = spawn('git', ['config', '--get', 'remote.origin.url'], {
|
||
cwd: repoPath,
|
||
stdio: ['pipe', 'pipe', 'pipe']
|
||
});
|
||
|
||
let stdout = '';
|
||
let stderr = '';
|
||
|
||
gitProcess.stdout.on('data', (data) => {
|
||
stdout += data.toString();
|
||
});
|
||
|
||
gitProcess.stderr.on('data', (data) => {
|
||
stderr += data.toString();
|
||
});
|
||
|
||
gitProcess.on('close', (code) => {
|
||
if (code === 0) {
|
||
resolve(stdout.trim());
|
||
} else {
|
||
reject(new Error(`Failed to get git remote: ${stderr}`));
|
||
}
|
||
});
|
||
|
||
gitProcess.on('error', (error) => {
|
||
reject(new Error(`Failed to execute git: ${error.message}`));
|
||
});
|
||
});
|
||
}
|
||
|
||
/**
|
||
* Normalize GitHub URLs for comparison
|
||
* @param {string} url - GitHub URL
|
||
* @returns {string} - Normalized URL
|
||
*/
|
||
function normalizeGitHubUrl(url) {
|
||
// Remove .git suffix
|
||
let normalized = url.replace(/\.git$/, '');
|
||
// Convert SSH to HTTPS format for comparison
|
||
normalized = normalized.replace(/^git@github\.com:/, 'https://github.com/');
|
||
// Remove trailing slash
|
||
normalized = normalized.replace(/\/$/, '');
|
||
return normalized.toLowerCase();
|
||
}
|
||
|
||
/**
|
||
* Parse GitHub URL to extract owner and repo
|
||
* @param {string} url - GitHub URL (HTTPS or SSH)
|
||
* @returns {{owner: string, repo: string}} - Parsed owner and repo
|
||
*/
|
||
function parseGitHubUrl(url) {
|
||
// Handle HTTPS URLs: https://github.com/owner/repo or https://github.com/owner/repo.git
|
||
// Handle SSH URLs: git@github.com:owner/repo or git@github.com:owner/repo.git
|
||
const match = url.match(/github\.com[:/]([^/]+)\/([^/]+?)(?:\.git)?$/);
|
||
if (!match) {
|
||
throw new Error('Invalid GitHub URL format');
|
||
}
|
||
return {
|
||
owner: match[1],
|
||
repo: match[2].replace(/\.git$/, '')
|
||
};
|
||
}
|
||
|
||
/**
|
||
* Auto-generate a branch name from a message
|
||
* @param {string} message - The agent message
|
||
* @returns {string} - Generated branch name
|
||
*/
|
||
function autogenerateBranchName(message) {
|
||
// Convert to lowercase, replace spaces/special chars with hyphens
|
||
let branchName = message
|
||
.toLowerCase()
|
||
.replace(/[^a-z0-9\s-]/g, '') // Remove special characters
|
||
.replace(/\s+/g, '-') // Replace spaces with hyphens
|
||
.replace(/-+/g, '-') // Replace multiple hyphens with single
|
||
.replace(/^-|-$/g, ''); // Remove leading/trailing hyphens
|
||
|
||
// Ensure non-empty fallback
|
||
if (!branchName) {
|
||
branchName = 'task';
|
||
}
|
||
|
||
// Generate timestamp suffix (last 6 chars of base36 timestamp)
|
||
const timestamp = Date.now().toString(36).slice(-6);
|
||
const suffix = `-${timestamp}`;
|
||
|
||
// Limit length to ensure total length including suffix fits within 50 characters
|
||
const maxBaseLength = 50 - suffix.length;
|
||
if (branchName.length > maxBaseLength) {
|
||
branchName = branchName.substring(0, maxBaseLength);
|
||
}
|
||
|
||
// Remove any trailing hyphen after truncation and ensure no leading hyphen
|
||
branchName = branchName.replace(/-$/, '').replace(/^-+/, '');
|
||
|
||
// If still empty or starts with hyphen after cleanup, use fallback
|
||
if (!branchName || branchName.startsWith('-')) {
|
||
branchName = 'task';
|
||
}
|
||
|
||
// Combine base name with timestamp suffix
|
||
branchName = `${branchName}${suffix}`;
|
||
|
||
// Final validation: ensure it matches safe pattern
|
||
if (!/^[a-z0-9]+(?:-[a-z0-9]+)*$/.test(branchName)) {
|
||
// Fallback to deterministic safe name
|
||
return `branch-${timestamp}`;
|
||
}
|
||
|
||
return branchName;
|
||
}
|
||
|
||
/**
|
||
* Validate a Git branch name
|
||
* @param {string} branchName - Branch name to validate
|
||
* @returns {{valid: boolean, error?: string}} - Validation result
|
||
*/
|
||
function validateBranchName(branchName) {
|
||
if (!branchName || branchName.trim() === '') {
|
||
return { valid: false, error: 'Branch name cannot be empty' };
|
||
}
|
||
|
||
// Git branch name rules
|
||
const invalidPatterns = [
|
||
{ pattern: /^\./, message: 'Branch name cannot start with a dot' },
|
||
{ pattern: /\.$/, message: 'Branch name cannot end with a dot' },
|
||
{ pattern: /\.\./, message: 'Branch name cannot contain consecutive dots (..)' },
|
||
{ pattern: /\s/, message: 'Branch name cannot contain spaces' },
|
||
{ pattern: /[~^:?*\[\\]/, message: 'Branch name cannot contain special characters: ~ ^ : ? * [ \\' },
|
||
{ pattern: /@{/, message: 'Branch name cannot contain @{' },
|
||
{ pattern: /\/$/, message: 'Branch name cannot end with a slash' },
|
||
{ pattern: /^\//, message: 'Branch name cannot start with a slash' },
|
||
{ pattern: /\/\//, message: 'Branch name cannot contain consecutive slashes' },
|
||
{ pattern: /\.lock$/, message: 'Branch name cannot end with .lock' }
|
||
];
|
||
|
||
for (const { pattern, message } of invalidPatterns) {
|
||
if (pattern.test(branchName)) {
|
||
return { valid: false, error: message };
|
||
}
|
||
}
|
||
|
||
// Check for ASCII control characters
|
||
if (/[\x00-\x1F\x7F]/.test(branchName)) {
|
||
return { valid: false, error: 'Branch name cannot contain control characters' };
|
||
}
|
||
|
||
return { valid: true };
|
||
}
|
||
|
||
/**
|
||
* Get recent commit messages from a repository
|
||
* @param {string} projectPath - Path to the git repository
|
||
* @param {number} limit - Number of commits to retrieve (default: 5)
|
||
* @returns {Promise<string[]>} - Array of commit messages
|
||
*/
|
||
async function getCommitMessages(projectPath, limit = 5) {
|
||
return new Promise((resolve, reject) => {
|
||
const gitProcess = spawn('git', ['log', `-${limit}`, '--pretty=format:%s'], {
|
||
cwd: projectPath,
|
||
stdio: ['pipe', 'pipe', 'pipe']
|
||
});
|
||
|
||
let stdout = '';
|
||
let stderr = '';
|
||
|
||
gitProcess.stdout.on('data', (data) => {
|
||
stdout += data.toString();
|
||
});
|
||
|
||
gitProcess.stderr.on('data', (data) => {
|
||
stderr += data.toString();
|
||
});
|
||
|
||
gitProcess.on('close', (code) => {
|
||
if (code === 0) {
|
||
const messages = stdout.trim().split('\n').filter(msg => msg.length > 0);
|
||
resolve(messages);
|
||
} else {
|
||
reject(new Error(`Failed to get commit messages: ${stderr}`));
|
||
}
|
||
});
|
||
|
||
gitProcess.on('error', (error) => {
|
||
reject(new Error(`Failed to execute git: ${error.message}`));
|
||
});
|
||
});
|
||
}
|
||
|
||
/**
|
||
* Create a new branch on GitHub using the API
|
||
* @param {Octokit} octokit - Octokit instance
|
||
* @param {string} owner - Repository owner
|
||
* @param {string} repo - Repository name
|
||
* @param {string} branchName - Name of the new branch
|
||
* @param {string} baseBranch - Base branch to branch from (default: 'main')
|
||
* @returns {Promise<void>}
|
||
*/
|
||
async function createGitHubBranch(octokit, owner, repo, branchName, baseBranch = 'main') {
|
||
try {
|
||
// Get the SHA of the base branch
|
||
const { data: ref } = await octokit.git.getRef({
|
||
owner,
|
||
repo,
|
||
ref: `heads/${baseBranch}`
|
||
});
|
||
|
||
const baseSha = ref.object.sha;
|
||
|
||
// Create the new branch
|
||
await octokit.git.createRef({
|
||
owner,
|
||
repo,
|
||
ref: `refs/heads/${branchName}`,
|
||
sha: baseSha
|
||
});
|
||
|
||
console.log(`✅ Created branch '${branchName}' on GitHub`);
|
||
} catch (error) {
|
||
if (error.status === 422 && error.message.includes('Reference already exists')) {
|
||
console.log(`ℹ️ Branch '${branchName}' already exists on GitHub`);
|
||
} else {
|
||
throw error;
|
||
}
|
||
}
|
||
}
|
||
|
||
/**
|
||
* Create a pull request on GitHub
|
||
* @param {Octokit} octokit - Octokit instance
|
||
* @param {string} owner - Repository owner
|
||
* @param {string} repo - Repository name
|
||
* @param {string} branchName - Head branch name
|
||
* @param {string} title - PR title
|
||
* @param {string} body - PR body/description
|
||
* @param {string} baseBranch - Base branch (default: 'main')
|
||
* @returns {Promise<{number: number, url: string}>} - PR number and URL
|
||
*/
|
||
async function createGitHubPR(octokit, owner, repo, branchName, title, body, baseBranch = 'main') {
|
||
const { data: pr } = await octokit.pulls.create({
|
||
owner,
|
||
repo,
|
||
title,
|
||
head: branchName,
|
||
base: baseBranch,
|
||
body
|
||
});
|
||
|
||
console.log(`✅ Created pull request #${pr.number}: ${pr.html_url}`);
|
||
|
||
return {
|
||
number: pr.number,
|
||
url: pr.html_url
|
||
};
|
||
}
|
||
|
||
/**
|
||
* Clone a GitHub repository to a directory
|
||
* @param {string} githubUrl - GitHub repository URL
|
||
* @param {string} githubToken - Optional GitHub token for private repos
|
||
* @param {string} projectPath - Path for cloning the repository
|
||
* @returns {Promise<string>} - Path to the cloned repository
|
||
*/
|
||
async function cloneGitHubRepo(githubUrl, githubToken = null, projectPath) {
|
||
return new Promise(async (resolve, reject) => {
|
||
try {
|
||
// Validate GitHub URL
|
||
if (!githubUrl || !githubUrl.includes('github.com')) {
|
||
throw new Error('Invalid GitHub URL');
|
||
}
|
||
|
||
const cloneDir = path.resolve(projectPath);
|
||
|
||
// Check if directory already exists
|
||
try {
|
||
await fs.access(cloneDir);
|
||
// Directory exists - check if it's a git repo with the same URL
|
||
try {
|
||
const existingUrl = await getGitRemoteUrl(cloneDir);
|
||
const normalizedExisting = normalizeGitHubUrl(existingUrl);
|
||
const normalizedRequested = normalizeGitHubUrl(githubUrl);
|
||
|
||
if (normalizedExisting === normalizedRequested) {
|
||
console.log('✅ Repository already exists at path with correct URL');
|
||
return resolve(cloneDir);
|
||
} else {
|
||
throw new Error(`Directory ${cloneDir} already exists with a different repository (${existingUrl}). Expected: ${githubUrl}`);
|
||
}
|
||
} catch (gitError) {
|
||
throw new Error(`Directory ${cloneDir} already exists but is not a valid git repository or git command failed`);
|
||
}
|
||
} catch (accessError) {
|
||
// Directory doesn't exist - proceed with clone
|
||
}
|
||
|
||
// Ensure parent directory exists
|
||
await fs.mkdir(path.dirname(cloneDir), { recursive: true });
|
||
|
||
// Prepare the git clone URL with authentication if token is provided
|
||
let cloneUrl = githubUrl;
|
||
if (githubToken) {
|
||
// Convert HTTPS URL to authenticated URL
|
||
// Example: https://github.com/user/repo -> https://token@github.com/user/repo
|
||
cloneUrl = githubUrl.replace('https://github.com', `https://${githubToken}@github.com`);
|
||
}
|
||
|
||
console.log('🔄 Cloning repository:', githubUrl);
|
||
console.log('📁 Destination:', cloneDir);
|
||
|
||
// Execute git clone
|
||
const gitProcess = spawn('git', ['clone', '--depth', '1', cloneUrl, cloneDir], {
|
||
stdio: ['pipe', 'pipe', 'pipe']
|
||
});
|
||
|
||
let stdout = '';
|
||
let stderr = '';
|
||
|
||
gitProcess.stdout.on('data', (data) => {
|
||
stdout += data.toString();
|
||
});
|
||
|
||
gitProcess.stderr.on('data', (data) => {
|
||
stderr += data.toString();
|
||
console.log('Git stderr:', data.toString());
|
||
});
|
||
|
||
gitProcess.on('close', (code) => {
|
||
if (code === 0) {
|
||
console.log('✅ Repository cloned successfully');
|
||
resolve(cloneDir);
|
||
} else {
|
||
console.error('❌ Git clone failed:', stderr);
|
||
reject(new Error(`Git clone failed: ${stderr}`));
|
||
}
|
||
});
|
||
|
||
gitProcess.on('error', (error) => {
|
||
reject(new Error(`Failed to execute git: ${error.message}`));
|
||
});
|
||
} catch (error) {
|
||
reject(error);
|
||
}
|
||
});
|
||
}
|
||
|
||
/**
|
||
* Clean up a temporary project directory and its Claude session
|
||
* @param {string} projectPath - Path to the project directory
|
||
* @param {string} sessionId - Session ID to clean up
|
||
*/
|
||
async function cleanupProject(projectPath, sessionId = null) {
|
||
try {
|
||
// Only clean up projects in the external-projects directory
|
||
if (!projectPath.includes('.claude/external-projects')) {
|
||
console.warn('⚠️ Refusing to clean up non-external project:', projectPath);
|
||
return;
|
||
}
|
||
|
||
console.log('🧹 Cleaning up project:', projectPath);
|
||
await fs.rm(projectPath, { recursive: true, force: true });
|
||
console.log('✅ Project cleaned up');
|
||
|
||
// Also clean up the Claude session directory if sessionId provided
|
||
if (sessionId) {
|
||
try {
|
||
const sessionPath = path.join(os.homedir(), '.claude', 'sessions', sessionId);
|
||
console.log('🧹 Cleaning up session directory:', sessionPath);
|
||
await fs.rm(sessionPath, { recursive: true, force: true });
|
||
console.log('✅ Session directory cleaned up');
|
||
} catch (error) {
|
||
console.error('⚠️ Failed to clean up session directory:', error.message);
|
||
}
|
||
}
|
||
} catch (error) {
|
||
console.error('❌ Failed to clean up project:', error);
|
||
}
|
||
}
|
||
|
||
/**
|
||
* SSE Stream Writer - Adapts SDK/CLI output to Server-Sent Events
|
||
*/
|
||
class SSEStreamWriter {
|
||
constructor(res, userId = null) {
|
||
this.res = res;
|
||
this.sessionId = null;
|
||
this.userId = userId;
|
||
this.isSSEStreamWriter = true; // Marker for transport detection
|
||
}
|
||
|
||
send(data) {
|
||
if (this.res.writableEnded) {
|
||
return;
|
||
}
|
||
|
||
// Format as SSE - providers send raw objects, we stringify
|
||
this.res.write(`data: ${JSON.stringify(data)}\n\n`);
|
||
}
|
||
|
||
end() {
|
||
if (!this.res.writableEnded) {
|
||
this.res.write('data: {"type":"done"}\n\n');
|
||
this.res.end();
|
||
}
|
||
}
|
||
|
||
setSessionId(sessionId) {
|
||
this.sessionId = sessionId;
|
||
this.send({ type: 'session-id', sessionId });
|
||
}
|
||
|
||
getSessionId() {
|
||
return this.sessionId;
|
||
}
|
||
}
|
||
|
||
/**
|
||
* Non-streaming response collector
|
||
*/
|
||
class ResponseCollector {
|
||
constructor(userId = null) {
|
||
this.messages = [];
|
||
this.sessionId = null;
|
||
this.userId = userId;
|
||
}
|
||
|
||
send(data) {
|
||
// Store ALL messages for now - we'll filter when returning
|
||
this.messages.push(data);
|
||
|
||
// Extract sessionId if present
|
||
if (typeof data === 'string') {
|
||
try {
|
||
const parsed = JSON.parse(data);
|
||
if (parsed.sessionId) {
|
||
this.sessionId = parsed.sessionId;
|
||
}
|
||
} catch (e) {
|
||
// Not JSON, ignore
|
||
}
|
||
} else if (data && data.sessionId) {
|
||
this.sessionId = data.sessionId;
|
||
}
|
||
}
|
||
|
||
end() {
|
||
// Do nothing - we'll collect all messages
|
||
}
|
||
|
||
setSessionId(sessionId) {
|
||
this.sessionId = sessionId;
|
||
}
|
||
|
||
getSessionId() {
|
||
return this.sessionId;
|
||
}
|
||
|
||
getMessages() {
|
||
return this.messages;
|
||
}
|
||
|
||
/**
|
||
* Get filtered assistant messages only
|
||
*/
|
||
getAssistantMessages() {
|
||
const assistantMessages = [];
|
||
|
||
for (const msg of this.messages) {
|
||
// Skip initial status message
|
||
if (msg && msg.type === 'status') {
|
||
continue;
|
||
}
|
||
|
||
// Handle JSON strings
|
||
if (typeof msg === 'string') {
|
||
try {
|
||
const parsed = JSON.parse(msg);
|
||
// Only include claude-response messages with assistant type
|
||
if (parsed.type === 'claude-response' && parsed.data && parsed.data.type === 'assistant') {
|
||
assistantMessages.push(parsed.data);
|
||
}
|
||
} catch (e) {
|
||
// Not JSON, skip
|
||
}
|
||
}
|
||
}
|
||
|
||
return assistantMessages;
|
||
}
|
||
|
||
/**
|
||
* Calculate total tokens from all messages
|
||
*/
|
||
getTotalTokens() {
|
||
let totalInput = 0;
|
||
let totalOutput = 0;
|
||
let totalCacheRead = 0;
|
||
let totalCacheCreation = 0;
|
||
|
||
for (const msg of this.messages) {
|
||
let data = msg;
|
||
|
||
// Parse if string
|
||
if (typeof msg === 'string') {
|
||
try {
|
||
data = JSON.parse(msg);
|
||
} catch (e) {
|
||
continue;
|
||
}
|
||
}
|
||
|
||
// Extract usage from claude-response messages
|
||
if (data && data.type === 'claude-response' && data.data) {
|
||
const msgData = data.data;
|
||
if (msgData.message && msgData.message.usage) {
|
||
const usage = msgData.message.usage;
|
||
totalInput += usage.input_tokens || 0;
|
||
totalOutput += usage.output_tokens || 0;
|
||
totalCacheRead += usage.cache_read_input_tokens || 0;
|
||
totalCacheCreation += usage.cache_creation_input_tokens || 0;
|
||
}
|
||
}
|
||
}
|
||
|
||
return {
|
||
inputTokens: totalInput,
|
||
outputTokens: totalOutput,
|
||
cacheReadTokens: totalCacheRead,
|
||
cacheCreationTokens: totalCacheCreation,
|
||
totalTokens: totalInput + totalOutput + totalCacheRead + totalCacheCreation
|
||
};
|
||
}
|
||
}
|
||
|
||
// ===============================
|
||
// External API Endpoint
|
||
// ===============================
|
||
|
||
/**
|
||
* POST /api/agent
|
||
*
|
||
* Trigger an AI agent (Claude or Cursor) to work on a project.
|
||
* Supports automatic GitHub branch and pull request creation after successful completion.
|
||
*
|
||
* ================================================================================================
|
||
* REQUEST BODY PARAMETERS
|
||
* ================================================================================================
|
||
*
|
||
* @param {string} githubUrl - (Conditionally Required) GitHub repository URL to clone.
|
||
* Supported formats:
|
||
* - HTTPS: https://github.com/owner/repo
|
||
* - HTTPS with .git: https://github.com/owner/repo.git
|
||
* - SSH: git@github.com:owner/repo
|
||
* - SSH with .git: git@github.com:owner/repo.git
|
||
*
|
||
* @param {string} projectPath - (Conditionally Required) Path to existing project OR destination for cloning.
|
||
* Behavior depends on usage:
|
||
* - If used alone: Must point to existing project directory
|
||
* - If used with githubUrl: Target location for cloning
|
||
* - If omitted with githubUrl: Auto-generates temporary path in ~/.claude/external-projects/
|
||
*
|
||
* @param {string} message - (Required) Task description for the AI agent. Used as:
|
||
* - Instructions for the agent
|
||
* - Source for auto-generated branch names (if createBranch=true and no branchName)
|
||
* - Fallback for PR title if no commits are made
|
||
*
|
||
* @param {string} provider - (Optional) AI provider to use. Options: 'claude' | 'cursor' | 'codex' | 'gemini'
|
||
* Default: 'claude'
|
||
*
|
||
* @param {boolean} stream - (Optional) Enable Server-Sent Events (SSE) streaming for real-time updates.
|
||
* Default: true
|
||
* - true: Returns text/event-stream with incremental updates
|
||
* - false: Returns complete JSON response after completion
|
||
*
|
||
* @param {string} model - (Optional) Model identifier for providers.
|
||
*
|
||
* Claude models: 'sonnet' (default), 'opus', 'haiku', 'opusplan', 'sonnet[1m]'
|
||
* Cursor models: 'gpt-5' (default), 'gpt-5.2', 'gpt-5.2-high', 'sonnet-4.5', 'opus-4.5',
|
||
* 'gemini-3-pro', 'composer-1', 'auto', 'gpt-5.1', 'gpt-5.1-high',
|
||
* 'gpt-5.1-codex', 'gpt-5.1-codex-high', 'gpt-5.1-codex-max',
|
||
* 'gpt-5.1-codex-max-high', 'opus-4.1', 'grok', and thinking variants
|
||
* Codex models: 'gpt-5.2' (default), 'gpt-5.1-codex-max', 'o3', 'o4-mini'
|
||
*
|
||
* @param {boolean} cleanup - (Optional) Auto-cleanup project directory after completion.
|
||
* Default: true
|
||
* Behavior:
|
||
* - Only applies when cloning via githubUrl (not for existing projectPath)
|
||
* - Deletes cloned repository after 5 seconds
|
||
* - Also deletes associated Claude session directory
|
||
* - Remote branch and PR remain on GitHub if created
|
||
*
|
||
* @param {string} githubToken - (Optional) GitHub Personal Access Token for authentication.
|
||
* Overrides stored token from user settings.
|
||
* Required for:
|
||
* - Private repositories
|
||
* - Branch/PR creation features
|
||
* Token must have 'repo' scope for full functionality.
|
||
*
|
||
* @param {string} branchName - (Optional) Custom name for the Git branch.
|
||
* If provided, createBranch is automatically set to true.
|
||
* Validation rules (errors returned if violated):
|
||
* - Cannot be empty or whitespace only
|
||
* - Cannot start or end with dot (.)
|
||
* - Cannot contain consecutive dots (..)
|
||
* - Cannot contain spaces
|
||
* - Cannot contain special characters: ~ ^ : ? * [ \
|
||
* - Cannot contain @{
|
||
* - Cannot start or end with forward slash (/)
|
||
* - Cannot contain consecutive slashes (//)
|
||
* - Cannot end with .lock
|
||
* - Cannot contain ASCII control characters
|
||
* Examples: 'feature/user-auth', 'bugfix/login-error', 'refactor/db-optimization'
|
||
*
|
||
* @param {boolean} createBranch - (Optional) Create a new Git branch after successful agent completion.
|
||
* Default: false (or true if branchName is provided)
|
||
* Behavior:
|
||
* - Creates branch locally and pushes to remote
|
||
* - If branch exists locally: Checks out existing branch (no error)
|
||
* - If branch exists on remote: Uses existing branch (no error)
|
||
* - Branch name: Custom (if branchName provided) or auto-generated from message
|
||
* - Requires either githubUrl OR projectPath with GitHub remote
|
||
*
|
||
* @param {boolean} createPR - (Optional) Create a GitHub Pull Request after successful completion.
|
||
* Default: false
|
||
* Behavior:
|
||
* - PR title: First commit message (or fallback to message parameter)
|
||
* - PR description: Auto-generated from all commit messages
|
||
* - Base branch: Always 'main' (currently hardcoded)
|
||
* - If PR already exists: GitHub returns error with details
|
||
* - Requires either githubUrl OR projectPath with GitHub remote
|
||
*
|
||
* ================================================================================================
|
||
* PATH HANDLING BEHAVIOR
|
||
* ================================================================================================
|
||
*
|
||
* Scenario 1: Only githubUrl provided
|
||
* Input: { githubUrl: "https://github.com/owner/repo" }
|
||
* Action: Clones to auto-generated temporary path: ~/.claude/external-projects/<hash>/
|
||
* Cleanup: Yes (if cleanup=true)
|
||
*
|
||
* Scenario 2: Only projectPath provided
|
||
* Input: { projectPath: "/home/user/my-project" }
|
||
* Action: Uses existing project at specified path
|
||
* Validation: Path must exist and be accessible
|
||
* Cleanup: No (never cleanup existing projects)
|
||
*
|
||
* Scenario 3: Both githubUrl and projectPath provided
|
||
* Input: { githubUrl: "https://github.com/owner/repo", projectPath: "/custom/path" }
|
||
* Action: Clones githubUrl to projectPath location
|
||
* Validation:
|
||
* - If projectPath exists with git repo:
|
||
* - Compares remote URL with githubUrl
|
||
* - If URLs match: Reuses existing repo
|
||
* - If URLs differ: Returns error
|
||
* Cleanup: Yes (if cleanup=true)
|
||
*
|
||
* ================================================================================================
|
||
* GITHUB BRANCH/PR CREATION REQUIREMENTS
|
||
* ================================================================================================
|
||
*
|
||
* For createBranch or createPR to work, one of the following must be true:
|
||
*
|
||
* Option A: githubUrl provided
|
||
* - Repository URL directly specified
|
||
* - Works with both cloning and existing paths
|
||
*
|
||
* Option B: projectPath with GitHub remote
|
||
* - Project must be a Git repository
|
||
* - Must have 'origin' remote configured
|
||
* - Remote URL must point to github.com
|
||
* - System auto-detects GitHub URL via: git remote get-url origin
|
||
*
|
||
* Additional Requirements:
|
||
* - Valid GitHub token (from settings or githubToken parameter)
|
||
* - Token must have 'repo' scope for private repos
|
||
* - Project must have commits (for PR creation)
|
||
*
|
||
* ================================================================================================
|
||
* VALIDATION & ERROR HANDLING
|
||
* ================================================================================================
|
||
*
|
||
* Input Validations (400 Bad Request):
|
||
* - Either githubUrl OR projectPath must be provided (not neither)
|
||
* - message must be non-empty string
|
||
* - provider must be 'claude', 'cursor', 'codex', or 'gemini'
|
||
* - createBranch/createPR requires githubUrl OR projectPath (not neither)
|
||
* - branchName must pass Git naming rules (if provided)
|
||
*
|
||
* Runtime Validations (500 Internal Server Error or specific error in response):
|
||
* - projectPath must exist (if used alone)
|
||
* - GitHub URL format must be valid
|
||
* - Git remote URL must include github.com (for projectPath + branch/PR)
|
||
* - GitHub token must be available (for private repos and branch/PR)
|
||
* - Directory conflicts handled (existing path with different repo)
|
||
*
|
||
* Branch Name Validation Errors (returned in response, not HTTP error):
|
||
* Invalid names return: { branch: { error: "Invalid branch name: <reason>" } }
|
||
* Examples:
|
||
* - "my branch" → "Branch name cannot contain spaces"
|
||
* - ".feature" → "Branch name cannot start with a dot"
|
||
* - "feature.lock" → "Branch name cannot end with .lock"
|
||
*
|
||
* ================================================================================================
|
||
* RESPONSE FORMATS
|
||
* ================================================================================================
|
||
*
|
||
* Streaming Response (stream=true):
|
||
* Content-Type: text/event-stream
|
||
* Events:
|
||
* - { type: "status", message: "...", projectPath: "..." }
|
||
* - { type: "claude-response", data: {...} }
|
||
* - { type: "github-branch", branch: { name: "...", url: "..." } }
|
||
* - { type: "github-pr", pullRequest: { number: 42, url: "..." } }
|
||
* - { type: "github-error", error: "..." }
|
||
* - { type: "done" }
|
||
*
|
||
* Non-Streaming Response (stream=false):
|
||
* Content-Type: application/json
|
||
* {
|
||
* success: true,
|
||
* sessionId: "session-123",
|
||
* messages: [...], // Assistant messages only (filtered)
|
||
* tokens: {
|
||
* inputTokens: 150,
|
||
* outputTokens: 50,
|
||
* cacheReadTokens: 0,
|
||
* cacheCreationTokens: 0,
|
||
* totalTokens: 200
|
||
* },
|
||
* projectPath: "/path/to/project",
|
||
* branch: { // Only if createBranch=true
|
||
* name: "feature/xyz",
|
||
* url: "https://github.com/owner/repo/tree/feature/xyz"
|
||
* } | { error: "..." },
|
||
* pullRequest: { // Only if createPR=true
|
||
* number: 42,
|
||
* url: "https://github.com/owner/repo/pull/42"
|
||
* } | { error: "..." }
|
||
* }
|
||
*
|
||
* Error Response:
|
||
* HTTP Status: 400, 401, 500
|
||
* Content-Type: application/json
|
||
* { success: false, error: "Error description" }
|
||
*
|
||
* ================================================================================================
|
||
* EXAMPLES
|
||
* ================================================================================================
|
||
*
|
||
* Example 1: Clone and process with auto-cleanup
|
||
* POST /api/agent
|
||
* { "githubUrl": "https://github.com/user/repo", "message": "Fix bug" }
|
||
*
|
||
* Example 2: Use existing project with custom branch and PR
|
||
* POST /api/agent
|
||
* {
|
||
* "projectPath": "/home/user/project",
|
||
* "message": "Add feature",
|
||
* "branchName": "feature/new-feature",
|
||
* "createPR": true
|
||
* }
|
||
*
|
||
* Example 3: Clone to specific path with auto-generated branch
|
||
* POST /api/agent
|
||
* {
|
||
* "githubUrl": "https://github.com/user/repo",
|
||
* "projectPath": "/tmp/work",
|
||
* "message": "Refactor code",
|
||
* "createBranch": true,
|
||
* "cleanup": false
|
||
* }
|
||
*/
|
||
router.post('/', validateExternalApiKey, async (req, res) => {
|
||
const { githubUrl, projectPath, message, provider = 'claude', model, githubToken, branchName, sessionId } = req.body;
|
||
|
||
// Parse stream and cleanup as booleans (handle string "true"/"false" from curl)
|
||
const stream = req.body.stream === undefined ? true : (req.body.stream === true || req.body.stream === 'true');
|
||
const cleanup = req.body.cleanup === undefined ? true : (req.body.cleanup === true || req.body.cleanup === 'true');
|
||
|
||
// If branchName is provided, automatically enable createBranch
|
||
const createBranch = branchName ? true : (req.body.createBranch === true || req.body.createBranch === 'true');
|
||
const createPR = req.body.createPR === true || req.body.createPR === 'true';
|
||
|
||
// Validate inputs
|
||
if (!githubUrl && !projectPath) {
|
||
return res.status(400).json({ error: 'Either githubUrl or projectPath is required' });
|
||
}
|
||
|
||
if (!message || !message.trim()) {
|
||
return res.status(400).json({ error: 'message is required' });
|
||
}
|
||
|
||
if (!['claude', 'cursor', 'codex', 'gemini'].includes(provider)) {
|
||
return res.status(400).json({ error: 'provider must be "claude", "cursor", "codex", or "gemini"' });
|
||
}
|
||
|
||
// Validate GitHub branch/PR creation requirements
|
||
// Allow branch/PR creation with projectPath as long as it has a GitHub remote
|
||
if ((createBranch || createPR) && !githubUrl && !projectPath) {
|
||
return res.status(400).json({ error: 'createBranch and createPR require either githubUrl or projectPath with a GitHub remote' });
|
||
}
|
||
|
||
let finalProjectPath = null;
|
||
let writer = null;
|
||
|
||
try {
|
||
// Determine the final project path
|
||
if (githubUrl) {
|
||
// Clone repository (to projectPath if provided, otherwise generate path)
|
||
const tokenToUse = githubToken || githubTokensDb.getActiveGithubToken(req.user.id);
|
||
|
||
let targetPath;
|
||
if (projectPath) {
|
||
targetPath = projectPath;
|
||
} else {
|
||
// Generate a unique path for cloning
|
||
const repoHash = crypto.createHash('md5').update(githubUrl + Date.now()).digest('hex');
|
||
targetPath = path.join(os.homedir(), '.claude', 'external-projects', repoHash);
|
||
}
|
||
|
||
finalProjectPath = await cloneGitHubRepo(githubUrl.trim(), tokenToUse, targetPath);
|
||
} else {
|
||
// Use existing project path
|
||
finalProjectPath = normalizeProjectPath(path.resolve(projectPath));
|
||
|
||
// Verify the path exists
|
||
try {
|
||
await fs.access(finalProjectPath);
|
||
} catch (error) {
|
||
throw new Error(`Project path does not exist: ${finalProjectPath}`);
|
||
}
|
||
}
|
||
|
||
finalProjectPath = normalizeProjectPath(finalProjectPath);
|
||
|
||
// Register project path in DB (or reuse existing active registration)
|
||
const registrationResult = projectsDb.createProjectPath(finalProjectPath, null);
|
||
if (registrationResult.outcome === 'active_conflict') {
|
||
console.log('Project registration already exists for:', finalProjectPath);
|
||
} else {
|
||
console.log('Project registered:', registrationResult.project);
|
||
}
|
||
|
||
// Set up writer based on streaming mode
|
||
if (stream) {
|
||
// Set up SSE headers for streaming
|
||
res.setHeader('Content-Type', 'text/event-stream');
|
||
res.setHeader('Cache-Control', 'no-cache');
|
||
res.setHeader('Connection', 'keep-alive');
|
||
res.setHeader('X-Accel-Buffering', 'no'); // Disable nginx buffering
|
||
|
||
writer = new SSEStreamWriter(res, req.user.id);
|
||
|
||
// Send initial status
|
||
writer.send({
|
||
type: 'status',
|
||
message: githubUrl ? 'Repository cloned and session started' : 'Session started',
|
||
projectPath: finalProjectPath
|
||
});
|
||
} else {
|
||
// Non-streaming mode: collect messages
|
||
writer = new ResponseCollector(req.user.id);
|
||
|
||
// Collect initial status message
|
||
writer.send({
|
||
type: 'status',
|
||
message: githubUrl ? 'Repository cloned and session started' : 'Session started',
|
||
projectPath: finalProjectPath
|
||
});
|
||
}
|
||
|
||
// Start the appropriate session
|
||
if (provider === 'claude') {
|
||
console.log('🤖 Starting Claude SDK session');
|
||
|
||
await queryClaudeSDK(message.trim(), {
|
||
projectPath: finalProjectPath,
|
||
cwd: finalProjectPath,
|
||
sessionId: sessionId || null,
|
||
model: model,
|
||
permissionMode: 'bypassPermissions' // Bypass all permissions for API calls
|
||
}, writer);
|
||
|
||
} else if (provider === 'cursor') {
|
||
console.log('🖱️ Starting Cursor CLI session');
|
||
|
||
await spawnCursor(message.trim(), {
|
||
projectPath: finalProjectPath,
|
||
cwd: finalProjectPath,
|
||
sessionId: sessionId || null,
|
||
model: model || undefined,
|
||
skipPermissions: true // Bypass permissions for Cursor
|
||
}, writer);
|
||
} else if (provider === 'codex') {
|
||
console.log('🤖 Starting Codex SDK session');
|
||
|
||
await queryCodex(message.trim(), {
|
||
projectPath: finalProjectPath,
|
||
cwd: finalProjectPath,
|
||
sessionId: sessionId || null,
|
||
model: model || CODEX_MODELS.DEFAULT,
|
||
permissionMode: 'bypassPermissions'
|
||
}, writer);
|
||
} else if (provider === 'gemini') {
|
||
console.log('✨ Starting Gemini CLI session');
|
||
|
||
await spawnGemini(message.trim(), {
|
||
projectPath: finalProjectPath,
|
||
cwd: finalProjectPath,
|
||
sessionId: sessionId || null,
|
||
model: model,
|
||
skipPermissions: true // CLI mode bypasses permissions
|
||
}, writer);
|
||
}
|
||
|
||
// Handle GitHub branch and PR creation after successful agent completion
|
||
let branchInfo = null;
|
||
let prInfo = null;
|
||
|
||
if (createBranch || createPR) {
|
||
try {
|
||
console.log('🔄 Starting GitHub branch/PR creation workflow...');
|
||
|
||
// Get GitHub token
|
||
const tokenToUse = githubToken || githubTokensDb.getActiveGithubToken(req.user.id);
|
||
|
||
if (!tokenToUse) {
|
||
throw new Error('GitHub token required for branch/PR creation. Please configure a GitHub token in settings.');
|
||
}
|
||
|
||
// Initialize Octokit
|
||
const octokit = new Octokit({ auth: tokenToUse });
|
||
|
||
// Get GitHub URL - either from parameter or from git remote
|
||
let repoUrl = githubUrl;
|
||
if (!repoUrl) {
|
||
console.log('🔍 Getting GitHub URL from git remote...');
|
||
try {
|
||
repoUrl = await getGitRemoteUrl(finalProjectPath);
|
||
if (!repoUrl.includes('github.com')) {
|
||
throw new Error('Project does not have a GitHub remote configured');
|
||
}
|
||
console.log(`✅ Found GitHub remote: ${repoUrl}`);
|
||
} catch (error) {
|
||
throw new Error(`Failed to get GitHub remote URL: ${error.message}`);
|
||
}
|
||
}
|
||
|
||
// Parse GitHub URL to get owner and repo
|
||
const { owner, repo } = parseGitHubUrl(repoUrl);
|
||
console.log(`📦 Repository: ${owner}/${repo}`);
|
||
|
||
// Use provided branch name or auto-generate from message
|
||
const finalBranchName = branchName || autogenerateBranchName(message);
|
||
if (branchName) {
|
||
console.log(`🌿 Using provided branch name: ${finalBranchName}`);
|
||
|
||
// Validate custom branch name
|
||
const validation = validateBranchName(finalBranchName);
|
||
if (!validation.valid) {
|
||
throw new Error(`Invalid branch name: ${validation.error}`);
|
||
}
|
||
} else {
|
||
console.log(`🌿 Auto-generated branch name: ${finalBranchName}`);
|
||
}
|
||
|
||
if (createBranch) {
|
||
// Create and checkout the new branch locally
|
||
console.log('🔄 Creating local branch...');
|
||
const checkoutProcess = spawn('git', ['checkout', '-b', finalBranchName], {
|
||
cwd: finalProjectPath,
|
||
stdio: 'pipe'
|
||
});
|
||
|
||
await new Promise((resolve, reject) => {
|
||
let stderr = '';
|
||
checkoutProcess.stderr.on('data', (data) => { stderr += data.toString(); });
|
||
checkoutProcess.on('close', (code) => {
|
||
if (code === 0) {
|
||
console.log(`✅ Created and checked out local branch '${finalBranchName}'`);
|
||
resolve();
|
||
} else {
|
||
// Branch might already exist locally, try to checkout
|
||
if (stderr.includes('already exists')) {
|
||
console.log(`ℹ️ Branch '${finalBranchName}' already exists locally, checking out...`);
|
||
const checkoutExisting = spawn('git', ['checkout', finalBranchName], {
|
||
cwd: finalProjectPath,
|
||
stdio: 'pipe'
|
||
});
|
||
checkoutExisting.on('close', (checkoutCode) => {
|
||
if (checkoutCode === 0) {
|
||
console.log(`✅ Checked out existing branch '${finalBranchName}'`);
|
||
resolve();
|
||
} else {
|
||
reject(new Error(`Failed to checkout existing branch: ${stderr}`));
|
||
}
|
||
});
|
||
} else {
|
||
reject(new Error(`Failed to create branch: ${stderr}`));
|
||
}
|
||
}
|
||
});
|
||
});
|
||
|
||
// Push the branch to remote
|
||
console.log('🔄 Pushing branch to remote...');
|
||
const pushProcess = spawn('git', ['push', '-u', 'origin', finalBranchName], {
|
||
cwd: finalProjectPath,
|
||
stdio: 'pipe'
|
||
});
|
||
|
||
await new Promise((resolve, reject) => {
|
||
let stderr = '';
|
||
let stdout = '';
|
||
pushProcess.stdout.on('data', (data) => { stdout += data.toString(); });
|
||
pushProcess.stderr.on('data', (data) => { stderr += data.toString(); });
|
||
pushProcess.on('close', (code) => {
|
||
if (code === 0) {
|
||
console.log(`✅ Pushed branch '${finalBranchName}' to remote`);
|
||
resolve();
|
||
} else {
|
||
// Check if branch exists on remote but has different commits
|
||
if (stderr.includes('already exists') || stderr.includes('up-to-date')) {
|
||
console.log(`ℹ️ Branch '${finalBranchName}' already exists on remote, using existing branch`);
|
||
resolve();
|
||
} else {
|
||
reject(new Error(`Failed to push branch: ${stderr}`));
|
||
}
|
||
}
|
||
});
|
||
});
|
||
|
||
branchInfo = {
|
||
name: finalBranchName,
|
||
url: `https://github.com/${owner}/${repo}/tree/${finalBranchName}`
|
||
};
|
||
}
|
||
|
||
if (createPR) {
|
||
// Get commit messages to generate PR description
|
||
console.log('🔄 Generating PR title and description...');
|
||
const commitMessages = await getCommitMessages(finalProjectPath, 5);
|
||
|
||
// Use the first commit message as the PR title, or fallback to the agent message
|
||
const prTitle = commitMessages.length > 0 ? commitMessages[0] : message;
|
||
|
||
// Generate PR body from commit messages
|
||
let prBody = '## Changes\n\n';
|
||
if (commitMessages.length > 0) {
|
||
prBody += commitMessages.map(msg => `- ${msg}`).join('\n');
|
||
} else {
|
||
prBody += `Agent task: ${message}`;
|
||
}
|
||
prBody += '\n\n---\n*This pull request was automatically created by CloudCLI.ai Agent.*';
|
||
|
||
console.log(`📝 PR Title: ${prTitle}`);
|
||
|
||
// Create the pull request
|
||
console.log('🔄 Creating pull request...');
|
||
prInfo = await createGitHubPR(octokit, owner, repo, finalBranchName, prTitle, prBody, 'main');
|
||
}
|
||
|
||
// Send branch/PR info in response
|
||
if (stream) {
|
||
if (branchInfo) {
|
||
writer.send({
|
||
type: 'github-branch',
|
||
branch: branchInfo
|
||
});
|
||
}
|
||
if (prInfo) {
|
||
writer.send({
|
||
type: 'github-pr',
|
||
pullRequest: prInfo
|
||
});
|
||
}
|
||
}
|
||
|
||
} catch (error) {
|
||
console.error('❌ GitHub branch/PR creation error:', error);
|
||
|
||
// Send error but don't fail the entire request
|
||
if (stream) {
|
||
writer.send({
|
||
type: 'github-error',
|
||
error: error.message
|
||
});
|
||
}
|
||
// Store error info for non-streaming response
|
||
if (!stream) {
|
||
branchInfo = { error: error.message };
|
||
prInfo = { error: error.message };
|
||
}
|
||
}
|
||
}
|
||
|
||
// Handle response based on streaming mode
|
||
if (stream) {
|
||
// Streaming mode: end the SSE stream
|
||
writer.end();
|
||
} else {
|
||
// Non-streaming mode: send filtered messages and token summary as JSON
|
||
const assistantMessages = writer.getAssistantMessages();
|
||
const tokenSummary = writer.getTotalTokens();
|
||
|
||
const response = {
|
||
success: true,
|
||
sessionId: writer.getSessionId(),
|
||
messages: assistantMessages,
|
||
tokens: tokenSummary,
|
||
projectPath: finalProjectPath
|
||
};
|
||
|
||
// Add branch/PR info if created
|
||
if (branchInfo) {
|
||
response.branch = branchInfo;
|
||
}
|
||
if (prInfo) {
|
||
response.pullRequest = prInfo;
|
||
}
|
||
|
||
res.json(response);
|
||
}
|
||
|
||
// Clean up if requested
|
||
if (cleanup && githubUrl) {
|
||
// Only cleanup if we cloned a repo (not for existing project paths)
|
||
const sessionIdForCleanup = writer.getSessionId();
|
||
setTimeout(() => {
|
||
cleanupProject(finalProjectPath, sessionIdForCleanup);
|
||
}, 5000);
|
||
}
|
||
|
||
} catch (error) {
|
||
console.error('❌ External session error:', error);
|
||
|
||
// Clean up on error
|
||
if (finalProjectPath && cleanup && githubUrl) {
|
||
const sessionIdForCleanup = writer ? writer.getSessionId() : null;
|
||
cleanupProject(finalProjectPath, sessionIdForCleanup);
|
||
}
|
||
|
||
if (stream) {
|
||
// For streaming, send error event and stop
|
||
if (!writer) {
|
||
// Set up SSE headers if not already done
|
||
res.setHeader('Content-Type', 'text/event-stream');
|
||
res.setHeader('Cache-Control', 'no-cache');
|
||
res.setHeader('Connection', 'keep-alive');
|
||
res.setHeader('X-Accel-Buffering', 'no');
|
||
writer = new SSEStreamWriter(res, req.user.id);
|
||
}
|
||
|
||
if (!res.writableEnded) {
|
||
writer.send({
|
||
type: 'error',
|
||
error: error.message,
|
||
message: `Failed: ${error.message}`
|
||
});
|
||
writer.end();
|
||
}
|
||
} else if (!res.headersSent) {
|
||
res.status(500).json({
|
||
success: false,
|
||
error: error.message
|
||
});
|
||
}
|
||
}
|
||
});
|
||
|
||
export default router;
|