mirror of
https://github.com/siteboon/claudecodeui.git
synced 2026-05-01 10:18:37 +00:00
refactor(projects): identify projects by DB projectId instead of folder-derived name
GET /api/projects used to scan ~/.claude/projects/ on every request, derive each project's identity from the encoded folder name, and re-parse JSONL files to build session lists. Using the folder-derived name as the project identifier leaked the Claude CLI's on-disk encoding into every API route, forced every downstream endpoint to re-resolve a real path via JSONL 'cwd' inspection, and made the project list endpoint O(projects x sessions) on disk I/O. This change switches the entire API surface to identify projects by the stable primary key from the 'projects' table and drives the listing straight from the DB: - Add projectsDb.getProjectPathById as the canonical projectId -> path resolver so routes no longer need to touch the filesystem to figure out where a project lives. - Rewrite getProjects so it reads the project list from the 'projects' table and the per-project session list from the 'sessions' table (one SELECT per project). No filesystem scanning happens for this endpoint anymore, which removes the dependency on ~/.claude/projects existing, on Cursor's MD5-hashed chat folders being discoverable, and on Codex's JSONL history being on disk. Per the migration spec each session now exposes 'summary' sourced from sessions.custom_name, 'messageCount' = 0 (message counting is not implemented), and sessionMeta.hasMore is pinned to false since this endpoint doesn't drive session pagination. - Introduce id-based wrappers (getSessionsById, renameProjectById, deleteSessionById, deleteProjectById, getProjectTaskMasterById) so every caller can pass projectId and resolve the real path through the DB. renameProjectById also writes to projects.custom_project_name so the DB-driven getProjects response reflects renames immediately; it keeps project-config.json in sync for any legacy reader that still consults the JSON file. - Migrate every /api/projects/:projectName route in server/index.js, server/routes/taskmaster.js, and server/routes/messages.js to :projectId, and change server/routes/git.js so the 'project' query/body parameter carries a projectId that is resolved through the DB before any git command runs. TaskMaster WebSocket broadcasts emit 'projectId' for the same reason so the frontend can match notifications against its current selection without another lookup. - Delete helpers that existed only to feed the old getProjects path (getCursorSessions, getGeminiCliSessions, getProjectTaskMaster) along with their unused imports (better-sqlite3's Database, applyCustomSessionNames). The legacy folder-name helpers (getSessions, renameProject, deleteSession, deleteProject, extractProjectDirectory) are kept as internal implementation details of the id-based wrappers and of destructive cleanup / conversation search, but they are no longer re-exported. - searchConversations still walks JSONL to produce match snippets (that data doesn't live in the DB), but it now includes the resolved projectId in each result so the sidebar can cross-reference hits with its already loaded project list without a second round-trip. Frontend migration: - Project.name is replaced by Project.projectId in src/types/app.ts, and ProjectSession.__projectName becomes __projectId so session tagging and sidebar state keys stay aligned with the backend identifier. Settings continues to use SettingsProject.name for legacy consumers, but it is populated from projectId by normalizeProjectForSettings. - All places that previously indexed per-project state by project.name (sidebar expanded/starred/loading/deletingProjects sets, additionalSessions map, projectHasMoreOverrides, starredProjects localStorage, command history and draft-input localStorage, TaskMaster caches) now key on projectId so state survives display-name edits and is consistent across the app. - src/utils/api.js renames every endpoint parameter to projectId, the unified messages endpoint takes projectId in its query string, and useSessionStore forwards projectId on fetchFromServer / fetchMore / refreshFromServer. Git panel, file tree, code editor, PRD editor, plugins context, MCP server flows and TaskMaster hooks are all updated to pass projectId. - DEFAULT_PROJECT_FOR_EMPTY_SHELL is updated to carry a 'default' projectId sentinel so the empty-shell placeholder still satisfies the Project contract. Bug fix bundled in: - sessionsDb.setName no longer bumps updated_at when a row already exists. Renaming is a label change, not activity, so there is no reason for it to reset 'last activity' in the sidebar. It also no longer relies on SQLite's CURRENT_TIMESTAMP, which stores a naive 'YYYY-MM-DD HH:MM:SS' value that JavaScript parses as local time and caused renamed sessions to appear shifted backwards by the client's UTC offset. When an INSERT actually happens it now writes ISO-8601 UTC with a 'Z' suffix. - buildSessionsByProviderFromDb normalizes any legacy naive timestamps in the sessions table to ISO-8601 UTC on the way out so rows written before this change also render correctly on the client. Other cleanup: - Removed the filesystem-first project-discovery comment block at the top of server/projects.js and replaced it with a short note that describes the new DB-driven flow and lists the few remaining filesystem-dependent helpers (message reads, search, destructive delete, manual project registration). - server/modules/providers/index.ts is added as a small barrel so the providers module exposes a stable public surface. Made-with: Cursor
This commit is contained in:
156
server/index.js
156
server/index.js
@@ -28,7 +28,17 @@ import { spawn } from 'child_process';
|
||||
import pty from 'node-pty';
|
||||
import mime from 'mime-types';
|
||||
|
||||
import { getProjects, getSessions, renameProject, deleteSession, deleteProject, getProjectTaskMaster, extractProjectDirectory, clearProjectDirectoryCache, searchConversations } from './projects.js';
|
||||
import {
|
||||
getProjects,
|
||||
getSessionsById,
|
||||
renameProjectById,
|
||||
deleteSessionById,
|
||||
deleteProjectById,
|
||||
getProjectTaskMasterById,
|
||||
getProjectPathById,
|
||||
clearProjectDirectoryCache,
|
||||
searchConversations,
|
||||
} from './projects.js';
|
||||
import { queryClaudeSDK, abortClaudeSDKSession, isClaudeSDKSessionActive, getActiveClaudeSDKSessions, resolveToolApproval, getPendingApprovalsForSession, reconnectSessionWriter } from './claude-sdk.js';
|
||||
import { spawnCursor, abortCursorSession, isCursorSessionActive, getActiveCursorSessions } from './cursor-cli.js';
|
||||
import { queryCodex, abortCodexSession, isCodexSessionActive, getActiveCodexSessions } from './openai-codex.js';
|
||||
@@ -428,20 +438,25 @@ app.get('/api/projects', authenticateToken, async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/projects/:projectName/taskmaster', authenticateToken, async (req, res) => {
|
||||
// Project-scoped TaskMaster details; identified by DB-assigned `projectId`.
|
||||
app.get('/api/projects/:projectId/taskmaster', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const taskMasterDetails = await getProjectTaskMaster(projectName);
|
||||
const { projectId } = req.params;
|
||||
const taskMasterDetails = await getProjectTaskMasterById(projectId);
|
||||
if (!taskMasterDetails) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
res.json(taskMasterDetails);
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/projects/:projectName/sessions', authenticateToken, async (req, res) => {
|
||||
// Sessions for a project; `projectId` is resolved to a path via the DB.
|
||||
app.get('/api/projects/:projectId/sessions', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { limit = 5, offset = 0 } = req.query;
|
||||
const result = await getSessions(req.params.projectName, parseInt(limit), parseInt(offset));
|
||||
const result = await getSessionsById(req.params.projectId, parseInt(limit), parseInt(offset));
|
||||
applyCustomSessionNames(result.sessions, 'claude');
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
@@ -449,23 +464,23 @@ app.get('/api/projects/:projectName/sessions', authenticateToken, async (req, re
|
||||
}
|
||||
});
|
||||
|
||||
// Rename project endpoint
|
||||
app.put('/api/projects/:projectName/rename', authenticateToken, async (req, res) => {
|
||||
// Rename project endpoint; stores the custom name on the DB row for `projectId`.
|
||||
app.put('/api/projects/:projectId/rename', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { displayName } = req.body;
|
||||
await renameProject(req.params.projectName, displayName);
|
||||
await renameProjectById(req.params.projectId, displayName);
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Delete session endpoint
|
||||
app.delete('/api/projects/:projectName/sessions/:sessionId', authenticateToken, async (req, res) => {
|
||||
// Delete session endpoint; resolves `projectId` to path before touching disk.
|
||||
app.delete('/api/projects/:projectId/sessions/:sessionId', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName, sessionId } = req.params;
|
||||
console.log(`[API] Deleting session: ${sessionId} from project: ${projectName}`);
|
||||
await deleteSession(projectName, sessionId);
|
||||
const { projectId, sessionId } = req.params;
|
||||
console.log(`[API] Deleting session: ${sessionId} from project: ${projectId}`);
|
||||
await deleteSessionById(projectId, sessionId);
|
||||
sessionsDb.deleteName(sessionId, 'claude');
|
||||
console.log(`[API] Session ${sessionId} deleted successfully`);
|
||||
res.json({ success: true });
|
||||
@@ -504,12 +519,13 @@ app.put('/api/sessions/:sessionId/rename', authenticateToken, async (req, res) =
|
||||
// Delete project endpoint
|
||||
// force=true to allow removal even when sessions exist
|
||||
// deleteData=true to also delete session/memory files on disk (destructive)
|
||||
app.delete('/api/projects/:projectName', authenticateToken, async (req, res) => {
|
||||
// `projectId` is resolved to an absolute path through the DB before cleanup.
|
||||
app.delete('/api/projects/:projectId', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const force = req.query.force === 'true';
|
||||
const deleteData = req.query.deleteData === 'true';
|
||||
await deleteProject(projectName, force, deleteData);
|
||||
await deleteProjectById(projectId, force, deleteData);
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
@@ -694,9 +710,9 @@ app.post('/api/create-folder', authenticateToken, async (req, res) => {
|
||||
});
|
||||
|
||||
// Read file content endpoint
|
||||
app.get('/api/projects/:projectName/file', authenticateToken, async (req, res) => {
|
||||
app.get('/api/projects/:projectId/file', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { filePath } = req.query;
|
||||
|
||||
|
||||
@@ -705,7 +721,9 @@ app.get('/api/projects/:projectName/file', authenticateToken, async (req, res) =
|
||||
return res.status(400).json({ error: 'Invalid file path' });
|
||||
}
|
||||
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Resolve the absolute project root via the DB-backed helper; the
|
||||
// caller passes the DB-assigned `projectId`, not a folder name.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -734,9 +752,9 @@ app.get('/api/projects/:projectName/file', authenticateToken, async (req, res) =
|
||||
});
|
||||
|
||||
// Serve raw file bytes for previews and downloads.
|
||||
app.get('/api/projects/:projectName/files/content', authenticateToken, async (req, res) => {
|
||||
app.get('/api/projects/:projectId/files/content', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { path: filePath } = req.query;
|
||||
|
||||
|
||||
@@ -745,7 +763,8 @@ app.get('/api/projects/:projectName/files/content', authenticateToken, async (re
|
||||
return res.status(400).json({ error: 'Invalid file path' });
|
||||
}
|
||||
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Projects are now addressed by DB `projectId`, resolved to their path here.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -791,9 +810,9 @@ app.get('/api/projects/:projectName/files/content', authenticateToken, async (re
|
||||
});
|
||||
|
||||
// Save file content endpoint
|
||||
app.put('/api/projects/:projectName/file', authenticateToken, async (req, res) => {
|
||||
app.put('/api/projects/:projectId/file', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { filePath, content } = req.body;
|
||||
|
||||
|
||||
@@ -806,7 +825,8 @@ app.put('/api/projects/:projectName/file', authenticateToken, async (req, res) =
|
||||
return res.status(400).json({ error: 'Content is required' });
|
||||
}
|
||||
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Projects are now addressed by DB `projectId`, resolved to their path here.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -840,19 +860,16 @@ app.put('/api/projects/:projectName/file', authenticateToken, async (req, res) =
|
||||
}
|
||||
});
|
||||
|
||||
app.get('/api/projects/:projectName/files', authenticateToken, async (req, res) => {
|
||||
app.get('/api/projects/:projectId/files', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
|
||||
// Using fsPromises from import
|
||||
|
||||
// Use extractProjectDirectory to get the actual project path
|
||||
let actualPath;
|
||||
try {
|
||||
actualPath = await extractProjectDirectory(req.params.projectName);
|
||||
} catch (error) {
|
||||
console.error('Error extracting project directory:', error);
|
||||
// Fallback to simple dash replacement
|
||||
actualPath = req.params.projectName.replace(/-/g, '/');
|
||||
// Resolve the project's absolute path through the DB (projectId is the
|
||||
// primary key of the `projects` table after the identifier migration).
|
||||
const actualPath = await getProjectPathById(req.params.projectId);
|
||||
if (!actualPath) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
|
||||
// Check if path exists
|
||||
@@ -917,10 +934,10 @@ function validateFilename(name) {
|
||||
return { valid: true };
|
||||
}
|
||||
|
||||
// POST /api/projects/:projectName/files/create - Create new file or directory
|
||||
app.post('/api/projects/:projectName/files/create', authenticateToken, async (req, res) => {
|
||||
// POST /api/projects/:projectId/files/create - Create new file or directory
|
||||
app.post('/api/projects/:projectId/files/create', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { path: parentPath, type, name } = req.body;
|
||||
|
||||
// Validate input
|
||||
@@ -937,8 +954,8 @@ app.post('/api/projects/:projectName/files/create', authenticateToken, async (re
|
||||
return res.status(400).json({ error: nameValidation.error });
|
||||
}
|
||||
|
||||
// Get project root
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Resolve the project directory through the DB using the new projectId.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -994,10 +1011,10 @@ app.post('/api/projects/:projectName/files/create', authenticateToken, async (re
|
||||
}
|
||||
});
|
||||
|
||||
// PUT /api/projects/:projectName/files/rename - Rename file or directory
|
||||
app.put('/api/projects/:projectName/files/rename', authenticateToken, async (req, res) => {
|
||||
// PUT /api/projects/:projectId/files/rename - Rename file or directory
|
||||
app.put('/api/projects/:projectId/files/rename', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { oldPath, newName } = req.body;
|
||||
|
||||
// Validate input
|
||||
@@ -1010,8 +1027,8 @@ app.put('/api/projects/:projectName/files/rename', authenticateToken, async (req
|
||||
return res.status(400).json({ error: nameValidation.error });
|
||||
}
|
||||
|
||||
// Get project root
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Resolve the project directory through the DB using the new projectId.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -1071,10 +1088,10 @@ app.put('/api/projects/:projectName/files/rename', authenticateToken, async (req
|
||||
}
|
||||
});
|
||||
|
||||
// DELETE /api/projects/:projectName/files - Delete file or directory
|
||||
app.delete('/api/projects/:projectName/files', authenticateToken, async (req, res) => {
|
||||
// DELETE /api/projects/:projectId/files - Delete file or directory
|
||||
app.delete('/api/projects/:projectId/files', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { path: targetPath, type } = req.body;
|
||||
|
||||
// Validate input
|
||||
@@ -1082,8 +1099,8 @@ app.delete('/api/projects/:projectName/files', authenticateToken, async (req, re
|
||||
return res.status(400).json({ error: 'Path is required' });
|
||||
}
|
||||
|
||||
// Get project root
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Resolve the project directory through the DB using the new projectId.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -1136,7 +1153,7 @@ app.delete('/api/projects/:projectName/files', authenticateToken, async (req, re
|
||||
}
|
||||
});
|
||||
|
||||
// POST /api/projects/:projectName/files/upload - Upload files
|
||||
// POST /api/projects/:projectId/files/upload - Upload files
|
||||
// Dynamic import of multer for file uploads
|
||||
const uploadFilesHandler = async (req, res) => {
|
||||
// Dynamic import of multer
|
||||
@@ -1175,7 +1192,7 @@ const uploadFilesHandler = async (req, res) => {
|
||||
}
|
||||
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { targetPath, relativePaths } = req.body;
|
||||
|
||||
// Parse relative paths if provided (for folder uploads)
|
||||
@@ -1189,7 +1206,7 @@ const uploadFilesHandler = async (req, res) => {
|
||||
}
|
||||
|
||||
console.log('[DEBUG] File upload request:', {
|
||||
projectName,
|
||||
projectId,
|
||||
targetPath: JSON.stringify(targetPath),
|
||||
targetPathType: typeof targetPath,
|
||||
filesCount: req.files?.length,
|
||||
@@ -1200,8 +1217,8 @@ const uploadFilesHandler = async (req, res) => {
|
||||
return res.status(400).json({ error: 'No files provided' });
|
||||
}
|
||||
|
||||
// Get project root
|
||||
const projectRoot = await extractProjectDirectory(projectName).catch(() => null);
|
||||
// Resolve the project directory through the DB using the new projectId.
|
||||
const projectRoot = await getProjectPathById(projectId);
|
||||
if (!projectRoot) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
@@ -1298,7 +1315,7 @@ const uploadFilesHandler = async (req, res) => {
|
||||
});
|
||||
};
|
||||
|
||||
app.post('/api/projects/:projectName/files/upload', authenticateToken, uploadFilesHandler);
|
||||
app.post('/api/projects/:projectId/files/upload', authenticateToken, uploadFilesHandler);
|
||||
|
||||
/**
|
||||
* Proxy an authenticated client WebSocket to a plugin's internal WS server.
|
||||
@@ -1905,8 +1922,10 @@ function handleShellConnection(ws) {
|
||||
console.error('[ERROR] Shell WebSocket error:', error);
|
||||
});
|
||||
}
|
||||
// Image upload endpoint
|
||||
app.post('/api/projects/:projectName/upload-images', authenticateToken, async (req, res) => {
|
||||
// Image upload endpoint. Accepts the DB-assigned `projectId` (not a folder name)
|
||||
// but the current implementation doesn't need to touch the project directory,
|
||||
// so we just leave the param rename for consistency with the rest of the API.
|
||||
app.post('/api/projects/:projectId/upload-images', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const multer = (await import('multer')).default;
|
||||
const path = (await import('path')).default;
|
||||
@@ -1990,10 +2009,11 @@ app.post('/api/projects/:projectName/upload-images', authenticateToken, async (r
|
||||
}
|
||||
});
|
||||
|
||||
// Get token usage for a specific session
|
||||
app.get('/api/projects/:projectName/sessions/:sessionId/token-usage', authenticateToken, async (req, res) => {
|
||||
// Get token usage for a specific session. `projectId` is the DB primary key;
|
||||
// the Claude branch below resolves it to an absolute path via the DB.
|
||||
app.get('/api/projects/:projectId/sessions/:sessionId/token-usage', authenticateToken, async (req, res) => {
|
||||
try {
|
||||
const { projectName, sessionId } = req.params;
|
||||
const { projectId, sessionId } = req.params;
|
||||
const { provider = 'claude' } = req.query;
|
||||
const homeDir = os.homedir();
|
||||
|
||||
@@ -2097,13 +2117,13 @@ app.get('/api/projects/:projectName/sessions/:sessionId/token-usage', authentica
|
||||
}
|
||||
|
||||
// Handle Claude sessions (default)
|
||||
// Extract actual project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
console.error('Error extracting project directory:', error);
|
||||
return res.status(500).json({ error: 'Failed to determine project path' });
|
||||
// Resolve the project path through the DB using the caller-supplied
|
||||
// `projectId`. Legacy code here called extractProjectDirectory with a
|
||||
// folder-encoded project name; the migration centralizes that lookup
|
||||
// in the projects table.
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
|
||||
// Construct the JSONL file path
|
||||
|
||||
@@ -47,6 +47,25 @@ export const projectsDb = {
|
||||
return row ?? null;
|
||||
},
|
||||
|
||||
/**
|
||||
* Resolve the absolute project directory from a database project_id.
|
||||
*
|
||||
* This is the canonical lookup used after the projectName → projectId migration:
|
||||
* API routes receive the DB-assigned `projectId` and must resolve the real folder
|
||||
* path through this helper before touching the filesystem. Returns `null` when the
|
||||
* project row does not exist so callers can respond with a 404.
|
||||
*/
|
||||
getProjectPathById(projectId: string): string | null {
|
||||
const db = getConnection();
|
||||
const row = db.prepare(`
|
||||
SELECT project_path
|
||||
FROM projects
|
||||
WHERE project_id = ?
|
||||
`).get(projectId) as Pick<ProjectRow, 'project_path'> | undefined;
|
||||
|
||||
return row?.project_path ?? null;
|
||||
},
|
||||
|
||||
getProjectPaths(): ProjectRow[] {
|
||||
const db = getConnection();
|
||||
return db.prepare(`
|
||||
|
||||
@@ -192,17 +192,29 @@ export const sessionsDb = {
|
||||
|
||||
/**
|
||||
* Legacy-compatibility method kept for parity with `server/database/db.js`.
|
||||
*
|
||||
* Renaming a session is a metadata-only change — it's not actual activity,
|
||||
* so existing rows intentionally keep their `updated_at` untouched. This
|
||||
* prevents the sidebar's "last activity" timestamp from jumping around when
|
||||
* a user simply edits a session's label.
|
||||
*
|
||||
* When the row doesn't exist yet we still have to seed `created_at`/
|
||||
* `updated_at`; we write ISO-8601 UTC (with the `Z` suffix) rather than
|
||||
* rely on SQLite's `CURRENT_TIMESTAMP`, which stores a naive
|
||||
* `"YYYY-MM-DD HH:MM:SS"` value that JavaScript's `new Date(...)` parses as
|
||||
* local time and displays with the wrong offset.
|
||||
*
|
||||
* TODO: Remove after all legacy imports are migrated to the new repository API.
|
||||
*/
|
||||
setName(sessionId: string, provider: string, customName: string): void {
|
||||
const db = getConnection();
|
||||
const nowIso = new Date().toISOString();
|
||||
db.prepare(
|
||||
`INSERT INTO sessions (session_id, provider, custom_name)
|
||||
VALUES (?, ?, ?)
|
||||
`INSERT INTO sessions (session_id, provider, custom_name, created_at, updated_at)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
ON CONFLICT(session_id, provider) DO UPDATE SET
|
||||
custom_name = excluded.custom_name,
|
||||
updated_at = CURRENT_TIMESTAMP`
|
||||
).run(sessionId, provider, customName);
|
||||
custom_name = excluded.custom_name`
|
||||
).run(sessionId, provider, customName, nowIso, nowIso);
|
||||
},
|
||||
|
||||
/**
|
||||
|
||||
4
server/modules/providers/index.ts
Normal file
4
server/modules/providers/index.ts
Normal file
@@ -0,0 +1,4 @@
|
||||
export { sessionSynchronizerService } from './services/session-synchronizer.service.js';
|
||||
|
||||
export { initializeSessionsWatcher } from './services/sessions-watcher.service.js';
|
||||
export { closeSessionsWatcher } from './services/sessions-watcher.service.js';
|
||||
@@ -1,71 +1,39 @@
|
||||
/**
|
||||
* PROJECT DISCOVERY AND MANAGEMENT SYSTEM
|
||||
* ========================================
|
||||
*
|
||||
* This module manages project discovery for both Claude CLI and Cursor CLI sessions.
|
||||
*
|
||||
* ## Architecture Overview
|
||||
*
|
||||
* 1. **Claude Projects** (stored in ~/.claude/projects/)
|
||||
* - Each project is a directory named with the project path encoded (/ replaced with -)
|
||||
* - Contains .jsonl files with conversation history including 'cwd' field
|
||||
* - Project metadata stored in ~/.claude/project-config.json
|
||||
*
|
||||
* 2. **Cursor Projects** (stored in ~/.cursor/chats/)
|
||||
* - Each project directory is named with MD5 hash of the absolute project path
|
||||
* - Example: /Users/john/myproject -> MD5 -> a1b2c3d4e5f6...
|
||||
* - Contains session directories with SQLite databases (store.db)
|
||||
* - Project path is NOT stored in the database - only in the MD5 hash
|
||||
*
|
||||
* ## Project Discovery Strategy
|
||||
*
|
||||
* 1. **Claude Projects Discovery**:
|
||||
* - Scan ~/.claude/projects/ directory for Claude project folders
|
||||
* - Extract actual project path from .jsonl files (cwd field)
|
||||
* - Fall back to decoded directory name if no sessions exist
|
||||
*
|
||||
* 2. **Cursor Sessions Discovery**:
|
||||
* - For each KNOWN project (from Claude or manually added)
|
||||
* - Compute MD5 hash of the project's absolute path
|
||||
* - Check if ~/.cursor/chats/{md5_hash}/ directory exists
|
||||
* - Read session metadata from SQLite store.db files
|
||||
*
|
||||
* 3. **Manual Project Addition**:
|
||||
* - Users can manually add project paths via UI
|
||||
* - Stored in ~/.claude/project-config.json with 'manuallyAdded' flag
|
||||
* - Allows discovering Cursor sessions for projects without Claude sessions
|
||||
*
|
||||
* ## Critical Limitations
|
||||
*
|
||||
* - **CANNOT discover Cursor-only projects**: From a quick check, there was no mention of
|
||||
* the cwd of each project. if someone has the time, you can try to reverse engineer it.
|
||||
*
|
||||
* - **Project relocation breaks history**: If a project directory is moved or renamed,
|
||||
* the MD5 hash changes, making old Cursor sessions inaccessible unless the old
|
||||
* path is known and manually added.
|
||||
*
|
||||
* ## Error Handling
|
||||
*
|
||||
* - Missing ~/.claude directory is handled gracefully with automatic creation
|
||||
* - ENOENT errors are caught and handled without crashing
|
||||
* - Empty arrays returned when no projects/sessions exist
|
||||
*
|
||||
* ## Caching Strategy
|
||||
*
|
||||
* - Project directory extraction is cached to minimize file I/O
|
||||
* - Cache is cleared when project configuration changes
|
||||
* - Session data is fetched on-demand, not cached
|
||||
* PROJECT DISCOVERY AND MANAGEMENT
|
||||
* ================================
|
||||
*
|
||||
* After the projectName → projectId migration, project and session listings
|
||||
* for `GET /api/projects` are sourced entirely from the database:
|
||||
*
|
||||
* - `projects` table (via `projectsDb`) — the canonical list of projects and
|
||||
* their absolute `project_path`.
|
||||
* - `sessions` table (via `sessionsDb`) — every provider's sessions for a
|
||||
* given project, keyed by `project_path`.
|
||||
*
|
||||
* Routes always address a project by its DB `projectId` and resolve the real
|
||||
* directory through `getProjectPathById` before touching disk.
|
||||
*
|
||||
* The filesystem-aware helpers kept in this module serve the remaining
|
||||
* features that still need on-disk data:
|
||||
* - Session message reads for each provider (Claude/Codex/Gemini) for
|
||||
* `GET /api/sessions/:sessionId/messages`.
|
||||
* - Conversation search (`searchConversations`) which scans JSONL history.
|
||||
* - Destructive project cleanup (`deleteProjectById` -> `deleteProject`)
|
||||
* which removes Claude/Cursor/Codex artifacts on disk.
|
||||
* - Manual project registration (`addProjectManually`) which syncs to
|
||||
* ~/.claude/project-config.json for backwards compatibility.
|
||||
*/
|
||||
|
||||
import { promises as fs } from 'fs';
|
||||
import fsSync from 'fs';
|
||||
import fsSync, { promises as fs } from 'fs';
|
||||
import path from 'path';
|
||||
import readline from 'readline';
|
||||
import crypto from 'crypto';
|
||||
import Database from 'better-sqlite3';
|
||||
import os from 'os';
|
||||
|
||||
import { sessionSynchronizerService } from '@/modules/providers';
|
||||
|
||||
import sessionManager from './sessionManager.js';
|
||||
import { applyCustomSessionNames } from './modules/database/index.js';
|
||||
import { projectsDb, sessionsDb } from './modules/database/index.js';
|
||||
import { getModuleDir, findAppRoot } from './utils/runtime-paths.js';
|
||||
|
||||
// Snapshot files are kept as incrementing artifacts under .tmp/project-dumps for later review.
|
||||
@@ -265,12 +233,56 @@ function normalizeTaskMasterInfo(taskMasterResult = null) {
|
||||
};
|
||||
}
|
||||
|
||||
async function getProjectTaskMaster(projectName) {
|
||||
const projectPath = await extractProjectDirectory(projectName);
|
||||
/**
|
||||
* Resolve the absolute project path for a database `projectId`.
|
||||
*
|
||||
* After the projectName → projectId migration, every API route receives a
|
||||
* `projectId` (the primary key from the `projects` table) and must translate
|
||||
* it into the real directory on disk through this helper. Returns `null` when
|
||||
* the id doesn't match any row so callers can respond with a 404.
|
||||
*/
|
||||
async function getProjectPathById(projectId) {
|
||||
if (!projectId) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return projectsDb.getProjectPathById(projectId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute the Claude CLI project folder name for an absolute path.
|
||||
*
|
||||
* Claude stores its JSONL history per project under
|
||||
* `~/.claude/projects/<encoded-path>/`. The folder name is derived from the
|
||||
* absolute path by replacing every non-alphanumeric character (except `-`) with
|
||||
* `-`. Filesystem helpers like `getSessions`/`deleteSession` still work on that
|
||||
* folder name, so routes that receive a `projectId` compute it from the path
|
||||
* resolved through the DB instead of keeping the encoded name as an identifier.
|
||||
*/
|
||||
function claudeFolderNameFromPath(projectPath) {
|
||||
if (!projectPath) {
|
||||
return '';
|
||||
}
|
||||
|
||||
return projectPath.replace(/[^a-zA-Z0-9-]/g, '-');
|
||||
}
|
||||
|
||||
/**
|
||||
* TaskMaster details for a project, addressed by DB `projectId`.
|
||||
*
|
||||
* Resolves the project path through the DB and inspects the `.taskmaster`
|
||||
* folder on disk for metadata the TaskMaster panel displays.
|
||||
*/
|
||||
async function getProjectTaskMasterById(projectId) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const taskMasterResult = await detectTaskMasterFolder(projectPath);
|
||||
|
||||
return {
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
taskmaster: normalizeTaskMasterInfo(taskMasterResult)
|
||||
};
|
||||
@@ -342,8 +354,10 @@ async function generateDisplayName(projectName, actualProjectDir = null) {
|
||||
return projectPath;
|
||||
}
|
||||
|
||||
// Extract the actual project directory from JSONL sessions (with caching)
|
||||
// TODO: Get the project id as parameter and return the actual project directory from the database
|
||||
// Resolve a Claude-encoded folder name back to an absolute project directory
|
||||
// by inspecting cached metadata and JSONL `cwd` fields. Used only by the
|
||||
// legacy name-based helpers below (`getSessions`, `deleteProject`, etc.) and
|
||||
// by the conversation search; id-based routes use `getProjectPathById`.
|
||||
async function extractProjectDirectory(projectName) {
|
||||
// Check cache first
|
||||
if (projectDirectoryCache.has(projectName)) {
|
||||
@@ -463,209 +477,115 @@ async function extractProjectDirectory(projectName) {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Group the `sessions` table rows for a project by provider.
|
||||
*
|
||||
* After the projectId migration, GET /api/projects no longer scans JSONL files
|
||||
* or any other session directory — every provider's session list comes from
|
||||
* the database. One `SELECT ... WHERE project_path = ?` gets us every row we
|
||||
* need, and we then bucket them by `provider` so each list (`sessions`,
|
||||
* `cursorSessions`, `codexSessions`, `geminiSessions`) can be built without
|
||||
* touching disk. Per the migration spec, each emitted session carries
|
||||
* `summary = custom_name`, `messageCount = 0` and `lastActivity` taken from
|
||||
* `updated_at` so the sidebar still sorts by recency.
|
||||
*/
|
||||
function buildSessionsByProviderFromDb(projectPath) {
|
||||
const rows = sessionsDb.getSessionsByProjectPath(projectPath);
|
||||
const byProvider = {
|
||||
claude: [],
|
||||
cursor: [],
|
||||
codex: [],
|
||||
gemini: [],
|
||||
};
|
||||
|
||||
for (const row of rows) {
|
||||
const bucket = byProvider[row.provider];
|
||||
if (!bucket) {
|
||||
continue;
|
||||
}
|
||||
|
||||
bucket.push({
|
||||
id: row.session_id,
|
||||
// The session summary intentionally mirrors the custom_name column only;
|
||||
// the historical JSONL-derived summary is no longer computed on this path.
|
||||
summary: row.custom_name || '',
|
||||
// messageCount is always 0 for now — counting is not implemented yet.
|
||||
messageCount: 0,
|
||||
lastActivity: row.updated_at ?? row.created_at ?? new Date().toISOString(),
|
||||
});
|
||||
}
|
||||
|
||||
// Sort each bucket by recency so the sidebar's default ordering is preserved.
|
||||
for (const provider of Object.keys(byProvider)) {
|
||||
byProvider[provider].sort(
|
||||
(a, b) => new Date(b.lastActivity) - new Date(a.lastActivity),
|
||||
);
|
||||
}
|
||||
|
||||
return byProvider;
|
||||
}
|
||||
|
||||
async function getProjects(progressCallback = null) {
|
||||
const claudeDir = path.join(os.homedir(), '.claude', 'projects');
|
||||
const config = await loadProjectConfig();
|
||||
await sessionSynchronizerService.synchronizeSessions();
|
||||
// Source of truth for project listing is now the `projects` and `sessions`
|
||||
// tables — no directory scanning happens here. This keeps the API fast and
|
||||
// lets the frontend identify projects by their stable DB `projectId`.
|
||||
const projectRows = projectsDb.getProjectPaths();
|
||||
const totalProjects = projectRows.length;
|
||||
const projects = [];
|
||||
const existingProjects = new Set();
|
||||
const codexSessionsIndexRef = { sessionsByProject: null };
|
||||
let totalProjects = 0;
|
||||
let processedProjects = 0;
|
||||
let directories = [];
|
||||
|
||||
try {
|
||||
// Check if the .claude/projects directory exists
|
||||
await fs.access(claudeDir);
|
||||
for (const row of projectRows) {
|
||||
processedProjects++;
|
||||
|
||||
// First, get existing Claude projects from the file system
|
||||
const entries = await fs.readdir(claudeDir, { withFileTypes: true });
|
||||
directories = entries.filter(e => e.isDirectory());
|
||||
const projectId = row.project_id;
|
||||
const projectPath = row.project_path;
|
||||
|
||||
// Build set of existing project names for later
|
||||
directories.forEach(e => existingProjects.add(e.name));
|
||||
|
||||
// Count manual projects not already in directories
|
||||
const manualProjectsCount = Object.entries(config)
|
||||
.filter(([name, cfg]) => cfg.manuallyAdded && !existingProjects.has(name))
|
||||
.length;
|
||||
|
||||
totalProjects = directories.length + manualProjectsCount;
|
||||
|
||||
for (const entry of directories) {
|
||||
processedProjects++;
|
||||
|
||||
// Emit progress
|
||||
if (progressCallback) {
|
||||
progressCallback({
|
||||
phase: 'loading',
|
||||
current: processedProjects,
|
||||
total: totalProjects,
|
||||
currentProject: entry.name
|
||||
});
|
||||
}
|
||||
|
||||
// Extract actual project directory from JSONL sessions
|
||||
const actualProjectDir = await extractProjectDirectory(entry.name);
|
||||
|
||||
// Get display name from config or generate one
|
||||
const customName = config[entry.name]?.displayName;
|
||||
const autoDisplayName = await generateDisplayName(entry.name, actualProjectDir);
|
||||
const fullPath = actualProjectDir;
|
||||
|
||||
const project = {
|
||||
name: entry.name,
|
||||
path: actualProjectDir,
|
||||
displayName: customName || autoDisplayName,
|
||||
fullPath: fullPath,
|
||||
sessions: [],
|
||||
geminiSessions: [],
|
||||
sessionMeta: {
|
||||
hasMore: false,
|
||||
total: 0
|
||||
}
|
||||
};
|
||||
|
||||
// Try to get sessions for this project (just first 5 for performance)
|
||||
try {
|
||||
const sessionResult = await getSessions(entry.name, 5, 0);
|
||||
project.sessions = sessionResult.sessions || [];
|
||||
project.sessionMeta = {
|
||||
hasMore: sessionResult.hasMore,
|
||||
total: sessionResult.total
|
||||
};
|
||||
} catch (e) {
|
||||
console.warn(`Could not load sessions for project ${entry.name}:`, e.message);
|
||||
project.sessionMeta = {
|
||||
hasMore: false,
|
||||
total: 0
|
||||
};
|
||||
}
|
||||
applyCustomSessionNames(project.sessions, 'claude');
|
||||
|
||||
// Also fetch Cursor sessions for this project
|
||||
try {
|
||||
project.cursorSessions = await getCursorSessions(actualProjectDir);
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Cursor sessions for project ${entry.name}:`, e.message);
|
||||
project.cursorSessions = [];
|
||||
}
|
||||
applyCustomSessionNames(project.cursorSessions, 'cursor');
|
||||
|
||||
// Also fetch Codex sessions for this project
|
||||
try {
|
||||
project.codexSessions = await getCodexSessions(actualProjectDir, {
|
||||
indexRef: codexSessionsIndexRef,
|
||||
});
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Codex sessions for project ${entry.name}:`, e.message);
|
||||
project.codexSessions = [];
|
||||
}
|
||||
applyCustomSessionNames(project.codexSessions, 'codex');
|
||||
|
||||
// Also fetch Gemini sessions for this project (UI + CLI)
|
||||
try {
|
||||
const uiSessions = sessionManager.getProjectSessions(actualProjectDir) || [];
|
||||
const cliSessions = await getGeminiCliSessions(actualProjectDir);
|
||||
const uiIds = new Set(uiSessions.map(s => s.id));
|
||||
const mergedGemini = [...uiSessions, ...cliSessions.filter(s => !uiIds.has(s.id))];
|
||||
project.geminiSessions = mergedGemini;
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Gemini sessions for project ${entry.name}:`, e.message);
|
||||
project.geminiSessions = [];
|
||||
}
|
||||
applyCustomSessionNames(project.geminiSessions, 'gemini');
|
||||
|
||||
projects.push(project);
|
||||
// console.log(`Loaded project: ${project.displayName} (${project.name}) with ${project.sessions.length} sessions, ${project.cursorSessions.length} Cursor sessions, ${project.codexSessions.length} Codex sessions, and ${project.geminiSessions.length} Gemini sessions.`);
|
||||
// console.log("Full project data:", project);
|
||||
if (progressCallback) {
|
||||
progressCallback({
|
||||
phase: 'loading',
|
||||
current: processedProjects,
|
||||
total: totalProjects,
|
||||
currentProject: projectPath
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
// If the directory doesn't exist (ENOENT), that's okay - just continue with empty projects
|
||||
if (error.code !== 'ENOENT') {
|
||||
console.error('Error reading projects directory:', error);
|
||||
}
|
||||
// Calculate total for manual projects only (no directories exist)
|
||||
totalProjects = Object.entries(config)
|
||||
.filter(([name, cfg]) => cfg.manuallyAdded)
|
||||
.length;
|
||||
|
||||
// Use the stored custom name when present, otherwise fall back to a
|
||||
// generated display name derived from the project path.
|
||||
const displayName = row.custom_project_name && row.custom_project_name.trim().length > 0
|
||||
? row.custom_project_name
|
||||
: await generateDisplayName(path.basename(projectPath) || projectPath, projectPath);
|
||||
|
||||
// All provider session lists are built from a single DB query — no JSONL
|
||||
// parsing, no filesystem walks, no in-memory session manager lookups.
|
||||
const sessionsByProvider = buildSessionsByProviderFromDb(projectPath);
|
||||
const claudeSessionsAll = sessionsByProvider.claude;
|
||||
const claudeSessions = claudeSessionsAll.slice(0, 5);
|
||||
|
||||
const project = {
|
||||
// Primary identifier used across the UI and API routes post-migration.
|
||||
projectId,
|
||||
path: projectPath,
|
||||
displayName,
|
||||
fullPath: projectPath,
|
||||
sessions: claudeSessions,
|
||||
cursorSessions: sessionsByProvider.cursor,
|
||||
codexSessions: sessionsByProvider.codex,
|
||||
geminiSessions: sessionsByProvider.gemini,
|
||||
// hasMore is pinned to false per the migration spec — pagination on the
|
||||
// project list is not driven by this endpoint anymore.
|
||||
sessionMeta: {
|
||||
hasMore: false,
|
||||
total: claudeSessionsAll.length
|
||||
}
|
||||
};
|
||||
|
||||
// Custom-name overrides are already baked into each row's `summary` field
|
||||
// by buildSessionsByProviderFromDb, so we don't need to re-apply them.
|
||||
|
||||
projects.push(project);
|
||||
}
|
||||
|
||||
// Add manually configured projects that don't exist as folders yet
|
||||
for (const [projectName, projectConfig] of Object.entries(config)) {
|
||||
if (!existingProjects.has(projectName) && projectConfig.manuallyAdded) {
|
||||
processedProjects++;
|
||||
|
||||
// Emit progress for manual projects
|
||||
if (progressCallback) {
|
||||
progressCallback({
|
||||
phase: 'loading',
|
||||
current: processedProjects,
|
||||
total: totalProjects,
|
||||
currentProject: projectName
|
||||
});
|
||||
}
|
||||
|
||||
// Use the original path if available, otherwise extract from potential sessions
|
||||
let actualProjectDir = projectConfig.originalPath;
|
||||
|
||||
if (!actualProjectDir) {
|
||||
try {
|
||||
actualProjectDir = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
// Fall back to decoded project name
|
||||
actualProjectDir = projectName.replace(/-/g, '/');
|
||||
}
|
||||
}
|
||||
|
||||
const project = {
|
||||
name: projectName,
|
||||
path: actualProjectDir,
|
||||
displayName: projectConfig.displayName || await generateDisplayName(projectName, actualProjectDir),
|
||||
fullPath: actualProjectDir,
|
||||
sessions: [],
|
||||
geminiSessions: [],
|
||||
sessionMeta: {
|
||||
hasMore: false,
|
||||
total: 0
|
||||
},
|
||||
cursorSessions: [],
|
||||
codexSessions: []
|
||||
};
|
||||
|
||||
// Try to fetch Cursor sessions for manual projects too
|
||||
try {
|
||||
project.cursorSessions = await getCursorSessions(actualProjectDir);
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Cursor sessions for manual project ${projectName}:`, e.message);
|
||||
}
|
||||
applyCustomSessionNames(project.cursorSessions, 'cursor');
|
||||
|
||||
// Try to fetch Codex sessions for manual projects too
|
||||
try {
|
||||
project.codexSessions = await getCodexSessions(actualProjectDir, {
|
||||
indexRef: codexSessionsIndexRef,
|
||||
});
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Codex sessions for manual project ${projectName}:`, e.message);
|
||||
}
|
||||
applyCustomSessionNames(project.codexSessions, 'codex');
|
||||
|
||||
// Try to fetch Gemini sessions for manual projects too (UI + CLI)
|
||||
try {
|
||||
const uiSessions = sessionManager.getProjectSessions(actualProjectDir) || [];
|
||||
const cliSessions = await getGeminiCliSessions(actualProjectDir);
|
||||
const uiIds = new Set(uiSessions.map(s => s.id));
|
||||
project.geminiSessions = [...uiSessions, ...cliSessions.filter(s => !uiIds.has(s.id))];
|
||||
} catch (e) {
|
||||
console.warn(`Could not load Gemini sessions for manual project ${projectName}:`, e.message);
|
||||
}
|
||||
applyCustomSessionNames(project.geminiSessions, 'gemini');
|
||||
|
||||
projects.push(project);
|
||||
}
|
||||
}
|
||||
|
||||
// Emit completion after all projects (including manual) are processed
|
||||
if (progressCallback) {
|
||||
progressCallback({
|
||||
phase: 'complete',
|
||||
@@ -1117,6 +1037,26 @@ async function getSessionMessages(projectName, sessionId, limit = null, offset =
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* ID-based wrapper around `getSessions`.
|
||||
*
|
||||
* Resolves a `projectId` to the underlying Claude JSONL folder name (via the
|
||||
* DB-backed project path) and defers to the legacy filesystem reader. Keeps
|
||||
* the previous pagination shape so the sidebar's "Load more sessions" UI keeps
|
||||
* working after the migration.
|
||||
*/
|
||||
async function getSessionsById(projectId, limit = 5, offset = 0) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
return { sessions: [], hasMore: false, total: 0 };
|
||||
}
|
||||
|
||||
// Claude stores history under ~/.claude/projects/<encoded-path>/; derive the
|
||||
// folder name from the absolute path the DB gave us.
|
||||
const claudeFolderName = claudeFolderNameFromPath(projectPath);
|
||||
return getSessions(claudeFolderName, limit, offset);
|
||||
}
|
||||
|
||||
// Rename a project's display name
|
||||
async function renameProject(projectName, newDisplayName) {
|
||||
const config = await loadProjectConfig();
|
||||
@@ -1138,6 +1078,53 @@ async function renameProject(projectName, newDisplayName) {
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* ID-based wrapper around `renameProject`.
|
||||
*
|
||||
* Writes the new display name to the `projects.custom_project_name` column
|
||||
* (the source of truth for the DB-driven getProjects() response) and also
|
||||
* keeps the legacy project-config.json in sync for backwards compatibility
|
||||
* with any code that still reads it.
|
||||
*/
|
||||
async function renameProjectById(projectId, newDisplayName) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
throw new Error(`Unknown projectId: ${projectId}`);
|
||||
}
|
||||
|
||||
const trimmed = typeof newDisplayName === 'string' ? newDisplayName.trim() : '';
|
||||
// Persist on the DB row so getProjects() immediately reflects the change.
|
||||
projectsDb.updateCustomProjectNameById(projectId, trimmed.length > 0 ? trimmed : null);
|
||||
|
||||
// Keep the legacy file-based project config in lockstep so historic readers
|
||||
// that still consult project-config.json don't diverge.
|
||||
const claudeFolderName = claudeFolderNameFromPath(projectPath);
|
||||
try {
|
||||
await renameProject(claudeFolderName, trimmed);
|
||||
} catch (error) {
|
||||
console.warn(`[projects] Legacy renameProject sync failed for ${projectId}:`, error.message);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* ID-based wrapper around `deleteSession`.
|
||||
*
|
||||
* Resolves the real Claude history folder via the DB-backed path, then defers
|
||||
* to the filesystem deletion routine. Callers should still clean up any DB
|
||||
* bookkeeping (e.g. the sessions table) at the route layer.
|
||||
*/
|
||||
async function deleteSessionById(projectId, sessionId) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
throw new Error(`Unknown projectId: ${projectId}`);
|
||||
}
|
||||
|
||||
const claudeFolderName = claudeFolderNameFromPath(projectPath);
|
||||
return deleteSession(claudeFolderName, sessionId);
|
||||
}
|
||||
|
||||
// Delete a session from a project
|
||||
async function deleteSession(projectName, sessionId) {
|
||||
const projectDir = path.join(os.homedir(), '.claude', 'projects', projectName);
|
||||
@@ -1261,6 +1248,35 @@ async function deleteProject(projectName, force = false, deleteData = false) {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* ID-based wrapper around `deleteProject`.
|
||||
*
|
||||
* Resolves the project path via the DB, defers destructive filesystem cleanup
|
||||
* to `deleteProject`, then removes the row from the `projects` table so the
|
||||
* DB-driven GET /api/projects response no longer lists it.
|
||||
*/
|
||||
async function deleteProjectById(projectId, force = false, deleteData = false) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
throw new Error(`Unknown projectId: ${projectId}`);
|
||||
}
|
||||
|
||||
const claudeFolderName = claudeFolderNameFromPath(projectPath);
|
||||
try {
|
||||
await deleteProject(claudeFolderName, force, deleteData);
|
||||
} catch (error) {
|
||||
// If the legacy Claude folder doesn't exist anymore we still want to drop
|
||||
// the DB row; rethrow otherwise so callers can surface the failure.
|
||||
if (error.code !== 'ENOENT') {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Drop the DB row so the DB-driven GET /api/projects stops listing it.
|
||||
projectsDb.deleteProjectById(projectId);
|
||||
return true;
|
||||
}
|
||||
|
||||
// Add a project manually to the config (without creating folders)
|
||||
async function addProjectManually(projectPath, displayName = null) {
|
||||
const absolutePath = path.resolve(projectPath);
|
||||
@@ -1309,110 +1325,6 @@ async function addProjectManually(projectPath, displayName = null) {
|
||||
};
|
||||
}
|
||||
|
||||
// Fetch Cursor sessions for a given project path
|
||||
async function getCursorSessions(projectPath) {
|
||||
try {
|
||||
// Calculate cwdID hash for the project path (Cursor uses MD5 hash)
|
||||
const cwdId = crypto.createHash('md5').update(projectPath).digest('hex');
|
||||
const cursorChatsPath = path.join(os.homedir(), '.cursor', 'chats', cwdId);
|
||||
|
||||
// Check if the directory exists
|
||||
try {
|
||||
await fs.access(cursorChatsPath);
|
||||
} catch (error) {
|
||||
// No sessions for this project
|
||||
return [];
|
||||
}
|
||||
|
||||
// List all session directories
|
||||
const sessionDirs = await fs.readdir(cursorChatsPath);
|
||||
const sessions = [];
|
||||
|
||||
for (const sessionId of sessionDirs) {
|
||||
const sessionPath = path.join(cursorChatsPath, sessionId);
|
||||
const storeDbPath = path.join(sessionPath, 'store.db');
|
||||
|
||||
try {
|
||||
// Check if store.db exists
|
||||
await fs.access(storeDbPath);
|
||||
|
||||
// Capture store.db mtime as a reliable fallback timestamp
|
||||
let dbStatMtimeMs = null;
|
||||
try {
|
||||
const stat = await fs.stat(storeDbPath);
|
||||
dbStatMtimeMs = stat.mtimeMs;
|
||||
} catch (_) { }
|
||||
|
||||
// Open SQLite database
|
||||
const db = new Database(storeDbPath, { readonly: true, fileMustExist: true });
|
||||
|
||||
// Get metadata from meta table
|
||||
const metaRows = db.prepare('SELECT key, value FROM meta').all();
|
||||
|
||||
// Parse metadata
|
||||
let metadata = {};
|
||||
for (const row of metaRows) {
|
||||
if (row.value) {
|
||||
try {
|
||||
// Try to decode as hex-encoded JSON
|
||||
const hexMatch = row.value.toString().match(/^[0-9a-fA-F]+$/);
|
||||
if (hexMatch) {
|
||||
const jsonStr = Buffer.from(row.value, 'hex').toString('utf8');
|
||||
metadata[row.key] = JSON.parse(jsonStr);
|
||||
} else {
|
||||
metadata[row.key] = row.value.toString();
|
||||
}
|
||||
} catch (e) {
|
||||
metadata[row.key] = row.value.toString();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get message count
|
||||
const messageCountResult = db.prepare('SELECT COUNT(*) as count FROM blobs').get();
|
||||
|
||||
db.close();
|
||||
|
||||
// Extract session info
|
||||
const sessionName = metadata.title || metadata.sessionTitle || 'Untitled Session';
|
||||
|
||||
// Determine timestamp - prefer createdAt from metadata, fall back to db file mtime
|
||||
let createdAt = null;
|
||||
if (metadata.createdAt) {
|
||||
createdAt = new Date(metadata.createdAt).toISOString();
|
||||
} else if (dbStatMtimeMs) {
|
||||
createdAt = new Date(dbStatMtimeMs).toISOString();
|
||||
} else {
|
||||
createdAt = new Date().toISOString();
|
||||
}
|
||||
|
||||
sessions.push({
|
||||
id: sessionId,
|
||||
name: sessionName,
|
||||
createdAt: createdAt,
|
||||
lastActivity: createdAt, // For compatibility with Claude sessions
|
||||
messageCount: messageCountResult.count || 0,
|
||||
projectPath: projectPath
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.warn(`Could not read Cursor session ${sessionId}:`, error.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Sort sessions by creation time (newest first)
|
||||
sessions.sort((a, b) => new Date(b.createdAt) - new Date(a.createdAt));
|
||||
|
||||
// Return only the first 5 sessions for performance
|
||||
return sessions.slice(0, 5);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error fetching Cursor sessions:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
function normalizeComparablePath(inputPath) {
|
||||
if (!inputPath || typeof inputPath !== 'string') {
|
||||
return '';
|
||||
@@ -2011,7 +1923,23 @@ async function searchConversations(query, limit = 50, onProjectResult = null, si
|
||||
file => file.endsWith('.jsonl') && !file.startsWith('agent-')
|
||||
);
|
||||
|
||||
// Also include the DB `projectId` so the frontend (which now identifies
|
||||
// projects by `projectId`) can match search results to the
|
||||
// currently-loaded project list without a second round-trip.
|
||||
let searchProjectId = null;
|
||||
try {
|
||||
const resolvedPath = await extractProjectDirectory(projectName);
|
||||
const dbRow = projectsDb.getProjectPath(resolvedPath);
|
||||
if (dbRow?.project_id) {
|
||||
searchProjectId = dbRow.project_id;
|
||||
}
|
||||
} catch {
|
||||
// Best-effort: if we cannot resolve the projectId, the result is still
|
||||
// usable on the backend but the frontend will skip the auto-select.
|
||||
}
|
||||
|
||||
const projectResult = {
|
||||
projectId: searchProjectId,
|
||||
projectName,
|
||||
projectDisplayName: displayName,
|
||||
sessions: []
|
||||
@@ -2438,82 +2366,6 @@ async function searchGeminiSessionsForProject(
|
||||
}
|
||||
}
|
||||
|
||||
async function getGeminiCliSessions(projectPath) {
|
||||
const normalizedProjectPath = normalizeComparablePath(projectPath);
|
||||
if (!normalizedProjectPath) return [];
|
||||
|
||||
const geminiTmpDir = path.join(os.homedir(), '.gemini', 'tmp');
|
||||
try {
|
||||
await fs.access(geminiTmpDir);
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
|
||||
const sessions = [];
|
||||
let projectDirs;
|
||||
try {
|
||||
projectDirs = await fs.readdir(geminiTmpDir);
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
|
||||
for (const projectDir of projectDirs) {
|
||||
const projectRootFile = path.join(geminiTmpDir, projectDir, '.project_root');
|
||||
let projectRoot;
|
||||
try {
|
||||
projectRoot = (await fs.readFile(projectRootFile, 'utf8')).trim();
|
||||
} catch {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (normalizeComparablePath(projectRoot) !== normalizedProjectPath) continue;
|
||||
|
||||
const chatsDir = path.join(geminiTmpDir, projectDir, 'chats');
|
||||
let chatFiles;
|
||||
try {
|
||||
chatFiles = await fs.readdir(chatsDir);
|
||||
} catch {
|
||||
continue;
|
||||
}
|
||||
|
||||
for (const chatFile of chatFiles) {
|
||||
if (!chatFile.endsWith('.json')) continue;
|
||||
try {
|
||||
const filePath = path.join(chatsDir, chatFile);
|
||||
const data = await fs.readFile(filePath, 'utf8');
|
||||
const session = JSON.parse(data);
|
||||
if (!session.messages || !Array.isArray(session.messages)) continue;
|
||||
|
||||
const sessionId = session.sessionId || chatFile.replace('.json', '');
|
||||
const firstUserMsg = session.messages.find(m => m.type === 'user');
|
||||
let summary = 'Gemini CLI Session';
|
||||
if (firstUserMsg) {
|
||||
const text = Array.isArray(firstUserMsg.content)
|
||||
? firstUserMsg.content.filter(p => p.text).map(p => p.text).join(' ')
|
||||
: (typeof firstUserMsg.content === 'string' ? firstUserMsg.content : '');
|
||||
if (text) {
|
||||
summary = text.length > 50 ? text.substring(0, 50) + '...' : text;
|
||||
}
|
||||
}
|
||||
|
||||
sessions.push({
|
||||
id: sessionId,
|
||||
summary,
|
||||
messageCount: session.messages.length,
|
||||
lastActivity: session.lastUpdated || session.startTime || null,
|
||||
provider: 'gemini'
|
||||
});
|
||||
} catch {
|
||||
continue;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return sessions.sort((a, b) =>
|
||||
new Date(b.lastActivity || 0) - new Date(a.lastActivity || 0)
|
||||
);
|
||||
}
|
||||
|
||||
async function getGeminiCliSessionMessages(sessionId) {
|
||||
const geminiTmpDir = path.join(os.homedir(), '.gemini', 'tmp');
|
||||
let projectDirs;
|
||||
@@ -2568,20 +2420,23 @@ async function getGeminiCliSessionMessages(sessionId) {
|
||||
return [];
|
||||
}
|
||||
|
||||
// Only functions with consumers outside this module are exported. Folder-name
|
||||
// based helpers (`getSessions`, `renameProject`, `deleteSession`, etc.) are
|
||||
// kept as internal implementation details of the id-based wrappers below.
|
||||
export {
|
||||
getProjects,
|
||||
getSessions,
|
||||
getSessionsById,
|
||||
getSessionMessages,
|
||||
renameProject,
|
||||
deleteSession,
|
||||
deleteProject,
|
||||
renameProjectById,
|
||||
deleteSessionById,
|
||||
deleteProjectById,
|
||||
addProjectManually,
|
||||
getProjectTaskMaster,
|
||||
extractProjectDirectory,
|
||||
getProjectTaskMasterById,
|
||||
getProjectPathById,
|
||||
claudeFolderNameFromPath,
|
||||
clearProjectDirectoryCache,
|
||||
getCodexSessionMessages,
|
||||
deleteCodexSession,
|
||||
getGeminiCliSessions,
|
||||
getGeminiCliSessionMessages,
|
||||
searchConversations
|
||||
};
|
||||
|
||||
@@ -2,7 +2,7 @@ import express from 'express';
|
||||
import { spawn } from 'child_process';
|
||||
import path from 'path';
|
||||
import { promises as fs } from 'fs';
|
||||
import { extractProjectDirectory } from '../projects.js';
|
||||
import { getProjectPathById } from '../projects.js';
|
||||
import { queryClaudeSDK } from '../claude-sdk.js';
|
||||
import { spawnCursor } from '../cursor-cli.js';
|
||||
|
||||
@@ -101,14 +101,19 @@ function validateProjectPath(projectPath) {
|
||||
return resolved;
|
||||
}
|
||||
|
||||
// Helper function to get the actual project path from the encoded project name
|
||||
async function getActualProjectPath(projectName) {
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
console.error(`Error extracting project directory for ${projectName}:`, error);
|
||||
throw new Error(`Unable to resolve project path for "${projectName}"`);
|
||||
/**
|
||||
* Resolve the absolute project directory for a given DB `projectId`.
|
||||
*
|
||||
* After the projectName → projectId migration, every git endpoint receives
|
||||
* the DB primary key (`project` query/body param). The legacy filesystem
|
||||
* resolver that walked Claude's JSONL history is no longer used here; the
|
||||
* path comes straight from the `projects` table and is then sanity-checked
|
||||
* by `validateProjectPath` before any `git` command runs against it.
|
||||
*/
|
||||
async function getActualProjectPath(projectId) {
|
||||
const projectPath = await getProjectPathById(projectId);
|
||||
if (!projectPath) {
|
||||
throw new Error(`Unable to resolve project path for "${projectId}"`);
|
||||
}
|
||||
return validateProjectPath(projectPath);
|
||||
}
|
||||
@@ -292,7 +297,7 @@ router.get('/status', async (req, res) => {
|
||||
const { project } = req.query;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -355,7 +360,7 @@ router.get('/diff', async (req, res) => {
|
||||
const { project, file } = req.query;
|
||||
|
||||
if (!project || !file) {
|
||||
return res.status(400).json({ error: 'Project name and file path are required' });
|
||||
return res.status(400).json({ error: 'Project id and file path are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -438,7 +443,7 @@ router.get('/file-with-diff', async (req, res) => {
|
||||
const { project, file } = req.query;
|
||||
|
||||
if (!project || !file) {
|
||||
return res.status(400).json({ error: 'Project name and file path are required' });
|
||||
return res.status(400).json({ error: 'Project id and file path are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -518,7 +523,7 @@ router.post('/initial-commit', async (req, res) => {
|
||||
const { project } = req.body;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -593,7 +598,7 @@ router.post('/revert-local-commit', async (req, res) => {
|
||||
const { project } = req.body;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -640,7 +645,7 @@ router.get('/branches', async (req, res) => {
|
||||
const { project } = req.query;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -684,7 +689,7 @@ router.post('/checkout', async (req, res) => {
|
||||
const { project, branch } = req.body;
|
||||
|
||||
if (!project || !branch) {
|
||||
return res.status(400).json({ error: 'Project name and branch are required' });
|
||||
return res.status(400).json({ error: 'Project id and branch are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -706,7 +711,7 @@ router.post('/create-branch', async (req, res) => {
|
||||
const { project, branch } = req.body;
|
||||
|
||||
if (!project || !branch) {
|
||||
return res.status(400).json({ error: 'Project name and branch name are required' });
|
||||
return res.status(400).json({ error: 'Project id and branch name are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -728,7 +733,7 @@ router.post('/delete-branch', async (req, res) => {
|
||||
const { project, branch } = req.body;
|
||||
|
||||
if (!project || !branch) {
|
||||
return res.status(400).json({ error: 'Project name and branch name are required' });
|
||||
return res.status(400).json({ error: 'Project id and branch name are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -754,7 +759,7 @@ router.get('/commits', async (req, res) => {
|
||||
const { project, limit = 10 } = req.query;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -811,7 +816,7 @@ router.get('/commit-diff', async (req, res) => {
|
||||
const { project, commit } = req.query;
|
||||
|
||||
if (!project || !commit) {
|
||||
return res.status(400).json({ error: 'Project name and commit hash are required' });
|
||||
return res.status(400).json({ error: 'Project id and commit hash are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -843,7 +848,7 @@ router.post('/generate-commit-message', async (req, res) => {
|
||||
const { project, files, provider = 'claude' } = req.body;
|
||||
|
||||
if (!project || !files || files.length === 0) {
|
||||
return res.status(400).json({ error: 'Project name and files are required' });
|
||||
return res.status(400).json({ error: 'Project id and files are required' });
|
||||
}
|
||||
|
||||
// Validate provider
|
||||
@@ -1048,7 +1053,7 @@ router.get('/remote-status', async (req, res) => {
|
||||
const { project } = req.query;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1126,7 +1131,7 @@ router.post('/fetch', async (req, res) => {
|
||||
const { project } = req.body;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1167,7 +1172,7 @@ router.post('/pull', async (req, res) => {
|
||||
const { project } = req.body;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1235,7 +1240,7 @@ router.post('/push', async (req, res) => {
|
||||
const { project } = req.body;
|
||||
|
||||
if (!project) {
|
||||
return res.status(400).json({ error: 'Project name is required' });
|
||||
return res.status(400).json({ error: 'Project id is required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1306,7 +1311,7 @@ router.post('/publish', async (req, res) => {
|
||||
const { project, branch } = req.body;
|
||||
|
||||
if (!project || !branch) {
|
||||
return res.status(400).json({ error: 'Project name and branch are required' });
|
||||
return res.status(400).json({ error: 'Project id and branch are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1385,7 +1390,7 @@ router.post('/discard', async (req, res) => {
|
||||
const { project, file } = req.body;
|
||||
|
||||
if (!project || !file) {
|
||||
return res.status(400).json({ error: 'Project name and file path are required' });
|
||||
return res.status(400).json({ error: 'Project id and file path are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
@@ -1439,7 +1444,7 @@ router.post('/delete-untracked', async (req, res) => {
|
||||
const { project, file } = req.body;
|
||||
|
||||
if (!project || !file) {
|
||||
return res.status(400).json({ error: 'Project name and file path are required' });
|
||||
return res.status(400).json({ error: 'Project id and file path are required' });
|
||||
}
|
||||
|
||||
try {
|
||||
|
||||
@@ -1,16 +1,21 @@
|
||||
/**
|
||||
* Unified messages endpoint.
|
||||
*
|
||||
* GET /api/sessions/:sessionId/messages?provider=claude&projectName=foo&limit=50&offset=0
|
||||
* GET /api/sessions/:sessionId/messages?provider=claude&projectId=<id>&limit=50&offset=0
|
||||
*
|
||||
* Replaces the four provider-specific session message endpoints with a single route
|
||||
* that delegates to the appropriate adapter via the provider registry.
|
||||
*
|
||||
* After the projectName → projectId migration, Claude history is located via the
|
||||
* DB-backed project path lookup; the route accepts `projectId` (preferred) and
|
||||
* resolves it to the underlying Claude folder name for the downstream adapter.
|
||||
*
|
||||
* @module routes/messages
|
||||
*/
|
||||
|
||||
import express from 'express';
|
||||
import { sessionsService } from '../modules/providers/services/sessions.service.js';
|
||||
import { getProjectPathById, claudeFolderNameFromPath } from '../projects.js';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
@@ -21,7 +26,7 @@ const router = express.Router();
|
||||
*
|
||||
* Query params:
|
||||
* provider - 'claude' | 'cursor' | 'codex' | 'gemini' (default: 'claude')
|
||||
* projectName - required for claude provider
|
||||
* projectId - DB primary key of the project (required for claude provider)
|
||||
* projectPath - required for cursor provider (absolute path used for cwdId hash)
|
||||
* limit - page size (omit or null for all)
|
||||
* offset - pagination offset (default: 0)
|
||||
@@ -30,7 +35,7 @@ router.get('/:sessionId/messages', async (req, res) => {
|
||||
try {
|
||||
const { sessionId } = req.params;
|
||||
const provider = String(req.query.provider || 'claude').trim().toLowerCase();
|
||||
const projectName = req.query.projectName || '';
|
||||
const projectId = req.query.projectId || '';
|
||||
const projectPath = req.query.projectPath || '';
|
||||
const limitParam = req.query.limit;
|
||||
const limit = limitParam !== undefined && limitParam !== null && limitParam !== ''
|
||||
@@ -44,8 +49,20 @@ router.get('/:sessionId/messages', async (req, res) => {
|
||||
return res.status(400).json({ error: `Unknown provider: ${provider}. Available: ${available}` });
|
||||
}
|
||||
|
||||
// The Claude adapter still reads sessions from ~/.claude/projects/<folder>/,
|
||||
// so we translate the caller's projectId into the encoded folder name via
|
||||
// the DB-stored project path before delegating to the adapter.
|
||||
let claudeProjectName = '';
|
||||
if (provider === 'claude' && projectId) {
|
||||
const resolvedPath = await getProjectPathById(projectId);
|
||||
if (!resolvedPath) {
|
||||
return res.status(404).json({ error: 'Project not found' });
|
||||
}
|
||||
claudeProjectName = claudeFolderNameFromPath(resolvedPath);
|
||||
}
|
||||
|
||||
const result = await sessionsService.fetchHistory(provider, sessionId, {
|
||||
projectName,
|
||||
projectName: claudeProjectName,
|
||||
projectPath,
|
||||
limit,
|
||||
offset,
|
||||
|
||||
@@ -13,10 +13,25 @@ import fs from 'fs';
|
||||
import path from 'path';
|
||||
import { promises as fsPromises } from 'fs';
|
||||
import { spawn } from 'child_process';
|
||||
import { extractProjectDirectory } from '../projects.js';
|
||||
import { getProjectPathById } from '../projects.js';
|
||||
import { detectTaskMasterMCPServer } from '../utils/mcp-detector.js';
|
||||
import { broadcastTaskMasterProjectUpdate, broadcastTaskMasterTasksUpdate } from '../utils/taskmaster-websocket.js';
|
||||
|
||||
/**
|
||||
* Resolve the absolute project directory from a DB-assigned `projectId`.
|
||||
*
|
||||
* TaskMaster routes used to accept a Claude-encoded folder name (`projectName`)
|
||||
* and derive the path from JSONL history. After the projectId migration the
|
||||
* only identifier we accept is the primary key of the `projects` table, so
|
||||
* every handler calls this helper and 404s when the id is unknown.
|
||||
*/
|
||||
async function resolveProjectPathFromId(projectId) {
|
||||
if (!projectId) {
|
||||
return null;
|
||||
}
|
||||
return getProjectPathById(projectId);
|
||||
}
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
/**
|
||||
@@ -132,21 +147,22 @@ router.get('/installation-status', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* GET /api/taskmaster/tasks/:projectName
|
||||
* GET /api/taskmaster/tasks/:projectId
|
||||
* Load actual tasks from .taskmaster/tasks/tasks.json
|
||||
*
|
||||
* `projectId` is the DB primary key of the project; the folder is resolved via
|
||||
* the projects table rather than extracted from Claude JSONL history.
|
||||
*/
|
||||
router.get('/tasks/:projectName', async (req, res) => {
|
||||
router.get('/tasks/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
const { projectId } = req.params;
|
||||
|
||||
// Get project path via the DB; the legacy JSONL-based resolver is gone.
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -158,7 +174,7 @@ router.get('/tasks/:projectName', async (req, res) => {
|
||||
await fsPromises.access(tasksFilePath);
|
||||
} catch (error) {
|
||||
return res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
tasks: [],
|
||||
message: 'No tasks.json file found'
|
||||
});
|
||||
@@ -213,7 +229,7 @@ router.get('/tasks/:projectName', async (req, res) => {
|
||||
}));
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
tasks: transformedTasks,
|
||||
currentTag,
|
||||
@@ -247,21 +263,19 @@ router.get('/tasks/:projectName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* GET /api/taskmaster/prd/:projectName
|
||||
* GET /api/taskmaster/prd/:projectId
|
||||
* List all PRD files in the project's .taskmaster/docs directory
|
||||
*/
|
||||
router.get('/prd/:projectName', async (req, res) => {
|
||||
router.get('/prd/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
const { projectId } = req.params;
|
||||
|
||||
// projectId → projectPath lookup through the DB (post-migration).
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -272,7 +286,7 @@ router.get('/prd/:projectName', async (req, res) => {
|
||||
await fsPromises.access(docsPath, fs.constants.R_OK);
|
||||
} catch (error) {
|
||||
return res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
prdFiles: [],
|
||||
message: 'No .taskmaster/docs directory found'
|
||||
});
|
||||
@@ -299,7 +313,7 @@ router.get('/prd/:projectName', async (req, res) => {
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
prdFiles: prdFiles.sort((a, b) => new Date(b.modified) - new Date(a.modified)),
|
||||
timestamp: new Date().toISOString()
|
||||
@@ -323,12 +337,12 @@ router.get('/prd/:projectName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/taskmaster/prd/:projectName
|
||||
* POST /api/taskmaster/prd/:projectId
|
||||
* Create or update a PRD file in the project's .taskmaster/docs directory
|
||||
*/
|
||||
router.post('/prd/:projectName', async (req, res) => {
|
||||
router.post('/prd/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { fileName, content } = req.body;
|
||||
|
||||
if (!fileName || !content) {
|
||||
@@ -346,14 +360,12 @@ router.post('/prd/:projectName', async (req, res) => {
|
||||
});
|
||||
}
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
// Resolve the project folder through the DB using the projectId param.
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -379,7 +391,7 @@ router.post('/prd/:projectName', async (req, res) => {
|
||||
const stats = await fsPromises.stat(filePath);
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
fileName,
|
||||
filePath: path.relative(projectPath, filePath),
|
||||
@@ -408,21 +420,18 @@ router.post('/prd/:projectName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* GET /api/taskmaster/prd/:projectName/:fileName
|
||||
* GET /api/taskmaster/prd/:projectId/:fileName
|
||||
* Get content of a specific PRD file
|
||||
*/
|
||||
router.get('/prd/:projectName/:fileName', async (req, res) => {
|
||||
router.get('/prd/:projectId/:fileName', async (req, res) => {
|
||||
try {
|
||||
const { projectName, fileName } = req.params;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
const { projectId, fileName } = req.params;
|
||||
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -444,7 +453,7 @@ router.get('/prd/:projectName/:fileName', async (req, res) => {
|
||||
const stats = await fsPromises.stat(filePath);
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
fileName,
|
||||
filePath: path.relative(projectPath, filePath),
|
||||
@@ -473,21 +482,18 @@ router.get('/prd/:projectName/:fileName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/taskmaster/init/:projectName
|
||||
* POST /api/taskmaster/init/:projectId
|
||||
* Initialize TaskMaster in a project
|
||||
*/
|
||||
router.post('/init/:projectName', async (req, res) => {
|
||||
router.post('/init/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
const { projectId } = req.params;
|
||||
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -522,17 +528,19 @@ router.post('/init/:projectName', async (req, res) => {
|
||||
|
||||
initProcess.on('close', (code) => {
|
||||
if (code === 0) {
|
||||
// Broadcast TaskMaster project update via WebSocket
|
||||
// Broadcast TaskMaster project update via WebSocket. The
|
||||
// WebSocket payload keeps using `projectId` so the frontend
|
||||
// can match notifications against the current selection.
|
||||
if (req.app.locals.wss) {
|
||||
broadcastTaskMasterProjectUpdate(
|
||||
req.app.locals.wss,
|
||||
projectName,
|
||||
req.app.locals.wss,
|
||||
projectId,
|
||||
{ hasTaskmaster: true, status: 'initialized' }
|
||||
);
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
message: 'TaskMaster initialized successfully',
|
||||
output: stdout,
|
||||
@@ -562,12 +570,12 @@ router.post('/init/:projectName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/taskmaster/add-task/:projectName
|
||||
* POST /api/taskmaster/add-task/:projectId
|
||||
* Add a new task to the project
|
||||
*/
|
||||
router.post('/add-task/:projectName', async (req, res) => {
|
||||
router.post('/add-task/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { prompt, title, description, priority = 'medium', dependencies } = req.body;
|
||||
|
||||
if (!prompt && (!title || !description)) {
|
||||
@@ -576,15 +584,12 @@ router.post('/add-task/:projectName', async (req, res) => {
|
||||
message: 'Either "prompt" or both "title" and "description" are required'
|
||||
});
|
||||
}
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -629,16 +634,17 @@ router.post('/add-task/:projectName', async (req, res) => {
|
||||
console.log('Stderr:', stderr);
|
||||
|
||||
if (code === 0) {
|
||||
// Broadcast task update via WebSocket
|
||||
// Broadcast task update via WebSocket using the projectId so
|
||||
// clients subscribed to this project get notified immediately.
|
||||
if (req.app.locals.wss) {
|
||||
broadcastTaskMasterTasksUpdate(
|
||||
req.app.locals.wss,
|
||||
projectName
|
||||
req.app.locals.wss,
|
||||
projectId
|
||||
);
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
message: 'Task added successfully',
|
||||
output: stdout,
|
||||
@@ -666,22 +672,19 @@ router.post('/add-task/:projectName', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* PUT /api/taskmaster/update-task/:projectName/:taskId
|
||||
* PUT /api/taskmaster/update-task/:projectId/:taskId
|
||||
* Update a specific task using TaskMaster CLI
|
||||
*/
|
||||
router.put('/update-task/:projectName/:taskId', async (req, res) => {
|
||||
router.put('/update-task/:projectId/:taskId', async (req, res) => {
|
||||
try {
|
||||
const { projectName, taskId } = req.params;
|
||||
const { projectId, taskId } = req.params;
|
||||
const { title, description, status, priority, details } = req.body;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -707,11 +710,11 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
|
||||
if (code === 0) {
|
||||
// Broadcast task update via WebSocket
|
||||
if (req.app.locals.wss) {
|
||||
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectName);
|
||||
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
taskId,
|
||||
message: 'Task status updated successfully',
|
||||
@@ -759,11 +762,11 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
|
||||
if (code === 0) {
|
||||
// Broadcast task update via WebSocket
|
||||
if (req.app.locals.wss) {
|
||||
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectName);
|
||||
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
taskId,
|
||||
message: 'Task updated successfully',
|
||||
@@ -793,22 +796,19 @@ router.put('/update-task/:projectName/:taskId', async (req, res) => {
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/taskmaster/parse-prd/:projectName
|
||||
* POST /api/taskmaster/parse-prd/:projectId
|
||||
* Parse a PRD file to generate tasks
|
||||
*/
|
||||
router.post('/parse-prd/:projectName', async (req, res) => {
|
||||
router.post('/parse-prd/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { fileName = 'prd.txt', numTasks, append = false } = req.body;
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -859,13 +859,13 @@ router.post('/parse-prd/:projectName', async (req, res) => {
|
||||
// Broadcast task update via WebSocket
|
||||
if (req.app.locals.wss) {
|
||||
broadcastTaskMasterTasksUpdate(
|
||||
req.app.locals.wss,
|
||||
projectName
|
||||
req.app.locals.wss,
|
||||
projectId
|
||||
);
|
||||
}
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
prdFile: fileName,
|
||||
message: 'PRD parsed and tasks generated successfully',
|
||||
@@ -1340,12 +1340,12 @@ Description of the business problem, data sources, and expected insights.
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/taskmaster/apply-template/:projectName
|
||||
* POST /api/taskmaster/apply-template/:projectId
|
||||
* Apply a PRD template to create a new PRD file
|
||||
*/
|
||||
router.post('/apply-template/:projectName', async (req, res) => {
|
||||
router.post('/apply-template/:projectId', async (req, res) => {
|
||||
try {
|
||||
const { projectName } = req.params;
|
||||
const { projectId } = req.params;
|
||||
const { templateId, fileName = 'prd.txt', customizations = {} } = req.body;
|
||||
|
||||
if (!templateId) {
|
||||
@@ -1355,14 +1355,11 @@ router.post('/apply-template/:projectName', async (req, res) => {
|
||||
});
|
||||
}
|
||||
|
||||
// Get project path
|
||||
let projectPath;
|
||||
try {
|
||||
projectPath = await extractProjectDirectory(projectName);
|
||||
} catch (error) {
|
||||
const projectPath = await resolveProjectPathFromId(projectId);
|
||||
if (!projectPath) {
|
||||
return res.status(404).json({
|
||||
error: 'Project not found',
|
||||
message: `Project "${projectName}" does not exist`
|
||||
message: `Project "${projectId}" does not exist`
|
||||
});
|
||||
}
|
||||
|
||||
@@ -1401,7 +1398,7 @@ router.post('/apply-template/:projectName', async (req, res) => {
|
||||
await fsPromises.writeFile(filePath, content, 'utf8');
|
||||
|
||||
res.json({
|
||||
projectName,
|
||||
projectId,
|
||||
projectPath,
|
||||
templateId,
|
||||
templateName: template.name,
|
||||
|
||||
@@ -7,20 +7,25 @@
|
||||
*/
|
||||
|
||||
/**
|
||||
* Broadcast TaskMaster project update to all connected clients
|
||||
* Broadcast TaskMaster project update to all connected clients.
|
||||
*
|
||||
* The payload key is `projectId` post-migration so frontend listeners can
|
||||
* match notifications against the DB-assigned project identifier they
|
||||
* already use everywhere else.
|
||||
*
|
||||
* @param {WebSocket.Server} wss - WebSocket server instance
|
||||
* @param {string} projectName - Name of the updated project
|
||||
* @param {string} projectId - DB id of the updated project
|
||||
* @param {Object} taskMasterData - Updated TaskMaster data
|
||||
*/
|
||||
export function broadcastTaskMasterProjectUpdate(wss, projectName, taskMasterData) {
|
||||
if (!wss || !projectName) {
|
||||
console.warn('TaskMaster WebSocket broadcast: Missing wss or projectName');
|
||||
export function broadcastTaskMasterProjectUpdate(wss, projectId, taskMasterData) {
|
||||
if (!wss || !projectId) {
|
||||
console.warn('TaskMaster WebSocket broadcast: Missing wss or projectId');
|
||||
return;
|
||||
}
|
||||
|
||||
const message = {
|
||||
type: 'taskmaster-project-updated',
|
||||
projectName,
|
||||
projectId,
|
||||
taskMasterData,
|
||||
timestamp: new Date().toISOString()
|
||||
};
|
||||
@@ -38,20 +43,21 @@ export function broadcastTaskMasterProjectUpdate(wss, projectName, taskMasterDat
|
||||
}
|
||||
|
||||
/**
|
||||
* Broadcast TaskMaster tasks update for a specific project
|
||||
* @param {WebSocket.Server} wss - WebSocket server instance
|
||||
* @param {string} projectName - Name of the project with updated tasks
|
||||
* Broadcast TaskMaster tasks update for a specific project.
|
||||
*
|
||||
* @param {WebSocket.Server} wss - WebSocket server instance
|
||||
* @param {string} projectId - DB id of the project with updated tasks
|
||||
* @param {Object} tasksData - Updated tasks data
|
||||
*/
|
||||
export function broadcastTaskMasterTasksUpdate(wss, projectName, tasksData) {
|
||||
if (!wss || !projectName) {
|
||||
console.warn('TaskMaster WebSocket broadcast: Missing wss or projectName');
|
||||
export function broadcastTaskMasterTasksUpdate(wss, projectId, tasksData) {
|
||||
if (!wss || !projectId) {
|
||||
console.warn('TaskMaster WebSocket broadcast: Missing wss or projectId');
|
||||
return;
|
||||
}
|
||||
|
||||
const message = {
|
||||
type: 'taskmaster-tasks-updated',
|
||||
projectName,
|
||||
projectId,
|
||||
tasksData,
|
||||
timestamp: new Date().toISOString()
|
||||
};
|
||||
|
||||
Reference in New Issue
Block a user