mirror of
https://github.com/siteboon/claudecodeui.git
synced 2026-05-01 18:28:38 +00:00
* refactor: remove unused exports
* refactor: remove unused fields from project and session objects
* refactor: rename session_names table and related code to sessions for clarity and consistency
* refactor(database): move db into typescript
- Implemented githubTokensDb for managing GitHub tokens with CRUD operations.
- Created
otificationPreferencesDb to handle user notification preferences.
- Added projectsDb for project path management and related operations.
- Introduced pushSubscriptionsDb for managing browser push subscriptions.
- Developed scanStateDb to track the last scanned timestamp.
- Established sessionsDb for session management with CRUD functionalities.
- Created userDb for user management, including authentication and onboarding.
- Implemented apidKeysDb for storing and managing VAPID keys.
feat(database): define schema for new database tables
- Added SQL schema definitions for users, API keys, user credentials, notification preferences, VAPID keys, push subscriptions, projects, sessions, scan state, and app configuration.
- Included necessary indexes for performance optimization.
refactor(shared): enhance type definitions and utility functions
- Updated shared types and interfaces for improved clarity and consistency.
- Added new types for credential management and provider-specific operations.
- Refined utility functions for better error handling and message normalization.
* feat: added session indexer logic
* perf(projects): lazy-load TaskMaster metadata per selected project
Why:
- /api/projects is a hot path (initial load, sidebar refresh, websocket sync).
- Scanning .taskmaster for every project on each call added avoidable fs I/O and payload size.
- TaskMaster metadata is only needed after selecting a specific project.
- Moving it to a project-scoped endpoint makes loading cost match user intent.
- The UI now hydrates TaskMaster state on selection and keeps it across refresh events.
- This prevents status flicker/regression while still removing global scan overhead.
- Selection fetches are sequence-guarded to block stale async responses on fast switching.
- isManuallyAdded was removed from responses to keep the public project contract minimal.
- Project dumps now use incrementing snapshot files to preserve history for debugging.
What changed:
- Added GET /api/projects/:projectName/taskmaster and getProjectTaskMaster().
- Removed TaskMaster detection from bulk getProjects().
- Added api.projectTaskmaster(...) plus selection-time hydration in frontend contexts.
- Merged cached taskmaster values into refreshed project lists for continuity.
- Removed isManuallyAdded from manual project payloads.
* refactor: update import paths for database modules and remove legacy db.js and schema.js files
* refactor(projects): identify projects by DB projectId instead of folder-derived name
GET /api/projects used to scan ~/.claude/projects/ on every request, derive
each project's identity from the encoded folder name, and re-parse JSONL
files to build session lists. Using the folder-derived name as the project
identifier leaked the Claude CLI's on-disk encoding into every API route,
forced every downstream endpoint to re-resolve a real path via JSONL
'cwd' inspection, and made the project list endpoint O(projects x sessions)
on disk I/O.
This change switches the entire API surface to identify projects by the
stable primary key from the 'projects' table and drives the listing
straight from the DB:
- Add projectsDb.getProjectPathById as the canonical projectId -> path
resolver so routes no longer need to touch the filesystem to figure out
where a project lives.
- Rewrite getProjects so it reads the project list from the 'projects'
table and the per-project session list from the 'sessions' table (one
SELECT per project). No filesystem scanning happens for this endpoint
anymore, which removes the dependency on ~/.claude/projects existing,
on Cursor's MD5-hashed chat folders being discoverable, and on Codex's
JSONL history being on disk. Per the migration spec each session now
exposes 'summary' sourced from sessions.custom_name, 'messageCount' = 0
(message counting is not implemented), and sessionMeta.hasMore is
pinned to false since this endpoint doesn't drive session pagination.
- Introduce id-based wrappers (getSessionsById, renameProjectById,
deleteSessionById, deleteProjectById, getProjectTaskMasterById) so
every caller can pass projectId and resolve the real path through the
DB. renameProjectById also writes to projects.custom_project_name so
the DB-driven getProjects response reflects renames immediately; it
keeps project-config.json in sync for any legacy reader that still
consults the JSON file.
- Migrate every /api/projects/:projectName route in server/index.js,
server/routes/taskmaster.js, and server/routes/messages.js to
:projectId, and change server/routes/git.js so the 'project'
query/body parameter carries a projectId that is resolved through the
DB before any git command runs. TaskMaster WebSocket broadcasts emit
'projectId' for the same reason so the frontend can match
notifications against its current selection without another lookup.
- Delete helpers that existed only to feed the old getProjects path
(getCursorSessions, getGeminiCliSessions, getProjectTaskMaster) along
with their unused imports (better-sqlite3's Database,
applyCustomSessionNames). The legacy folder-name helpers (getSessions,
renameProject, deleteSession, deleteProject, extractProjectDirectory)
are kept as internal implementation details of the id-based wrappers
and of destructive cleanup / conversation search, but they are no
longer re-exported.
- searchConversations still walks JSONL to produce match snippets (that
data doesn't live in the DB), but it now includes the resolved
projectId in each result so the sidebar can cross-reference hits with
its already loaded project list without a second round-trip.
Frontend migration:
- Project.name is replaced by Project.projectId in src/types/app.ts, and
ProjectSession.__projectName becomes __projectId so session tagging
and sidebar state keys stay aligned with the backend identifier.
Settings continues to use SettingsProject.name for legacy consumers,
but it is populated from projectId by normalizeProjectForSettings.
- All places that previously indexed per-project state by project.name
(sidebar expanded/starred/loading/deletingProjects sets,
additionalSessions map, projectHasMoreOverrides, starredProjects
localStorage, command history and draft-input localStorage,
TaskMaster caches) now key on projectId so state survives
display-name edits and is consistent across the app.
- src/utils/api.js renames every endpoint parameter to projectId, the
unified messages endpoint takes projectId in its query string, and
useSessionStore forwards projectId on fetchFromServer / fetchMore /
refreshFromServer. Git panel, file tree, code editor, PRD editor,
plugins context, MCP server flows and TaskMaster hooks are all
updated to pass projectId.
- DEFAULT_PROJECT_FOR_EMPTY_SHELL is updated to carry a 'default'
projectId sentinel so the empty-shell placeholder still satisfies the
Project contract.
Bug fix bundled in:
- sessionsDb.setName no longer bumps updated_at when a row already
exists. Renaming is a label change, not activity, so there is no
reason for it to reset 'last activity' in the sidebar. It also no
longer relies on SQLite's CURRENT_TIMESTAMP, which stores a naive
'YYYY-MM-DD HH:MM:SS' value that JavaScript parses as local time and
caused renamed sessions to appear shifted backwards by the client's
UTC offset. When an INSERT actually happens it now writes ISO-8601
UTC with a 'Z' suffix.
- buildSessionsByProviderFromDb normalizes any legacy naive timestamps
in the sessions table to ISO-8601 UTC on the way out so rows written
before this change also render correctly on the client.
Other cleanup:
- Removed the filesystem-first project-discovery comment block at the
top of server/projects.js and replaced it with a short note that
describes the new DB-driven flow and lists the few remaining
filesystem-dependent helpers (message reads, search, destructive
delete, manual project registration).
- server/modules/providers/index.ts is added as a small barrel so the
providers module exposes a stable public surface.
Made-with: Cursor
* refactor(projects): reorganize project-related logic into dedicated modules
* refactor(projects): rename getProjects with getProjectsWithSessions
* refactor: update import path for getProjectsWithSessions to include file extension
* refactor: use updated session watcher
In addition, for projects_updated websocket response, send the sessionId instead
* refactor(websocket): move websocket logic to its own module
* refactor(sessions-watcher): remove redundant logging after session sync completion
* refactor(index.js): reorganize code structure
* refactor(index.js): fix import order
* refactor: remove unnecessary GitHub cloning logic from create-workspace endpoint
* refactor: modularize project services, and wizard create/clone flow
Restructure project creation, listing, GitHub clone progress, and TaskMaster
details behind a dedicated TypeScript module under server/modules/projects/,
and align the client wizard with a single path-based flow.
Server / routing
- Remove server/routes/projects.js and mount server/modules/projects/
projects.routes.ts at /api/projects (still behind authenticateToken).
- Drop duplicate handlers from server/index.js for GET /api/projects and
GET /api/projects/:projectId/taskmaster; those live on the new router.
- Import WORKSPACES_ROOT and validateWorkspacePath from shared utils in
index.js instead of the deleted projects route module.
Projects router (projects.routes.ts)
- GET /: list projects with sessions (existing snapshot behavior).
- POST /create-project: validate body, reject legacy workspaceType and
mixed clone fields, delegate to createProject service, return distinct
success copy when an archived path is reactivated.
- GET /clone-progress: Server-Sent Events for clone progress/complete/error;
requires authenticated user id for token resolution; wires startCloneProject.
- GET /:projectId/taskmaster: delegates to getProjectTaskMaster.
Services (new)
- project-management.service.ts: path validation, workspace directory
creation, persistence via projectsDb.createProjectPath, mapping to API
project shape; surfaces AppError for validation, conflict, and not-found
cases; optional dependency injection for tests.
- project-clone.service.ts: validates workspace, resolves GitHub auth
(stored token or inline token), runs git clone with progress callbacks,
registers project via createProject on success; sanitizes errors and
supports cancellation; injectable dependencies for tests.
- projects-has-taskmaster.service.ts: moves TaskMaster detection and
normalization out of server/projects.js; resolve-by-id and public
getProjectTaskMaster with structured AppError responses.
Persistence and shared types
- projectsDb.createProjectPath now returns CreateProjectPathResult
(created | reactivated_archived | active_conflict) using INSERT … ON
CONFLICT with selective update when the row is archived; normalizes
display name from path or custom name; repository row typing moves to
shared ProjectRepositoryRow.
- getProjectPaths() returns only non-archived rows (isArchived = 0).
- shared/types.ts: ProjectRepositoryRow, CreateProjectPathResult/outcome,
WorkspacePathValidationResult.
- shared/utils.ts: WORKSPACES_ROOT, forbidden path lists, validateWorkspacePath,
asyncHandler for Express async routes.
Legacy cleanup
- server/projects.js: remove detectTaskMasterFolder, normalizeTaskMasterInfo,
and getProjectTaskMasterById (logic lives in the new service).
- server/routes/agent.js: register external API project paths with
projectsDb.createProjectPath instead of addProjectManually try/catch;
treat active_conflict as an existing registration and continue.
Tests
- Add Node test suites for project-management, project-clone, and
projects-has-taskmaster services; update projects.service test import
for renamed projects-with-sessions-fetch.service.ts.
Rename
- projects.service.ts → projects-with-sessions-fetch.service.ts;
re-export from modules/projects/index.ts.
Client (project creation wizard)
- Remove StepTypeSelection and workspaceType from form state and types;
wizard is two steps (configure path/GitHub auth, then review).
- createWorkspaceRequest → createProjectRequest; clone vs create-only
inferred from githubUrl (pathUtils / isCloneWorkflow).
- Adjust step indices, WizardProgress, StepConfiguration/Review,
WorkspacePathField, and src/utils/api.js as needed for the new API.
Docs
- Minor websocket README touch-up.
Net: ~1.6k insertions / ~0.9k deletions across 29 files; behavior is
centralized in typed services with explicit HTTP errors and test seams.
* refactor: remove loading sessions logic from sidebar
* refactor: move project rename to module
* refactor: move project deletion to module
* refactor: move project star state from localStorage to backend
* refactor: implement optimistic UI for project star state management
* feat: optimistic update for session watcher
* fix(projects-state): stop websocket message reprocessing loop
The websocket projects effect in useProjectsState could re-handle the same
latestMessage after local state writes triggered re-renders.
Under bursty websocket traffic, this created an update feedback cycle
that surfaced as 'Maximum update depth exceeded', often from Sidebar.
What changed:
- Added lastHandledMessageRef so each latestMessage object is handled once.
- Added an early return guard when the current message was already handled.
- Made projects updates idempotent by comparing previous and merged payloads
before calling setProjects.
Result:
- Breaks the effect -> state update -> effect re-entry cycle.
- Reduces redundant renders during rapid projects_updated traffic while
preserving normal project/session synchronization.
* refactor: optimize project auto-expand logic
* refactor: move projects provider specific logic into respective session providers
* refactor: move rename and delete sessions to modules
* refactor: move fetching messages to module
* fix: remove unused var
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
* Potential fix for pull request finding 'Useless assignment to local variable'
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
* refactor(projects/sidebar): remove temp snapshot side-effects and simplify session metadata UX
Why this change was needed:
- Project listing had an implicit side effect: every fetch wrote a debug snapshot under `.tmp/project-dumps`.
That added unnecessary disk I/O to a hot path, introduced hidden runtime behavior, and created maintenance
overhead for code that was not part of product functionality.
- Keeping snapshot-specific exports/tests around made the projects module API broader than needed and coupled
tests to temporary/debug behavior instead of user-visible behavior.
- Codex sessions could remain stuck with a placeholder name (`Untitled Codex Session`) even after a real title
became available from newer sync data, which degraded session discoverability in the UI.
- Sidebar session rows showed duplicated provider branding and long-form relative times, which added visual noise
and reduced scan speed when many sessions are listed.
What changed:
- Removed temporary projects snapshot dumping from `projects-with-sessions-fetch.service.ts`:
- deleted snapshot types/helpers and file-write flow
- removed the write call from `getProjectsWithSessions`
- Removed snapshot-related surface area from `projects/index.ts`.
- Removed the snapshot-focused test `projects.service.test.ts` that only validated removed debug behavior.
- Updated `codex-session-synchronizer.provider.ts` to upgrade session names when an existing session still has
the placeholder title but a real parsed name is now available.
- Updated `SidebarSessionItem.tsx`:
- removed duplicate provider logo rendering in each session row
- moved age indicator to the right side
- made age indicator fade on hover to prioritize action controls
- switched to compact relative time format (`<1m`, `Xm`, `Xhr`, `Xd`) for faster list scanning
Outcome:
- Lower overhead and fewer hidden side effects in project fetches.
- Cleaner module boundaries in projects.
- Better Codex session naming consistency after sync.
- Cleaner sidebar density and clearer hover/action behavior.
* refactor: implement pagination for project sessions loading
* refactor: move search to module
* fix: search performance
* refactor: add handling for internal Codex metadata in conversation search
* fix(migrations,projects,clone): normalize legacy schema before writes and harden conflict detection
Why
- Legacy installs can have a sessions table shape that predates provider/custom_name columns. Running migrateLegacySessionNames first caused its INSERT OR REPLACE INTO sessions (...) to target columns that may not exist and fail during startup migration.
- Some upgraded databases had projects.project_id as plain TEXT instead of a real PRIMARY KEY. That breaks assumptions used by id-based lookups and can allow invalid/duplicate identity semantics over time.
- projectsDb.createProjectPath inferred outcomes from
ow.isArchived, but the upsert path always returns the post-update row with isArchived=0, so archived-reactivation and fresh-create could be misclassified.
- git clone accepted user-controlled URLs directly in argv position, so inputs beginning with - could be interpreted as options instead of a repository argument.
What
- Added
ebuildProjectsTableWithPrimaryKeySchema in migrations: detect table shape via getTableInfo('projects'), verify project_id has pk=1, and rebuild when missing.
- Rebuild flow now creates a canonical projects__new table (project_id TEXT PRIMARY KEY), copies rows with transformation, backfills empty ids via SQLITE_UUID_SQL, deduplicates conflicting ids/paths, then swaps tables inside a transaction.
- Replaced the prior ddColumnToTableIfNotExists(...) + UPDATE project_id sequence with PK-aware detection/rebuild logic so legacy DBs converge to the required schema.
- Reordered migration sequence to run
ebuildSessionsTableWithProjectSchema before migrateLegacySessionNames, ensuring sessions is normalized before legacy session_names merge writes execute.
- Updated projectsDb.createProjectPath to generate an ttemptedId before insert, pass it into the prepared statement, and classify outcomes by comparing returned
ow.project_id to ttemptedId (created vs
eactivated_archived), with no-row remaining ctive_conflict.
- Hardened clone execution by inserting -- before clone URL in git argv and rejecting normalized GitHub URLs that start with - in startCloneProject.
Tests
- Added integration coverage for projectsDb.createProjectPath branches: fresh insert, archived reactivation, and active conflict.
- Added clone service test for option-prefixed githubUrl rejection (INVALID_GITHUB_URL).
* refactor(session-synchronizer): update last scanned timestamp based on synchronization results
* refactor: improve session limit and offset validation in provider routes
* refactor: normalize project paths across database and service modules
* refactor(database): make session id the primary key in sessions table
* fix(codex): preserve reasoning entries as thinking blocks
Codex history normalization was downgrading reasoning into plain assistant text
because of branch ordering, not because the raw data was missing.
Why this mattered:
- Codex reasoning JSONL entries are intentionally mapped to history items with
type thinking, but they also carry message.role assistant.
- normalizeHistoryEntry evaluated the assistant-role branch before the
thinking branch.
- As a result, reasoning content matched the assistant-text path first and was
emitted as kind text instead of kind thinking.
- This collapses semantic intent, so UI and downstream features that rely on
thinking blocks (separate rendering, filtering, and interpretation of model
thought process vs final answer) receive the wrong message kind.
What changed:
- Prioritized thinking detection (raw.type === thinking or raw.isReasoning)
before role-based assistant normalization.
- Kept a non-empty content guard for thinking payloads to avoid emitting empty
artifacts.
Impact:
- Reasoning entries from persisted Codex JSONL now remain thinking blocks
end-to-end.
- Regular assistant text normalization behavior remains unchanged.
* refactor: remove dead code
* refactor: directly use getProjectPathById from projectsDb
* refactor: add gemini jsonl session support
1469 lines
46 KiB
JavaScript
1469 lines
46 KiB
JavaScript
/**
|
|
* TASKMASTER API ROUTES
|
|
* ====================
|
|
*
|
|
* This module provides API endpoints for TaskMaster integration including:
|
|
* - .taskmaster folder detection in project directories
|
|
* - MCP server configuration detection
|
|
* - TaskMaster state and metadata management
|
|
*/
|
|
|
|
import express from 'express';
|
|
import fs from 'fs';
|
|
import path from 'path';
|
|
import { promises as fsPromises } from 'fs';
|
|
import { spawn } from 'child_process';
|
|
import { projectsDb } from '../modules/database/index.js';
|
|
import { detectTaskMasterMCPServer } from '../utils/mcp-detector.js';
|
|
import { broadcastTaskMasterProjectUpdate, broadcastTaskMasterTasksUpdate } from '../utils/taskmaster-websocket.js';
|
|
|
|
/**
|
|
* Resolve the absolute project directory from a DB-assigned `projectId`.
|
|
*
|
|
* TaskMaster routes used to accept a Claude-encoded folder name (`projectName`)
|
|
* and derive the path from JSONL history. After the projectId migration the
|
|
* only identifier we accept is the primary key of the `projects` table, so
|
|
* every handler calls this helper and 404s when the id is unknown.
|
|
*/
|
|
async function resolveProjectPathFromId(projectId) {
|
|
if (!projectId) {
|
|
return null;
|
|
}
|
|
return projectsDb.getProjectPathById(projectId);
|
|
}
|
|
|
|
const router = express.Router();
|
|
|
|
/**
|
|
* Check if TaskMaster CLI is installed globally
|
|
* @returns {Promise<Object>} Installation status result
|
|
*/
|
|
async function checkTaskMasterInstallation() {
|
|
return new Promise((resolve) => {
|
|
// Check if task-master command is available
|
|
const child = spawn('which', ['task-master'], {
|
|
stdio: ['ignore', 'pipe', 'pipe'],
|
|
shell: true
|
|
});
|
|
|
|
let output = '';
|
|
let errorOutput = '';
|
|
|
|
child.stdout.on('data', (data) => {
|
|
output += data.toString();
|
|
});
|
|
|
|
child.stderr.on('data', (data) => {
|
|
errorOutput += data.toString();
|
|
});
|
|
|
|
child.on('close', (code) => {
|
|
if (code === 0 && output.trim()) {
|
|
// TaskMaster is installed, get version
|
|
const versionChild = spawn('task-master', ['--version'], {
|
|
stdio: ['ignore', 'pipe', 'pipe'],
|
|
shell: true
|
|
});
|
|
|
|
let versionOutput = '';
|
|
|
|
versionChild.stdout.on('data', (data) => {
|
|
versionOutput += data.toString();
|
|
});
|
|
|
|
versionChild.on('close', (versionCode) => {
|
|
resolve({
|
|
isInstalled: true,
|
|
installPath: output.trim(),
|
|
version: versionCode === 0 ? versionOutput.trim() : 'unknown',
|
|
reason: null
|
|
});
|
|
});
|
|
|
|
versionChild.on('error', () => {
|
|
resolve({
|
|
isInstalled: true,
|
|
installPath: output.trim(),
|
|
version: 'unknown',
|
|
reason: null
|
|
});
|
|
});
|
|
} else {
|
|
resolve({
|
|
isInstalled: false,
|
|
installPath: null,
|
|
version: null,
|
|
reason: 'TaskMaster CLI not found in PATH'
|
|
});
|
|
}
|
|
});
|
|
|
|
child.on('error', (error) => {
|
|
resolve({
|
|
isInstalled: false,
|
|
installPath: null,
|
|
version: null,
|
|
reason: `Error checking installation: ${error.message}`
|
|
});
|
|
});
|
|
});
|
|
}
|
|
|
|
// API Routes
|
|
|
|
/**
|
|
* GET /api/taskmaster/installation-status
|
|
* Check if TaskMaster CLI is installed on the system
|
|
*/
|
|
router.get('/installation-status', async (req, res) => {
|
|
try {
|
|
const installationStatus = await checkTaskMasterInstallation();
|
|
|
|
// Also check for MCP server configuration
|
|
const mcpStatus = await detectTaskMasterMCPServer();
|
|
|
|
res.json({
|
|
success: true,
|
|
installation: installationStatus,
|
|
mcpServer: mcpStatus,
|
|
isReady: installationStatus.isInstalled && mcpStatus.hasMCPServer
|
|
});
|
|
} catch (error) {
|
|
console.error('Error checking TaskMaster installation:', error);
|
|
res.status(500).json({
|
|
success: false,
|
|
error: 'Failed to check TaskMaster installation status',
|
|
installation: {
|
|
isInstalled: false,
|
|
reason: `Server error: ${error.message}`
|
|
},
|
|
mcpServer: {
|
|
hasMCPServer: false,
|
|
reason: `Server error: ${error.message}`
|
|
},
|
|
isReady: false
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* GET /api/taskmaster/tasks/:projectId
|
|
* Load actual tasks from .taskmaster/tasks/tasks.json
|
|
*
|
|
* `projectId` is the DB primary key of the project; the folder is resolved via
|
|
* the projects table rather than extracted from Claude JSONL history.
|
|
*/
|
|
router.get('/tasks/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
|
|
// Get project path via the DB; the legacy JSONL-based resolver is gone.
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
const taskMasterPath = path.join(projectPath, '.taskmaster');
|
|
const tasksFilePath = path.join(taskMasterPath, 'tasks', 'tasks.json');
|
|
|
|
// Check if tasks file exists
|
|
try {
|
|
await fsPromises.access(tasksFilePath);
|
|
} catch (error) {
|
|
return res.json({
|
|
projectId,
|
|
tasks: [],
|
|
message: 'No tasks.json file found'
|
|
});
|
|
}
|
|
|
|
// Read and parse tasks file
|
|
try {
|
|
const tasksContent = await fsPromises.readFile(tasksFilePath, 'utf8');
|
|
const tasksData = JSON.parse(tasksContent);
|
|
|
|
let tasks = [];
|
|
let currentTag = 'master';
|
|
|
|
// Handle both tagged and legacy formats
|
|
if (Array.isArray(tasksData)) {
|
|
// Legacy format
|
|
tasks = tasksData;
|
|
} else if (tasksData.tasks) {
|
|
// Simple format with tasks array
|
|
tasks = tasksData.tasks;
|
|
} else {
|
|
// Tagged format - get tasks from current tag or master
|
|
if (tasksData[currentTag] && tasksData[currentTag].tasks) {
|
|
tasks = tasksData[currentTag].tasks;
|
|
} else if (tasksData.master && tasksData.master.tasks) {
|
|
tasks = tasksData.master.tasks;
|
|
} else {
|
|
// Get tasks from first available tag
|
|
const firstTag = Object.keys(tasksData).find(key =>
|
|
tasksData[key].tasks && Array.isArray(tasksData[key].tasks)
|
|
);
|
|
if (firstTag) {
|
|
tasks = tasksData[firstTag].tasks;
|
|
currentTag = firstTag;
|
|
}
|
|
}
|
|
}
|
|
|
|
// Transform tasks to ensure all have required fields
|
|
const transformedTasks = tasks.map(task => ({
|
|
id: task.id,
|
|
title: task.title || 'Untitled Task',
|
|
description: task.description || '',
|
|
status: task.status || 'pending',
|
|
priority: task.priority || 'medium',
|
|
dependencies: task.dependencies || [],
|
|
createdAt: task.createdAt || task.created || new Date().toISOString(),
|
|
updatedAt: task.updatedAt || task.updated || new Date().toISOString(),
|
|
details: task.details || '',
|
|
testStrategy: task.testStrategy || task.test_strategy || '',
|
|
subtasks: task.subtasks || []
|
|
}));
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
tasks: transformedTasks,
|
|
currentTag,
|
|
totalTasks: transformedTasks.length,
|
|
tasksByStatus: {
|
|
pending: transformedTasks.filter(t => t.status === 'pending').length,
|
|
'in-progress': transformedTasks.filter(t => t.status === 'in-progress').length,
|
|
done: transformedTasks.filter(t => t.status === 'done').length,
|
|
review: transformedTasks.filter(t => t.status === 'review').length,
|
|
deferred: transformedTasks.filter(t => t.status === 'deferred').length,
|
|
cancelled: transformedTasks.filter(t => t.status === 'cancelled').length
|
|
},
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (parseError) {
|
|
console.error('Failed to parse tasks.json:', parseError);
|
|
return res.status(500).json({
|
|
error: 'Failed to parse tasks file',
|
|
message: parseError.message
|
|
});
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('TaskMaster tasks loading error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to load TaskMaster tasks',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* GET /api/taskmaster/prd/:projectId
|
|
* List all PRD files in the project's .taskmaster/docs directory
|
|
*/
|
|
router.get('/prd/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
|
|
// projectId → projectPath lookup through the DB (post-migration).
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
const docsPath = path.join(projectPath, '.taskmaster', 'docs');
|
|
|
|
// Check if docs directory exists
|
|
try {
|
|
await fsPromises.access(docsPath, fs.constants.R_OK);
|
|
} catch (error) {
|
|
return res.json({
|
|
projectId,
|
|
prdFiles: [],
|
|
message: 'No .taskmaster/docs directory found'
|
|
});
|
|
}
|
|
|
|
// Read directory and filter for PRD files
|
|
try {
|
|
const files = await fsPromises.readdir(docsPath);
|
|
const prdFiles = [];
|
|
|
|
for (const file of files) {
|
|
const filePath = path.join(docsPath, file);
|
|
const stats = await fsPromises.stat(filePath);
|
|
|
|
if (stats.isFile() && (file.endsWith('.txt') || file.endsWith('.md'))) {
|
|
prdFiles.push({
|
|
name: file,
|
|
path: path.relative(projectPath, filePath),
|
|
size: stats.size,
|
|
modified: stats.mtime.toISOString(),
|
|
created: stats.birthtime.toISOString()
|
|
});
|
|
}
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
prdFiles: prdFiles.sort((a, b) => new Date(b.modified) - new Date(a.modified)),
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (readError) {
|
|
console.error('Error reading docs directory:', readError);
|
|
return res.status(500).json({
|
|
error: 'Failed to read PRD files',
|
|
message: readError.message
|
|
});
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('PRD list error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to list PRD files',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* POST /api/taskmaster/prd/:projectId
|
|
* Create or update a PRD file in the project's .taskmaster/docs directory
|
|
*/
|
|
router.post('/prd/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
const { fileName, content } = req.body;
|
|
|
|
if (!fileName || !content) {
|
|
return res.status(400).json({
|
|
error: 'Missing required fields',
|
|
message: 'fileName and content are required'
|
|
});
|
|
}
|
|
|
|
// Validate filename
|
|
if (!fileName.match(/^[\w\-. ]+\.(txt|md)$/)) {
|
|
return res.status(400).json({
|
|
error: 'Invalid filename',
|
|
message: 'Filename must end with .txt or .md and contain only alphanumeric characters, spaces, dots, and dashes'
|
|
});
|
|
}
|
|
|
|
// Resolve the project folder through the DB using the projectId param.
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
const docsPath = path.join(projectPath, '.taskmaster', 'docs');
|
|
const filePath = path.join(docsPath, fileName);
|
|
|
|
// Ensure docs directory exists
|
|
try {
|
|
await fsPromises.mkdir(docsPath, { recursive: true });
|
|
} catch (error) {
|
|
console.error('Failed to create docs directory:', error);
|
|
return res.status(500).json({
|
|
error: 'Failed to create directory',
|
|
message: error.message
|
|
});
|
|
}
|
|
|
|
// Write the PRD file
|
|
try {
|
|
await fsPromises.writeFile(filePath, content, 'utf8');
|
|
|
|
// Get file stats
|
|
const stats = await fsPromises.stat(filePath);
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
fileName,
|
|
filePath: path.relative(projectPath, filePath),
|
|
size: stats.size,
|
|
created: stats.birthtime.toISOString(),
|
|
modified: stats.mtime.toISOString(),
|
|
message: 'PRD file saved successfully',
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (writeError) {
|
|
console.error('Failed to write PRD file:', writeError);
|
|
return res.status(500).json({
|
|
error: 'Failed to write PRD file',
|
|
message: writeError.message
|
|
});
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('PRD create/update error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to create/update PRD file',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* GET /api/taskmaster/prd/:projectId/:fileName
|
|
* Get content of a specific PRD file
|
|
*/
|
|
router.get('/prd/:projectId/:fileName', async (req, res) => {
|
|
try {
|
|
const { projectId, fileName } = req.params;
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
const filePath = path.join(projectPath, '.taskmaster', 'docs', fileName);
|
|
|
|
// Check if file exists
|
|
try {
|
|
await fsPromises.access(filePath, fs.constants.R_OK);
|
|
} catch (error) {
|
|
return res.status(404).json({
|
|
error: 'PRD file not found',
|
|
message: `File "${fileName}" does not exist`
|
|
});
|
|
}
|
|
|
|
// Read file content
|
|
try {
|
|
const content = await fsPromises.readFile(filePath, 'utf8');
|
|
const stats = await fsPromises.stat(filePath);
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
fileName,
|
|
filePath: path.relative(projectPath, filePath),
|
|
content,
|
|
size: stats.size,
|
|
created: stats.birthtime.toISOString(),
|
|
modified: stats.mtime.toISOString(),
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (readError) {
|
|
console.error('Failed to read PRD file:', readError);
|
|
return res.status(500).json({
|
|
error: 'Failed to read PRD file',
|
|
message: readError.message
|
|
});
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('PRD read error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to read PRD file',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* POST /api/taskmaster/init/:projectId
|
|
* Initialize TaskMaster in a project
|
|
*/
|
|
router.post('/init/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
// Check if TaskMaster is already initialized
|
|
const taskMasterPath = path.join(projectPath, '.taskmaster');
|
|
try {
|
|
await fsPromises.access(taskMasterPath, fs.constants.F_OK);
|
|
return res.status(400).json({
|
|
error: 'TaskMaster already initialized',
|
|
message: 'TaskMaster is already configured for this project'
|
|
});
|
|
} catch (error) {
|
|
// Directory doesn't exist, we can proceed
|
|
}
|
|
|
|
// Run taskmaster init command
|
|
const initProcess = spawn('npx', ['task-master', 'init'], {
|
|
cwd: projectPath,
|
|
stdio: ['pipe', 'pipe', 'pipe']
|
|
});
|
|
|
|
let stdout = '';
|
|
let stderr = '';
|
|
|
|
initProcess.stdout.on('data', (data) => {
|
|
stdout += data.toString();
|
|
});
|
|
|
|
initProcess.stderr.on('data', (data) => {
|
|
stderr += data.toString();
|
|
});
|
|
|
|
initProcess.on('close', (code) => {
|
|
if (code === 0) {
|
|
// Broadcast TaskMaster project update via WebSocket. The
|
|
// WebSocket payload keeps using `projectId` so the frontend
|
|
// can match notifications against the current selection.
|
|
if (req.app.locals.wss) {
|
|
broadcastTaskMasterProjectUpdate(
|
|
req.app.locals.wss,
|
|
projectId,
|
|
{ hasTaskmaster: true, status: 'initialized' }
|
|
);
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
message: 'TaskMaster initialized successfully',
|
|
output: stdout,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
} else {
|
|
console.error('TaskMaster init failed:', stderr);
|
|
res.status(500).json({
|
|
error: 'Failed to initialize TaskMaster',
|
|
message: stderr || stdout,
|
|
code
|
|
});
|
|
}
|
|
});
|
|
|
|
// Send 'yes' responses to automated prompts
|
|
initProcess.stdin.write('yes\n');
|
|
initProcess.stdin.end();
|
|
|
|
} catch (error) {
|
|
console.error('TaskMaster init error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to initialize TaskMaster',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* POST /api/taskmaster/add-task/:projectId
|
|
* Add a new task to the project
|
|
*/
|
|
router.post('/add-task/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
const { prompt, title, description, priority = 'medium', dependencies } = req.body;
|
|
|
|
if (!prompt && (!title || !description)) {
|
|
return res.status(400).json({
|
|
error: 'Missing required parameters',
|
|
message: 'Either "prompt" or both "title" and "description" are required'
|
|
});
|
|
}
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
// Build the task-master add-task command
|
|
const args = ['task-master-ai', 'add-task'];
|
|
|
|
if (prompt) {
|
|
args.push('--prompt', prompt);
|
|
args.push('--research'); // Use research for AI-generated tasks
|
|
} else {
|
|
args.push('--prompt', `Create a task titled "${title}" with description: ${description}`);
|
|
}
|
|
|
|
if (priority) {
|
|
args.push('--priority', priority);
|
|
}
|
|
|
|
if (dependencies) {
|
|
args.push('--dependencies', dependencies);
|
|
}
|
|
|
|
// Run task-master add-task command
|
|
const addTaskProcess = spawn('npx', args, {
|
|
cwd: projectPath,
|
|
stdio: ['pipe', 'pipe', 'pipe']
|
|
});
|
|
|
|
let stdout = '';
|
|
let stderr = '';
|
|
|
|
addTaskProcess.stdout.on('data', (data) => {
|
|
stdout += data.toString();
|
|
});
|
|
|
|
addTaskProcess.stderr.on('data', (data) => {
|
|
stderr += data.toString();
|
|
});
|
|
|
|
addTaskProcess.on('close', (code) => {
|
|
console.log('Add task process completed with code:', code);
|
|
console.log('Stdout:', stdout);
|
|
console.log('Stderr:', stderr);
|
|
|
|
if (code === 0) {
|
|
// Broadcast task update via WebSocket using the projectId so
|
|
// clients subscribed to this project get notified immediately.
|
|
if (req.app.locals.wss) {
|
|
broadcastTaskMasterTasksUpdate(
|
|
req.app.locals.wss,
|
|
projectId
|
|
);
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
message: 'Task added successfully',
|
|
output: stdout,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
} else {
|
|
console.error('Add task failed:', stderr);
|
|
res.status(500).json({
|
|
error: 'Failed to add task',
|
|
message: stderr || stdout,
|
|
code
|
|
});
|
|
}
|
|
});
|
|
|
|
addTaskProcess.stdin.end();
|
|
|
|
} catch (error) {
|
|
console.error('Add task error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to add task',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* PUT /api/taskmaster/update-task/:projectId/:taskId
|
|
* Update a specific task using TaskMaster CLI
|
|
*/
|
|
router.put('/update-task/:projectId/:taskId', async (req, res) => {
|
|
try {
|
|
const { projectId, taskId } = req.params;
|
|
const { title, description, status, priority, details } = req.body;
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
// If only updating status, use set-status command
|
|
if (status && Object.keys(req.body).length === 1) {
|
|
const setStatusProcess = spawn('npx', ['task-master-ai', 'set-status', `--id=${taskId}`, `--status=${status}`], {
|
|
cwd: projectPath,
|
|
stdio: ['pipe', 'pipe', 'pipe']
|
|
});
|
|
|
|
let stdout = '';
|
|
let stderr = '';
|
|
|
|
setStatusProcess.stdout.on('data', (data) => {
|
|
stdout += data.toString();
|
|
});
|
|
|
|
setStatusProcess.stderr.on('data', (data) => {
|
|
stderr += data.toString();
|
|
});
|
|
|
|
setStatusProcess.on('close', (code) => {
|
|
if (code === 0) {
|
|
// Broadcast task update via WebSocket
|
|
if (req.app.locals.wss) {
|
|
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
taskId,
|
|
message: 'Task status updated successfully',
|
|
output: stdout,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
} else {
|
|
console.error('Set task status failed:', stderr);
|
|
res.status(500).json({
|
|
error: 'Failed to update task status',
|
|
message: stderr || stdout,
|
|
code
|
|
});
|
|
}
|
|
});
|
|
|
|
setStatusProcess.stdin.end();
|
|
} else {
|
|
// For other updates, use update-task command with a prompt describing the changes
|
|
const updates = [];
|
|
if (title) updates.push(`title: "${title}"`);
|
|
if (description) updates.push(`description: "${description}"`);
|
|
if (priority) updates.push(`priority: "${priority}"`);
|
|
if (details) updates.push(`details: "${details}"`);
|
|
|
|
const prompt = `Update task with the following changes: ${updates.join(', ')}`;
|
|
|
|
const updateProcess = spawn('npx', ['task-master-ai', 'update-task', `--id=${taskId}`, `--prompt=${prompt}`], {
|
|
cwd: projectPath,
|
|
stdio: ['pipe', 'pipe', 'pipe']
|
|
});
|
|
|
|
let stdout = '';
|
|
let stderr = '';
|
|
|
|
updateProcess.stdout.on('data', (data) => {
|
|
stdout += data.toString();
|
|
});
|
|
|
|
updateProcess.stderr.on('data', (data) => {
|
|
stderr += data.toString();
|
|
});
|
|
|
|
updateProcess.on('close', (code) => {
|
|
if (code === 0) {
|
|
// Broadcast task update via WebSocket
|
|
if (req.app.locals.wss) {
|
|
broadcastTaskMasterTasksUpdate(req.app.locals.wss, projectId);
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
taskId,
|
|
message: 'Task updated successfully',
|
|
output: stdout,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
} else {
|
|
console.error('Update task failed:', stderr);
|
|
res.status(500).json({
|
|
error: 'Failed to update task',
|
|
message: stderr || stdout,
|
|
code
|
|
});
|
|
}
|
|
});
|
|
|
|
updateProcess.stdin.end();
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('Update task error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to update task',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* POST /api/taskmaster/parse-prd/:projectId
|
|
* Parse a PRD file to generate tasks
|
|
*/
|
|
router.post('/parse-prd/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
const { fileName = 'prd.txt', numTasks, append = false } = req.body;
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
const prdPath = path.join(projectPath, '.taskmaster', 'docs', fileName);
|
|
|
|
// Check if PRD file exists
|
|
try {
|
|
await fsPromises.access(prdPath, fs.constants.F_OK);
|
|
} catch (error) {
|
|
return res.status(404).json({
|
|
error: 'PRD file not found',
|
|
message: `File "${fileName}" does not exist in .taskmaster/docs/`
|
|
});
|
|
}
|
|
|
|
// Build the command args
|
|
const args = ['task-master-ai', 'parse-prd', prdPath];
|
|
|
|
if (numTasks) {
|
|
args.push('--num-tasks', numTasks.toString());
|
|
}
|
|
|
|
if (append) {
|
|
args.push('--append');
|
|
}
|
|
|
|
args.push('--research'); // Use research for better PRD parsing
|
|
|
|
// Run task-master parse-prd command
|
|
const parsePRDProcess = spawn('npx', args, {
|
|
cwd: projectPath,
|
|
stdio: ['pipe', 'pipe', 'pipe']
|
|
});
|
|
|
|
let stdout = '';
|
|
let stderr = '';
|
|
|
|
parsePRDProcess.stdout.on('data', (data) => {
|
|
stdout += data.toString();
|
|
});
|
|
|
|
parsePRDProcess.stderr.on('data', (data) => {
|
|
stderr += data.toString();
|
|
});
|
|
|
|
parsePRDProcess.on('close', (code) => {
|
|
if (code === 0) {
|
|
// Broadcast task update via WebSocket
|
|
if (req.app.locals.wss) {
|
|
broadcastTaskMasterTasksUpdate(
|
|
req.app.locals.wss,
|
|
projectId
|
|
);
|
|
}
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
prdFile: fileName,
|
|
message: 'PRD parsed and tasks generated successfully',
|
|
output: stdout,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
} else {
|
|
console.error('Parse PRD failed:', stderr);
|
|
res.status(500).json({
|
|
error: 'Failed to parse PRD',
|
|
message: stderr || stdout,
|
|
code
|
|
});
|
|
}
|
|
});
|
|
|
|
parsePRDProcess.stdin.end();
|
|
|
|
} catch (error) {
|
|
console.error('Parse PRD error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to parse PRD',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* GET /api/taskmaster/prd-templates
|
|
* Get available PRD templates
|
|
*/
|
|
router.get('/prd-templates', async (req, res) => {
|
|
try {
|
|
// Return built-in templates
|
|
const templates = [
|
|
{
|
|
id: 'web-app',
|
|
name: 'Web Application',
|
|
description: 'Template for web application projects with frontend and backend components',
|
|
category: 'web',
|
|
content: `# Product Requirements Document - Web Application
|
|
|
|
## Overview
|
|
**Product Name:** [Your App Name]
|
|
**Version:** 1.0
|
|
**Date:** ${new Date().toISOString().split('T')[0]}
|
|
**Author:** [Your Name]
|
|
|
|
## Executive Summary
|
|
Brief description of what this web application will do and why it's needed.
|
|
|
|
## Product Goals
|
|
- Goal 1: [Specific measurable goal]
|
|
- Goal 2: [Specific measurable goal]
|
|
- Goal 3: [Specific measurable goal]
|
|
|
|
## User Stories
|
|
### Core Features
|
|
1. **User Registration & Authentication**
|
|
- As a user, I want to create an account so I can access personalized features
|
|
- As a user, I want to log in securely so my data is protected
|
|
- As a user, I want to reset my password if I forget it
|
|
|
|
2. **Main Application Features**
|
|
- As a user, I want to [core feature 1] so I can [benefit]
|
|
- As a user, I want to [core feature 2] so I can [benefit]
|
|
- As a user, I want to [core feature 3] so I can [benefit]
|
|
|
|
3. **User Interface**
|
|
- As a user, I want a responsive design so I can use the app on any device
|
|
- As a user, I want intuitive navigation so I can easily find features
|
|
|
|
## Technical Requirements
|
|
### Frontend
|
|
- Framework: React/Vue/Angular or vanilla JavaScript
|
|
- Styling: CSS framework (Tailwind, Bootstrap, etc.)
|
|
- State Management: Redux/Vuex/Context API
|
|
- Build Tools: Webpack/Vite
|
|
- Testing: Jest/Vitest for unit tests
|
|
|
|
### Backend
|
|
- Runtime: Node.js/Python/Java
|
|
- Database: PostgreSQL/MySQL/MongoDB
|
|
- API: RESTful API or GraphQL
|
|
- Authentication: JWT tokens
|
|
- Testing: Integration and unit tests
|
|
|
|
### Infrastructure
|
|
- Hosting: Cloud provider (AWS, Azure, GCP)
|
|
- CI/CD: GitHub Actions/GitLab CI
|
|
- Monitoring: Application monitoring tools
|
|
- Security: HTTPS, input validation, rate limiting
|
|
|
|
## Success Metrics
|
|
- User engagement metrics
|
|
- Performance benchmarks (load time < 2s)
|
|
- Error rates < 1%
|
|
- User satisfaction scores
|
|
|
|
## Timeline
|
|
- Phase 1: Core functionality (4-6 weeks)
|
|
- Phase 2: Advanced features (2-4 weeks)
|
|
- Phase 3: Polish and launch (2 weeks)
|
|
|
|
## Constraints & Assumptions
|
|
- Budget constraints
|
|
- Technical limitations
|
|
- Team size and expertise
|
|
- Timeline constraints`
|
|
},
|
|
{
|
|
id: 'api',
|
|
name: 'REST API',
|
|
description: 'Template for REST API development projects',
|
|
category: 'backend',
|
|
content: `# Product Requirements Document - REST API
|
|
|
|
## Overview
|
|
**API Name:** [Your API Name]
|
|
**Version:** v1.0
|
|
**Date:** ${new Date().toISOString().split('T')[0]}
|
|
**Author:** [Your Name]
|
|
|
|
## Executive Summary
|
|
Description of the API's purpose, target users, and primary use cases.
|
|
|
|
## API Goals
|
|
- Goal 1: Provide secure data access
|
|
- Goal 2: Ensure scalable architecture
|
|
- Goal 3: Maintain high availability (99.9% uptime)
|
|
|
|
## Functional Requirements
|
|
### Core Endpoints
|
|
1. **Authentication Endpoints**
|
|
- POST /api/auth/login - User authentication
|
|
- POST /api/auth/logout - User logout
|
|
- POST /api/auth/refresh - Token refresh
|
|
- POST /api/auth/register - User registration
|
|
|
|
2. **Data Management Endpoints**
|
|
- GET /api/resources - List resources with pagination
|
|
- GET /api/resources/{id} - Get specific resource
|
|
- POST /api/resources - Create new resource
|
|
- PUT /api/resources/{id} - Update existing resource
|
|
- DELETE /api/resources/{id} - Delete resource
|
|
|
|
3. **Administrative Endpoints**
|
|
- GET /api/admin/users - Manage users (admin only)
|
|
- GET /api/admin/analytics - System analytics
|
|
- POST /api/admin/backup - Trigger system backup
|
|
|
|
## Technical Requirements
|
|
### API Design
|
|
- RESTful architecture following OpenAPI 3.0 specification
|
|
- JSON request/response format
|
|
- Consistent error response format
|
|
- API versioning strategy
|
|
|
|
### Authentication & Security
|
|
- JWT token-based authentication
|
|
- Role-based access control (RBAC)
|
|
- Rate limiting (100 requests/minute per user)
|
|
- Input validation and sanitization
|
|
- HTTPS enforcement
|
|
|
|
### Database
|
|
- Database type: [PostgreSQL/MongoDB/MySQL]
|
|
- Connection pooling
|
|
- Database migrations
|
|
- Backup and recovery procedures
|
|
|
|
### Performance Requirements
|
|
- Response time: < 200ms for 95% of requests
|
|
- Throughput: 1000+ requests/second
|
|
- Concurrent users: 10,000+
|
|
- Database query optimization
|
|
|
|
### Documentation
|
|
- Auto-generated API documentation (Swagger/OpenAPI)
|
|
- Code examples for common use cases
|
|
- SDK development for major languages
|
|
- Postman collection for testing
|
|
|
|
## Error Handling
|
|
- Standardized error codes and messages
|
|
- Proper HTTP status codes
|
|
- Detailed error logging
|
|
- Graceful degradation strategies
|
|
|
|
## Testing Strategy
|
|
- Unit tests (80%+ coverage)
|
|
- Integration tests for all endpoints
|
|
- Load testing and performance testing
|
|
- Security testing (OWASP compliance)
|
|
|
|
## Monitoring & Logging
|
|
- Application performance monitoring
|
|
- Error tracking and alerting
|
|
- Access logs and audit trails
|
|
- Health check endpoints
|
|
|
|
## Deployment
|
|
- Containerized deployment (Docker)
|
|
- CI/CD pipeline setup
|
|
- Environment management (dev, staging, prod)
|
|
- Blue-green deployment strategy
|
|
|
|
## Success Metrics
|
|
- API uptime > 99.9%
|
|
- Average response time < 200ms
|
|
- Zero critical security vulnerabilities
|
|
- Developer adoption metrics`
|
|
},
|
|
{
|
|
id: 'mobile-app',
|
|
name: 'Mobile Application',
|
|
description: 'Template for mobile app development projects (iOS/Android)',
|
|
category: 'mobile',
|
|
content: `# Product Requirements Document - Mobile Application
|
|
|
|
## Overview
|
|
**App Name:** [Your App Name]
|
|
**Platform:** iOS / Android / Cross-platform
|
|
**Version:** 1.0
|
|
**Date:** ${new Date().toISOString().split('T')[0]}
|
|
**Author:** [Your Name]
|
|
|
|
## Executive Summary
|
|
Brief description of the mobile app's purpose, target audience, and key value proposition.
|
|
|
|
## Product Goals
|
|
- Goal 1: [Specific user engagement goal]
|
|
- Goal 2: [Specific functionality goal]
|
|
- Goal 3: [Specific performance goal]
|
|
|
|
## User Stories
|
|
### Core Features
|
|
1. **Onboarding & Authentication**
|
|
- As a new user, I want a simple onboarding process
|
|
- As a user, I want to sign up with email or social media
|
|
- As a user, I want biometric authentication for security
|
|
|
|
2. **Main App Features**
|
|
- As a user, I want [core feature 1] accessible from home screen
|
|
- As a user, I want [core feature 2] to work offline
|
|
- As a user, I want to sync data across devices
|
|
|
|
3. **User Experience**
|
|
- As a user, I want intuitive navigation patterns
|
|
- As a user, I want fast loading times
|
|
- As a user, I want accessibility features
|
|
|
|
## Technical Requirements
|
|
### Mobile Development
|
|
- **Cross-platform:** React Native / Flutter / Xamarin
|
|
- **Native:** Swift (iOS) / Kotlin (Android)
|
|
- **State Management:** Redux / MobX / Provider
|
|
- **Navigation:** React Navigation / Flutter Navigation
|
|
|
|
### Backend Integration
|
|
- REST API or GraphQL integration
|
|
- Real-time features (WebSockets/Push notifications)
|
|
- Offline data synchronization
|
|
- Background processing
|
|
|
|
### Device Features
|
|
- Camera and photo library access
|
|
- GPS location services
|
|
- Push notifications
|
|
- Biometric authentication
|
|
- Device storage
|
|
|
|
### Performance Requirements
|
|
- App launch time < 3 seconds
|
|
- Screen transition animations < 300ms
|
|
- Memory usage optimization
|
|
- Battery usage optimization
|
|
|
|
## Platform-Specific Considerations
|
|
### iOS Requirements
|
|
- iOS 13.0+ minimum version
|
|
- App Store guidelines compliance
|
|
- iOS design guidelines (Human Interface Guidelines)
|
|
- TestFlight beta testing
|
|
|
|
### Android Requirements
|
|
- Android 8.0+ (API level 26) minimum
|
|
- Google Play Store guidelines
|
|
- Material Design guidelines
|
|
- Google Play Console testing
|
|
|
|
## User Interface Design
|
|
- Responsive design for different screen sizes
|
|
- Dark mode support
|
|
- Accessibility compliance (WCAG 2.1)
|
|
- Consistent design system
|
|
|
|
## Security & Privacy
|
|
- Secure data storage (Keychain/Keystore)
|
|
- API communication encryption
|
|
- Privacy policy compliance (GDPR/CCPA)
|
|
- App security best practices
|
|
|
|
## Testing Strategy
|
|
- Unit testing (80%+ coverage)
|
|
- UI/E2E testing (Detox/Appium)
|
|
- Device testing on multiple screen sizes
|
|
- Performance testing
|
|
- Security testing
|
|
|
|
## App Store Deployment
|
|
- App store optimization (ASO)
|
|
- App icons and screenshots
|
|
- Store listing content
|
|
- Release management strategy
|
|
|
|
## Analytics & Monitoring
|
|
- User analytics (Firebase/Analytics)
|
|
- Crash reporting (Crashlytics/Sentry)
|
|
- Performance monitoring
|
|
- User feedback collection
|
|
|
|
## Success Metrics
|
|
- App store ratings > 4.0
|
|
- User retention rates
|
|
- Daily/Monthly active users
|
|
- App performance metrics
|
|
- Conversion rates`
|
|
},
|
|
{
|
|
id: 'data-analysis',
|
|
name: 'Data Analysis Project',
|
|
description: 'Template for data analysis and visualization projects',
|
|
category: 'data',
|
|
content: `# Product Requirements Document - Data Analysis Project
|
|
|
|
## Overview
|
|
**Project Name:** [Your Analysis Project]
|
|
**Analysis Type:** [Descriptive/Predictive/Prescriptive]
|
|
**Date:** ${new Date().toISOString().split('T')[0]}
|
|
**Author:** [Your Name]
|
|
|
|
## Executive Summary
|
|
Description of the business problem, data sources, and expected insights.
|
|
|
|
## Project Goals
|
|
- Goal 1: [Specific business question to answer]
|
|
- Goal 2: [Specific prediction to make]
|
|
- Goal 3: [Specific recommendation to provide]
|
|
|
|
## Business Requirements
|
|
### Key Questions
|
|
1. What patterns exist in the current data?
|
|
2. What factors influence [target variable]?
|
|
3. What predictions can be made for [future outcome]?
|
|
4. What recommendations can improve [business metric]?
|
|
|
|
### Success Criteria
|
|
- Actionable insights for stakeholders
|
|
- Statistical significance in findings
|
|
- Reproducible analysis pipeline
|
|
- Clear visualization and reporting
|
|
|
|
## Data Requirements
|
|
### Data Sources
|
|
1. **Primary Data**
|
|
- Source: [Database/API/Files]
|
|
- Format: [CSV/JSON/SQL]
|
|
- Size: [Volume estimate]
|
|
- Update frequency: [Real-time/Daily/Monthly]
|
|
|
|
2. **External Data**
|
|
- Third-party APIs
|
|
- Public datasets
|
|
- Market research data
|
|
|
|
### Data Quality Requirements
|
|
- Data completeness (< 5% missing values)
|
|
- Data accuracy validation
|
|
- Data consistency checks
|
|
- Historical data availability
|
|
|
|
## Technical Requirements
|
|
### Data Pipeline
|
|
- Data extraction and ingestion
|
|
- Data cleaning and preprocessing
|
|
- Data transformation and feature engineering
|
|
- Data validation and quality checks
|
|
|
|
### Analysis Tools
|
|
- **Programming:** Python/R/SQL
|
|
- **Libraries:** pandas, numpy, scikit-learn, matplotlib
|
|
- **Visualization:** Tableau, PowerBI, or custom dashboards
|
|
- **Version Control:** Git for code and DVC for data
|
|
|
|
### Computing Resources
|
|
- Local development environment
|
|
- Cloud computing (AWS/GCP/Azure) if needed
|
|
- Database access and permissions
|
|
- Storage requirements
|
|
|
|
## Analysis Methodology
|
|
### Data Exploration
|
|
1. Descriptive statistics and data profiling
|
|
2. Data visualization and pattern identification
|
|
3. Correlation analysis
|
|
4. Outlier detection and handling
|
|
|
|
### Statistical Analysis
|
|
1. Hypothesis formulation
|
|
2. Statistical testing
|
|
3. Confidence intervals
|
|
4. Effect size calculations
|
|
|
|
### Machine Learning (if applicable)
|
|
1. Feature selection and engineering
|
|
2. Model selection and training
|
|
3. Cross-validation and evaluation
|
|
4. Model interpretation and explainability
|
|
|
|
## Deliverables
|
|
### Reports
|
|
- Executive summary for stakeholders
|
|
- Technical analysis report
|
|
- Data quality report
|
|
- Methodology documentation
|
|
|
|
### Visualizations
|
|
- Interactive dashboards
|
|
- Static charts and graphs
|
|
- Data story presentations
|
|
- Key findings infographics
|
|
|
|
### Code & Documentation
|
|
- Reproducible analysis scripts
|
|
- Data pipeline code
|
|
- Documentation and comments
|
|
- Testing and validation code
|
|
|
|
## Timeline
|
|
- Phase 1: Data collection and exploration (2 weeks)
|
|
- Phase 2: Analysis and modeling (3 weeks)
|
|
- Phase 3: Reporting and visualization (1 week)
|
|
- Phase 4: Stakeholder presentation (1 week)
|
|
|
|
## Risks & Assumptions
|
|
- Data availability and quality risks
|
|
- Technical complexity assumptions
|
|
- Resource and timeline constraints
|
|
- Stakeholder engagement assumptions
|
|
|
|
## Success Metrics
|
|
- Stakeholder satisfaction with insights
|
|
- Accuracy of predictions (if applicable)
|
|
- Business impact of recommendations
|
|
- Reproducibility of results`
|
|
}
|
|
];
|
|
|
|
res.json({
|
|
templates,
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (error) {
|
|
console.error('PRD templates error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to get PRD templates',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
/**
|
|
* POST /api/taskmaster/apply-template/:projectId
|
|
* Apply a PRD template to create a new PRD file
|
|
*/
|
|
router.post('/apply-template/:projectId', async (req, res) => {
|
|
try {
|
|
const { projectId } = req.params;
|
|
const { templateId, fileName = 'prd.txt', customizations = {} } = req.body;
|
|
|
|
if (!templateId) {
|
|
return res.status(400).json({
|
|
error: 'Missing required parameter',
|
|
message: 'templateId is required'
|
|
});
|
|
}
|
|
|
|
const projectPath = await resolveProjectPathFromId(projectId);
|
|
if (!projectPath) {
|
|
return res.status(404).json({
|
|
error: 'Project not found',
|
|
message: `Project "${projectId}" does not exist`
|
|
});
|
|
}
|
|
|
|
// Get the template content (this would normally fetch from the templates list)
|
|
const templates = await getAvailableTemplates();
|
|
const template = templates.find(t => t.id === templateId);
|
|
|
|
if (!template) {
|
|
return res.status(404).json({
|
|
error: 'Template not found',
|
|
message: `Template "${templateId}" does not exist`
|
|
});
|
|
}
|
|
|
|
// Apply customizations to template content
|
|
let content = template.content;
|
|
|
|
// Replace placeholders with customizations
|
|
for (const [key, value] of Object.entries(customizations)) {
|
|
const placeholder = `[${key}]`;
|
|
content = content.replace(new RegExp(placeholder.replace(/[.*+?^${}()|[\\]\\\\]/g, '\\\\$&'), 'g'), value);
|
|
}
|
|
|
|
// Ensure .taskmaster/docs directory exists
|
|
const docsDir = path.join(projectPath, '.taskmaster', 'docs');
|
|
try {
|
|
await fsPromises.mkdir(docsDir, { recursive: true });
|
|
} catch (error) {
|
|
console.error('Failed to create docs directory:', error);
|
|
}
|
|
|
|
const filePath = path.join(docsDir, fileName);
|
|
|
|
// Write the template content to the file
|
|
try {
|
|
await fsPromises.writeFile(filePath, content, 'utf8');
|
|
|
|
res.json({
|
|
projectId,
|
|
projectPath,
|
|
templateId,
|
|
templateName: template.name,
|
|
fileName,
|
|
filePath: filePath,
|
|
message: 'PRD template applied successfully',
|
|
timestamp: new Date().toISOString()
|
|
});
|
|
|
|
} catch (writeError) {
|
|
console.error('Failed to write PRD template:', writeError);
|
|
return res.status(500).json({
|
|
error: 'Failed to write PRD template',
|
|
message: writeError.message
|
|
});
|
|
}
|
|
|
|
} catch (error) {
|
|
console.error('Apply template error:', error);
|
|
res.status(500).json({
|
|
error: 'Failed to apply PRD template',
|
|
message: error.message
|
|
});
|
|
}
|
|
});
|
|
|
|
// Helper function to get available templates
|
|
async function getAvailableTemplates() {
|
|
// This could be extended to read from files or database
|
|
return [
|
|
{
|
|
id: 'web-app',
|
|
name: 'Web Application',
|
|
description: 'Template for web application projects',
|
|
category: 'web',
|
|
content: `# Product Requirements Document - Web Application
|
|
|
|
## Overview
|
|
**Product Name:** [Your App Name]
|
|
**Version:** 1.0
|
|
**Date:** ${new Date().toISOString().split('T')[0]}
|
|
**Author:** [Your Name]
|
|
|
|
## Executive Summary
|
|
Brief description of what this web application will do and why it's needed.
|
|
|
|
## User Stories
|
|
1. As a user, I want [feature] so I can [benefit]
|
|
2. As a user, I want [feature] so I can [benefit]
|
|
3. As a user, I want [feature] so I can [benefit]
|
|
|
|
## Technical Requirements
|
|
- Frontend framework
|
|
- Backend services
|
|
- Database requirements
|
|
- Security considerations
|
|
|
|
## Success Metrics
|
|
- User engagement metrics
|
|
- Performance benchmarks
|
|
- Business objectives`
|
|
},
|
|
// Add other templates here if needed
|
|
];
|
|
}
|
|
|
|
export default router;
|