一键转换模式优化
All checks were successful
Gitea Actions Demo / Explore-Gitea-Actions (push) Successful in 1m22s

This commit is contained in:
Song367 2026-03-11 21:53:41 +08:00
parent 5615d28012
commit b49d703e3c
18 changed files with 3083 additions and 194 deletions

View File

@ -0,0 +1,156 @@
# Account Sync Drafts Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Add lightweight account/password auth, cloud draft persistence, cross-device restore, resumable episode conversion, and full-regenerate controls.
**Architecture:** Keep the current Vite frontend, add a minimal Express + SQLite backend under `server/`, and let the frontend auto-save/load the current draft for the logged-in account. Continue using the existing conversion UI, but persist the generated episode results and resume state remotely.
**Tech Stack:** React 19, TypeScript, Vite, Express, better-sqlite3, Node crypto, SQLite
---
### Task 1: Create backend skeleton and shared storage model
**Files:**
- Create: `server/index.ts`
- Create: `server/db.ts`
- Create: `server/auth.ts`
- Modify: `package.json`
- Modify: `vite.config.ts`
**Step 1: Add a backend run script**
- Add `dev:server` and `dev:full` scripts.
- Keep existing frontend scripts unchanged.
**Step 2: Create SQLite schema helpers**
- Add `users`, `sessions`, and `drafts` tables.
- Store one current draft JSON blob per user for now.
**Step 3: Add Express server bootstrap**
- Enable JSON body parsing.
- Mount auth and draft endpoints under `/api`.
**Step 4: Add Vite proxy for local development**
- Proxy `/api` to the local backend server.
**Step 5: Verify backend compiles under TypeScript**
- Run `npm run lint`.
### Task 2: Implement lightweight account/password authentication
**Files:**
- Modify: `server/db.ts`
- Modify: `server/auth.ts`
- Modify: `server/index.ts`
**Step 1: Implement password hashing with Node crypto**
- Use `scryptSync` + random salt.
**Step 2: Add register endpoint**
- Accept `username` and `password`.
- Reject duplicate usernames and invalid payloads.
**Step 3: Add login endpoint**
- Verify password.
- Create a random session token and persist it.
**Step 4: Add auth middleware**
- Read bearer token.
- Resolve current user from `sessions`.
**Step 5: Add session restore endpoint**
- Return current user info when token is valid.
### Task 3: Add cloud draft persistence endpoints
**Files:**
- Modify: `server/db.ts`
- Modify: `server/index.ts`
- Create: `server/types.ts`
**Step 1: Define draft payload shape**
- Include conversion settings, extracted episodes, conversion episode results, finalized selections, and source metadata.
**Step 2: Add load current draft endpoint**
- Return the saved draft JSON for the authenticated user.
**Step 3: Add save current draft endpoint**
- Upsert the latest draft JSON and update timestamp.
**Step 4: Add reset/restart endpoint if needed**
- Allow frontend to intentionally replace cloud draft on full regeneration.
### Task 4: Add frontend auth state and draft synchronization
**Files:**
- Create: `src/services/api.ts`
- Modify: `src/App.tsx`
- Modify: `src/main.tsx` if needed
**Step 1: Add auth modal/panel UI**
- Support register and login with username/password.
**Step 2: Add token persistence**
- Save token locally and restore session on app load.
**Step 3: Load remote draft on login/app restore**
- Hydrate the conversion state from the server.
**Step 4: Auto-save remote draft**
- Debounce save calls when relevant conversion state changes.
- Save generated content and manual edits.
### Task 5: Strengthen resumable conversion workflow
**Files:**
- Modify: `src/App.tsx`
- Modify: `src/services/ai.ts`
- Modify: `src/lib/conversionWorkflow.ts`
- Modify: `tests/conversionWorkflow.test.ts`
**Step 1: Preserve current stop/resume behavior**
- Keep paused episode/version state intact.
**Step 2: Add `重新全部生成` control**
- When a resumable state exists, show both `继续续集生成` and `重新全部生成`.
- Full regenerate clears existing generated episode results before starting from episode 1.
**Step 3: Ensure manual edits persist**
- Save edited episode version content into draft state and remote storage.
### Task 6: Enforce continuity and scope constraints in generation
**Files:**
- Modify: `src/App.tsx`
- Modify: `src/services/ai.ts`
**Step 1: Build previous-episode continuity context**
- For episode N, pass episode N-1 selected final version, or current active version if not finalized.
**Step 2: Restrict generation scope**
- Explicitly tell the model to only generate within the current episode outline shown on the left.
**Step 3: Reassert global constraints**
- Require adherence to worldview, story outline, and core character settings.
**Step 4: Keep resume continuation safe**
- Continue from already generated content without rewriting earlier text.
### Task 7: Verification
**Files:**
- Test: `tests/conversionWorkflow.test.ts`
- Test: backend endpoint smoke verification via local commands
**Step 1: Run draft workflow unit test**
- Run `node --experimental-strip-types tests/conversionWorkflow.test.ts`
**Step 2: Run type-check**
- Run `npm run lint`
**Step 3: Run production build**
- Run `npm run build`
**Step 4: Smoke-test backend boot if possible**
- Start local backend and verify auth/draft endpoints respond.

View File

@ -0,0 +1,25 @@
# Audience And Word Range Design
**Context**
The app already exposes shared generation controls for script type, themes, and narrative methods. Both conversion mode and creation mode use those controls and persist them locally.
**Decision**
Add two more shared controls in both modes:
- `受众倾向`: `男频` and `女频`, default `男频`
- `字数范围`: `200 - 500`, `500 - 1000`, `1000 - 2000`, `2000 - 3000`, `3000以上`, `不限`, default `不限`
These controls should:
- appear in both modes alongside the existing generation parameters
- use the same pill-button interaction style as the existing filters
- persist via `localStorage`
- be included in all AI generation requests for conversion, single-page creation, and batch creation
**Implementation Notes**
- Keep the values in Chinese only.
- Follow the current state structure in `src/App.tsx` instead of doing a broader settings refactor.
- Extend the AI settings payload passed into `src/services/ai.ts` so prompts can explicitly mention the new choices.

View File

@ -0,0 +1,97 @@
# Audience And Word Range Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Add shared audience and word-range controls to both modes, persist them locally, and include them in every script generation request.
**Architecture:** Extend `src/App.tsx` with two new shared state values and reusable option lists, render matching controls in both parameter areas, and thread the new values through the existing AI request payloads. Update `src/services/ai.ts` to accept the expanded settings object and append the new fields to prompts.
**Tech Stack:** React 19, TypeScript, Vite
---
### Task 1: Add shared state and persistence
**Files:**
- Modify: `src/App.tsx`
**Step 1: Add the failing test surrogate**
There is no existing automated test harness in this repo, so use TypeScript as the first guard by introducing the new state and payload types in a way that will initially break downstream call sites until they are wired through.
**Step 2: Verify the break**
Run: `npm run lint`
Expected: FAIL until all new settings usages are connected.
**Step 3: Write minimal implementation**
- Add `AudiencePreference` and `WordRange` union types.
- Add shared option arrays.
- Add `useState` hooks with Chinese defaults.
- Persist them with `localStorage`.
**Step 4: Verify**
Run: `npm run lint`
Expected: still failing until AI payload changes are completed.
### Task 2: Render controls in both modes
**Files:**
- Modify: `src/App.tsx`
**Step 1: Add UI in conversion mode**
- Insert `受众倾向` and `字数范围` rows into the expanded conversion settings panel.
**Step 2: Add UI in creation mode**
- Insert the same two rows into the creation top filter area.
**Step 3: Keep styling consistent**
- Reuse the existing pill-button class patterns.
- Ensure the labels remain Chinese-only.
### Task 3: Thread settings into AI generation
**Files:**
- Modify: `src/App.tsx`
- Modify: `src/services/ai.ts`
**Step 1: Expand payload shape**
- Add `audiencePreference` and `wordRange` to the existing global settings object passed into generation functions.
**Step 2: Update all call sites**
- `convertTextToScript`
- `generatePageScript`
- `generateAllScripts`
**Step 3: Update prompts**
- Mention audience preference and target word range in each generated prompt alongside worldview, outline, and characters.
### Task 4: Verify
**Files:**
- Modify: `src/App.tsx`
- Modify: `src/services/ai.ts`
**Step 1: Run type check**
Run: `npm run lint`
Expected: PASS
**Step 2: Run production build**
Run: `npm run build`
Expected: PASS
**Step 3: Review**
- Confirm both modes show the two controls.
- Confirm defaults are `男频` and `不限`.
- Confirm no English values were introduced.

View File

@ -0,0 +1,44 @@
# Upload And Extract Episodes Design
**Context**
The conversion mode currently accepts only manual text input in the left textarea. The app already has a Doubao streaming integration pattern for script generation, and the extracted content should feed back into the existing `sourceText` flow rather than replacing the rest of the conversion pipeline.
**Decision**
Adopt a client-side upload flow for four file types: Word (`.docx`), text (`.txt`), PDF (`.pdf`), and Markdown (`.md`). After upload, the app will read the file in the browser, send the raw text to a new Doubao extraction call using `doubao-seed-1-6-flash-250828`, and stream the model output directly into the left-side source textarea.
**Behavior**
- The source input area becomes a hybrid input surface: manual typing still works, and file upload is added alongside it.
- Upload immediately starts extraction without requiring the user to click `立即转换成剧本`.
- The extraction model is instructed to identify each episode and return the original script content 1:1 with no rewriting, normalization, cleanup, or omission.
- The streamed extraction result overwrites `sourceText` progressively so the user can see the result arrive in real time.
- Existing conversion generation stays separate. After extraction completes, the user can still click the existing conversion button to continue with the current workflow.
**Parsing Strategy**
- `.txt` and `.md`: read with `File.text()`.
- `.docx`: parse in-browser with a document-text extraction library.
- `.pdf`: parse in-browser with a PDF text extraction library.
**UI And State**
- Add upload affordance, accepted-file hint, extraction loading state, and extraction error state in conversion mode.
- Preserve `sourceText` local persistence.
- Keep manual editing enabled after extraction.
**AI Contract**
The new extraction API will:
- use Doubao only
- stream results
- instruct the model to output episode-separated original content only
- avoid any transformations beyond episode boundary recognition
**Risks**
- PDF text extraction quality depends on document structure.
- Even with strict prompting, model-based extraction is probabilistic, so the prompt must strongly prohibit edits and define a deterministic output format.
- Browser-side parsing adds dependency and bundle-size cost.

View File

@ -0,0 +1,109 @@
# Upload And Extract Episodes Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Add file upload for `.docx`, `.txt`, `.pdf`, and `.md` in conversion mode, stream a Doubao-based episode extraction result into the source textarea, and preserve the existing conversion workflow.
**Architecture:** Extend the conversion UI in `src/App.tsx` with upload controls and extraction state, add browser-side file parsing helpers for the four formats, and create a new streaming Doubao extraction method in `src/services/ai.ts`. The extraction result should progressively replace `sourceText`, then hand control back to the user for the existing conversion step.
**Tech Stack:** React 19, TypeScript, Vite, OpenAI SDK, browser file APIs, PDF/docx parsing libraries
---
### Task 1: Add dependency and extraction API surface
**Files:**
- Modify: `package.json`
- Modify: `src/services/ai.ts`
**Step 1: Write the failing test surrogate**
Use TypeScript as the first guard by adding a new extraction function signature and references from `App.tsx` before implementation is complete.
**Step 2: Run verification to confirm the break**
Run: `npm run lint`
Expected: FAIL because the new extraction path is not fully wired.
**Step 3: Write minimal implementation**
- Add any needed client-side parsing dependencies.
- Add `extractEpisodesFromSource` in `src/services/ai.ts`.
- Use model `doubao-seed-1-6-flash-250828`.
- Stream chunks through a callback similar to the existing conversion flow.
- Build a strict prompt requiring 1:1 episode/script extraction with no edits.
**Step 4: Re-run verification**
Run: `npm run lint`
Expected: still failing until UI wiring is complete.
### Task 2: Add browser-side file parsing helpers
**Files:**
- Create: `src/services/fileParsing.ts`
- Modify: `src/App.tsx`
**Step 1: Write the failing test surrogate**
Reference helper APIs from `App.tsx` before the helper module is complete.
**Step 2: Implement minimal parsing**
- Support `.txt` and `.md` via plain text reads.
- Support `.docx` via browser text extraction.
- Support `.pdf` via browser PDF text extraction.
- Return normalized raw text for LLM input without attempting episode splitting locally.
**Step 3: Handle unsupported files**
- Reject anything outside the four allowed extensions.
- Return actionable error messages.
### Task 3: Wire upload UX into conversion mode
**Files:**
- Modify: `src/App.tsx`
**Step 1: Add state**
- Add extraction progress/loading/error state.
- Keep `sourceText` persistence unchanged.
**Step 2: Add UI**
- Add file upload control near the source textarea.
- Show supported formats.
- Show extraction-in-progress feedback distinct from the existing conversion action.
**Step 3: Add upload flow**
- On file select, parse the file.
- Start the streaming extraction call immediately.
- Replace `sourceText` incrementally with streamed content.
- Preserve manual editing after completion.
### Task 4: Verify behavior and guardrails
**Files:**
- Modify: `src/App.tsx`
- Modify: `src/services/ai.ts`
- Modify: `package.json`
**Step 1: Run type check**
Run: `npm run lint`
Expected: PASS
**Step 2: Run build**
Run: `npm run build`
Expected: PASS
**Step 3: Manual review checklist**
- Upload accepts only `.docx`, `.txt`, `.pdf`, `.md`.
- Upload starts extraction immediately.
- Left source textarea updates in streaming fashion.
- Extraction uses the raw-output contract and does not intentionally rewrite content.
- Existing `立即转换成剧本` button still works after extraction.

397
package-lock.json generated
View File

@ -18,8 +18,10 @@
"express": "^4.21.2",
"file-saver": "^2.0.5",
"lucide-react": "^0.546.0",
"mammoth": "^1.11.0",
"motion": "^12.23.24",
"openai": "^6.27.0",
"pdfjs-dist": "^5.5.207",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"react-markdown": "^10.1.0",
@ -801,6 +803,271 @@
"@jridgewell/sourcemap-codec": "^1.4.14"
}
},
"node_modules/@napi-rs/canvas": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas/-/canvas-0.1.96.tgz",
"integrity": "sha512-6NNmNxvoJKeucVjxaaRUt3La2i5jShgiAbaY3G/72s1Vp3U06XPrAIxkAjBxpDcamEn/t+WJ4OOlGmvILo4/Ew==",
"license": "MIT",
"optional": true,
"workspaces": [
"e2e/*"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
},
"optionalDependencies": {
"@napi-rs/canvas-android-arm64": "0.1.96",
"@napi-rs/canvas-darwin-arm64": "0.1.96",
"@napi-rs/canvas-darwin-x64": "0.1.96",
"@napi-rs/canvas-linux-arm-gnueabihf": "0.1.96",
"@napi-rs/canvas-linux-arm64-gnu": "0.1.96",
"@napi-rs/canvas-linux-arm64-musl": "0.1.96",
"@napi-rs/canvas-linux-riscv64-gnu": "0.1.96",
"@napi-rs/canvas-linux-x64-gnu": "0.1.96",
"@napi-rs/canvas-linux-x64-musl": "0.1.96",
"@napi-rs/canvas-win32-arm64-msvc": "0.1.96",
"@napi-rs/canvas-win32-x64-msvc": "0.1.96"
}
},
"node_modules/@napi-rs/canvas-android-arm64": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-android-arm64/-/canvas-android-arm64-0.1.96.tgz",
"integrity": "sha512-ew1sPrN3dGdZ3L4FoohPfnjq0f9/Jk7o+wP7HkQZokcXgIUD6FIyICEWGhMYzv53j63wUcPvZeAwgewX58/egg==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"android"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-darwin-arm64": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-arm64/-/canvas-darwin-arm64-0.1.96.tgz",
"integrity": "sha512-Q/wOXZ5PzTqpdmA5eUOcegCf4Go/zz3aZ5DlzSeDpOjFmfwMKh8EzLAoweQ+mJVagcHQyzoJhaTEnrO68TNyNg==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-darwin-x64": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-darwin-x64/-/canvas-darwin-x64-0.1.96.tgz",
"integrity": "sha512-UrXiQz28tQEvGM1qvyptewOAfmUrrd5+wvi6Rzjj2VprZI8iZ2KIvBD2lTTG1bVF95AbeDeG7PJA0D9sLKaOFA==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm-gnueabihf": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm-gnueabihf/-/canvas-linux-arm-gnueabihf-0.1.96.tgz",
"integrity": "sha512-I90ODxweD8aEP6XKU/NU+biso95MwCtQ2F46dUvhec1HesFi0tq/tAJkYic/1aBSiO/1kGKmSeD1B0duOHhEHQ==",
"cpu": [
"arm"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-gnu": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-gnu/-/canvas-linux-arm64-gnu-0.1.96.tgz",
"integrity": "sha512-Dx/0+RFV++w3PcRy+4xNXkghhXjA5d0Mw1bs95emn5Llinp1vihMaA6WJt3oYv2LAHc36+gnrhIBsPhUyI2SGw==",
"cpu": [
"arm64"
],
"libc": [
"glibc"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-arm64-musl": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-arm64-musl/-/canvas-linux-arm64-musl-0.1.96.tgz",
"integrity": "sha512-UvOi7fii3IE2KDfEfhh8m+LpzSRvhGK7o1eho99M2M0HTik11k3GX+2qgVx9EtujN3/bhFFS1kSO3+vPMaJ0Mg==",
"cpu": [
"arm64"
],
"libc": [
"musl"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-riscv64-gnu": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-riscv64-gnu/-/canvas-linux-riscv64-gnu-0.1.96.tgz",
"integrity": "sha512-MBSukhGCQ5nRtf9NbFYWOU080yqkZU1PbuH4o1ROvB4CbPl12fchDR35tU83Wz8gWIM9JTn99lBn9DenPIv7Ig==",
"cpu": [
"riscv64"
],
"libc": [
"glibc"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-x64-gnu": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-gnu/-/canvas-linux-x64-gnu-0.1.96.tgz",
"integrity": "sha512-I/ccu2SstyKiV3HIeVzyBIWfrJo8cN7+MSQZPnabewWV6hfJ2nY7Df2WqOHmobBRUw84uGR6zfQHsUEio/m5Vg==",
"cpu": [
"x64"
],
"libc": [
"glibc"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-linux-x64-musl": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-linux-x64-musl/-/canvas-linux-x64-musl-0.1.96.tgz",
"integrity": "sha512-H3uov7qnTl73GDT4h52lAqpJPsl1tIUyNPWJyhQ6gHakohNqqRq3uf80+NEpzcytKGEOENP1wX3yGwZxhjiWEQ==",
"cpu": [
"x64"
],
"libc": [
"musl"
],
"license": "MIT",
"optional": true,
"os": [
"linux"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-win32-arm64-msvc": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-arm64-msvc/-/canvas-win32-arm64-msvc-0.1.96.tgz",
"integrity": "sha512-ATp6Y+djOjYtkfV/VRH7CZ8I1MEtkUQBmKUbuWw5zWEHHqfL0cEcInE4Cxgx7zkNAhEdBbnH8HMVrqNp+/gwxA==",
"cpu": [
"arm64"
],
"license": "MIT",
"optional": true,
"os": [
"win32"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@napi-rs/canvas-win32-x64-msvc": {
"version": "0.1.96",
"resolved": "https://registry.npmjs.org/@napi-rs/canvas-win32-x64-msvc/-/canvas-win32-x64-msvc-0.1.96.tgz",
"integrity": "sha512-UYGdTltVd+Z8mcIuoqGmAXXUvwH5CLf2M6mIB5B0/JmX5J041jETjqtSYl7gN+aj3k1by/SG6sS0hAwCqyK7zw==",
"cpu": [
"x64"
],
"license": "MIT",
"optional": true,
"os": [
"win32"
],
"engines": {
"node": ">= 10"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Brooooooklyn"
}
},
"node_modules/@pkgjs/parseargs": {
"version": "0.11.0",
"resolved": "https://registry.npmjs.org/@pkgjs/parseargs/-/parseargs-0.11.0.tgz",
@ -1724,6 +1991,15 @@
"vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
}
},
"node_modules/@xmldom/xmldom": {
"version": "0.8.11",
"resolved": "https://registry.npmjs.org/@xmldom/xmldom/-/xmldom-0.8.11.tgz",
"integrity": "sha512-cQzWCtO6C8TQiYl1ruKNn2U6Ao4o4WBBcbL61yJl84x+j5sOWWFU9X7DpND8XZG3daDppSsigMdfAIl2upQBRw==",
"license": "MIT",
"engines": {
"node": ">=10.0.0"
}
},
"node_modules/accepts": {
"version": "1.3.8",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
@ -1770,6 +2046,15 @@
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/argparse": {
"version": "1.0.10",
"resolved": "https://registry.npmjs.org/argparse/-/argparse-1.0.10.tgz",
"integrity": "sha512-o5Roy6tNG4SL/FOkCAN6RzjiakZS25RLYFrcMttJqbdd8BWrnA+fGz57iN5Pb06pvBGvl5gQ0B48dJlslXvoTg==",
"license": "MIT",
"dependencies": {
"sprintf-js": "~1.0.2"
}
},
"node_modules/array-flatten": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
@ -1904,6 +2189,12 @@
"readable-stream": "^3.4.0"
}
},
"node_modules/bluebird": {
"version": "3.4.7",
"resolved": "https://registry.npmjs.org/bluebird/-/bluebird-3.4.7.tgz",
"integrity": "sha512-iD3898SR7sWVRHbiQv+sHUtHnMvC1o3nW5rAcqnq3uOn07DSAppZYUkIGslDz6gXC7HfunPe7YVBgoEJASPcHA==",
"license": "MIT"
},
"node_modules/body-parser": {
"version": "1.20.4",
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.4.tgz",
@ -2348,6 +2639,12 @@
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/dingbat-to-unicode": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/dingbat-to-unicode/-/dingbat-to-unicode-1.0.1.tgz",
"integrity": "sha512-98l0sW87ZT58pU4i61wa2OHwxbiYSbuxsCBozaVnYX2iCnr3bLM3fIes1/ej7h1YdOKuKt/MLs706TVnALA65w==",
"license": "BSD-2-Clause"
},
"node_modules/docx": {
"version": "9.6.0",
"resolved": "https://registry.npmjs.org/docx/-/docx-9.6.0.tgz",
@ -2410,6 +2707,15 @@
"url": "https://dotenvx.com"
}
},
"node_modules/duck": {
"version": "0.1.12",
"resolved": "https://registry.npmjs.org/duck/-/duck-0.1.12.tgz",
"integrity": "sha512-wkctla1O6VfP89gQ+J/yDesM0S7B7XLXjKGzXxMDVFg7uEn706niAtyYovKbyq1oT9YwDcly721/iUWoc8MVRg==",
"license": "BSD",
"dependencies": {
"underscore": "^1.13.1"
}
},
"node_modules/dunder-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
@ -3689,6 +3995,17 @@
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/lop": {
"version": "0.4.2",
"resolved": "https://registry.npmjs.org/lop/-/lop-0.4.2.tgz",
"integrity": "sha512-RefILVDQ4DKoRZsJ4Pj22TxE3omDO47yFpkIBoDKzkqPRISs5U1cnAdg/5583YPkWPaLIYHOKRMQSvjFsO26cw==",
"license": "BSD-2-Clause",
"dependencies": {
"duck": "^0.1.12",
"option": "~0.2.1",
"underscore": "^1.13.1"
}
},
"node_modules/lru-cache": {
"version": "5.1.1",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
@ -3716,6 +4033,30 @@
"@jridgewell/sourcemap-codec": "^1.5.5"
}
},
"node_modules/mammoth": {
"version": "1.11.0",
"resolved": "https://registry.npmjs.org/mammoth/-/mammoth-1.11.0.tgz",
"integrity": "sha512-BcEqqY/BOwIcI1iR5tqyVlqc3KIaMRa4egSoK83YAVrBf6+yqdAAbtUcFDCWX8Zef8/fgNZ6rl4VUv+vVX8ddQ==",
"license": "BSD-2-Clause",
"dependencies": {
"@xmldom/xmldom": "^0.8.6",
"argparse": "~1.0.3",
"base64-js": "^1.5.1",
"bluebird": "~3.4.0",
"dingbat-to-unicode": "^1.0.1",
"jszip": "^3.7.1",
"lop": "^0.4.2",
"path-is-absolute": "^1.0.0",
"underscore": "^1.13.1",
"xmlbuilder": "^10.0.0"
},
"bin": {
"mammoth": "bin/mammoth"
},
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/math-intrinsics": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
@ -4579,6 +4920,13 @@
"url": "https://opencollective.com/node-fetch"
}
},
"node_modules/node-readable-to-web-readable-stream": {
"version": "0.4.2",
"resolved": "https://registry.npmjs.org/node-readable-to-web-readable-stream/-/node-readable-to-web-readable-stream-0.4.2.tgz",
"integrity": "sha512-/cMZNI34v//jUTrI+UIo4ieHAB5EZRY/+7OmXZgBxaWBMcW2tGdceIw06RFxWxrKZ5Jp3sI2i5TsRo+CBhtVLQ==",
"license": "MIT",
"optional": true
},
"node_modules/node-releases": {
"version": "2.0.27",
"resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz",
@ -4639,6 +4987,12 @@
}
}
},
"node_modules/option": {
"version": "0.2.4",
"resolved": "https://registry.npmjs.org/option/-/option-0.2.4.tgz",
"integrity": "sha512-pkEqbDyl8ou5cpq+VsnQbe/WlEy5qS7xPzMS1U55OCG9KPvwFD46zDbxQIj3egJSFc3D+XhYOPUzz49zQAVy7A==",
"license": "BSD-2-Clause"
},
"node_modules/p-retry": {
"version": "4.6.2",
"resolved": "https://registry.npmjs.org/p-retry/-/p-retry-4.6.2.tgz",
@ -4698,6 +5052,15 @@
"node": ">= 0.8"
}
},
"node_modules/path-is-absolute": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.1.tgz",
"integrity": "sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg==",
"license": "MIT",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/path-key": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
@ -4735,6 +5098,19 @@
"integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==",
"license": "MIT"
},
"node_modules/pdfjs-dist": {
"version": "5.5.207",
"resolved": "https://registry.npmjs.org/pdfjs-dist/-/pdfjs-dist-5.5.207.tgz",
"integrity": "sha512-WMqqw06w1vUt9ZfT0gOFhMf3wHsWhaCrxGrckGs5Cci6ybDW87IvPaOd2pnBwT6BJuP/CzXDZxjFgmSULLdsdw==",
"license": "Apache-2.0",
"engines": {
"node": ">=20.19.0 || >=22.13.0 || >=24"
},
"optionalDependencies": {
"@napi-rs/canvas": "^0.1.95",
"node-readable-to-web-readable-stream": "^0.4.2"
}
},
"node_modules/picocolors": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
@ -5399,6 +5775,12 @@
"url": "https://github.com/sponsors/wooorm"
}
},
"node_modules/sprintf-js": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/sprintf-js/-/sprintf-js-1.0.3.tgz",
"integrity": "sha512-D9cPgkvLlV3t3IzL0D0YLvGA9Ahk4PcvVwUbN0dSGr1aP0Nrt4AEnTUbuGvquEC0mA64Gqt1fzirlRs5ibXx8g==",
"license": "BSD-3-Clause"
},
"node_modules/statuses": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.2.tgz",
@ -5721,6 +6103,12 @@
"node": ">=14.17"
}
},
"node_modules/underscore": {
"version": "1.13.8",
"resolved": "https://registry.npmjs.org/underscore/-/underscore-1.13.8.tgz",
"integrity": "sha512-DXtD3ZtEQzc7M8m4cXotyHR+FAS18C64asBYY5vqZexfYryNNnDc02W4hKg3rdQuqOYas1jkseX0+nZXjTXnvQ==",
"license": "MIT"
},
"node_modules/undici-types": {
"version": "6.21.0",
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.21.0.tgz",
@ -6596,6 +6984,15 @@
"xml-js": "bin/cli.js"
}
},
"node_modules/xmlbuilder": {
"version": "10.1.1",
"resolved": "https://registry.npmjs.org/xmlbuilder/-/xmlbuilder-10.1.1.tgz",
"integrity": "sha512-OyzrcFLL/nb6fMGHbiRDuPup9ljBycsdCypwuyg5AAHvyWzGfChJpCXMG88AGTIMFhGZ9RccFN1e6lhg3hkwKg==",
"license": "MIT",
"engines": {
"node": ">=4.0"
}
},
"node_modules/yallist": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/yallist/-/yallist-3.1.1.tgz",

View File

@ -8,7 +8,8 @@
"build": "vite build",
"preview": "vite preview",
"clean": "rm -rf dist",
"lint": "tsc --noEmit"
"lint": "tsc --noEmit",
"dev:server": "node --experimental-strip-types server/index.ts"
},
"dependencies": {
"@google/genai": "^1.29.0",
@ -21,8 +22,10 @@
"express": "^4.21.2",
"file-saver": "^2.0.5",
"lucide-react": "^0.546.0",
"mammoth": "^1.11.0",
"motion": "^12.23.24",
"openai": "^6.27.0",
"pdfjs-dist": "^5.5.207",
"react": "^19.0.0",
"react-dom": "^19.0.0",
"react-markdown": "^10.1.0",

85
server/auth.ts Normal file
View File

@ -0,0 +1,85 @@
import crypto from 'node:crypto';
import type { NextFunction, Request, Response } from 'express';
import { createSession, createUser, deleteSession, findUserBySessionToken, findUserByUsername } from './db.ts';
import type { AuthenticatedUser } from './types.ts';
declare global {
namespace Express {
interface Request {
authUser?: AuthenticatedUser;
authToken?: string;
}
}
}
export function hashPassword(password: string, salt = crypto.randomBytes(16).toString('hex')) {
const hash = crypto.scryptSync(password, salt, 64).toString('hex');
return { hash, salt };
}
export function verifyPassword(password: string, passwordHash: string, passwordSalt: string) {
const { hash } = hashPassword(password, passwordSalt);
return crypto.timingSafeEqual(Buffer.from(hash, 'hex'), Buffer.from(passwordHash, 'hex'));
}
export function issueSession(userId: number) {
const token = crypto.randomBytes(32).toString('hex');
createSession(token, userId);
return token;
}
export function registerUser(username: string, password: string) {
if (findUserByUsername(username)) {
throw new Error('瑯뵀綠닸瞳');
}
const { hash, salt } = hashPassword(password);
const user = createUser(username, hash, salt);
if (!user) {
throw new Error('瑯뵀눼쉔呵겨');
}
const token = issueSession(user.id);
return {
token,
user: { id: user.id, username: user.username },
};
}
export function loginUser(username: string, password: string) {
const user = findUserByUsername(username);
if (!user || !verifyPassword(password, user.password_hash, user.password_salt)) {
throw new Error('瑯뵀샀쵱쯤댄轎');
}
const token = issueSession(user.id);
return {
token,
user: { id: user.id, username: user.username },
};
}
export function requireAuth(req: Request, res: Response, next: NextFunction) {
const header = req.headers.authorization || '';
const token = header.startsWith('Bearer ') ? header.slice(7) : '';
if (!token) {
res.status(401).json({ error: '灌되쩌' });
return;
}
const user = findUserBySessionToken(token);
if (!user) {
res.status(401).json({ error: '되쩌綠呵槻' });
return;
}
req.authToken = token;
req.authUser = { id: user.id, username: user.username };
next();
}
export function resolveSession(token: string) {
const user = findUserBySessionToken(token);
return user ? { id: user.id, username: user.username } : null;
}
export function logoutSession(token: string) {
deleteSession(token);
}

102
server/db.ts Normal file
View File

@ -0,0 +1,102 @@
import Database from 'better-sqlite3';
import fs from 'node:fs';
import path from 'node:path';
const dataDir = path.resolve(process.cwd(), '.data');
fs.mkdirSync(dataDir, { recursive: true });
const dbPath = path.join(dataDir, 'scriptflow.db');
export const db = new Database(dbPath);
db.pragma('journal_mode = WAL');
db.exec(`
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
username TEXT NOT NULL UNIQUE,
password_hash TEXT NOT NULL,
password_salt TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE TABLE IF NOT EXISTS sessions (
token TEXT PRIMARY KEY,
user_id INTEGER NOT NULL,
created_at TEXT NOT NULL DEFAULT (datetime('now')),
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS drafts (
user_id INTEGER PRIMARY KEY,
payload TEXT NOT NULL,
updated_at TEXT NOT NULL,
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
);
`);
export type UserRow = {
id: number;
username: string;
password_hash: string;
password_salt: string;
};
export function findUserByUsername(username: string) {
return db.prepare('SELECT * FROM users WHERE username = ?').get(username) as UserRow | undefined;
}
export function findUserById(id: number) {
return db.prepare('SELECT * FROM users WHERE id = ?').get(id) as UserRow | undefined;
}
export function createUser(username: string, passwordHash: string, passwordSalt: string) {
const result = db
.prepare('INSERT INTO users (username, password_hash, password_salt) VALUES (?, ?, ?)')
.run(username, passwordHash, passwordSalt);
return findUserById(Number(result.lastInsertRowid));
}
export function createSession(token: string, userId: number) {
db.prepare('INSERT INTO sessions (token, user_id) VALUES (?, ?)').run(token, userId);
}
export function findUserBySessionToken(token: string) {
return db
.prepare(`
SELECT users.*
FROM sessions
INNER JOIN users ON users.id = sessions.user_id
WHERE sessions.token = ?
`)
.get(token) as UserRow | undefined;
}
export function deleteSession(token: string) {
db.prepare('DELETE FROM sessions WHERE token = ?').run(token);
}
export function getDraftForUser(userId: number) {
const row = db.prepare('SELECT payload, updated_at FROM drafts WHERE user_id = ?').get(userId) as { payload: string; updated_at: string } | undefined;
if (!row) return null;
return {
payload: JSON.parse(row.payload),
updatedAt: row.updated_at,
};
}
export function saveDraftForUser(userId: number, payload: unknown) {
const serialized = JSON.stringify(payload);
const updatedAt = new Date().toISOString();
db
.prepare(`
INSERT INTO drafts (user_id, payload, updated_at)
VALUES (?, ?, ?)
ON CONFLICT(user_id) DO UPDATE SET payload = excluded.payload, updated_at = excluded.updated_at
`)
.run(userId, serialized, updatedAt);
return {
payload,
updatedAt,
};
}

91
server/index.ts Normal file
View File

@ -0,0 +1,91 @@
import express from 'express';
import { getDraftForUser, saveDraftForUser } from './db.ts';
import { loginUser, logoutSession, registerUser, requireAuth, resolveSession } from './auth.ts';
import type { DraftPayload } from './types.ts';
const app = express();
const port = Number(process.env.PORT || 8787);
app.use(express.json({ limit: '10mb' }));
app.get('/api/health', (_req, res) => {
res.json({ ok: true });
});
app.post('/api/auth/register', (req, res) => {
const username = String(req.body?.username || '').trim();
const password = String(req.body?.password || '');
if (username.length < 3 || password.length < 6) {
res.status(400).json({ error: '瑯뵀逞<EBB580> 3 貫쵱쯤逞<ECAFA4> 6 貫' });
return;
}
try {
const result = registerUser(username, password);
res.json(result);
} catch (error) {
res.status(400).json({ error: error instanceof Error ? error.message : '鬧꿍呵겨' });
}
});
app.post('/api/auth/login', (req, res) => {
const username = String(req.body?.username || '').trim();
const password = String(req.body?.password || '');
if (!username || !password) {
res.status(400).json({ error: '헝渴흙瑯뵀뵨쵱쯤' });
return;
}
try {
const result = loginUser(username, password);
res.json(result);
} catch (error) {
res.status(401).json({ error: error instanceof Error ? error.message : '되쩌呵겨' });
}
});
app.get('/api/auth/session', (req, res) => {
const header = req.headers.authorization || '';
const token = header.startsWith('Bearer ') ? header.slice(7) : '';
if (!token) {
res.status(401).json({ error: '灌되쩌' });
return;
}
const user = resolveSession(token);
if (!user) {
res.status(401).json({ error: '되쩌綠呵槻' });
return;
}
res.json({ user });
});
app.post('/api/auth/logout', requireAuth, (req, res) => {
if (req.authToken) {
logoutSession(req.authToken);
}
res.json({ ok: true });
});
app.get('/api/draft/current', requireAuth, (req, res) => {
const draft = getDraftForUser(req.authUser!.id);
res.json({ draft });
});
app.post('/api/draft/current', requireAuth, (req, res) => {
const payload = req.body?.draft as DraftPayload | undefined;
if (!payload || typeof payload !== 'object') {
res.status(400).json({ error: '꿇멨코휭轟槻' });
return;
}
const draft = saveDraftForUser(req.authUser!.id, payload);
res.json({ draft });
});
app.listen(port, '127.0.0.1', () => {
console.log(`ScriptFlow server listening on http://127.0.0.1:${port}`);
});

10
server/types.ts Normal file
View File

@ -0,0 +1,10 @@
export interface DraftPayload {
version: number;
savedAt: string;
data: Record<string, unknown>;
}
export interface AuthenticatedUser {
id: number;
username: string;
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,42 @@
export type ConversionDraftStatus = 'pending' | 'generating' | 'completed' | 'failed' | 'paused';
export type ConversionDraftVersion = {
id: '0' | '1';
label: 'A' | 'B';
content: string;
status: ConversionDraftStatus;
};
export type ConversionDraftEpisode = {
id: string;
title: string;
sourceContent: string;
status: ConversionDraftStatus;
activeVersionId: '0' | '1';
selectedFinalVersionId: '0' | '1' | null;
versions: ConversionDraftVersion[];
error?: string;
};
export function sanitizeSavedConversionEpisodeResults<T extends ConversionDraftEpisode>(results: T[]): T[] {
return results.map((episode) => ({
...episode,
status: episode.status === 'generating' ? 'paused' : episode.status,
versions: episode.versions.map((version) => ({
...version,
status: version.status === 'generating' ? 'paused' : version.status,
})),
}));
}
export function parseSavedConversionEpisodeResults(raw: string | null): ConversionDraftEpisode[] {
if (!raw) return [];
try {
const parsed = JSON.parse(raw);
if (!Array.isArray(parsed)) return [];
return sanitizeSavedConversionEpisodeResults(parsed as ConversionDraftEpisode[]);
} catch {
return [];
}
}

View File

@ -0,0 +1,43 @@
export type ConversionGenerationStatus = 'pending' | 'generating' | 'completed' | 'failed' | 'paused';
export type ConversionVersionLike = {
id: '0' | '1';
status: ConversionGenerationStatus;
content: string;
};
export type ConversionEpisodeLike = {
status: ConversionGenerationStatus;
versions: ConversionVersionLike[];
};
export class ConversionAbortedError extends Error {
constructor(message = 'Conversion aborted') {
super(message);
this.name = 'AbortError';
}
}
export function isConversionAbortedError(error: unknown): boolean {
if (!error) return false;
if (error instanceof ConversionAbortedError) return true;
if (error instanceof DOMException && error.name === 'AbortError') return true;
return typeof error === 'object' && 'name' in error && (error as { name?: string }).name === 'AbortError';
}
export function throwIfAborted(signal?: AbortSignal | null) {
if (signal?.aborted) {
throw new ConversionAbortedError();
}
}
export function getResumeStartIndex<T extends { status: ConversionGenerationStatus }>(episodes: T[]): number {
return episodes.findIndex((episode) => episode.status !== 'completed');
}
export function getVersionsToGenerate<T extends ConversionEpisodeLike>(episode: T): number[] {
return episode.versions
.filter((version) => version.status !== 'completed')
.map((version) => Number(version.id));
}

17
src/lib/scriptPreview.ts Normal file
View File

@ -0,0 +1,17 @@
export type ScriptPreviewBlock =
| { type: 'scene'; content: string }
| { type: 'text'; content: string };
const SCENE_LINE_PATTERN = /^\s*(?:#{0,2}\s*)?(?:\u573A\u666F(?:\s*\d+)?|SCENE\b)/i;
export function splitScriptPreviewBlocks(content: string): ScriptPreviewBlock[] {
return content
.replace(/\r\n/g, '\n')
.split('\n')
.map((line) => line.trim())
.filter(Boolean)
.map((line) => ({
type: SCENE_LINE_PATTERN.test(line) ? 'scene' : 'text',
content: line,
}));
}

View File

@ -1,5 +1,6 @@
import { GoogleGenAI, Type, ThinkingLevel } from "@google/genai";
import OpenAI from "openai";
import { isConversionAbortedError, throwIfAborted } from "../lib/conversionWorkflow";
export interface AIConfig {
model: 'gemini' | 'doubao';
@ -27,14 +28,206 @@ export interface ScriptOption {
content: string;
}
export interface PlotBackgroundFields {
worldview: string;
outline: string;
characters: string;
}
const PLOT_BACKGROUND_MARKERS = [
{ key: 'worldview', tag: '[[WORLDVIEW]]' },
{ key: 'outline', tag: '[[OUTLINE]]' },
{ key: 'characters', tag: '[[CHARACTERS]]' }
] as const;
const CHARACTER_FIELD_LABELS = [
'\u8eab\u4efd',
'\u6027\u683c',
'\u52a8\u673a',
'\u5173\u7cfb',
'\u5916\u8c8c',
'\u80cc\u666f',
'\u76ee\u6807',
'\u7279\u70b9',
'\u7ecf\u5386',
'\u59d3\u540d',
'\u4ecb\u7ecd'
];
const CHARACTER_FIELD_LABEL_PATTERN = CHARACTER_FIELD_LABELS.join('|');
function formatCharacterProfiles(value: string) {
let formatted = value.replace(/\r\n/g, '\n').trim();
if (!formatted) return '';
formatted = formatted.replace(/\[\[(?:WORLDVIEW|OUTLINE|CHARACTERS)\]\]/g, '').trim();
formatted = formatted.replace(
new RegExp(`([\u3002\uff01\uff1f\uff1b]|\n)\\s*(?!${CHARACTER_FIELD_LABEL_PATTERN}[\uff1a:])([^\n\uff1a:]{1,12}[\uff1a:])`, 'g'),
(_, separator, title) => `${separator}\n\n${title}`
);
formatted = formatted.replace(
new RegExp(`\\s*(${CHARACTER_FIELD_LABEL_PATTERN})[\uff1a:]`, 'g'),
'\n$1\uff1a'
);
formatted = formatted.replace(/(^|\n)([^\n\uff1a:]{1,12})[\uff1a:]\s*/g, (_match, prefix, name) => {
if (CHARACTER_FIELD_LABELS.includes(name)) {
return `${prefix}${name}\uff1a`;
}
return `${prefix}${name}\uff1a\n`;
});
formatted = formatted.replace(/\n{3,}/g, '\n\n').trim();
return formatted;
}
function normalizePlotBackgroundField(value: string) {
return value.replace(/^\s+/, '').trimEnd();
}
function parsePlotBackgroundFields(streamText: string): PlotBackgroundFields {
const result: PlotBackgroundFields = { worldview: '', outline: '', characters: '' };
PLOT_BACKGROUND_MARKERS.forEach((marker, index) => {
const start = streamText.indexOf(marker.tag);
if (start === -1) return;
const contentStart = start + marker.tag.length;
const nextMarkerIndex = PLOT_BACKGROUND_MARKERS
.slice(index + 1)
.map((nextMarker) => streamText.indexOf(nextMarker.tag, contentStart))
.find((position) => position !== -1);
const rawValue = nextMarkerIndex === undefined
? streamText.slice(contentStart)
: streamText.slice(contentStart, nextMarkerIndex);
const normalizedValue = normalizePlotBackgroundField(rawValue);
result[marker.key] = marker.key === 'characters'
? formatCharacterProfiles(normalizedValue)
: normalizedValue;
});
return result;
}
export async function extractEpisodesFromSource(
sourceText: string,
onUpdate: (content: string) => void,
config: AIConfig
): Promise<void> {
try {
const client = getDoubaoClient(config.apiKey);
const systemInstruction = `You are a screenplay document extraction assistant. Your only task is to identify episode boundaries and return the original script text for each episode without rewriting.`;
const prompt = `Extract each episode and its corresponding original script content from the source text.
Rules:
1. Do not rewrite, polish, summarize, expand, shorten, correct, translate, or reorder the original text.
2. The script body must be returned 1:1 in the same wording and order as the source.
3. You may only add episode boundary headings such as \u7b2c1\u96c6 and \u7b2c2\u96c6 when needed to separate episodes.
4. If the source already contains episode titles, preserve them as-is.
5. If episode boundaries are unclear, split only where the source provides the clearest signals; if no reliable split exists, return the source as a single block.
6. Output only the organized script content. Do not include explanations, notes, JSON, or markdown code fences.
7. Use Chinese in the output.
Source text:
${sourceText}`;
const stream: any = await client.chat.completions.create({
model: 'doubao-seed-1-6-flash-250828',
messages: [
{ role: 'system', content: systemInstruction },
{ role: 'user', content: prompt }
],
stream: true,
thinking: { type: 'enabled' }
} as any);
let fullText = '';
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || '';
if (content) {
fullText += content;
onUpdate(fullText);
}
}
} catch (error) {
console.error('Error extracting episodes from source:', error);
throw error;
}
}
export async function extractPlotBackgroundFromSource(
sourceText: string,
onUpdate: (fields: PlotBackgroundFields) => void,
config: AIConfig
): Promise<void> {
try {
const client = getDoubaoClient(config.apiKey);
const systemInstruction = `You are a senior story analyst. Extract structured plot background settings from the source text accurately. Do not invent critical facts that are not supported by the source.`;
const prompt = `Read the source text and output exactly three tagged sections in this exact order:
[[WORLDVIEW]]
Summarize the story world, including era, location, social environment, special rules, and world setting.
[[OUTLINE]]
Summarize the core story outline, including setup, development, conflicts, climax, and ending direction.
[[CHARACTERS]]
Summarize the core character profiles, including name, identity, personality, appearance, and key relationships. Format this section as readable character blocks in Chinese. The appearance field must describe visible traits in detail, especially facial features, hairstyle, clothing, body shape, height, build, and overall look. Do not put goals, motives, or plot actions into the appearance field. Use this format:
\u89d2\u8272\u540d\uff1a\n\u8eab\u4efd\uff1a...\n\u6027\u683c\uff1a...\n\u5916\u8c8c\uff1a\u957f\u76f8...\u53d1\u578b...\u670d\u88c5...\u9ad8\u77ee\u80d6\u7626...\n\u5173\u7cfb\uff1a...\nLeave one blank line between characters.
Rules:
1. Output only these three tags and their content.
2. Do not output markdown code fences, explanations, or any extra headings.
3. Use Chinese for the content.
4. You may condense the source, but do not fabricate important settings or characters.
5. If appearance details are missing in the source, infer only cautious visual descriptions supported by role/context, and keep them modest.
6. Never use the label \u52a8\u673a in this section; use \u5916\u8c8c instead.
Source text:
${sourceText}`;
const stream: any = await client.chat.completions.create({
model: 'doubao-seed-1-8-251228',
messages: [
{ role: 'system', content: systemInstruction },
{ role: 'user', content: prompt }
],
stream: true,
extra_body: {
reasoning_effort: 'medium'
}
} as any);
let fullText = '';
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || '';
if (content) {
fullText += content;
onUpdate(parsePlotBackgroundFields(fullText));
}
}
} catch (error) {
console.error('Error extracting plot background from source:', error);
throw error;
}
}
export async function convertTextToScript(
sourceText: string,
scriptType: string,
scriptFormat: string,
sourceText: string,
scriptType: string,
scriptFormat: string,
combinedStyle: string,
onUpdate: (versionIndex: number, content: string) => void,
config: AIConfig,
globalSettings?: { worldview: string; outline: string; characters: string }
globalSettings?: { worldview: string; outline: string; characters: string; audiencePreference: string; wordRange: string },
options?: {
signal?: AbortSignal;
targetVersionIds?: number[];
seedContents?: Partial<Record<number, string>>;
previousEpisodeContent?: string;
episodeTitle?: string;
episodeOutline?: string;
}
): Promise<void> {
try {
const baseSystemInstruction = `你是一位专业影视编剧和剧本改编专家。
@ -376,59 +569,109 @@ export async function convertTextToScript(
{
id: 0,
instruction: `
#
# A??????
??????????
??????????????????????????????????????????
#
?????
1. ??????????????????????????????????
2. ???????????????????????????
3. ????????????????????????????????????????
4. ???????????????????????
5. ?????????????????????
使 Markdown JSON [[VERSION]]
使`
???
?????????????????????????????
# ????
???????????????????????????????????????? Markdown ??????? JSON ??????? [[VERSION]] ????????????????????????????????????????????????????????
????????`
},
{
id: 1,
instruction: `
#
# B??????
??????????
??????????????????????????????????????????????????????????
#
?????
1. ??????????????????????????????????
2. ?????????????????????????????????????
3. ?????????????????????????????????
4. ???????????????????????????
5. ?????????????????????????
使 Markdown JSON [[VERSION]]
使`
???
??????????????????????????????????????????
# ????
???????????????????????????????????????? Markdown ??????? JSON ??????? [[VERSION]] ????????????????????????????????????????????????????????
????????`
}
];
const generateVersion = async (v: { id: number, instruction: string }) => {
try {
const prompt = `请根据以下信息进行创作:
const versionIds = options?.targetVersionIds ?? versionPrompts.map((version) => version.id);
const generateVersion = async (v: { id: number, instruction: string }) => {
const seedContent = options?.seedContents?.[v.id] || "";
const continuationPrompt = seedContent.trim()
? `
??????????
${seedContent}
???????????????????????????????????????????????????????????????????`
: "";
try {
throwIfAborted(options?.signal);
const prompt = `????????????
?????
${options?.episodeTitle || '???'}
???????
${options?.episodeOutline || sourceText}
????
${sourceText}
?????
${scriptType}
?????
${combinedStyle}
???????
${scriptFormat}
${globalSettings ? `
${globalSettings.worldview || '未设定'}
${globalSettings.outline || '未设定'}
${globalSettings.characters || '未设定'}
` : ''}
${options?.previousEpisodeContent ? `???????????????????
${options.previousEpisodeContent}
`;
???????????????????????????????????????
` : ''}${globalSettings ? `
????????
????${globalSettings.worldview || '???'}
?????${globalSettings.outline || '???'}
?????${globalSettings.characters || '???'}
` : ''}${continuationPrompt}
?????
1. ?????????????????????????????
2. ??????????????????????????????
3. ???????????????????
4. ???????????
?????????????????????????`;
const systemInstruction = baseSystemInstruction + v.instruction;
let fullText = seedContent;
if (config.model === 'gemini') {
const ai = getGeminiClient(config.apiKey);
@ -446,8 +689,8 @@ ${globalSettings ? `
}
});
let fullText = "";
for await (const chunk of result) {
throwIfAborted(options?.signal);
const chunkText = chunk.text;
if (chunkText) {
fullText += chunkText;
@ -467,10 +710,12 @@ ${globalSettings ? `
"thinking": { "type": "enabled" },
"reasoning_effort": "low"
}
} as any, {
signal: options?.signal
} as any);
let fullText = "";
for await (const chunk of stream) {
throwIfAborted(options?.signal);
const content = chunk.choices[0]?.delta?.content || "";
if (content) {
fullText += content;
@ -479,15 +724,25 @@ ${globalSettings ? `
}
}
} catch (error) {
if (isConversionAbortedError(error)) {
throw error;
}
console.error(`Error generating version ${v.id}:`, error);
onUpdate(v.id, "生成失败,请重试。");
throw error;
}
};
await Promise.all(versionPrompts.map(v => generateVersion(v)));
await Promise.all(versionPrompts.filter((version) => versionIds.includes(version.id)).map((version) => generateVersion(version)));
throwIfAborted(options?.signal);
} catch (error) {
if (isConversionAbortedError(error)) {
throw error;
}
console.error("Error converting text to script:", error);
throw error;
}
}
@ -537,12 +792,12 @@ export async function refineConvertedScript(originalContent: string, instruction
}
export async function generatePageScript(
story: string,
scriptType: string,
scriptFormat: string,
story: string,
scriptType: string,
scriptFormat: string,
styles: string[],
narratives: string[],
globalSettings: { worldview: string; outline: string; characters: string } | undefined,
globalSettings: { worldview: string; outline: string; characters: string; audiencePreference: string; wordRange: string } | undefined,
previousScript: string | undefined,
onUpdate: (versionIndex: number, content: string) => void,
config: AIConfig
@ -1099,7 +1354,7 @@ export async function generateAllScripts(
styles: string[],
narratives: string[],
config: AIConfig,
globalSettings?: { worldview: string; outline: string; characters: string }
globalSettings?: { worldview: string; outline: string; characters: string; audiencePreference: string; wordRange: string }
): Promise<string[][]> {
try {
const generateForPage = async (pageStory: string, pageIndex: number): Promise<string[]> => {

60
src/services/api.ts Normal file
View File

@ -0,0 +1,60 @@
export interface AuthUser {
id: number;
username: string;
}
export interface RemoteDraft {
version: number;
savedAt: string;
data: Record<string, unknown>;
}
async function request<T>(path: string, options: RequestInit = {}, token?: string): Promise<T> {
const response = await fetch(path, {
...options,
headers: {
'Content-Type': 'application/json',
...(token ? { Authorization: `Bearer ${token}` } : {}),
...(options.headers || {}),
},
});
const payload = await response.json().catch(() => ({}));
if (!response.ok) {
throw new Error(payload.error || 'ÇëÇóʧ°Ü');
}
return payload as T;
}
export function register(username: string, password: string) {
return request<{ token: string; user: AuthUser }>('/api/auth/register', {
method: 'POST',
body: JSON.stringify({ username, password }),
});
}
export function login(username: string, password: string) {
return request<{ token: string; user: AuthUser }>('/api/auth/login', {
method: 'POST',
body: JSON.stringify({ username, password }),
});
}
export function restoreSession(token: string) {
return request<{ user: AuthUser }>('/api/auth/session', { method: 'GET' }, token);
}
export function logout(token: string) {
return request<{ ok: true }>('/api/auth/logout', { method: 'POST' }, token);
}
export function loadCurrentDraft(token: string) {
return request<{ draft: { payload: RemoteDraft; updatedAt: string } | null }>('/api/draft/current', { method: 'GET' }, token);
}
export function saveCurrentDraft(token: string, draft: RemoteDraft) {
return request<{ draft: { payload: RemoteDraft; updatedAt: string } }>('/api/draft/current', {
method: 'POST',
body: JSON.stringify({ draft }),
}, token);
}

View File

@ -0,0 +1,71 @@
import mammoth from 'mammoth';
import { getDocument, GlobalWorkerOptions } from 'pdfjs-dist';
import pdfWorker from 'pdfjs-dist/build/pdf.worker.mjs?url';
GlobalWorkerOptions.workerSrc = pdfWorker;
const SUPPORTED_EXTENSIONS = ['docx', 'txt', 'pdf', 'md'] as const;
export type SupportedUploadExtension = typeof SUPPORTED_EXTENSIONS[number];
export interface ParsedUploadFile {
extension: SupportedUploadExtension;
text: string;
}
function getExtension(fileName: string): SupportedUploadExtension | null {
const ext = fileName.split('.').pop()?.toLowerCase();
if (!ext) return null;
return (SUPPORTED_EXTENSIONS as readonly string[]).includes(ext)
? (ext as SupportedUploadExtension)
: null;
}
async function parsePdf(file: File): Promise<string> {
const data = new Uint8Array(await file.arrayBuffer());
const pdf = await getDocument({ data }).promise;
const pages: string[] = [];
for (let pageNumber = 1; pageNumber <= pdf.numPages; pageNumber += 1) {
const page = await pdf.getPage(pageNumber);
const textContent = await page.getTextContent();
const pageText = textContent.items
.map((item) => ('str' in item ? item.str : ''))
.join(' ')
.replace(/\s+/g, ' ')
.trim();
if (pageText) {
pages.push(pageText);
}
}
return pages.join('\n\n');
}
async function parseDocx(file: File): Promise<string> {
const buffer = await file.arrayBuffer();
const result = await mammoth.extractRawText({ arrayBuffer: buffer });
return result.value.trim();
}
export async function parseUploadedSourceFile(file: File): Promise<ParsedUploadFile> {
const extension = getExtension(file.name);
if (!extension) {
throw new Error('\u4ec5\u652f\u6301 docx\u3001txt\u3001pdf\u3001md \u56db\u79cd\u683c\u5f0f');
}
let text = '';
if (extension === 'txt' || extension === 'md') {
text = await file.text();
} else if (extension === 'docx') {
text = await parseDocx(file);
} else if (extension === 'pdf') {
text = await parsePdf(file);
}
const normalized = text.replace(/\r\n/g, '\n').trim();
if (!normalized) {
throw new Error('\u6587\u4ef6\u89e3\u6790\u6210\u529f\uff0c\u4f46\u672a\u63d0\u53d6\u5230\u53ef\u7528\u6587\u672c');
}
return { extension, text: normalized };
}