Compare commits

...

99 Commits

Author SHA1 Message Date
707e65b017 refactor(annotation): 优化编辑器服务中的分段处理逻辑
- 在处理分段注释时初始化 segments 列表变量
- 确保分段信息列表在函数开始时被正确初始化
- 提高代码可读性和变量声明的一致性
2026-02-04 17:35:14 +08:00
cda22a720c feat(annotation): 优化文本标注分段功能实现
- 新增 getEditorTaskSegmentsUsingGet 接口用于获取任务分段信息
- 移除 SegmentInfo 中的 text、start、end 字段,精简数据结构
- 添加 EditorTaskSegmentsResponse 类型定义用于分段摘要响应
- 实现服务端 get_task_segments 方法,支持分段信息查询
- 重构前端组件缓存机制,使用 segmentSummaryFileRef 管理分段状态
- 优化分段构建逻辑,提取 _build_segment_contexts 公共方法
- 调整后端 _build_text_task 方法中的分段处理流程
- 更新 API 类型定义,统一 RequestParams 和 RequestPayload 类型
2026-02-04 16:59:04 +08:00
394e2bda18 feat(data-management): 添加数据集文件取消上传功能
- 在OpenAPI规范中定义了取消上传的REST端点接口
- 实现了DatasetFileApplicationService中的取消上传业务逻辑
- 在FileService中添加了完整的取消上传服务方法
- 创建了DatasetUploadController控制器处理取消上传请求
- 实现了临时分片文件清理和数据库记录删除功能
2026-02-04 16:25:03 +08:00
4220284f5a refactor(utils): 重构文件流式分割上传功能
- 将 streamSplitAndUpload 函数拆分为独立的 processFileLines 函数
- 简化文件按行处理逻辑,移除冗余的行收集和缓存机制
- 优化并发上传实现,使用 Promise 集合管理上传任务
- 修复上传过程中断信号处理和错误传播机制
- 统一进度回调参数结构,改进字节和行数跟踪逻辑
- 优化空行跳过计数和上传结果返回值处理
2026-02-04 16:11:03 +08:00
8415166949 refactor(upload): 重构切片上传逻辑支持动态请求ID解析
- 移除预先批量获取reqId的方式,改为按需解析
- 新增resolveReqId函数支持动态获取请求ID
- 添加onReqIdResolved回调处理ID解析完成事件
- 改进文件按行切片上传,每行作为独立文件处理
- 优化空行跳过逻辑,统计跳过的空行数量
- 修复fileNo和chunkNo的对应关系
- 更新streamSplitAndUpload参数结构
2026-02-04 15:58:58 +08:00
078f303f57 Revert "fix: 修复 hasArchive 和 splitByLine 同时存在的问题"
This reverts commit 50f2da5503.
2026-02-04 15:48:01 +08:00
50f2da5503 fix: 修复 hasArchive 和 splitByLine 同时存在的问题
问题:hasArchive 默认为 true,而 splitByLine 可以与其同时开启,
      导致压缩包被错误地按行分割,产生逻辑矛盾。

修复:
1. 当 hasArchive=true 时,禁用 splitByLine switch
2. 添加 useEffect,当 hasArchive 变为 true 时自动关闭 splitByLine

修改文件:frontend/src/pages/DataManagement/Detail/components/ImportConfiguration.tsx
2026-02-04 15:43:53 +08:00
3af1daf8b6 fix: 修复流式分割上传的"预上传请求不存在"错误
问题:handleStreamUpload 中为所有文件只调用一次 preUpload,设置
      totalFileNum: files.length(原始文件数),但实际上传的文件数量
      是按行分割后的总行数,导致后端提前删除预上传请求。

修复:将 preUpload 调用移到文件循环内部,为每个原始文件单独调用
      preUpload,设置 totalFileNum: 1,每个文件有自己的 reqId。
      这样可以避免按行分割导致的请求被提前删除问题。

修改文件:frontend/src/hooks/useSliceUpload.tsx
2026-02-04 15:39:05 +08:00
7c7729434b fix: 修复流式分割上传的三个问题
1. 实现真正的并发控制,避免同时产生大量请求
   - 使用任务队列模式,确保同时运行的任务不超过 maxConcurrency
   - 完成一个任务后才启动下一个,而不是一次性启动所有任务

2. 修复 API 错误(预上传请求不存在)
   - 所有分片使用相同的 fileNo=1(属于同一个预上传请求)
   - chunkNo 改为行号,表示第几行数据
   - 这是根本原因:之前每行都被当作不同文件,但只有第一个文件有有效的预上传请求

3. 保留原始文件扩展名
   - 正确提取并保留文件扩展名
   - 例如:132.txt → 132_000001.txt(而不是 132_000001)
2026-02-04 15:06:02 +08:00
17a62cd3c2 fix: 修复上传取消功能,确保 HTTP 请求正确中止
- 在 XMLHttpRequest 中添加 signal.aborted 检查
- 修复 useSliceUpload 中的 cancelFn 闭包问题
- 确保流式上传和分片上传都能正确取消
2026-02-04 14:51:23 +08:00
f381d641ab fix(upload): 修复流式上传中的文件名处理逻辑
- 修正预上传接口调用时传递正确的文件总数而非固定值-1
- 移除导入配置中文件分割时的文件扩展名保留逻辑
- 删除流式上传选项中的fileExtension参数定义
- 移除流式上传实现中的文件扩展名处理相关代码
- 简化新文件名生成逻辑,不再附加扩展名后缀
2026-02-04 07:47:41 +08:00
c8611d29ff feat(upload): 实现流式分割上传,优化大文件上传体验
实现边分割边上传的流式处理,避免大文件一次性加载导致前端卡顿。

修改内容:
1. file.util.ts - 流式分割上传核心功能
   - 新增 streamSplitAndUpload 函数,实现边分割边上传
   - 新增 shouldStreamUpload 函数,判断是否使用流式上传
   - 新增 StreamUploadOptions 和 StreamUploadResult 接口
   - 优化分片大小(默认 5MB)

2. ImportConfiguration.tsx - 智能上传策略
   - 大文件(>5MB)使用流式分割上传
   - 小文件(≤5MB)使用传统分割方式
   - 保持 UI 不变

3. useSliceUpload.tsx - 流式上传处理
   - 新增 handleStreamUpload 处理流式上传事件
   - 支持并发上传和更好的进度管理

4. TaskUpload.tsx - 进度显示优化
   - 注册流式上传事件监听器
   - 显示流式上传信息(已上传行数、当前文件等)

5. dataset.model.ts - 类型定义扩展
   - 新增 StreamUploadInfo 接口
   - TaskItem 接口添加 streamUploadInfo 和 prefix 字段

实现特点:
- 流式读取:使用 Blob.slice 逐块读取,避免一次性加载
- 逐行检测:按换行符分割,形成完整行后立即上传
- 内存优化:buffer 只保留当前块和未完成行,不累积所有分割结果
- 并发控制:支持 3 个并发上传,提升效率
- 进度可见:实时显示已上传行数和总体进度
- 错误处理:单个文件上传失败不影响其他文件
- 向后兼容:小文件仍使用原有分割方式

优势:
- 大文件上传不再卡顿,用户体验大幅提升
- 内存占用显著降低(从加载整个文件到只保留当前块)
- 上传效率提升(边分割边上传,并发上传多个小文件)

相关文件:
- frontend/src/utils/file.util.ts
- frontend/src/pages/DataManagement/Detail/components/ImportConfiguration.tsx
- frontend/src/hooks/useSliceUpload.tsx
- frontend/src/pages/Layout/TaskUpload.tsx
- frontend/src/pages/DataManagement/dataset.model.ts
2026-02-03 13:12:10 +00:00
147beb1ec7 feat(annotation): 实现文本切片预生成功能
在创建标注任务时自动预生成文本切片结构,避免每次进入标注页面时的实时计算。

修改内容:
1. 在 AnnotationEditorService 中新增 precompute_segmentation_for_project 方法
   - 为项目的所有文本文件预计算切片结构
   - 使用 AnnotationTextSplitter 执行切片
   - 将切片结构持久化到 AnnotationResult 表(状态为 IN_PROGRESS)
   - 支持失败重试机制
   - 返回统计信息

2. 修改 create_mapping 接口
   - 在创建标注任务后,如果启用分段且为文本数据集,自动触发切片预生成
   - 使用 try-except 捕获异常,确保切片失败不影响项目创建

特点:
- 使用现有的 AnnotationTextSplitter 类
- 切片数据结构与现有分段标注格式一致
- 向后兼容(未切片的任务仍然使用实时计算)
- 性能优化:避免进入标注页面时的重复计算

相关文件:
- runtime/datamate-python/app/module/annotation/service/editor.py
- runtime/datamate-python/app/module/annotation/interface/project.py
2026-02-03 12:59:29 +00:00
699031dae7 fix: 修复编辑数据集时无法清除关联数据集的编译问题
问题分析:
之前尝试使用 @TableField(updateStrategy = FieldStrategy.IGNORED/ALWAYS) 注解
来强制更新 null 值,但 FieldStrategy.ALWAYS 可能不存在于当前
MyBatis-Plus 3.5.14 版本中,导致编译错误。

修复方案:
1. 移除 Dataset.java 中 parentDatasetId 字段的 @TableField(updateStrategy) 注解
2. 移除不需要的 import com.baomidou.mybatisplus.annotation.FieldStrategy
3. 在 DatasetApplicationService.updateDataset 方法中:
   - 添加 import com.baomidou.mybatisplus.core.conditions.update.LambdaUpdateWrapper
   - 保存原始的 parentDatasetId 值用于比较
   - handleParentChange 之后,检查 parentDatasetId 是否发生变化
   - 如果发生变化,使用 LambdaUpdateWrapper 显式地更新 parentDatasetId 字段
   - 这样即使值为 null 也能被正确更新到数据库

原理:
MyBatis-Plus 的 updateById 方法默认只更新非 null 字段。
通过使用 LambdaUpdateWrapper 的 set 方法,可以显式地设置字段值,
包括 null 值,从而确保字段能够被正确更新到数据库。
2026-02-03 11:09:15 +00:00
88b1383653 fix: 恢复前端发送空字符串以支持清除关联数据集
修改说明:
移除了之前将空字符串转换为 undefined 的逻辑,
现在直接发送表单值,包括空字符串。

配合后端修改(commit cc6415c):
1. 当用户选择"无关联数据集"时,发送空字符串 ""
2. 后端 handleParentChange 方法通过 normalizeParentId 将空字符串转为 null
3. Dataset.parentDatasetId 字段添加了 @TableField(updateStrategy = FieldStrategy.IGNORED)
4. 确保即使值为 null 也会被更新到数据库
2026-02-03 10:57:14 +00:00
cc6415c4d9 fix: 修复编辑数据集时无法清除关联数据集的问题
问题描述:
在数据管理的数据集编辑中,如果之前设置了关联数据集,编辑时选择不关联数据集后保存不会生效。

根本原因:
MyBatis-Plus 的 updateById 方法默认使用 FieldStrategy.NOT_NULL 策略,
只有当字段值为非 null 时才会更新到数据库。
当 parentDatasetId 从有值变为 null 时,默认情况下不会更新到数据库。

修复方案:
在 Dataset.java 的 parentDatasetId 字段上添加 @TableField(updateStrategy = FieldStrategy.IGNORED) 注解,
表示即使值为 null 也需要更新到数据库。

配合前端修改(恢复发送空字符串),现在可以正确清除关联数据集:
1. 前端发送空字符串表示"无关联数据集"
2. 后端 handleParentChange 通过 normalizeParentId 将空字符串转为 null
3. dataset.setParentDatasetId(null) 设置为 null
4. 由于添加了 IGNORED 策略,即使为 null 也会更新到数据库
2026-02-03 10:57:08 +00:00
3d036c4cd6 fix: 修复编辑数据集时无法清除关联数据集的问题
问题描述:
在数据管理的数据集编辑中,如果之前设置了关联数据集,编辑时选择不关联数据集后保存不会生效。

问题原因:
后端 updateDataset 方法中的条件判断:
```java
if (updateDatasetRequest.getParentDatasetId() != null) {
    handleParentChange(dataset, updateDatasetRequest.getParentDatasetId());
}
```
当 parentDatasetId 为 null 或空字符串时,条件判断为 false,不会执行 handleParentChange,导致无法清除关联数据集。

修复方案:
去掉条件判断,始终调用 handleParentChange。handleParentChange 内部通过 normalizeParentId 方法将空字符串和 null 都转换为 null,这样既支持设置新的父数据集,也支持清除关联。

配合前端修改(commit 2445235),将空字符串转换为 undefined(被后端反序列化为 null),确保清除关联的操作能够正确执行。
2026-02-03 09:35:09 +00:00
2445235fd2 fix: 修复编辑数据集时清除关联数据集不生效的问题
问题描述:
在数据管理的数据集编辑中,如果之前设置了关联数据集,编辑时选择不关联数据集后保存不会生效。

问题原因:
- BasicInformation.tsx中,"无关联数据集"选项的值是空字符串""
- 当用户选择不关联数据集时,parentDatasetId的值为""
- 后端API将空字符串视为无效值而忽略,而不是识别为"清除关联"的操作

修复方案:
- 在EditDataset.tsx的handleSubmit函数中,将parentDatasetId的空字符串转换为undefined
- 使用 formValues.parentDatasetId || undefined 确保空字符串被转换为 undefined
- 这样后端API能正确识别为要清除关联数据集的操作
2026-02-03 09:23:13 +00:00
893e0a1580 fix: 上传文件时任务中心立即显示
问题描述:
在数据管理的数据集详情页上传文件时,点击确认后,弹窗消失,但是需要等待文件处理(特别是启用按行分割时)后任务中心才弹出来,用户体验不好。

修改内容:
1. useSliceUpload.tsx: 在 createTask 函数中添加立即显示任务中心的逻辑,确保任务创建后立即显示
2. ImportConfiguration.tsx: 在 handleImportData 函数中,在执行耗时的文件处理操作(如文件分割)之前,立即触发 show:task-popover 事件显示任务中心

效果:
- 修改前:点击确认 → 弹窗消失 → (等待文件处理)→ 任务中心弹出
- 修改后:点击确认 → 弹窗消失 + 任务中心立即弹出 → 文件开始处理
2026-02-03 09:14:40 +00:00
05e6842fc8 refactor(DataManagement): 移除不必要的数据集类型过滤逻辑
- 删除了对数据集类型的过滤操作
- 移除了不再使用的 textDatasetTypeOptions 变量
- 简化了 BasicInformation 组件的数据传递逻辑
- 减少了代码冗余,提高了组件性能
2026-02-03 13:33:12 +08:00
da5b18e423 feat(scripts): 添加 APT 缓存预装功能解决离线构建问题
- 新增 APT 缓存目录和相关构建脚本 export-cache.sh
- 添加 build-base-images.sh 脚本用于构建预装 APT 包的基础镜像
- 增加 build-offline-final.sh 最终版离线构建脚本
- 更新 Makefile.offline.mk 添加新的离线构建目标
- 扩展 README.md 文档详细说明 APT 缓存问题解决方案
- 为多个服务添加使用预装基础镜像的离线 Dockerfile
- 修改打包脚本包含 APT 缓存到最终压缩包中
2026-02-03 13:16:17 +08:00
31629ab50b docs(offline): 更新离线构建文档添加传统构建方式和故障排查指南
- 添加传统 docker build 方式作为推荐方案
- 新增离线环境诊断命令 make offline-diagnose
- 扩展故障排查章节,增加多个常见问题解决方案
- 添加文件清单和推荐工作流说明
- 为 BuildKit 构建器无法使用本地镜像问题提供多种解决方法
- 更新构建命令使用说明和重要提示信息
2026-02-03 13:10:28 +08:00
fb43052ddf feat(build): 添加传统 Docker 构建方式和诊断功能
Some checks failed
CodeQL Advanced / Analyze (actions) (push) Has been cancelled
CodeQL Advanced / Analyze (java-kotlin) (push) Has been cancelled
CodeQL Advanced / Analyze (javascript-typescript) (push) Has been cancelled
CodeQL Advanced / Analyze (python) (push) Has been cancelled
- 在 build-offline.sh 脚本中添加 --pull=false 参数并改进错误处理
- 为 Makefile.offline.mk 中的各个服务构建任务添加 --pull=false 参数
- 新增 build-offline-classic.sh 脚本,提供不使用 BuildKit 的传统构建方式
- 新增 build-offline-v2.sh 脚本,提供增强版 BuildKit 离线构建功能
- 新增 diagnose.sh 脚本,用于诊断离线构建环境状态
- 在 Makefile 中添加 offline-build-classic 和 offline-diagnose
2026-02-02 23:53:45 +08:00
c44c75be25 fix(login): 修复登录页面样式问题
- 修正了标题下方描述文字的CSS类名,移除了错误的空格
- 更新了页脚版权信息的样式类名
- 简化了底部描述文本的内容,保持一致的品牌信息
2026-02-02 22:49:46 +08:00
05f3efc148 build(docker): 更新 Docker 镜像源为南京大学镜像地址
- 将 frontend Dockerfile 中的基础镜像从 gcr.io 切换到 gcr.nju.edu.cn
- 更新 offline Dockerfile 中的 nodejs20-debian12 镜像源
- 修改 export-cache.sh 脚本中的基础镜像列表为南京大学镜像
- 更新 Makefile.offline.mk 中的镜像拉取地址为本地镜像源
- 优化 export-cache.sh 脚本的格式和输出信息
- 添加缓存导出过程中的警告处理机制
2026-02-02 22:48:41 +08:00
16eb5cacf9 feat(data-management): 添加知识项扩展元数据支持
- 在 KnowledgeItemApplicationService 中实现元数据字段的更新逻辑
- 为 CreateKnowledgeItemRequest 添加 metadata 字段定义
- 为 UpdateKnowledgeItemRequest 添加 metadata 字段定义
- 支持知识项创建和更新时的扩展元数据存储
2026-02-02 22:20:05 +08:00
e71116d117 refactor(components): 更新标签组件类型定义和数据处理逻辑
- 修改 Tag 接口定义,将 id 和 color 字段改为可选类型
- 更新 onAddTag 回调函数参数类型,从对象改为字符串
- 在 AddTagPopover 组件中添加 useCallback 优化数据获取逻辑
- 调整标签去重逻辑,支持 id 或 name 任一字段匹配
- 更新 DetailHeader 组件的数据类型定义和泛型约束
- 添加 parseMetadata 工具函数用于解析元数据
- 实现 isAnnotationItem 函数判断注释类型数据
- 优化知识库详情页的标签处理和数据类型转换
2026-02-02 22:15:16 +08:00
cac53d7aac fix(knowledge): 更新知识管理页面标题为知识集
- 将页面标题从"知识管理"修改为"知识集"
2026-02-02 21:49:39 +08:00
43b4a619bc refactor(knowledge): 移除知识库创建中的扩展元数据字段
- 删除了表单中的扩展元数据输入区域
- 移除了对应的 Form.Item 包装器
- 简化了创建知识库表单结构
2026-02-02 21:48:21 +08:00
9da187d2c6 feat(build): 添加离线构建支持
- 新增 build-offline.sh 脚本实现无网环境构建
- 添加离线版 Dockerfiles 使用本地资源替代网络下载
- 创建 export-cache.sh 脚本在有网环境预下载依赖
- 集成 Makefile.offline.mk 提供便捷的离线构建命令
- 添加详细的离线构建文档和故障排查指南
- 实现基础镜像、BuildKit 缓存和外部资源的一键打包
2026-02-02 21:44:44 +08:00
b36fdd2438 feat(annotation): 添加数据类型过滤功能到标签配置树编辑器
- 引入 DataType 枚举类型定义
- 根据数据类型动态过滤对象标签选项
- 在模板表单中添加数据类型监听
- 改进错误处理逻辑以提高类型安全性
- 集成数据类型参数到配置树编辑器组件
2026-02-02 20:37:38 +08:00
daa63bdd13 feat(knowledge): 移除知识库管理中的敏感级别功能
- 注释掉创建知识集表单中的敏感级别选择字段
- 移除知识集详情页面中的敏感级别显示项
- 注释掉相关的敏感级别选项配置常量
- 更新表单布局以保持一致的两列网格结构
2026-02-02 19:06:03 +08:00
85433ac071 feat(template): 移除模板类型和版本字段并添加管理员权限控制
- 移除了模板详情页面中的类型和版本显示字段
- 移除了模板列表页面中的类型和版本列
- 添加了管理员权限检查功能,通过 localStorage 键控制
- 将编辑和删除操作按钮限制为仅管理员可见
- 将创建模板按钮限制为仅管理员可见
2026-02-02 18:59:32 +08:00
fc2e50b415 Revert "refactor(template): 移除模板列表中的类型、版本和操作列"
This reverts commit a5261b33b2.
2026-02-02 18:39:52 +08:00
26e1ae69d7 Revert "refactor(template): 移除模板列表页面的创建按钮"
This reverts commit b2bdf9e066.
2026-02-02 18:39:48 +08:00
7092c3f955 feat(annotation): 调整文本编辑器大小限制配置
- 将editor_max_text_bytes默认值从2MB改为0,表示不限制
- 更新文本获取服务中的大小检查逻辑,只在max_bytes大于0时进行限制
- 修改错误提示信息中的字节限制显示
- 优化配置参数的条件判断流程
2026-02-02 17:53:09 +08:00
b2bdf9e066 refactor(template): 移除模板列表页面的创建按钮
- 删除了右上角的创建模板按钮组件
- 移除了相关的点击事件处理函数调用
- 调整了页面布局结构以适应按钮移除后的变化
2026-02-02 16:35:09 +08:00
a5261b33b2 refactor(template): 移除模板列表中的类型、版本和操作列
- 移除了类型列(内置/自定义标签显示)
- 移除了版本列
- 移除了操作列(查看、编辑、删除按钮)
- 保留了创建时间列并维持其渲染逻辑
2026-02-02 16:20:50 +08:00
root
52daf30869 a 2026-02-02 16:09:25 +08:00
07a901043a refactor(annotation): 移除文本内容获取相关功能
- 删除了 fetch_text_content_via_download_api 导入
- 移除了 TEXT 类型数据集的文本内容获取逻辑
- 删除了 _append_annotation_to_content 方法实现
- 简化了知识同步服务的内容处理流程
2026-02-02 15:39:06 +08:00
32e3fc97c6 feat(annotation): 增强知识库同步服务以支持项目隔离
- 在知识库查找时添加项目ID验证,确保知识库归属正确
- 修改日志消息以显示项目ID信息便于调试
- 重构知识库查找逻辑,从按名称查找改为按名称和项目ID组合查找
- 新增_metadata_matches_project方法验证元数据中的项目归属
- 新增_parse_metadata方法安全解析元数据JSON字符串
- 更新回退命名逻辑以确保项目级别的唯一性
- 在所有知识库操作中统一使用项目名称和项目ID进行验证
2026-02-02 15:28:33 +08:00
a73571bd73 feat(annotation): 优化模板配置树编辑器中的属性填充逻辑
- 修改对象配置属性填充条件,仅在名称不存在时设置默认值
- 为控制配置添加标签类别判断逻辑
- 区分标注类和布局类控件的属性填充策略
- 标注类控件始终填充必需属性,布局类控件仅在需要时填充
- 修复属性值设置逻辑,确保正确引用名称属性
2026-02-02 15:26:25 +08:00
00fa1b86eb refactor(DataAnnotation): 移除未使用的状态变量并优化选择器逻辑
- 删除未使用的 addChildTag 和 addSiblingTag 状态变量
- 将 Select 组件的值设置为 null 以重置选择状态
- 简化 handleAddNode 调用的处理逻辑
- 移除不再需要的状态管理代码以提高性能
2026-02-02 15:23:01 +08:00
626c0fcd9a fix(data-annotation): 修复数据标注任务进度计算问题
- 添加 toSafeCount 工具函数确保数值安全处理
- 支持 totalCount 和 total_count 字段兼容性
-
2026-02-01 23:42:06 +08:00
2f2e0d6a8d feat(KnowledgeManagement): 保留知识集原始字段信息
- 在更新标签时保持知识集的名称、描述、状态等核心属性
- 保留领域、业务线、负责人等元数据信息
- 维护有效期、敏感度等配置项
- 确保源类型和自定义元数据字段不被覆盖
- 防止更新标签操作意外丢失其他重要字段值
2026-02-01 23:30:01 +08:00
10fad39e02 feat(KnowledgeManagement): 为知识集详情页添加标签功能
- 引入 updateKnowledgeSetByIdUsingPut、createDatasetTagUsingPost 和 queryDatasetTagsUsingGet API
- 添加 Clock 图标用于显示更新时间
- 替换条目数和更新时间的图标为 File 和 Clock 组件
- 配置标签组件以支持添加、获取和创建标签
- 实现标签的创建和添加逻辑
- 集成标签的异步加载和更新功能
2026-02-01 23:26:54 +08:00
9014dca1ac fix(knowledge): 修复知识集详情页面状态判断逻辑
- 修正了 office 预览状态的条件判断
- 移除了对 PENDING 状态的冗余检查
- 优化了状态轮询的触发条件
2026-02-01 23:15:50 +08:00
0b8fe34586 refactor(DataManagement): 简化文件操作逻辑并移除文本数据集类型检查
- 移除了未使用的 DatasetType 导入
- 删除了 TEXT_DATASET_TYPE_PREFIX 常量定义
- 移除了 isTextDataset 工具函数
- 直接设置 excludeDerivedFiles 参数为 true,简化查询逻辑
2026-02-01 23:13:09 +08:00
27e27a09d4 fix(knowledge): 移除知识条目编辑器中的冗余提示消息
- 删除了文件上传成功后的重复提示信息
- 保持了文件对象的正确处理逻辑
- 优化了用户体验避免不必要的操作反馈
2026-02-01 23:07:32 +08:00
d24fea83d8 feat(KnowledgeItemEditor): 添加文件上传替换功能的加载状态
- 添加 loading 状态用于控制文件上传和替换操作
- 在文件上传前设置 loading 状态为 true
- 在文件替换前设置 loading 状态为 true
- 在操作完成后通过 finally 块重置 loading 状态
- 将 loading 状态绑定到确认按钮的 confirmLoading 属性
2026-02-01 23:07:10 +08:00
05088fef1a refactor(data-management): 优化文本数据集类型判断逻辑
- 添加 TEXT_DATASET_TYPE_PREFIX 常量定义
- 新增 isTextDataset 工具函数用于判断文本数据集类型
- 使用 isTextDataset 函数替换原有的直接比较逻辑
- 提高代码可读性和类型判断的准确性
2026-02-01 23:02:05 +08:00
a0239518fb feat(dataset): 实现数据集文件可见性过滤功能
- 添加派生文件识别逻辑,通过元数据中的derived_from_file_id字段判断
- 实现applyVisibleFileCounts方法为数据集批量设置可见文件数量
- 修改数据集统计接口使用过滤后的可见文件进行统计计算
- 添加normalizeFilePath工具方法统一路径格式处理
- 更新文件查询逻辑支持派生文件过滤功能
- 新增DatasetFileCount DTO用于文件计数统计返回
2026-02-01 22:55:07 +08:00
9d185bb10c feat(deploy): 添加上传文件存储卷配置
- 新增 uploads_volume 卷用于存储上传文件
- 配置卷名称为 datamate-uploads-volume
- 将上传卷挂载到容器 /uploads 目录
- 更新部署配置以支持文件上传功能
2026-02-01 22:34:14 +08:00
6c4f05c0b9 fix(data-management): 修复文件预览状态检查逻辑
- 移除 PENDING 状态的预览轮询检查
- 避免在 PENDING 状态下重复轮询导致的性能问题
- 优化预览加载状态管理流程
2026-02-01 22:32:03 +08:00
438acebb89 feat(data-management): 添加Office文档预览功能
- 集成LibreOffice转换器实现DOC/DOCX转PDF功能
- 新增DatasetFilePreviewService处理预览文件管理
- 新增DatasetFilePreviewAsyncService异步转换任务
- 在文件删除时同步清理预览文件
- 前端实现Office文档预览状态轮询机制
- 添加预览API接口支持状态查询和转换触发
- 优化文件预览界面显示转换进度和错误信息
2026-02-01 22:26:05 +08:00
f06d6e5a7e fix(utils): 修复请求工具中的XMLHttpRequest配置问题
- 移动XMLHttpRequest实例化到方法开头避免重复创建
- 删除被注释掉的旧请求完成事件处理代码
- 修正请求错误和中止事件的错误处理逻辑
- 移除重复的xhr.open调用确保正确的HTTP方法设置
2026-02-01 22:07:43 +08:00
fda283198d refactor(knowledge): 移除未使用的Tag组件导入
- 从KnowledgeSetDetail.tsx中移除未使用的Tag组件导入
- 保持代码整洁性,消除无用的依赖项
2026-02-01 22:05:10 +08:00
d535d0ac1b feat(knowledge): 添加Office文档预览轮询机制
- 引入useRef钩子用于管理轮询定时器和当前处理项目
- 添加Spin组件用于预览加载状态显示
- 新增queryKnowledgeItemPreviewStatusUsingGet API调用接口
- 设置OFFICE_PREVIEW_POLL_INTERVAL和OFFICE_PREVIEW_POLL_MAX_TIMES常量
- 移除原有的Office预览元数据解析相关代码
- 添加officePreviewStatus、officePreviewError状态管理
- 实现pollOfficePreviewStatus函数进行预览状态轮询
- 添加clearOfficePreviewPolling清理轮询定时器功能
- 在handlePreviewItemFile中集成预览状态轮询逻辑
- 更新关闭预览时清理轮询和重置状态
- 移除表格中的Office预览标签显示
- 优化PDF预览界面,在无预览URL时显示加载或错误状态
2026-02-01 22:02:57 +08:00
4d2c9e546c refactor(menu): 调整菜单结构并更新数据管理标题
- 将数据管理菜单项标题从"数据管理"改为"数集管理"
- 重新排列菜单项顺序,将数据标注和内容生成功能移至数据管理后
- 数据集统计页面标题从"数据管理"更新为"数据集统计"
- 移除重复的数据标注和内容生成菜单配置项
2026-02-01 21:40:21 +08:00
02cd16523f refactor(data-management-service): 移除 docx4j 依赖
- 删除了 docx4j-core 依赖项
- 删除了 docx4j-export-fo 依赖项
- 更新了项目依赖配置
- 简化了构建配置文件
2026-02-01 21:18:50 +08:00
d4a44f3bf5 refactor(data-management): 优化知识项目预览服务的文件转换逻辑
- 移除 docx4j 相关依赖和转换方法
- 统一 office 文件转换为 pdf 的处理方式,全部使用 libreoffice
- 删除单独的 docx 到 pdf 转换方法
- 重命名转换方法为 convertOfficeToPdfByLibreOffice
- 增强路径解析逻辑,添加多种候选路径处理
- 添加路径安全性验证和规范化处理
- 新增 extractRelativePathFromSegment 和 normalizeRelativePathValue 辅助方法
- 改进文件存在性检查和路径构建逻辑
2026-02-01 21:18:14 +08:00
340a0ad364 refactor(data-management): 更新知识项存储路径解析方法
- 将 resolveKnowledgeItemStoragePath 方法替换为 resolveKnowledgeItemStoragePathWithFallback
- 新方法提供备用路径解析逻辑以增强文件定位的可靠性
2026-02-01 21:14:39 +08:00
00c41fbbd3 refactor(knowledge-item): 优化知识项预览文件路径处理逻辑
- 将文件路径验证逻辑从方法开始位置移动到实际使用位置
- 修复了预览文件名获取方式,直接从相对路径解析文件名
- 确保文件存在性检查只在需要时执行
- 提高了代码可读性和执行效率
2026-02-01 21:00:07 +08:00
2430db290d fix(knowledge): 修复知识管理页面统计信息显示错误
- 将第二个统计项从"文件总数"更正为"知识类别"
- 将第三个统计项从"标签总数"更正为"文件总数"
- 在统计数据显示区域调整标签总数的位置
- 确保统计数据与标题正确对应
2026-02-01 20:46:54 +08:00
40889baacc feat(knowledge): 添加知识库条目预览功能
- 集成 docx4j 和 LibreOffice 实现 Office 文档转 PDF 预览
- 新增 KnowledgeItemPreviewService 处理预览转换逻辑
- 添加异步任务 KnowledgeItemPreviewAsyncService 进行文档转换
- 实现预览状态管理包括待转换、转换中、就绪和失败状态
- 在前端界面添加 Office 文档预览状态标签显示
- 支持 DOC/DOCX 文件在线预览功能
- 添加预览元数据存储和管理机制
2026-02-01 20:05:25 +08:00
551248ec76 feat(data-annotation): 添加表格序号列并移除任务ID列
- 添加序号列显示当前页码计算后的行号
- 移除原有的任务ID列
- 序号列居中对齐宽度为80px
- 序号基于当前页码和页面大小动态计算
- 保持表格
2026-02-01 19:11:39 +08:00
0bb9abb200 feat(annotation): 添加标注类型显示功能
- 在前端页面中新增标注类型列并使用Tag组件展示
- 添加AnnotationTypeMap常量用于标注类型的映射
- 修改接口定义支持labelingType字段的传递
- 更新后端项目创建和更新逻辑以存储标注类型
- 添加标注类型配置键常量统一管理
- 扩展数据传输对象支持标注类型属性
- 实现模板标注类型的继承逻辑
2026-02-01 19:08:11 +08:00
d135a7f336 feat(knowledge): 添加知识库标签统计功能
- 在 KnowledgeItemApplicationService 中注入 TagMapper 并调用统计方法
- 新增 countKnowledgeSetTags 方法用于计算知识库中的标签总数
- 在 KnowledgeManagementStatisticsResponse 中添加 totalTags 字段
- 在前端 KnowledgeManagementPage 中显示标签总数统计信息
- 更新统计卡片布局从 3 列改为 4 列以适应新增统计项
- 在知识管理模型中添加 totalTags 类型定义
2026-02-01 18:46:31 +08:00
7043a26ab3 feat(auth): 添加登录功能和路由保护
- 在侧边栏添加退出登录按钮并实现登出逻辑
- 添加 ProtectedRoute 组件用于路由权限控制
- 创建 LoginPage 组件实现登录界面和逻辑
- 集成本地登录验证到 authSlice 状态管理
- 配置路由表添加登录页面和保护路由
- 实现自动跳转到登录页面的重定向逻辑
2026-02-01 14:11:44 +08:00
906bb39b83 feat(annotation): 添加保存并跳转到下一段功能
- 新增 SAVE_AND_NEXT_LABEL 常量用于保存并跳转按钮文本
- 添加 saveDisabled 状态控制保存按钮禁用逻辑
- 修改顶部工具栏布局为三列网格结构
- 在工具栏中间位置添加保存并跳转到下一段/下一条按钮
- 调整保存按钮样式移除主色调设置
- 优化保存按钮禁用状态逻辑统一管理
- 修改保存功能区分普通保存和跳转保存操作
2026-02-01 13:09:55 +08:00
dbf8ec53dd style(ui): 统一预览模态框宽度为响应式尺寸
- 将 CreateAnnotationTaskDialog 中的预览模态框宽度从固定像素改为 80vw
- 将 VisualTemplateBuilder 中的预览抽屉宽度从 600px 改为 80vw
- 将 PreviewPromptModal 中的模态框宽度从 800px 改为 80vw
- 将 Overview 组件中的文本和媒体预览宽度统一改为 80vw
- 将 KnowledgeSetDetail 中的文本和媒体预览宽度统一改为 80vw
- 移除原来固定的像素值,使用响应式单位提升用户体验
2026-02-01 12:49:56 +08:00
5f89968974 refactor(dataset): 重构数据集基础信息组件
- 优化 BasicInformation 组件结构和逻辑
- 更新 CreateDataset 组件的数据处理流程
- 改进表单验证和错误处理机制
- 统一组件间的事件传递方式
- 提升代码可读性和维护性
2026-02-01 11:31:09 +08:00
be313cf425 refactor(db): 优化知识条目表索引结构
- 移除知识条目表中 relative_path 字段的索引
- 移除知识条目目录表中 relative_path 字段的唯一约束
- 移除知识条目目录表中 relative_path 字段的索引
- 保留必要的 source_file 和 set_id 关
2026-02-01 11:26:10 +08:00
db37de8aee perf(db): 优化知识条目表索引配置
- 为 idx_dm_ki_relative_path 索引添加长度限制 (768)
- 为 uk_dm_kd_set_path 唯一约束添加相对路径长度限制 (768)
- 为 idx_dm_kd_relative_path 索引添加长度限制 (768)
- 提升数据库查询性能和索引效率
2026-02-01 11:24:35 +08:00
aeec19b99f feat(annotation): 添加保存快捷键功能
- 实现了 Ctrl+S 保存快捷键检测逻辑
- 添加了 handleSaveShortcut 事件处理函数
- 在窗口上注册键盘事件监听器
- 修改 requestExport 函数支持 autoAdvance 参数
- 更新保存按钮点击事件传递 autoAdvance 参数
2026-01-31 20:47:33 +08:00
a4aefe66cd perf(file): 增加文件上传默认超时时间
- 将默认超时时间从 120 秒增加到 1800 秒
- 提高大文件上传的处理能力
2026-01-31 19:15:21 +08:00
2f3a8b38d0 fix(dataset): 解决数据集文件查询时空目录导致异常的问题
- 添加目录存在性检查,避免文件系统访问异常
- 目录不存在时返回空分页结果而不是抛出异常
- 优化数据集刚创建时的用户体验
2026-01-31 19:10:22 +08:00
150af1a741 fix(annotation): 修复项目映射查询逻辑错误
- 移除旧的映射服务查询方式,改为直接查询 ORM 模型获取原始数据
- 更新配置字段读取逻辑以使用新的 ORM 对象
- 修复更新无变化时的响应数据返回问题
- 添加软删除过滤条件确保只返回未删除的项目记录
- 统一数据访问方式提高查询效率和代码一致性
2026-01-31 18:57:08 +08:00
e28f680abb feat(annotation): 添加标注项目信息更新功能
- 引入 DatasetMappingUpdateRequest 请求模型支持 name、description、template_id 和 label_config 字段更新
- 在项目接口中添加 PUT /{project_id} 端点用于更新标注项目信息
- 实现更新逻辑包括映射记录查询、配置信息处理和数据库更新操作
- 集成标准响应格式返回更新结果
- 添加异常处理和日志记录确保操作可追溯性
2026-01-31 18:54:05 +08:00
4f99875670 feat(data-management): 添加数据集类型判断并控制按行分割功能显示
- 从 dataset.model 中导入 DatasetType 类型定义
- 新增 isTextDataset 变量用于判断当前数据集是否为文本类型
- 将按行分割配置项包裹在条件渲染中,仅在文本数据集时显示
- 保持原有非文本文件禁用逻辑不变
2026-01-31 18:50:56 +08:00
c23a9da8cb feat(knowledge): 添加知识库目录管理功能
- 在知识条目表中新增relative_path字段用于存储条目相对路径
- 创建知识条目目录表用于管理知识库中的目录结构
- 实现目录的增删查接口和相应的应用服务逻辑
- 在前端知识库详情页面集成目录显示和操作功能
- 添加目录创建删除等相关的API接口和DTO定义
- 更新数据库初始化脚本包含新的目录表结构
2026-01-31 18:36:40 +08:00
310bc356b1 feat(knowledge): 添加知识库文件目录结构支持功能
- 在 KnowledgeItem 模型中增加 relativePath 字段存储相对路径
- 实现文件上传时的目录前缀处理和相对路径构建逻辑
- 添加批量删除知识条目的接口和实现方法
- 重构前端 KnowledgeSetDetail 组件以支持目录浏览和管理
- 实现文件夹创建、删除、导航等目录操作功能
- 更新数据查询逻辑以支持按相对路径进行搜索和过滤
- 添加前端文件夹图标显示和目录层级展示功能
2026-01-31 17:45:43 +08:00
c1fb02b0f5 refactor(annotation): 更新任务编辑模式的数据类型定义
- 移除 AnnotationTask 类型导入
- 添加 AnnotationTaskListItem 类型导入
- 修改 editTask 属性类型从 AnnotationTask 到 AnnotationTaskListItem
- 优化组件类型定义以匹配实际使用的数据结构
2026-01-31 17:19:18 +08:00
4a3e466210 feat(annotation): 添加标注任务进行中数据显示功能
- 新增 AnnotationTaskListItem 和相关类型定义
- 在前端页面中添加标注中列显示进行中的标注数据量
- 更新数据获取逻辑以支持进行中标注数量统计
- 修改后端服务层添加 in_progress_count 字段映射
- 优化类型安全和代码结构设计
2026-01-31 17:14:23 +08:00
5d8d25ca8c fix(annotation): 解决空标注结果的状态处理问题
- 在构建标注快照时增加空标注检查,避免空对象被处理
- 修改状态判断逻辑,当标注为空且当前状态为 NO_ANNOTATION 或 NOT_APPLICABLE 时保持原状态
- 移除冗余的 hasExistingAnnotation 变量检查
- 确保空标注情况下状态流转的正确性,防止误标为已标注状态
2026-01-31 16:57:38 +08:00
f6788756d3 fix(annotation): 修复分段标注数据结构兼容性问题
- 添加分段标注合并异常时的日志记录和警告
- 增加分段标注保存时的详细状态日志
- 修复分段数据结构类型检查逻辑,支持dict和list格式统一转换
- 避免SQLAlchemy变更检测失效的原地修改问题
- 添加旧版list结构向新dict结构的数据迁移兼容处理
2026-01-31 16:45:48 +08:00
5a5279869e feat(annotation): 添加分段总数提示功能优化性能
- 在编辑器服务中添加 segment_total_hint 变量用于缓存分段总数计算结果
- 使用 with_for_update() 锁定查询以避免并发问题
- 将重复的分段总数计算逻辑替换为使用缓存的提示值
- 减少数据库查询次数提升标注任务处理效率
- 优化了分段索引存在时的总数获取流程
2026-01-31 16:28:39 +08:00
e1c963928a feat(annotation): 添加标注对象解析和导出功能
- 实现 isAnnotationObject 函数验证标注对象
- 添加 resolveSelectedAnnotation 函数解析选中标注
- 优化 exportSelectedAnnotation 函数的标注选择逻辑
- 添加未找到标注对象的错误处理
- 支持 results 字段到 result 字段的自动转换
- 提升标注数据导出的稳定性和准确性
2026-01-31 16:14:12 +08:00
33cf65c9f8 feat(annotation): 添加分段标注统计和进度跟踪功能
- 新增 SegmentStats 类型定义用于分段统计
- 实现分段标注进度计算和缓存机制
- 添加标注任务状态判断逻辑支持分段模式
- 集成分段统计数据显示到任务列表界面
- 实现分段总数自动计算和验证功能
- 扩展标注状态枚举支持进行中标注状态
- 优化任务选择逻辑基于分段完成状态
- 添加分段统计数据预加载和同步机制
2026-01-31 15:42:04 +08:00
3e0a15ac8e fix(annotation): 修复导出标注对话框格式选项显示问题
- 为格式选项添加 py-1 样式类改善布局
- 添加 simpleLabel 属性用于选项标签显示
- 将 optionLabelProp 从 label 改为 simpleLabel
- 优化下拉选择器的标签
2026-01-31 15:35:54 +08:00
5318ee9641 fix(annotation): 修复导出服务中的重复数据处理逻辑
- 移除了重复的else分支代码块
- 修复了分段索引键不存在时的数据处理流程
- 简化了列表类型分段的处理逻辑
- 消除了重复的数据添加操作
2026-01-31 14:39:21 +08:00
c5c8e6c69e feat(annotation): 添加分段标注功能支持
- 定义分段标注相关常量(segmented、segments、result等键名)
- 实现分段标注提取方法_extract_segment_annotations处理字典和列表格式
- 添加分段标注判断方法_is_segmented_annotation检测标注状态
- 修改_has_annotation_result方法使用新的分段标注处理逻辑
- 在任务创建过程中集成分段标注数据处理
- 更新导出服务中的分段标注结果扁平化处理
- 实现标注归一化方法支持分段标注格式转换
- 调整JSON和CSV导出格式适配分段标注结构
2026-01-31 14:36:16 +08:00
8fdc7d99b8 feat(docker): 优化 Dockerfile 支持弱网环境缓存
- 使用缓存挂载 DataX 源码,避免重复克隆提高构建效率
- 添加 NLTK 数据缓存挂载并增加失败检查机制
- 实现 PaddleOCR 模型下载缓存,支持离线重用
- 集成 spaCy 模型缓存机制,提升安装稳定性
- 优化构建流程适配弱网环境下的依赖下载
2026-01-31 14:31:47 +08:00
2bc48fd465 refactor(annotation): 移除编辑器标签配置装饰逻辑
- 删除了 _decorate_label_config_for_editor 方法调用
- 简化了标签配置获取流程
- 移除了不必要的条件检查逻辑
2026-01-31 14:14:32 +08:00
a21a632a4b refactor(DataManagement): 优化数据集详情页面的文件获取逻辑
- 将文件获取逻辑从 fetchDataset 函数中分离到独立的 useEffect 钩子
- 添加 dataset.id 依赖以确保在数据集加载后获取文件
- 修复初始加载时可能发生的文件获取时机问题
- 改进组件渲染性能通过更精确的依赖跟踪
- 保持原有功能不变但提升代码可维护性
2026-01-31 14:14:16 +08:00
595a758d05 refactor(data-management): 优化PDF文本提取服务的事务处理
- 添加TransactionSynchronization相关依赖注入
- 实现事务提交后异步执行PDF文本提取功能
- 增加数据集ID和文件ID的空值检查
- 在活跃事务中注册同步回调确保正确执行
- 避免在事务未提交时提前执行异步任务
2026-01-31 13:59:03 +08:00
4fa0ac1df4 config(security): 禁用安全配置中的frameOptions以允许iframe嵌入
- 在SecurityFilterChain中添加headers配置
- 禁用frameOptions以解决iframe嵌入限制问题
- 保持csrf禁用和其他现有安全设置不变
2026-01-31 13:57:38 +08:00
f2403f00ce feat(annotation): 添加不适用标注状态支持
- 在 AnnotationResultStatus 枚举中新增 NOT_APPLICABLE 状态
- 将无标注/不适用合并为两个独立的状态选项
- 更新前端标签显示逻辑以支持新的状态类型
- 修改确认对话框允许选择不适用状态
- 在后端数据库模型中添加 NOT_APPLICABLE 状态值
- 更新 API schema 描述以反映新的状态选项
- 调整标注状态判断和保存逻辑以处理三种状态
- 更新数据库表结构注释包含新状态类型
2026-01-31 13:28:08 +08:00
f4fc574687 feat(annotation): 添加标注状态管理功能
- 引入 AnnotationResultStatus 枚举类型区分已标注和无标注状态
- 在前端组件中实现空标注检测和确认对话框逻辑
- 添加数据库表字段 annotation_status 存储标注状态
- 扩展后端服务验证和处理标注状态逻辑
- 更新 API 接口支持标注状态参数传递
- 改进任务列表显示逻辑以反映不同标注状态
- 实现分段模式下的标注结果检查机制
2026-01-31 13:23:38 +08:00
129 changed files with 10049 additions and 1790 deletions

304
Makefile.offline.mk Normal file
View File

@@ -0,0 +1,304 @@
# ============================================================================
# Makefile 离线构建扩展
# 将此文件内容追加到主 Makefile 末尾,或单独包含使用
# ============================================================================
# 离线构建配置
CACHE_DIR ?= ./build-cache
OFFLINE_VERSION ?= latest
# 创建 buildx 构建器(如果不存在)
.PHONY: ensure-buildx
ensure-buildx:
@if ! docker buildx inspect offline-builder > /dev/null 2>&1; then \
echo "创建 buildx 构建器..."; \
docker buildx create --name offline-builder --driver docker-container --use 2>/dev/null || docker buildx use offline-builder; \
else \
docker buildx use offline-builder 2>/dev/null || true; \
fi
# ========== 离线缓存导出(有网环境) ==========
.PHONY: offline-export
offline-export: ensure-buildx
@echo "======================================"
@echo "导出离线构建缓存..."
@echo "======================================"
@mkdir -p $(CACHE_DIR)/buildkit $(CACHE_DIR)/images $(CACHE_DIR)/resources
@$(MAKE) _offline-export-base-images
@$(MAKE) _offline-export-cache
@$(MAKE) _offline-export-resources
@$(MAKE) _offline-package
.PHONY: _offline-export-base-images
_offline-export-base-images:
@echo ""
@echo "1. 导出基础镜像..."
@bash -c 'images=( \
"maven:3-eclipse-temurin-21" \
"maven:3-eclipse-temurin-8" \
"eclipse-temurin:21-jdk" \
"mysql:8" \
"node:20-alpine" \
"nginx:1.29" \
"ghcr.nju.edu.cn/astral-sh/uv:python3.11-bookworm" \
"ghcr.nju.edu.cn/astral-sh/uv:python3.12-bookworm" \
"ghcr.nju.edu.cn/astral-sh/uv:latest" \
"python:3.12-slim" \
"python:3.11-slim" \
"gcr.nju.edu.cn/distroless/nodejs20-debian12" \
); for img in "$${images[@]}"; do echo " Pulling $$img..."; docker pull "$$img" 2>/dev/null || true; done'
@echo " Saving base images..."
@docker save -o $(CACHE_DIR)/images/base-images.tar \
maven:3-eclipse-temurin-21 \
maven:3-eclipse-temurin-8 \
eclipse-temurin:21-jdk \
mysql:8 \
node:20-alpine \
nginx:1.29 \
ghcr.nju.edu.cn/astral-sh/uv:python3.11-bookworm \
ghcr.nju.edu.cn/astral-sh/uv:python3.12-bookworm \
ghcr.nju.edu.cn/astral-sh/uv:latest \
python:3.12-slim \
python:3.11-slim \
gcr.nju.edu.cn/distroless/nodejs20-debian12 2>/dev/null || echo " Warning: Some images may not exist"
.PHONY: _offline-export-cache
_offline-export-cache:
@echo ""
@echo "2. 导出 BuildKit 缓存..."
@echo " backend..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/backend-cache,mode=max -f scripts/images/backend/Dockerfile -t datamate-backend:cache . 2>/dev/null || echo " Warning: backend cache export failed"
@echo " backend-python..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/backend-python-cache,mode=max -f scripts/images/backend-python/Dockerfile -t datamate-backend-python:cache . 2>/dev/null || echo " Warning: backend-python cache export failed"
@echo " database..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/database-cache,mode=max -f scripts/images/database/Dockerfile -t datamate-database:cache . 2>/dev/null || echo " Warning: database cache export failed"
@echo " frontend..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/frontend-cache,mode=max -f scripts/images/frontend/Dockerfile -t datamate-frontend:cache . 2>/dev/null || echo " Warning: frontend cache export failed"
@echo " gateway..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/gateway-cache,mode=max -f scripts/images/gateway/Dockerfile -t datamate-gateway:cache . 2>/dev/null || echo " Warning: gateway cache export failed"
@echo " runtime..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/runtime-cache,mode=max -f scripts/images/runtime/Dockerfile -t datamate-runtime:cache . 2>/dev/null || echo " Warning: runtime cache export failed"
@echo " deer-flow-backend..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/deer-flow-backend-cache,mode=max -f scripts/images/deer-flow-backend/Dockerfile -t deer-flow-backend:cache . 2>/dev/null || echo " Warning: deer-flow-backend cache export failed"
@echo " deer-flow-frontend..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/deer-flow-frontend-cache,mode=max -f scripts/images/deer-flow-frontend/Dockerfile -t deer-flow-frontend:cache . 2>/dev/null || echo " Warning: deer-flow-frontend cache export failed"
@echo " mineru..."
@docker buildx build --cache-to type=local,dest=$(CACHE_DIR)/buildkit/mineru-cache,mode=max -f scripts/images/mineru/Dockerfile -t datamate-mineru:cache . 2>/dev/null || echo " Warning: mineru cache export failed"
.PHONY: _offline-export-resources
_offline-export-resources:
@echo ""
@echo "3. 预下载外部资源..."
@mkdir -p $(CACHE_DIR)/resources/models
@echo " PaddleOCR model..."
@wget -q -O $(CACHE_DIR)/resources/models/ch_ppocr_mobile_v2.0_cls_infer.tar \
https://paddleocr.bj.bcebos.com/dygraph_v2.0/ch/ch_ppocr_mobile_v2.0_cls_infer.tar 2>/dev/null || echo " Warning: PaddleOCR model download failed"
@echo " spaCy model..."
@wget -q -O $(CACHE_DIR)/resources/models/zh_core_web_sm-3.8.0-py3-none-any.whl \
https://ghproxy.net/https://github.com/explosion/spacy-models/releases/download/zh_core_web_sm-3.8.0/zh_core_web_sm-3.8.0-py3-none-any.whl 2>/dev/null || echo " Warning: spaCy model download failed"
@echo " DataX source..."
@if [ ! -d "$(CACHE_DIR)/resources/DataX" ]; then \
git clone --depth 1 https://gitee.com/alibaba/DataX.git $(CACHE_DIR)/resources/DataX 2>/dev/null || echo " Warning: DataX clone failed"; \
fi
@echo " deer-flow source..."
@if [ ! -d "$(CACHE_DIR)/resources/deer-flow" ]; then \
git clone --depth 1 https://ghproxy.net/https://github.com/ModelEngine-Group/deer-flow.git $(CACHE_DIR)/resources/deer-flow 2>/dev/null || echo " Warning: deer-flow clone failed"; \
fi
.PHONY: _offline-package
_offline-package:
@echo ""
@echo "4. 打包缓存..."
@cd $(CACHE_DIR) && tar -czf "build-cache-$$(date +%Y%m%d).tar.gz" buildkit images resources 2>/dev/null && cd - > /dev/null
@echo ""
@echo "======================================"
@echo "✓ 缓存导出完成!"
@echo "======================================"
@echo "传输文件: $(CACHE_DIR)/build-cache-$$(date +%Y%m%d).tar.gz"
# ========== 离线构建(无网环境) ==========
.PHONY: offline-setup
offline-setup:
@echo "======================================"
@echo "设置离线构建环境..."
@echo "======================================"
@if [ ! -d "$(CACHE_DIR)" ]; then \
echo "查找并解压缓存包..."; \
cache_file=$$(ls -t build-cache-*.tar.gz 2>/dev/null | head -1); \
if [ -z "$$cache_file" ]; then \
echo "错误: 未找到缓存压缩包 (build-cache-*.tar.gz)"; \
exit 1; \
fi; \
echo "解压 $$cache_file..."; \
tar -xzf "$$cache_file"; \
else \
echo "缓存目录已存在: $(CACHE_DIR)"; \
fi
@echo ""
@echo "加载基础镜像..."
@if [ -f "$(CACHE_DIR)/images/base-images.tar" ]; then \
docker load -i $(CACHE_DIR)/images/base-images.tar; \
else \
echo "警告: 基础镜像文件不存在,假设已手动加载"; \
fi
@$(MAKE) ensure-buildx
@echo ""
@echo "✓ 离线环境准备完成"
.PHONY: offline-build
offline-build: offline-setup
@echo ""
@echo "======================================"
@echo "开始离线构建..."
@echo "======================================"
@$(MAKE) _offline-build-services
.PHONY: _offline-build-services
_offline-build-services: ensure-buildx
@echo ""
@echo "构建 datamate-database..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/database-cache \
--pull=false \
-f scripts/images/database/Dockerfile \
-t datamate-database:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "构建 datamate-gateway..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/gateway-cache \
--pull=false \
-f scripts/images/gateway/Dockerfile \
-t datamate-gateway:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "构建 datamate-backend..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/backend-cache \
--pull=false \
-f scripts/images/backend/Dockerfile \
-t datamate-backend:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "构建 datamate-frontend..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/frontend-cache \
--pull=false \
-f scripts/images/frontend/Dockerfile \
-t datamate-frontend:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "构建 datamate-runtime..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/runtime-cache \
--pull=false \
--build-arg RESOURCES_DIR=$(CACHE_DIR)/resources \
-f scripts/images/runtime/Dockerfile \
-t datamate-runtime:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "构建 datamate-backend-python..."
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/backend-python-cache \
--pull=false \
--build-arg RESOURCES_DIR=$(CACHE_DIR)/resources \
-f scripts/images/backend-python/Dockerfile \
-t datamate-backend-python:$(OFFLINE_VERSION) \
--load . || echo " Failed"
@echo ""
@echo "======================================"
@echo "✓ 离线构建完成"
@echo "======================================"
# 单个服务离线构建 (BuildKit)
.PHONY: %-offline-build
%-offline-build: offline-setup ensure-buildx
@echo "离线构建 $*..."
@if [ ! -d "$(CACHE_DIR)/buildkit/$*-cache" ]; then \
echo "错误: $* 的缓存不存在"; \
exit 1; \
fi
@$(eval IMAGE_NAME := $(if $(filter deer-flow%,$*),$*,datamate-$*))
@docker buildx build \
--cache-from type=local,src=$(CACHE_DIR)/buildkit/$*-cache \
--pull=false \
$(if $(filter runtime backend-python deer-flow%,$*),--build-arg RESOURCES_DIR=$(CACHE_DIR)/resources,) \
-f scripts/images/$*/Dockerfile \
-t $(IMAGE_NAME):$(OFFLINE_VERSION) \
--load .
# 传统 Docker 构建(不使用 BuildKit,更稳定)
.PHONY: offline-build-classic
offline-build-classic: offline-setup
@echo "使用传统 docker build 进行离线构建..."
@bash scripts/offline/build-offline-classic.sh $(CACHE_DIR) $(OFFLINE_VERSION)
# 诊断离线环境
.PHONY: offline-diagnose
offline-diagnose:
@bash scripts/offline/diagnose.sh $(CACHE_DIR)
# 构建 APT 预装基础镜像(有网环境)
.PHONY: offline-build-base-images
offline-build-base-images:
@echo "构建 APT 预装基础镜像..."
@bash scripts/offline/build-base-images.sh $(CACHE_DIR)
# 使用预装基础镜像进行离线构建(推荐)
.PHONY: offline-build-final
offline-build-final: offline-setup
@echo "使用预装 APT 包的基础镜像进行离线构建..."
@bash scripts/offline/build-offline-final.sh $(CACHE_DIR) $(OFFLINE_VERSION)
# 完整离线导出(包含 APT 预装基础镜像)
.PHONY: offline-export-full
offline-export-full:
@echo "======================================"
@echo "完整离线缓存导出(含 APT 预装基础镜像)"
@echo "======================================"
@$(MAKE) offline-build-base-images
@$(MAKE) offline-export
@echo ""
@echo "导出完成!传输时请包含以下文件:"
@echo " - build-cache/images/base-images-with-apt.tar"
@echo " - build-cache-YYYYMMDD.tar.gz"
# ========== 帮助 ==========
.PHONY: help-offline
help-offline:
@echo "离线构建命令:"
@echo ""
@echo "【有网环境】"
@echo " make offline-export [CACHE_DIR=./build-cache] - 导出构建缓存"
@echo " make offline-export-full - 导出完整缓存(含 APT 预装基础镜像)"
@echo " make offline-build-base-images - 构建 APT 预装基础镜像"
@echo ""
@echo "【无网环境】"
@echo " make offline-setup [CACHE_DIR=./build-cache] - 解压并准备离线缓存"
@echo " make offline-build-final - 使用预装基础镜像构建(推荐,解决 APT 问题)"
@echo " make offline-build-classic - 使用传统 docker build"
@echo " make offline-build - 使用 BuildKit 构建"
@echo " make offline-diagnose - 诊断离线构建环境"
@echo " make <service>-offline-build - 离线构建单个服务"
@echo ""
@echo "【完整工作流程(推荐)】"
@echo " # 1. 有网环境导出完整缓存"
@echo " make offline-export-full"
@echo ""
@echo " # 2. 传输到无网环境(需要传输两个文件)"
@echo " scp build-cache/images/base-images-with-apt.tar user@offline-server:/path/"
@echo " scp build-cache-*.tar.gz user@offline-server:/path/"
@echo ""
@echo " # 3. 无网环境构建"
@echo " tar -xzf build-cache-*.tar.gz"
@echo " docker load -i build-cache/images/base-images-with-apt.tar"
@echo " make offline-build-final"

View File

@@ -470,6 +470,23 @@ paths:
'200': '200':
description: 上传成功 description: 上传成功
/data-management/datasets/upload/cancel-upload/{reqId}:
put:
tags: [ DatasetFile ]
operationId: cancelUpload
summary: 取消上传
description: 取消预上传请求并清理临时分片
parameters:
- name: reqId
in: path
required: true
schema:
type: string
description: 预上传请求ID
responses:
'200':
description: 取消成功
/data-management/dataset-types: /data-management/dataset-types:
get: get:
operationId: getDatasetTypes operationId: getDatasetTypes

View File

@@ -1,5 +1,6 @@
package com.datamate.datamanagement.application; package com.datamate.datamanagement.application;
import com.baomidou.mybatisplus.core.conditions.update.LambdaUpdateWrapper;
import com.baomidou.mybatisplus.core.metadata.IPage; import com.baomidou.mybatisplus.core.metadata.IPage;
import com.baomidou.mybatisplus.extension.plugins.pagination.Page; import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
import com.datamate.common.domain.utils.ChunksSaver; import com.datamate.common.domain.utils.ChunksSaver;
@@ -19,8 +20,11 @@ import com.datamate.datamanagement.infrastructure.exception.DataManagementErrorC
import com.datamate.datamanagement.infrastructure.persistence.mapper.TagMapper; import com.datamate.datamanagement.infrastructure.persistence.mapper.TagMapper;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.dto.DatasetFileCount;
import com.datamate.datamanagement.interfaces.converter.DatasetConverter; import com.datamate.datamanagement.interfaces.converter.DatasetConverter;
import com.datamate.datamanagement.interfaces.dto.*; import com.datamate.datamanagement.interfaces.dto.*;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.RequiredArgsConstructor; import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j; import lombok.extern.slf4j.Slf4j;
import org.apache.commons.collections4.CollectionUtils; import org.apache.commons.collections4.CollectionUtils;
@@ -53,6 +57,7 @@ public class DatasetApplicationService {
private static final int SIMILAR_DATASET_MAX_LIMIT = 50; private static final int SIMILAR_DATASET_MAX_LIMIT = 50;
private static final int SIMILAR_DATASET_CANDIDATE_FACTOR = 5; private static final int SIMILAR_DATASET_CANDIDATE_FACTOR = 5;
private static final int SIMILAR_DATASET_CANDIDATE_MAX = 100; private static final int SIMILAR_DATASET_CANDIDATE_MAX = 100;
private static final String DERIVED_METADATA_KEY = "derived_from_file_id";
private final DatasetRepository datasetRepository; private final DatasetRepository datasetRepository;
private final TagMapper tagMapper; private final TagMapper tagMapper;
private final DatasetFileRepository datasetFileRepository; private final DatasetFileRepository datasetFileRepository;
@@ -97,6 +102,7 @@ public class DatasetApplicationService {
public Dataset updateDataset(String datasetId, UpdateDatasetRequest updateDatasetRequest) { public Dataset updateDataset(String datasetId, UpdateDatasetRequest updateDatasetRequest) {
Dataset dataset = datasetRepository.getById(datasetId); Dataset dataset = datasetRepository.getById(datasetId);
BusinessAssert.notNull(dataset, DataManagementErrorCode.DATASET_NOT_FOUND); BusinessAssert.notNull(dataset, DataManagementErrorCode.DATASET_NOT_FOUND);
if (StringUtils.hasText(updateDatasetRequest.getName())) { if (StringUtils.hasText(updateDatasetRequest.getName())) {
dataset.setName(updateDatasetRequest.getName()); dataset.setName(updateDatasetRequest.getName());
} }
@@ -109,13 +115,31 @@ public class DatasetApplicationService {
if (Objects.nonNull(updateDatasetRequest.getStatus())) { if (Objects.nonNull(updateDatasetRequest.getStatus())) {
dataset.setStatus(updateDatasetRequest.getStatus()); dataset.setStatus(updateDatasetRequest.getStatus());
} }
if (updateDatasetRequest.getParentDatasetId() != null) { if (updateDatasetRequest.isParentDatasetIdProvided()) {
// 保存原始的 parentDatasetId 值,用于比较是否发生了变化
String originalParentDatasetId = dataset.getParentDatasetId();
// 处理父数据集变更:仅当请求显式包含 parentDatasetId 时处理
// handleParentChange 内部通过 normalizeParentId 方法将空字符串和 null 都转换为 null
// 这样既支持设置新的父数据集,也支持清除关联
handleParentChange(dataset, updateDatasetRequest.getParentDatasetId()); handleParentChange(dataset, updateDatasetRequest.getParentDatasetId());
// 检查 parentDatasetId 是否发生了变化
if (!Objects.equals(originalParentDatasetId, dataset.getParentDatasetId())) {
// 使用 LambdaUpdateWrapper 显式地更新 parentDatasetId 字段
// 这样即使值为 null 也能被正确更新到数据库
datasetRepository.update(null, new LambdaUpdateWrapper<Dataset>()
.eq(Dataset::getId, datasetId)
.set(Dataset::getParentDatasetId, dataset.getParentDatasetId()));
} }
}
if (StringUtils.hasText(updateDatasetRequest.getDataSource())) { if (StringUtils.hasText(updateDatasetRequest.getDataSource())) {
// 数据源id不为空,使用异步线程进行文件扫盘落库 // 数据源id不为空,使用异步线程进行文件扫盘落库
processDataSourceAsync(dataset.getId(), updateDatasetRequest.getDataSource()); processDataSourceAsync(dataset.getId(), updateDatasetRequest.getDataSource());
} }
// 更新其他字段(不包括 parentDatasetId,因为它已经在上面的代码中更新了)
datasetRepository.updateById(dataset); datasetRepository.updateById(dataset);
return dataset; return dataset;
} }
@@ -142,6 +166,7 @@ public class DatasetApplicationService {
BusinessAssert.notNull(dataset, DataManagementErrorCode.DATASET_NOT_FOUND); BusinessAssert.notNull(dataset, DataManagementErrorCode.DATASET_NOT_FOUND);
List<DatasetFile> datasetFiles = datasetFileRepository.findAllByDatasetId(datasetId); List<DatasetFile> datasetFiles = datasetFileRepository.findAllByDatasetId(datasetId);
dataset.setFiles(datasetFiles); dataset.setFiles(datasetFiles);
applyVisibleFileCounts(Collections.singletonList(dataset));
return dataset; return dataset;
} }
@@ -153,6 +178,7 @@ public class DatasetApplicationService {
IPage<Dataset> page = new Page<>(query.getPage(), query.getSize()); IPage<Dataset> page = new Page<>(query.getPage(), query.getSize());
page = datasetRepository.findByCriteria(page, query); page = datasetRepository.findByCriteria(page, query);
String datasetPvcName = getDatasetPvcName(); String datasetPvcName = getDatasetPvcName();
applyVisibleFileCounts(page.getRecords());
List<DatasetResponse> datasetResponses = DatasetConverter.INSTANCE.convertToResponse(page.getRecords()); List<DatasetResponse> datasetResponses = DatasetConverter.INSTANCE.convertToResponse(page.getRecords());
datasetResponses.forEach(dataset -> dataset.setPvcName(datasetPvcName)); datasetResponses.forEach(dataset -> dataset.setPvcName(datasetPvcName));
return PagedResponse.of(datasetResponses, page.getCurrent(), page.getTotal(), page.getPages()); return PagedResponse.of(datasetResponses, page.getCurrent(), page.getTotal(), page.getPages());
@@ -200,6 +226,7 @@ public class DatasetApplicationService {
}) })
.limit(safeLimit) .limit(safeLimit)
.toList(); .toList();
applyVisibleFileCounts(sorted);
List<DatasetResponse> responses = DatasetConverter.INSTANCE.convertToResponse(sorted); List<DatasetResponse> responses = DatasetConverter.INSTANCE.convertToResponse(sorted);
responses.forEach(item -> item.setPvcName(datasetPvcName)); responses.forEach(item -> item.setPvcName(datasetPvcName));
return responses; return responses;
@@ -345,6 +372,61 @@ public class DatasetApplicationService {
dataset.setPath(newPath); dataset.setPath(newPath);
} }
private void applyVisibleFileCounts(List<Dataset> datasets) {
if (CollectionUtils.isEmpty(datasets)) {
return;
}
List<String> datasetIds = datasets.stream()
.filter(Objects::nonNull)
.map(Dataset::getId)
.filter(StringUtils::hasText)
.toList();
if (datasetIds.isEmpty()) {
return;
}
Map<String, Long> countMap = datasetFileRepository.countNonDerivedByDatasetIds(datasetIds).stream()
.filter(Objects::nonNull)
.collect(Collectors.toMap(
DatasetFileCount::getDatasetId,
count -> Optional.ofNullable(count.getFileCount()).orElse(0L),
(left, right) -> left
));
for (Dataset dataset : datasets) {
if (dataset == null || !StringUtils.hasText(dataset.getId())) {
continue;
}
Long visibleCount = countMap.get(dataset.getId());
dataset.setFileCount(visibleCount != null ? visibleCount : 0L);
}
}
private List<DatasetFile> filterVisibleFiles(List<DatasetFile> files) {
if (CollectionUtils.isEmpty(files)) {
return Collections.emptyList();
}
return files.stream()
.filter(file -> !isDerivedFile(file))
.collect(Collectors.toList());
}
private boolean isDerivedFile(DatasetFile datasetFile) {
if (datasetFile == null) {
return false;
}
String metadata = datasetFile.getMetadata();
if (!StringUtils.hasText(metadata)) {
return false;
}
try {
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> metadataMap = mapper.readValue(metadata, new TypeReference<Map<String, Object>>() {});
return metadataMap.get(DERIVED_METADATA_KEY) != null;
} catch (Exception e) {
log.debug("Failed to parse dataset file metadata for derived detection: {}", datasetFile.getId(), e);
return false;
}
}
/** /**
* 获取数据集统计信息 * 获取数据集统计信息
*/ */
@@ -357,27 +439,29 @@ public class DatasetApplicationService {
Map<String, Object> statistics = new HashMap<>(); Map<String, Object> statistics = new HashMap<>();
// 基础统计 List<DatasetFile> allFiles = datasetFileRepository.findAllByDatasetId(datasetId);
Long totalFiles = datasetFileRepository.countByDatasetId(datasetId); List<DatasetFile> visibleFiles = filterVisibleFiles(allFiles);
Long completedFiles = datasetFileRepository.countCompletedByDatasetId(datasetId); long totalFiles = visibleFiles.size();
long completedFiles = visibleFiles.stream()
.filter(file -> "COMPLETED".equalsIgnoreCase(file.getStatus()))
.count();
Long totalSize = datasetFileRepository.sumSizeByDatasetId(datasetId); Long totalSize = datasetFileRepository.sumSizeByDatasetId(datasetId);
statistics.put("totalFiles", totalFiles != null ? totalFiles.intValue() : 0); statistics.put("totalFiles", (int) totalFiles);
statistics.put("completedFiles", completedFiles != null ? completedFiles.intValue() : 0); statistics.put("completedFiles", (int) completedFiles);
statistics.put("totalSize", totalSize != null ? totalSize : 0L); statistics.put("totalSize", totalSize != null ? totalSize : 0L);
// 完成率计算 // 完成率计算
float completionRate = 0.0f; float completionRate = 0.0f;
if (totalFiles != null && totalFiles > 0) { if (totalFiles > 0) {
completionRate = (completedFiles != null ? completedFiles.floatValue() : 0.0f) / totalFiles.floatValue() * 100.0f; completionRate = ((float) completedFiles) / (float) totalFiles * 100.0f;
} }
statistics.put("completionRate", completionRate); statistics.put("completionRate", completionRate);
// 文件类型分布统计 // 文件类型分布统计
Map<String, Integer> fileTypeDistribution = new HashMap<>(); Map<String, Integer> fileTypeDistribution = new HashMap<>();
List<DatasetFile> allFiles = datasetFileRepository.findAllByDatasetId(datasetId); if (!visibleFiles.isEmpty()) {
if (allFiles != null) { for (DatasetFile file : visibleFiles) {
for (DatasetFile file : allFiles) {
String fileType = file.getFileType() != null ? file.getFileType() : "unknown"; String fileType = file.getFileType() != null ? file.getFileType() : "unknown";
fileTypeDistribution.put(fileType, fileTypeDistribution.getOrDefault(fileType, 0) + 1); fileTypeDistribution.put(fileType, fileTypeDistribution.getOrDefault(fileType, 0) + 1);
} }
@@ -386,8 +470,8 @@ public class DatasetApplicationService {
// 状态分布统计 // 状态分布统计
Map<String, Integer> statusDistribution = new HashMap<>(); Map<String, Integer> statusDistribution = new HashMap<>();
if (allFiles != null) { if (!visibleFiles.isEmpty()) {
for (DatasetFile file : allFiles) { for (DatasetFile file : visibleFiles) {
String status = file.getStatus() != null ? file.getStatus() : "unknown"; String status = file.getStatus() != null ? file.getStatus() : "unknown";
statusDistribution.put(status, statusDistribution.getOrDefault(status, 0) + 1); statusDistribution.put(status, statusDistribution.getOrDefault(status, 0) + 1);
} }

View File

@@ -43,6 +43,8 @@ import org.springframework.core.io.UrlResource;
import org.springframework.http.HttpHeaders; import org.springframework.http.HttpHeaders;
import org.springframework.stereotype.Service; import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional; import org.springframework.transaction.annotation.Transactional;
import org.springframework.transaction.support.TransactionSynchronization;
import org.springframework.transaction.support.TransactionSynchronizationManager;
import java.io.File; import java.io.File;
import java.io.IOException; import java.io.IOException;
@@ -86,6 +88,7 @@ public class DatasetFileApplicationService {
private final DatasetRepository datasetRepository; private final DatasetRepository datasetRepository;
private final FileService fileService; private final FileService fileService;
private final PdfTextExtractAsyncService pdfTextExtractAsyncService; private final PdfTextExtractAsyncService pdfTextExtractAsyncService;
private final DatasetFilePreviewService datasetFilePreviewService;
@Value("${datamate.data-management.base-path:/dataset}") @Value("${datamate.data-management.base-path:/dataset}")
private String datasetBasePath; private String datasetBasePath;
@@ -97,11 +100,13 @@ public class DatasetFileApplicationService {
public DatasetFileApplicationService(DatasetFileRepository datasetFileRepository, public DatasetFileApplicationService(DatasetFileRepository datasetFileRepository,
DatasetRepository datasetRepository, DatasetRepository datasetRepository,
FileService fileService, FileService fileService,
PdfTextExtractAsyncService pdfTextExtractAsyncService) { PdfTextExtractAsyncService pdfTextExtractAsyncService,
DatasetFilePreviewService datasetFilePreviewService) {
this.datasetFileRepository = datasetFileRepository; this.datasetFileRepository = datasetFileRepository;
this.datasetRepository = datasetRepository; this.datasetRepository = datasetRepository;
this.fileService = fileService; this.fileService = fileService;
this.pdfTextExtractAsyncService = pdfTextExtractAsyncService; this.pdfTextExtractAsyncService = pdfTextExtractAsyncService;
this.datasetFilePreviewService = datasetFilePreviewService;
} }
/** /**
@@ -160,18 +165,31 @@ public class DatasetFileApplicationService {
String datasetPath = dataset.getPath(); String datasetPath = dataset.getPath();
Path queryPath = Path.of(dataset.getPath() + File.separator + prefix); Path queryPath = Path.of(dataset.getPath() + File.separator + prefix);
Map<String, DatasetFile> datasetFilesMap = datasetFileRepository.findAllByDatasetId(datasetId) Map<String, DatasetFile> datasetFilesMap = datasetFileRepository.findAllByDatasetId(datasetId)
.stream().collect(Collectors.toMap(DatasetFile::getFilePath, Function.identity())); .stream()
.filter(file -> file.getFilePath() != null)
.collect(Collectors.toMap(
file -> normalizeFilePath(file.getFilePath()),
Function.identity(),
(left, right) -> left
));
Set<String> derivedFilePaths = excludeDerivedFiles Set<String> derivedFilePaths = excludeDerivedFiles
? datasetFilesMap.values().stream() ? datasetFilesMap.values().stream()
.filter(this::isDerivedFile) .filter(this::isDerivedFile)
.map(DatasetFile::getFilePath) .map(DatasetFile::getFilePath)
.map(this::normalizeFilePath)
.filter(Objects::nonNull) .filter(Objects::nonNull)
.collect(Collectors.toSet()) .collect(Collectors.toSet())
: Collections.emptySet(); : Collections.emptySet();
// 如果目录不存在,直接返回空结果(数据集刚创建时目录可能还未生成)
if (!Files.exists(queryPath)) {
return new PagedResponse<>(page, size, 0, 0, Collections.emptyList());
}
try (Stream<Path> pathStream = Files.list(queryPath)) { try (Stream<Path> pathStream = Files.list(queryPath)) {
List<Path> allFiles = pathStream List<Path> allFiles = pathStream
.filter(path -> path.toString().startsWith(datasetPath)) .filter(path -> path.toString().startsWith(datasetPath))
.filter(path -> !excludeDerivedFiles || Files.isDirectory(path) || !derivedFilePaths.contains(path.toString())) .filter(path -> !excludeDerivedFiles
|| Files.isDirectory(path)
|| !derivedFilePaths.contains(normalizeFilePath(path.toString())))
.sorted(Comparator .sorted(Comparator
.comparing((Path path) -> !Files.isDirectory(path)) .comparing((Path path) -> !Files.isDirectory(path))
.thenComparing(path -> path.getFileName().toString())) .thenComparing(path -> path.getFileName().toString()))
@@ -190,7 +208,9 @@ public class DatasetFileApplicationService {
if (fromIndex < total) { if (fromIndex < total) {
pageData = allFiles.subList(fromIndex, toIndex); pageData = allFiles.subList(fromIndex, toIndex);
} }
List<DatasetFile> datasetFiles = pageData.stream().map(path -> getDatasetFile(path, datasetFilesMap)).toList(); List<DatasetFile> datasetFiles = pageData.stream()
.map(path -> getDatasetFile(path, datasetFilesMap, excludeDerivedFiles, derivedFilePaths))
.toList();
return new PagedResponse<>(page, size, total, totalPages, datasetFiles); return new PagedResponse<>(page, size, total, totalPages, datasetFiles);
} catch (IOException e) { } catch (IOException e) {
@@ -199,7 +219,10 @@ public class DatasetFileApplicationService {
} }
} }
private DatasetFile getDatasetFile(Path path, Map<String, DatasetFile> datasetFilesMap) { private DatasetFile getDatasetFile(Path path,
Map<String, DatasetFile> datasetFilesMap,
boolean excludeDerivedFiles,
Set<String> derivedFilePaths) {
DatasetFile datasetFile = new DatasetFile(); DatasetFile datasetFile = new DatasetFile();
LocalDateTime localDateTime = LocalDateTime.now(); LocalDateTime localDateTime = LocalDateTime.now();
try { try {
@@ -221,12 +244,21 @@ public class DatasetFileApplicationService {
long totalSize; long totalSize;
try (Stream<Path> walk = Files.walk(path)) { try (Stream<Path> walk = Files.walk(path)) {
fileCount = walk.filter(Files::isRegularFile).count(); Stream<Path> fileStream = walk.filter(Files::isRegularFile);
if (excludeDerivedFiles && !derivedFilePaths.isEmpty()) {
fileStream = fileStream.filter(filePath ->
!derivedFilePaths.contains(normalizeFilePath(filePath.toString())));
}
fileCount = fileStream.count();
} }
try (Stream<Path> walk = Files.walk(path)) { try (Stream<Path> walk = Files.walk(path)) {
totalSize = walk Stream<Path> fileStream = walk.filter(Files::isRegularFile);
.filter(Files::isRegularFile) if (excludeDerivedFiles && !derivedFilePaths.isEmpty()) {
fileStream = fileStream.filter(filePath ->
!derivedFilePaths.contains(normalizeFilePath(filePath.toString())));
}
totalSize = fileStream
.mapToLong(p -> { .mapToLong(p -> {
try { try {
return Files.size(p); return Files.size(p);
@@ -244,7 +276,7 @@ public class DatasetFileApplicationService {
log.error("stat directory info error", e); log.error("stat directory info error", e);
} }
} else { } else {
DatasetFile exist = datasetFilesMap.get(path.toString()); DatasetFile exist = datasetFilesMap.get(normalizeFilePath(path.toString()));
if (exist == null) { if (exist == null) {
datasetFile.setId("file-" + datasetFile.getFileName()); datasetFile.setId("file-" + datasetFile.getFileName());
datasetFile.setFileSize(path.toFile().length()); datasetFile.setFileSize(path.toFile().length());
@@ -255,6 +287,17 @@ public class DatasetFileApplicationService {
return datasetFile; return datasetFile;
} }
private String normalizeFilePath(String filePath) {
if (filePath == null || filePath.isBlank()) {
return null;
}
try {
return Paths.get(filePath).toAbsolutePath().normalize().toString();
} catch (Exception e) {
return filePath.replace("\\", "/");
}
}
private boolean isSourceDocument(DatasetFile datasetFile) { private boolean isSourceDocument(DatasetFile datasetFile) {
if (datasetFile == null) { if (datasetFile == null) {
return false; return false;
@@ -310,6 +353,7 @@ public class DatasetFileApplicationService {
datasetFileRepository.removeById(fileId); datasetFileRepository.removeById(fileId);
dataset.removeFile(file); dataset.removeFile(file);
datasetRepository.updateById(dataset); datasetRepository.updateById(dataset);
datasetFilePreviewService.deletePreviewFileQuietly(datasetId, fileId);
// 删除文件时,上传到数据集中的文件会同时删除数据库中的记录和文件系统中的文件,归集过来的文件仅删除数据库中的记录 // 删除文件时,上传到数据集中的文件会同时删除数据库中的记录和文件系统中的文件,归集过来的文件仅删除数据库中的记录
if (file.getFilePath().startsWith(dataset.getPath())) { if (file.getFilePath().startsWith(dataset.getPath())) {
try { try {
@@ -461,6 +505,14 @@ public class DatasetFileApplicationService {
saveFileInfoToDb(uploadResult, datasetId); saveFileInfoToDb(uploadResult, datasetId);
} }
/**
* 取消上传
*/
@Transactional
public void cancelUpload(String reqId) {
fileService.cancelUpload(reqId);
}
private void saveFileInfoToDb(FileUploadResult fileUploadResult, String datasetId) { private void saveFileInfoToDb(FileUploadResult fileUploadResult, String datasetId) {
if (Objects.isNull(fileUploadResult.getSavedFile())) { if (Objects.isNull(fileUploadResult.getSavedFile())) {
// 文件切片上传没有完成 // 文件切片上传没有完成
@@ -682,6 +734,7 @@ public class DatasetFileApplicationService {
for (DatasetFile file : filesToDelete) { for (DatasetFile file : filesToDelete) {
datasetFileRepository.removeById(file.getId()); datasetFileRepository.removeById(file.getId());
datasetFilePreviewService.deletePreviewFileQuietly(datasetId, file.getId());
} }
// 删除文件系统中的目录 // 删除文件系统中的目录
@@ -961,6 +1014,20 @@ public class DatasetFileApplicationService {
if (fileType == null || !DOCUMENT_TEXT_FILE_TYPES.contains(fileType.toLowerCase(Locale.ROOT))) { if (fileType == null || !DOCUMENT_TEXT_FILE_TYPES.contains(fileType.toLowerCase(Locale.ROOT))) {
return; return;
} }
pdfTextExtractAsyncService.extractPdfText(dataset.getId(), datasetFile.getId()); String datasetId = dataset.getId();
String fileId = datasetFile.getId();
if (datasetId == null || fileId == null) {
return;
}
if (TransactionSynchronizationManager.isSynchronizationActive()) {
TransactionSynchronizationManager.registerSynchronization(new TransactionSynchronization() {
@Override
public void afterCommit() {
pdfTextExtractAsyncService.extractPdfText(datasetId, fileId);
}
});
return;
}
pdfTextExtractAsyncService.extractPdfText(datasetId, fileId);
} }
} }

View File

@@ -0,0 +1,171 @@
package com.datamate.datamanagement.application;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.infrastructure.config.DataManagementProperties;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.Set;
/**
* 数据集文件预览转换异步任务
*/
@Service
@RequiredArgsConstructor
@Slf4j
public class DatasetFilePreviewAsyncService {
private static final Set<String> OFFICE_EXTENSIONS = Set.of("doc", "docx");
private static final String DATASET_PREVIEW_DIR = "dataset-previews";
private static final String PREVIEW_FILE_SUFFIX = ".pdf";
private static final String PATH_SEPARATOR = "/";
private static final int MAX_ERROR_LENGTH = 500;
private static final DateTimeFormatter PREVIEW_TIME_FORMATTER = DateTimeFormatter.ISO_LOCAL_DATE_TIME;
private final DatasetFileRepository datasetFileRepository;
private final DataManagementProperties dataManagementProperties;
private final ObjectMapper objectMapper = new ObjectMapper();
@Async
public void convertPreviewAsync(String fileId) {
if (StringUtils.isBlank(fileId)) {
return;
}
DatasetFile file = datasetFileRepository.getById(fileId);
if (file == null) {
return;
}
String extension = resolveFileExtension(resolveOriginalName(file));
if (!OFFICE_EXTENSIONS.contains(extension)) {
updatePreviewStatus(file, KnowledgeItemPreviewStatus.FAILED, null, "仅支持 DOC/DOCX 转换");
return;
}
if (StringUtils.isBlank(file.getFilePath())) {
updatePreviewStatus(file, KnowledgeItemPreviewStatus.FAILED, null, "源文件路径为空");
return;
}
Path sourcePath = Paths.get(file.getFilePath()).toAbsolutePath().normalize();
if (!Files.exists(sourcePath) || !Files.isRegularFile(sourcePath)) {
updatePreviewStatus(file, KnowledgeItemPreviewStatus.FAILED, null, "源文件不存在");
return;
}
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(file.getMetadata(), objectMapper);
String previewRelativePath = StringUtils.defaultIfBlank(
previewInfo.pdfPath(),
resolvePreviewRelativePath(file.getDatasetId(), file.getId())
);
Path targetPath = resolvePreviewStoragePath(previewRelativePath);
try {
ensureParentDirectory(targetPath);
LibreOfficeConverter.convertToPdf(sourcePath, targetPath);
updatePreviewStatus(file, KnowledgeItemPreviewStatus.READY, previewRelativePath, null);
} catch (Exception e) {
log.error("dataset preview convert failed, fileId: {}", file.getId(), e);
updatePreviewStatus(file, KnowledgeItemPreviewStatus.FAILED, previewRelativePath, trimError(e.getMessage()));
}
}
private void updatePreviewStatus(
DatasetFile file,
KnowledgeItemPreviewStatus status,
String previewRelativePath,
String error
) {
if (file == null) {
return;
}
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
file.getMetadata(),
objectMapper,
status,
previewRelativePath,
error,
nowText()
);
file.setMetadata(updatedMetadata);
datasetFileRepository.updateById(file);
}
private String resolveOriginalName(DatasetFile file) {
if (file == null) {
return "";
}
if (StringUtils.isNotBlank(file.getFileName())) {
return file.getFileName();
}
if (StringUtils.isNotBlank(file.getFilePath())) {
return Paths.get(file.getFilePath()).getFileName().toString();
}
return "";
}
private String resolveFileExtension(String fileName) {
if (StringUtils.isBlank(fileName)) {
return "";
}
int dotIndex = fileName.lastIndexOf('.');
if (dotIndex <= 0 || dotIndex >= fileName.length() - 1) {
return "";
}
return fileName.substring(dotIndex + 1).toLowerCase();
}
private String resolvePreviewRelativePath(String datasetId, String fileId) {
String relativePath = Paths.get(DATASET_PREVIEW_DIR, datasetId, fileId + PREVIEW_FILE_SUFFIX)
.toString();
return relativePath.replace("\\", PATH_SEPARATOR);
}
private Path resolvePreviewStoragePath(String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace("/", java.io.File.separator);
Path root = resolveUploadRootPath();
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
if (!target.startsWith(root)) {
throw new IllegalArgumentException("invalid preview path");
}
return target;
}
private Path resolveUploadRootPath() {
String uploadDir = dataManagementProperties.getFileStorage().getUploadDir();
return Paths.get(uploadDir).toAbsolutePath().normalize();
}
private void ensureParentDirectory(Path targetPath) {
try {
Path parent = targetPath.getParent();
if (parent != null) {
Files.createDirectories(parent);
}
} catch (Exception e) {
throw new IllegalStateException("创建预览目录失败", e);
}
}
private String trimError(String error) {
if (StringUtils.isBlank(error)) {
return "";
}
if (error.length() <= MAX_ERROR_LENGTH) {
return error;
}
return error.substring(0, MAX_ERROR_LENGTH);
}
private String nowText() {
return LocalDateTime.now().format(PREVIEW_TIME_FORMATTER);
}
}

View File

@@ -0,0 +1,233 @@
package com.datamate.datamanagement.application;
import com.datamate.common.infrastructure.exception.BusinessAssert;
import com.datamate.common.infrastructure.exception.CommonErrorCode;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.infrastructure.config.DataManagementProperties;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository;
import com.datamate.datamanagement.interfaces.dto.DatasetFilePreviewStatusResponse;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.springframework.stereotype.Service;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.Objects;
import java.util.Set;
/**
* 数据集文件预览转换服务
*/
@Service
@RequiredArgsConstructor
@Slf4j
public class DatasetFilePreviewService {
private static final Set<String> OFFICE_EXTENSIONS = Set.of("doc", "docx");
private static final String DATASET_PREVIEW_DIR = "dataset-previews";
private static final String PREVIEW_FILE_SUFFIX = ".pdf";
private static final String PATH_SEPARATOR = "/";
private static final DateTimeFormatter PREVIEW_TIME_FORMATTER = DateTimeFormatter.ISO_LOCAL_DATE_TIME;
private final DatasetFileRepository datasetFileRepository;
private final DataManagementProperties dataManagementProperties;
private final DatasetFilePreviewAsyncService datasetFilePreviewAsyncService;
private final ObjectMapper objectMapper = new ObjectMapper();
public DatasetFilePreviewStatusResponse getPreviewStatus(String datasetId, String fileId) {
DatasetFile file = requireDatasetFile(datasetId, fileId);
assertOfficeDocument(file);
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(file.getMetadata(), objectMapper);
if (previewInfo.status() == KnowledgeItemPreviewStatus.READY && !previewPdfExists(file, previewInfo)) {
previewInfo = markPreviewFailed(file, previewInfo, "预览文件不存在");
}
return buildResponse(previewInfo);
}
public DatasetFilePreviewStatusResponse ensurePreview(String datasetId, String fileId) {
DatasetFile file = requireDatasetFile(datasetId, fileId);
assertOfficeDocument(file);
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(file.getMetadata(), objectMapper);
if (previewInfo.status() == KnowledgeItemPreviewStatus.READY && previewPdfExists(file, previewInfo)) {
return buildResponse(previewInfo);
}
if (previewInfo.status() == KnowledgeItemPreviewStatus.PROCESSING) {
return buildResponse(previewInfo);
}
String previewRelativePath = resolvePreviewRelativePath(file.getDatasetId(), file.getId());
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
file.getMetadata(),
objectMapper,
KnowledgeItemPreviewStatus.PROCESSING,
previewRelativePath,
null,
nowText()
);
file.setMetadata(updatedMetadata);
datasetFileRepository.updateById(file);
datasetFilePreviewAsyncService.convertPreviewAsync(file.getId());
KnowledgeItemPreviewMetadataHelper.PreviewInfo refreshed = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(updatedMetadata, objectMapper);
return buildResponse(refreshed);
}
public boolean isOfficeDocument(String fileName) {
String extension = resolveFileExtension(fileName);
return StringUtils.isNotBlank(extension) && OFFICE_EXTENSIONS.contains(extension.toLowerCase());
}
public PreviewFile resolveReadyPreviewFile(String datasetId, DatasetFile file) {
if (file == null) {
return null;
}
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(file.getMetadata(), objectMapper);
if (previewInfo.status() != KnowledgeItemPreviewStatus.READY) {
return null;
}
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(datasetId, file.getId()));
Path filePath = resolvePreviewStoragePath(relativePath);
if (!Files.exists(filePath) || !Files.isRegularFile(filePath)) {
markPreviewFailed(file, previewInfo, "预览文件不存在");
return null;
}
String previewName = resolvePreviewPdfName(file);
return new PreviewFile(filePath, previewName);
}
public void deletePreviewFileQuietly(String datasetId, String fileId) {
String relativePath = resolvePreviewRelativePath(datasetId, fileId);
Path filePath = resolvePreviewStoragePath(relativePath);
try {
Files.deleteIfExists(filePath);
} catch (Exception e) {
log.warn("delete dataset preview pdf error, fileId: {}", fileId, e);
}
}
private DatasetFilePreviewStatusResponse buildResponse(KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo) {
DatasetFilePreviewStatusResponse response = new DatasetFilePreviewStatusResponse();
KnowledgeItemPreviewStatus status = previewInfo.status() == null
? KnowledgeItemPreviewStatus.PENDING
: previewInfo.status();
response.setStatus(status);
response.setPreviewError(previewInfo.error());
response.setUpdatedAt(previewInfo.updatedAt());
return response;
}
private DatasetFile requireDatasetFile(String datasetId, String fileId) {
BusinessAssert.isTrue(StringUtils.isNotBlank(datasetId), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(StringUtils.isNotBlank(fileId), CommonErrorCode.PARAM_ERROR);
DatasetFile datasetFile = datasetFileRepository.getById(fileId);
BusinessAssert.notNull(datasetFile, CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(Objects.equals(datasetFile.getDatasetId(), datasetId), CommonErrorCode.PARAM_ERROR);
return datasetFile;
}
private void assertOfficeDocument(DatasetFile file) {
BusinessAssert.notNull(file, CommonErrorCode.PARAM_ERROR);
String extension = resolveFileExtension(resolveOriginalName(file));
BusinessAssert.isTrue(OFFICE_EXTENSIONS.contains(extension), CommonErrorCode.PARAM_ERROR);
}
private String resolveOriginalName(DatasetFile file) {
if (file == null) {
return "";
}
if (StringUtils.isNotBlank(file.getFileName())) {
return file.getFileName();
}
if (StringUtils.isNotBlank(file.getFilePath())) {
return Paths.get(file.getFilePath()).getFileName().toString();
}
return "";
}
private String resolveFileExtension(String fileName) {
if (StringUtils.isBlank(fileName)) {
return "";
}
int dotIndex = fileName.lastIndexOf('.');
if (dotIndex <= 0 || dotIndex >= fileName.length() - 1) {
return "";
}
return fileName.substring(dotIndex + 1).toLowerCase();
}
private String resolvePreviewPdfName(DatasetFile file) {
String originalName = resolveOriginalName(file);
if (StringUtils.isBlank(originalName)) {
return "预览.pdf";
}
int dotIndex = originalName.lastIndexOf('.');
if (dotIndex <= 0) {
return originalName + PREVIEW_FILE_SUFFIX;
}
return originalName.substring(0, dotIndex) + PREVIEW_FILE_SUFFIX;
}
private boolean previewPdfExists(DatasetFile file, KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo) {
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(file.getDatasetId(), file.getId()));
Path filePath = resolvePreviewStoragePath(relativePath);
return Files.exists(filePath) && Files.isRegularFile(filePath);
}
private KnowledgeItemPreviewMetadataHelper.PreviewInfo markPreviewFailed(
DatasetFile file,
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo,
String error
) {
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(file.getDatasetId(), file.getId()));
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
file.getMetadata(),
objectMapper,
KnowledgeItemPreviewStatus.FAILED,
relativePath,
error,
nowText()
);
file.setMetadata(updatedMetadata);
datasetFileRepository.updateById(file);
return KnowledgeItemPreviewMetadataHelper.readPreviewInfo(updatedMetadata, objectMapper);
}
private String resolvePreviewRelativePath(String datasetId, String fileId) {
String relativePath = Paths.get(DATASET_PREVIEW_DIR, datasetId, fileId + PREVIEW_FILE_SUFFIX)
.toString();
return relativePath.replace("\\", PATH_SEPARATOR);
}
Path resolvePreviewStoragePath(String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace("/", java.io.File.separator);
Path root = resolveUploadRootPath();
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
BusinessAssert.isTrue(target.startsWith(root), CommonErrorCode.PARAM_ERROR);
return target;
}
private Path resolveUploadRootPath() {
String uploadDir = dataManagementProperties.getFileStorage().getUploadDir();
BusinessAssert.isTrue(StringUtils.isNotBlank(uploadDir), CommonErrorCode.PARAM_ERROR);
return Paths.get(uploadDir).toAbsolutePath().normalize();
}
private String nowText() {
return LocalDateTime.now().format(PREVIEW_TIME_FORMATTER);
}
public record PreviewFile(Path filePath, String fileName) {
}
}

View File

@@ -0,0 +1,142 @@
package com.datamate.datamanagement.application;
import com.datamate.common.infrastructure.exception.BusinessAssert;
import com.datamate.common.infrastructure.exception.CommonErrorCode;
import com.datamate.datamanagement.common.enums.KnowledgeStatusType;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeSet;
import com.datamate.datamanagement.infrastructure.exception.DataManagementErrorCode;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemDirectoryRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeSetRepository;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeDirectoryRequest;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryQuery;
import lombok.RequiredArgsConstructor;
import org.apache.commons.lang3.StringUtils;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.List;
import java.util.UUID;
/**
* 知识条目目录应用服务
*/
@Service
@Transactional
@RequiredArgsConstructor
public class KnowledgeDirectoryApplicationService {
private static final String PATH_SEPARATOR = "/";
private static final String INVALID_PATH_SEGMENT = "..";
private final KnowledgeItemDirectoryRepository knowledgeItemDirectoryRepository;
private final KnowledgeItemRepository knowledgeItemRepository;
private final KnowledgeSetRepository knowledgeSetRepository;
@Transactional(readOnly = true)
public List<KnowledgeItemDirectory> getKnowledgeDirectories(String setId, KnowledgeDirectoryQuery query) {
BusinessAssert.notNull(query, CommonErrorCode.PARAM_ERROR);
query.setSetId(setId);
return knowledgeItemDirectoryRepository.findByCriteria(query);
}
public KnowledgeItemDirectory createKnowledgeDirectory(String setId, CreateKnowledgeDirectoryRequest request) {
BusinessAssert.notNull(request, CommonErrorCode.PARAM_ERROR);
KnowledgeSet knowledgeSet = requireKnowledgeSet(setId);
BusinessAssert.isTrue(!isReadOnlyStatus(knowledgeSet.getStatus()),
DataManagementErrorCode.KNOWLEDGE_SET_STATUS_ERROR);
String directoryName = normalizeDirectoryName(request.getDirectoryName());
validateDirectoryName(directoryName);
String parentPrefix = normalizeRelativePathPrefix(request.getParentPrefix());
String relativePath = normalizeRelativePathValue(parentPrefix + directoryName);
validateRelativePath(relativePath);
BusinessAssert.isTrue(!knowledgeItemRepository.existsBySetIdAndRelativePath(setId, relativePath),
CommonErrorCode.PARAM_ERROR);
KnowledgeItemDirectory existing = knowledgeItemDirectoryRepository.findBySetIdAndPath(setId, relativePath);
if (existing != null) {
return existing;
}
KnowledgeItemDirectory directory = new KnowledgeItemDirectory();
directory.setId(UUID.randomUUID().toString());
directory.setSetId(setId);
directory.setName(directoryName);
directory.setRelativePath(relativePath);
knowledgeItemDirectoryRepository.save(directory);
return directory;
}
public void deleteKnowledgeDirectory(String setId, String relativePath) {
KnowledgeSet knowledgeSet = requireKnowledgeSet(setId);
BusinessAssert.isTrue(!isReadOnlyStatus(knowledgeSet.getStatus()),
DataManagementErrorCode.KNOWLEDGE_SET_STATUS_ERROR);
String normalized = normalizeRelativePathValue(relativePath);
validateRelativePath(normalized);
knowledgeItemRepository.removeByRelativePathPrefix(setId, normalized);
knowledgeItemDirectoryRepository.removeByRelativePathPrefix(setId, normalized);
}
private KnowledgeSet requireKnowledgeSet(String setId) {
KnowledgeSet knowledgeSet = knowledgeSetRepository.getById(setId);
BusinessAssert.notNull(knowledgeSet, DataManagementErrorCode.KNOWLEDGE_SET_NOT_FOUND);
return knowledgeSet;
}
private boolean isReadOnlyStatus(KnowledgeStatusType status) {
return status == KnowledgeStatusType.ARCHIVED || status == KnowledgeStatusType.DEPRECATED;
}
private String normalizeDirectoryName(String name) {
return StringUtils.trimToEmpty(name);
}
private void validateDirectoryName(String name) {
BusinessAssert.isTrue(StringUtils.isNotBlank(name), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(!name.contains(PATH_SEPARATOR), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(!name.contains("\\"), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(!name.contains(INVALID_PATH_SEGMENT), CommonErrorCode.PARAM_ERROR);
}
private void validateRelativePath(String relativePath) {
BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(!relativePath.contains(INVALID_PATH_SEGMENT), CommonErrorCode.PARAM_ERROR);
}
private String normalizeRelativePathPrefix(String prefix) {
if (StringUtils.isBlank(prefix)) {
return "";
}
String normalized = prefix.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
if (StringUtils.isBlank(normalized)) {
return "";
}
validateRelativePath(normalized);
return normalized + PATH_SEPARATOR;
}
private String normalizeRelativePathValue(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
return normalized;
}
}

View File

@@ -16,12 +16,14 @@ import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeSet; import com.datamate.datamanagement.domain.model.knowledge.KnowledgeSet;
import com.datamate.datamanagement.infrastructure.config.DataManagementProperties; import com.datamate.datamanagement.infrastructure.config.DataManagementProperties;
import com.datamate.datamanagement.infrastructure.exception.DataManagementErrorCode; import com.datamate.datamanagement.infrastructure.exception.DataManagementErrorCode;
import com.datamate.datamanagement.infrastructure.persistence.mapper.TagMapper;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeSetRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeSetRepository;
import com.datamate.datamanagement.interfaces.converter.KnowledgeConverter; import com.datamate.datamanagement.interfaces.converter.KnowledgeConverter;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest; import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest;
import com.datamate.datamanagement.interfaces.dto.DeleteKnowledgeItemsRequest;
import com.datamate.datamanagement.interfaces.dto.ImportKnowledgeItemsRequest; import com.datamate.datamanagement.interfaces.dto.ImportKnowledgeItemsRequest;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPagingQuery; import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPagingQuery;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse; import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse;
@@ -74,16 +76,20 @@ public class KnowledgeItemApplicationService {
private static final String EXPORT_FILE_PREFIX = "knowledge_set_"; private static final String EXPORT_FILE_PREFIX = "knowledge_set_";
private static final String EXPORT_FILE_SUFFIX = ".zip"; private static final String EXPORT_FILE_SUFFIX = ".zip";
private static final String EXPORT_CONTENT_TYPE = "application/zip"; private static final String EXPORT_CONTENT_TYPE = "application/zip";
private static final String PREVIEW_PDF_CONTENT_TYPE = "application/pdf";
private static final int MAX_FILE_BASE_LENGTH = 120; private static final int MAX_FILE_BASE_LENGTH = 120;
private static final int MAX_TITLE_LENGTH = 200; private static final int MAX_TITLE_LENGTH = 200;
private static final String KNOWLEDGE_ITEM_UPLOAD_DIR = "knowledge-items"; private static final String KNOWLEDGE_ITEM_UPLOAD_DIR = "knowledge-items";
private static final String DEFAULT_FILE_EXTENSION = "bin"; private static final String DEFAULT_FILE_EXTENSION = "bin";
private static final String PATH_SEPARATOR = "/";
private final KnowledgeItemRepository knowledgeItemRepository; private final KnowledgeItemRepository knowledgeItemRepository;
private final KnowledgeSetRepository knowledgeSetRepository; private final KnowledgeSetRepository knowledgeSetRepository;
private final DatasetRepository datasetRepository; private final DatasetRepository datasetRepository;
private final DatasetFileRepository datasetFileRepository; private final DatasetFileRepository datasetFileRepository;
private final DataManagementProperties dataManagementProperties; private final DataManagementProperties dataManagementProperties;
private final TagMapper tagMapper;
private final KnowledgeItemPreviewService knowledgeItemPreviewService;
public KnowledgeItem createKnowledgeItem(String setId, CreateKnowledgeItemRequest request) { public KnowledgeItem createKnowledgeItem(String setId, CreateKnowledgeItemRequest request) {
KnowledgeSet knowledgeSet = requireKnowledgeSet(setId); KnowledgeSet knowledgeSet = requireKnowledgeSet(setId);
@@ -112,6 +118,7 @@ public class KnowledgeItemApplicationService {
List<MultipartFile> files = request.getFiles(); List<MultipartFile> files = request.getFiles();
BusinessAssert.isTrue(CollectionUtils.isNotEmpty(files), CommonErrorCode.PARAM_ERROR); BusinessAssert.isTrue(CollectionUtils.isNotEmpty(files), CommonErrorCode.PARAM_ERROR);
String parentPrefix = normalizeRelativePathPrefix(request.getParentPrefix());
Path uploadRoot = resolveUploadRootPath(); Path uploadRoot = resolveUploadRootPath();
Path setDir = uploadRoot.resolve(KNOWLEDGE_ITEM_UPLOAD_DIR).resolve(setId).normalize(); Path setDir = uploadRoot.resolve(KNOWLEDGE_ITEM_UPLOAD_DIR).resolve(setId).normalize();
@@ -145,6 +152,7 @@ public class KnowledgeItemApplicationService {
knowledgeItem.setContentType(KnowledgeContentType.FILE); knowledgeItem.setContentType(KnowledgeContentType.FILE);
knowledgeItem.setSourceType(KnowledgeSourceType.FILE_UPLOAD); knowledgeItem.setSourceType(KnowledgeSourceType.FILE_UPLOAD);
knowledgeItem.setSourceFileId(trimToLength(safeOriginalName, MAX_TITLE_LENGTH)); knowledgeItem.setSourceFileId(trimToLength(safeOriginalName, MAX_TITLE_LENGTH));
knowledgeItem.setRelativePath(buildRelativePath(parentPrefix, safeOriginalName));
items.add(knowledgeItem); items.add(knowledgeItem);
} }
@@ -170,6 +178,9 @@ public class KnowledgeItemApplicationService {
if (request.getContentType() != null) { if (request.getContentType() != null) {
knowledgeItem.setContentType(request.getContentType()); knowledgeItem.setContentType(request.getContentType());
} }
if (request.getMetadata() != null) {
knowledgeItem.setMetadata(request.getMetadata());
}
knowledgeItemRepository.updateById(knowledgeItem); knowledgeItemRepository.updateById(knowledgeItem);
return knowledgeItem; return knowledgeItem;
@@ -182,6 +193,22 @@ public class KnowledgeItemApplicationService {
knowledgeItemRepository.removeById(itemId); knowledgeItemRepository.removeById(itemId);
} }
public void deleteKnowledgeItems(String setId, DeleteKnowledgeItemsRequest request) {
BusinessAssert.notNull(request, CommonErrorCode.PARAM_ERROR);
List<String> ids = request.getIds();
BusinessAssert.isTrue(CollectionUtils.isNotEmpty(ids), CommonErrorCode.PARAM_ERROR);
List<KnowledgeItem> items = knowledgeItemRepository.listByIds(ids);
BusinessAssert.isTrue(CollectionUtils.isNotEmpty(items), DataManagementErrorCode.KNOWLEDGE_ITEM_NOT_FOUND);
BusinessAssert.isTrue(items.size() == ids.size(), DataManagementErrorCode.KNOWLEDGE_ITEM_NOT_FOUND);
boolean allMatch = items.stream().allMatch(item -> Objects.equals(item.getSetId(), setId));
BusinessAssert.isTrue(allMatch, CommonErrorCode.PARAM_ERROR);
List<String> deleteIds = items.stream().map(KnowledgeItem::getId).toList();
knowledgeItemRepository.removeByIds(deleteIds);
}
@Transactional(readOnly = true) @Transactional(readOnly = true)
public KnowledgeItem getKnowledgeItem(String setId, String itemId) { public KnowledgeItem getKnowledgeItem(String setId, String itemId) {
KnowledgeItem knowledgeItem = knowledgeItemRepository.getById(itemId); KnowledgeItem knowledgeItem = knowledgeItemRepository.getById(itemId);
@@ -213,6 +240,7 @@ public class KnowledgeItemApplicationService {
long datasetFileSize = safeLong(knowledgeItemRepository.sumDatasetFileSize()); long datasetFileSize = safeLong(knowledgeItemRepository.sumDatasetFileSize());
long uploadFileSize = calculateUploadFileTotalSize(); long uploadFileSize = calculateUploadFileTotalSize();
response.setTotalSize(datasetFileSize + uploadFileSize); response.setTotalSize(datasetFileSize + uploadFileSize);
response.setTotalTags(safeLong(tagMapper.countKnowledgeSetTags()));
return response; return response;
} }
@@ -256,6 +284,7 @@ public class KnowledgeItemApplicationService {
knowledgeItem.setSourceType(KnowledgeSourceType.DATASET_FILE); knowledgeItem.setSourceType(KnowledgeSourceType.DATASET_FILE);
knowledgeItem.setSourceDatasetId(dataset.getId()); knowledgeItem.setSourceDatasetId(dataset.getId());
knowledgeItem.setSourceFileId(datasetFile.getId()); knowledgeItem.setSourceFileId(datasetFile.getId());
knowledgeItem.setRelativePath(resolveDatasetFileRelativePath(dataset, datasetFile));
items.add(knowledgeItem); items.add(knowledgeItem);
} }
@@ -307,7 +336,7 @@ public class KnowledgeItemApplicationService {
String relativePath = knowledgeItem.getContent(); String relativePath = knowledgeItem.getContent();
BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR); BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR);
Path filePath = resolveKnowledgeItemStoragePath(relativePath); Path filePath = resolveKnowledgeItemStoragePathWithFallback(relativePath);
BusinessAssert.isTrue(Files.exists(filePath) && Files.isRegularFile(filePath), CommonErrorCode.PARAM_ERROR); BusinessAssert.isTrue(Files.exists(filePath) && Files.isRegularFile(filePath), CommonErrorCode.PARAM_ERROR);
String downloadName = StringUtils.isNotBlank(knowledgeItem.getSourceFileId()) String downloadName = StringUtils.isNotBlank(knowledgeItem.getSourceFileId())
@@ -340,12 +369,32 @@ public class KnowledgeItemApplicationService {
String relativePath = knowledgeItem.getContent(); String relativePath = knowledgeItem.getContent();
BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR); BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR);
Path filePath = resolveKnowledgeItemStoragePath(relativePath);
BusinessAssert.isTrue(Files.exists(filePath) && Files.isRegularFile(filePath), CommonErrorCode.PARAM_ERROR);
String previewName = StringUtils.isNotBlank(knowledgeItem.getSourceFileId()) String previewName = StringUtils.isNotBlank(knowledgeItem.getSourceFileId())
? knowledgeItem.getSourceFileId() ? knowledgeItem.getSourceFileId()
: filePath.getFileName().toString(); : Paths.get(relativePath).getFileName().toString();
if (knowledgeItemPreviewService.isOfficeDocument(previewName)) {
KnowledgeItemPreviewService.PreviewFile previewFile = knowledgeItemPreviewService.resolveReadyPreviewFile(setId, knowledgeItem);
if (previewFile == null) {
response.setStatus(HttpServletResponse.SC_CONFLICT);
return;
}
response.setContentType(PREVIEW_PDF_CONTENT_TYPE);
response.setCharacterEncoding(StandardCharsets.UTF_8.name());
response.setHeader(HttpHeaders.CONTENT_DISPOSITION,
"inline; filename=\"" + URLEncoder.encode(previewFile.fileName(), StandardCharsets.UTF_8) + "\"");
try (InputStream inputStream = Files.newInputStream(previewFile.filePath())) {
inputStream.transferTo(response.getOutputStream());
response.flushBuffer();
} catch (IOException e) {
log.error("preview knowledge item pdf error, itemId: {}", itemId, e);
throw BusinessException.of(SystemErrorCode.FILE_SYSTEM_ERROR);
}
return;
}
Path filePath = resolveKnowledgeItemStoragePathWithFallback(relativePath);
BusinessAssert.isTrue(Files.exists(filePath) && Files.isRegularFile(filePath), CommonErrorCode.PARAM_ERROR);
String contentType = null; String contentType = null;
try { try {
@@ -418,7 +467,10 @@ public class KnowledgeItemApplicationService {
knowledgeItem.setContentType(KnowledgeContentType.FILE); knowledgeItem.setContentType(KnowledgeContentType.FILE);
knowledgeItem.setSourceType(KnowledgeSourceType.FILE_UPLOAD); knowledgeItem.setSourceType(KnowledgeSourceType.FILE_UPLOAD);
knowledgeItem.setSourceFileId(sourceFileId); knowledgeItem.setSourceFileId(sourceFileId);
knowledgeItem.setRelativePath(resolveReplacedRelativePath(knowledgeItem.getRelativePath(), sourceFileId));
knowledgeItem.setMetadata(knowledgeItemPreviewService.clearPreviewMetadata(knowledgeItem.getMetadata()));
knowledgeItemRepository.updateById(knowledgeItem); knowledgeItemRepository.updateById(knowledgeItem);
knowledgeItemPreviewService.deletePreviewFileQuietly(setId, knowledgeItem.getId());
deleteFile(oldFilePath); deleteFile(oldFilePath);
} catch (Exception e) { } catch (Exception e) {
deleteFileQuietly(targetPath); deleteFileQuietly(targetPath);
@@ -483,6 +535,86 @@ public class KnowledgeItemApplicationService {
return target; return target;
} }
private Path resolveKnowledgeItemStoragePathWithFallback(String relativePath) {
BusinessAssert.isTrue(StringUtils.isNotBlank(relativePath), CommonErrorCode.PARAM_ERROR);
String normalizedInput = relativePath.replace("\\", PATH_SEPARATOR).trim();
Path root = resolveUploadRootPath();
java.util.LinkedHashSet<Path> candidates = new java.util.LinkedHashSet<>();
Path inputPath = Paths.get(normalizedInput.replace(PATH_SEPARATOR, File.separator));
if (inputPath.isAbsolute()) {
Path normalizedAbsolute = inputPath.toAbsolutePath().normalize();
if (normalizedAbsolute.startsWith(root)) {
candidates.add(normalizedAbsolute);
}
String segmentRelativePath = extractRelativePathFromSegment(normalizedInput, KNOWLEDGE_ITEM_UPLOAD_DIR);
if (StringUtils.isNotBlank(segmentRelativePath)) {
candidates.add(buildKnowledgeItemStoragePath(root, segmentRelativePath));
}
BusinessAssert.isTrue(!candidates.isEmpty(), CommonErrorCode.PARAM_ERROR);
} else {
String normalizedRelative = normalizeRelativePathValue(normalizedInput);
if (StringUtils.isNotBlank(normalizedRelative)) {
candidates.add(buildKnowledgeItemStoragePath(root, normalizedRelative));
}
String segmentRelativePath = extractRelativePathFromSegment(normalizedInput, KNOWLEDGE_ITEM_UPLOAD_DIR);
if (StringUtils.isNotBlank(segmentRelativePath)) {
candidates.add(buildKnowledgeItemStoragePath(root, segmentRelativePath));
}
if (StringUtils.isNotBlank(normalizedRelative)
&& !normalizedRelative.startsWith(KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR)
&& !normalizedRelative.equals(KNOWLEDGE_ITEM_UPLOAD_DIR)) {
candidates.add(buildKnowledgeItemStoragePath(root, KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR + normalizedRelative));
}
}
if (root.getFileName() != null && KNOWLEDGE_ITEM_UPLOAD_DIR.equals(root.getFileName().toString())) {
String normalizedRelative = normalizeRelativePathValue(normalizedInput);
if (StringUtils.isNotBlank(normalizedRelative)
&& normalizedRelative.startsWith(KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR)) {
String withoutPrefix = normalizedRelative.substring(KNOWLEDGE_ITEM_UPLOAD_DIR.length() + PATH_SEPARATOR.length());
if (StringUtils.isNotBlank(withoutPrefix)) {
candidates.add(buildKnowledgeItemStoragePath(root, withoutPrefix));
}
}
}
Path fallback = null;
for (Path candidate : candidates) {
if (fallback == null) {
fallback = candidate;
}
if (Files.exists(candidate) && Files.isRegularFile(candidate)) {
return candidate;
}
}
BusinessAssert.notNull(fallback, CommonErrorCode.PARAM_ERROR);
return fallback;
}
private Path buildKnowledgeItemStoragePath(Path root, String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace(PATH_SEPARATOR, File.separator);
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
BusinessAssert.isTrue(target.startsWith(root), CommonErrorCode.PARAM_ERROR);
return target;
}
private String extractRelativePathFromSegment(String rawPath, String segment) {
if (StringUtils.isBlank(rawPath) || StringUtils.isBlank(segment)) {
return null;
}
String normalized = rawPath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
String segmentPrefix = segment + PATH_SEPARATOR;
int index = normalized.indexOf(segmentPrefix);
if (index < 0) {
return segment.equals(normalized) ? segment : null;
}
return normalizeRelativePathValue(normalized.substring(index));
}
private KnowledgeItemSearchResponse normalizeSearchResponse(KnowledgeItemSearchResponse item) { private KnowledgeItemSearchResponse normalizeSearchResponse(KnowledgeItemSearchResponse item) {
BusinessAssert.notNull(item, CommonErrorCode.PARAM_ERROR); BusinessAssert.notNull(item, CommonErrorCode.PARAM_ERROR);
if (item.getSourceType() == KnowledgeSourceType.FILE_UPLOAD) { if (item.getSourceType() == KnowledgeSourceType.FILE_UPLOAD) {
@@ -540,6 +672,84 @@ public class KnowledgeItemApplicationService {
return relativePath.replace(File.separatorChar, '/'); return relativePath.replace(File.separatorChar, '/');
} }
private String buildRelativePath(String parentPrefix, String fileName) {
String safeName = sanitizeFileName(fileName);
if (StringUtils.isBlank(safeName)) {
safeName = "file";
}
String normalizedPrefix = normalizeRelativePathPrefix(parentPrefix);
return normalizedPrefix + safeName;
}
private String normalizeRelativePathPrefix(String prefix) {
if (StringUtils.isBlank(prefix)) {
return "";
}
String normalized = prefix.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
BusinessAssert.isTrue(!normalized.contains(".."), CommonErrorCode.PARAM_ERROR);
if (StringUtils.isBlank(normalized)) {
return "";
}
return normalized + PATH_SEPARATOR;
}
private String normalizeRelativePathValue(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
return normalized;
}
private String resolveDatasetFileRelativePath(Dataset dataset, DatasetFile datasetFile) {
if (datasetFile == null) {
return "";
}
String fileName = StringUtils.defaultIfBlank(datasetFile.getFileName(), datasetFile.getId());
String datasetPath = dataset == null ? null : dataset.getPath();
String filePath = datasetFile.getFilePath();
if (StringUtils.isBlank(datasetPath) || StringUtils.isBlank(filePath)) {
return buildRelativePath("", fileName);
}
try {
Path datasetRoot = Paths.get(datasetPath).toAbsolutePath().normalize();
Path targetPath = Paths.get(filePath).toAbsolutePath().normalize();
if (targetPath.startsWith(datasetRoot)) {
Path relative = datasetRoot.relativize(targetPath);
String relativeValue = relative.toString().replace(File.separatorChar, '/');
String normalized = normalizeRelativePathValue(relativeValue);
if (!normalized.contains("..") && StringUtils.isNotBlank(normalized)) {
return normalized;
}
}
} catch (Exception e) {
log.warn("resolve dataset file relative path failed, fileId: {}", datasetFile.getId(), e);
}
return buildRelativePath("", fileName);
}
private String resolveReplacedRelativePath(String existingRelativePath, String newFileName) {
String normalized = normalizeRelativePathValue(existingRelativePath);
if (StringUtils.isBlank(normalized)) {
return buildRelativePath("", newFileName);
}
int lastIndex = normalized.lastIndexOf(PATH_SEPARATOR);
String parentPrefix = lastIndex >= 0 ? normalized.substring(0, lastIndex + 1) : "";
return buildRelativePath(parentPrefix, newFileName);
}
private void createDirectories(Path path) { private void createDirectories(Path path) {
try { try {
Files.createDirectories(path); Files.createDirectories(path);

View File

@@ -0,0 +1,275 @@
package com.datamate.datamanagement.application;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem;
import com.datamate.datamanagement.infrastructure.config.DataManagementProperties;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemRepository;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.Set;
/**
* 知识条目预览转换异步任务
*/
@Service
@RequiredArgsConstructor
@Slf4j
public class KnowledgeItemPreviewAsyncService {
private static final Set<String> OFFICE_EXTENSIONS = Set.of("doc", "docx");
private static final String KNOWLEDGE_ITEM_UPLOAD_DIR = "knowledge-items";
private static final String PREVIEW_SUB_DIR = "preview";
private static final String PREVIEW_FILE_SUFFIX = ".pdf";
private static final String PATH_SEPARATOR = "/";
private static final int MAX_ERROR_LENGTH = 500;
private static final DateTimeFormatter PREVIEW_TIME_FORMATTER = DateTimeFormatter.ISO_LOCAL_DATE_TIME;
private final KnowledgeItemRepository knowledgeItemRepository;
private final DataManagementProperties dataManagementProperties;
private final ObjectMapper objectMapper = new ObjectMapper();
@Async
public void convertPreviewAsync(String itemId) {
if (StringUtils.isBlank(itemId)) {
return;
}
KnowledgeItem item = knowledgeItemRepository.getById(itemId);
if (item == null) {
return;
}
String extension = resolveFileExtension(resolveOriginalName(item));
if (!OFFICE_EXTENSIONS.contains(extension)) {
updatePreviewStatus(item, KnowledgeItemPreviewStatus.FAILED, null, "仅支持 DOC/DOCX 转换");
return;
}
if (StringUtils.isBlank(item.getContent())) {
updatePreviewStatus(item, KnowledgeItemPreviewStatus.FAILED, null, "源文件路径为空");
return;
}
Path sourcePath = resolveKnowledgeItemStoragePath(item.getContent());
if (!Files.exists(sourcePath) || !Files.isRegularFile(sourcePath)) {
updatePreviewStatus(item, KnowledgeItemPreviewStatus.FAILED, null, "源文件不存在");
return;
}
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(item.getMetadata(), objectMapper);
String previewRelativePath = StringUtils.defaultIfBlank(
previewInfo.pdfPath(),
resolvePreviewRelativePath(item.getSetId(), item.getId())
);
Path targetPath = resolvePreviewStoragePath(previewRelativePath);
ensureParentDirectory(targetPath);
try {
LibreOfficeConverter.convertToPdf(sourcePath, targetPath);
updatePreviewStatus(item, KnowledgeItemPreviewStatus.READY, previewRelativePath, null);
} catch (Exception e) {
log.error("preview convert failed, itemId: {}", item.getId(), e);
updatePreviewStatus(item, KnowledgeItemPreviewStatus.FAILED, previewRelativePath, trimError(e.getMessage()));
}
}
private void updatePreviewStatus(
KnowledgeItem item,
KnowledgeItemPreviewStatus status,
String previewRelativePath,
String error
) {
if (item == null) {
return;
}
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
item.getMetadata(),
objectMapper,
status,
previewRelativePath,
error,
nowText()
);
item.setMetadata(updatedMetadata);
knowledgeItemRepository.updateById(item);
}
private String resolveOriginalName(KnowledgeItem item) {
if (item == null) {
return "";
}
if (StringUtils.isNotBlank(item.getSourceFileId())) {
return item.getSourceFileId();
}
if (StringUtils.isNotBlank(item.getContent())) {
return Paths.get(item.getContent()).getFileName().toString();
}
return "";
}
private String resolveFileExtension(String fileName) {
if (StringUtils.isBlank(fileName)) {
return "";
}
int dotIndex = fileName.lastIndexOf('.');
if (dotIndex <= 0 || dotIndex >= fileName.length() - 1) {
return "";
}
return fileName.substring(dotIndex + 1).toLowerCase();
}
private String resolvePreviewRelativePath(String setId, String itemId) {
String relativePath = Paths.get(KNOWLEDGE_ITEM_UPLOAD_DIR, setId, PREVIEW_SUB_DIR, itemId + PREVIEW_FILE_SUFFIX)
.toString();
return relativePath.replace("\\", PATH_SEPARATOR);
}
private Path resolvePreviewStoragePath(String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace("/", java.io.File.separator);
Path root = resolveUploadRootPath();
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
if (!target.startsWith(root)) {
throw new IllegalArgumentException("invalid preview path");
}
return target;
}
private Path resolveKnowledgeItemStoragePath(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
throw new IllegalArgumentException("invalid knowledge item path");
}
String normalizedInput = relativePath.replace("\\", PATH_SEPARATOR).trim();
Path root = resolveUploadRootPath();
java.util.LinkedHashSet<Path> candidates = new java.util.LinkedHashSet<>();
Path inputPath = Paths.get(normalizedInput.replace(PATH_SEPARATOR, java.io.File.separator));
if (inputPath.isAbsolute()) {
Path normalizedAbsolute = inputPath.toAbsolutePath().normalize();
if (normalizedAbsolute.startsWith(root)) {
candidates.add(normalizedAbsolute);
}
String segmentRelativePath = extractRelativePathFromSegment(normalizedInput, KNOWLEDGE_ITEM_UPLOAD_DIR);
if (StringUtils.isNotBlank(segmentRelativePath)) {
candidates.add(buildKnowledgeItemStoragePath(root, segmentRelativePath));
}
if (candidates.isEmpty()) {
throw new IllegalArgumentException("invalid knowledge item path");
}
} else {
String normalizedRelative = normalizeRelativePathValue(normalizedInput);
if (StringUtils.isNotBlank(normalizedRelative)) {
candidates.add(buildKnowledgeItemStoragePath(root, normalizedRelative));
}
String segmentRelativePath = extractRelativePathFromSegment(normalizedInput, KNOWLEDGE_ITEM_UPLOAD_DIR);
if (StringUtils.isNotBlank(segmentRelativePath)) {
candidates.add(buildKnowledgeItemStoragePath(root, segmentRelativePath));
}
if (StringUtils.isNotBlank(normalizedRelative)
&& !normalizedRelative.startsWith(KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR)
&& !normalizedRelative.equals(KNOWLEDGE_ITEM_UPLOAD_DIR)) {
candidates.add(buildKnowledgeItemStoragePath(root, KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR + normalizedRelative));
}
}
if (root.getFileName() != null && KNOWLEDGE_ITEM_UPLOAD_DIR.equals(root.getFileName().toString())) {
String normalizedRelative = normalizeRelativePathValue(normalizedInput);
if (StringUtils.isNotBlank(normalizedRelative)
&& normalizedRelative.startsWith(KNOWLEDGE_ITEM_UPLOAD_DIR + PATH_SEPARATOR)) {
String withoutPrefix = normalizedRelative.substring(KNOWLEDGE_ITEM_UPLOAD_DIR.length() + PATH_SEPARATOR.length());
if (StringUtils.isNotBlank(withoutPrefix)) {
candidates.add(buildKnowledgeItemStoragePath(root, withoutPrefix));
}
}
}
Path fallback = null;
for (Path candidate : candidates) {
if (fallback == null) {
fallback = candidate;
}
if (Files.exists(candidate) && Files.isRegularFile(candidate)) {
return candidate;
}
}
if (fallback == null) {
throw new IllegalArgumentException("invalid knowledge item path");
}
return fallback;
}
private Path buildKnowledgeItemStoragePath(Path root, String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace(PATH_SEPARATOR, java.io.File.separator);
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
if (!target.startsWith(root)) {
throw new IllegalArgumentException("invalid knowledge item path");
}
return target;
}
private String extractRelativePathFromSegment(String rawPath, String segment) {
if (StringUtils.isBlank(rawPath) || StringUtils.isBlank(segment)) {
return null;
}
String normalized = rawPath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
String segmentPrefix = segment + PATH_SEPARATOR;
int index = normalized.indexOf(segmentPrefix);
if (index < 0) {
return segment.equals(normalized) ? segment : null;
}
return normalizeRelativePathValue(normalized.substring(index));
}
private String normalizeRelativePathValue(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
return normalized;
}
private Path resolveUploadRootPath() {
String uploadDir = dataManagementProperties.getFileStorage().getUploadDir();
return Paths.get(uploadDir).toAbsolutePath().normalize();
}
private void ensureParentDirectory(Path targetPath) {
try {
Path parent = targetPath.getParent();
if (parent != null) {
Files.createDirectories(parent);
}
} catch (IOException e) {
throw new IllegalStateException("创建预览目录失败", e);
}
}
private String trimError(String error) {
if (StringUtils.isBlank(error)) {
return "";
}
if (error.length() <= MAX_ERROR_LENGTH) {
return error;
}
return error.substring(0, MAX_ERROR_LENGTH);
}
private String nowText() {
return LocalDateTime.now().format(PREVIEW_TIME_FORMATTER);
}
}

View File

@@ -0,0 +1,134 @@
package com.datamate.datamanagement.application;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;
import org.apache.commons.lang3.StringUtils;
/**
* 知识条目预览元数据解析与写入辅助类
*/
public final class KnowledgeItemPreviewMetadataHelper {
public static final String PREVIEW_STATUS_KEY = "previewStatus";
public static final String PREVIEW_PDF_PATH_KEY = "previewPdfPath";
public static final String PREVIEW_ERROR_KEY = "previewError";
public static final String PREVIEW_UPDATED_AT_KEY = "previewUpdatedAt";
private KnowledgeItemPreviewMetadataHelper() {
}
public static PreviewInfo readPreviewInfo(String metadata, ObjectMapper objectMapper) {
if (StringUtils.isBlank(metadata) || objectMapper == null) {
return PreviewInfo.empty();
}
try {
JsonNode node = objectMapper.readTree(metadata);
if (node == null || !node.isObject()) {
return PreviewInfo.empty();
}
String statusText = textValue(node, PREVIEW_STATUS_KEY);
KnowledgeItemPreviewStatus status = parseStatus(statusText);
return new PreviewInfo(
status,
textValue(node, PREVIEW_PDF_PATH_KEY),
textValue(node, PREVIEW_ERROR_KEY),
textValue(node, PREVIEW_UPDATED_AT_KEY)
);
} catch (Exception ignore) {
return PreviewInfo.empty();
}
}
public static String applyPreviewInfo(
String metadata,
ObjectMapper objectMapper,
KnowledgeItemPreviewStatus status,
String pdfPath,
String error,
String updatedAt
) {
if (objectMapper == null) {
return metadata;
}
ObjectNode root = parseRoot(metadata, objectMapper);
if (status == null) {
root.remove(PREVIEW_STATUS_KEY);
} else {
root.put(PREVIEW_STATUS_KEY, status.name());
}
if (StringUtils.isBlank(pdfPath)) {
root.remove(PREVIEW_PDF_PATH_KEY);
} else {
root.put(PREVIEW_PDF_PATH_KEY, pdfPath);
}
if (StringUtils.isBlank(error)) {
root.remove(PREVIEW_ERROR_KEY);
} else {
root.put(PREVIEW_ERROR_KEY, error);
}
if (StringUtils.isBlank(updatedAt)) {
root.remove(PREVIEW_UPDATED_AT_KEY);
} else {
root.put(PREVIEW_UPDATED_AT_KEY, updatedAt);
}
return root.size() == 0 ? null : root.toString();
}
public static String clearPreviewInfo(String metadata, ObjectMapper objectMapper) {
if (objectMapper == null) {
return metadata;
}
ObjectNode root = parseRoot(metadata, objectMapper);
root.remove(PREVIEW_STATUS_KEY);
root.remove(PREVIEW_PDF_PATH_KEY);
root.remove(PREVIEW_ERROR_KEY);
root.remove(PREVIEW_UPDATED_AT_KEY);
return root.size() == 0 ? null : root.toString();
}
private static ObjectNode parseRoot(String metadata, ObjectMapper objectMapper) {
if (StringUtils.isBlank(metadata)) {
return objectMapper.createObjectNode();
}
try {
JsonNode node = objectMapper.readTree(metadata);
if (node instanceof ObjectNode objectNode) {
return objectNode;
}
} catch (Exception ignore) {
return objectMapper.createObjectNode();
}
return objectMapper.createObjectNode();
}
private static String textValue(JsonNode node, String key) {
if (node == null || StringUtils.isBlank(key)) {
return null;
}
JsonNode value = node.get(key);
return value == null || value.isNull() ? null : value.asText();
}
private static KnowledgeItemPreviewStatus parseStatus(String statusText) {
if (StringUtils.isBlank(statusText)) {
return null;
}
try {
return KnowledgeItemPreviewStatus.valueOf(statusText);
} catch (Exception ignore) {
return null;
}
}
public record PreviewInfo(
KnowledgeItemPreviewStatus status,
String pdfPath,
String error,
String updatedAt
) {
public static PreviewInfo empty() {
return new PreviewInfo(null, null, null, null);
}
}
}

View File

@@ -0,0 +1,244 @@
package com.datamate.datamanagement.application;
import com.datamate.common.infrastructure.exception.BusinessAssert;
import com.datamate.common.infrastructure.exception.CommonErrorCode;
import com.datamate.datamanagement.common.enums.KnowledgeContentType;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import com.datamate.datamanagement.common.enums.KnowledgeSourceType;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem;
import com.datamate.datamanagement.infrastructure.config.DataManagementProperties;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemRepository;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPreviewStatusResponse;
import com.fasterxml.jackson.databind.ObjectMapper;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang3.StringUtils;
import org.springframework.stereotype.Service;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.Objects;
import java.util.Set;
/**
* 知识条目预览转换服务
*/
@Service
@RequiredArgsConstructor
@Slf4j
public class KnowledgeItemPreviewService {
private static final Set<String> OFFICE_EXTENSIONS = Set.of("doc", "docx");
private static final String KNOWLEDGE_ITEM_UPLOAD_DIR = "knowledge-items";
private static final String PREVIEW_SUB_DIR = "preview";
private static final String PREVIEW_FILE_SUFFIX = ".pdf";
private static final String PATH_SEPARATOR = "/";
private static final DateTimeFormatter PREVIEW_TIME_FORMATTER = DateTimeFormatter.ISO_LOCAL_DATE_TIME;
private final KnowledgeItemRepository knowledgeItemRepository;
private final DataManagementProperties dataManagementProperties;
private final KnowledgeItemPreviewAsyncService knowledgeItemPreviewAsyncService;
private final ObjectMapper objectMapper = new ObjectMapper();
public KnowledgeItemPreviewStatusResponse getPreviewStatus(String setId, String itemId) {
KnowledgeItem item = requireKnowledgeItem(setId, itemId);
assertOfficeDocument(item);
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(item.getMetadata(), objectMapper);
if (previewInfo.status() == KnowledgeItemPreviewStatus.READY && !previewPdfExists(item, previewInfo)) {
previewInfo = markPreviewFailed(item, previewInfo, "预览文件不存在");
}
return buildResponse(previewInfo);
}
public KnowledgeItemPreviewStatusResponse ensurePreview(String setId, String itemId) {
KnowledgeItem item = requireKnowledgeItem(setId, itemId);
assertOfficeDocument(item);
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(item.getMetadata(), objectMapper);
if (previewInfo.status() == KnowledgeItemPreviewStatus.READY && previewPdfExists(item, previewInfo)) {
return buildResponse(previewInfo);
}
if (previewInfo.status() == KnowledgeItemPreviewStatus.PROCESSING) {
return buildResponse(previewInfo);
}
String previewRelativePath = resolvePreviewRelativePath(item.getSetId(), item.getId());
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
item.getMetadata(),
objectMapper,
KnowledgeItemPreviewStatus.PROCESSING,
previewRelativePath,
null,
nowText()
);
item.setMetadata(updatedMetadata);
knowledgeItemRepository.updateById(item);
knowledgeItemPreviewAsyncService.convertPreviewAsync(item.getId());
KnowledgeItemPreviewMetadataHelper.PreviewInfo refreshed = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(updatedMetadata, objectMapper);
return buildResponse(refreshed);
}
public boolean isOfficeDocument(String fileName) {
String extension = resolveFileExtension(fileName);
return StringUtils.isNotBlank(extension) && OFFICE_EXTENSIONS.contains(extension.toLowerCase());
}
public PreviewFile resolveReadyPreviewFile(String setId, KnowledgeItem item) {
if (item == null) {
return null;
}
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo = KnowledgeItemPreviewMetadataHelper
.readPreviewInfo(item.getMetadata(), objectMapper);
if (previewInfo.status() != KnowledgeItemPreviewStatus.READY) {
return null;
}
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(setId, item.getId()));
Path filePath = resolvePreviewStoragePath(relativePath);
if (!Files.exists(filePath) || !Files.isRegularFile(filePath)) {
markPreviewFailed(item, previewInfo, "预览文件不存在");
return null;
}
String previewName = resolvePreviewPdfName(item);
return new PreviewFile(filePath, previewName);
}
public String clearPreviewMetadata(String metadata) {
return KnowledgeItemPreviewMetadataHelper.clearPreviewInfo(metadata, objectMapper);
}
public void deletePreviewFileQuietly(String setId, String itemId) {
String relativePath = resolvePreviewRelativePath(setId, itemId);
Path filePath = resolvePreviewStoragePath(relativePath);
try {
Files.deleteIfExists(filePath);
} catch (Exception e) {
log.warn("delete preview pdf error, itemId: {}", itemId, e);
}
}
private KnowledgeItemPreviewStatusResponse buildResponse(KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo) {
KnowledgeItemPreviewStatusResponse response = new KnowledgeItemPreviewStatusResponse();
KnowledgeItemPreviewStatus status = previewInfo.status() == null
? KnowledgeItemPreviewStatus.PENDING
: previewInfo.status();
response.setStatus(status);
response.setPreviewError(previewInfo.error());
response.setUpdatedAt(previewInfo.updatedAt());
return response;
}
private KnowledgeItem requireKnowledgeItem(String setId, String itemId) {
BusinessAssert.isTrue(StringUtils.isNotBlank(setId), CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(StringUtils.isNotBlank(itemId), CommonErrorCode.PARAM_ERROR);
KnowledgeItem knowledgeItem = knowledgeItemRepository.getById(itemId);
BusinessAssert.notNull(knowledgeItem, CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(Objects.equals(knowledgeItem.getSetId(), setId), CommonErrorCode.PARAM_ERROR);
return knowledgeItem;
}
private void assertOfficeDocument(KnowledgeItem item) {
BusinessAssert.notNull(item, CommonErrorCode.PARAM_ERROR);
BusinessAssert.isTrue(
item.getContentType() == KnowledgeContentType.FILE || item.getSourceType() == KnowledgeSourceType.FILE_UPLOAD,
CommonErrorCode.PARAM_ERROR
);
String extension = resolveFileExtension(resolveOriginalName(item));
BusinessAssert.isTrue(OFFICE_EXTENSIONS.contains(extension), CommonErrorCode.PARAM_ERROR);
}
private String resolveOriginalName(KnowledgeItem item) {
if (item == null) {
return "";
}
if (StringUtils.isNotBlank(item.getSourceFileId())) {
return item.getSourceFileId();
}
if (StringUtils.isNotBlank(item.getContent())) {
return Paths.get(item.getContent()).getFileName().toString();
}
return "";
}
private String resolveFileExtension(String fileName) {
if (StringUtils.isBlank(fileName)) {
return "";
}
int dotIndex = fileName.lastIndexOf('.');
if (dotIndex <= 0 || dotIndex >= fileName.length() - 1) {
return "";
}
return fileName.substring(dotIndex + 1).toLowerCase();
}
private String resolvePreviewPdfName(KnowledgeItem item) {
String originalName = resolveOriginalName(item);
if (StringUtils.isBlank(originalName)) {
return "预览.pdf";
}
int dotIndex = originalName.lastIndexOf('.');
if (dotIndex <= 0) {
return originalName + PREVIEW_FILE_SUFFIX;
}
return originalName.substring(0, dotIndex) + PREVIEW_FILE_SUFFIX;
}
private boolean previewPdfExists(KnowledgeItem item, KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo) {
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(item.getSetId(), item.getId()));
Path filePath = resolvePreviewStoragePath(relativePath);
return Files.exists(filePath) && Files.isRegularFile(filePath);
}
private KnowledgeItemPreviewMetadataHelper.PreviewInfo markPreviewFailed(
KnowledgeItem item,
KnowledgeItemPreviewMetadataHelper.PreviewInfo previewInfo,
String error
) {
String relativePath = StringUtils.defaultIfBlank(previewInfo.pdfPath(), resolvePreviewRelativePath(item.getSetId(), item.getId()));
String updatedMetadata = KnowledgeItemPreviewMetadataHelper.applyPreviewInfo(
item.getMetadata(),
objectMapper,
KnowledgeItemPreviewStatus.FAILED,
relativePath,
error,
nowText()
);
item.setMetadata(updatedMetadata);
knowledgeItemRepository.updateById(item);
return KnowledgeItemPreviewMetadataHelper.readPreviewInfo(updatedMetadata, objectMapper);
}
private String resolvePreviewRelativePath(String setId, String itemId) {
String relativePath = Paths.get(KNOWLEDGE_ITEM_UPLOAD_DIR, setId, PREVIEW_SUB_DIR, itemId + PREVIEW_FILE_SUFFIX)
.toString();
return relativePath.replace("\\", PATH_SEPARATOR);
}
private Path resolvePreviewStoragePath(String relativePath) {
String normalizedRelativePath = StringUtils.defaultString(relativePath).replace("/", java.io.File.separator);
Path root = resolveUploadRootPath();
Path target = root.resolve(normalizedRelativePath).toAbsolutePath().normalize();
BusinessAssert.isTrue(target.startsWith(root), CommonErrorCode.PARAM_ERROR);
return target;
}
private Path resolveUploadRootPath() {
String uploadDir = dataManagementProperties.getFileStorage().getUploadDir();
BusinessAssert.isTrue(StringUtils.isNotBlank(uploadDir), CommonErrorCode.PARAM_ERROR);
return Paths.get(uploadDir).toAbsolutePath().normalize();
}
private String nowText() {
return LocalDateTime.now().format(PREVIEW_TIME_FORMATTER);
}
public record PreviewFile(Path filePath, String fileName) {
}
}

View File

@@ -0,0 +1,93 @@
package com.datamate.datamanagement.application;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.StandardCopyOption;
import java.time.Duration;
import java.util.List;
import java.util.concurrent.TimeUnit;
/**
* LibreOffice 文档转换工具
*/
public final class LibreOfficeConverter {
private static final String LIBREOFFICE_COMMAND = "soffice";
private static final Duration CONVERT_TIMEOUT = Duration.ofMinutes(5);
private static final int MAX_OUTPUT_LENGTH = 500;
private LibreOfficeConverter() {
}
public static void convertToPdf(Path sourcePath, Path targetPath) throws Exception {
Path outputDir = targetPath.getParent();
List<String> command = List.of(
LIBREOFFICE_COMMAND,
"--headless",
"--nologo",
"--nolockcheck",
"--nodefault",
"--nofirststartwizard",
"--convert-to",
"pdf",
"--outdir",
outputDir.toString(),
sourcePath.toString()
);
ProcessBuilder processBuilder = new ProcessBuilder(command);
processBuilder.redirectErrorStream(true);
Process process = processBuilder.start();
boolean finished = process.waitFor(CONVERT_TIMEOUT.toMillis(), TimeUnit.MILLISECONDS);
String output = readProcessOutput(process.getInputStream());
if (!finished) {
process.destroyForcibly();
throw new IllegalStateException("LibreOffice 转换超时");
}
if (process.exitValue() != 0) {
throw new IllegalStateException("LibreOffice 转换失败: " + output);
}
Path generated = outputDir.resolve(stripExtension(sourcePath.getFileName().toString()) + ".pdf");
if (!Files.exists(generated)) {
throw new IllegalStateException("LibreOffice 输出文件不存在");
}
if (!generated.equals(targetPath)) {
Files.move(generated, targetPath, StandardCopyOption.REPLACE_EXISTING);
}
}
private static String readProcessOutput(InputStream inputStream) throws IOException {
if (inputStream == null) {
return "";
}
byte[] buffer = new byte[1024];
StringBuilder builder = new StringBuilder();
int total = 0;
int read;
while ((read = inputStream.read(buffer)) >= 0) {
if (read == 0) {
continue;
}
int remaining = MAX_OUTPUT_LENGTH - total;
if (remaining <= 0) {
break;
}
int toAppend = Math.min(remaining, read);
builder.append(new String(buffer, 0, toAppend, StandardCharsets.UTF_8));
total += toAppend;
if (total >= MAX_OUTPUT_LENGTH) {
break;
}
}
return builder.toString();
}
private static String stripExtension(String fileName) {
if (fileName == null || fileName.isBlank()) {
return "preview";
}
int dotIndex = fileName.lastIndexOf('.');
return dotIndex <= 0 ? fileName : fileName.substring(0, dotIndex);
}
}

View File

@@ -0,0 +1,11 @@
package com.datamate.datamanagement.common.enums;
/**
* 知识条目预览转换状态
*/
public enum KnowledgeItemPreviewStatus {
PENDING,
PROCESSING,
READY,
FAILED
}

View File

@@ -38,4 +38,12 @@ public class KnowledgeItem extends BaseEntity<String> {
* 来源文件ID * 来源文件ID
*/ */
private String sourceFileId; private String sourceFileId;
/**
* 相对路径(用于目录展示)
*/
private String relativePath;
/**
* 扩展元数据
*/
private String metadata;
} }

View File

@@ -0,0 +1,29 @@
package com.datamate.datamanagement.domain.model.knowledge;
import com.baomidou.mybatisplus.annotation.TableName;
import com.datamate.common.domain.model.base.BaseEntity;
import lombok.Getter;
import lombok.Setter;
/**
* 知识条目目录实体(与数据库表 t_dm_knowledge_item_directories 对齐)
*/
@Getter
@Setter
@TableName(value = "t_dm_knowledge_item_directories", autoResultMap = true)
public class KnowledgeItemDirectory extends BaseEntity<String> {
/**
* 所属知识集ID
*/
private String setId;
/**
* 目录名称
*/
private String name;
/**
* 目录相对路径
*/
private String relativePath;
}

View File

@@ -2,6 +2,7 @@ package com.datamate.datamanagement.infrastructure.persistence.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper; import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile; import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.infrastructure.persistence.repository.dto.DatasetFileCount;
import org.apache.ibatis.annotations.Mapper; import org.apache.ibatis.annotations.Mapper;
import org.apache.ibatis.annotations.Param; import org.apache.ibatis.annotations.Param;
import org.apache.ibatis.session.RowBounds; import org.apache.ibatis.session.RowBounds;
@@ -17,6 +18,7 @@ public interface DatasetFileMapper extends BaseMapper<DatasetFile> {
Long countByDatasetId(@Param("datasetId") String datasetId); Long countByDatasetId(@Param("datasetId") String datasetId);
Long countCompletedByDatasetId(@Param("datasetId") String datasetId); Long countCompletedByDatasetId(@Param("datasetId") String datasetId);
Long sumSizeByDatasetId(@Param("datasetId") String datasetId); Long sumSizeByDatasetId(@Param("datasetId") String datasetId);
Long countNonDerivedByDatasetId(@Param("datasetId") String datasetId);
DatasetFile findByDatasetIdAndFileName(@Param("datasetId") String datasetId, @Param("fileName") String fileName); DatasetFile findByDatasetIdAndFileName(@Param("datasetId") String datasetId, @Param("fileName") String fileName);
List<DatasetFile> findAllByDatasetId(@Param("datasetId") String datasetId); List<DatasetFile> findAllByDatasetId(@Param("datasetId") String datasetId);
List<DatasetFile> findByCriteria(@Param("datasetId") String datasetId, List<DatasetFile> findByCriteria(@Param("datasetId") String datasetId,
@@ -38,4 +40,12 @@ public interface DatasetFileMapper extends BaseMapper<DatasetFile> {
* @return 源文件ID列表 * @return 源文件ID列表
*/ */
List<String> findSourceFileIdsWithDerivedFiles(@Param("datasetId") String datasetId); List<String> findSourceFileIdsWithDerivedFiles(@Param("datasetId") String datasetId);
/**
* 批量统计排除衍生文件后的文件数
*
* @param datasetIds 数据集ID列表
* @return 文件数统计列表
*/
List<DatasetFileCount> countNonDerivedByDatasetIds(@Param("datasetIds") List<String> datasetIds);
} }

View File

@@ -0,0 +1,9 @@
package com.datamate.datamanagement.infrastructure.persistence.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import org.apache.ibatis.annotations.Mapper;
@Mapper
public interface KnowledgeItemDirectoryMapper extends BaseMapper<KnowledgeItemDirectory> {
}

View File

@@ -28,13 +28,16 @@ public interface KnowledgeItemMapper extends BaseMapper<KnowledgeItem> {
WHEN ki.source_type = 'FILE_UPLOAD' THEN ki.content WHEN ki.source_type = 'FILE_UPLOAD' THEN ki.content
ELSE NULL ELSE NULL
END AS content, END AS content,
ki.relative_path AS relativePath,
ki.created_at AS createdAt, ki.created_at AS createdAt,
ki.updated_at AS updatedAt ki.updated_at AS updatedAt
FROM t_dm_knowledge_items ki FROM t_dm_knowledge_items ki
LEFT JOIN t_dm_knowledge_sets ks ON ki.set_id = ks.id LEFT JOIN t_dm_knowledge_sets ks ON ki.set_id = ks.id
LEFT JOIN t_dm_dataset_files df ON ki.source_file_id = df.id AND ki.source_type = 'DATASET_FILE' LEFT JOIN t_dm_dataset_files df ON ki.source_file_id = df.id AND ki.source_type = 'DATASET_FILE'
WHERE (ki.source_type = 'FILE_UPLOAD' AND ki.source_file_id LIKE CONCAT('%', #{keyword}, '%')) WHERE (ki.source_type = 'FILE_UPLOAD' AND (ki.source_file_id LIKE CONCAT('%', #{keyword}, '%')
OR (ki.source_type = 'DATASET_FILE' AND df.file_name LIKE CONCAT('%', #{keyword}, '%')) OR ki.relative_path LIKE CONCAT('%', #{keyword}, '%')))
OR (ki.source_type = 'DATASET_FILE' AND (df.file_name LIKE CONCAT('%', #{keyword}, '%')
OR ki.relative_path LIKE CONCAT('%', #{keyword}, '%')))
ORDER BY ki.created_at DESC ORDER BY ki.created_at DESC
""") """)
IPage<KnowledgeItemSearchResponse> searchFileItems(IPage<?> page, @Param("keyword") String keyword); IPage<KnowledgeItemSearchResponse> searchFileItems(IPage<?> page, @Param("keyword") String keyword);

View File

@@ -14,6 +14,7 @@ public interface TagMapper {
List<Tag> findByIdIn(@Param("ids") List<String> ids); List<Tag> findByIdIn(@Param("ids") List<String> ids);
List<Tag> findByKeyword(@Param("keyword") String keyword); List<Tag> findByKeyword(@Param("keyword") String keyword);
List<Tag> findAllByOrderByUsageCountDesc(); List<Tag> findAllByOrderByUsageCountDesc();
Long countKnowledgeSetTags();
int insert(Tag tag); int insert(Tag tag);
int update(Tag tag); int update(Tag tag);

View File

@@ -3,6 +3,7 @@ package com.datamate.datamanagement.infrastructure.persistence.repository;
import com.baomidou.mybatisplus.core.metadata.IPage; import com.baomidou.mybatisplus.core.metadata.IPage;
import com.baomidou.mybatisplus.extension.repository.IRepository; import com.baomidou.mybatisplus.extension.repository.IRepository;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile; import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.infrastructure.persistence.repository.dto.DatasetFileCount;
import java.util.List; import java.util.List;
@@ -15,6 +16,8 @@ import java.util.List;
public interface DatasetFileRepository extends IRepository<DatasetFile> { public interface DatasetFileRepository extends IRepository<DatasetFile> {
Long countByDatasetId(String datasetId); Long countByDatasetId(String datasetId);
Long countNonDerivedByDatasetId(String datasetId);
Long countCompletedByDatasetId(String datasetId); Long countCompletedByDatasetId(String datasetId);
Long sumSizeByDatasetId(String datasetId); Long sumSizeByDatasetId(String datasetId);
@@ -36,4 +39,6 @@ public interface DatasetFileRepository extends IRepository<DatasetFile> {
* @return 源文件ID列表 * @return 源文件ID列表
*/ */
List<String> findSourceFileIdsWithDerivedFiles(String datasetId); List<String> findSourceFileIdsWithDerivedFiles(String datasetId);
List<DatasetFileCount> countNonDerivedByDatasetIds(List<String> datasetIds);
} }

View File

@@ -0,0 +1,18 @@
package com.datamate.datamanagement.infrastructure.persistence.repository;
import com.baomidou.mybatisplus.extension.repository.IRepository;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryQuery;
import java.util.List;
/**
* 知识条目目录仓储接口
*/
public interface KnowledgeItemDirectoryRepository extends IRepository<KnowledgeItemDirectory> {
List<KnowledgeItemDirectory> findByCriteria(KnowledgeDirectoryQuery query);
KnowledgeItemDirectory findBySetIdAndPath(String setId, String relativePath);
int removeByRelativePathPrefix(String setId, String relativePath);
}

View File

@@ -26,4 +26,8 @@ public interface KnowledgeItemRepository extends IRepository<KnowledgeItem> {
IPage<KnowledgeItemSearchResponse> searchFileItems(IPage<?> page, String keyword); IPage<KnowledgeItemSearchResponse> searchFileItems(IPage<?> page, String keyword);
Long sumDatasetFileSize(); Long sumDatasetFileSize();
boolean existsBySetIdAndRelativePath(String setId, String relativePath);
int removeByRelativePathPrefix(String setId, String relativePath);
} }

View File

@@ -0,0 +1,18 @@
package com.datamate.datamanagement.infrastructure.persistence.repository.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
/**
* 数据集文件数统计结果
*/
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public class DatasetFileCount {
private String datasetId;
private Long fileCount;
}

View File

@@ -6,6 +6,7 @@ import com.baomidou.mybatisplus.extension.repository.CrudRepository;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile; import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.infrastructure.persistence.mapper.DatasetFileMapper; import com.datamate.datamanagement.infrastructure.persistence.mapper.DatasetFileMapper;
import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository; import com.datamate.datamanagement.infrastructure.persistence.repository.DatasetFileRepository;
import com.datamate.datamanagement.infrastructure.persistence.repository.dto.DatasetFileCount;
import lombok.RequiredArgsConstructor; import lombok.RequiredArgsConstructor;
import org.springframework.stereotype.Repository; import org.springframework.stereotype.Repository;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
@@ -30,6 +31,11 @@ public class DatasetFileRepositoryImpl extends CrudRepository<DatasetFileMapper,
return datasetFileMapper.selectCount(new LambdaQueryWrapper<DatasetFile>().eq(DatasetFile::getDatasetId, datasetId)); return datasetFileMapper.selectCount(new LambdaQueryWrapper<DatasetFile>().eq(DatasetFile::getDatasetId, datasetId));
} }
@Override
public Long countNonDerivedByDatasetId(String datasetId) {
return datasetFileMapper.countNonDerivedByDatasetId(datasetId);
}
@Override @Override
public Long countCompletedByDatasetId(String datasetId) { public Long countCompletedByDatasetId(String datasetId) {
return datasetFileMapper.countCompletedByDatasetId(datasetId); return datasetFileMapper.countCompletedByDatasetId(datasetId);
@@ -71,4 +77,9 @@ public class DatasetFileRepositoryImpl extends CrudRepository<DatasetFileMapper,
// 使用 MyBatis 的 @Select 注解或直接调用 mapper 方法 // 使用 MyBatis 的 @Select 注解或直接调用 mapper 方法
return datasetFileMapper.findSourceFileIdsWithDerivedFiles(datasetId); return datasetFileMapper.findSourceFileIdsWithDerivedFiles(datasetId);
} }
@Override
public List<DatasetFileCount> countNonDerivedByDatasetIds(List<String> datasetIds) {
return datasetFileMapper.countNonDerivedByDatasetIds(datasetIds);
}
} }

View File

@@ -0,0 +1,96 @@
package com.datamate.datamanagement.infrastructure.persistence.repository.impl;
import com.baomidou.mybatisplus.core.conditions.query.LambdaQueryWrapper;
import com.baomidou.mybatisplus.extension.repository.CrudRepository;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import com.datamate.datamanagement.infrastructure.persistence.mapper.KnowledgeItemDirectoryMapper;
import com.datamate.datamanagement.infrastructure.persistence.repository.KnowledgeItemDirectoryRepository;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryQuery;
import lombok.RequiredArgsConstructor;
import org.apache.commons.lang3.StringUtils;
import org.springframework.stereotype.Repository;
import java.util.List;
/**
* 知识条目目录仓储实现类
*/
@Repository
@RequiredArgsConstructor
public class KnowledgeItemDirectoryRepositoryImpl
extends CrudRepository<KnowledgeItemDirectoryMapper, KnowledgeItemDirectory>
implements KnowledgeItemDirectoryRepository {
private static final String PATH_SEPARATOR = "/";
private final KnowledgeItemDirectoryMapper knowledgeItemDirectoryMapper;
@Override
public List<KnowledgeItemDirectory> findByCriteria(KnowledgeDirectoryQuery query) {
String relativePath = normalizeRelativePathPrefix(query.getRelativePath());
LambdaQueryWrapper<KnowledgeItemDirectory> wrapper = new LambdaQueryWrapper<KnowledgeItemDirectory>()
.eq(StringUtils.isNotBlank(query.getSetId()), KnowledgeItemDirectory::getSetId, query.getSetId())
.likeRight(StringUtils.isNotBlank(relativePath), KnowledgeItemDirectory::getRelativePath, relativePath);
if (StringUtils.isNotBlank(query.getKeyword())) {
wrapper.and(w -> w.like(KnowledgeItemDirectory::getName, query.getKeyword())
.or()
.like(KnowledgeItemDirectory::getRelativePath, query.getKeyword()));
}
wrapper.orderByAsc(KnowledgeItemDirectory::getRelativePath);
return knowledgeItemDirectoryMapper.selectList(wrapper);
}
@Override
public KnowledgeItemDirectory findBySetIdAndPath(String setId, String relativePath) {
return knowledgeItemDirectoryMapper.selectOne(new LambdaQueryWrapper<KnowledgeItemDirectory>()
.eq(KnowledgeItemDirectory::getSetId, setId)
.eq(KnowledgeItemDirectory::getRelativePath, relativePath));
}
@Override
public int removeByRelativePathPrefix(String setId, String relativePath) {
String normalized = normalizeRelativePathValue(relativePath);
if (StringUtils.isBlank(normalized)) {
return 0;
}
String prefix = normalizeRelativePathPrefix(normalized);
LambdaQueryWrapper<KnowledgeItemDirectory> wrapper = new LambdaQueryWrapper<KnowledgeItemDirectory>()
.eq(KnowledgeItemDirectory::getSetId, setId)
.and(w -> w.eq(KnowledgeItemDirectory::getRelativePath, normalized)
.or()
.likeRight(KnowledgeItemDirectory::getRelativePath, prefix));
return knowledgeItemDirectoryMapper.delete(wrapper);
}
private String normalizeRelativePathPrefix(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
if (StringUtils.isBlank(normalized)) {
return "";
}
if (!normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized + PATH_SEPARATOR;
}
return normalized;
}
private String normalizeRelativePathValue(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
return normalized;
}
}

View File

@@ -21,21 +21,26 @@ import java.util.List;
@Repository @Repository
@RequiredArgsConstructor @RequiredArgsConstructor
public class KnowledgeItemRepositoryImpl extends CrudRepository<KnowledgeItemMapper, KnowledgeItem> implements KnowledgeItemRepository { public class KnowledgeItemRepositoryImpl extends CrudRepository<KnowledgeItemMapper, KnowledgeItem> implements KnowledgeItemRepository {
private static final String PATH_SEPARATOR = "/";
private final KnowledgeItemMapper knowledgeItemMapper; private final KnowledgeItemMapper knowledgeItemMapper;
@Override @Override
public IPage<KnowledgeItem> findByCriteria(IPage<KnowledgeItem> page, KnowledgeItemPagingQuery query) { public IPage<KnowledgeItem> findByCriteria(IPage<KnowledgeItem> page, KnowledgeItemPagingQuery query) {
String relativePath = normalizeRelativePathPrefix(query.getRelativePath());
LambdaQueryWrapper<KnowledgeItem> wrapper = new LambdaQueryWrapper<KnowledgeItem>() LambdaQueryWrapper<KnowledgeItem> wrapper = new LambdaQueryWrapper<KnowledgeItem>()
.eq(StringUtils.isNotBlank(query.getSetId()), KnowledgeItem::getSetId, query.getSetId()) .eq(StringUtils.isNotBlank(query.getSetId()), KnowledgeItem::getSetId, query.getSetId())
.eq(query.getContentType() != null, KnowledgeItem::getContentType, query.getContentType()) .eq(query.getContentType() != null, KnowledgeItem::getContentType, query.getContentType())
.eq(query.getSourceType() != null, KnowledgeItem::getSourceType, query.getSourceType()) .eq(query.getSourceType() != null, KnowledgeItem::getSourceType, query.getSourceType())
.eq(StringUtils.isNotBlank(query.getSourceDatasetId()), KnowledgeItem::getSourceDatasetId, query.getSourceDatasetId()) .eq(StringUtils.isNotBlank(query.getSourceDatasetId()), KnowledgeItem::getSourceDatasetId, query.getSourceDatasetId())
.eq(StringUtils.isNotBlank(query.getSourceFileId()), KnowledgeItem::getSourceFileId, query.getSourceFileId()); .eq(StringUtils.isNotBlank(query.getSourceFileId()), KnowledgeItem::getSourceFileId, query.getSourceFileId())
.likeRight(StringUtils.isNotBlank(relativePath), KnowledgeItem::getRelativePath, relativePath);
if (StringUtils.isNotBlank(query.getKeyword())) { if (StringUtils.isNotBlank(query.getKeyword())) {
wrapper.and(w -> w.like(KnowledgeItem::getSourceFileId, query.getKeyword()) wrapper.and(w -> w.like(KnowledgeItem::getSourceFileId, query.getKeyword())
.or() .or()
.like(KnowledgeItem::getContent, query.getKeyword())); .like(KnowledgeItem::getContent, query.getKeyword())
.or()
.like(KnowledgeItem::getRelativePath, query.getKeyword()));
} }
wrapper.orderByDesc(KnowledgeItem::getCreatedAt); wrapper.orderByDesc(KnowledgeItem::getCreatedAt);
@@ -77,4 +82,60 @@ public class KnowledgeItemRepositoryImpl extends CrudRepository<KnowledgeItemMap
public Long sumDatasetFileSize() { public Long sumDatasetFileSize() {
return knowledgeItemMapper.sumDatasetFileSize(); return knowledgeItemMapper.sumDatasetFileSize();
} }
@Override
public boolean existsBySetIdAndRelativePath(String setId, String relativePath) {
if (StringUtils.isBlank(setId) || StringUtils.isBlank(relativePath)) {
return false;
}
return knowledgeItemMapper.selectCount(new LambdaQueryWrapper<KnowledgeItem>()
.eq(KnowledgeItem::getSetId, setId)
.eq(KnowledgeItem::getRelativePath, relativePath)) > 0;
}
@Override
public int removeByRelativePathPrefix(String setId, String relativePath) {
String normalized = normalizeRelativePathValue(relativePath);
if (StringUtils.isBlank(setId) || StringUtils.isBlank(normalized)) {
return 0;
}
String prefix = normalizeRelativePathPrefix(normalized);
LambdaQueryWrapper<KnowledgeItem> wrapper = new LambdaQueryWrapper<KnowledgeItem>()
.eq(KnowledgeItem::getSetId, setId)
.and(w -> w.eq(KnowledgeItem::getRelativePath, normalized)
.or()
.likeRight(KnowledgeItem::getRelativePath, prefix));
return knowledgeItemMapper.delete(wrapper);
}
private String normalizeRelativePathPrefix(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
if (StringUtils.isBlank(normalized)) {
return "";
}
if (!normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized + PATH_SEPARATOR;
}
return normalized;
}
private String normalizeRelativePathValue(String relativePath) {
if (StringUtils.isBlank(relativePath)) {
return "";
}
String normalized = relativePath.replace("\\", PATH_SEPARATOR).trim();
while (normalized.startsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(1);
}
while (normalized.endsWith(PATH_SEPARATOR)) {
normalized = normalized.substring(0, normalized.length() - 1);
}
return normalized;
}
} }

View File

@@ -1,9 +1,11 @@
package com.datamate.datamanagement.interfaces.converter; package com.datamate.datamanagement.interfaces.converter;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem; import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeSet; import com.datamate.datamanagement.domain.model.knowledge.KnowledgeSet;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest; import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeSetRequest; import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeSetRequest;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryResponse;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse; import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse;
import com.datamate.datamanagement.interfaces.dto.KnowledgeSetResponse; import com.datamate.datamanagement.interfaces.dto.KnowledgeSetResponse;
import org.mapstruct.Mapper; import org.mapstruct.Mapper;
@@ -31,4 +33,8 @@ public interface KnowledgeConverter {
KnowledgeItemResponse convertToResponse(KnowledgeItem knowledgeItem); KnowledgeItemResponse convertToResponse(KnowledgeItem knowledgeItem);
List<KnowledgeItemResponse> convertItemResponses(List<KnowledgeItem> items); List<KnowledgeItemResponse> convertItemResponses(List<KnowledgeItem> items);
KnowledgeDirectoryResponse convertToResponse(KnowledgeItemDirectory directory);
List<KnowledgeDirectoryResponse> convertDirectoryResponses(List<KnowledgeItemDirectory> directories);
} }

View File

@@ -0,0 +1,20 @@
package com.datamate.datamanagement.interfaces.dto;
import jakarta.validation.constraints.NotBlank;
import lombok.Getter;
import lombok.Setter;
/**
* 创建知识条目目录请求
*/
@Getter
@Setter
public class CreateKnowledgeDirectoryRequest {
/** 父级前缀路径,例如 "docs/",为空表示知识集根目录 */
private String parentPrefix;
/** 新建目录名称 */
@NotBlank
private String directoryName;
}

View File

@@ -34,4 +34,8 @@ public class CreateKnowledgeItemRequest {
* 来源文件ID(用于标注同步等场景) * 来源文件ID(用于标注同步等场景)
*/ */
private String sourceFileId; private String sourceFileId;
/**
* 扩展元数据
*/
private String metadata;
} }

View File

@@ -0,0 +1,16 @@
package com.datamate.datamanagement.interfaces.dto;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import lombok.Getter;
import lombok.Setter;
/**
* 数据集文件预览状态响应
*/
@Getter
@Setter
public class DatasetFilePreviewStatusResponse {
private KnowledgeItemPreviewStatus status;
private String previewError;
private String updatedAt;
}

View File

@@ -0,0 +1,20 @@
package com.datamate.datamanagement.interfaces.dto;
import jakarta.validation.constraints.NotEmpty;
import lombok.Getter;
import lombok.Setter;
import java.util.List;
/**
* 批量删除知识条目请求
*/
@Getter
@Setter
public class DeleteKnowledgeItemsRequest {
/**
* 知识条目ID列表
*/
@NotEmpty(message = "知识条目ID不能为空")
private List<String> ids;
}

View File

@@ -0,0 +1,20 @@
package com.datamate.datamanagement.interfaces.dto;
import lombok.Getter;
import lombok.Setter;
/**
* 知识条目目录查询参数
*/
@Getter
@Setter
public class KnowledgeDirectoryQuery {
/** 所属知识集ID */
private String setId;
/** 目录相对路径前缀 */
private String relativePath;
/** 搜索关键字 */
private String keyword;
}

View File

@@ -0,0 +1,20 @@
package com.datamate.datamanagement.interfaces.dto;
import lombok.Getter;
import lombok.Setter;
import java.time.LocalDateTime;
/**
* 知识条目目录响应
*/
@Getter
@Setter
public class KnowledgeDirectoryResponse {
private String id;
private String setId;
private String name;
private String relativePath;
private LocalDateTime createdAt;
private LocalDateTime updatedAt;
}

View File

@@ -41,4 +41,8 @@ public class KnowledgeItemPagingQuery extends PagingQuery {
* 来源文件ID * 来源文件ID
*/ */
private String sourceFileId; private String sourceFileId;
/**
* 相对路径前缀
*/
private String relativePath;
} }

View File

@@ -0,0 +1,16 @@
package com.datamate.datamanagement.interfaces.dto;
import com.datamate.datamanagement.common.enums.KnowledgeItemPreviewStatus;
import lombok.Getter;
import lombok.Setter;
/**
* 知识条目预览状态响应
*/
@Getter
@Setter
public class KnowledgeItemPreviewStatusResponse {
private KnowledgeItemPreviewStatus status;
private String previewError;
private String updatedAt;
}

View File

@@ -20,6 +20,14 @@ public class KnowledgeItemResponse {
private KnowledgeSourceType sourceType; private KnowledgeSourceType sourceType;
private String sourceDatasetId; private String sourceDatasetId;
private String sourceFileId; private String sourceFileId;
/**
* 相对路径(用于目录展示)
*/
private String relativePath;
/**
* 扩展元数据
*/
private String metadata;
private LocalDateTime createdAt; private LocalDateTime createdAt;
private LocalDateTime updatedAt; private LocalDateTime updatedAt;
private String createdBy; private String createdBy;

View File

@@ -23,6 +23,10 @@ public class KnowledgeItemSearchResponse {
private String sourceFileId; private String sourceFileId;
private String fileName; private String fileName;
private Long fileSize; private Long fileSize;
/**
* 相对路径(用于目录展示)
*/
private String relativePath;
private LocalDateTime createdAt; private LocalDateTime createdAt;
private LocalDateTime updatedAt; private LocalDateTime updatedAt;

View File

@@ -12,4 +12,5 @@ public class KnowledgeManagementStatisticsResponse {
private Long totalKnowledgeSets = 0L; private Long totalKnowledgeSets = 0L;
private Long totalFiles = 0L; private Long totalFiles = 0L;
private Long totalSize = 0L; private Long totalSize = 0L;
private Long totalTags = 0L;
} }

View File

@@ -1,8 +1,10 @@
package com.datamate.datamanagement.interfaces.dto; package com.datamate.datamanagement.interfaces.dto;
import com.datamate.datamanagement.common.enums.DatasetStatusType; import com.datamate.datamanagement.common.enums.DatasetStatusType;
import com.fasterxml.jackson.annotation.JsonIgnore;
import jakarta.validation.constraints.NotBlank; import jakarta.validation.constraints.NotBlank;
import jakarta.validation.constraints.Size; import jakarta.validation.constraints.Size;
import lombok.AccessLevel;
import lombok.Getter; import lombok.Getter;
import lombok.Setter; import lombok.Setter;
@@ -24,9 +26,18 @@ public class UpdateDatasetRequest {
/** 归集任务id */ /** 归集任务id */
private String dataSource; private String dataSource;
/** 父数据集ID */ /** 父数据集ID */
@Setter(AccessLevel.NONE)
private String parentDatasetId; private String parentDatasetId;
@JsonIgnore
@Setter(AccessLevel.NONE)
private boolean parentDatasetIdProvided;
/** 标签列表 */ /** 标签列表 */
private List<String> tags; private List<String> tags;
/** 数据集状态 */ /** 数据集状态 */
private DatasetStatusType status; private DatasetStatusType status;
public void setParentDatasetId(String parentDatasetId) {
this.parentDatasetIdProvided = true;
this.parentDatasetId = parentDatasetId;
}
} }

View File

@@ -18,4 +18,8 @@ public class UpdateKnowledgeItemRequest {
* 内容类型 * 内容类型
*/ */
private KnowledgeContentType contentType; private KnowledgeContentType contentType;
/**
* 扩展元数据
*/
private String metadata;
} }

View File

@@ -17,4 +17,8 @@ public class UploadKnowledgeItemsRequest {
*/ */
@NotEmpty(message = "文件列表不能为空") @NotEmpty(message = "文件列表不能为空")
private List<MultipartFile> files; private List<MultipartFile> files;
/**
* 目录前缀(用于目录上传)
*/
private String parentPrefix;
} }

View File

@@ -6,11 +6,13 @@ import com.datamate.common.infrastructure.exception.SystemErrorCode;
import com.datamate.common.interfaces.PagedResponse; import com.datamate.common.interfaces.PagedResponse;
import com.datamate.common.interfaces.PagingQuery; import com.datamate.common.interfaces.PagingQuery;
import com.datamate.datamanagement.application.DatasetFileApplicationService; import com.datamate.datamanagement.application.DatasetFileApplicationService;
import com.datamate.datamanagement.application.DatasetFilePreviewService;
import com.datamate.datamanagement.domain.model.dataset.DatasetFile; import com.datamate.datamanagement.domain.model.dataset.DatasetFile;
import com.datamate.datamanagement.interfaces.converter.DatasetConverter; import com.datamate.datamanagement.interfaces.converter.DatasetConverter;
import com.datamate.datamanagement.interfaces.dto.AddFilesRequest; import com.datamate.datamanagement.interfaces.dto.AddFilesRequest;
import com.datamate.datamanagement.interfaces.dto.CopyFilesRequest; import com.datamate.datamanagement.interfaces.dto.CopyFilesRequest;
import com.datamate.datamanagement.interfaces.dto.CreateDirectoryRequest; import com.datamate.datamanagement.interfaces.dto.CreateDirectoryRequest;
import com.datamate.datamanagement.interfaces.dto.DatasetFilePreviewStatusResponse;
import com.datamate.datamanagement.interfaces.dto.DatasetFileResponse; import com.datamate.datamanagement.interfaces.dto.DatasetFileResponse;
import com.datamate.datamanagement.interfaces.dto.UploadFileRequest; import com.datamate.datamanagement.interfaces.dto.UploadFileRequest;
import com.datamate.datamanagement.interfaces.dto.UploadFilesPreRequest; import com.datamate.datamanagement.interfaces.dto.UploadFilesPreRequest;
@@ -19,6 +21,7 @@ import jakarta.validation.Valid;
import lombok.extern.slf4j.Slf4j; import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired; import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.Resource; import org.springframework.core.io.Resource;
import org.springframework.core.io.UrlResource;
import org.springframework.http.HttpHeaders; import org.springframework.http.HttpHeaders;
import org.springframework.http.HttpStatus; import org.springframework.http.HttpStatus;
import org.springframework.http.MediaType; import org.springframework.http.MediaType;
@@ -37,10 +40,13 @@ import java.util.List;
public class DatasetFileController { public class DatasetFileController {
private final DatasetFileApplicationService datasetFileApplicationService; private final DatasetFileApplicationService datasetFileApplicationService;
private final DatasetFilePreviewService datasetFilePreviewService;
@Autowired @Autowired
public DatasetFileController(DatasetFileApplicationService datasetFileApplicationService) { public DatasetFileController(DatasetFileApplicationService datasetFileApplicationService,
DatasetFilePreviewService datasetFilePreviewService) {
this.datasetFileApplicationService = datasetFileApplicationService; this.datasetFileApplicationService = datasetFileApplicationService;
this.datasetFilePreviewService = datasetFilePreviewService;
} }
@GetMapping @GetMapping
@@ -120,6 +126,19 @@ public class DatasetFileController {
@PathVariable("fileId") String fileId) { @PathVariable("fileId") String fileId) {
try { try {
DatasetFile datasetFile = datasetFileApplicationService.getDatasetFile(datasetId, fileId); DatasetFile datasetFile = datasetFileApplicationService.getDatasetFile(datasetId, fileId);
if (datasetFilePreviewService.isOfficeDocument(datasetFile.getFileName())) {
DatasetFilePreviewService.PreviewFile previewFile = datasetFilePreviewService
.resolveReadyPreviewFile(datasetId, datasetFile);
if (previewFile == null) {
return ResponseEntity.status(HttpStatus.CONFLICT).build();
}
Resource previewResource = new UrlResource(previewFile.filePath().toUri());
return ResponseEntity.ok()
.contentType(MediaType.APPLICATION_PDF)
.header(HttpHeaders.CONTENT_DISPOSITION,
"inline; filename=\"" + previewFile.fileName() + "\"")
.body(previewResource);
}
Resource resource = datasetFileApplicationService.downloadFile(datasetId, fileId); Resource resource = datasetFileApplicationService.downloadFile(datasetId, fileId);
MediaType mediaType = MediaTypeFactory.getMediaType(resource) MediaType mediaType = MediaTypeFactory.getMediaType(resource)
.orElse(MediaType.APPLICATION_OCTET_STREAM); .orElse(MediaType.APPLICATION_OCTET_STREAM);
@@ -136,6 +155,18 @@ public class DatasetFileController {
} }
} }
@GetMapping("/{fileId}/preview/status")
public DatasetFilePreviewStatusResponse getDatasetFilePreviewStatus(@PathVariable("datasetId") String datasetId,
@PathVariable("fileId") String fileId) {
return datasetFilePreviewService.getPreviewStatus(datasetId, fileId);
}
@PostMapping("/{fileId}/preview/convert")
public DatasetFilePreviewStatusResponse convertDatasetFilePreview(@PathVariable("datasetId") String datasetId,
@PathVariable("fileId") String fileId) {
return datasetFilePreviewService.ensurePreview(datasetId, fileId);
}
@IgnoreResponseWrap @IgnoreResponseWrap
@GetMapping(value = "/download", produces = MediaType.APPLICATION_OCTET_STREAM_VALUE) @GetMapping(value = "/download", produces = MediaType.APPLICATION_OCTET_STREAM_VALUE)
public void downloadDatasetFileAsZip(@PathVariable("datasetId") String datasetId, HttpServletResponse response) { public void downloadDatasetFileAsZip(@PathVariable("datasetId") String datasetId, HttpServletResponse response) {

View File

@@ -0,0 +1,33 @@
package com.datamate.datamanagement.interfaces.rest;
import com.datamate.datamanagement.application.DatasetFileApplicationService;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
/**
* 数据集上传控制器
*/
@Slf4j
@RestController
@RequiredArgsConstructor
@RequestMapping("/data-management/datasets/upload")
public class DatasetUploadController {
private final DatasetFileApplicationService datasetFileApplicationService;
/**
* 取消上传
*
* @param reqId 预上传请求ID
*/
@PutMapping("/cancel-upload/{reqId}")
public ResponseEntity<Void> cancelUpload(@PathVariable("reqId") String reqId) {
datasetFileApplicationService.cancelUpload(reqId);
return ResponseEntity.ok().build();
}
}

View File

@@ -0,0 +1,43 @@
package com.datamate.datamanagement.interfaces.rest;
import com.datamate.datamanagement.application.KnowledgeDirectoryApplicationService;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItemDirectory;
import com.datamate.datamanagement.interfaces.converter.KnowledgeConverter;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeDirectoryRequest;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryQuery;
import com.datamate.datamanagement.interfaces.dto.KnowledgeDirectoryResponse;
import jakarta.validation.Valid;
import lombok.RequiredArgsConstructor;
import org.springframework.web.bind.annotation.*;
import java.util.List;
/**
* 知识条目目录 REST 控制器
*/
@RestController
@RequiredArgsConstructor
@RequestMapping("/data-management/knowledge-sets/{setId}/directories")
public class KnowledgeDirectoryController {
private final KnowledgeDirectoryApplicationService knowledgeDirectoryApplicationService;
@GetMapping
public List<KnowledgeDirectoryResponse> getKnowledgeDirectories(@PathVariable("setId") String setId,
KnowledgeDirectoryQuery query) {
List<KnowledgeItemDirectory> directories = knowledgeDirectoryApplicationService.getKnowledgeDirectories(setId, query);
return KnowledgeConverter.INSTANCE.convertDirectoryResponses(directories);
}
@PostMapping
public KnowledgeDirectoryResponse createKnowledgeDirectory(@PathVariable("setId") String setId,
@RequestBody @Valid CreateKnowledgeDirectoryRequest request) {
KnowledgeItemDirectory directory = knowledgeDirectoryApplicationService.createKnowledgeDirectory(setId, request);
return KnowledgeConverter.INSTANCE.convertToResponse(directory);
}
@DeleteMapping
public void deleteKnowledgeDirectory(@PathVariable("setId") String setId,
@RequestParam("relativePath") String relativePath) {
knowledgeDirectoryApplicationService.deleteKnowledgeDirectory(setId, relativePath);
}
}

View File

@@ -3,11 +3,14 @@ package com.datamate.datamanagement.interfaces.rest;
import com.datamate.common.infrastructure.common.IgnoreResponseWrap; import com.datamate.common.infrastructure.common.IgnoreResponseWrap;
import com.datamate.common.interfaces.PagedResponse; import com.datamate.common.interfaces.PagedResponse;
import com.datamate.datamanagement.application.KnowledgeItemApplicationService; import com.datamate.datamanagement.application.KnowledgeItemApplicationService;
import com.datamate.datamanagement.application.KnowledgeItemPreviewService;
import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem; import com.datamate.datamanagement.domain.model.knowledge.KnowledgeItem;
import com.datamate.datamanagement.interfaces.converter.KnowledgeConverter; import com.datamate.datamanagement.interfaces.converter.KnowledgeConverter;
import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest; import com.datamate.datamanagement.interfaces.dto.CreateKnowledgeItemRequest;
import com.datamate.datamanagement.interfaces.dto.DeleteKnowledgeItemsRequest;
import com.datamate.datamanagement.interfaces.dto.ImportKnowledgeItemsRequest; import com.datamate.datamanagement.interfaces.dto.ImportKnowledgeItemsRequest;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPagingQuery; import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPagingQuery;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemPreviewStatusResponse;
import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse; import com.datamate.datamanagement.interfaces.dto.KnowledgeItemResponse;
import com.datamate.datamanagement.interfaces.dto.ReplaceKnowledgeItemFileRequest; import com.datamate.datamanagement.interfaces.dto.ReplaceKnowledgeItemFileRequest;
import com.datamate.datamanagement.interfaces.dto.UpdateKnowledgeItemRequest; import com.datamate.datamanagement.interfaces.dto.UpdateKnowledgeItemRequest;
@@ -30,6 +33,7 @@ import java.util.List;
@RequestMapping("/data-management/knowledge-sets/{setId}/items") @RequestMapping("/data-management/knowledge-sets/{setId}/items")
public class KnowledgeItemController { public class KnowledgeItemController {
private final KnowledgeItemApplicationService knowledgeItemApplicationService; private final KnowledgeItemApplicationService knowledgeItemApplicationService;
private final KnowledgeItemPreviewService knowledgeItemPreviewService;
@GetMapping @GetMapping
public PagedResponse<KnowledgeItemResponse> getKnowledgeItems(@PathVariable("setId") String setId, public PagedResponse<KnowledgeItemResponse> getKnowledgeItems(@PathVariable("setId") String setId,
@@ -80,6 +84,18 @@ public class KnowledgeItemController {
knowledgeItemApplicationService.previewKnowledgeItemFile(setId, itemId, response); knowledgeItemApplicationService.previewKnowledgeItemFile(setId, itemId, response);
} }
@GetMapping("/{itemId}/preview/status")
public KnowledgeItemPreviewStatusResponse getKnowledgeItemPreviewStatus(@PathVariable("setId") String setId,
@PathVariable("itemId") String itemId) {
return knowledgeItemPreviewService.getPreviewStatus(setId, itemId);
}
@PostMapping("/{itemId}/preview/convert")
public KnowledgeItemPreviewStatusResponse convertKnowledgeItemPreview(@PathVariable("setId") String setId,
@PathVariable("itemId") String itemId) {
return knowledgeItemPreviewService.ensurePreview(setId, itemId);
}
@GetMapping("/{itemId}") @GetMapping("/{itemId}")
public KnowledgeItemResponse getKnowledgeItemById(@PathVariable("setId") String setId, public KnowledgeItemResponse getKnowledgeItemById(@PathVariable("setId") String setId,
@PathVariable("itemId") String itemId) { @PathVariable("itemId") String itemId) {
@@ -108,4 +124,10 @@ public class KnowledgeItemController {
@PathVariable("itemId") String itemId) { @PathVariable("itemId") String itemId) {
knowledgeItemApplicationService.deleteKnowledgeItem(setId, itemId); knowledgeItemApplicationService.deleteKnowledgeItem(setId, itemId);
} }
@PostMapping("/batch-delete")
public void deleteKnowledgeItems(@PathVariable("setId") String setId,
@RequestBody @Valid DeleteKnowledgeItemsRequest request) {
knowledgeItemApplicationService.deleteKnowledgeItems(setId, request);
}
} }

View File

@@ -42,6 +42,13 @@
SELECT COUNT(*) FROM t_dm_dataset_files WHERE dataset_id = #{datasetId} SELECT COUNT(*) FROM t_dm_dataset_files WHERE dataset_id = #{datasetId}
</select> </select>
<select id="countNonDerivedByDatasetId" parameterType="string" resultType="long">
SELECT COUNT(*)
FROM t_dm_dataset_files
WHERE dataset_id = #{datasetId}
AND (metadata IS NULL OR JSON_EXTRACT(metadata, '$.derived_from_file_id') IS NULL)
</select>
<select id="countCompletedByDatasetId" parameterType="string" resultType="long"> <select id="countCompletedByDatasetId" parameterType="string" resultType="long">
SELECT COUNT(*) FROM t_dm_dataset_files WHERE dataset_id = #{datasetId} AND status = 'COMPLETED' SELECT COUNT(*) FROM t_dm_dataset_files WHERE dataset_id = #{datasetId} AND status = 'COMPLETED'
</select> </select>
@@ -110,4 +117,16 @@
AND metadata IS NOT NULL AND metadata IS NOT NULL
AND JSON_EXTRACT(metadata, '$.derived_from_file_id') IS NOT NULL AND JSON_EXTRACT(metadata, '$.derived_from_file_id') IS NOT NULL
</select> </select>
<select id="countNonDerivedByDatasetIds" resultType="com.datamate.datamanagement.infrastructure.persistence.repository.dto.DatasetFileCount">
SELECT dataset_id AS datasetId,
COUNT(*) AS fileCount
FROM t_dm_dataset_files
WHERE dataset_id IN
<foreach collection="datasetIds" item="datasetId" open="(" separator="," close=")">
#{datasetId}
</foreach>
AND (metadata IS NULL OR JSON_EXTRACT(metadata, '$.derived_from_file_id') IS NULL)
GROUP BY dataset_id
</select>
</mapper> </mapper>

View File

@@ -145,9 +145,10 @@
<select id="getAllDatasetStatistics" resultType="com.datamate.datamanagement.interfaces.dto.AllDatasetStatisticsResponse"> <select id="getAllDatasetStatistics" resultType="com.datamate.datamanagement.interfaces.dto.AllDatasetStatisticsResponse">
SELECT SELECT
COUNT(*) AS total_datasets, (SELECT COUNT(*) FROM t_dm_datasets) AS total_datasets,
SUM(size_bytes) AS total_size, (SELECT COALESCE(SUM(size_bytes), 0) FROM t_dm_datasets) AS total_size,
SUM(file_count) AS total_files (SELECT COUNT(*)
FROM t_dm_datasets; FROM t_dm_dataset_files
WHERE metadata IS NULL OR JSON_EXTRACT(metadata, '$.derived_from_file_id') IS NULL) AS total_files
</select> </select>
</mapper> </mapper>

View File

@@ -53,6 +53,19 @@
ORDER BY usage_count DESC, name ASC ORDER BY usage_count DESC, name ASC
</select> </select>
<select id="countKnowledgeSetTags" resultType="long">
SELECT COUNT(DISTINCT t.id)
FROM t_dm_tags t
WHERE EXISTS (
SELECT 1
FROM t_dm_knowledge_sets ks
WHERE ks.tags IS NOT NULL
AND JSON_VALID(ks.tags) = 1
AND JSON_LENGTH(ks.tags) > 0
AND JSON_SEARCH(ks.tags, 'one', t.name, NULL, '$[*].name') IS NOT NULL
)
</select>
<insert id="insert" parameterType="com.datamate.datamanagement.domain.model.dataset.Tag"> <insert id="insert" parameterType="com.datamate.datamanagement.domain.model.dataset.Tag">
INSERT INTO t_dm_tags (id, name, description, category, color, usage_count) INSERT INTO t_dm_tags (id, name, description, category, color, usage_count)
VALUES (#{id}, #{name}, #{description}, #{category}, #{color}, #{usageCount}) VALUES (#{id}, #{name}, #{description}, #{category}, #{color}, #{usageCount})

View File

@@ -17,6 +17,7 @@ public class SecurityConfig {
@Bean @Bean
public SecurityFilterChain filterChain(HttpSecurity http) throws Exception { public SecurityFilterChain filterChain(HttpSecurity http) throws Exception {
http.csrf(csrf -> csrf.disable()) http.csrf(csrf -> csrf.disable())
.headers(headers -> headers.frameOptions(frameOptions -> frameOptions.disable()))
.authorizeHttpRequests(authz -> authz .authorizeHttpRequests(authz -> authz
.anyRequest().permitAll() // 允许所有请求无需认证 .anyRequest().permitAll() // 允许所有请求无需认证
); );

View File

@@ -21,7 +21,7 @@ import java.util.UUID;
*/ */
@Component @Component
public class FileService { public class FileService {
private static final int DEFAULT_TIMEOUT = 120; private static final int DEFAULT_TIMEOUT = 1800;
private final ChunkUploadRequestMapper chunkUploadRequestMapper; private final ChunkUploadRequestMapper chunkUploadRequestMapper;
@@ -74,6 +74,26 @@ public class FileService {
.build(); .build();
} }
/**
* 取消上传
*/
@Transactional
public void cancelUpload(String reqId) {
if (reqId == null || reqId.isBlank()) {
throw BusinessException.of(CommonErrorCode.PARAM_ERROR);
}
ChunkUploadPreRequest preRequest = chunkUploadRequestMapper.findById(reqId);
if (preRequest == null) {
return;
}
String uploadPath = preRequest.getUploadPath();
if (uploadPath != null && !uploadPath.isBlank()) {
File tempDir = new File(uploadPath, String.format(ChunksSaver.TEMP_DIR_NAME_FORMAT, preRequest.getId()));
ChunksSaver.deleteFolder(tempDir.getPath());
}
chunkUploadRequestMapper.deleteById(reqId);
}
private File uploadFile(ChunkUploadRequest fileUploadRequest, ChunkUploadPreRequest preRequest) { private File uploadFile(ChunkUploadRequest fileUploadRequest, ChunkUploadPreRequest preRequest) {
File savedFile = ChunksSaver.saveFile(fileUploadRequest, preRequest); File savedFile = ChunksSaver.saveFile(fileUploadRequest, preRequest);
preRequest.setTimeout(LocalDateTime.now().plusSeconds(DEFAULT_TIMEOUT)); preRequest.setTimeout(LocalDateTime.now().plusSeconds(DEFAULT_TIMEOUT));

View File

@@ -5,7 +5,7 @@ server {
access_log /var/log/datamate/frontend/access.log main; access_log /var/log/datamate/frontend/access.log main;
error_log /var/log/datamate/frontend/error.log notice; error_log /var/log/datamate/frontend/error.log notice;
client_max_body_size 1024M; client_max_body_size 0;
add_header Set-Cookie "NEXT_LOCALE=zh"; add_header Set-Cookie "NEXT_LOCALE=zh";

View File

@@ -11,6 +11,7 @@ services:
- log_volume:/var/log/datamate - log_volume:/var/log/datamate
- operator-upload-volume:/operators/upload - operator-upload-volume:/operators/upload
- operator-runtime-volume:/operators/extract - operator-runtime-volume:/operators/extract
- uploads_volume:/uploads
networks: [ datamate ] networks: [ datamate ]
depends_on: depends_on:
- datamate-database - datamate-database
@@ -154,6 +155,8 @@ services:
profiles: [ data-juicer ] profiles: [ data-juicer ]
volumes: volumes:
uploads_volume:
name: datamate-uploads-volume
dataset_volume: dataset_volume:
name: datamate-dataset-volume name: datamate-dataset-volume
flow_volume: flow_volume:

View File

@@ -169,6 +169,33 @@
} }
} }
function isAnnotationObject(value) {
if (!value || typeof value !== "object") return false;
return typeof value.serializeAnnotation === "function" || typeof value.serialize === "function";
}
function resolveSelectedAnnotation(store) {
if (!store) return null;
const annotations = Array.isArray(store.annotations) ? store.annotations : [];
if (isAnnotationObject(store.selectedAnnotation)) {
return store.selectedAnnotation;
}
if (isAnnotationObject(store.selected)) {
return store.selected;
}
const selectedId = store.selected;
if (selectedId !== undefined && selectedId !== null && annotations.length) {
const matched = annotations.find((ann) => ann && String(ann.id) === String(selectedId));
if (isAnnotationObject(matched)) {
return matched;
}
}
if (annotations.length && isAnnotationObject(annotations[0])) {
return annotations[0];
}
return null;
}
function exportSelectedAnnotation() { function exportSelectedAnnotation() {
if (!lsInstance) { if (!lsInstance) {
throw new Error("LabelStudio 未初始化"); throw new Error("LabelStudio 未初始化");
@@ -179,10 +206,10 @@
throw new Error("无法访问 annotationStore"); throw new Error("无法访问 annotationStore");
} }
const selected = const selected = resolveSelectedAnnotation(store);
store.selected || if (!selected) {
store.selectedAnnotation || throw new Error("未找到可导出的标注对象");
(Array.isArray(store.annotations) && store.annotations.length ? store.annotations[0] : null); }
let serialized = null; let serialized = null;
if (selected && typeof selected.serializeAnnotation === "function") { if (selected && typeof selected.serializeAnnotation === "function") {
@@ -197,6 +224,10 @@
? { id: selected?.id || serialized.id || "draft", ...serialized } ? { id: selected?.id || serialized.id || "draft", ...serialized }
: { id: selected?.id || "draft", result: (selected && selected.result) || [] }; : { id: selected?.id || "draft", result: (selected && selected.result) || [] };
if (!Array.isArray(annotationPayload.result) && Array.isArray(annotationPayload.results)) {
annotationPayload.result = annotationPayload.results;
}
// 最小化对齐 Label Studio Server 的字段(DataMate 侧会原样存储) // 最小化对齐 Label Studio Server 的字段(DataMate 侧会原样存储)
const taskId = typeof currentTask?.id === "number" ? currentTask.id : Number(currentTask?.id) || null; const taskId = typeof currentTask?.id === "number" ? currentTask.id : Number(currentTask?.id) || null;
const fileId = currentTask?.data?.file_id || currentTask?.data?.fileId || null; const fileId = currentTask?.data?.file_id || currentTask?.data?.fileId || null;
@@ -237,6 +268,17 @@
return true; return true;
} }
function isSaveShortcut(event) {
if (!event || event.defaultPrevented || event.isComposing) return false;
const key = event.key;
const code = event.code;
const isS = key === "s" || key === "S" || code === "KeyS";
if (!isS) return false;
if (!(event.ctrlKey || event.metaKey)) return false;
if (event.shiftKey || event.altKey) return false;
return true;
}
function handleSaveAndNextShortcut(event) { function handleSaveAndNextShortcut(event) {
if (!isSaveAndNextShortcut(event) || event.repeat) return; if (!isSaveAndNextShortcut(event) || event.repeat) return;
event.preventDefault(); event.preventDefault();
@@ -249,6 +291,18 @@
} }
} }
function handleSaveShortcut(event) {
if (!isSaveShortcut(event) || event.repeat) return;
event.preventDefault();
event.stopPropagation();
try {
const raw = exportSelectedAnnotation();
postToParent("LS_EXPORT_RESULT", raw);
} catch (e) {
postToParent("LS_ERROR", { message: e?.message || String(e) });
}
}
function initLabelStudio(payload) { function initLabelStudio(payload) {
if (!window.LabelStudio) { if (!window.LabelStudio) {
throw new Error("LabelStudio 未加载(请检查静态资源/网络)"); throw new Error("LabelStudio 未加载(请检查静态资源/网络)");
@@ -320,6 +374,7 @@
} }
window.addEventListener("keydown", handleSaveAndNextShortcut); window.addEventListener("keydown", handleSaveAndNextShortcut);
window.addEventListener("keydown", handleSaveShortcut);
window.addEventListener("message", (event) => { window.addEventListener("message", (event) => {
if (event.origin !== ORIGIN) return; if (event.origin !== ORIGIN) return;

View File

@@ -1,17 +1,17 @@
import { Button, Input, Popover, theme, Tag, Empty } from "antd"; import { Button, Input, Popover, theme, Tag, Empty } from "antd";
import { PlusOutlined } from "@ant-design/icons"; import { PlusOutlined } from "@ant-design/icons";
import { useEffect, useMemo, useState } from "react"; import { useCallback, useEffect, useMemo, useState } from "react";
interface Tag { interface Tag {
id: number; id?: string | number;
name: string; name: string;
color: string; color?: string;
} }
interface AddTagPopoverProps { interface AddTagPopoverProps {
tags: Tag[]; tags: Tag[];
onFetchTags?: () => Promise<Tag[]>; onFetchTags?: () => Promise<Tag[]>;
onAddTag?: (tag: Tag) => void; onAddTag?: (tagName: string) => void;
onCreateAndTag?: (tagName: string) => void; onCreateAndTag?: (tagName: string) => void;
} }
@@ -27,20 +27,23 @@ export default function AddTagPopover({
const [newTag, setNewTag] = useState(""); const [newTag, setNewTag] = useState("");
const [allTags, setAllTags] = useState<Tag[]>([]); const [allTags, setAllTags] = useState<Tag[]>([]);
const tagsSet = useMemo(() => new Set(tags.map((tag) => tag.id)), [tags]); const tagsSet = useMemo(
() => new Set(tags.map((tag) => (tag.id ?? tag.name))),
[tags]
);
const fetchTags = async () => { const fetchTags = useCallback(async () => {
if (onFetchTags && showPopover) { if (onFetchTags && showPopover) {
const data = await onFetchTags?.(); const data = await onFetchTags?.();
setAllTags(data || []); setAllTags(data || []);
} }
}; }, [onFetchTags, showPopover]);
useEffect(() => { useEffect(() => {
fetchTags(); fetchTags();
}, [showPopover]); }, [fetchTags]);
const availableTags = useMemo(() => { const availableTags = useMemo(() => {
return allTags.filter((tag) => !tagsSet.has(tag.id)); return allTags.filter((tag) => !tagsSet.has(tag.id ?? tag.name));
}, [allTags, tagsSet]); }, [allTags, tagsSet]);
const handleCreateAndAddTag = () => { const handleCreateAndAddTag = () => {

View File

@@ -24,21 +24,28 @@ interface OperationItem {
interface TagConfig { interface TagConfig {
showAdd: boolean; showAdd: boolean;
tags: { id: number; name: string; color: string }[]; tags: { id?: string | number; name: string; color?: string }[];
onFetchTags?: () => Promise<{ onFetchTags?: () => Promise<{ id?: string | number; name: string; color?: string }[]>;
data: { id: number; name: string; color: string }[]; onAddTag?: (tagName: string) => void;
}>;
onAddTag?: (tag: { id: number; name: string; color: string }) => void;
onCreateAndTag?: (tagName: string) => void; onCreateAndTag?: (tagName: string) => void;
} }
interface DetailHeaderProps<T> { interface DetailHeaderData {
name?: string;
description?: string;
status?: { color?: string; icon?: React.ReactNode; label?: string };
tags?: { id?: string | number; name?: string }[];
icon?: React.ReactNode;
iconColor?: string;
}
interface DetailHeaderProps<T extends DetailHeaderData> {
data: T; data: T;
statistics: StatisticItem[]; statistics: StatisticItem[];
operations: OperationItem[]; operations: OperationItem[];
tagConfig?: TagConfig; tagConfig?: TagConfig;
} }
function DetailHeader<T>({ function DetailHeader<T extends DetailHeaderData>({
data = {} as T, data = {} as T,
statistics, statistics,
operations, operations,
@@ -50,13 +57,13 @@ function DetailHeader<T>({
<div className="flex items-start gap-4 flex-1"> <div className="flex items-start gap-4 flex-1">
<div <div
className={`w-16 h-16 text-white rounded-lg flex-center shadow-lg ${ className={`w-16 h-16 text-white rounded-lg flex-center shadow-lg ${
(data as any)?.iconColor data?.iconColor
? "" ? ""
: "bg-gradient-to-br from-sky-300 to-blue-500 text-white" : "bg-gradient-to-br from-sky-300 to-blue-500 text-white"
}`} }`}
style={(data as any)?.iconColor ? { backgroundColor: (data as any).iconColor } : undefined} style={data?.iconColor ? { backgroundColor: data.iconColor } : undefined}
> >
{<div className="w-[2.8rem] h-[2.8rem] text-gray-50">{(data as any)?.icon}</div> || ( {<div className="w-[2.8rem] h-[2.8rem] text-gray-50">{data?.icon}</div> || (
<Database className="w-8 h-8 text-white" /> <Database className="w-8 h-8 text-white" />
)} )}
</div> </div>

View File

@@ -0,0 +1,21 @@
import React from 'react';
import { Navigate, useLocation, Outlet } from 'react-router';
import { useAppSelector } from '@/store/hooks';
interface ProtectedRouteProps {
children?: React.ReactNode;
}
const ProtectedRoute: React.FC<ProtectedRouteProps> = ({ children }) => {
const { isAuthenticated } = useAppSelector((state) => state.auth);
const location = useLocation();
if (!isAuthenticated) {
// Redirect to the login page, but save the current location they were trying to go to
return <Navigate to="/login" state={{ from: location }} replace />;
}
return children ? <>{children}</> : <Outlet />;
};
export default ProtectedRoute;

View File

@@ -1,5 +1,5 @@
import { TaskItem } from "@/pages/DataManagement/dataset.model"; import { TaskItem } from "@/pages/DataManagement/dataset.model";
import { calculateSHA256, checkIsFilesExist } from "@/utils/file.util"; import { calculateSHA256, checkIsFilesExist, streamSplitAndUpload, StreamUploadResult } from "@/utils/file.util";
import { App } from "antd"; import { App } from "antd";
import { useRef, useState } from "react"; import { useRef, useState } from "react";
@@ -9,17 +9,18 @@ export function useFileSliceUpload(
uploadChunk, uploadChunk,
cancelUpload, cancelUpload,
}: { }: {
preUpload: (id: string, params: any) => Promise<{ data: number }>; preUpload: (id: string, params: Record<string, unknown>) => Promise<{ data: number }>;
uploadChunk: (id: string, formData: FormData, config: any) => Promise<any>; uploadChunk: (id: string, formData: FormData, config: Record<string, unknown>) => Promise<unknown>;
cancelUpload: ((reqId: number) => Promise<any>) | null; cancelUpload: ((reqId: number) => Promise<unknown>) | null;
}, },
showTaskCenter = true // 上传时是否显示任务中心 showTaskCenter = true, // 上传时是否显示任务中心
enableStreamUpload = true // 是否启用流式分割上传
) { ) {
const { message } = App.useApp(); const { message } = App.useApp();
const [taskList, setTaskList] = useState<TaskItem[]>([]); const [taskList, setTaskList] = useState<TaskItem[]>([]);
const taskListRef = useRef<TaskItem[]>([]); // 用于固定任务顺序 const taskListRef = useRef<TaskItem[]>([]); // 用于固定任务顺序
const createTask = (detail: any = {}) => { const createTask = (detail: Record<string, unknown> = {}) => {
const { dataset } = detail; const { dataset } = detail;
const title = `上传数据集: ${dataset.name} `; const title = `上传数据集: ${dataset.name} `;
const controller = new AbortController(); const controller = new AbortController();
@@ -37,6 +38,14 @@ export function useFileSliceUpload(
taskListRef.current = [task, ...taskListRef.current]; taskListRef.current = [task, ...taskListRef.current];
setTaskList(taskListRef.current); setTaskList(taskListRef.current);
// 立即显示任务中心,让用户感知上传已开始
if (showTaskCenter) {
window.dispatchEvent(
new CustomEvent("show:task-popover", { detail: { show: true } })
);
}
return task; return task;
}; };
@@ -60,7 +69,7 @@ export function useFileSliceUpload(
// 携带前缀信息,便于刷新后仍停留在当前目录 // 携带前缀信息,便于刷新后仍停留在当前目录
window.dispatchEvent( window.dispatchEvent(
new CustomEvent(task.updateEvent, { new CustomEvent(task.updateEvent, {
detail: { prefix: (task as any).prefix }, detail: { prefix: task.prefix },
}) })
); );
} }
@@ -71,7 +80,7 @@ export function useFileSliceUpload(
} }
}; };
async function buildFormData({ file, reqId, i, j }) { async function buildFormData({ file, reqId, i, j }: { file: { slices: Blob[]; name: string; size: number }; reqId: number; i: number; j: number }) {
const formData = new FormData(); const formData = new FormData();
const { slices, name, size } = file; const { slices, name, size } = file;
const checkSum = await calculateSHA256(slices[j]); const checkSum = await calculateSHA256(slices[j]);
@@ -86,12 +95,18 @@ export function useFileSliceUpload(
return formData; return formData;
} }
async function uploadSlice(task: TaskItem, fileInfo) { async function uploadSlice(task: TaskItem, fileInfo: { loaded: number; i: number; j: number; files: { slices: Blob[]; name: string; size: number }[]; totalSize: number }) {
if (!task) { if (!task) {
return; return;
} }
const { reqId, key } = task; const { reqId, key, controller } = task;
const { loaded, i, j, files, totalSize } = fileInfo; const { loaded, i, j, files, totalSize } = fileInfo;
// 检查是否已取消
if (controller.signal.aborted) {
throw new Error("Upload cancelled");
}
const formData = await buildFormData({ const formData = await buildFormData({
file: files[i], file: files[i],
i, i,
@@ -101,6 +116,7 @@ export function useFileSliceUpload(
let newTask = { ...task }; let newTask = { ...task };
await uploadChunk(key, formData, { await uploadChunk(key, formData, {
signal: controller.signal,
onUploadProgress: (e) => { onUploadProgress: (e) => {
const loadedSize = loaded + e.loaded; const loadedSize = loaded + e.loaded;
const curPercent = Number((loadedSize / totalSize) * 100).toFixed(2); const curPercent = Number((loadedSize / totalSize) * 100).toFixed(2);
@@ -116,7 +132,7 @@ export function useFileSliceUpload(
}); });
} }
async function uploadFile({ task, files, totalSize }) { async function uploadFile({ task, files, totalSize }: { task: TaskItem; files: { slices: Blob[]; name: string; size: number; originFile: Blob }[]; totalSize: number }) {
console.log('[useSliceUpload] Calling preUpload with prefix:', task.prefix); console.log('[useSliceUpload] Calling preUpload with prefix:', task.prefix);
const { data: reqId } = await preUpload(task.key, { const { data: reqId } = await preUpload(task.key, {
totalFileNum: files.length, totalFileNum: files.length,
@@ -132,24 +148,29 @@ export function useFileSliceUpload(
reqId, reqId,
isCancel: false, isCancel: false,
cancelFn: () => { cancelFn: () => {
task.controller.abort(); // 使用 newTask 的 controller 确保一致性
newTask.controller.abort();
cancelUpload?.(reqId); cancelUpload?.(reqId);
if (task.updateEvent) window.dispatchEvent(new Event(task.updateEvent)); if (newTask.updateEvent) window.dispatchEvent(new Event(newTask.updateEvent));
}, },
}; };
updateTaskList(newTask); updateTaskList(newTask);
if (showTaskCenter) { // 注意:show:task-popover 事件已在 createTask 中触发,此处不再重复触发
window.dispatchEvent(
new CustomEvent("show:task-popover", { detail: { show: true } })
);
}
// // 更新数据状态 // // 更新数据状态
if (task.updateEvent) window.dispatchEvent(new Event(task.updateEvent)); if (task.updateEvent) window.dispatchEvent(new Event(task.updateEvent));
let loaded = 0; let loaded = 0;
for (let i = 0; i < files.length; i++) { for (let i = 0; i < files.length; i++) {
// 检查是否已取消
if (newTask.controller.signal.aborted) {
throw new Error("Upload cancelled");
}
const { slices } = files[i]; const { slices } = files[i];
for (let j = 0; j < slices.length; j++) { for (let j = 0; j < slices.length; j++) {
// 检查是否已取消
if (newTask.controller.signal.aborted) {
throw new Error("Upload cancelled");
}
await uploadSlice(newTask, { await uploadSlice(newTask, {
loaded, loaded,
i, i,
@@ -163,7 +184,7 @@ export function useFileSliceUpload(
removeTask(newTask); removeTask(newTask);
} }
const handleUpload = async ({ task, files }) => { const handleUpload = async ({ task, files }: { task: TaskItem; files: { slices: Blob[]; name: string; size: number; originFile: Blob }[] }) => {
const isErrorFile = await checkIsFilesExist(files); const isErrorFile = await checkIsFilesExist(files);
if (isErrorFile) { if (isErrorFile) {
message.error("文件被修改或删除,请重新选择文件上传"); message.error("文件被修改或删除,请重新选择文件上传");
@@ -189,10 +210,174 @@ export function useFileSliceUpload(
} }
}; };
/**
* 流式分割上传处理
* 用于大文件按行分割并立即上传的场景
*/
const handleStreamUpload = async ({ task, files }: { task: TaskItem; files: File[] }) => {
try {
console.log('[useSliceUpload] Starting stream upload for', files.length, 'files');
const totalSize = files.reduce((acc, file) => acc + file.size, 0);
// 存储所有文件的 reqId,用于取消上传
const reqIds: number[] = [];
const newTask: TaskItem = {
...task,
reqId: -1,
isCancel: false,
cancelFn: () => {
// 使用 newTask 的 controller 确保一致性
newTask.controller.abort();
// 取消所有文件的预上传请求
reqIds.forEach(id => cancelUpload?.(id));
if (newTask.updateEvent) window.dispatchEvent(new Event(newTask.updateEvent));
},
};
updateTaskList(newTask);
let totalUploadedLines = 0;
let totalProcessedBytes = 0;
const results: StreamUploadResult[] = [];
// 逐个处理文件,每个文件单独调用 preUpload
for (let i = 0; i < files.length; i++) {
// 检查是否已取消
if (newTask.controller.signal.aborted) {
throw new Error("Upload cancelled");
}
const file = files[i];
console.log(`[useSliceUpload] Processing file ${i + 1}/${files.length}: ${file.name}`);
const result = await streamSplitAndUpload(
file,
(formData, config) => uploadChunk(task.key, formData, {
...config,
signal: newTask.controller.signal,
}),
(currentBytes, totalBytes, uploadedLines) => {
// 检查是否已取消
if (newTask.controller.signal.aborted) {
return;
}
// 更新进度
const overallBytes = totalProcessedBytes + currentBytes;
const curPercent = Number((overallBytes / totalSize) * 100).toFixed(2);
const updatedTask: TaskItem = {
...newTask,
...taskListRef.current.find((item) => item.key === task.key),
size: overallBytes,
percent: curPercent >= 100 ? 99.99 : curPercent,
streamUploadInfo: {
currentFile: file.name,
fileIndex: i + 1,
totalFiles: files.length,
uploadedLines: totalUploadedLines + uploadedLines,
},
};
updateTaskList(updatedTask);
},
1024 * 1024, // 1MB chunk size
{
resolveReqId: async ({ totalFileNum, totalSize }) => {
const { data: reqId } = await preUpload(task.key, {
totalFileNum,
totalSize,
datasetId: task.key,
hasArchive: task.hasArchive,
prefix: task.prefix,
});
console.log(`[useSliceUpload] File ${file.name} preUpload response reqId:`, reqId);
reqIds.push(reqId);
return reqId;
},
hasArchive: newTask.hasArchive,
prefix: newTask.prefix,
signal: newTask.controller.signal,
maxConcurrency: 3,
}
);
results.push(result);
totalUploadedLines += result.uploadedCount;
totalProcessedBytes += file.size;
console.log(`[useSliceUpload] File ${file.name} processed, uploaded ${result.uploadedCount} lines`);
}
console.log('[useSliceUpload] Stream upload completed, total lines:', totalUploadedLines);
removeTask(newTask);
message.success(`成功上传 ${totalUploadedLines} 个文件(按行分割)`);
} catch (err) {
console.error('[useSliceUpload] Stream upload error:', err);
if (err.message === "Upload cancelled") {
message.info("上传已取消");
} else {
message.error("文件上传失败,请稍后重试");
}
removeTask({
...task,
isCancel: true,
...taskListRef.current.find((item) => item.key === task.key),
});
}
};
/**
* 注册流式上传事件监听
* 返回注销函数
*/
const registerStreamUploadListener = () => {
if (!enableStreamUpload) return () => {};
const streamUploadHandler = async (e: Event) => {
const customEvent = e as CustomEvent;
const { dataset, files, updateEvent, hasArchive, prefix } = customEvent.detail;
const controller = new AbortController();
const task: TaskItem = {
key: dataset.id,
title: `上传数据集: ${dataset.name} (按行分割)`,
percent: 0,
reqId: -1,
controller,
size: 0,
updateEvent,
hasArchive,
prefix,
};
taskListRef.current = [task, ...taskListRef.current];
setTaskList(taskListRef.current);
// 显示任务中心
if (showTaskCenter) {
window.dispatchEvent(
new CustomEvent("show:task-popover", { detail: { show: true } })
);
}
await handleStreamUpload({ task, files });
};
window.addEventListener("upload:dataset-stream", streamUploadHandler);
return () => {
window.removeEventListener("upload:dataset-stream", streamUploadHandler);
};
};
return { return {
taskList, taskList,
createTask, createTask,
removeTask, removeTask,
handleUpload, handleUpload,
handleStreamUpload,
registerStreamUploadListener,
}; };
} }

View File

@@ -3,7 +3,9 @@
* 通过 iframe 加载外部页面 * 通过 iframe 加载外部页面
*/ */
export default function ContentGenerationPage() { export default function ContentGenerationPage() {
const iframeUrl = "http://192.168.0.8:3000"; const iframeUrl = "/api#/meeting";
window.localStorage.setItem("geeker-user", '{"token":"123","userInfo":{"name":"xteam"},"loginFrom":null,"loginData":null}');
return ( return (
<div className="h-full w-full flex flex-col"> <div className="h-full w-full flex flex-col">
@@ -16,6 +18,11 @@ export default function ContentGenerationPage() {
className="w-full h-full border-0" className="w-full h-full border-0"
title="内容生成" title="内容生成"
sandbox="allow-same-origin allow-scripts allow-popups allow-forms allow-downloads" sandbox="allow-same-origin allow-scripts allow-popups allow-forms allow-downloads"
style={{marginLeft: "-220px",
marginTop: "-66px",
width: "calc(100% + 233px)",
height: "calc(100% + 108px)"
}}
/> />
</div> </div>
</div> </div>

View File

@@ -6,9 +6,11 @@ import { useNavigate, useParams } from "react-router";
import { import {
getEditorProjectInfoUsingGet, getEditorProjectInfoUsingGet,
getEditorTaskUsingGet, getEditorTaskUsingGet,
getEditorTaskSegmentsUsingGet,
listEditorTasksUsingGet, listEditorTasksUsingGet,
upsertEditorAnnotationUsingPut, upsertEditorAnnotationUsingPut,
} from "../annotation.api"; } from "../annotation.api";
import { AnnotationResultStatus } from "../annotation.model";
type EditorProjectInfo = { type EditorProjectInfo = {
projectId: string; projectId: string;
@@ -26,6 +28,8 @@ type EditorTaskListItem = {
fileType?: string | null; fileType?: string | null;
hasAnnotation: boolean; hasAnnotation: boolean;
annotationUpdatedAt?: string | null; annotationUpdatedAt?: string | null;
annotationStatus?: AnnotationResultStatus | null;
segmentStats?: SegmentStats;
}; };
type LsfMessage = { type LsfMessage = {
@@ -35,14 +39,16 @@ type LsfMessage = {
type SegmentInfo = { type SegmentInfo = {
idx: number; idx: number;
text: string;
start: number;
end: number;
hasAnnotation: boolean; hasAnnotation: boolean;
lineIndex: number; lineIndex: number;
chunkIndex: number; chunkIndex: number;
}; };
type SegmentStats = {
done: number;
total: number;
};
type ApiResponse<T> = { type ApiResponse<T> = {
code?: number; code?: number;
message?: string; message?: string;
@@ -58,10 +64,16 @@ type EditorTaskPayload = {
type EditorTaskResponse = { type EditorTaskResponse = {
task?: EditorTaskPayload; task?: EditorTaskPayload;
segmented?: boolean; segmented?: boolean;
segments?: SegmentInfo[]; totalSegments?: number;
currentSegmentIndex?: number; currentSegmentIndex?: number;
}; };
type EditorTaskSegmentsResponse = {
segmented?: boolean;
segments?: SegmentInfo[];
totalSegments?: number;
};
type EditorTaskListResponse = { type EditorTaskListResponse = {
content?: EditorTaskListItem[]; content?: EditorTaskListItem[];
totalElements?: number; totalElements?: number;
@@ -88,6 +100,13 @@ type SwitchDecision = "save" | "discard" | "cancel";
const LSF_IFRAME_SRC = "/lsf/lsf.html"; const LSF_IFRAME_SRC = "/lsf/lsf.html";
const TASK_PAGE_START = 0; const TASK_PAGE_START = 0;
const TASK_PAGE_SIZE = 200; const TASK_PAGE_SIZE = 200;
const NO_ANNOTATION_LABEL = "无标注";
const NOT_APPLICABLE_LABEL = "不适用";
const NO_ANNOTATION_CONFIRM_TITLE = "没有标注任何内容";
const NO_ANNOTATION_CONFIRM_OK_TEXT = "设为无标注并保存";
const NOT_APPLICABLE_CONFIRM_TEXT = "设为不适用并保存";
const NO_ANNOTATION_CONFIRM_CANCEL_TEXT = "继续标注";
const SAVE_AND_NEXT_LABEL = "保存并跳转到下一段/下一条";
type NormalizedTaskList = { type NormalizedTaskList = {
items: EditorTaskListItem[]; items: EditorTaskListItem[];
@@ -103,6 +122,17 @@ const resolveSegmentIndex = (value: unknown) => {
return Number.isFinite(parsed) ? parsed : undefined; return Number.isFinite(parsed) ? parsed : undefined;
}; };
const isSaveShortcut = (event: KeyboardEvent) => {
if (event.defaultPrevented || event.isComposing) return false;
const key = event.key;
const code = event.code;
const isS = key === "s" || key === "S" || code === "KeyS";
if (!isS) return false;
if (!(event.ctrlKey || event.metaKey)) return false;
if (event.shiftKey || event.altKey) return false;
return true;
};
const normalizePayload = (payload: unknown): ExportPayload | undefined => { const normalizePayload = (payload: unknown): ExportPayload | undefined => {
if (!payload || typeof payload !== "object") return undefined; if (!payload || typeof payload !== "object") return undefined;
return payload as ExportPayload; return payload as ExportPayload;
@@ -119,6 +149,40 @@ const resolvePayloadMessage = (payload: unknown) => {
const isRecord = (value: unknown): value is Record<string, unknown> => const isRecord = (value: unknown): value is Record<string, unknown> =>
!!value && typeof value === "object" && !Array.isArray(value); !!value && typeof value === "object" && !Array.isArray(value);
const isAnnotationResultEmpty = (annotation?: Record<string, unknown>) => {
if (!annotation) return true;
if (!("result" in annotation)) return true;
const result = (annotation as { result?: unknown }).result;
if (!Array.isArray(result)) return false;
return result.length === 0;
};
const resolveTaskStatusMeta = (item: EditorTaskListItem) => {
const segmentSummary = resolveSegmentSummary(item);
if (segmentSummary) {
if (segmentSummary.done >= segmentSummary.total) {
return { text: "已标注", type: "success" as const };
}
if (segmentSummary.done > 0) {
return { text: "标注中", type: "warning" as const };
}
return { text: "未标注", type: "secondary" as const };
}
if (!item.hasAnnotation) {
return { text: "未标注", type: "secondary" as const };
}
if (item.annotationStatus === AnnotationResultStatus.NO_ANNOTATION) {
return { text: NO_ANNOTATION_LABEL, type: "warning" as const };
}
if (item.annotationStatus === AnnotationResultStatus.NOT_APPLICABLE) {
return { text: NOT_APPLICABLE_LABEL, type: "warning" as const };
}
if (item.annotationStatus === AnnotationResultStatus.IN_PROGRESS) {
return { text: "标注中", type: "warning" as const };
}
return { text: "已标注", type: "success" as const };
};
const normalizeSnapshotValue = (value: unknown, seen: WeakSet<object>): unknown => { const normalizeSnapshotValue = (value: unknown, seen: WeakSet<object>): unknown => {
if (!value || typeof value !== "object") return value; if (!value || typeof value !== "object") return value;
const obj = value as object; const obj = value as object;
@@ -144,6 +208,7 @@ const stableStringify = (value: unknown) => {
const buildAnnotationSnapshot = (annotation?: Record<string, unknown>) => { const buildAnnotationSnapshot = (annotation?: Record<string, unknown>) => {
if (!annotation) return ""; if (!annotation) return "";
if (isAnnotationResultEmpty(annotation)) return "";
const cleaned: Record<string, unknown> = { ...annotation }; const cleaned: Record<string, unknown> = { ...annotation };
delete cleaned.updated_at; delete cleaned.updated_at;
delete cleaned.updatedAt; delete cleaned.updatedAt;
@@ -155,6 +220,25 @@ const buildAnnotationSnapshot = (annotation?: Record<string, unknown>) => {
const buildSnapshotKey = (fileId: string, segmentIndex?: number) => const buildSnapshotKey = (fileId: string, segmentIndex?: number) =>
`${fileId}::${segmentIndex ?? "full"}`; `${fileId}::${segmentIndex ?? "full"}`;
const buildSegmentStats = (segmentList?: SegmentInfo[] | null): SegmentStats | null => {
if (!Array.isArray(segmentList) || segmentList.length === 0) return null;
const total = segmentList.length;
const done = segmentList.reduce((count, seg) => count + (seg.hasAnnotation ? 1 : 0), 0);
return { done, total };
};
const normalizeSegmentStats = (stats?: SegmentStats | null): SegmentStats | null => {
if (!stats) return null;
const total = Number(stats.total);
const done = Number(stats.done);
if (!Number.isFinite(total) || total <= 0) return null;
const safeDone = Math.min(Math.max(done, 0), total);
return { done: safeDone, total };
};
const resolveSegmentSummary = (item: EditorTaskListItem) =>
normalizeSegmentStats(item.segmentStats);
const mergeTaskItems = (base: EditorTaskListItem[], next: EditorTaskListItem[]) => { const mergeTaskItems = (base: EditorTaskListItem[], next: EditorTaskListItem[]) => {
if (next.length === 0) return base; if (next.length === 0) return base;
const seen = new Set(base.map((item) => item.fileId)); const seen = new Set(base.map((item) => item.fileId));
@@ -205,6 +289,10 @@ export default function LabelStudioTextEditor() {
const exportCheckSeqRef = useRef(0); const exportCheckSeqRef = useRef(0);
const savedSnapshotsRef = useRef<Record<string, string>>({}); const savedSnapshotsRef = useRef<Record<string, string>>({});
const pendingAutoAdvanceRef = useRef(false); const pendingAutoAdvanceRef = useRef(false);
const segmentStatsCacheRef = useRef<Record<string, SegmentStats>>({});
const segmentStatsSeqRef = useRef(0);
const segmentStatsLoadingRef = useRef<Set<string>>(new Set());
const segmentSummaryFileRef = useRef<string>("");
const [loadingProject, setLoadingProject] = useState(true); const [loadingProject, setLoadingProject] = useState(true);
const [loadingTasks, setLoadingTasks] = useState(false); const [loadingTasks, setLoadingTasks] = useState(false);
@@ -247,6 +335,98 @@ export default function LabelStudioTextEditor() {
win.postMessage({ type, payload }, origin); win.postMessage({ type, payload }, origin);
}, [origin]); }, [origin]);
const applySegmentStats = useCallback((fileId: string, stats: SegmentStats | null) => {
if (!fileId) return;
const normalized = normalizeSegmentStats(stats);
setTasks((prev) =>
prev.map((item) =>
item.fileId === fileId
? { ...item, segmentStats: normalized || undefined }
: item
)
);
}, []);
const updateSegmentStatsCache = useCallback((fileId: string, stats: SegmentStats | null) => {
if (!fileId) return;
const normalized = normalizeSegmentStats(stats);
if (normalized) {
segmentStatsCacheRef.current[fileId] = normalized;
} else {
delete segmentStatsCacheRef.current[fileId];
}
applySegmentStats(fileId, normalized);
}, [applySegmentStats]);
const fetchSegmentStatsForFile = useCallback(async (fileId: string, seq: number) => {
if (!projectId || !fileId) return;
if (segmentStatsCacheRef.current[fileId] || segmentStatsLoadingRef.current.has(fileId)) return;
segmentStatsLoadingRef.current.add(fileId);
try {
const resp = (await getEditorTaskSegmentsUsingGet(projectId, fileId)) as ApiResponse<EditorTaskSegmentsResponse>;
if (segmentStatsSeqRef.current !== seq) return;
const data = resp?.data;
if (!data?.segmented) return;
const stats = buildSegmentStats(data.segments);
if (!stats) return;
segmentStatsCacheRef.current[fileId] = stats;
applySegmentStats(fileId, stats);
} catch (e) {
console.error(e);
} finally {
segmentStatsLoadingRef.current.delete(fileId);
}
}, [applySegmentStats, projectId]);
const prefetchSegmentStats = useCallback((items: EditorTaskListItem[]) => {
if (!projectId) return;
const fileIds = items
.map((item) => item.fileId)
.filter((fileId) => fileId && !segmentStatsCacheRef.current[fileId]);
if (fileIds.length === 0) return;
const seq = segmentStatsSeqRef.current;
let cursor = 0;
const workerCount = Math.min(3, fileIds.length);
const runWorker = async () => {
while (cursor < fileIds.length && segmentStatsSeqRef.current === seq) {
const fileId = fileIds[cursor];
cursor += 1;
await fetchSegmentStatsForFile(fileId, seq);
}
};
void Promise.all(Array.from({ length: workerCount }, () => runWorker()));
}, [fetchSegmentStatsForFile, projectId]);
const confirmEmptyAnnotationStatus = useCallback(() => {
return new Promise<AnnotationResultStatus | null>((resolve) => {
let resolved = false;
let modalInstance: { destroy: () => void } | null = null;
const settle = (value: AnnotationResultStatus | null) => {
if (resolved) return;
resolved = true;
resolve(value);
if (modalInstance) modalInstance.destroy();
};
const handleNotApplicable = () => settle(AnnotationResultStatus.NOT_APPLICABLE);
modalInstance = modal.confirm({
title: NO_ANNOTATION_CONFIRM_TITLE,
content: (
<div className="flex flex-col gap-2">
<Typography.Text></Typography.Text>
<Typography.Text type="secondary"></Typography.Text>
<Button type="link" style={{ padding: 0, height: "auto" }} onClick={handleNotApplicable}>
{NOT_APPLICABLE_CONFIRM_TEXT}
</Button>
</div>
),
okText: NO_ANNOTATION_CONFIRM_OK_TEXT,
cancelText: NO_ANNOTATION_CONFIRM_CANCEL_TEXT,
onOk: () => settle(AnnotationResultStatus.NO_ANNOTATION),
onCancel: () => settle(null),
});
});
}, [modal]);
const loadProject = useCallback(async () => { const loadProject = useCallback(async () => {
setLoadingProject(true); setLoadingProject(true);
try { try {
@@ -268,8 +448,13 @@ export default function LabelStudioTextEditor() {
}, [message, projectId]); }, [message, projectId]);
const updateTaskSelection = useCallback((items: EditorTaskListItem[]) => { const updateTaskSelection = useCallback((items: EditorTaskListItem[]) => {
const isCompleted = (item: EditorTaskListItem) => {
const summary = resolveSegmentSummary(item);
if (summary) return summary.done >= summary.total;
return item.hasAnnotation;
};
const defaultFileId = const defaultFileId =
items.find((item) => !item.hasAnnotation)?.fileId || items[0]?.fileId || ""; items.find((item) => !isCompleted(item))?.fileId || items[0]?.fileId || "";
setSelectedFileId((prev) => { setSelectedFileId((prev) => {
if (prev && items.some((item) => item.fileId === prev)) return prev; if (prev && items.some((item) => item.fileId === prev)) return prev;
return defaultFileId; return defaultFileId;
@@ -326,6 +511,9 @@ export default function LabelStudioTextEditor() {
if (mode === "reset") { if (mode === "reset") {
prefetchSeqRef.current += 1; prefetchSeqRef.current += 1;
setPrefetching(false); setPrefetching(false);
segmentStatsSeqRef.current += 1;
segmentStatsCacheRef.current = {};
segmentStatsLoadingRef.current = new Set();
} }
if (mode === "append") { if (mode === "append") {
setLoadingMore(true); setLoadingMore(true);
@@ -406,17 +594,38 @@ export default function LabelStudioTextEditor() {
if (seq !== initSeqRef.current) return; if (seq !== initSeqRef.current) return;
// 更新分段状态 // 更新分段状态
const segmentIndex = data?.segmented const isSegmented = !!data?.segmented;
const segmentIndex = isSegmented
? resolveSegmentIndex(data.currentSegmentIndex) ?? 0 ? resolveSegmentIndex(data.currentSegmentIndex) ?? 0
: undefined; : undefined;
if (data?.segmented) { if (isSegmented) {
let nextSegments: SegmentInfo[] = [];
if (segmentSummaryFileRef.current === fileId && segments.length > 0) {
nextSegments = segments;
} else {
try {
const segmentResp = (await getEditorTaskSegmentsUsingGet(projectId, fileId)) as ApiResponse<EditorTaskSegmentsResponse>;
if (seq !== initSeqRef.current) return;
const segmentData = segmentResp?.data;
if (segmentData?.segmented) {
nextSegments = Array.isArray(segmentData.segments) ? segmentData.segments : [];
}
} catch (e) {
console.error(e);
}
}
const stats = buildSegmentStats(nextSegments);
setSegmented(true); setSegmented(true);
setSegments(data.segments || []); setSegments(nextSegments);
setCurrentSegmentIndex(segmentIndex ?? 0); setCurrentSegmentIndex(segmentIndex ?? 0);
updateSegmentStatsCache(fileId, stats);
segmentSummaryFileRef.current = fileId;
} else { } else {
setSegmented(false); setSegmented(false);
setSegments([]); setSegments([]);
setCurrentSegmentIndex(0); setCurrentSegmentIndex(0);
updateSegmentStatsCache(fileId, null);
segmentSummaryFileRef.current = fileId;
} }
const taskData = { const taskData = {
@@ -476,7 +685,7 @@ export default function LabelStudioTextEditor() {
} finally { } finally {
if (seq === initSeqRef.current) setLoadingTaskDetail(false); if (seq === initSeqRef.current) setLoadingTaskDetail(false);
} }
}, [iframeReady, message, postToIframe, project, projectId]); }, [iframeReady, message, postToIframe, project, projectId, segments, updateSegmentStatsCache]);
const advanceAfterSave = useCallback(async (fileId: string, segmentIndex?: number) => { const advanceAfterSave = useCallback(async (fileId: string, segmentIndex?: number) => {
if (!fileId) return; if (!fileId) return;
@@ -539,11 +748,31 @@ export default function LabelStudioTextEditor() {
? currentSegmentIndex ? currentSegmentIndex
: undefined; : undefined;
const annotationRecord = annotation as Record<string, unknown>;
const currentTask = tasks.find((item) => item.fileId === String(fileId));
const currentStatus = currentTask?.annotationStatus;
let resolvedStatus: AnnotationResultStatus;
if (isAnnotationResultEmpty(annotationRecord)) {
if (
currentStatus === AnnotationResultStatus.NO_ANNOTATION ||
currentStatus === AnnotationResultStatus.NOT_APPLICABLE
) {
resolvedStatus = currentStatus;
} else {
const selectedStatus = await confirmEmptyAnnotationStatus();
if (!selectedStatus) return false;
resolvedStatus = selectedStatus;
}
} else {
resolvedStatus = AnnotationResultStatus.ANNOTATED;
}
setSaving(true); setSaving(true);
try { try {
const resp = (await upsertEditorAnnotationUsingPut(projectId, String(fileId), { const resp = (await upsertEditorAnnotationUsingPut(projectId, String(fileId), {
annotation, annotation,
segmentIndex, segmentIndex,
annotationStatus: resolvedStatus,
})) as ApiResponse<UpsertAnnotationResponse>; })) as ApiResponse<UpsertAnnotationResponse>;
const updatedAt = resp?.data?.updatedAt; const updatedAt = resp?.data?.updatedAt;
message.success("标注已保存"); message.success("标注已保存");
@@ -553,6 +782,7 @@ export default function LabelStudioTextEditor() {
? { ? {
...item, ...item,
hasAnnotation: true, hasAnnotation: true,
annotationStatus: resolvedStatus,
annotationUpdatedAt: updatedAt || item.annotationUpdatedAt, annotationUpdatedAt: updatedAt || item.annotationUpdatedAt,
} }
: item : item
@@ -565,13 +795,13 @@ export default function LabelStudioTextEditor() {
// 分段模式下更新当前段落的标注状态 // 分段模式下更新当前段落的标注状态
if (segmented && segmentIndex !== undefined) { if (segmented && segmentIndex !== undefined) {
setSegments((prev) => const nextSegments = segments.map((seg) =>
prev.map((seg) =>
seg.idx === segmentIndex seg.idx === segmentIndex
? { ...seg, hasAnnotation: true } ? { ...seg, hasAnnotation: true }
: seg : seg
)
); );
setSegments(nextSegments);
updateSegmentStatsCache(String(fileId), buildSegmentStats(nextSegments));
} }
if (options?.autoAdvance) { if (options?.autoAdvance) {
await advanceAfterSave(String(fileId), segmentIndex); await advanceAfterSave(String(fileId), segmentIndex);
@@ -586,11 +816,15 @@ export default function LabelStudioTextEditor() {
} }
}, [ }, [
advanceAfterSave, advanceAfterSave,
confirmEmptyAnnotationStatus,
currentSegmentIndex, currentSegmentIndex,
message, message,
projectId, projectId,
segmented, segmented,
segments,
selectedFileId, selectedFileId,
tasks,
updateSegmentStatsCache,
]); ]);
const requestExportForCheck = useCallback(() => { const requestExportForCheck = useCallback(() => {
@@ -650,14 +884,27 @@ export default function LabelStudioTextEditor() {
}); });
}, [modal]); }, [modal]);
const requestExport = () => { const requestExport = useCallback((autoAdvance: boolean) => {
if (!selectedFileId) { if (!selectedFileId) {
message.warning("请先选择文件"); message.warning("请先选择文件");
return; return;
} }
pendingAutoAdvanceRef.current = true; pendingAutoAdvanceRef.current = autoAdvance;
postToIframe("LS_EXPORT", {}); postToIframe("LS_EXPORT", {});
}, [message, postToIframe, selectedFileId]);
useEffect(() => {
const handleSaveShortcut = (event: KeyboardEvent) => {
if (!isSaveShortcut(event) || event.repeat) return;
if (saving || loadingTaskDetail || segmentSwitching) return;
if (!iframeReady || !lsReady) return;
event.preventDefault();
event.stopPropagation();
requestExport(false);
}; };
window.addEventListener("keydown", handleSaveShortcut);
return () => window.removeEventListener("keydown", handleSaveShortcut);
}, [iframeReady, loadingTaskDetail, lsReady, requestExport, saving, segmentSwitching]);
// 段落切换处理 // 段落切换处理
const handleSegmentChange = useCallback(async (newIndex: number) => { const handleSegmentChange = useCallback(async (newIndex: number) => {
@@ -753,7 +1000,11 @@ export default function LabelStudioTextEditor() {
setSegmented(false); setSegmented(false);
setSegments([]); setSegments([]);
setCurrentSegmentIndex(0); setCurrentSegmentIndex(0);
segmentSummaryFileRef.current = "";
savedSnapshotsRef.current = {}; savedSnapshotsRef.current = {};
segmentStatsSeqRef.current += 1;
segmentStatsCacheRef.current = {};
segmentStatsLoadingRef.current = new Set();
if (exportCheckRef.current?.timer) { if (exportCheckRef.current?.timer) {
window.clearTimeout(exportCheckRef.current.timer); window.clearTimeout(exportCheckRef.current.timer);
} }
@@ -767,6 +1018,12 @@ export default function LabelStudioTextEditor() {
loadTasks({ mode: "reset" }); loadTasks({ mode: "reset" });
}, [project?.supported, loadTasks]); }, [project?.supported, loadTasks]);
useEffect(() => {
if (!segmented) return;
if (tasks.length === 0) return;
prefetchSegmentStats(tasks);
}, [prefetchSegmentStats, segmented, tasks]);
useEffect(() => { useEffect(() => {
if (!selectedFileId) return; if (!selectedFileId) return;
initEditorForFile(selectedFileId); initEditorForFile(selectedFileId);
@@ -826,6 +1083,15 @@ export default function LabelStudioTextEditor() {
[segmentTreeData] [segmentTreeData]
); );
const inProgressSegmentedCount = useMemo(() => {
if (tasks.length === 0) return 0;
return tasks.reduce((count, item) => {
const summary = resolveSegmentSummary(item);
if (!summary) return count;
return summary.done < summary.total ? count + 1 : count;
}, 0);
}, [tasks]);
const handleSegmentSelect = useCallback((keys: Array<string | number>) => { const handleSegmentSelect = useCallback((keys: Array<string | number>) => {
const [first] = keys; const [first] = keys;
if (first === undefined || first === null) return; if (first === undefined || first === null) return;
@@ -903,6 +1169,8 @@ export default function LabelStudioTextEditor() {
}, [message, origin, saveFromExport]); }, [message, origin, saveFromExport]);
const canLoadMore = taskTotalPages > 0 && taskPage + 1 < taskTotalPages; const canLoadMore = taskTotalPages > 0 && taskPage + 1 < taskTotalPages;
const saveDisabled =
!iframeReady || !selectedFileId || saving || segmentSwitching || loadingTaskDetail;
const loadMoreNode = canLoadMore ? ( const loadMoreNode = canLoadMore ? (
<div className="p-2 text-center"> <div className="p-2 text-center">
<Button <Button
@@ -966,7 +1234,7 @@ export default function LabelStudioTextEditor() {
return ( return (
<div className="h-full flex flex-col"> <div className="h-full flex flex-col">
{/* 顶部工具栏 */} {/* 顶部工具栏 */}
<div className="flex items-center justify-between px-3 py-2 border-b border-gray-200 bg-white"> <div className="grid grid-cols-[1fr_auto_1fr] items-center px-3 py-2 border-b border-gray-200 bg-white">
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Button icon={<LeftOutlined />} onClick={() => navigate("/data/annotation")}> <Button icon={<LeftOutlined />} onClick={() => navigate("/data/annotation")}>
@@ -980,7 +1248,18 @@ export default function LabelStudioTextEditor() {
</Typography.Title> </Typography.Title>
</div> </div>
<div className="flex items-center gap-2"> <div className="flex items-center justify-center">
<Button
type="primary"
icon={<SaveOutlined />}
loading={saving}
disabled={saveDisabled}
onClick={() => requestExport(true)}
>
{SAVE_AND_NEXT_LABEL}
</Button>
</div>
<div className="flex items-center gap-2 justify-end">
<Button <Button
icon={<ReloadOutlined />} icon={<ReloadOutlined />}
loading={loadingTasks} loading={loadingTasks}
@@ -989,11 +1268,10 @@ export default function LabelStudioTextEditor() {
</Button> </Button>
<Button <Button
type="primary"
icon={<SaveOutlined />} icon={<SaveOutlined />}
loading={saving} loading={saving}
disabled={!iframeReady || !selectedFileId} disabled={saveDisabled}
onClick={requestExport} onClick={() => requestExport(false)}
> >
</Button> </Button>
@@ -1007,8 +1285,13 @@ export default function LabelStudioTextEditor() {
className="border-r border-gray-200 bg-gray-50 flex flex-col transition-all duration-200 min-h-0" className="border-r border-gray-200 bg-gray-50 flex flex-col transition-all duration-200 min-h-0"
style={{ width: sidebarCollapsed ? 0 : 240, overflow: "hidden" }} style={{ width: sidebarCollapsed ? 0 : 240, overflow: "hidden" }}
> >
<div className="px-3 py-2 border-b border-gray-200 bg-white font-medium text-sm"> <div className="px-3 py-2 border-b border-gray-200 bg-white font-medium text-sm flex items-center justify-between gap-2">
<span></span>
{segmented && (
<Tag color="orange" style={{ margin: 0 }}>
{inProgressSegmentedCount}
</Tag>
)}
</div> </div>
<div className="flex-1 min-h-0 overflow-auto"> <div className="flex-1 min-h-0 overflow-auto">
<List <List
@@ -1016,7 +1299,10 @@ export default function LabelStudioTextEditor() {
size="small" size="small"
dataSource={tasks} dataSource={tasks}
loadMore={loadMoreNode} loadMore={loadMoreNode}
renderItem={(item) => ( renderItem={(item) => {
const segmentSummary = resolveSegmentSummary(item);
const statusMeta = resolveTaskStatusMeta(item);
return (
<List.Item <List.Item
key={item.fileId} key={item.fileId}
className="cursor-pointer hover:bg-blue-50" className="cursor-pointer hover:bg-blue-50"
@@ -1032,12 +1318,16 @@ export default function LabelStudioTextEditor() {
{item.fileName} {item.fileName}
</Typography.Text> </Typography.Text>
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<Typography.Text <div className="flex items-center gap-2">
type={item.hasAnnotation ? "success" : "secondary"} <Typography.Text type={statusMeta.type} style={{ fontSize: 11 }}>
style={{ fontSize: 11 }} {statusMeta.text}
>
{item.hasAnnotation ? "已标注" : "未标注"}
</Typography.Text> </Typography.Text>
{segmentSummary && (
<Typography.Text type="secondary" style={{ fontSize: 10 }}>
{segmentSummary.done}/{segmentSummary.total}
</Typography.Text>
)}
</div>
{item.annotationUpdatedAt && ( {item.annotationUpdatedAt && (
<Typography.Text type="secondary" style={{ fontSize: 10 }}> <Typography.Text type="secondary" style={{ fontSize: 10 }}>
{item.annotationUpdatedAt} {item.annotationUpdatedAt}
@@ -1046,7 +1336,8 @@ export default function LabelStudioTextEditor() {
</div> </div>
</div> </div>
</List.Item> </List.Item>
)} );
}}
/> />
</div> </div>
{segmented && ( {segmented && (

View File

@@ -19,7 +19,8 @@ import {
queryAnnotationTemplatesUsingGet, queryAnnotationTemplatesUsingGet,
} from "../../annotation.api"; } from "../../annotation.api";
import { DatasetType, type Dataset } from "@/pages/DataManagement/dataset.model"; import { DatasetType, type Dataset } from "@/pages/DataManagement/dataset.model";
import { DataType, type AnnotationTemplate, type AnnotationTask } from "../../annotation.model"; import { DataType, type AnnotationTemplate } from "../../annotation.model";
import type { AnnotationTaskListItem } from "../../annotation.const";
import LabelStudioEmbed from "@/components/business/LabelStudioEmbed"; import LabelStudioEmbed from "@/components/business/LabelStudioEmbed";
import TemplateConfigurationTreeEditor from "../../components/TemplateConfigurationTreeEditor"; import TemplateConfigurationTreeEditor from "../../components/TemplateConfigurationTreeEditor";
import { useTagConfig } from "@/hooks/useTagConfig"; import { useTagConfig } from "@/hooks/useTagConfig";
@@ -29,7 +30,7 @@ interface AnnotationTaskDialogProps {
onClose: () => void; onClose: () => void;
onRefresh: () => void; onRefresh: () => void;
/** 编辑模式:传入要编辑的任务数据 */ /** 编辑模式:传入要编辑的任务数据 */
editTask?: AnnotationTask | null; editTask?: AnnotationTaskListItem | null;
} }
type DatasetOption = Dataset & { icon?: ReactNode }; type DatasetOption = Dataset & { icon?: ReactNode };
@@ -60,6 +61,7 @@ const isRecord = (value: unknown): value is Record<string, unknown> =>
const DEFAULT_SEGMENTATION_ENABLED = true; const DEFAULT_SEGMENTATION_ENABLED = true;
const FILE_PREVIEW_MAX_HEIGHT = 500; const FILE_PREVIEW_MAX_HEIGHT = 500;
const PREVIEW_MODAL_WIDTH = "80vw";
const SEGMENTATION_OPTIONS = [ const SEGMENTATION_OPTIONS = [
{ label: "需要切片段", value: true }, { label: "需要切片段", value: true },
{ label: "不需要切片段", value: false }, { label: "不需要切片段", value: false },
@@ -828,7 +830,7 @@ export default function CreateAnnotationTask({
open={showPreview} open={showPreview}
onCancel={() => setShowPreview(false)} onCancel={() => setShowPreview(false)}
title="标注界面预览" title="标注界面预览"
width={1000} width={PREVIEW_MODAL_WIDTH}
footer={[ footer={[
<Button key="close" onClick={() => setShowPreview(false)}> <Button key="close" onClick={() => setShowPreview(false)}>
@@ -853,7 +855,7 @@ export default function CreateAnnotationTask({
open={datasetPreviewVisible} open={datasetPreviewVisible}
onCancel={() => setDatasetPreviewVisible(false)} onCancel={() => setDatasetPreviewVisible(false)}
title="数据集预览(前10条文件)" title="数据集预览(前10条文件)"
width={700} width={PREVIEW_MODAL_WIDTH}
footer={[ footer={[
<Button key="close" onClick={() => setDatasetPreviewVisible(false)}> <Button key="close" onClick={() => setDatasetPreviewVisible(false)}>
@@ -910,7 +912,7 @@ export default function CreateAnnotationTask({
setFileContent(""); setFileContent("");
}} }}
title={`文件预览:${previewFileName}`} title={`文件预览:${previewFileName}`}
width={previewFileType === "text" ? 800 : 700} width={PREVIEW_MODAL_WIDTH}
footer={[ footer={[
<Button key="close" onClick={() => { <Button key="close" onClick={() => {
setFileContentVisible(false); setFileContentVisible(false);

View File

@@ -1,5 +1,5 @@
import { useState } from "react"; import { useState } from "react";
import { Card, Button, Table, message, Modal, Tabs } from "antd"; import { Card, Button, Table, Tag, message, Modal, Tabs } from "antd";
import { import {
PlusOutlined, PlusOutlined,
EditOutlined, EditOutlined,
@@ -10,27 +10,39 @@ import {
import { useNavigate } from "react-router"; import { useNavigate } from "react-router";
import { SearchControls } from "@/components/SearchControls"; import { SearchControls } from "@/components/SearchControls";
import CardView from "@/components/CardView"; import CardView from "@/components/CardView";
import type { AnnotationTask } from "../annotation.model";
import useFetchData from "@/hooks/useFetchData"; import useFetchData from "@/hooks/useFetchData";
import { import {
deleteAnnotationTaskByIdUsingDelete, deleteAnnotationTaskByIdUsingDelete,
queryAnnotationTasksUsingGet, queryAnnotationTasksUsingGet,
} from "../annotation.api"; } from "../annotation.api";
import { mapAnnotationTask } from "../annotation.const"; import {
AnnotationTypeMap,
mapAnnotationTask,
type AnnotationTaskListItem,
} from "../annotation.const";
import CreateAnnotationTask from "../Create/components/CreateAnnotationTaskDialog"; import CreateAnnotationTask from "../Create/components/CreateAnnotationTaskDialog";
import ExportAnnotationDialog from "./ExportAnnotationDialog"; import ExportAnnotationDialog from "./ExportAnnotationDialog";
import { ColumnType } from "antd/es/table"; import { ColumnType } from "antd/es/table";
import { TemplateList } from "../Template"; import { TemplateList } from "../Template";
// Note: DevelopmentInProgress intentionally not used here // Note: DevelopmentInProgress intentionally not used here
type AnnotationTaskRowKey = string | number;
type AnnotationTaskOperation = {
key: string;
label: string;
icon: JSX.Element;
danger?: boolean;
onClick: (task: AnnotationTaskListItem) => void;
};
export default function DataAnnotation() { export default function DataAnnotation() {
// return <DevelopmentInProgress showTime="2025.10.30" />; // return <DevelopmentInProgress showTime="2025.10.30" />;
const navigate = useNavigate(); const navigate = useNavigate();
const [activeTab, setActiveTab] = useState("tasks"); const [activeTab, setActiveTab] = useState("tasks");
const [viewMode, setViewMode] = useState<"list" | "card">("list"); const [viewMode, setViewMode] = useState<"list" | "card">("list");
const [showCreateDialog, setShowCreateDialog] = useState(false); const [showCreateDialog, setShowCreateDialog] = useState(false);
const [exportTask, setExportTask] = useState<AnnotationTask | null>(null); const [exportTask, setExportTask] = useState<AnnotationTaskListItem | null>(null);
const [editTask, setEditTask] = useState<AnnotationTask | null>(null); const [editTask, setEditTask] = useState<AnnotationTaskListItem | null>(null);
const { const {
loading, loading,
@@ -40,13 +52,16 @@ export default function DataAnnotation() {
fetchData, fetchData,
handleFiltersChange, handleFiltersChange,
handleKeywordChange, handleKeywordChange,
} = useFetchData(queryAnnotationTasksUsingGet, mapAnnotationTask, 30000, true, [], 0); } = useFetchData<AnnotationTaskListItem>(queryAnnotationTasksUsingGet, mapAnnotationTask, 30000, true, [], 0);
const [selectedRowKeys, setSelectedRowKeys] = useState<(string | number)[]>([]); const [selectedRowKeys, setSelectedRowKeys] = useState<AnnotationTaskRowKey[]>([]);
const [selectedRows, setSelectedRows] = useState<any[]>([]); const [selectedRows, setSelectedRows] = useState<AnnotationTaskListItem[]>([]);
const handleAnnotate = (task: AnnotationTask) => { const toSafeCount = (value: unknown) =>
const projectId = (task as any)?.id; typeof value === "number" && Number.isFinite(value) ? value : 0;
const handleAnnotate = (task: AnnotationTaskListItem) => {
const projectId = task.id;
if (!projectId) { if (!projectId) {
message.error("无法进入标注:缺少标注项目ID"); message.error("无法进入标注:缺少标注项目ID");
return; return;
@@ -54,15 +69,15 @@ export default function DataAnnotation() {
navigate(`/data/annotation/annotate/${projectId}`); navigate(`/data/annotation/annotate/${projectId}`);
}; };
const handleExport = (task: AnnotationTask) => { const handleExport = (task: AnnotationTaskListItem) => {
setExportTask(task); setExportTask(task);
}; };
const handleEdit = (task: AnnotationTask) => { const handleEdit = (task: AnnotationTaskListItem) => {
setEditTask(task); setEditTask(task);
}; };
const handleDelete = (task: AnnotationTask) => { const handleDelete = (task: AnnotationTaskListItem) => {
Modal.confirm({ Modal.confirm({
title: `确认删除标注任务「${task.name}」吗?`, title: `确认删除标注任务「${task.name}」吗?`,
content: "删除标注任务不会删除对应数据集,但会删除该任务的所有标注结果。", content: "删除标注任务不会删除对应数据集,但会删除该任务的所有标注结果。",
@@ -110,7 +125,7 @@ export default function DataAnnotation() {
}); });
}; };
const operations = [ const operations: AnnotationTaskOperation[] = [
{ {
key: "annotate", key: "annotate",
label: "标注", label: "标注",
@@ -142,24 +157,45 @@ export default function DataAnnotation() {
}, },
]; ];
const columns: ColumnType<any>[] = [ const columns: ColumnType<AnnotationTaskListItem>[] = [
{
title: "序号",
key: "index",
width: 80,
align: "center" as const,
render: (_value: unknown, _record: AnnotationTaskListItem, index: number) => {
const current = pagination.current ?? 1;
const pageSize = pagination.pageSize ?? tableData.length ?? 0;
return (current - 1) * pageSize + index + 1;
},
},
{ {
title: "任务名称", title: "任务名称",
dataIndex: "name", dataIndex: "name",
key: "name", key: "name",
fixed: "left" as const, fixed: "left" as const,
}, },
{
title: "任务ID",
dataIndex: "id",
key: "id",
},
{ {
title: "数据集", title: "数据集",
dataIndex: "datasetName", dataIndex: "datasetName",
key: "datasetName", key: "datasetName",
width: 180, width: 180,
}, },
{
title: "标注类型",
dataIndex: "labelingType",
key: "labelingType",
width: 160,
render: (value?: string) => {
if (!value) {
return "-";
}
const label =
AnnotationTypeMap[value as keyof typeof AnnotationTypeMap]?.label ||
value;
return <Tag color="geekblue">{label}</Tag>;
},
},
{ {
title: "数据量", title: "数据量",
dataIndex: "totalCount", dataIndex: "totalCount",
@@ -173,9 +209,21 @@ export default function DataAnnotation() {
key: "annotatedCount", key: "annotatedCount",
width: 100, width: 100,
align: "center" as const, align: "center" as const,
render: (value: number, record: any) => { render: (value: number, record: AnnotationTaskListItem) => {
const total = record.totalCount || 0; const total = toSafeCount(record.totalCount ?? record.total_count);
const annotated = value || 0; const annotatedRaw = toSafeCount(
value ?? record.annotatedCount ?? record.annotated_count
);
const segmentationEnabled =
record.segmentationEnabled ?? record.segmentation_enabled;
const inProgressRaw = segmentationEnabled
? toSafeCount(record.inProgressCount ?? record.in_progress_count)
: 0;
const shouldExcludeInProgress =
total > 0 && annotatedRaw + inProgressRaw > total;
const annotated = shouldExcludeInProgress
? Math.max(annotatedRaw - inProgressRaw, 0)
: annotatedRaw;
const percent = total > 0 ? Math.round((annotated / total) * 100) : 0; const percent = total > 0 ? Math.round((annotated / total) * 100) : 0;
return ( return (
<span title={`${annotated}/${total} (${percent}%)`}> <span title={`${annotated}/${total} (${percent}%)`}>
@@ -184,6 +232,23 @@ export default function DataAnnotation() {
); );
}, },
}, },
{
title: "标注中",
dataIndex: "inProgressCount",
key: "inProgressCount",
width: 100,
align: "center" as const,
render: (value: number, record: AnnotationTaskListItem) => {
const segmentationEnabled =
record.segmentationEnabled ?? record.segmentation_enabled;
if (!segmentationEnabled) return "-";
const resolved =
Number.isFinite(value)
? value
: record.inProgressCount ?? record.in_progress_count ?? 0;
return resolved;
},
},
{ {
title: "创建时间", title: "创建时间",
dataIndex: "createdAt", dataIndex: "createdAt",
@@ -202,14 +267,14 @@ export default function DataAnnotation() {
fixed: "right" as const, fixed: "right" as const,
width: 150, width: 150,
dataIndex: "actions", dataIndex: "actions",
render: (_: any, task: any) => ( render: (_value: unknown, task: AnnotationTaskListItem) => (
<div className="flex items-center justify-center space-x-1"> <div className="flex items-center justify-center space-x-1">
{operations.map((operation) => ( {operations.map((operation) => (
<Button <Button
key={operation.key} key={operation.key}
type="text" type="text"
icon={operation.icon} icon={operation.icon}
onClick={() => (operation?.onClick as any)?.(task)} onClick={() => operation.onClick(task)}
title={operation.label} title={operation.label}
/> />
))} ))}
@@ -282,9 +347,9 @@ export default function DataAnnotation() {
pagination={pagination} pagination={pagination}
rowSelection={{ rowSelection={{
selectedRowKeys, selectedRowKeys,
onChange: (keys, rows) => { onChange: (keys: AnnotationTaskRowKey[], rows: AnnotationTaskListItem[]) => {
setSelectedRowKeys(keys as (string | number)[]); setSelectedRowKeys(keys);
setSelectedRows(rows as any[]); setSelectedRows(rows);
}, },
}} }}
scroll={{ x: "max-content", y: "calc(100vh - 24rem)" }} scroll={{ x: "max-content", y: "calc(100vh - 24rem)" }}
@@ -293,7 +358,7 @@ export default function DataAnnotation() {
) : ( ) : (
<CardView <CardView
data={tableData} data={tableData}
operations={operations as any} operations={operations}
pagination={pagination} pagination={pagination}
loading={loading} loading={loading}
/> />

View File

@@ -178,14 +178,15 @@ export default function ExportAnnotationDialog({
<Select <Select
options={FORMAT_OPTIONS.map((opt) => ({ options={FORMAT_OPTIONS.map((opt) => ({
label: ( label: (
<div> <div className="py-1">
<div className="font-medium">{opt.label}</div> <div className="font-medium">{opt.label}</div>
<div className="text-xs text-gray-400">{opt.description}</div> <div className="text-xs text-gray-400">{opt.description}</div>
</div> </div>
), ),
value: opt.value, value: opt.value,
simpleLabel: opt.label,
}))} }))}
optionLabelProp="label" optionLabelProp="simpleLabel"
/> />
</Form.Item> </Form.Item>

View File

@@ -43,14 +43,6 @@ const TemplateDetail: React.FC<TemplateDetailProps> = ({
<Descriptions.Item label="样式"> <Descriptions.Item label="样式">
{template.style} {template.style}
</Descriptions.Item> </Descriptions.Item>
<Descriptions.Item label="类型">
<Tag color={template.builtIn ? "gold" : "default"}>
{template.builtIn ? "系统内置" : "自定义"}
</Tag>
</Descriptions.Item>
<Descriptions.Item label="版本">
{template.version}
</Descriptions.Item>
<Descriptions.Item label="创建时间" span={2}> <Descriptions.Item label="创建时间" span={2}>
{new Date(template.createdAt).toLocaleString()} {new Date(template.createdAt).toLocaleString()}
</Descriptions.Item> </Descriptions.Item>

View File

@@ -36,6 +36,7 @@ const TemplateForm: React.FC<TemplateFormProps> = ({
const [form] = Form.useForm(); const [form] = Form.useForm();
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [labelConfig, setLabelConfig] = useState(""); const [labelConfig, setLabelConfig] = useState("");
const selectedDataType = Form.useWatch("dataType", form);
useEffect(() => { useEffect(() => {
if (visible && template && mode === "edit") { if (visible && template && mode === "edit") {
@@ -96,8 +97,12 @@ const TemplateForm: React.FC<TemplateFormProps> = ({
} else { } else {
message.error(response.message || `模板${mode === "create" ? "创建" : "更新"}失败`); message.error(response.message || `模板${mode === "create" ? "创建" : "更新"}失败`);
} }
} catch (error: any) { } catch (error: unknown) {
if (error.errorFields) { const hasErrorFields =
typeof error === "object" &&
error !== null &&
"errorFields" in error;
if (hasErrorFields) {
message.error("请填写所有必填字段"); message.error("请填写所有必填字段");
} else { } else {
message.error(`模板${mode === "create" ? "创建" : "更新"}失败`); message.error(`模板${mode === "create" ? "创建" : "更新"}失败`);
@@ -195,6 +200,7 @@ const TemplateForm: React.FC<TemplateFormProps> = ({
value={labelConfig} value={labelConfig}
onChange={setLabelConfig} onChange={setLabelConfig}
height={420} height={420}
dataType={selectedDataType}
/> />
</div> </div>
</Form> </Form>

View File

@@ -1,4 +1,4 @@
import React, { useState } from "react"; import React, { useState, useEffect } from "react";
import { import {
Button, Button,
Table, Table,
@@ -32,7 +32,16 @@ import {
TemplateTypeMap TemplateTypeMap
} from "@/pages/DataAnnotation/annotation.const.tsx"; } from "@/pages/DataAnnotation/annotation.const.tsx";
const TEMPLATE_ADMIN_KEY = "datamate_template_admin";
const TemplateList: React.FC = () => { const TemplateList: React.FC = () => {
const [isAdmin, setIsAdmin] = useState(false);
useEffect(() => {
// 检查 localStorage 中是否存在特殊键
const hasAdminKey = localStorage.getItem(TEMPLATE_ADMIN_KEY) !== null;
setIsAdmin(hasAdminKey);
}, []);
const filterOptions = [ const filterOptions = [
{ {
key: "category", key: "category",
@@ -225,23 +234,7 @@ const TemplateList: React.FC = () => {
<Tag color={getCategoryColor(category)}>{ClassificationMap[category as keyof typeof ClassificationMap]?.label || category}</Tag> <Tag color={getCategoryColor(category)}>{ClassificationMap[category as keyof typeof ClassificationMap]?.label || category}</Tag>
), ),
}, },
{
title: "类型",
dataIndex: "builtIn",
key: "builtIn",
width: 100,
render: (builtIn: boolean) => (
<Tag color={builtIn ? "gold" : "default"}>
{builtIn ? "系统内置" : "自定义"}
</Tag>
),
},
{
title: "版本",
dataIndex: "version",
key: "version",
width: 80,
},
{ {
title: "创建时间", title: "创建时间",
dataIndex: "createdAt", dataIndex: "createdAt",
@@ -263,6 +256,7 @@ const TemplateList: React.FC = () => {
onClick={() => handleView(record)} onClick={() => handleView(record)}
/> />
</Tooltip> </Tooltip>
{isAdmin && (
<> <>
<Tooltip title="编辑"> <Tooltip title="编辑">
<Button <Button
@@ -286,6 +280,7 @@ const TemplateList: React.FC = () => {
</Tooltip> </Tooltip>
</Popconfirm> </Popconfirm>
</> </>
)}
</Space> </Space>
), ),
}, },
@@ -310,11 +305,13 @@ const TemplateList: React.FC = () => {
</div> </div>
{/* Right side: Create button */} {/* Right side: Create button */}
{isAdmin && (
<div className="flex items-center gap-2"> <div className="flex items-center gap-2">
<Button type="primary" icon={<PlusOutlined />} onClick={handleCreate}> <Button type="primary" icon={<PlusOutlined />} onClick={handleCreate}>
</Button> </Button>
</div> </div>
)}
</div> </div>
<Card> <Card>

View File

@@ -18,6 +18,7 @@ import {
import { TagBrowser } from "./components"; import { TagBrowser } from "./components";
const { Paragraph } = Typography; const { Paragraph } = Typography;
const PREVIEW_DRAWER_WIDTH = "80vw";
interface VisualTemplateBuilderProps { interface VisualTemplateBuilderProps {
onSave?: (templateCode: string) => void; onSave?: (templateCode: string) => void;
@@ -129,7 +130,7 @@ const VisualTemplateBuilder: React.FC<VisualTemplateBuilderProps> = ({
<Drawer <Drawer
title="模板代码预览" title="模板代码预览"
placement="right" placement="right"
width={600} width={PREVIEW_DRAWER_WIDTH}
open={previewVisible} open={previewVisible}
onClose={() => setPreviewVisible(false)} onClose={() => setPreviewVisible(false)}
> >

View File

@@ -3,16 +3,19 @@ import { get, post, put, del, download } from "@/utils/request";
// 导出格式类型 // 导出格式类型
export type ExportFormat = "json" | "jsonl" | "csv" | "coco" | "yolo"; export type ExportFormat = "json" | "jsonl" | "csv" | "coco" | "yolo";
type RequestParams = Record<string, unknown>;
type RequestPayload = Record<string, unknown>;
// 标注任务管理相关接口 // 标注任务管理相关接口
export function queryAnnotationTasksUsingGet(params?: any) { export function queryAnnotationTasksUsingGet(params?: RequestParams) {
return get("/api/annotation/project", params); return get("/api/annotation/project", params);
} }
export function createAnnotationTaskUsingPost(data: any) { export function createAnnotationTaskUsingPost(data: RequestPayload) {
return post("/api/annotation/project", data); return post("/api/annotation/project", data);
} }
export function syncAnnotationTaskUsingPost(data: any) { export function syncAnnotationTaskUsingPost(data: RequestPayload) {
return post(`/api/annotation/task/sync`, data); return post(`/api/annotation/task/sync`, data);
} }
@@ -25,7 +28,7 @@ export function getAnnotationTaskByIdUsingGet(taskId: string) {
return get(`/api/annotation/project/${taskId}`); return get(`/api/annotation/project/${taskId}`);
} }
export function updateAnnotationTaskByIdUsingPut(taskId: string, data: any) { export function updateAnnotationTaskByIdUsingPut(taskId: string, data: RequestPayload) {
return put(`/api/annotation/project/${taskId}`, data); return put(`/api/annotation/project/${taskId}`, data);
} }
@@ -35,17 +38,17 @@ export function getTagConfigUsingGet() {
} }
// 标注模板管理 // 标注模板管理
export function queryAnnotationTemplatesUsingGet(params?: any) { export function queryAnnotationTemplatesUsingGet(params?: RequestParams) {
return get("/api/annotation/template", params); return get("/api/annotation/template", params);
} }
export function createAnnotationTemplateUsingPost(data: any) { export function createAnnotationTemplateUsingPost(data: RequestPayload) {
return post("/api/annotation/template", data); return post("/api/annotation/template", data);
} }
export function updateAnnotationTemplateByIdUsingPut( export function updateAnnotationTemplateByIdUsingPut(
templateId: string | number, templateId: string | number,
data: any data: RequestPayload
) { ) {
return put(`/api/annotation/template/${templateId}`, data); return put(`/api/annotation/template/${templateId}`, data);
} }
@@ -65,7 +68,7 @@ export function getEditorProjectInfoUsingGet(projectId: string) {
return get(`/api/annotation/editor/projects/${projectId}`); return get(`/api/annotation/editor/projects/${projectId}`);
} }
export function listEditorTasksUsingGet(projectId: string, params?: any) { export function listEditorTasksUsingGet(projectId: string, params?: RequestParams) {
return get(`/api/annotation/editor/projects/${projectId}/tasks`, params); return get(`/api/annotation/editor/projects/${projectId}/tasks`, params);
} }
@@ -77,11 +80,15 @@ export function getEditorTaskUsingGet(
return get(`/api/annotation/editor/projects/${projectId}/tasks/${fileId}`, params); return get(`/api/annotation/editor/projects/${projectId}/tasks/${fileId}`, params);
} }
export function getEditorTaskSegmentsUsingGet(projectId: string, fileId: string) {
return get(`/api/annotation/editor/projects/${projectId}/tasks/${fileId}/segments`);
}
export function upsertEditorAnnotationUsingPut( export function upsertEditorAnnotationUsingPut(
projectId: string, projectId: string,
fileId: string, fileId: string,
data: { data: {
annotation: any; annotation: Record<string, unknown>;
expectedUpdatedAt?: string; expectedUpdatedAt?: string;
segmentIndex?: number; segmentIndex?: number;
} }

View File

@@ -6,6 +6,71 @@ import {
CloseCircleOutlined, CloseCircleOutlined,
} from "@ant-design/icons"; } from "@ant-design/icons";
type AnnotationTaskStatistics = {
accuracy?: number | string;
averageTime?: number | string;
reviewCount?: number | string;
};
type AnnotationTaskPayload = {
id?: string;
labelingProjId?: string;
labelingProjectId?: string;
projId?: string;
labeling_project_id?: string;
name?: string;
description?: string;
datasetId?: string;
datasetName?: string;
dataset_name?: string;
labelingType?: string;
labeling_type?: string;
template?: {
labelingType?: string;
labeling_type?: string;
};
totalCount?: number;
total_count?: number;
annotatedCount?: number;
annotated_count?: number;
inProgressCount?: number;
in_progress_count?: number;
segmentationEnabled?: boolean;
segmentation_enabled?: boolean;
createdAt?: string;
created_at?: string;
updatedAt?: string;
updated_at?: string;
status?: string;
statistics?: AnnotationTaskStatistics;
[key: string]: unknown;
};
export type AnnotationTaskListItem = {
id?: string;
labelingProjId?: string;
projId?: string;
name?: string;
description?: string;
datasetId?: string;
datasetName?: string;
labelingType?: string;
totalCount?: number;
annotatedCount?: number;
inProgressCount?: number;
segmentationEnabled?: boolean;
createdAt?: string;
updatedAt?: string;
icon?: JSX.Element;
iconColor?: string;
status?: {
label: string;
color: string;
};
statistics?: { label: string; value: string | number }[];
[key: string]: unknown;
};
export const AnnotationTaskStatusMap = { export const AnnotationTaskStatusMap = {
[AnnotationTaskStatus.ACTIVE]: { [AnnotationTaskStatus.ACTIVE]: {
label: "活跃", label: "活跃",
@@ -27,9 +92,16 @@ export const AnnotationTaskStatusMap = {
}, },
}; };
export function mapAnnotationTask(task: any) { export function mapAnnotationTask(task: AnnotationTaskPayload): AnnotationTaskListItem {
// Normalize labeling project id from possible backend field names // Normalize labeling project id from possible backend field names
const labelingProjId = task?.labelingProjId || task?.labelingProjectId || task?.projId || task?.labeling_project_id || ""; const labelingProjId = task?.labelingProjId || task?.labelingProjectId || task?.projId || task?.labeling_project_id || "";
const segmentationEnabled = task?.segmentationEnabled ?? task?.segmentation_enabled ?? false;
const inProgressCount = task?.inProgressCount ?? task?.in_progress_count ?? 0;
const labelingType =
task?.labelingType ||
task?.labeling_type ||
task?.template?.labelingType ||
task?.template?.labeling_type;
const statsArray = task?.statistics const statsArray = task?.statistics
? [ ? [
@@ -45,6 +117,9 @@ export function mapAnnotationTask(task: any) {
// provide consistent field for components // provide consistent field for components
labelingProjId, labelingProjId,
projId: labelingProjId, projId: labelingProjId,
segmentationEnabled,
inProgressCount,
labelingType,
name: task.name, name: task.name,
description: task.description || "", description: task.description || "",
datasetName: task.datasetName || task.dataset_name || "-", datasetName: task.datasetName || task.dataset_name || "-",

View File

@@ -8,6 +8,13 @@ export enum AnnotationTaskStatus {
SKIPPED = "skipped", SKIPPED = "skipped",
} }
export enum AnnotationResultStatus {
ANNOTATED = "ANNOTATED",
IN_PROGRESS = "IN_PROGRESS",
NO_ANNOTATION = "NO_ANNOTATION",
NOT_APPLICABLE = "NOT_APPLICABLE",
}
export interface AnnotationTask { export interface AnnotationTask {
id: string; id: string;
name: string; name: string;
@@ -52,7 +59,7 @@ export interface ObjectDefinition {
export interface TemplateConfiguration { export interface TemplateConfiguration {
labels: LabelDefinition[]; labels: LabelDefinition[];
objects: ObjectDefinition[]; objects: ObjectDefinition[];
metadata?: Record<string, any>; metadata?: Record<string, unknown>;
} }
export interface AnnotationTemplate { export interface AnnotationTemplate {

View File

@@ -22,6 +22,7 @@ import {
getObjectDisplayName, getObjectDisplayName,
type LabelStudioTagConfig, type LabelStudioTagConfig,
} from "../annotation.tagconfig"; } from "../annotation.tagconfig";
import { DataType } from "../annotation.model";
const { Text, Title } = Typography; const { Text, Title } = Typography;
@@ -44,10 +45,22 @@ interface TemplateConfigurationTreeEditorProps {
readOnly?: boolean; readOnly?: boolean;
readOnlyStructure?: boolean; readOnlyStructure?: boolean;
height?: number | string; height?: number | string;
dataType?: DataType;
} }
const DEFAULT_ROOT_TAG = "View"; const DEFAULT_ROOT_TAG = "View";
const CHILD_TAGS = ["Label", "Choice", "Relation", "Item", "Path", "Channel"]; const CHILD_TAGS = ["Label", "Choice", "Relation", "Item", "Path", "Channel"];
const OBJECT_TAGS_BY_DATA_TYPE: Record<DataType, string[]> = {
[DataType.TEXT]: ["Text", "Paragraphs", "Markdown"],
[DataType.IMAGE]: ["Image", "Bitmask"],
[DataType.AUDIO]: ["Audio", "AudioPlus"],
[DataType.VIDEO]: ["Video"],
[DataType.PDF]: ["PDF"],
[DataType.TIMESERIES]: ["Timeseries", "TimeSeries", "Vector"],
[DataType.CHAT]: ["Chat"],
[DataType.HTML]: ["HyperText", "Markdown"],
[DataType.TABLE]: ["Table", "Vector"],
};
const createId = () => const createId = () =>
`node_${Date.now().toString(36)}_${Math.random().toString(36).slice(2, 8)}`; `node_${Date.now().toString(36)}_${Math.random().toString(36).slice(2, 8)}`;
@@ -247,19 +260,35 @@ const createNode = (
attrs[attr] = ""; attrs[attr] = "";
}); });
if (objectConfig && attrs.name !== undefined) { if (objectConfig) {
const name = getDefaultName(tag); const name = getDefaultName(tag);
if (!attrs.name) {
attrs.name = name; attrs.name = name;
if (attrs.value !== undefined) { }
attrs.value = `$${name}`; if (!attrs.value) {
attrs.value = `$${attrs.name}`;
} }
} }
if (controlConfig && attrs.name !== undefined) { if (controlConfig) {
const isLabeling = controlConfig.category === "labeling";
if (isLabeling) {
if (!attrs.name) {
attrs.name = getDefaultName(tag); attrs.name = getDefaultName(tag);
if (attrs.toName !== undefined) { }
if (!attrs.toName) {
attrs.toName = objectNames[0] || ""; attrs.toName = objectNames[0] || "";
} }
} else {
// For layout controls, only fill if required
if (attrs.name !== undefined && !attrs.name) {
attrs.name = getDefaultName(tag);
}
if (attrs.toName !== undefined && !attrs.toName) {
attrs.toName = objectNames[0] || "";
}
}
} }
if (CHILD_TAGS.includes(tag)) { if (CHILD_TAGS.includes(tag)) {
@@ -420,14 +449,13 @@ const TemplateConfigurationTreeEditor = ({
readOnly = false, readOnly = false,
readOnlyStructure = false, readOnlyStructure = false,
height = 420, height = 420,
dataType,
}: TemplateConfigurationTreeEditorProps) => { }: TemplateConfigurationTreeEditorProps) => {
const { config } = useTagConfig(false); const { config } = useTagConfig(false);
const [tree, setTree] = useState<XmlNode>(() => createEmptyTree()); const [tree, setTree] = useState<XmlNode>(() => createEmptyTree());
const [selectedId, setSelectedId] = useState<string>(tree.id); const [selectedId, setSelectedId] = useState<string>(tree.id);
const [parseError, setParseError] = useState<string | null>(null); const [parseError, setParseError] = useState<string | null>(null);
const lastSerialized = useRef<string>(""); const lastSerialized = useRef<string>("");
const [addChildTag, setAddChildTag] = useState<string | undefined>();
const [addSiblingTag, setAddSiblingTag] = useState<string | undefined>();
useEffect(() => { useEffect(() => {
if (!value) { if (!value) {
@@ -498,11 +526,17 @@ const TemplateConfigurationTreeEditor = ({
const objectOptions = useMemo(() => { const objectOptions = useMemo(() => {
if (!config?.objects) return []; if (!config?.objects) return [];
return Object.keys(config.objects).map((tag) => ({ const options = Object.keys(config.objects).map((tag) => ({
value: tag, value: tag,
label: getObjectDisplayName(tag), label: getObjectDisplayName(tag),
})); }));
}, [config]); if (!dataType) return options;
const allowedTags = OBJECT_TAGS_BY_DATA_TYPE[dataType];
if (!allowedTags) return options;
const allowedSet = new Set(allowedTags);
const filtered = options.filter((option) => allowedSet.has(option.value));
return filtered.length > 0 ? filtered : options;
}, [config, dataType]);
const tagOptions = useMemo(() => { const tagOptions = useMemo(() => {
const options = [] as { const options = [] as {
@@ -763,9 +797,8 @@ const TemplateConfigurationTreeEditor = ({
<Select <Select
placeholder="添加子节点" placeholder="添加子节点"
options={tagOptions} options={tagOptions}
value={addChildTag} value={null}
onChange={(value) => { onChange={(value) => {
setAddChildTag(undefined);
handleAddNode(value, "child"); handleAddNode(value, "child");
}} }}
disabled={isStructureLocked} disabled={isStructureLocked}
@@ -773,9 +806,8 @@ const TemplateConfigurationTreeEditor = ({
<Select <Select
placeholder="添加同级节点" placeholder="添加同级节点"
options={tagOptions} options={tagOptions}
value={addSiblingTag} value={null}
onChange={(value) => { onChange={(value) => {
setAddSiblingTag(undefined);
handleAddNode(value, "sibling"); handleAddNode(value, "sibling");
}} }}
disabled={isStructureLocked || selectedNode.id === tree.id} disabled={isStructureLocked || selectedNode.id === tree.id}

View File

@@ -7,6 +7,8 @@ interface PreviewPromptModalProps {
evaluationPrompt: string; evaluationPrompt: string;
} }
const PREVIEW_MODAL_WIDTH = "80vw";
const PreviewPromptModal: React.FC<PreviewPromptModalProps> = ({ previewVisible, onCancel, evaluationPrompt }) => { const PreviewPromptModal: React.FC<PreviewPromptModalProps> = ({ previewVisible, onCancel, evaluationPrompt }) => {
return ( return (
<Modal <Modal
@@ -24,7 +26,7 @@ const PreviewPromptModal: React.FC<PreviewPromptModalProps> = ({ previewVisible,
</Button> </Button>
]} ]}
width={800} width={PREVIEW_MODAL_WIDTH}
> >
<div style={{ <div style={{
background: '#f5f5f5', background: '#f5f5f5',

View File

@@ -11,10 +11,12 @@ export default function BasicInformation({
data, data,
setData, setData,
hidden = [], hidden = [],
datasetTypeOptions = datasetTypes,
}: { }: {
data: DatasetFormData; data: DatasetFormData;
setData: Dispatch<SetStateAction<DatasetFormData>>; setData: Dispatch<SetStateAction<DatasetFormData>>;
hidden?: string[]; hidden?: string[];
datasetTypeOptions?: DatasetTypeOption[];
}) { }) {
const [tagOptions, setTagOptions] = useState<DatasetTagOption[]>([]); const [tagOptions, setTagOptions] = useState<DatasetTagOption[]>([]);
const [collectionOptions, setCollectionOptions] = useState<SelectOption[]>([]); const [collectionOptions, setCollectionOptions] = useState<SelectOption[]>([]);
@@ -119,7 +121,7 @@ export default function BasicInformation({
rules={[{ required: true, message: "请选择数据集类型" }]} rules={[{ required: true, message: "请选择数据集类型" }]}
> >
<RadioCard <RadioCard
options={datasetTypes} options={datasetTypeOptions}
value={data.type} value={data.type}
onChange={(datasetType) => setData({ ...data, datasetType })} onChange={(datasetType) => setData({ ...data, datasetType })}
/> />
@@ -149,6 +151,8 @@ type DatasetFormData = Partial<Dataset> & {
parentDatasetId?: string; parentDatasetId?: string;
}; };
type DatasetTypeOption = (typeof datasetTypes)[number];
type DatasetTagOption = { type DatasetTagOption = {
label: string; label: string;
value: string; value: string;

View File

@@ -198,8 +198,14 @@ export default function DatasetDetail() {
return; return;
} }
fetchDataset(); fetchDataset();
filesOperation.fetchFiles("", 1, 10); // 从根目录开始,第一页
}, [id]); }, [id]);
useEffect(() => {
if (dataset?.id) {
filesOperation.fetchFiles("", 1, 10); // 从根目录开始,第一页
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [dataset?.id]);
useEffect(() => { useEffect(() => {
if (dataset?.parentDatasetId && activeTab === "children") { if (dataset?.parentDatasetId && activeTab === "children") {
setActiveTab("overview"); setActiveTab("overview");

View File

@@ -1,11 +1,11 @@
import { Select, Input, Form, Radio, Modal, Button, UploadFile, Switch, Tooltip } from "antd"; import { Select, Input, Form, Radio, Modal, Button, UploadFile, Switch, Tooltip } from "antd";
import { InboxOutlined, QuestionCircleOutlined } from "@ant-design/icons"; import { InboxOutlined, QuestionCircleOutlined } from "@ant-design/icons";
import { dataSourceOptions } from "../../dataset.const"; import { dataSourceOptions } from "../../dataset.const";
import { Dataset, DataSource } from "../../dataset.model"; import { Dataset, DatasetType, DataSource } from "../../dataset.model";
import { useCallback, useEffect, useMemo, useState } from "react"; import { useCallback, useEffect, useMemo, useState } from "react";
import { queryTasksUsingGet } from "@/pages/DataCollection/collection.apis"; import { queryTasksUsingGet } from "@/pages/DataCollection/collection.apis";
import { updateDatasetByIdUsingPut } from "../../dataset.api"; import { updateDatasetByIdUsingPut } from "../../dataset.api";
import { sliceFile } from "@/utils/file.util"; import { sliceFile, shouldStreamUpload } from "@/utils/file.util";
import Dragger from "antd/es/upload/Dragger"; import Dragger from "antd/es/upload/Dragger";
const TEXT_FILE_MIME_PREFIX = "text/"; const TEXT_FILE_MIME_PREFIX = "text/";
@@ -90,14 +90,16 @@ async function splitFileByLines(file: UploadFile): Promise<UploadFile[]> {
const lines = text.split(/\r?\n/).filter((line: string) => line.trim() !== ""); const lines = text.split(/\r?\n/).filter((line: string) => line.trim() !== "");
if (lines.length === 0) return []; if (lines.length === 0) return [];
// 生成文件名:原文件名_序号.扩展名 // 生成文件名:原文件名_序号(不保留后缀)
const nameParts = file.name.split("."); const nameParts = file.name.split(".");
const ext = nameParts.length > 1 ? "." + nameParts.pop() : ""; if (nameParts.length > 1) {
nameParts.pop();
}
const baseName = nameParts.join("."); const baseName = nameParts.join(".");
const padLength = String(lines.length).length; const padLength = String(lines.length).length;
return lines.map((line: string, index: number) => { return lines.map((line: string, index: number) => {
const newFileName = `${baseName}_${String(index + 1).padStart(padLength, "0")}${ext}`; const newFileName = `${baseName}_${String(index + 1).padStart(padLength, "0")}`;
const blob = new Blob([line], { type: "text/plain" }); const blob = new Blob([line], { type: "text/plain" });
const newFile = new File([blob], newFileName, { type: "text/plain" }); const newFile = new File([blob], newFileName, { type: "text/plain" });
return { return {
@@ -159,21 +161,80 @@ export default function ImportConfiguration({
if (files.length === 0) return false; if (files.length === 0) return false;
return files.some((file) => !isTextUploadFile(file)); return files.some((file) => !isTextUploadFile(file));
}, [importConfig.files]); }, [importConfig.files]);
const isTextDataset = data?.datasetType === DatasetType.TEXT;
// 本地上传文件相关逻辑 // 本地上传文件相关逻辑
const handleUpload = async (dataset: Dataset) => { const handleUpload = async (dataset: Dataset) => {
let filesToUpload = const filesToUpload =
(form.getFieldValue("files") as UploadFile[] | undefined) || []; (form.getFieldValue("files") as UploadFile[] | undefined) || [];
// 如果启用分行分割,处理文件 // 如果启用分行分割,对大文件使用流式处理
if (importConfig.splitByLine && !hasNonTextFile) { if (importConfig.splitByLine && !hasNonTextFile) {
const splitResults = await Promise.all( // 检查是否有大文件需要流式分割上传
filesToUpload.map((file) => splitFileByLines(file)) const filesForStreamUpload: File[] = [];
); const filesForNormalUpload: UploadFile[] = [];
filesToUpload = splitResults.flat();
for (const file of filesToUpload) {
const originFile = file.originFileObj ?? file;
if (originFile instanceof File && shouldStreamUpload(originFile)) {
filesForStreamUpload.push(originFile);
} else {
filesForNormalUpload.push(file);
}
} }
// 大文件使用流式分割上传
if (filesForStreamUpload.length > 0) {
window.dispatchEvent(
new CustomEvent("upload:dataset-stream", {
detail: {
dataset,
files: filesForStreamUpload,
updateEvent,
hasArchive: importConfig.hasArchive,
prefix: currentPrefix,
},
})
);
}
// 小文件使用传统分割方式
if (filesForNormalUpload.length > 0) {
const splitResults = await Promise.all(
filesForNormalUpload.map((file) => splitFileByLines(file))
);
const smallFilesToUpload = splitResults.flat();
// 计算分片列表
const sliceList = smallFilesToUpload.map((file) => {
const originFile = (file.originFileObj ?? file) as Blob;
const slices = sliceFile(originFile);
return {
originFile: originFile,
slices,
name: file.name,
size: originFile.size || 0,
};
});
console.log("[ImportConfiguration] Uploading small files with currentPrefix:", currentPrefix);
window.dispatchEvent(
new CustomEvent("upload:dataset", {
detail: {
dataset,
files: sliceList,
updateEvent,
hasArchive: importConfig.hasArchive,
prefix: currentPrefix,
},
})
);
}
return;
}
// 未启用分行分割,使用普通上传
// 计算分片列表 // 计算分片列表
const sliceList = filesToUpload.map((file) => { const sliceList = filesToUpload.map((file) => {
const originFile = (file.originFileObj ?? file) as Blob; const originFile = (file.originFileObj ?? file) as Blob;
@@ -233,6 +294,10 @@ export default function ImportConfiguration({
if (!data) return; if (!data) return;
console.log('[ImportConfiguration] handleImportData called, currentPrefix:', currentPrefix); console.log('[ImportConfiguration] handleImportData called, currentPrefix:', currentPrefix);
if (importConfig.source === DataSource.UPLOAD) { if (importConfig.source === DataSource.UPLOAD) {
// 立即显示任务中心,让用户感知上传已开始(在文件分割等耗时操作之前)
window.dispatchEvent(
new CustomEvent("show:task-popover", { detail: { show: true } })
);
await handleUpload(data); await handleUpload(data);
} else if (importConfig.source === DataSource.COLLECTION) { } else if (importConfig.source === DataSource.COLLECTION) {
await updateDatasetByIdUsingPut(data.id, { await updateDatasetByIdUsingPut(data.id, {
@@ -363,6 +428,7 @@ export default function ImportConfiguration({
> >
<Switch /> <Switch />
</Form.Item> </Form.Item>
{isTextDataset && (
<Form.Item <Form.Item
label={ label={
<span> <span>
@@ -383,6 +449,7 @@ export default function ImportConfiguration({
> >
<Switch disabled={hasNonTextFile} /> <Switch disabled={hasNonTextFile} />
</Form.Item> </Form.Item>
)}
<Form.Item <Form.Item
label="上传文件" label="上传文件"
name="files" name="files"

View File

@@ -4,6 +4,7 @@ import {
Descriptions, Descriptions,
DescriptionsProps, DescriptionsProps,
Modal, Modal,
Spin,
Table, Table,
Input, Input,
} from "antd"; } from "antd";
@@ -21,8 +22,8 @@ type DatasetFileRow = DatasetFile & {
const PREVIEW_MAX_HEIGHT = 500; const PREVIEW_MAX_HEIGHT = 500;
const PREVIEW_MODAL_WIDTH = { const PREVIEW_MODAL_WIDTH = {
text: 800, text: "80vw",
media: 700, media: "80vw",
}; };
const PREVIEW_TEXT_FONT_SIZE = 12; const PREVIEW_TEXT_FONT_SIZE = 12;
const PREVIEW_TEXT_PADDING = 12; const PREVIEW_TEXT_PADDING = 12;
@@ -52,6 +53,8 @@ export default function Overview({
previewFileType, previewFileType,
previewMediaUrl, previewMediaUrl,
previewLoading, previewLoading,
officePreviewStatus,
officePreviewError,
closePreview, closePreview,
handleDeleteFile, handleDeleteFile,
handleDownloadFile, handleDownloadFile,
@@ -447,11 +450,39 @@ export default function Overview({
</div> </div>
)} )}
{previewFileType === "pdf" && ( {previewFileType === "pdf" && (
<>
{previewMediaUrl ? (
<iframe <iframe
src={previewMediaUrl} src={previewMediaUrl}
title={previewFileName || "PDF 预览"} title={previewFileName || "PDF 预览"}
style={{ width: "100%", height: `${PREVIEW_MAX_HEIGHT}px`, border: "none" }} style={{ width: "100%", height: `${PREVIEW_MAX_HEIGHT}px`, border: "none" }}
/> />
) : (
<div
style={{
height: `${PREVIEW_MAX_HEIGHT}px`,
display: "flex",
flexDirection: "column",
alignItems: "center",
justifyContent: "center",
gap: 12,
color: "#666",
}}
>
{officePreviewStatus === "FAILED" ? (
<>
<div></div>
<div>{officePreviewError || "请稍后重试"}</div>
</>
) : (
<>
<Spin />
<div>...</div>
</>
)}
</div>
)}
</>
)} )}
{previewFileType === "video" && ( {previewFileType === "video" && (
<div style={{ textAlign: "center" }}> <div style={{ textAlign: "center" }}>

View File

@@ -2,9 +2,8 @@ import type {
Dataset, Dataset,
DatasetFile, DatasetFile,
} from "@/pages/DataManagement/dataset.model"; } from "@/pages/DataManagement/dataset.model";
import { DatasetType } from "@/pages/DataManagement/dataset.model";
import { App } from "antd"; import { App } from "antd";
import { useState } from "react"; import { useCallback, useEffect, useRef, useState } from "react";
import { import {
PREVIEW_TEXT_MAX_LENGTH, PREVIEW_TEXT_MAX_LENGTH,
resolvePreviewFileType, resolvePreviewFileType,
@@ -19,9 +18,33 @@ import {
createDatasetDirectoryUsingPost, createDatasetDirectoryUsingPost,
downloadDirectoryUsingGet, downloadDirectoryUsingGet,
deleteDirectoryUsingDelete, deleteDirectoryUsingDelete,
queryDatasetFilePreviewStatusUsingGet,
convertDatasetFilePreviewUsingPost,
} from "../dataset.api"; } from "../dataset.api";
import { useParams } from "react-router"; import { useParams } from "react-router";
const OFFICE_FILE_EXTENSIONS = [".doc", ".docx"];
const OFFICE_PREVIEW_POLL_INTERVAL = 2000;
const OFFICE_PREVIEW_POLL_MAX_TIMES = 60;
type OfficePreviewStatus = "UNSET" | "PENDING" | "PROCESSING" | "READY" | "FAILED";
const isOfficeFileName = (fileName?: string) => {
const lowerName = (fileName || "").toLowerCase();
return OFFICE_FILE_EXTENSIONS.some((ext) => lowerName.endsWith(ext));
};
const normalizeOfficePreviewStatus = (status?: string): OfficePreviewStatus => {
if (!status) {
return "UNSET";
}
const upper = status.toUpperCase();
if (upper === "PENDING" || upper === "PROCESSING" || upper === "READY" || upper === "FAILED") {
return upper as OfficePreviewStatus;
}
return "UNSET";
};
export function useFilesOperation(dataset: Dataset) { export function useFilesOperation(dataset: Dataset) {
const { message } = App.useApp(); const { message } = App.useApp();
@@ -44,6 +67,23 @@ export function useFilesOperation(dataset: Dataset) {
const [previewFileType, setPreviewFileType] = useState<PreviewFileType>("text"); const [previewFileType, setPreviewFileType] = useState<PreviewFileType>("text");
const [previewMediaUrl, setPreviewMediaUrl] = useState(""); const [previewMediaUrl, setPreviewMediaUrl] = useState("");
const [previewLoading, setPreviewLoading] = useState(false); const [previewLoading, setPreviewLoading] = useState(false);
const [officePreviewStatus, setOfficePreviewStatus] = useState<OfficePreviewStatus | null>(null);
const [officePreviewError, setOfficePreviewError] = useState("");
const officePreviewPollingRef = useRef<number | null>(null);
const officePreviewFileRef = useRef<string | null>(null);
const clearOfficePreviewPolling = useCallback(() => {
if (officePreviewPollingRef.current) {
window.clearTimeout(officePreviewPollingRef.current);
officePreviewPollingRef.current = null;
}
}, []);
useEffect(() => {
return () => {
clearOfficePreviewPolling();
};
}, [clearOfficePreviewPolling]);
const fetchFiles = async ( const fetchFiles = async (
prefix?: string, prefix?: string,
@@ -52,14 +92,13 @@ export function useFilesOperation(dataset: Dataset) {
) => { ) => {
// 如果明确传了 prefix(包括空字符串),使用传入的值;否则使用当前 pagination.prefix // 如果明确传了 prefix(包括空字符串),使用传入的值;否则使用当前 pagination.prefix
const targetPrefix = prefix !== undefined ? prefix : (pagination.prefix || ''); const targetPrefix = prefix !== undefined ? prefix : (pagination.prefix || '');
const shouldExcludeDerivedFiles = dataset?.datasetType === DatasetType.TEXT;
const params: DatasetFilesQueryParams = { const params: DatasetFilesQueryParams = {
page: current !== undefined ? current : pagination.current, page: current !== undefined ? current : pagination.current,
size: pageSize !== undefined ? pageSize : pagination.pageSize, size: pageSize !== undefined ? pageSize : pagination.pageSize,
isWithDirectory: true, isWithDirectory: true,
prefix: targetPrefix, prefix: targetPrefix,
...(shouldExcludeDerivedFiles ? { excludeDerivedFiles: true } : {}), excludeDerivedFiles: true,
}; };
const { data } = await queryDatasetFilesUsingGet(id!, params); const { data } = await queryDatasetFilesUsingGet(id!, params);
@@ -113,17 +152,61 @@ export function useFilesOperation(dataset: Dataset) {
return; return;
} }
const previewUrl = `/api/data-management/datasets/${datasetId}/files/${file.id}/preview`;
setPreviewFileName(file.fileName);
setPreviewContent("");
setPreviewMediaUrl("");
if (isOfficeFileName(file?.fileName)) {
setPreviewFileType("pdf");
setPreviewVisible(true);
setPreviewLoading(true);
setOfficePreviewStatus("PROCESSING");
setOfficePreviewError("");
officePreviewFileRef.current = file.id;
try {
const { data: statusData } = await queryDatasetFilePreviewStatusUsingGet(datasetId, file.id);
const currentStatus = normalizeOfficePreviewStatus(statusData?.status);
if (currentStatus === "READY") {
setPreviewMediaUrl(previewUrl);
setOfficePreviewStatus("READY");
setPreviewLoading(false);
return;
}
if (currentStatus === "PROCESSING") {
pollOfficePreviewStatus(datasetId, file.id, 0);
return;
}
const { data } = await convertDatasetFilePreviewUsingPost(datasetId, file.id);
const status = normalizeOfficePreviewStatus(data?.status);
if (status === "READY") {
setPreviewMediaUrl(previewUrl);
setOfficePreviewStatus("READY");
} else if (status === "FAILED") {
setOfficePreviewStatus("FAILED");
setOfficePreviewError(data?.previewError || "转换失败,请稍后重试");
} else {
setOfficePreviewStatus("PROCESSING");
pollOfficePreviewStatus(datasetId, file.id, 0);
return;
}
} catch (error) {
console.error("触发预览转换失败", error);
message.error({ content: "触发预览转换失败" });
setOfficePreviewStatus("FAILED");
setOfficePreviewError("触发预览转换失败");
} finally {
setPreviewLoading(false);
}
return;
}
const fileType = resolvePreviewFileType(file?.fileName); const fileType = resolvePreviewFileType(file?.fileName);
if (!fileType) { if (!fileType) {
message.warning({ content: "不支持预览该文件类型" }); message.warning({ content: "不支持预览该文件类型" });
return; return;
} }
const previewUrl = `/api/data-management/datasets/${datasetId}/files/${file.id}/preview`;
setPreviewFileName(file.fileName);
setPreviewFileType(fileType); setPreviewFileType(fileType);
setPreviewContent("");
setPreviewMediaUrl("");
if (fileType === "text") { if (fileType === "text") {
setPreviewLoading(true); setPreviewLoading(true);
@@ -149,13 +232,62 @@ export function useFilesOperation(dataset: Dataset) {
}; };
const closePreview = () => { const closePreview = () => {
clearOfficePreviewPolling();
officePreviewFileRef.current = null;
setPreviewVisible(false); setPreviewVisible(false);
setPreviewContent(""); setPreviewContent("");
setPreviewMediaUrl(""); setPreviewMediaUrl("");
setPreviewFileName(""); setPreviewFileName("");
setPreviewFileType("text"); setPreviewFileType("text");
setOfficePreviewStatus(null);
setOfficePreviewError("");
}; };
const pollOfficePreviewStatus = useCallback(
async (datasetId: string, fileId: string, attempt: number) => {
clearOfficePreviewPolling();
officePreviewPollingRef.current = window.setTimeout(async () => {
if (officePreviewFileRef.current !== fileId) {
return;
}
try {
const { data } = await queryDatasetFilePreviewStatusUsingGet(datasetId, fileId);
const status = normalizeOfficePreviewStatus(data?.status);
if (status === "READY") {
setPreviewMediaUrl(`/api/data-management/datasets/${datasetId}/files/${fileId}/preview`);
setOfficePreviewStatus("READY");
setOfficePreviewError("");
setPreviewLoading(false);
return;
}
if (status === "FAILED") {
setOfficePreviewStatus("FAILED");
setOfficePreviewError(data?.previewError || "转换失败,请稍后重试");
setPreviewLoading(false);
return;
}
if (attempt >= OFFICE_PREVIEW_POLL_MAX_TIMES - 1) {
setOfficePreviewStatus("FAILED");
setOfficePreviewError("转换超时,请稍后重试");
setPreviewLoading(false);
return;
}
pollOfficePreviewStatus(datasetId, fileId, attempt + 1);
} catch (error) {
console.error("轮询预览状态失败", error);
if (attempt >= OFFICE_PREVIEW_POLL_MAX_TIMES - 1) {
setOfficePreviewStatus("FAILED");
setOfficePreviewError("转换超时,请稍后重试");
setPreviewLoading(false);
return;
}
pollOfficePreviewStatus(datasetId, fileId, attempt + 1);
}
}, OFFICE_PREVIEW_POLL_INTERVAL);
},
[clearOfficePreviewPolling]
);
const handleDeleteFile = async (file: DatasetFile) => { const handleDeleteFile = async (file: DatasetFile) => {
try { try {
await deleteDatasetFileUsingDelete(dataset.id, file.id); await deleteDatasetFileUsingDelete(dataset.id, file.id);
@@ -198,6 +330,8 @@ export function useFilesOperation(dataset: Dataset) {
previewFileType, previewFileType,
previewMediaUrl, previewMediaUrl,
previewLoading, previewLoading,
officePreviewStatus,
officePreviewError,
closePreview, closePreview,
fetchFiles, fetchFiles,
setFileList, setFileList,

View File

@@ -329,7 +329,7 @@ export default function DatasetManagementPage() {
<div className="gap-4 h-full flex flex-col"> <div className="gap-4 h-full flex flex-col">
{/* Header */} {/* Header */}
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<h1 className="text-xl font-bold"></h1> <h1 className="text-xl font-bold"></h1>
<div className="flex gap-2 items-center"> <div className="flex gap-2 items-center">
{/* tasks */} {/* tasks */}
<TagManager <TagManager

View File

@@ -119,6 +119,22 @@ export function downloadFileByIdUsingGet(
); );
} }
// 数据集文件预览状态
export function queryDatasetFilePreviewStatusUsingGet(
datasetId: string | number,
fileId: string | number
) {
return get(`/api/data-management/datasets/${datasetId}/files/${fileId}/preview/status`);
}
// 触发数据集文件预览转换
export function convertDatasetFilePreviewUsingPost(
datasetId: string | number,
fileId: string | number
) {
return post(`/api/data-management/datasets/${datasetId}/files/${fileId}/preview/convert`, {});
}
// 删除数据集文件 // 删除数据集文件
export function deleteDatasetFileUsingDelete( export function deleteDatasetFileUsingDelete(
datasetId: string | number, datasetId: string | number,

View File

@@ -102,6 +102,13 @@ export interface DatasetTask {
executionHistory?: { time: string; status: string }[]; executionHistory?: { time: string; status: string }[];
} }
export interface StreamUploadInfo {
currentFile: string;
fileIndex: number;
totalFiles: number;
uploadedLines: number;
}
export interface TaskItem { export interface TaskItem {
key: string; key: string;
title: string; title: string;
@@ -113,4 +120,6 @@ export interface TaskItem {
updateEvent?: string; updateEvent?: string;
size?: number; size?: number;
hasArchive?: boolean; hasArchive?: boolean;
prefix?: string;
streamUploadInfo?: StreamUploadInfo;
} }

View File

@@ -36,6 +36,10 @@ const DEFAULT_STATISTICS: StatisticsItem[] = [
title: "知识集总数", title: "知识集总数",
value: 0, value: 0,
}, },
{
title: "知识类别",
value: 0,
},
{ {
title: "文件总数", title: "文件总数",
value: 0, value: 0,
@@ -109,6 +113,10 @@ export default function KnowledgeManagementPage() {
title: "知识集总数", title: "知识集总数",
value: stats?.totalKnowledgeSets ?? 0, value: stats?.totalKnowledgeSets ?? 0,
}, },
{
title: "知识类别",
value: stats?.totalTags ?? 0,
},
{ {
title: "文件总数", title: "文件总数",
value: stats?.totalFiles ?? 0, value: stats?.totalFiles ?? 0,
@@ -249,7 +257,7 @@ export default function KnowledgeManagementPage() {
return ( return (
<div className="h-full flex flex-col gap-4"> <div className="h-full flex flex-col gap-4">
<div className="flex items-center justify-between"> <div className="flex items-center justify-between">
<h1 className="text-xl font-bold"></h1> <h1 className="text-xl font-bold"></h1>
<div className="flex gap-2 items-center"> <div className="flex gap-2 items-center">
<Button onClick={() => navigate("/data/knowledge-management/search")}> <Button onClick={() => navigate("/data/knowledge-management/search")}>
@@ -276,7 +284,7 @@ export default function KnowledgeManagementPage() {
<div className="grid grid-cols-1 gap-4"> <div className="grid grid-cols-1 gap-4">
<Card> <Card>
<div className="grid grid-cols-3"> <div className="grid grid-cols-4">
{statisticsData.map((item) => ( {statisticsData.map((item) => (
<Statistic <Statistic
title={item.title} title={item.title}

View File

@@ -9,6 +9,7 @@ import {
import { import {
knowledgeSourceTypeOptions, knowledgeSourceTypeOptions,
knowledgeStatusOptions, knowledgeStatusOptions,
// sensitivityOptions,
} from "../knowledge-management.const"; } from "../knowledge-management.const";
import { import {
KnowledgeSet, KnowledgeSet,
@@ -169,9 +170,9 @@ export default function CreateKnowledgeSet({
<Form.Item label="负责人" name="owner"> <Form.Item label="负责人" name="owner">
<Input placeholder="请输入负责人" /> <Input placeholder="请输入负责人" />
</Form.Item> </Form.Item>
<Form.Item label="敏感级别" name="sensitivity"> {/* <Form.Item label="敏感级别" name="sensitivity">
<Input placeholder="请输入敏感级别" /> <Select options={sensitivityOptions} placeholder="请选择敏感级别" />
</Form.Item> </Form.Item> */}
</div> </div>
<div className="grid grid-cols-2 gap-4"> <div className="grid grid-cols-2 gap-4">
<Form.Item label="有效期开始" name="validFrom"> <Form.Item label="有效期开始" name="validFrom">
@@ -191,9 +192,6 @@ export default function CreateKnowledgeSet({
placeholder="请选择或输入标签" placeholder="请选择或输入标签"
/> />
</Form.Item> </Form.Item>
<Form.Item label="扩展元数据" name="metadata">
<Input.TextArea placeholder="请输入元数据(JSON)" rows={3} />
</Form.Item>
</Form> </Form>
</Modal> </Modal>
</> </>

View File

@@ -16,6 +16,7 @@ export default function KnowledgeItemEditor({
open, open,
setId, setId,
data, data,
parentPrefix,
onCancel, onCancel,
onSuccess, onSuccess,
readOnly, readOnly,
@@ -23,12 +24,14 @@ export default function KnowledgeItemEditor({
open: boolean; open: boolean;
setId: string; setId: string;
data?: Partial<KnowledgeItem> | null; data?: Partial<KnowledgeItem> | null;
parentPrefix?: string;
readOnly?: boolean; readOnly?: boolean;
onCancel: () => void; onCancel: () => void;
onSuccess: () => void; onSuccess: () => void;
}) { }) {
const [fileList, setFileList] = useState<UploadFile[]>([]); const [fileList, setFileList] = useState<UploadFile[]>([]);
const [replaceFileList, setReplaceFileList] = useState<UploadFile[]>([]); const [replaceFileList, setReplaceFileList] = useState<UploadFile[]>([]);
const [loading, setLoading] = useState(false);
const isFileItem = const isFileItem =
data?.contentType === KnowledgeContentType.FILE || data?.contentType === KnowledgeContentType.FILE ||
data?.sourceType === KnowledgeSourceType.FILE_UPLOAD; data?.sourceType === KnowledgeSourceType.FILE_UPLOAD;
@@ -49,7 +52,6 @@ export default function KnowledgeItemEditor({
originFileObj: file, originFileObj: file,
}, },
]); ]);
message.success("文件已就绪,可提交创建条目");
return false; return false;
}; };
@@ -95,6 +97,7 @@ export default function KnowledgeItemEditor({
message.warning("请先选择文件"); message.warning("请先选择文件");
return; return;
} }
setLoading(true);
const formData = new FormData(); const formData = new FormData();
fileList.forEach((file) => { fileList.forEach((file) => {
const origin = file.originFileObj as File | undefined; const origin = file.originFileObj as File | undefined;
@@ -102,6 +105,9 @@ export default function KnowledgeItemEditor({
formData.append("files", origin); formData.append("files", origin);
} }
}); });
if (parentPrefix) {
formData.append("parentPrefix", parentPrefix);
}
await uploadKnowledgeItemsUsingPost(setId, formData); await uploadKnowledgeItemsUsingPost(setId, formData);
message.success(`已创建 ${fileList.length} 个知识条目`); message.success(`已创建 ${fileList.length} 个知识条目`);
} else { } else {
@@ -121,6 +127,7 @@ export default function KnowledgeItemEditor({
message.warning("请先选择要替换的文件"); message.warning("请先选择要替换的文件");
return; return;
} }
setLoading(true);
const formData = new FormData(); const formData = new FormData();
formData.append("file", replaceFile); formData.append("file", replaceFile);
await replaceKnowledgeItemFileUsingPut(setId, data.id, formData); await replaceKnowledgeItemFileUsingPut(setId, data.id, formData);
@@ -132,6 +139,8 @@ export default function KnowledgeItemEditor({
onSuccess(); onSuccess();
} catch { } catch {
message.error("操作失败,请重试"); message.error("操作失败,请重试");
} finally {
setLoading(false);
} }
}; };
@@ -148,6 +157,7 @@ export default function KnowledgeItemEditor({
width={860} width={860}
maskClosable={false} maskClosable={false}
okButtonProps={{ disabled: readOnly }} okButtonProps={{ disabled: readOnly }}
confirmLoading={loading}
> >
<Form layout="vertical" disabled={readOnly}> <Form layout="vertical" disabled={readOnly}>
{isCreateMode && ( {isCreateMode && (

View File

@@ -35,6 +35,22 @@ export function queryKnowledgeItemsUsingGet(setId: string, params?: Record<strin
return get(`/api/data-management/knowledge-sets/${setId}/items`, params); return get(`/api/data-management/knowledge-sets/${setId}/items`, params);
} }
// 知识条目目录列表
export function queryKnowledgeDirectoriesUsingGet(setId: string, params?: Record<string, unknown>) {
return get(`/api/data-management/knowledge-sets/${setId}/directories`, params);
}
// 创建知识条目目录
export function createKnowledgeDirectoryUsingPost(setId: string, data: Record<string, unknown>) {
return post(`/api/data-management/knowledge-sets/${setId}/directories`, data);
}
// 删除知识条目目录
export function deleteKnowledgeDirectoryUsingDelete(setId: string, relativePath: string) {
const query = new URLSearchParams({ relativePath }).toString();
return del(`/api/data-management/knowledge-sets/${setId}/directories?${query}`);
}
// 知识条目文件搜索 // 知识条目文件搜索
export function searchKnowledgeItemsUsingGet(params?: Record<string, unknown>) { export function searchKnowledgeItemsUsingGet(params?: Record<string, unknown>) {
return get("/api/data-management/knowledge-items/search", params); return get("/api/data-management/knowledge-items/search", params);
@@ -70,6 +86,11 @@ export function deleteKnowledgeItemByIdUsingDelete(setId: string, itemId: string
return del(`/api/data-management/knowledge-sets/${setId}/items/${itemId}`); return del(`/api/data-management/knowledge-sets/${setId}/items/${itemId}`);
} }
// 批量删除知识条目
export function deleteKnowledgeItemsByIdsUsingPost(setId: string, data: { ids: string[] }) {
return post(`/api/data-management/knowledge-sets/${setId}/items/batch-delete`, data);
}
// 上传知识条目文件 // 上传知识条目文件
export function uploadKnowledgeItemsUsingPost(setId: string, data: FormData) { export function uploadKnowledgeItemsUsingPost(setId: string, data: FormData) {
return post(`/api/data-management/knowledge-sets/${setId}/items/upload`, data); return post(`/api/data-management/knowledge-sets/${setId}/items/upload`, data);
@@ -80,6 +101,16 @@ export function downloadKnowledgeItemFileUsingGet(setId: string, itemId: string,
return download(`/api/data-management/knowledge-sets/${setId}/items/${itemId}/file`, null, fileName || ""); return download(`/api/data-management/knowledge-sets/${setId}/items/${itemId}/file`, null, fileName || "");
} }
// 知识条目预览状态
export function queryKnowledgeItemPreviewStatusUsingGet(setId: string, itemId: string) {
return get(`/api/data-management/knowledge-sets/${setId}/items/${itemId}/preview/status`);
}
// 触发知识条目预览转换
export function convertKnowledgeItemPreviewUsingPost(setId: string, itemId: string) {
return post(`/api/data-management/knowledge-sets/${setId}/items/${itemId}/preview/convert`, {});
}
// 导出知识条目 // 导出知识条目
export function exportKnowledgeItemsUsingGet(setId: string) { export function exportKnowledgeItemsUsingGet(setId: string) {
return download(`/api/data-management/knowledge-sets/${setId}/items/export`); return download(`/api/data-management/knowledge-sets/${setId}/items/export`);

View File

@@ -66,6 +66,11 @@ export const knowledgeSourceTypeOptions = [
{ label: "文件上传", value: KnowledgeSourceType.FILE_UPLOAD }, { label: "文件上传", value: KnowledgeSourceType.FILE_UPLOAD },
]; ];
// export const sensitivityOptions = [
// { label: "敏感", value: "敏感" },
// { label: "不敏感", value: "不敏感" },
// ];
export type KnowledgeSetView = { export type KnowledgeSetView = {
id: string; id: string;
name: string; name: string;
@@ -106,6 +111,7 @@ export type KnowledgeItemView = {
sensitivity?: string; sensitivity?: string;
sourceDatasetId?: string; sourceDatasetId?: string;
sourceFileId?: string; sourceFileId?: string;
relativePath?: string;
metadata?: string; metadata?: string;
createdAt?: string; createdAt?: string;
updatedAt?: string; updatedAt?: string;
@@ -153,6 +159,7 @@ export function mapKnowledgeItem(data: KnowledgeItem): KnowledgeItemView {
sensitivity: data.sensitivity, sensitivity: data.sensitivity,
sourceDatasetId: data.sourceDatasetId, sourceDatasetId: data.sourceDatasetId,
sourceFileId: data.sourceFileId, sourceFileId: data.sourceFileId,
relativePath: data.relativePath,
metadata: data.metadata, metadata: data.metadata,
createdAt: data.createdAt ? formatDateTime(data.createdAt) : "", createdAt: data.createdAt ? formatDateTime(data.createdAt) : "",
updatedAt: data.updatedAt ? formatDateTime(data.updatedAt) : "", updatedAt: data.updatedAt ? formatDateTime(data.updatedAt) : "",

View File

@@ -61,6 +61,7 @@ export interface KnowledgeItem {
sensitivity?: string; sensitivity?: string;
sourceDatasetId?: string; sourceDatasetId?: string;
sourceFileId?: string; sourceFileId?: string;
relativePath?: string;
metadata?: string; metadata?: string;
createdAt?: string; createdAt?: string;
updatedAt?: string; updatedAt?: string;
@@ -68,10 +69,20 @@ export interface KnowledgeItem {
updatedBy?: string; updatedBy?: string;
} }
export interface KnowledgeDirectory {
id: string;
setId: string;
name: string;
relativePath: string;
createdAt?: string;
updatedAt?: string;
}
export interface KnowledgeManagementStatistics { export interface KnowledgeManagementStatistics {
totalKnowledgeSets: number; totalKnowledgeSets: number;
totalFiles: number; totalFiles: number;
totalSize: number; totalSize: number;
totalTags: number;
} }
export interface KnowledgeItemSearchResult { export interface KnowledgeItemSearchResult {
@@ -84,6 +95,7 @@ export interface KnowledgeItemSearchResult {
sourceFileId?: string; sourceFileId?: string;
fileName?: string; fileName?: string;
fileSize?: number; fileSize?: number;
relativePath?: string;
createdAt?: string; createdAt?: string;
updatedAt?: string; updatedAt?: string;
} }

View File

@@ -4,6 +4,7 @@ import {
CloseOutlined, CloseOutlined,
MenuOutlined, MenuOutlined,
SettingOutlined, SettingOutlined,
LogoutOutlined,
} from "@ant-design/icons"; } from "@ant-design/icons";
import { ClipboardList, X } from "lucide-react"; import { ClipboardList, X } from "lucide-react";
import { menuItems } from "@/pages/Layout/menu"; import { menuItems } from "@/pages/Layout/menu";
@@ -12,6 +13,7 @@ import TaskUpload from "./TaskUpload";
import SettingsPage from "../SettingsPage/SettingsPage"; import SettingsPage from "../SettingsPage/SettingsPage";
import { useAppSelector, useAppDispatch } from "@/store/hooks"; import { useAppSelector, useAppDispatch } from "@/store/hooks";
import { showSettings, hideSettings } from "@/store/slices/settingsSlice"; import { showSettings, hideSettings } from "@/store/slices/settingsSlice";
import { logout } from "@/store/slices/authSlice";
const isPathMatch = (currentPath: string, targetPath: string) => const isPathMatch = (currentPath: string, targetPath: string) =>
currentPath === targetPath || currentPath.startsWith(`${targetPath}/`); currentPath === targetPath || currentPath.startsWith(`${targetPath}/`);
@@ -67,6 +69,11 @@ const AsiderAndHeaderLayout = () => {
}; };
}, []); }, []);
const handleLogout = () => {
dispatch(logout());
navigate("/login");
};
return ( return (
<div <div
className={`${ className={`${
@@ -148,6 +155,9 @@ const AsiderAndHeaderLayout = () => {
> >
</Button> </Button>
<Button block danger onClick={handleLogout}>
退
</Button>
</div> </div>
) : ( ) : (
<div className="space-y-2"> <div className="space-y-2">
@@ -175,6 +185,7 @@ const AsiderAndHeaderLayout = () => {
> >
<SettingOutlined /> <SettingOutlined />
</Button> </Button>
<Button block danger onClick={handleLogout} icon={<LogoutOutlined />} />
</div> </div>
)} )}
</div> </div>

View File

@@ -3,25 +3,28 @@ import {
preUploadUsingPost, preUploadUsingPost,
uploadFileChunkUsingPost, uploadFileChunkUsingPost,
} from "@/pages/DataManagement/dataset.api"; } from "@/pages/DataManagement/dataset.api";
import { Button, Empty, Progress } from "antd"; import { Button, Empty, Progress, Tag } from "antd";
import { DeleteOutlined } from "@ant-design/icons"; import { DeleteOutlined, FileTextOutlined } from "@ant-design/icons";
import { useEffect } from "react"; import { useEffect } from "react";
import { useFileSliceUpload } from "@/hooks/useSliceUpload"; import { useFileSliceUpload } from "@/hooks/useSliceUpload";
export default function TaskUpload() { export default function TaskUpload() {
const { createTask, taskList, removeTask, handleUpload } = useFileSliceUpload( const { createTask, taskList, removeTask, handleUpload, registerStreamUploadListener } = useFileSliceUpload(
{ {
preUpload: preUploadUsingPost, preUpload: preUploadUsingPost,
uploadChunk: uploadFileChunkUsingPost, uploadChunk: uploadFileChunkUsingPost,
cancelUpload: cancelUploadUsingPut, cancelUpload: cancelUploadUsingPut,
} },
true, // showTaskCenter
true // enableStreamUpload
); );
useEffect(() => { useEffect(() => {
const uploadHandler = (e: any) => { const uploadHandler = (e: Event) => {
console.log('[TaskUpload] Received upload event detail:', e.detail); const customEvent = e as CustomEvent;
const { files } = e.detail; console.log('[TaskUpload] Received upload event detail:', customEvent.detail);
const task = createTask(e.detail); const { files } = customEvent.detail;
const task = createTask(customEvent.detail);
console.log('[TaskUpload] Created task with prefix:', task.prefix); console.log('[TaskUpload] Created task with prefix:', task.prefix);
handleUpload({ task, files }); handleUpload({ task, files });
}; };
@@ -29,7 +32,13 @@ export default function TaskUpload() {
return () => { return () => {
window.removeEventListener("upload:dataset", uploadHandler); window.removeEventListener("upload:dataset", uploadHandler);
}; };
}, []); }, [createTask, handleUpload]);
// 注册流式上传监听器
useEffect(() => {
const unregister = registerStreamUploadListener();
return unregister;
}, [registerStreamUploadListener]);
return ( return (
<div <div
@@ -55,7 +64,22 @@ export default function TaskUpload() {
></Button> ></Button>
</div> </div>
<Progress size="small" percent={task.percent} /> <Progress size="small" percent={Number(task.percent)} />
{task.streamUploadInfo && (
<div className="flex items-center gap-2 text-xs text-gray-500 mt-1">
<Tag icon={<FileTextOutlined />} size="small">
</Tag>
<span>
: {task.streamUploadInfo.uploadedLines}
</span>
{task.streamUploadInfo.totalFiles > 1 && (
<span>
({task.streamUploadInfo.fileIndex}/{task.streamUploadInfo.totalFiles} )
</span>
)}
</div>
)}
</div> </div>
))} ))}
{taskList.length === 0 && ( {taskList.length === 0 && (

View File

@@ -24,11 +24,25 @@ export const menuItems = [
// }, // },
{ {
id: "management", id: "management",
title: "数管理", title: "数管理",
icon: FolderOpen, icon: FolderOpen,
description: "创建、导入和管理数据集", description: "创建、导入和管理数据集",
color: "bg-blue-500", color: "bg-blue-500",
}, },
{
id: "annotation",
title: "数据标注",
icon: Tag,
description: "对数据进行标注和标记",
color: "bg-green-500",
},
{
id: "content-generation",
title: "内容生成",
icon: Sparkles,
description: "智能内容生成与创作",
color: "bg-purple-500",
},
{ {
id: "knowledge-management", id: "knowledge-management",
title: "知识管理", title: "知识管理",
@@ -43,20 +57,6 @@ export const menuItems = [
// description: "数据清洗和预处理", // description: "数据清洗和预处理",
// color: "bg-purple-500", // color: "bg-purple-500",
// }, // },
{
id: "annotation",
title: "数据标注",
icon: Tag,
description: "对数据进行标注和标记",
color: "bg-green-500",
},
{
id: "content-generation",
title: "内容生成",
icon: Sparkles,
description: "智能内容生成与创作",
color: "bg-purple-500",
},
// { // {
// id: "synthesis", // id: "synthesis",
// title: "数据合成", // title: "数据合成",

View File

@@ -0,0 +1,114 @@
import React, { useState } from 'react';
import { useNavigate, useLocation } from 'react-router';
import { Form, Input, Button, Typography, message, Card } from 'antd';
import { UserOutlined, LockOutlined } from '@ant-design/icons';
import { useAppDispatch, useAppSelector } from '@/store/hooks';
import { loginLocal } from '@/store/slices/authSlice';
const { Title, Text } = Typography;
const LoginPage: React.FC = () => {
const navigate = useNavigate();
const location = useLocation();
const dispatch = useAppDispatch();
const { loading, error } = useAppSelector((state) => state.auth);
const [messageApi, contextHolder] = message.useMessage();
const from = location.state?.from?.pathname || '/data';
const onFinish = (values: any) => {
dispatch(loginLocal(values));
// The reducer updates state synchronously.
if (values.username === 'admin' && values.password === '123456') {
messageApi.success('登录成功');
navigate(from, { replace: true });
} else {
messageApi.error('账号或密码错误');
}
};
return (
<div className="min-h-screen flex items-center justify-center bg-[#050b14] relative overflow-hidden">
{contextHolder}
{/* Background Effects */}
<div className="absolute inset-0 z-0">
<div className="absolute top-0 left-0 w-full h-full bg-[radial-gradient(ellipse_at_center,_var(--tw-gradient-stops))] from-blue-900/20 via-[#050b14] to-[#050b14]"></div>
{/* Simple grid pattern if possible, or just gradient */}
</div>
<div className="absolute top-1/4 left-1/4 w-72 h-72 bg-blue-500/10 rounded-full blur-3xl animate-pulse"></div>
<div className="absolute bottom-1/4 right-1/4 w-96 h-96 bg-cyan-500/10 rounded-full blur-3xl animate-pulse delay-700"></div>
<div className="z-10 w-full max-w-md p-8 animate-[fadeIn_0.5s_ease-out_forwards]">
<div className="backdrop-blur-xl bg-white/5 border border-white/10 rounded-2xl shadow-2xl p-8 relative overflow-hidden">
{/* Decorative line */}
<div className="absolute top-0 left-0 w-full h-1 bg-gradient-to-r from-transparent via-blue-500 to-transparent"></div>
<div className="text-center mb-8">
<div className="inline-flex items-center justify-center w-16 h-16 rounded-full bg-blue-500/20 mb-4 border border-blue-500/30">
<svg className="w-8 h-8 text-blue-400" fill="none" stroke="currentColor" viewBox="0 0 24 24" xmlns="http://www.w3.org/2000/svg">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10" />
</svg>
</div>
<Title level={2} className="!text-white !mb-2 tracking-wide font-bold">
DataBuilder
</Title>
<Text className="text-gray-400! text-sm tracking-wider">
</Text>
</div>
<Form
name="login"
initialValues={{ remember: true, username: 'admin', password: '123456' }}
onFinish={onFinish}
layout="vertical"
size="large"
>
<Form.Item
name="username"
rules={[{ required: true, message: '请输入账号!' }]}
>
<Input
prefix={<UserOutlined className="text-blue-400" />}
placeholder="账号"
className="!bg-white/5 !border-white/10 !text-white placeholder:!text-gray-600 hover:!border-blue-500/50 focus:!border-blue-500 !rounded-lg"
/>
</Form.Item>
<Form.Item
name="password"
rules={[{ required: true, message: '请输入密码!' }]}
>
<Input.Password
prefix={<LockOutlined className="text-blue-400" />}
type="password"
placeholder="密码"
className="!bg-white/5 !border-white/10 !text-white placeholder:!text-gray-600 hover:!border-blue-500/50 focus:!border-blue-500 !rounded-lg"
/>
</Form.Item>
<Form.Item className="mb-2">
<Button
type="primary"
htmlType="submit"
className="w-full bg-gradient-to-r from-blue-600 to-cyan-600 hover:from-blue-500 hover:to-cyan-500 border-none h-12 rounded-lg font-semibold tracking-wide shadow-lg shadow-blue-900/20"
loading={loading}
>
</Button>
</Form.Item>
<div className="text-center mt-4">
<Text className="text-gray-600! text-xs">
·
</Text>
</div>
</Form>
</div>
</div>
</div>
);
};
export default LoginPage;

View File

@@ -49,12 +49,21 @@ import EvaluationDetailPage from "@/pages/DataEvaluation/Detail/TaskDetail.tsx";
import SynthDataDetail from "@/pages/SynthesisTask/SynthDataDetail.tsx"; import SynthDataDetail from "@/pages/SynthesisTask/SynthDataDetail.tsx";
import Home from "@/pages/Home/Home"; import Home from "@/pages/Home/Home";
import ContentGenerationPage from "@/pages/ContentGeneration/ContentGenerationPage"; import ContentGenerationPage from "@/pages/ContentGeneration/ContentGenerationPage";
import LoginPage from "@/pages/Login/LoginPage";
import ProtectedRoute from "@/components/ProtectedRoute";
const router = createBrowserRouter([ const router = createBrowserRouter([
{
path: "/login",
Component: LoginPage,
},
{ {
path: "/", path: "/",
Component: Home, Component: Home,
}, },
{
Component: ProtectedRoute,
children: [
{ {
path: "/chat", path: "/chat",
Component: withErrorBoundary(AgentPage), Component: withErrorBoundary(AgentPage),
@@ -286,6 +295,8 @@ const router = createBrowserRouter([
}, },
], ],
}, },
]
}
]); ]);
export default router; export default router;

View File

@@ -31,7 +31,7 @@ const authSlice = createSlice({
initialState: { initialState: {
user: null, user: null,
token: localStorage.getItem('token'), token: localStorage.getItem('token'),
isAuthenticated: false, isAuthenticated: !!localStorage.getItem('token'),
loading: false, loading: false,
error: null, error: null,
}, },
@@ -49,6 +49,19 @@ const authSlice = createSlice({
state.token = action.payload; state.token = action.payload;
localStorage.setItem('token', action.payload); localStorage.setItem('token', action.payload);
}, },
loginLocal: (state, action) => {
const { username, password } = action.payload;
if (username === 'admin' && password === '123456') {
state.user = { username: 'admin', role: 'admin' };
state.token = 'mock-token-' + Date.now();
state.isAuthenticated = true;
localStorage.setItem('token', state.token);
state.error = null;
} else {
state.error = 'Invalid credentials';
state.isAuthenticated = false;
}
},
}, },
extraReducers: (builder) => { extraReducers: (builder) => {
builder builder
@@ -71,5 +84,5 @@ const authSlice = createSlice({
}, },
}); });
export const { logout, clearError, setToken } = authSlice.actions; export const { logout, clearError, setToken, loginLocal } = authSlice.actions;
export default authSlice.reducer; export default authSlice.reducer;

View File

@@ -1,79 +1,657 @@
import { UploadFile } from "antd"; import { UploadFile } from "antd";
import jsSHA from "jssha"; import jsSHA from "jssha";
const CHUNK_SIZE = 1024 * 1024 * 60; // 默认分片大小:5MB(适合大多数网络环境)
export const DEFAULT_CHUNK_SIZE = 1024 * 1024 * 5;
// 大文件阈值:10MB
export const LARGE_FILE_THRESHOLD = 1024 * 1024 * 10;
// 最大并发上传数
export const MAX_CONCURRENT_UPLOADS = 3;
// 文本文件读取块大小:20MB(用于计算 SHA256)
const BUFFER_CHUNK_SIZE = 1024 * 1024 * 20;
export function sliceFile(file, chunkSize = CHUNK_SIZE): Blob[] { /**
* 将文件分割为多个分片
* @param file 文件对象
* @param chunkSize 分片大小(字节),默认 5MB
* @returns 分片数组(Blob 列表)
*/
export function sliceFile(file: Blob, chunkSize = DEFAULT_CHUNK_SIZE): Blob[] {
const totalSize = file.size; const totalSize = file.size;
const chunks: Blob[] = [];
// 小文件不需要分片
if (totalSize <= chunkSize) {
return [file];
}
let start = 0; let start = 0;
let end = start + chunkSize;
const chunks = [];
while (start < totalSize) { while (start < totalSize) {
const end = Math.min(start + chunkSize, totalSize);
const blob = file.slice(start, end); const blob = file.slice(start, end);
chunks.push(blob); chunks.push(blob);
start = end; start = end;
end = start + chunkSize;
} }
return chunks; return chunks;
} }
export function calculateSHA256(file: Blob): Promise<string> { /**
let count = 0; * 计算文件的 SHA256 哈希值
const hash = new jsSHA("SHA-256", "ARRAYBUFFER", { encoding: "UTF8" }); * @param file 文件 Blob
* @param onProgress 进度回调(可选)
* @returns SHA256 哈希字符串
*/
export function calculateSHA256(
file: Blob,
onProgress?: (percent: number) => void
): Promise<string> {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const hash = new jsSHA("SHA-256", "ARRAYBUFFER", { encoding: "UTF8" });
const reader = new FileReader(); const reader = new FileReader();
let processedSize = 0;
function readChunk(start: number, end: number) { function readChunk(start: number, end: number) {
const slice = file.slice(start, end); const slice = file.slice(start, end);
reader.readAsArrayBuffer(slice); reader.readAsArrayBuffer(slice);
} }
const bufferChunkSize = 1024 * 1024 * 20;
function processChunk(offset: number) { function processChunk(offset: number) {
const start = offset; const start = offset;
const end = Math.min(start + bufferChunkSize, file.size); const end = Math.min(start + BUFFER_CHUNK_SIZE, file.size);
count = end;
readChunk(start, end); readChunk(start, end);
} }
reader.onloadend = function () { reader.onloadend = function (e) {
const arraybuffer = reader.result; const arraybuffer = reader.result as ArrayBuffer;
if (!arraybuffer) {
reject(new Error("Failed to read file"));
return;
}
hash.update(arraybuffer); hash.update(arraybuffer);
if (count < file.size) { processedSize += (e.target as FileReader).result?.byteLength || 0;
processChunk(count);
if (onProgress) {
const percent = Math.min(100, Math.round((processedSize / file.size) * 100));
onProgress(percent);
}
if (processedSize < file.size) {
processChunk(processedSize);
} else { } else {
resolve(hash.getHash("HEX", { outputLen: 256 })); resolve(hash.getHash("HEX", { outputLen: 256 }));
} }
}; };
reader.onerror = () => reject(new Error("File reading failed"));
processChunk(0); processChunk(0);
}); });
} }
/**
* 批量计算多个文件的 SHA256
* @param files 文件列表
* @param onFileProgress 单个文件进度回调(可选)
* @returns 哈希值数组
*/
export async function calculateSHA256Batch(
files: Blob[],
onFileProgress?: (index: number, percent: number) => void
): Promise<string[]> {
const results: string[] = [];
for (let i = 0; i < files.length; i++) {
const hash = await calculateSHA256(files[i], (percent) => {
onFileProgress?.(i, percent);
});
results.push(hash);
}
return results;
}
/**
* 检查文件是否存在(未被修改或删除)
* @param fileList 文件列表
* @returns 返回第一个不存在的文件,或 null(如果都存在)
*/
export function checkIsFilesExist( export function checkIsFilesExist(
fileList: UploadFile[] fileList: Array<{ originFile?: Blob }>
): Promise<UploadFile | null> { ): Promise<{ originFile?: Blob } | null> {
return new Promise((resolve) => { return new Promise((resolve) => {
const loadEndFn = (file: UploadFile, reachEnd: boolean, e) => { if (!fileList.length) {
const fileNotExist = !e.target.result; resolve(null);
return;
}
let checkedCount = 0;
const totalCount = fileList.length;
const loadEndFn = (file: { originFile?: Blob }, e: ProgressEvent<FileReader>) => {
checkedCount++;
const fileNotExist = !e.target?.result;
if (fileNotExist) { if (fileNotExist) {
resolve(file); resolve(file);
return;
} }
if (reachEnd) { if (checkedCount >= totalCount) {
resolve(null); resolve(null);
} }
}; };
for (let i = 0; i < fileList.length; i++) { for (const file of fileList) {
const { originFile: file } = fileList[i];
const fileReader = new FileReader(); const fileReader = new FileReader();
fileReader.readAsArrayBuffer(file); const actualFile = file.originFile;
fileReader.onloadend = (e) =>
loadEndFn(fileList[i], i === fileList.length - 1, e); if (!actualFile) {
checkedCount++;
if (checkedCount >= totalCount) {
resolve(null);
}
continue;
}
fileReader.readAsArrayBuffer(actualFile.slice(0, 1));
fileReader.onloadend = (e) => loadEndFn(file, e);
fileReader.onerror = () => {
checkedCount++;
resolve(file);
};
} }
}); });
} }
/**
* 判断文件是否为大文件
* @param size 文件大小(字节)
* @param threshold 阈值(字节),默认 10MB
*/
export function isLargeFile(size: number, threshold = LARGE_FILE_THRESHOLD): boolean {
return size > threshold;
}
/**
* 格式化文件大小为人类可读格式
* @param bytes 字节数
* @param decimals 小数位数
*/
export function formatFileSize(bytes: number, decimals = 2): string {
if (bytes === 0) return "0 B";
const k = 1024;
const sizes = ["B", "KB", "MB", "GB", "TB", "PB"];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(decimals))} ${sizes[i]}`;
}
/**
* 并发执行异步任务
* @param tasks 任务函数数组
* @param maxConcurrency 最大并发数
* @param onTaskComplete 单个任务完成回调(可选)
*/
export async function runConcurrentTasks<T>(
tasks: (() => Promise<T>)[],
maxConcurrency: number,
onTaskComplete?: (index: number, result: T) => void
): Promise<T[]> {
const results: T[] = new Array(tasks.length);
let index = 0;
async function runNext(): Promise<void> {
const currentIndex = index++;
if (currentIndex >= tasks.length) return;
const result = await tasks[currentIndex]();
results[currentIndex] = result;
onTaskComplete?.(currentIndex, result);
await runNext();
}
const workers = Array(Math.min(maxConcurrency, tasks.length))
.fill(null)
.map(() => runNext());
await Promise.all(workers);
return results;
}
/**
* 按行分割文本文件内容
* @param text 文本内容
* @param skipEmptyLines 是否跳过空行,默认 true
* @returns 行数组
*/
export function splitTextByLines(text: string, skipEmptyLines = true): string[] {
const lines = text.split(/\r?\n/);
if (skipEmptyLines) {
return lines.filter((line) => line.trim() !== "");
}
return lines;
}
/**
* 创建分片信息对象
* @param file 原始文件
* @param chunkSize 分片大小
*/
export function createFileSliceInfo(
file: File | Blob,
chunkSize = DEFAULT_CHUNK_SIZE
): {
originFile: Blob;
slices: Blob[];
name: string;
size: number;
totalChunks: number;
} {
const slices = sliceFile(file, chunkSize);
return {
originFile: file,
slices,
name: (file as File).name || "unnamed",
size: file.size,
totalChunks: slices.length,
};
}
/**
* 支持的文本文件 MIME 类型前缀
*/
export const TEXT_FILE_MIME_PREFIX = "text/";
/**
* 支持的文本文件 MIME 类型集合
*/
export const TEXT_FILE_MIME_TYPES = new Set([
"application/json",
"application/xml",
"application/csv",
"application/ndjson",
"application/x-ndjson",
"application/x-yaml",
"application/yaml",
"application/javascript",
"application/x-javascript",
"application/sql",
"application/rtf",
"application/xhtml+xml",
"application/svg+xml",
]);
/**
* 支持的文本文件扩展名集合
*/
export const TEXT_FILE_EXTENSIONS = new Set([
".txt",
".md",
".markdown",
".csv",
".tsv",
".json",
".jsonl",
".ndjson",
".log",
".xml",
".yaml",
".yml",
".sql",
".js",
".ts",
".jsx",
".tsx",
".html",
".htm",
".css",
".scss",
".less",
".py",
".java",
".c",
".cpp",
".h",
".hpp",
".go",
".rs",
".rb",
".php",
".sh",
".bash",
".zsh",
".ps1",
".bat",
".cmd",
".svg",
".rtf",
]);
/**
* 判断文件是否为文本文件(支持 UploadFile 类型)
* @param file UploadFile 对象
*/
export function isTextUploadFile(file: UploadFile): boolean {
const mimeType = (file.type || "").toLowerCase();
if (mimeType) {
if (mimeType.startsWith(TEXT_FILE_MIME_PREFIX)) return true;
if (TEXT_FILE_MIME_TYPES.has(mimeType)) return true;
}
const fileName = file.name || "";
const dotIndex = fileName.lastIndexOf(".");
if (dotIndex < 0) return false;
const ext = fileName.slice(dotIndex).toLowerCase();
return TEXT_FILE_EXTENSIONS.has(ext);
}
/**
* 判断文件名是否为文本文件
* @param fileName 文件名
*/
export function isTextFileByName(fileName: string): boolean {
const lowerName = fileName.toLowerCase();
// 先检查 MIME 类型(如果有)
// 这里简化处理,主要通过扩展名判断
const dotIndex = lowerName.lastIndexOf(".");
if (dotIndex < 0) return false;
const ext = lowerName.slice(dotIndex);
return TEXT_FILE_EXTENSIONS.has(ext);
}
/**
* 获取文件扩展名
* @param fileName 文件名
*/
export function getFileExtension(fileName: string): string {
const dotIndex = fileName.lastIndexOf(".");
if (dotIndex < 0) return "";
return fileName.slice(dotIndex).toLowerCase();
}
/**
* 安全地读取文件为文本
* @param file 文件对象
* @param encoding 编码,默认 UTF-8
*/
export function readFileAsText(
file: File | Blob,
encoding = "UTF-8"
): Promise<string> {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = (e) => resolve(e.target?.result as string);
reader.onerror = () => reject(new Error("Failed to read file"));
reader.readAsText(file, encoding);
});
}
/**
* 流式分割文件并逐行上传
* 使用 Blob.slice 逐块读取,避免一次性加载大文件到内存
* @param file 文件对象
* @param datasetId 数据集ID
* @param uploadFn 上传函数,接收 FormData 和配置,返回 Promise
* @param onProgress 进度回调 (currentBytes, totalBytes, uploadedLines)
* @param chunkSize 每次读取的块大小,默认 1MB
* @param options 其他选项
* @returns 上传结果统计
*/
export interface StreamUploadOptions {
reqId?: number;
resolveReqId?: (params: { totalFileNum: number; totalSize: number }) => Promise<number>;
onReqIdResolved?: (reqId: number) => void;
fileNamePrefix?: string;
hasArchive?: boolean;
prefix?: string;
signal?: AbortSignal;
maxConcurrency?: number;
}
export interface StreamUploadResult {
uploadedCount: number;
totalBytes: number;
skippedEmptyCount: number;
}
async function processFileLines(
file: File,
chunkSize: number,
signal: AbortSignal | undefined,
onLine?: (line: string, index: number) => Promise<void> | void,
onProgress?: (currentBytes: number, totalBytes: number, processedLines: number) => void
): Promise<{ lineCount: number; skippedEmptyCount: number }> {
const fileSize = file.size;
let offset = 0;
let buffer = "";
let skippedEmptyCount = 0;
let lineIndex = 0;
while (offset < fileSize) {
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
const end = Math.min(offset + chunkSize, fileSize);
const chunk = file.slice(offset, end);
const text = await readFileAsText(chunk);
const combined = buffer + text;
const lines = combined.split(/\r?\n/);
buffer = lines.pop() || "";
for (const line of lines) {
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
if (!line.trim()) {
skippedEmptyCount++;
continue;
}
const currentIndex = lineIndex;
lineIndex += 1;
if (onLine) {
await onLine(line, currentIndex);
}
}
offset = end;
onProgress?.(offset, fileSize, lineIndex);
}
if (buffer.trim()) {
const currentIndex = lineIndex;
lineIndex += 1;
if (onLine) {
await onLine(buffer, currentIndex);
}
} else if (buffer.length > 0) {
skippedEmptyCount++;
}
return { lineCount: lineIndex, skippedEmptyCount };
}
export async function streamSplitAndUpload(
file: File,
uploadFn: (formData: FormData, config?: { onUploadProgress?: (e: { loaded: number; total: number }) => void }) => Promise<unknown>,
onProgress?: (currentBytes: number, totalBytes: number, uploadedLines: number) => void,
chunkSize: number = 1024 * 1024, // 1MB
options: StreamUploadOptions
): Promise<StreamUploadResult> {
const {
reqId: initialReqId,
resolveReqId,
onReqIdResolved,
fileNamePrefix,
prefix,
signal,
maxConcurrency = 3,
} = options;
const fileSize = file.size;
let uploadedCount = 0;
let skippedEmptyCount = 0;
// 获取文件名基础部分和扩展名
const originalFileName = fileNamePrefix || file.name;
const lastDotIndex = originalFileName.lastIndexOf(".");
const baseName = lastDotIndex > 0 ? originalFileName.slice(0, lastDotIndex) : originalFileName;
const fileExtension = lastDotIndex > 0 ? originalFileName.slice(lastDotIndex) : "";
let resolvedReqId = initialReqId;
if (!resolvedReqId) {
const scanResult = await processFileLines(file, chunkSize, signal);
const totalFileNum = scanResult.lineCount;
skippedEmptyCount = scanResult.skippedEmptyCount;
if (totalFileNum === 0) {
return {
uploadedCount: 0,
totalBytes: fileSize,
skippedEmptyCount,
};
}
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
if (!resolveReqId) {
throw new Error("Missing pre-upload request id");
}
resolvedReqId = await resolveReqId({ totalFileNum, totalSize: fileSize });
if (!resolvedReqId) {
throw new Error("Failed to resolve pre-upload request id");
}
onReqIdResolved?.(resolvedReqId);
}
if (!resolvedReqId) {
throw new Error("Missing pre-upload request id");
}
/**
* 上传单行内容
* 每行作为独立文件上传,fileNo 对应行序号,chunkNo 固定为 1
*/
async function uploadLine(line: string, index: number): Promise<void> {
// 检查是否已取消
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
if (!line.trim()) {
skippedEmptyCount++;
return;
}
// 保留原始文件扩展名
const fileIndex = index + 1;
const newFileName = `${baseName}_${String(fileIndex).padStart(6, "0")}${fileExtension}`;
const blob = new Blob([line], { type: "text/plain" });
const lineFile = new File([blob], newFileName, { type: "text/plain" });
// 计算分片(小文件通常只需要一个分片)
const slices = sliceFile(lineFile, DEFAULT_CHUNK_SIZE);
const checkSum = await calculateSHA256(slices[0]);
// 检查是否已取消(计算哈希后)
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
const formData = new FormData();
formData.append("file", slices[0]);
formData.append("reqId", resolvedReqId.toString());
// 每行作为独立文件上传
formData.append("fileNo", fileIndex.toString());
formData.append("chunkNo", "1");
formData.append("fileName", newFileName);
formData.append("fileSize", lineFile.size.toString());
formData.append("totalChunkNum", "1");
formData.append("checkSumHex", checkSum);
if (prefix !== undefined) {
formData.append("prefix", prefix);
}
await uploadFn(formData, {
onUploadProgress: () => {
// 单行文件很小,进度主要用于追踪上传状态
},
});
}
const inFlight = new Set<Promise<void>>();
let uploadError: unknown = null;
const enqueueUpload = async (line: string, index: number) => {
if (signal?.aborted) {
throw new Error("Upload cancelled");
}
if (uploadError) {
throw uploadError;
}
const uploadPromise = uploadLine(line, index)
.then(() => {
uploadedCount++;
})
.catch((err) => {
uploadError = err;
});
inFlight.add(uploadPromise);
uploadPromise.finally(() => inFlight.delete(uploadPromise));
if (inFlight.size >= maxConcurrency) {
await Promise.race(inFlight);
if (uploadError) {
throw uploadError;
}
}
};
let uploadResult: { lineCount: number; skippedEmptyCount: number } | null = null;
try {
uploadResult = await processFileLines(
file,
chunkSize,
signal,
enqueueUpload,
(currentBytes, totalBytes) => {
onProgress?.(currentBytes, totalBytes, uploadedCount);
}
);
if (uploadError) {
throw uploadError;
}
} finally {
if (inFlight.size > 0) {
await Promise.allSettled(inFlight);
}
}
if (!uploadResult || (initialReqId && uploadResult.lineCount === 0)) {
return {
uploadedCount: 0,
totalBytes: fileSize,
skippedEmptyCount: uploadResult?.skippedEmptyCount ?? 0,
};
}
if (!initialReqId) {
skippedEmptyCount = skippedEmptyCount || uploadResult.skippedEmptyCount;
} else {
skippedEmptyCount = uploadResult.skippedEmptyCount;
}
return {
uploadedCount,
totalBytes: fileSize,
skippedEmptyCount,
};
}
/**
* 判断文件是否需要流式分割上传
* @param file 文件对象
* @param threshold 阈值,默认 5MB
*/
export function shouldStreamUpload(file: File, threshold: number = 5 * 1024 * 1024): boolean {
return file.size > threshold;
}

View File

@@ -82,6 +82,9 @@ class Request {
*/ */
createXHRWithProgress(url, config, onProgress, onDownloadProgress) { createXHRWithProgress(url, config, onProgress, onDownloadProgress) {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.open(config.method || "POST", url);
// 设置请求头 // 设置请求头
if (config.headers) { if (config.headers) {
Object.keys(config.headers).forEach((key) => { Object.keys(config.headers).forEach((key) => {
@@ -89,7 +92,13 @@ class Request {
}); });
} }
const xhr = new XMLHttpRequest(); // 监听 AbortSignal 来中止请求
if (config.signal) {
config.signal.addEventListener("abort", () => {
xhr.abort();
reject(new Error("上传已取消"));
});
}
// 监听上传进度 // 监听上传进度
xhr.upload.addEventListener("progress", function (event) { xhr.upload.addEventListener("progress", function (event) {
@@ -103,14 +112,6 @@ class Request {
} }
}); });
// 请求完成
// xhr.addEventListener("load", function () {
// if (xhr.status >= 200 && xhr.status < 300) {
// const response = JSON.parse(xhr.responseText);
// resolve(xhr);
// }
// });
// 请求完成处理 // 请求完成处理
xhr.addEventListener("load", () => { xhr.addEventListener("load", () => {
if (xhr.status >= 200 && xhr.status < 300) { if (xhr.status >= 200 && xhr.status < 300) {
@@ -142,16 +143,15 @@ class Request {
// 请求错误 // 请求错误
xhr.addEventListener("error", function () { xhr.addEventListener("error", function () {
console.error("网络错误"); console.error("网络错误");
if (onError) onError(new Error("网络错误")); reject(new Error("网络错误"));
}); });
// 请求中止 // 请求中止
xhr.addEventListener("abort", function () { xhr.addEventListener("abort", function () {
console.log("上传已取消"); console.log("上传已取消");
if (onError) onError(new Error("上传已取消")); reject(new Error("上传已取消"));
}); });
xhr.open("POST", url);
xhr.send(config.body); xhr.send(config.body);
return xhr; // 返回 xhr 对象以便后续控制 return xhr; // 返回 xhr 对象以便后续控制

View File

@@ -66,7 +66,7 @@ class Settings(BaseSettings):
datamate_backend_base_url: str = "http://datamate-backend:8080/api" datamate_backend_base_url: str = "http://datamate-backend:8080/api"
# 标注编辑器(Label Studio Editor)相关 # 标注编辑器(Label Studio Editor)相关
editor_max_text_bytes: int = 2 * 1024 * 1024 # 2MB,避免一次加载超大文本卡死前端 editor_max_text_bytes: int = 0 # <=0 表示不限制,正数为最大字节数
# 全局设置实例 # 全局设置实例
settings = Settings() settings = Settings()

View File

@@ -6,6 +6,22 @@ from sqlalchemy.sql import func
from app.db.session import Base from app.db.session import Base
ANNOTATION_STATUS_ANNOTATED = "ANNOTATED"
ANNOTATION_STATUS_NO_ANNOTATION = "NO_ANNOTATION"
ANNOTATION_STATUS_NOT_APPLICABLE = "NOT_APPLICABLE"
ANNOTATION_STATUS_IN_PROGRESS = "IN_PROGRESS"
ANNOTATION_STATUS_VALUES = {
ANNOTATION_STATUS_ANNOTATED,
ANNOTATION_STATUS_NO_ANNOTATION,
ANNOTATION_STATUS_NOT_APPLICABLE,
ANNOTATION_STATUS_IN_PROGRESS,
}
ANNOTATION_STATUS_CLIENT_VALUES = {
ANNOTATION_STATUS_ANNOTATED,
ANNOTATION_STATUS_NO_ANNOTATION,
ANNOTATION_STATUS_NOT_APPLICABLE,
}
class AnnotationTemplate(Base): class AnnotationTemplate(Base):
"""标注配置模板模型""" """标注配置模板模型"""
@@ -88,6 +104,12 @@ class AnnotationResult(Base):
project_id = Column(String(36), nullable=False, comment="标注项目ID(t_dm_labeling_projects.id)") project_id = Column(String(36), nullable=False, comment="标注项目ID(t_dm_labeling_projects.id)")
file_id = Column(String(36), nullable=False, comment="文件ID(t_dm_dataset_files.id)") file_id = Column(String(36), nullable=False, comment="文件ID(t_dm_dataset_files.id)")
annotation = Column(JSON, nullable=False, comment="Label Studio annotation 原始JSON(单人单份最终结果)") annotation = Column(JSON, nullable=False, comment="Label Studio annotation 原始JSON(单人单份最终结果)")
annotation_status = Column(
String(32),
nullable=False,
default=ANNOTATION_STATUS_ANNOTATED,
comment="标注状态: ANNOTATED/NO_ANNOTATION/NOT_APPLICABLE/IN_PROGRESS",
)
created_at = Column(TIMESTAMP, server_default=func.current_timestamp(), comment="创建时间") created_at = Column(TIMESTAMP, server_default=func.current_timestamp(), comment="创建时间")
updated_at = Column(TIMESTAMP, server_default=func.current_timestamp(), onupdate=func.current_timestamp(), comment="更新时间") updated_at = Column(TIMESTAMP, server_default=func.current_timestamp(), onupdate=func.current_timestamp(), comment="更新时间")

View File

@@ -20,6 +20,7 @@ from app.module.annotation.schema.editor import (
EditorProjectInfo, EditorProjectInfo,
EditorTaskListResponse, EditorTaskListResponse,
EditorTaskResponse, EditorTaskResponse,
EditorTaskSegmentsResponse,
UpsertAnnotationRequest, UpsertAnnotationRequest,
UpsertAnnotationResponse, UpsertAnnotationResponse,
) )
@@ -87,6 +88,20 @@ async def get_editor_task(
return StandardResponse(code=200, message="success", data=task) return StandardResponse(code=200, message="success", data=task)
@router.get(
"/projects/{project_id}/tasks/{file_id}/segments",
response_model=StandardResponse[EditorTaskSegmentsResponse],
)
async def list_editor_task_segments(
project_id: str = Path(..., description="标注项目ID(t_dm_labeling_projects.id)"),
file_id: str = Path(..., description="文件ID(t_dm_dataset_files.id)"),
db: AsyncSession = Depends(get_db),
):
service = AnnotationEditorService(db)
result = await service.get_task_segments(project_id, file_id)
return StandardResponse(code=200, message="success", data=result)
@router.put( @router.put(
"/projects/{project_id}/tasks/{file_id}/annotation", "/projects/{project_id}/tasks/{file_id}/annotation",
response_model=StandardResponse[UpsertAnnotationResponse], response_model=StandardResponse[UpsertAnnotationResponse],

View File

@@ -3,7 +3,7 @@ import math
import uuid import uuid
from fastapi import APIRouter, Depends, HTTPException, Query, Path from fastapi import APIRouter, Depends, HTTPException, Query, Path
from sqlalchemy import select from sqlalchemy import select, update
from sqlalchemy.ext.asyncio import AsyncSession from sqlalchemy.ext.asyncio import AsyncSession
from app.db.session import get_db from app.db.session import get_db
@@ -17,6 +17,7 @@ from ..service.template import AnnotationTemplateService
from ..schema import ( from ..schema import (
DatasetMappingCreateRequest, DatasetMappingCreateRequest,
DatasetMappingCreateResponse, DatasetMappingCreateResponse,
DatasetMappingUpdateRequest,
DeleteDatasetResponse, DeleteDatasetResponse,
DatasetMappingResponse, DatasetMappingResponse,
) )
@@ -28,6 +29,7 @@ router = APIRouter(
logger = get_logger(__name__) logger = get_logger(__name__)
TEXT_DATASET_TYPE = "TEXT" TEXT_DATASET_TYPE = "TEXT"
SOURCE_DOCUMENT_FILE_TYPES = {"pdf", "doc", "docx", "xls", "xlsx"} SOURCE_DOCUMENT_FILE_TYPES = {"pdf", "doc", "docx", "xls", "xlsx"}
LABELING_TYPE_CONFIG_KEY = "labeling_type"
@router.get("/{mapping_id}/login") @router.get("/{mapping_id}/login")
async def login_label_studio( async def login_label_studio(
@@ -81,6 +83,7 @@ async def create_mapping(
# 如果提供了模板ID,获取模板配置 # 如果提供了模板ID,获取模板配置
label_config = None label_config = None
template_labeling_type = None
if request.template_id: if request.template_id:
logger.info(f"Using template: {request.template_id}") logger.info(f"Using template: {request.template_id}")
template = await template_service.get_template(db, request.template_id) template = await template_service.get_template(db, request.template_id)
@@ -90,6 +93,7 @@ async def create_mapping(
detail=f"Template not found: {request.template_id}" detail=f"Template not found: {request.template_id}"
) )
label_config = template.label_config label_config = template.label_config
template_labeling_type = getattr(template, "labeling_type", None)
logger.debug(f"Template label config loaded for template: {template.name}") logger.debug(f"Template label config loaded for template: {template.name}")
# 如果直接提供了 label_config (自定义或修改后的),则覆盖模板配置 # 如果直接提供了 label_config (自定义或修改后的),则覆盖模板配置
@@ -108,6 +112,8 @@ async def create_mapping(
project_configuration["description"] = project_description project_configuration["description"] = project_description
if dataset_type == TEXT_DATASET_TYPE and request.segmentation_enabled is not None: if dataset_type == TEXT_DATASET_TYPE and request.segmentation_enabled is not None:
project_configuration["segmentation_enabled"] = bool(request.segmentation_enabled) project_configuration["segmentation_enabled"] = bool(request.segmentation_enabled)
if template_labeling_type:
project_configuration[LABELING_TYPE_CONFIG_KEY] = template_labeling_type
labeling_project = LabelingProject( labeling_project = LabelingProject(
id=str(uuid.uuid4()), # Generate UUID here id=str(uuid.uuid4()), # Generate UUID here
@@ -144,6 +150,18 @@ async def create_mapping(
labeling_project, snapshot_file_ids labeling_project, snapshot_file_ids
) )
# 如果启用了分段且为文本数据集,预生成切片结构
if dataset_type == TEXT_DATASET_TYPE and request.segmentation_enabled:
try:
from ..service.editor import AnnotationEditorService
editor_service = AnnotationEditorService(db)
# 异步预计算切片(不阻塞创建响应)
segmentation_result = await editor_service.precompute_segmentation_for_project(labeling_project.id)
logger.info(f"Precomputed segmentation for project {labeling_project.id}: {segmentation_result}")
except Exception as e:
logger.warning(f"Failed to precompute segmentation for project {labeling_project.id}: {e}")
# 不影响项目创建,只记录警告
response_data = DatasetMappingCreateResponse( response_data = DatasetMappingCreateResponse(
id=mapping.id, id=mapping.id,
labeling_project_id=str(mapping.labeling_project_id), labeling_project_id=str(mapping.labeling_project_id),
@@ -382,3 +400,116 @@ async def delete_mapping(
except Exception as e: except Exception as e:
logger.error(f"Error deleting mapping: {e}") logger.error(f"Error deleting mapping: {e}")
raise HTTPException(status_code=500, detail="Internal server error") raise HTTPException(status_code=500, detail="Internal server error")
@router.put("/{project_id}", response_model=StandardResponse[DatasetMappingResponse])
async def update_mapping(
project_id: str = Path(..., description="映射UUID(path param)"),
request: DatasetMappingUpdateRequest = None,
db: AsyncSession = Depends(get_db)
):
"""
更新标注项目信息
通过 path 参数 `project_id` 指定要更新的映射(映射的 UUID)。
支持更新的字段:
- name: 标注项目名称
- description: 标注项目描述
- template_id: 标注模板ID
- label_config: Label Studio XML配置
"""
try:
logger.info(f"Update mapping request received: project_id={project_id!r}")
service = DatasetMappingService(db)
# 直接查询 ORM 模型获取原始数据
result = await db.execute(
select(LabelingProject).where(
LabelingProject.id == project_id,
LabelingProject.deleted_at.is_(None)
)
)
mapping_orm = result.scalar_one_or_none()
if not mapping_orm:
raise HTTPException(
status_code=404,
detail=f"Mapping not found: {project_id}"
)
# 构建更新数据
update_values = {}
if request.name is not None:
update_values["name"] = request.name
# 从 configuration 字段中读取和更新 description 和 label_config
configuration = {}
if mapping_orm.configuration:
configuration = mapping_orm.configuration.copy() if isinstance(mapping_orm.configuration, dict) else {}
if request.description is not None:
configuration["description"] = request.description
if request.label_config is not None:
configuration["label_config"] = request.label_config
if configuration:
update_values["configuration"] = configuration
if request.template_id is not None:
update_values["template_id"] = request.template_id
template_labeling_type = None
if request.template_id:
template_service = AnnotationTemplateService()
template = await template_service.get_template(db, request.template_id)
if not template:
raise HTTPException(
status_code=404,
detail=f"Template not found: {request.template_id}"
)
template_labeling_type = getattr(template, "labeling_type", None)
if template_labeling_type:
configuration[LABELING_TYPE_CONFIG_KEY] = template_labeling_type
if not update_values:
# 没有要更新的字段,直接返回当前数据
response_data = await service.get_mapping_by_uuid(project_id)
return StandardResponse(
code=200,
message="success",
data=response_data
)
# 执行更新
from datetime import datetime
update_values["updated_at"] = datetime.now()
result = await db.execute(
update(LabelingProject)
.where(LabelingProject.id == project_id)
.values(**update_values)
)
await db.commit()
if result.rowcount == 0:
raise HTTPException(
status_code=500,
detail="Failed to update mapping"
)
# 重新获取更新后的数据
updated_mapping = await service.get_mapping_by_uuid(project_id)
logger.info(f"Successfully updated mapping: {project_id}")
return StandardResponse(
code=200,
message="success",
data=updated_mapping
)
except HTTPException:
raise
except Exception as e:
logger.error(f"Error updating mapping: {e}")
raise HTTPException(status_code=500, detail="Internal server error")

View File

@@ -9,10 +9,27 @@
from __future__ import annotations from __future__ import annotations
from datetime import datetime from datetime import datetime
from enum import Enum
from typing import Any, Dict, List, Optional from typing import Any, Dict, List, Optional
from pydantic import BaseModel, Field, ConfigDict from pydantic import BaseModel, Field, ConfigDict
from app.db.models.annotation_management import (
ANNOTATION_STATUS_ANNOTATED,
ANNOTATION_STATUS_IN_PROGRESS,
ANNOTATION_STATUS_NO_ANNOTATION,
ANNOTATION_STATUS_NOT_APPLICABLE,
)
class AnnotationStatus(str, Enum):
"""标注状态枚举"""
ANNOTATED = ANNOTATION_STATUS_ANNOTATED
IN_PROGRESS = ANNOTATION_STATUS_IN_PROGRESS
NO_ANNOTATION = ANNOTATION_STATUS_NO_ANNOTATION
NOT_APPLICABLE = ANNOTATION_STATUS_NOT_APPLICABLE
class EditorProjectInfo(BaseModel): class EditorProjectInfo(BaseModel):
"""编辑器项目元信息""" """编辑器项目元信息"""
@@ -40,8 +57,13 @@ class EditorTaskListItem(BaseModel):
file_type: Optional[str] = Field(None, alias="fileType", description="文件类型") file_type: Optional[str] = Field(None, alias="fileType", description="文件类型")
has_annotation: bool = Field(..., alias="hasAnnotation", description="是否已有最终标注") has_annotation: bool = Field(..., alias="hasAnnotation", description="是否已有最终标注")
annotation_updated_at: Optional[datetime] = Field(None, alias="annotationUpdatedAt", description="标注更新时间") annotation_updated_at: Optional[datetime] = Field(None, alias="annotationUpdatedAt", description="标注更新时间")
annotation_status: Optional[AnnotationStatus] = Field(
None,
alias="annotationStatus",
description="标注状态",
)
model_config = ConfigDict(populate_by_name=True) model_config = ConfigDict(populate_by_name=True, use_enum_values=True)
class EditorTaskListResponse(BaseModel): class EditorTaskListResponse(BaseModel):
@@ -57,12 +79,9 @@ class EditorTaskListResponse(BaseModel):
class SegmentInfo(BaseModel): class SegmentInfo(BaseModel):
"""段落信息(用于文本分段标注)""" """段落摘要(用于文本分段标注)"""
idx: int = Field(..., description="段落索引") idx: int = Field(..., description="段落索引")
text: str = Field(..., description="段落文本")
start: int = Field(..., description="在原文中的起始位置")
end: int = Field(..., description="在原文中的结束位置")
has_annotation: bool = Field(False, alias="hasAnnotation", description="该段落是否已有标注") has_annotation: bool = Field(False, alias="hasAnnotation", description="该段落是否已有标注")
line_index: int = Field(0, alias="lineIndex", description="JSONL 行索引(从0开始)") line_index: int = Field(0, alias="lineIndex", description="JSONL 行索引(从0开始)")
chunk_index: int = Field(0, alias="chunkIndex", description="行内分片索引(从0开始)") chunk_index: int = Field(0, alias="chunkIndex", description="行内分片索引(从0开始)")
@@ -78,17 +97,31 @@ class EditorTaskResponse(BaseModel):
# 分段相关字段 # 分段相关字段
segmented: bool = Field(False, description="是否启用分段模式") segmented: bool = Field(False, description="是否启用分段模式")
segments: Optional[List[SegmentInfo]] = Field(None, description="段落列表")
total_segments: int = Field(0, alias="totalSegments", description="总段落数") total_segments: int = Field(0, alias="totalSegments", description="总段落数")
current_segment_index: int = Field(0, alias="currentSegmentIndex", description="当前段落索引") current_segment_index: int = Field(0, alias="currentSegmentIndex", description="当前段落索引")
model_config = ConfigDict(populate_by_name=True) model_config = ConfigDict(populate_by_name=True)
class EditorTaskSegmentsResponse(BaseModel):
"""编辑器段落摘要响应"""
segmented: bool = Field(False, description="是否启用分段模式")
segments: List[SegmentInfo] = Field(default_factory=list, description="段落摘要列表")
total_segments: int = Field(0, alias="totalSegments", description="总段落数")
model_config = ConfigDict(populate_by_name=True)
class UpsertAnnotationRequest(BaseModel): class UpsertAnnotationRequest(BaseModel):
"""保存/覆盖最终标注(Label Studio annotation 原始对象)""" """保存/覆盖最终标注(Label Studio annotation 原始对象)"""
annotation: Dict[str, Any] = Field(..., description="Label Studio annotation 对象(包含 result 等)") annotation: Dict[str, Any] = Field(..., description="Label Studio annotation 对象(包含 result 等)")
annotation_status: Optional[AnnotationStatus] = Field(
None,
alias="annotationStatus",
description="标注状态(无标注传 NO_ANNOTATION,不适用传 NOT_APPLICABLE,IN_PROGRESS 由后端维护)",
)
expected_updated_at: Optional[datetime] = Field( expected_updated_at: Optional[datetime] = Field(
None, None,
alias="expectedUpdatedAt", alias="expectedUpdatedAt",
@@ -101,7 +134,7 @@ class UpsertAnnotationRequest(BaseModel):
description="段落索引(分段模式下必填)", description="段落索引(分段模式下必填)",
) )
model_config = ConfigDict(populate_by_name=True) model_config = ConfigDict(populate_by_name=True, use_enum_values=True)
class UpsertAnnotationResponse(BaseModel): class UpsertAnnotationResponse(BaseModel):

View File

@@ -39,9 +39,22 @@ class DatasetMappingCreateResponse(BaseResponseModel):
labeling_project_id: str = Field(..., description="Label Studio项目ID") labeling_project_id: str = Field(..., description="Label Studio项目ID")
labeling_project_name: str = Field(..., description="Label Studio项目名称") labeling_project_name: str = Field(..., description="Label Studio项目名称")
class DatasetMappingUpdateRequest(BaseResponseModel): class DatasetMappingUpdateRequest(BaseModel):
"""数据集映射 更新 请求模型""" """数据集映射 更新 请求模型
dataset_id: Optional[str] = Field(None, description="源数据集ID")
支持更新的字段:
- name: 标注项目名称
- description: 标注项目描述
- template_id: 标注模板ID
- label_config: Label Studio XML配置
"""
name: Optional[str] = Field(None, alias="name", description="标注项目名称")
description: Optional[str] = Field(None, alias="description", description="标注项目描述")
template_id: Optional[str] = Field(None, alias="templateId", description="标注模板ID")
label_config: Optional[str] = Field(None, alias="labelConfig", description="Label Studio XML配置")
class Config:
validate_by_name = True
class DatasetMappingResponse(BaseModel): class DatasetMappingResponse(BaseModel):
"""数据集映射 查询 响应模型""" """数据集映射 查询 响应模型"""
@@ -52,6 +65,7 @@ class DatasetMappingResponse(BaseModel):
name: Optional[str] = Field(None, description="标注项目名称") name: Optional[str] = Field(None, description="标注项目名称")
description: Optional[str] = Field(None, description="标注项目描述") description: Optional[str] = Field(None, description="标注项目描述")
template_id: Optional[str] = Field(None, alias="templateId", description="关联的模板ID") template_id: Optional[str] = Field(None, alias="templateId", description="关联的模板ID")
labeling_type: Optional[str] = Field(None, alias="labelingType", description="标注类型")
template: Optional['AnnotationTemplateResponse'] = Field(None, description="关联的标注模板详情") template: Optional['AnnotationTemplateResponse'] = Field(None, description="关联的标注模板详情")
label_config: Optional[str] = Field(None, alias="labelConfig", description="实际使用的 Label Studio XML 配置") label_config: Optional[str] = Field(None, alias="labelConfig", description="实际使用的 Label Studio XML 配置")
segmentation_enabled: Optional[bool] = Field( segmentation_enabled: Optional[bool] = Field(
@@ -61,6 +75,7 @@ class DatasetMappingResponse(BaseModel):
) )
total_count: int = Field(0, alias="totalCount", description="数据集总数据量") total_count: int = Field(0, alias="totalCount", description="数据集总数据量")
annotated_count: int = Field(0, alias="annotatedCount", description="已标注数据量") annotated_count: int = Field(0, alias="annotatedCount", description="已标注数据量")
in_progress_count: int = Field(0, alias="inProgressCount", description="分段标注中数据量")
created_at: datetime = Field(..., alias="createdAt", description="创建时间") created_at: datetime = Field(..., alias="createdAt", description="创建时间")
updated_at: Optional[datetime] = Field(None, alias="updatedAt", description="更新时间") updated_at: Optional[datetime] = Field(None, alias="updatedAt", description="更新时间")
deleted_at: Optional[datetime] = Field(None, alias="deletedAt", description="删除时间") deleted_at: Optional[datetime] = Field(None, alias="deletedAt", description="删除时间")

Some files were not shown because too many files have changed in this diff Show More