在创建标注任务时自动预生成文本切片结构,避免每次进入标注页面时的实时计算。 修改内容: 1. 在 AnnotationEditorService 中新增 precompute_segmentation_for_project 方法 - 为项目的所有文本文件预计算切片结构 - 使用 AnnotationTextSplitter 执行切片 - 将切片结构持久化到 AnnotationResult 表(状态为 IN_PROGRESS) - 支持失败重试机制 - 返回统计信息 2. 修改 create_mapping 接口 - 在创建标注任务后,如果启用分段且为文本数据集,自动触发切片预生成 - 使用 try-except 捕获异常,确保切片失败不影响项目创建 特点: - 使用现有的 AnnotationTextSplitter 类 - 切片数据结构与现有分段标注格式一致 - 向后兼容(未切片的任务仍然使用实时计算) - 性能优化:避免进入标注页面时的重复计算 相关文件: - runtime/datamate-python/app/module/annotation/service/editor.py - runtime/datamate-python/app/module/annotation/interface/project.py
DataMate All-in-One Data Work Platform
DataMate is an enterprise-level data processing platform for model fine-tuning and RAG retrieval, supporting core functions such as data collection, data management, operator marketplace, data cleaning, data synthesis, data annotation, data evaluation, and knowledge generation.
If you like this project, please give it a Star⭐️!
🌟 Core Features
- Core Modules: Data Collection, Data Management, Operator Marketplace, Data Cleaning, Data Synthesis, Data Annotation, Data Evaluation, Knowledge Generation.
- Visual Orchestration: Drag-and-drop data processing workflow design.
- Operator Ecosystem: Rich built-in operators and support for custom operators.
🚀 Quick Start
Prerequisites
- Git (for pulling source code)
- Make (for building and installing)
- Docker (for building images and deploying services)
- Docker-Compose (for service deployment - Docker method)
- Kubernetes (for service deployment - k8s method)
- Helm (for service deployment - k8s method)
This project supports deployment via two methods: docker-compose and helm. After executing the command, please enter the corresponding number for the deployment method. The command echo is as follows:
Choose a deployment method:
1. Docker/Docker-Compose
2. Kubernetes/Helm
Enter choice:
Clone the Code
git clone git@github.com:ModelEngine-Group/DataMate.git
cd DataMate
Deploy the basic services
make install
If the machine you are using does not have make installed, please run the following command to deploy it:
# Windows
set REGISTRY=ghcr.io/modelengine-group/
docker compose -f ./deployment/docker/datamate/docker-compose.yml up -d
docker compose -f ./deployment/docker/milvus/docker-compose.yml up -d
# Linux/Mac
export REGISTRY=ghcr.io/modelengine-group/
docker compose -f ./deployment/docker/datamate/docker-compose.yml up -d
docker compose -f ./deployment/docker/milvus/docker-compose.yml up -d
Once the container is running, access http://localhost:30000 in a browser to view the front-end interface.
To list all available Make targets, flags and help text, run:
make help
Build and deploy Mineru Enhanced PDF Processing
make build-mineru
make install-mineru
Deploy the DeerFlow service
make install-deer-flow
Local Development and Deployment
After modifying the local code, please execute the following commands to build the image and deploy using the local image.
make build
make install dev=true
Uninstall
make uninstall
When running make uninstall, the installer will prompt once whether to delete volumes; that single choice is applied to all components. The uninstall order is: milvus -> label-studio -> datamate, which ensures the datamate network is removed cleanly after services that use it have stopped.
🤝 Contribution Guidelines
Thank you for your interest in this project! We warmly welcome contributions from the community. Whether it's submitting bug reports, suggesting new features, or directly participating in code development, all forms of help make the project better.
• 📮 GitHub Issues: Submit bugs or feature suggestions.
• 🔧 GitHub Pull Requests: Contribute code improvements.
📄 License
DataMate is open source under the MIT license. You are free to use, modify, and distribute the code of this project in compliance with the license terms.