修复从全新部署到运行的完整流程中的配置和路由问题。 ## P0 修复(功能失效) ### P0-1: GraphRAG KG 服务 URL 错误 - config.py - GRAPHRAG_KG_SERVICE_URL 从 http://datamate-kg:8080 改为 http://datamate-backend:8080(容器名修正) - kg_client.py - 修复 API 路径:/knowledge-graph/... → /api/knowledge-graph/... - kb_access.py - 同类问题修复:/knowledge-base/... → /api/knowledge-base/... - test_kb_access.py - 测试断言同步更新 根因:容器名 datamate-kg 不存在,且 httpx 绝对路径会丢弃 base_url 中的 /api 路径 ### P0-2: Vite 开发代理剥离 /api 前缀 - vite.config.ts - 删除 /api/knowledge-graph 专用代理规则(剥离 /api 导致 404),统一走 ^/api 规则 ## P1 修复(功能受损) ### P1-1: Gateway 缺少 KG Python 端点路由 - ApiGatewayApplication.java - 添加 /api/kg/** 路由(指向 kg-extraction Python 服务) - ApiGatewayApplication.java - 添加 /api/graphrag/** 路由(指向 GraphRAG 服务) ### P1-2: DATA_MANAGEMENT_URL 默认值缺 /api - KnowledgeGraphProperties.java - dataManagementUrl 默认值 http://localhost:8080 → http://localhost:8080/api - KnowledgeGraphProperties.java - annotationServiceUrl 默认值 http://localhost:8081 → http://localhost:8080/api(同 JVM) - application-knowledgegraph.yml - YAML 默认值同步更新 ### P1-3: Neo4j k8s 安装链路失败 - Makefile - VALID_K8S_TARGETS 添加 neo4j - Makefile - %-k8s-install 添加 neo4j case(显式 skip,提示使用 Docker 或外部实例) - Makefile - %-k8s-uninstall 添加 neo4j case(显式 skip) 根因:install 目标无条件调用 neo4j-$(INSTALLER)-install,但 k8s 模式下 neo4j 不在 VALID_K8S_TARGETS 中,导致 "Unknown k8s target 'neo4j'" 错误 ## P2 修复(次要) ### P2-1: Neo4j 加入 Docker install 流程 - Makefile - install target 增加 neo4j-$(INSTALLER)-install,在 datamate 之前启动 - Makefile - VALID_SERVICE_TARGETS 增加 neo4j - Makefile - %-docker-install / %-docker-uninstall 增加 neo4j case ## 验证结果 - mvn test: 311 tests, 0 failures ✅ - eslint: 0 errors ✅ - tsc --noEmit: 通过 ✅ - vite build: 成功 (17.71s) ✅ - Python tests: 46 passed ✅ - make -n install INSTALLER=k8s: 不再报 unknown target ✅ - make -n neo4j-k8s-install: 正确显示 skip 消息 ✅
DataMate All-in-One Data Work Platform
DataMate is an enterprise-level data processing platform for model fine-tuning and RAG retrieval, supporting core functions such as data collection, data management, operator marketplace, data cleaning, data synthesis, data annotation, data evaluation, and knowledge generation.
If you like this project, please give it a Star⭐️!
🌟 Core Features
- Core Modules: Data Collection, Data Management, Operator Marketplace, Data Cleaning, Data Synthesis, Data Annotation, Data Evaluation, Knowledge Generation.
- Visual Orchestration: Drag-and-drop data processing workflow design.
- Operator Ecosystem: Rich built-in operators and support for custom operators.
🚀 Quick Start
Prerequisites
- Git (for pulling source code)
- Make (for building and installing)
- Docker (for building images and deploying services)
- Docker-Compose (for service deployment - Docker method)
- Kubernetes (for service deployment - k8s method)
- Helm (for service deployment - k8s method)
This project supports deployment via two methods: docker-compose and helm. After executing the command, please enter the corresponding number for the deployment method. The command echo is as follows:
Choose a deployment method:
1. Docker/Docker-Compose
2. Kubernetes/Helm
Enter choice:
Clone the Code
git clone git@github.com:ModelEngine-Group/DataMate.git
cd DataMate
Deploy the basic services
make install
If the machine you are using does not have make installed, please run the following command to deploy it:
# Windows
set REGISTRY=ghcr.io/modelengine-group/
docker compose -f ./deployment/docker/datamate/docker-compose.yml up -d
docker compose -f ./deployment/docker/milvus/docker-compose.yml up -d
# Linux/Mac
export REGISTRY=ghcr.io/modelengine-group/
docker compose -f ./deployment/docker/datamate/docker-compose.yml up -d
docker compose -f ./deployment/docker/milvus/docker-compose.yml up -d
Once the container is running, access http://localhost:30000 in a browser to view the front-end interface.
To list all available Make targets, flags and help text, run:
make help
Build and deploy Mineru Enhanced PDF Processing
make build-mineru
make install-mineru
Deploy the DeerFlow service
make install-deer-flow
Local Development and Deployment
After modifying the local code, please execute the following commands to build the image and deploy using the local image.
make build
make install dev=true
Uninstall
make uninstall
When running make uninstall, the installer will prompt once whether to delete volumes; that single choice is applied to all components. The uninstall order is: milvus -> label-studio -> datamate, which ensures the datamate network is removed cleanly after services that use it have stopped.
🤝 Contribution Guidelines
Thank you for your interest in this project! We warmly welcome contributions from the community. Whether it's submitting bug reports, suggesting new features, or directly participating in code development, all forms of help make the project better.
• 📮 GitHub Issues: Submit bugs or feature suggestions.
• 🔧 GitHub Pull Requests: Contribute code improvements.
📄 License
DataMate is open source under the MIT license. You are free to use, modify, and distribute the code of this project in compliance with the license terms.