feat: 实现RCU升级后端服务初始版本
- 添加Kafka消费者组件用于消费升级事件数据 - 实现数据处理器进行数据验证和转换 - 添加数据库写入组件支持批量写入G5数据库 - 配置环境变量管理连接参数 - 添加日志记录和错误处理机制 - 实现优雅关闭和流控功能
This commit is contained in:
42
bls-upgrade-backend/openspec/AGENTS.md
Normal file
42
bls-upgrade-backend/openspec/AGENTS.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# Agents
|
||||
|
||||
## Overview
|
||||
This document lists the agents involved in the RCU Upgrade Backend project.
|
||||
|
||||
## Agents
|
||||
|
||||
### 1. System Administrator
|
||||
- **Responsibilities**: Server setup, network configuration, security
|
||||
- **Contact**: admin@example.com
|
||||
|
||||
### 2. Database Administrator
|
||||
- **Responsibilities**: Database setup, schema management, performance tuning
|
||||
- **Contact**: dba@example.com
|
||||
|
||||
### 3. Kafka Administrator
|
||||
- **Responsibilities**: Kafka cluster management, topic configuration
|
||||
- **Contact**: kafka-admin@example.com
|
||||
|
||||
### 4. Developer
|
||||
- **Responsibilities**: Code implementation, testing, deployment
|
||||
- **Contact**: developer@example.com
|
||||
|
||||
### 5. DevOps Engineer
|
||||
- **Responsibilities**: CI/CD pipeline, monitoring, deployment automation
|
||||
- **Contact**: devops@example.com
|
||||
|
||||
## Agent Responsibilities Matrix
|
||||
|
||||
| Task | System Admin | DBA | Kafka Admin | Developer | DevOps |
|
||||
|------|-------------|-----|-------------|-----------|--------|
|
||||
| Server setup | ✅ | | | | |
|
||||
| Network configuration | ✅ | | | | |
|
||||
| Database setup | | ✅ | | | |
|
||||
| Schema management | | ✅ | | | |
|
||||
| Kafka cluster setup | | | ✅ | | |
|
||||
| Topic configuration | | | ✅ | | |
|
||||
| Code implementation | | | | ✅ | |
|
||||
| Testing | | | | ✅ | |
|
||||
| CI/CD pipeline | | | | | ✅ |
|
||||
| Monitoring | | | | | ✅ |
|
||||
| Deployment automation | | | | | ✅ |
|
||||
@@ -0,0 +1,50 @@
|
||||
# Initial Implementation Proposal
|
||||
|
||||
## Overview
|
||||
This proposal outlines the initial implementation of the RCU Upgrade Backend service, which will consume data from Kafka, process it, and write it to the G5 database.
|
||||
|
||||
## Background
|
||||
The service is needed to handle the processing and storage of RCU upgrade events data coming from Kafka, ensuring data integrity and performance.
|
||||
|
||||
## Proposed Changes
|
||||
1. **Project Structure**: Create a complete Node.js project structure with the following components:
|
||||
- Kafka consumer
|
||||
- Data processor
|
||||
- Database writer
|
||||
- Flow control mechanism
|
||||
|
||||
2. **Configuration**: Set up environment variables for:
|
||||
- Kafka connection
|
||||
- Database connection
|
||||
- Performance settings
|
||||
|
||||
3. **Data Processing**: Implement:
|
||||
- Data validation
|
||||
- hotel_id value range check
|
||||
- Batch processing
|
||||
- Flow control
|
||||
|
||||
4. **Error Handling**: Implement comprehensive error handling and logging
|
||||
|
||||
5. **Testing**: Prepare for unit and integration testing
|
||||
|
||||
## Benefits
|
||||
- Efficient processing of high-volume data
|
||||
- Data integrity through validation
|
||||
- Controlled database write frequency
|
||||
- Comprehensive logging and error handling
|
||||
|
||||
## Risks
|
||||
- Potential performance issues with large batch sizes
|
||||
- Kafka connection reliability
|
||||
- Database connection limits
|
||||
|
||||
## Mitigation Strategies
|
||||
- Configurable batch size and write frequency
|
||||
- Robust error handling and retry mechanisms
|
||||
- Monitoring and alerting
|
||||
|
||||
## Timeline
|
||||
- Initial implementation: 1 day
|
||||
- Testing: 1 day
|
||||
- Deployment: 1 day
|
||||
@@ -0,0 +1,73 @@
|
||||
# Initial Implementation Tasks
|
||||
|
||||
## Overview
|
||||
This document outlines the specific tasks required for the initial implementation of the RCU Upgrade Backend service.
|
||||
|
||||
## Tasks
|
||||
|
||||
### 1. Project Setup
|
||||
- [x] Create project directory structure
|
||||
- [x] Set up package.json with dependencies
|
||||
- [x] Configure environment variables
|
||||
|
||||
### 2. Core Components
|
||||
- [x] Implement Kafka consumer
|
||||
- [x] Implement data processor
|
||||
- [x] Implement database writer
|
||||
- [x] Implement flow control mechanism
|
||||
|
||||
### 3. Data Processing
|
||||
- [x] Implement data validation
|
||||
- [x] Implement hotel_id value range check
|
||||
- [x] Implement batch processing
|
||||
- [x] Implement flow control
|
||||
|
||||
### 4. Error Handling and Logging
|
||||
- [x] Implement error handling
|
||||
- [x] Implement logging
|
||||
|
||||
### 5. Testing
|
||||
- [ ] Write unit tests
|
||||
- [ ] Write integration tests
|
||||
|
||||
### 6. Deployment
|
||||
- [ ] Set up build process
|
||||
- [ ] Create deployment script
|
||||
|
||||
## Task Details
|
||||
|
||||
### Task 1: Project Setup
|
||||
- Create directory structure including src, tests, scripts, and openspec
|
||||
- Set up package.json with required dependencies (kafka-node, pg, dotenv, etc.)
|
||||
- Configure .env file with connection details
|
||||
|
||||
### Task 2: Core Components
|
||||
- Kafka consumer: Set up connection to Kafka broker and consume messages from blwlog4Nodejs-rcu-upgrade-topic
|
||||
- Data processor: Validate and transform data according to database schema
|
||||
- Database writer: Write processed data to G5 database
|
||||
- Flow control: Limit database write frequency to max 1 time per second
|
||||
|
||||
### Task 3: Data Processing
|
||||
- Data validation: Ensure all fields are present and valid
|
||||
- hotel_id value range check: Ensure hotel_id is within int2 range (-32768 to 32767), otherwise set to 0
|
||||
- Batch processing: Process data in batches of 1000 records
|
||||
- Flow control: Ensure database writes occur at most once per second
|
||||
|
||||
### Task 4: Error Handling and Logging
|
||||
- Error handling: Handle and log errors gracefully
|
||||
- Logging: Implement structured logging for all operations
|
||||
|
||||
### Task 5: Testing
|
||||
- Unit tests: Test individual components
|
||||
- Integration tests: Test the entire flow
|
||||
|
||||
### Task 6: Deployment
|
||||
- Build process: Set up Vite build
|
||||
- Deployment script: Create script for deployment
|
||||
|
||||
## Completion Criteria
|
||||
- All core components are implemented
|
||||
- Data is correctly processed and written to database
|
||||
- Flow control is working as expected
|
||||
- Error handling and logging are in place
|
||||
- Service can be started and run without errors
|
||||
42
bls-upgrade-backend/openspec/specs/rcu_upgrade/spec.md
Normal file
42
bls-upgrade-backend/openspec/specs/rcu_upgrade/spec.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# RCU Upgrade Backend Spec
|
||||
|
||||
## Overview
|
||||
This service is responsible for consuming data from Kafka, processing it, and writing it to the G5 database.
|
||||
|
||||
## Architecture
|
||||
- **Kafka Consumer**: Consumes data from the blwlog4Nodejs-rcu-upgrade-topic topic
|
||||
- **Data Processor**: Validates and transforms data
|
||||
- **Database Writer**: Writes processed data to the G5 database
|
||||
- **Flow Control**: Limits database write frequency to max 1 time per second
|
||||
|
||||
## Data Flow
|
||||
1. Kafka consumer receives messages
|
||||
2. Messages are parsed and validated
|
||||
3. Data is transformed to match database schema
|
||||
4. Data is batched and written to database
|
||||
|
||||
## Configuration
|
||||
All configuration is managed through environment variables in .env file:
|
||||
- Kafka connection settings
|
||||
- Database connection settings
|
||||
- Performance settings
|
||||
|
||||
## Data Validation
|
||||
- hotel_id: Must be within int2 range (-32768 to 32767), otherwise set to 0
|
||||
- All other fields are validated and default values are provided if missing
|
||||
|
||||
## Performance Requirements
|
||||
- Batch processing: 1000 records per batch
|
||||
- Database write frequency: max 1 time per second
|
||||
|
||||
## Error Handling
|
||||
- All errors are logged
|
||||
- Failed batches can be retried
|
||||
|
||||
## Monitoring
|
||||
- Logs are generated for all operations
|
||||
- Performance metrics can be collected
|
||||
|
||||
## Deployment
|
||||
- Service is deployed as a Node.js application
|
||||
- Can be run with npm start or as a system service
|
||||
36
bls-upgrade-backend/openspec/specs/rcu_upgrade/status.md
Normal file
36
bls-upgrade-backend/openspec/specs/rcu_upgrade/status.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# RCU Upgrade Backend Status
|
||||
|
||||
## Status: In Progress
|
||||
|
||||
## Implementation Progress
|
||||
|
||||
### Core Components
|
||||
- [x] Kafka Consumer: Implemented
|
||||
- [x] Data Processor: Implemented
|
||||
- [x] Database Writer: Implemented
|
||||
- [x] Flow Control: Implemented
|
||||
|
||||
### Features
|
||||
- [x] Data validation
|
||||
- [x] Batch processing
|
||||
- [x] Error handling
|
||||
- [x] Logging
|
||||
|
||||
### Configuration
|
||||
- [x] Environment variables
|
||||
- [x] Database connection
|
||||
- [x] Kafka connection
|
||||
|
||||
### Testing
|
||||
- [ ] Unit tests
|
||||
- [ ] Integration tests
|
||||
|
||||
### Deployment
|
||||
- [ ] Build process
|
||||
- [ ] Deployment script
|
||||
|
||||
## Next Steps
|
||||
1. Complete unit tests
|
||||
2. Complete integration tests
|
||||
3. Finalize deployment process
|
||||
4. Perform performance testing
|
||||
Reference in New Issue
Block a user