Compare commits
5 Commits
225e63a985
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| 1200d26866 | |||
| 21185e23cf | |||
| 9db11531d6 | |||
| 06aa2372e4 | |||
| c45ae24647 |
205
dashboard-server/acot-server/README.md
Normal file
205
dashboard-server/acot-server/README.md
Normal file
@@ -0,0 +1,205 @@
|
|||||||
|
# ACOT Server
|
||||||
|
|
||||||
|
This server replaces the Klaviyo integration with direct database queries to the production MySQL database via SSH tunnel. It provides seamless API compatibility for all frontend components without requiring any frontend changes.
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
1. **Environment Variables**: Copy `.env.example` to `.env` and configure:
|
||||||
|
```
|
||||||
|
DB_HOST=localhost
|
||||||
|
DB_PORT=3306
|
||||||
|
DB_USER=your_db_user
|
||||||
|
DB_PASSWORD=your_db_password
|
||||||
|
DB_NAME=your_db_name
|
||||||
|
PORT=3007
|
||||||
|
NODE_ENV=development
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **SSH Tunnel**: Ensure your SSH tunnel to the production database is running on localhost:3306.
|
||||||
|
|
||||||
|
3. **Install Dependencies**:
|
||||||
|
```bash
|
||||||
|
npm install
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Start Server**:
|
||||||
|
```bash
|
||||||
|
npm start
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
All endpoints provide exact API compatibility with the previous Klaviyo implementation:
|
||||||
|
|
||||||
|
### Main Statistics
|
||||||
|
- `GET /api/acot/events/stats` - Complete statistics dashboard data
|
||||||
|
- Query params: `timeRange` (today, yesterday, thisWeek, lastWeek, thisMonth, lastMonth, last7days, last30days, last90days) or `startDate`/`endDate` for custom ranges
|
||||||
|
- Returns: Revenue, orders, AOV, shipping data, order types, brands/categories, refunds, cancellations, best day, peak hour, order ranges, period progress, projections
|
||||||
|
|
||||||
|
### Daily Details
|
||||||
|
- `GET /api/acot/events/stats/details` - Daily breakdown with previous period comparisons
|
||||||
|
- Query params: `timeRange`, `metric` (revenue, orders, average_order, etc.), `daily=true`
|
||||||
|
- Returns: Array of daily data points with trend comparisons
|
||||||
|
|
||||||
|
### Products
|
||||||
|
- `GET /api/acot/events/products` - Top products with sales data
|
||||||
|
- Query params: `timeRange`
|
||||||
|
- Returns: Product list with images, sales quantities, revenue, and order counts
|
||||||
|
|
||||||
|
### Projections
|
||||||
|
- `GET /api/acot/events/projection` - Smart revenue projections for incomplete periods
|
||||||
|
- Query params: `timeRange`
|
||||||
|
- Returns: Projected revenue with confidence levels based on historical patterns
|
||||||
|
|
||||||
|
### Health Check
|
||||||
|
- `GET /api/acot/test` - Server health and database connectivity test
|
||||||
|
|
||||||
|
## Database Schema
|
||||||
|
|
||||||
|
The server queries the following main tables:
|
||||||
|
|
||||||
|
### Orders (`_order`)
|
||||||
|
- **Key fields**: `order_id`, `date_placed`, `summary_total`, `order_status`, `ship_method_selected`, `stats_waiting_preorder`
|
||||||
|
- **Valid orders**: `order_status > 15`
|
||||||
|
- **Cancelled orders**: `order_status = 15`
|
||||||
|
- **Shipped orders**: `order_status IN (100, 92)`
|
||||||
|
- **Pre-orders**: `stats_waiting_preorder > 0`
|
||||||
|
- **Local pickup**: `ship_method_selected = 'localpickup'`
|
||||||
|
- **On-hold orders**: `ship_method_selected = 'holdit'`
|
||||||
|
|
||||||
|
### Order Items (`order_items`)
|
||||||
|
- **Fields**: `order_id`, `prod_pid`, `qty_ordered`, `prod_price`
|
||||||
|
- **Purpose**: Links orders to products for detailed analysis
|
||||||
|
|
||||||
|
### Products (`products`)
|
||||||
|
- **Fields**: `pid`, `description` (product name), `company`
|
||||||
|
- **Purpose**: Product information and brand data
|
||||||
|
|
||||||
|
### Product Images (`product_images`)
|
||||||
|
- **Fields**: `pid`, `iid`, `order` (priority)
|
||||||
|
- **Primary image**: `order = 255` (highest priority)
|
||||||
|
- **Image URL generation**: `https://sbing.com/i/products/0000/{prefix}/{pid}-{type}-{iid}.jpg`
|
||||||
|
|
||||||
|
### Payments (`order_payment`)
|
||||||
|
- **Refunds**: `payment_amount < 0`
|
||||||
|
- **Purpose**: Track refund amounts and counts
|
||||||
|
|
||||||
|
## Business Logic
|
||||||
|
|
||||||
|
### Time Handling
|
||||||
|
- **Timezone**: All calculations in UTC-5 (Eastern Time)
|
||||||
|
- **Business Day**: 1 AM - 12:59 AM Eastern (25-hour business day)
|
||||||
|
- **Format**: MySQL DATETIME format (YYYY-MM-DD HH:MM:SS)
|
||||||
|
- **Period Boundaries**: Calculated using `timeUtils.js` for consistent time range handling
|
||||||
|
|
||||||
|
### Order Processing
|
||||||
|
- **Revenue Calculation**: Only includes orders with `order_status > 15`
|
||||||
|
- **Order Types**:
|
||||||
|
- Pre-orders: `stats_waiting_preorder > 0`
|
||||||
|
- Local pickup: `ship_method_selected = 'localpickup'`
|
||||||
|
- On-hold: `ship_method_selected = 'holdit'`
|
||||||
|
- **Shipping Methods**: Mapped to friendly names (e.g., `usps_ground_advantage` → "USPS Ground Advantage")
|
||||||
|
|
||||||
|
### Projections
|
||||||
|
- **Period Progress**: Calculated based on current time within the selected period
|
||||||
|
- **Simple Projection**: Linear extrapolation based on current progress
|
||||||
|
- **Smart Projection**: Uses historical data patterns for more accurate forecasting
|
||||||
|
- **Confidence Levels**: Based on data consistency and historical accuracy
|
||||||
|
|
||||||
|
### Image URL Generation
|
||||||
|
- **Pattern**: `https://sbing.com/i/products/0000/{prefix}/{pid}-{type}-{iid}.jpg`
|
||||||
|
- **Prefix**: First 2 digits of product ID
|
||||||
|
- **Type**: "main" for primary images
|
||||||
|
- **Fallback**: Uses primary image (order=255) when available
|
||||||
|
|
||||||
|
## Frontend Integration
|
||||||
|
|
||||||
|
### Service Layer (`services/acotService.js`)
|
||||||
|
- **Purpose**: Replaces direct Klaviyo API calls with acot-server calls
|
||||||
|
- **Methods**: `getStats()`, `getStatsDetails()`, `getProducts()`, `getProjection()`
|
||||||
|
- **Logging**: Axios interceptors for request/response logging
|
||||||
|
- **Environment**: Automatic URL handling (proxy in dev, direct in production)
|
||||||
|
|
||||||
|
### Component Updates
|
||||||
|
All 5 main components updated to use `acotService`:
|
||||||
|
- **StatCards.jsx**: Main dashboard statistics
|
||||||
|
- **MiniStatCards.jsx**: Compact statistics view
|
||||||
|
- **SalesChart.jsx**: Revenue and order trends
|
||||||
|
- **MiniSalesChart.jsx**: Compact chart view
|
||||||
|
- **ProductGrid.jsx**: Top products table
|
||||||
|
|
||||||
|
### Proxy Configuration (`vite.config.js`)
|
||||||
|
```javascript
|
||||||
|
'/api/acot': {
|
||||||
|
target: 'http://localhost:3007',
|
||||||
|
changeOrigin: true,
|
||||||
|
secure: false
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Features
|
||||||
|
|
||||||
|
### Complete Business Intelligence
|
||||||
|
- **Revenue Analytics**: Total revenue, trends, projections
|
||||||
|
- **Order Analysis**: Counts, types, status tracking
|
||||||
|
- **Product Performance**: Top sellers, revenue contribution
|
||||||
|
- **Shipping Intelligence**: Methods, locations, distribution
|
||||||
|
- **Customer Insights**: Order value ranges, patterns
|
||||||
|
- **Operational Metrics**: Refunds, cancellations, peak hours
|
||||||
|
|
||||||
|
### Performance Optimizations
|
||||||
|
- **Connection Pooling**: Efficient database connection management
|
||||||
|
- **Query Optimization**: Indexed queries with proper WHERE clauses
|
||||||
|
- **Caching Strategy**: Frontend caching for detail views
|
||||||
|
- **Batch Processing**: Efficient data aggregation
|
||||||
|
|
||||||
|
### Error Handling
|
||||||
|
- **Database Connectivity**: Graceful handling of connection issues
|
||||||
|
- **Query Failures**: Detailed error logging and user-friendly messages
|
||||||
|
- **Data Validation**: Input sanitization and validation
|
||||||
|
- **Fallback Mechanisms**: Default values for missing data
|
||||||
|
|
||||||
|
## Simplified Elements
|
||||||
|
|
||||||
|
Due to database complexity, some features are simplified:
|
||||||
|
- **Brands**: Shows "Various Brands" (companies table structure complex)
|
||||||
|
- **Categories**: Shows "General" (category relationships complex)
|
||||||
|
|
||||||
|
These can be enhanced in future iterations with proper category mapping.
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
Test the server functionality:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Health check
|
||||||
|
curl http://localhost:3007/api/acot/test
|
||||||
|
|
||||||
|
# Today's stats
|
||||||
|
curl http://localhost:3007/api/acot/events/stats?timeRange=today
|
||||||
|
|
||||||
|
# Last 30 days with details
|
||||||
|
curl http://localhost:3007/api/acot/events/stats/details?timeRange=last30days&daily=true
|
||||||
|
|
||||||
|
# Top products
|
||||||
|
curl http://localhost:3007/api/acot/events/products?timeRange=thisWeek
|
||||||
|
|
||||||
|
# Revenue projection
|
||||||
|
curl http://localhost:3007/api/acot/events/projection?timeRange=today
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Notes
|
||||||
|
|
||||||
|
- **No Frontend Changes**: Complete drop-in replacement for Klaviyo
|
||||||
|
- **API Compatibility**: Maintains exact response structure
|
||||||
|
- **Business Logic**: Implements all complex e-commerce calculations
|
||||||
|
- **Scalability**: Designed for production workloads
|
||||||
|
- **Maintainability**: Well-documented code with clear separation of concerns
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
- Enhanced category and brand mapping
|
||||||
|
- Real-time notifications for significant events
|
||||||
|
- Advanced analytics and forecasting
|
||||||
|
- Customer segmentation analysis
|
||||||
|
- Inventory integration
|
||||||
297
dashboard-server/acot-server/db/connection.js
Normal file
297
dashboard-server/acot-server/db/connection.js
Normal file
@@ -0,0 +1,297 @@
|
|||||||
|
const { Client } = require('ssh2');
|
||||||
|
const mysql = require('mysql2/promise');
|
||||||
|
const fs = require('fs');
|
||||||
|
|
||||||
|
// Connection pool configuration
|
||||||
|
const connectionPool = {
|
||||||
|
connections: [],
|
||||||
|
maxConnections: 20,
|
||||||
|
currentConnections: 0,
|
||||||
|
pendingRequests: [],
|
||||||
|
// Cache for query results (key: query string, value: {data, timestamp})
|
||||||
|
queryCache: new Map(),
|
||||||
|
// Cache duration for different query types in milliseconds
|
||||||
|
cacheDuration: {
|
||||||
|
'stats': 60 * 1000, // 1 minute for stats
|
||||||
|
'products': 5 * 60 * 1000, // 5 minutes for products
|
||||||
|
'orders': 60 * 1000, // 1 minute for orders
|
||||||
|
'default': 60 * 1000 // 1 minute default
|
||||||
|
},
|
||||||
|
// Circuit breaker state
|
||||||
|
circuitBreaker: {
|
||||||
|
failures: 0,
|
||||||
|
lastFailure: 0,
|
||||||
|
isOpen: false,
|
||||||
|
threshold: 5,
|
||||||
|
timeout: 30000 // 30 seconds
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get a database connection from the pool
|
||||||
|
* @returns {Promise<{connection: object, release: function}>} The database connection and release function
|
||||||
|
*/
|
||||||
|
async function getDbConnection() {
|
||||||
|
return new Promise(async (resolve, reject) => {
|
||||||
|
// Check circuit breaker
|
||||||
|
const now = Date.now();
|
||||||
|
if (connectionPool.circuitBreaker.isOpen) {
|
||||||
|
if (now - connectionPool.circuitBreaker.lastFailure > connectionPool.circuitBreaker.timeout) {
|
||||||
|
// Reset circuit breaker
|
||||||
|
connectionPool.circuitBreaker.isOpen = false;
|
||||||
|
connectionPool.circuitBreaker.failures = 0;
|
||||||
|
console.log('Circuit breaker reset');
|
||||||
|
} else {
|
||||||
|
reject(new Error('Circuit breaker is open - too many connection failures'));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if there's an available connection in the pool
|
||||||
|
if (connectionPool.connections.length > 0) {
|
||||||
|
const conn = connectionPool.connections.pop();
|
||||||
|
console.log(`Using pooled connection. Pool size: ${connectionPool.connections.length}`);
|
||||||
|
resolve({
|
||||||
|
connection: conn.connection,
|
||||||
|
release: () => releaseConnection(conn)
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// If we haven't reached max connections, create a new one
|
||||||
|
if (connectionPool.currentConnections < connectionPool.maxConnections) {
|
||||||
|
try {
|
||||||
|
console.log(`Creating new connection. Current: ${connectionPool.currentConnections}/${connectionPool.maxConnections}`);
|
||||||
|
connectionPool.currentConnections++;
|
||||||
|
|
||||||
|
const tunnel = await setupSshTunnel();
|
||||||
|
const { ssh, stream, dbConfig } = tunnel;
|
||||||
|
|
||||||
|
const connection = await mysql.createConnection({
|
||||||
|
...dbConfig,
|
||||||
|
stream
|
||||||
|
});
|
||||||
|
|
||||||
|
const conn = { ssh, connection, inUse: true, created: Date.now() };
|
||||||
|
|
||||||
|
console.log('Database connection established');
|
||||||
|
|
||||||
|
// Reset circuit breaker on successful connection
|
||||||
|
if (connectionPool.circuitBreaker.failures > 0) {
|
||||||
|
connectionPool.circuitBreaker.failures = 0;
|
||||||
|
connectionPool.circuitBreaker.isOpen = false;
|
||||||
|
}
|
||||||
|
|
||||||
|
resolve({
|
||||||
|
connection: conn.connection,
|
||||||
|
release: () => releaseConnection(conn)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
connectionPool.currentConnections--;
|
||||||
|
|
||||||
|
// Track circuit breaker failures
|
||||||
|
connectionPool.circuitBreaker.failures++;
|
||||||
|
connectionPool.circuitBreaker.lastFailure = Date.now();
|
||||||
|
|
||||||
|
if (connectionPool.circuitBreaker.failures >= connectionPool.circuitBreaker.threshold) {
|
||||||
|
connectionPool.circuitBreaker.isOpen = true;
|
||||||
|
console.log(`Circuit breaker opened after ${connectionPool.circuitBreaker.failures} failures`);
|
||||||
|
}
|
||||||
|
|
||||||
|
reject(error);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Pool is full, queue the request with timeout
|
||||||
|
console.log('Connection pool full, queuing request...');
|
||||||
|
const timeoutId = setTimeout(() => {
|
||||||
|
// Remove from queue if still there
|
||||||
|
const index = connectionPool.pendingRequests.findIndex(req => req.resolve === resolve);
|
||||||
|
if (index !== -1) {
|
||||||
|
connectionPool.pendingRequests.splice(index, 1);
|
||||||
|
reject(new Error('Connection pool queue timeout after 15 seconds'));
|
||||||
|
}
|
||||||
|
}, 15000);
|
||||||
|
|
||||||
|
connectionPool.pendingRequests.push({
|
||||||
|
resolve,
|
||||||
|
reject,
|
||||||
|
timeoutId,
|
||||||
|
timestamp: Date.now()
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Release a connection back to the pool
|
||||||
|
*/
|
||||||
|
function releaseConnection(conn) {
|
||||||
|
conn.inUse = false;
|
||||||
|
|
||||||
|
// Check if there are pending requests
|
||||||
|
if (connectionPool.pendingRequests.length > 0) {
|
||||||
|
const { resolve, timeoutId } = connectionPool.pendingRequests.shift();
|
||||||
|
|
||||||
|
// Clear the timeout since we're serving the request
|
||||||
|
if (timeoutId) {
|
||||||
|
clearTimeout(timeoutId);
|
||||||
|
}
|
||||||
|
|
||||||
|
conn.inUse = true;
|
||||||
|
console.log(`Serving queued request. Queue length: ${connectionPool.pendingRequests.length}`);
|
||||||
|
resolve({
|
||||||
|
connection: conn.connection,
|
||||||
|
release: () => releaseConnection(conn)
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
// Return to pool
|
||||||
|
connectionPool.connections.push(conn);
|
||||||
|
console.log(`Connection returned to pool. Pool size: ${connectionPool.connections.length}, Active: ${connectionPool.currentConnections}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get cached query results or execute query if not cached
|
||||||
|
* @param {string} cacheKey - Unique key to identify the query
|
||||||
|
* @param {string} queryType - Type of query (stats, products, orders, etc.)
|
||||||
|
* @param {Function} queryFn - Function to execute if cache miss
|
||||||
|
* @returns {Promise<any>} The query result
|
||||||
|
*/
|
||||||
|
async function getCachedQuery(cacheKey, queryType, queryFn) {
|
||||||
|
// Get cache duration based on query type
|
||||||
|
const cacheDuration = connectionPool.cacheDuration[queryType] || connectionPool.cacheDuration.default;
|
||||||
|
|
||||||
|
// Check if we have a valid cached result
|
||||||
|
const cachedResult = connectionPool.queryCache.get(cacheKey);
|
||||||
|
const now = Date.now();
|
||||||
|
|
||||||
|
if (cachedResult && (now - cachedResult.timestamp < cacheDuration)) {
|
||||||
|
console.log(`Cache hit for ${queryType} query: ${cacheKey}`);
|
||||||
|
return cachedResult.data;
|
||||||
|
}
|
||||||
|
|
||||||
|
// No valid cache found, execute the query
|
||||||
|
console.log(`Cache miss for ${queryType} query: ${cacheKey}`);
|
||||||
|
const result = await queryFn();
|
||||||
|
|
||||||
|
// Cache the result
|
||||||
|
connectionPool.queryCache.set(cacheKey, {
|
||||||
|
data: result,
|
||||||
|
timestamp: now
|
||||||
|
});
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Setup SSH tunnel to production database
|
||||||
|
* @private - Should only be used by getDbConnection
|
||||||
|
* @returns {Promise<{ssh: object, stream: object, dbConfig: object}>}
|
||||||
|
*/
|
||||||
|
async function setupSshTunnel() {
|
||||||
|
const sshConfig = {
|
||||||
|
host: process.env.PROD_SSH_HOST,
|
||||||
|
port: process.env.PROD_SSH_PORT || 22,
|
||||||
|
username: process.env.PROD_SSH_USER,
|
||||||
|
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||||
|
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||||
|
: undefined,
|
||||||
|
compress: true
|
||||||
|
};
|
||||||
|
|
||||||
|
const dbConfig = {
|
||||||
|
host: process.env.PROD_DB_HOST || 'localhost',
|
||||||
|
user: process.env.PROD_DB_USER,
|
||||||
|
password: process.env.PROD_DB_PASSWORD,
|
||||||
|
database: process.env.PROD_DB_NAME,
|
||||||
|
port: process.env.PROD_DB_PORT || 3306,
|
||||||
|
timezone: 'Z'
|
||||||
|
};
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const ssh = new Client();
|
||||||
|
|
||||||
|
ssh.on('error', (err) => {
|
||||||
|
console.error('SSH connection error:', err);
|
||||||
|
reject(err);
|
||||||
|
});
|
||||||
|
|
||||||
|
ssh.on('ready', () => {
|
||||||
|
ssh.forwardOut(
|
||||||
|
'127.0.0.1',
|
||||||
|
0,
|
||||||
|
dbConfig.host,
|
||||||
|
dbConfig.port,
|
||||||
|
(err, stream) => {
|
||||||
|
if (err) reject(err);
|
||||||
|
resolve({ ssh, stream, dbConfig });
|
||||||
|
}
|
||||||
|
);
|
||||||
|
}).connect(sshConfig);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear cached query results
|
||||||
|
* @param {string} [cacheKey] - Specific cache key to clear (clears all if not provided)
|
||||||
|
*/
|
||||||
|
function clearQueryCache(cacheKey) {
|
||||||
|
if (cacheKey) {
|
||||||
|
connectionPool.queryCache.delete(cacheKey);
|
||||||
|
console.log(`Cleared cache for key: ${cacheKey}`);
|
||||||
|
} else {
|
||||||
|
connectionPool.queryCache.clear();
|
||||||
|
console.log('Cleared all query cache');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Force close all active connections
|
||||||
|
* Useful for server shutdown or manual connection reset
|
||||||
|
*/
|
||||||
|
async function closeAllConnections() {
|
||||||
|
// Close all pooled connections
|
||||||
|
for (const conn of connectionPool.connections) {
|
||||||
|
try {
|
||||||
|
await conn.connection.end();
|
||||||
|
conn.ssh.end();
|
||||||
|
console.log('Closed pooled connection');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing pooled connection:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reset pool state
|
||||||
|
connectionPool.connections = [];
|
||||||
|
connectionPool.currentConnections = 0;
|
||||||
|
connectionPool.pendingRequests = [];
|
||||||
|
connectionPool.queryCache.clear();
|
||||||
|
|
||||||
|
console.log('All connections closed and pool reset');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get connection pool status for debugging
|
||||||
|
*/
|
||||||
|
function getPoolStatus() {
|
||||||
|
return {
|
||||||
|
poolSize: connectionPool.connections.length,
|
||||||
|
activeConnections: connectionPool.currentConnections,
|
||||||
|
maxConnections: connectionPool.maxConnections,
|
||||||
|
pendingRequests: connectionPool.pendingRequests.length,
|
||||||
|
cacheSize: connectionPool.queryCache.size,
|
||||||
|
queuedRequests: connectionPool.pendingRequests.map(req => ({
|
||||||
|
waitTime: Date.now() - req.timestamp,
|
||||||
|
hasTimeout: !!req.timeoutId
|
||||||
|
}))
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
getDbConnection,
|
||||||
|
getCachedQuery,
|
||||||
|
clearQueryCache,
|
||||||
|
closeAllConnections,
|
||||||
|
getPoolStatus
|
||||||
|
};
|
||||||
1543
dashboard-server/acot-server/package-lock.json
generated
Normal file
1543
dashboard-server/acot-server/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
22
dashboard-server/acot-server/package.json
Normal file
22
dashboard-server/acot-server/package.json
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
{
|
||||||
|
"name": "acot-server",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "A Cherry On Top production database server",
|
||||||
|
"main": "server.js",
|
||||||
|
"scripts": {
|
||||||
|
"start": "node server.js",
|
||||||
|
"dev": "nodemon server.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"express": "^4.18.2",
|
||||||
|
"cors": "^2.8.5",
|
||||||
|
"dotenv": "^16.3.1",
|
||||||
|
"morgan": "^1.10.0",
|
||||||
|
"ssh2": "^1.14.0",
|
||||||
|
"mysql2": "^3.6.5",
|
||||||
|
"compression": "^1.7.4"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"nodemon": "^3.0.1"
|
||||||
|
}
|
||||||
|
}
|
||||||
767
dashboard-server/acot-server/routes/events.js
Normal file
767
dashboard-server/acot-server/routes/events.js
Normal file
@@ -0,0 +1,767 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const { getDbConnection, getPoolStatus } = require('../db/connection');
|
||||||
|
const { getTimeRangeConditions, formatBusinessDate, getBusinessDayBounds } = require('../utils/timeUtils');
|
||||||
|
|
||||||
|
// Image URL generation utility
|
||||||
|
const getImageUrls = (pid, iid = 1) => {
|
||||||
|
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
||||||
|
const paddedPid = pid.toString().padStart(6, '0');
|
||||||
|
const prefix = paddedPid.slice(0, 3);
|
||||||
|
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||||
|
return {
|
||||||
|
image: `${basePath}-t-${iid}.jpg`,
|
||||||
|
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||||
|
image_full: `${basePath}-o-${iid}.jpg`,
|
||||||
|
ImgThumb: `${basePath}-175x175-${iid}.jpg` // For ProductGrid component
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
// Main stats endpoint - replaces /api/klaviyo/events/stats
|
||||||
|
router.get('/stats', async (req, res) => {
|
||||||
|
const startTime = Date.now();
|
||||||
|
console.log(`[STATS] Starting request for timeRange: ${req.query.timeRange}`);
|
||||||
|
|
||||||
|
// Set a timeout for the entire operation
|
||||||
|
const timeoutPromise = new Promise((_, reject) => {
|
||||||
|
setTimeout(() => reject(new Error('Request timeout after 15 seconds')), 15000);
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
const mainOperation = async () => {
|
||||||
|
const { timeRange, startDate, endDate } = req.query;
|
||||||
|
console.log(`[STATS] Getting DB connection...`);
|
||||||
|
const { connection, release } = await getDbConnection();
|
||||||
|
console.log(`[STATS] DB connection obtained in ${Date.now() - startTime}ms`);
|
||||||
|
|
||||||
|
const { whereClause, params, dateRange } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||||
|
|
||||||
|
// Main order stats query
|
||||||
|
const mainStatsQuery = `
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as orderCount,
|
||||||
|
SUM(summary_total) as revenue,
|
||||||
|
SUM(stats_prod_pieces) as itemCount,
|
||||||
|
AVG(summary_total) as averageOrderValue,
|
||||||
|
AVG(stats_prod_pieces) as averageItemsPerOrder,
|
||||||
|
SUM(CASE WHEN stats_waiting_preorder > 0 THEN 1 ELSE 0 END) as preOrderCount,
|
||||||
|
SUM(CASE WHEN ship_method_selected = 'localpickup' THEN 1 ELSE 0 END) as localPickupCount,
|
||||||
|
SUM(CASE WHEN ship_method_selected = 'holdit' THEN 1 ELSE 0 END) as onHoldCount,
|
||||||
|
SUM(CASE WHEN order_status IN (100, 92) THEN 1 ELSE 0 END) as shippedCount,
|
||||||
|
SUM(CASE WHEN order_status = 15 THEN 1 ELSE 0 END) as cancelledCount,
|
||||||
|
SUM(CASE WHEN order_status = 15 THEN summary_total ELSE 0 END) as cancelledTotal
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [mainStats] = await connection.execute(mainStatsQuery, params);
|
||||||
|
const stats = mainStats[0];
|
||||||
|
|
||||||
|
// Refunds query
|
||||||
|
const refundsQuery = `
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as refundCount,
|
||||||
|
ABS(SUM(payment_amount)) as refundTotal
|
||||||
|
FROM order_payment op
|
||||||
|
JOIN _order o ON op.order_id = o.order_id
|
||||||
|
WHERE payment_amount < 0 AND o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [refundStats] = await connection.execute(refundsQuery, params);
|
||||||
|
|
||||||
|
// Best revenue day query
|
||||||
|
const bestDayQuery = `
|
||||||
|
SELECT
|
||||||
|
DATE(date_placed) as date,
|
||||||
|
SUM(summary_total) as revenue,
|
||||||
|
COUNT(*) as orders
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
GROUP BY DATE(date_placed)
|
||||||
|
ORDER BY revenue DESC
|
||||||
|
LIMIT 1
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [bestDayResult] = await connection.execute(bestDayQuery, params);
|
||||||
|
|
||||||
|
// Peak hour query (for single day periods)
|
||||||
|
let peakHour = null;
|
||||||
|
if (['today', 'yesterday'].includes(timeRange)) {
|
||||||
|
const peakHourQuery = `
|
||||||
|
SELECT
|
||||||
|
HOUR(date_placed) as hour,
|
||||||
|
COUNT(*) as count
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
GROUP BY HOUR(date_placed)
|
||||||
|
ORDER BY count DESC
|
||||||
|
LIMIT 1
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [peakHourResult] = await connection.execute(peakHourQuery, params);
|
||||||
|
if (peakHourResult.length > 0) {
|
||||||
|
const hour = peakHourResult[0].hour;
|
||||||
|
const date = new Date();
|
||||||
|
date.setHours(hour, 0, 0);
|
||||||
|
peakHour = {
|
||||||
|
hour,
|
||||||
|
count: peakHourResult[0].count,
|
||||||
|
displayHour: date.toLocaleString("en-US", { hour: "numeric", hour12: true })
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Brands and categories query - simplified for now since we don't have the category tables
|
||||||
|
// We'll use a simple approach without company table for now
|
||||||
|
const brandsQuery = `
|
||||||
|
SELECT
|
||||||
|
'Various Brands' as brandName,
|
||||||
|
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||||
|
SUM(oi.qty_ordered) as itemCount,
|
||||||
|
SUM(oi.qty_ordered * oi.prod_price) as revenue
|
||||||
|
FROM order_items oi
|
||||||
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
|
JOIN products p ON oi.prod_pid = p.pid
|
||||||
|
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||||
|
HAVING revenue > 0
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [brandsResult] = await connection.execute(brandsQuery, params);
|
||||||
|
|
||||||
|
// For categories, we'll use a simplified approach
|
||||||
|
const categoriesQuery = `
|
||||||
|
SELECT
|
||||||
|
'General' as categoryName,
|
||||||
|
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||||
|
SUM(oi.qty_ordered) as itemCount,
|
||||||
|
SUM(oi.qty_ordered * oi.prod_price) as revenue
|
||||||
|
FROM order_items oi
|
||||||
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
|
JOIN products p ON oi.prod_pid = p.pid
|
||||||
|
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||||
|
HAVING revenue > 0
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [categoriesResult] = await connection.execute(categoriesQuery, params);
|
||||||
|
|
||||||
|
// Shipping locations query
|
||||||
|
const shippingQuery = `
|
||||||
|
SELECT
|
||||||
|
ship_country,
|
||||||
|
ship_state,
|
||||||
|
ship_method_selected,
|
||||||
|
COUNT(*) as count
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status IN (100, 92) AND ${whereClause}
|
||||||
|
GROUP BY ship_country, ship_state, ship_method_selected
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [shippingResult] = await connection.execute(shippingQuery, params);
|
||||||
|
|
||||||
|
// Process shipping data
|
||||||
|
const shippingStats = processShippingData(shippingResult, stats.shippedCount);
|
||||||
|
|
||||||
|
// Order value range query
|
||||||
|
const orderRangeQuery = `
|
||||||
|
SELECT
|
||||||
|
MIN(summary_total) as smallest,
|
||||||
|
MAX(summary_total) as largest
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [orderRangeResult] = await connection.execute(orderRangeQuery, params);
|
||||||
|
|
||||||
|
// Calculate period progress for incomplete periods
|
||||||
|
let periodProgress = 100;
|
||||||
|
if (['today', 'thisWeek', 'thisMonth'].includes(timeRange)) {
|
||||||
|
periodProgress = calculatePeriodProgress(timeRange);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Previous period comparison data
|
||||||
|
const prevPeriodData = await getPreviousPeriodData(connection, timeRange, startDate, endDate);
|
||||||
|
|
||||||
|
const response = {
|
||||||
|
timeRange: dateRange,
|
||||||
|
stats: {
|
||||||
|
revenue: parseFloat(stats.revenue || 0),
|
||||||
|
orderCount: parseInt(stats.orderCount || 0),
|
||||||
|
itemCount: parseInt(stats.itemCount || 0),
|
||||||
|
averageOrderValue: parseFloat(stats.averageOrderValue || 0),
|
||||||
|
averageItemsPerOrder: parseFloat(stats.averageItemsPerOrder || 0),
|
||||||
|
|
||||||
|
// Order types
|
||||||
|
orderTypes: {
|
||||||
|
preOrders: {
|
||||||
|
count: parseInt(stats.preOrderCount || 0),
|
||||||
|
percentage: stats.orderCount > 0 ? (stats.preOrderCount / stats.orderCount) * 100 : 0
|
||||||
|
},
|
||||||
|
localPickup: {
|
||||||
|
count: parseInt(stats.localPickupCount || 0),
|
||||||
|
percentage: stats.orderCount > 0 ? (stats.localPickupCount / stats.orderCount) * 100 : 0
|
||||||
|
},
|
||||||
|
heldItems: {
|
||||||
|
count: parseInt(stats.onHoldCount || 0),
|
||||||
|
percentage: stats.orderCount > 0 ? (stats.onHoldCount / stats.orderCount) * 100 : 0
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// Shipping
|
||||||
|
shipping: {
|
||||||
|
shippedCount: parseInt(stats.shippedCount || 0),
|
||||||
|
locations: shippingStats.locations,
|
||||||
|
methodStats: shippingStats.methods
|
||||||
|
},
|
||||||
|
|
||||||
|
// Brands and categories
|
||||||
|
brands: {
|
||||||
|
total: brandsResult.length,
|
||||||
|
list: brandsResult.slice(0, 50).map(brand => ({
|
||||||
|
name: brand.brandName,
|
||||||
|
count: parseInt(brand.itemCount),
|
||||||
|
revenue: parseFloat(brand.revenue)
|
||||||
|
}))
|
||||||
|
},
|
||||||
|
|
||||||
|
categories: {
|
||||||
|
total: categoriesResult.length,
|
||||||
|
list: categoriesResult.slice(0, 50).map(category => ({
|
||||||
|
name: category.categoryName,
|
||||||
|
count: parseInt(category.itemCount),
|
||||||
|
revenue: parseFloat(category.revenue)
|
||||||
|
}))
|
||||||
|
},
|
||||||
|
|
||||||
|
// Refunds and cancellations
|
||||||
|
refunds: {
|
||||||
|
total: parseFloat(refundStats[0]?.refundTotal || 0),
|
||||||
|
count: parseInt(refundStats[0]?.refundCount || 0)
|
||||||
|
},
|
||||||
|
|
||||||
|
canceledOrders: {
|
||||||
|
total: parseFloat(stats.cancelledTotal || 0),
|
||||||
|
count: parseInt(stats.cancelledCount || 0)
|
||||||
|
},
|
||||||
|
|
||||||
|
// Best day
|
||||||
|
bestRevenueDay: bestDayResult.length > 0 ? {
|
||||||
|
amount: parseFloat(bestDayResult[0].revenue),
|
||||||
|
displayDate: bestDayResult[0].date,
|
||||||
|
orders: parseInt(bestDayResult[0].orders)
|
||||||
|
} : null,
|
||||||
|
|
||||||
|
// Peak hour (for single days)
|
||||||
|
peakOrderHour: peakHour,
|
||||||
|
|
||||||
|
// Order value range
|
||||||
|
orderValueRange: orderRangeResult.length > 0 ? {
|
||||||
|
smallest: parseFloat(orderRangeResult[0].smallest || 0),
|
||||||
|
largest: parseFloat(orderRangeResult[0].largest || 0)
|
||||||
|
} : { smallest: 0, largest: 0 },
|
||||||
|
|
||||||
|
// Period progress and projections
|
||||||
|
periodProgress,
|
||||||
|
projectedRevenue: periodProgress < 100 ? (stats.revenue / (periodProgress / 100)) : stats.revenue,
|
||||||
|
|
||||||
|
// Previous period comparison
|
||||||
|
prevPeriodRevenue: prevPeriodData.revenue,
|
||||||
|
prevPeriodOrders: prevPeriodData.orderCount,
|
||||||
|
prevPeriodAOV: prevPeriodData.averageOrderValue
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return { response, release };
|
||||||
|
};
|
||||||
|
|
||||||
|
// Race between the main operation and timeout
|
||||||
|
let result;
|
||||||
|
try {
|
||||||
|
result = await Promise.race([mainOperation(), timeoutPromise]);
|
||||||
|
} catch (error) {
|
||||||
|
// If it's a timeout, we don't have a release function to call
|
||||||
|
if (error.message.includes('timeout')) {
|
||||||
|
console.log(`[STATS] Request timed out in ${Date.now() - startTime}ms`);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
// For other errors, re-throw
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { response, release } = result;
|
||||||
|
|
||||||
|
// Release connection back to pool
|
||||||
|
if (release) release();
|
||||||
|
|
||||||
|
console.log(`[STATS] Request completed in ${Date.now() - startTime}ms`);
|
||||||
|
res.json(response);
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in /stats:', error);
|
||||||
|
console.log(`[STATS] Request failed in ${Date.now() - startTime}ms`);
|
||||||
|
res.status(500).json({ error: error.message });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Daily details endpoint - replaces /api/klaviyo/events/stats/details
|
||||||
|
router.get('/stats/details', async (req, res) => {
|
||||||
|
let release;
|
||||||
|
try {
|
||||||
|
const { timeRange, startDate, endDate, metric, daily } = req.query;
|
||||||
|
const { connection, release: releaseConn } = await getDbConnection();
|
||||||
|
release = releaseConn;
|
||||||
|
|
||||||
|
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||||
|
|
||||||
|
// Daily breakdown query
|
||||||
|
const dailyQuery = `
|
||||||
|
SELECT
|
||||||
|
DATE(date_placed) as date,
|
||||||
|
COUNT(*) as orders,
|
||||||
|
SUM(summary_total) as revenue,
|
||||||
|
AVG(summary_total) as averageOrderValue,
|
||||||
|
SUM(stats_prod_pieces) as itemCount
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
GROUP BY DATE(date_placed)
|
||||||
|
ORDER BY DATE(date_placed)
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [dailyResults] = await connection.execute(dailyQuery, params);
|
||||||
|
|
||||||
|
// Get previous period data using the same logic as main stats endpoint
|
||||||
|
let prevWhereClause, prevParams;
|
||||||
|
|
||||||
|
if (timeRange && timeRange !== 'custom') {
|
||||||
|
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||||
|
const result = getTimeRangeConditions(prevTimeRange);
|
||||||
|
prevWhereClause = result.whereClause;
|
||||||
|
prevParams = result.params;
|
||||||
|
} else {
|
||||||
|
// Custom date range - go back by the same duration
|
||||||
|
const start = new Date(startDate);
|
||||||
|
const end = new Date(endDate);
|
||||||
|
const duration = end.getTime() - start.getTime();
|
||||||
|
|
||||||
|
const prevEnd = new Date(start.getTime() - 1);
|
||||||
|
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||||
|
|
||||||
|
prevWhereClause = 'date_placed >= ? AND date_placed <= ?';
|
||||||
|
prevParams = [prevStart.toISOString(), prevEnd.toISOString()];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get previous period daily data
|
||||||
|
const prevQuery = `
|
||||||
|
SELECT
|
||||||
|
DATE(date_placed) as date,
|
||||||
|
COUNT(*) as prevOrders,
|
||||||
|
SUM(summary_total) as prevRevenue,
|
||||||
|
AVG(summary_total) as prevAvgOrderValue
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${prevWhereClause}
|
||||||
|
GROUP BY DATE(date_placed)
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [prevResults] = await connection.execute(prevQuery, prevParams);
|
||||||
|
|
||||||
|
// Create a map for quick lookup of previous period data
|
||||||
|
const prevMap = new Map();
|
||||||
|
prevResults.forEach(prev => {
|
||||||
|
const key = new Date(prev.date).toISOString().split('T')[0];
|
||||||
|
prevMap.set(key, prev);
|
||||||
|
});
|
||||||
|
|
||||||
|
// For period-to-period comparison, we need to map days by relative position
|
||||||
|
// since dates won't match exactly (e.g., current week vs previous week)
|
||||||
|
const dailyArray = dailyResults.map(day => ({
|
||||||
|
timestamp: day.date,
|
||||||
|
date: day.date,
|
||||||
|
orders: parseInt(day.orders),
|
||||||
|
revenue: parseFloat(day.revenue),
|
||||||
|
averageOrderValue: parseFloat(day.averageOrderValue || 0),
|
||||||
|
itemCount: parseInt(day.itemCount)
|
||||||
|
}));
|
||||||
|
|
||||||
|
const prevArray = prevResults.map(day => ({
|
||||||
|
orders: parseInt(day.prevOrders),
|
||||||
|
revenue: parseFloat(day.prevRevenue),
|
||||||
|
averageOrderValue: parseFloat(day.prevAvgOrderValue || 0)
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Combine current and previous period data by matching relative positions
|
||||||
|
const statsWithComparison = dailyArray.map((day, index) => {
|
||||||
|
const prev = prevArray[index] || { orders: 0, revenue: 0, averageOrderValue: 0 };
|
||||||
|
|
||||||
|
return {
|
||||||
|
...day,
|
||||||
|
prevOrders: prev.orders,
|
||||||
|
prevRevenue: prev.revenue,
|
||||||
|
prevAvgOrderValue: prev.averageOrderValue
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json({ stats: statsWithComparison });
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in /stats/details:', error);
|
||||||
|
res.status(500).json({ error: error.message });
|
||||||
|
} finally {
|
||||||
|
// Release connection back to pool
|
||||||
|
if (release) release();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Products endpoint - replaces /api/klaviyo/events/products
|
||||||
|
router.get('/products', async (req, res) => {
|
||||||
|
let release;
|
||||||
|
try {
|
||||||
|
const { timeRange, startDate, endDate } = req.query;
|
||||||
|
const { connection, release: releaseConn } = await getDbConnection();
|
||||||
|
release = releaseConn;
|
||||||
|
|
||||||
|
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||||
|
|
||||||
|
const productsQuery = `
|
||||||
|
SELECT
|
||||||
|
p.pid,
|
||||||
|
p.description as name,
|
||||||
|
SUM(oi.qty_ordered) as totalQuantity,
|
||||||
|
SUM(oi.qty_ordered * oi.prod_price) as totalRevenue,
|
||||||
|
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||||
|
(SELECT pi.iid FROM product_images pi WHERE pi.pid = p.pid AND pi.order = 255 LIMIT 1) as primary_iid
|
||||||
|
FROM order_items oi
|
||||||
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
|
JOIN products p ON oi.prod_pid = p.pid
|
||||||
|
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||||
|
GROUP BY p.pid, p.description
|
||||||
|
ORDER BY totalRevenue DESC
|
||||||
|
LIMIT 500
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [productsResult] = await connection.execute(productsQuery, params);
|
||||||
|
|
||||||
|
// Add image URLs to each product
|
||||||
|
const productsWithImages = productsResult.map(product => {
|
||||||
|
const imageUrls = getImageUrls(product.pid, product.primary_iid || 1);
|
||||||
|
return {
|
||||||
|
id: product.pid,
|
||||||
|
name: product.name,
|
||||||
|
totalQuantity: parseInt(product.totalQuantity),
|
||||||
|
totalRevenue: parseFloat(product.totalRevenue),
|
||||||
|
orderCount: parseInt(product.orderCount),
|
||||||
|
...imageUrls
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
stats: {
|
||||||
|
products: {
|
||||||
|
total: productsWithImages.length,
|
||||||
|
list: productsWithImages
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in /products:', error);
|
||||||
|
res.status(500).json({ error: error.message });
|
||||||
|
} finally {
|
||||||
|
// Release connection back to pool
|
||||||
|
if (release) release();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Projection endpoint - replaces /api/klaviyo/events/projection
|
||||||
|
router.get('/projection', async (req, res) => {
|
||||||
|
let release;
|
||||||
|
try {
|
||||||
|
const { timeRange, startDate, endDate } = req.query;
|
||||||
|
|
||||||
|
// Only provide projections for incomplete periods
|
||||||
|
if (!['today', 'thisWeek', 'thisMonth'].includes(timeRange)) {
|
||||||
|
return res.json({ projectedRevenue: 0, confidence: 0 });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { connection, release: releaseConn } = await getDbConnection();
|
||||||
|
release = releaseConn;
|
||||||
|
|
||||||
|
// Get current period data
|
||||||
|
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||||
|
|
||||||
|
const currentQuery = `
|
||||||
|
SELECT
|
||||||
|
SUM(summary_total) as currentRevenue,
|
||||||
|
COUNT(*) as currentOrders
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${whereClause}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [currentResult] = await connection.execute(currentQuery, params);
|
||||||
|
const current = currentResult[0];
|
||||||
|
|
||||||
|
// Get historical data for the same period type
|
||||||
|
const historicalQuery = await getHistoricalProjectionData(connection, timeRange);
|
||||||
|
|
||||||
|
// Calculate projection based on current progress and historical patterns
|
||||||
|
const periodProgress = calculatePeriodProgress(timeRange);
|
||||||
|
const projection = calculateSmartProjection(
|
||||||
|
parseFloat(current.currentRevenue || 0),
|
||||||
|
parseInt(current.currentOrders || 0),
|
||||||
|
periodProgress,
|
||||||
|
historicalQuery
|
||||||
|
);
|
||||||
|
|
||||||
|
res.json(projection);
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error in /projection:', error);
|
||||||
|
res.status(500).json({ error: error.message });
|
||||||
|
} finally {
|
||||||
|
// Release connection back to pool
|
||||||
|
if (release) release();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Debug endpoint to check connection pool status
|
||||||
|
router.get('/debug/pool', (req, res) => {
|
||||||
|
res.json(getPoolStatus());
|
||||||
|
});
|
||||||
|
|
||||||
|
// Health check endpoint
|
||||||
|
router.get('/health', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { connection, release } = await getDbConnection();
|
||||||
|
|
||||||
|
// Simple query to test connection
|
||||||
|
const [result] = await connection.execute('SELECT 1 as test');
|
||||||
|
release();
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'healthy',
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
pool: getPoolStatus(),
|
||||||
|
dbTest: result[0]
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'unhealthy',
|
||||||
|
error: error.message,
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
pool: getPoolStatus()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Helper functions
|
||||||
|
function processShippingData(shippingResult, totalShipped) {
|
||||||
|
const countries = {};
|
||||||
|
const states = {};
|
||||||
|
const methods = {};
|
||||||
|
|
||||||
|
shippingResult.forEach(row => {
|
||||||
|
// Countries
|
||||||
|
if (row.ship_country) {
|
||||||
|
countries[row.ship_country] = (countries[row.ship_country] || 0) + row.count;
|
||||||
|
}
|
||||||
|
|
||||||
|
// States
|
||||||
|
if (row.ship_state) {
|
||||||
|
states[row.ship_state] = (states[row.ship_state] || 0) + row.count;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Methods
|
||||||
|
if (row.ship_method_selected) {
|
||||||
|
methods[row.ship_method_selected] = (methods[row.ship_method_selected] || 0) + row.count;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
locations: {
|
||||||
|
total: totalShipped,
|
||||||
|
byCountry: Object.entries(countries)
|
||||||
|
.map(([country, count]) => ({
|
||||||
|
country,
|
||||||
|
count,
|
||||||
|
percentage: (count / totalShipped) * 100
|
||||||
|
}))
|
||||||
|
.sort((a, b) => b.count - a.count),
|
||||||
|
byState: Object.entries(states)
|
||||||
|
.map(([state, count]) => ({
|
||||||
|
state,
|
||||||
|
count,
|
||||||
|
percentage: (count / totalShipped) * 100
|
||||||
|
}))
|
||||||
|
.sort((a, b) => b.count - a.count)
|
||||||
|
},
|
||||||
|
methods: Object.entries(methods)
|
||||||
|
.map(([name, value]) => ({ name, value }))
|
||||||
|
.sort((a, b) => b.value - a.value)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function calculatePeriodProgress(timeRange) {
|
||||||
|
const now = new Date();
|
||||||
|
const easternTime = new Date(now.getTime() - (5 * 60 * 60 * 1000)); // UTC-5
|
||||||
|
|
||||||
|
switch (timeRange) {
|
||||||
|
case 'today': {
|
||||||
|
const { start } = getBusinessDayBounds('today');
|
||||||
|
const businessStart = new Date(start);
|
||||||
|
const businessEnd = new Date(businessStart);
|
||||||
|
businessEnd.setDate(businessEnd.getDate() + 1);
|
||||||
|
businessEnd.setHours(0, 59, 59, 999); // 12:59 AM next day
|
||||||
|
|
||||||
|
const elapsed = easternTime.getTime() - businessStart.getTime();
|
||||||
|
const total = businessEnd.getTime() - businessStart.getTime();
|
||||||
|
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||||
|
}
|
||||||
|
case 'thisWeek': {
|
||||||
|
const startOfWeek = new Date(easternTime);
|
||||||
|
startOfWeek.setDate(easternTime.getDate() - easternTime.getDay()); // Sunday
|
||||||
|
startOfWeek.setHours(1, 0, 0, 0); // 1 AM business day start
|
||||||
|
|
||||||
|
const endOfWeek = new Date(startOfWeek);
|
||||||
|
endOfWeek.setDate(endOfWeek.getDate() + 7);
|
||||||
|
|
||||||
|
const elapsed = easternTime.getTime() - startOfWeek.getTime();
|
||||||
|
const total = endOfWeek.getTime() - startOfWeek.getTime();
|
||||||
|
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||||
|
}
|
||||||
|
case 'thisMonth': {
|
||||||
|
const startOfMonth = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 1, 0, 0, 0);
|
||||||
|
const endOfMonth = new Date(easternTime.getFullYear(), easternTime.getMonth() + 1, 1, 0, 59, 59, 999);
|
||||||
|
|
||||||
|
const elapsed = easternTime.getTime() - startOfMonth.getTime();
|
||||||
|
const total = endOfMonth.getTime() - startOfMonth.getTime();
|
||||||
|
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
return 100;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getPreviousPeriodData(connection, timeRange, startDate, endDate) {
|
||||||
|
// Calculate previous period dates
|
||||||
|
let prevWhereClause, prevParams;
|
||||||
|
|
||||||
|
if (timeRange && timeRange !== 'custom') {
|
||||||
|
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||||
|
const result = getTimeRangeConditions(prevTimeRange);
|
||||||
|
prevWhereClause = result.whereClause;
|
||||||
|
prevParams = result.params;
|
||||||
|
} else {
|
||||||
|
// Custom date range - go back by the same duration
|
||||||
|
const start = new Date(startDate);
|
||||||
|
const end = new Date(endDate);
|
||||||
|
const duration = end.getTime() - start.getTime();
|
||||||
|
|
||||||
|
const prevEnd = new Date(start.getTime() - 1);
|
||||||
|
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||||
|
|
||||||
|
prevWhereClause = 'date_placed >= ? AND date_placed <= ?';
|
||||||
|
prevParams = [prevStart.toISOString(), prevEnd.toISOString()];
|
||||||
|
}
|
||||||
|
|
||||||
|
const prevQuery = `
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as orderCount,
|
||||||
|
SUM(summary_total) as revenue,
|
||||||
|
AVG(summary_total) as averageOrderValue
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15 AND ${prevWhereClause}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [prevResult] = await connection.execute(prevQuery, prevParams);
|
||||||
|
const prev = prevResult[0] || { orderCount: 0, revenue: 0, averageOrderValue: 0 };
|
||||||
|
|
||||||
|
return {
|
||||||
|
orderCount: parseInt(prev.orderCount || 0),
|
||||||
|
revenue: parseFloat(prev.revenue || 0),
|
||||||
|
averageOrderValue: parseFloat(prev.averageOrderValue || 0)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function getPreviousTimeRange(timeRange) {
|
||||||
|
const map = {
|
||||||
|
today: 'yesterday',
|
||||||
|
thisWeek: 'lastWeek',
|
||||||
|
thisMonth: 'lastMonth',
|
||||||
|
last7days: 'previous7days',
|
||||||
|
last30days: 'previous30days',
|
||||||
|
last90days: 'previous90days',
|
||||||
|
yesterday: 'twoDaysAgo'
|
||||||
|
};
|
||||||
|
return map[timeRange] || timeRange;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getHistoricalProjectionData(connection, timeRange) {
|
||||||
|
// Get historical data for projection calculations
|
||||||
|
// This is a simplified version - you could make this more sophisticated
|
||||||
|
const historicalQuery = `
|
||||||
|
SELECT
|
||||||
|
SUM(summary_total) as revenue,
|
||||||
|
COUNT(*) as orders
|
||||||
|
FROM _order
|
||||||
|
WHERE order_status > 15
|
||||||
|
AND date_placed >= DATE_SUB(NOW(), INTERVAL 30 DAY)
|
||||||
|
AND date_placed < DATE_SUB(NOW(), INTERVAL 1 DAY)
|
||||||
|
`;
|
||||||
|
|
||||||
|
const [result] = await connection.execute(historicalQuery);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
function calculateSmartProjection(currentRevenue, currentOrders, periodProgress, historicalData) {
|
||||||
|
if (periodProgress >= 100) {
|
||||||
|
return { projectedRevenue: currentRevenue, projectedOrders: currentOrders, confidence: 1.0 };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Simple linear projection with confidence based on how much of the period has elapsed
|
||||||
|
const projectedRevenue = currentRevenue / (periodProgress / 100);
|
||||||
|
const projectedOrders = Math.round(currentOrders / (periodProgress / 100));
|
||||||
|
|
||||||
|
// Confidence increases with more data (higher period progress)
|
||||||
|
const confidence = Math.min(0.95, Math.max(0.1, periodProgress / 100));
|
||||||
|
|
||||||
|
return {
|
||||||
|
projectedRevenue,
|
||||||
|
projectedOrders,
|
||||||
|
confidence
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Health check endpoint
|
||||||
|
router.get('/health', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const poolStatus = getPoolStatus();
|
||||||
|
|
||||||
|
// Test database connectivity
|
||||||
|
const { connection, release } = await getDbConnection();
|
||||||
|
await connection.execute('SELECT 1 as test');
|
||||||
|
release();
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'healthy',
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
pool: poolStatus,
|
||||||
|
database: 'connected'
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Health check failed:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'unhealthy',
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
error: error.message,
|
||||||
|
pool: getPoolStatus()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Debug endpoint for pool status
|
||||||
|
router.get('/debug/pool', (req, res) => {
|
||||||
|
res.json({
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
pool: getPoolStatus()
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
57
dashboard-server/acot-server/routes/test.js
Normal file
57
dashboard-server/acot-server/routes/test.js
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const { getDbConnection, getCachedQuery } = require('../db/connection');
|
||||||
|
|
||||||
|
// Test endpoint to count orders
|
||||||
|
router.get('/order-count', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { connection } = await getDbConnection();
|
||||||
|
|
||||||
|
// Simple query to count orders from _order table
|
||||||
|
const queryFn = async () => {
|
||||||
|
const [rows] = await connection.execute('SELECT COUNT(*) as count FROM _order');
|
||||||
|
return rows[0].count;
|
||||||
|
};
|
||||||
|
|
||||||
|
const cacheKey = 'order-count';
|
||||||
|
const count = await getCachedQuery(cacheKey, 'default', queryFn);
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
success: true,
|
||||||
|
data: {
|
||||||
|
orderCount: count,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching order count:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
success: false,
|
||||||
|
error: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test connection endpoint
|
||||||
|
router.get('/test-connection', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { connection } = await getDbConnection();
|
||||||
|
|
||||||
|
// Test the connection with a simple query
|
||||||
|
const [rows] = await connection.execute('SELECT 1 as test');
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
success: true,
|
||||||
|
message: 'Database connection successful',
|
||||||
|
data: rows[0]
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error testing connection:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
success: false,
|
||||||
|
error: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
98
dashboard-server/acot-server/server.js
Normal file
98
dashboard-server/acot-server/server.js
Normal file
@@ -0,0 +1,98 @@
|
|||||||
|
require('dotenv').config();
|
||||||
|
const express = require('express');
|
||||||
|
const cors = require('cors');
|
||||||
|
const morgan = require('morgan');
|
||||||
|
const compression = require('compression');
|
||||||
|
const fs = require('fs');
|
||||||
|
const path = require('path');
|
||||||
|
const { closeAllConnections } = require('./db/connection');
|
||||||
|
|
||||||
|
const app = express();
|
||||||
|
const PORT = process.env.ACOT_PORT || 3012;
|
||||||
|
|
||||||
|
// Create logs directory if it doesn't exist
|
||||||
|
const logDir = path.join(__dirname, 'logs/app');
|
||||||
|
if (!fs.existsSync(logDir)) {
|
||||||
|
fs.mkdirSync(logDir, { recursive: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a write stream for access logs
|
||||||
|
const accessLogStream = fs.createWriteStream(
|
||||||
|
path.join(logDir, 'access.log'),
|
||||||
|
{ flags: 'a' }
|
||||||
|
);
|
||||||
|
|
||||||
|
// Middleware
|
||||||
|
app.use(compression());
|
||||||
|
app.use(cors());
|
||||||
|
app.use(express.json());
|
||||||
|
app.use(express.urlencoded({ extended: true }));
|
||||||
|
|
||||||
|
// Logging middleware
|
||||||
|
if (process.env.NODE_ENV === 'production') {
|
||||||
|
app.use(morgan('combined', { stream: accessLogStream }));
|
||||||
|
} else {
|
||||||
|
app.use(morgan('dev'));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Health check endpoint
|
||||||
|
app.get('/health', (req, res) => {
|
||||||
|
res.json({
|
||||||
|
status: 'healthy',
|
||||||
|
service: 'acot-server',
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
uptime: process.uptime()
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Routes
|
||||||
|
app.use('/api/acot/test', require('./routes/test'));
|
||||||
|
app.use('/api/acot/events', require('./routes/events'));
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
app.use((err, req, res, next) => {
|
||||||
|
console.error('Unhandled error:', err);
|
||||||
|
res.status(500).json({
|
||||||
|
success: false,
|
||||||
|
error: process.env.NODE_ENV === 'production'
|
||||||
|
? 'Internal server error'
|
||||||
|
: err.message
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// 404 handler
|
||||||
|
app.use((req, res) => {
|
||||||
|
res.status(404).json({
|
||||||
|
success: false,
|
||||||
|
error: 'Route not found'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start server
|
||||||
|
const server = app.listen(PORT, () => {
|
||||||
|
console.log(`ACOT Server running on port ${PORT}`);
|
||||||
|
console.log(`Environment: ${process.env.NODE_ENV}`);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Graceful shutdown
|
||||||
|
const gracefulShutdown = async () => {
|
||||||
|
console.log('SIGTERM signal received: closing HTTP server');
|
||||||
|
server.close(async () => {
|
||||||
|
console.log('HTTP server closed');
|
||||||
|
|
||||||
|
// Close database connections
|
||||||
|
try {
|
||||||
|
await closeAllConnections();
|
||||||
|
console.log('Database connections closed');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing database connections:', error);
|
||||||
|
}
|
||||||
|
|
||||||
|
process.exit(0);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
process.on('SIGTERM', gracefulShutdown);
|
||||||
|
process.on('SIGINT', gracefulShutdown);
|
||||||
|
|
||||||
|
module.exports = app;
|
||||||
259
dashboard-server/acot-server/utils/timeUtils.js
Normal file
259
dashboard-server/acot-server/utils/timeUtils.js
Normal file
@@ -0,0 +1,259 @@
|
|||||||
|
// Time utilities for handling business day logic and time ranges
|
||||||
|
// Business day is 1am-12:59am Eastern time (UTC-5)
|
||||||
|
|
||||||
|
const getBusinessDayBounds = (timeRange) => {
|
||||||
|
const now = new Date();
|
||||||
|
const easternTime = new Date(now.getTime() - (5 * 60 * 60 * 1000)); // UTC-5
|
||||||
|
|
||||||
|
switch (timeRange) {
|
||||||
|
case 'today': {
|
||||||
|
const start = new Date(easternTime);
|
||||||
|
start.setHours(1, 0, 0, 0); // 1 AM start of business day
|
||||||
|
|
||||||
|
const end = new Date(start);
|
||||||
|
end.setDate(end.getDate() + 1);
|
||||||
|
end.setHours(0, 59, 59, 999); // 12:59 AM next day
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'yesterday': {
|
||||||
|
const start = new Date(easternTime);
|
||||||
|
start.setDate(start.getDate() - 1);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
const end = new Date(start);
|
||||||
|
end.setDate(end.getDate() + 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'thisWeek': {
|
||||||
|
const start = new Date(easternTime);
|
||||||
|
start.setDate(easternTime.getDate() - easternTime.getDay()); // Sunday
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setDate(end.getDate() + 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'lastWeek': {
|
||||||
|
const start = new Date(easternTime);
|
||||||
|
start.setDate(easternTime.getDate() - easternTime.getDay() - 7); // Previous Sunday
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
const end = new Date(start);
|
||||||
|
end.setDate(end.getDate() + 7);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'thisMonth': {
|
||||||
|
const start = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 1, 0, 0, 0);
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setDate(end.getDate() + 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'lastMonth': {
|
||||||
|
const start = new Date(easternTime.getFullYear(), easternTime.getMonth() - 1, 1, 1, 0, 0, 0);
|
||||||
|
const end = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'last7days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 7);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'last30days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 30);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'last90days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 90);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'previous7days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setDate(end.getDate() - 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 6);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'previous30days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setDate(end.getDate() - 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 29);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'previous90days': {
|
||||||
|
const end = new Date(easternTime);
|
||||||
|
end.setDate(end.getDate() - 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
const start = new Date(end);
|
||||||
|
start.setDate(start.getDate() - 89);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'twoDaysAgo': {
|
||||||
|
const start = new Date(easternTime);
|
||||||
|
start.setDate(start.getDate() - 2);
|
||||||
|
start.setHours(1, 0, 0, 0);
|
||||||
|
|
||||||
|
const end = new Date(start);
|
||||||
|
end.setDate(end.getDate() + 1);
|
||||||
|
end.setHours(0, 59, 59, 999);
|
||||||
|
|
||||||
|
return { start, end };
|
||||||
|
}
|
||||||
|
|
||||||
|
default:
|
||||||
|
throw new Error(`Unknown time range: ${timeRange}`);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getTimeRangeConditions = (timeRange, startDate, endDate) => {
|
||||||
|
if (timeRange === 'custom' && startDate && endDate) {
|
||||||
|
// Custom date range
|
||||||
|
const start = new Date(startDate);
|
||||||
|
const end = new Date(endDate);
|
||||||
|
|
||||||
|
// Convert to UTC-5 (Eastern time)
|
||||||
|
const startUTC5 = new Date(start.getTime() - (5 * 60 * 60 * 1000));
|
||||||
|
const endUTC5 = new Date(end.getTime() - (5 * 60 * 60 * 1000));
|
||||||
|
|
||||||
|
return {
|
||||||
|
whereClause: 'date_placed >= ? AND date_placed <= ?',
|
||||||
|
params: [
|
||||||
|
startUTC5.toISOString().slice(0, 19).replace('T', ' '),
|
||||||
|
endUTC5.toISOString().slice(0, 19).replace('T', ' ')
|
||||||
|
],
|
||||||
|
dateRange: {
|
||||||
|
start: startDate,
|
||||||
|
end: endDate,
|
||||||
|
label: `${formatBusinessDate(start)} - ${formatBusinessDate(end)}`
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!timeRange) {
|
||||||
|
timeRange = 'today';
|
||||||
|
}
|
||||||
|
|
||||||
|
const { start, end } = getBusinessDayBounds(timeRange);
|
||||||
|
|
||||||
|
// Convert to MySQL datetime format (UTC-5)
|
||||||
|
const startStr = start.toISOString().slice(0, 19).replace('T', ' ');
|
||||||
|
const endStr = end.toISOString().slice(0, 19).replace('T', ' ');
|
||||||
|
|
||||||
|
return {
|
||||||
|
whereClause: 'date_placed >= ? AND date_placed <= ?',
|
||||||
|
params: [startStr, endStr],
|
||||||
|
dateRange: {
|
||||||
|
start: start.toISOString(),
|
||||||
|
end: end.toISOString(),
|
||||||
|
label: getTimeRangeLabel(timeRange)
|
||||||
|
}
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatBusinessDate = (date) => {
|
||||||
|
return date.toLocaleDateString('en-US', {
|
||||||
|
month: 'short',
|
||||||
|
day: 'numeric',
|
||||||
|
year: 'numeric'
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const getTimeRangeLabel = (timeRange) => {
|
||||||
|
const labels = {
|
||||||
|
today: 'Today',
|
||||||
|
yesterday: 'Yesterday',
|
||||||
|
thisWeek: 'This Week',
|
||||||
|
lastWeek: 'Last Week',
|
||||||
|
thisMonth: 'This Month',
|
||||||
|
lastMonth: 'Last Month',
|
||||||
|
last7days: 'Last 7 Days',
|
||||||
|
last30days: 'Last 30 Days',
|
||||||
|
last90days: 'Last 90 Days',
|
||||||
|
previous7days: 'Previous 7 Days',
|
||||||
|
previous30days: 'Previous 30 Days',
|
||||||
|
previous90days: 'Previous 90 Days',
|
||||||
|
twoDaysAgo: 'Two Days Ago'
|
||||||
|
};
|
||||||
|
|
||||||
|
return labels[timeRange] || timeRange;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Helper to convert MySQL datetime to JavaScript Date
|
||||||
|
const parseBusinessDate = (mysqlDatetime) => {
|
||||||
|
if (!mysqlDatetime || mysqlDatetime === '0000-00-00 00:00:00') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// MySQL datetime is stored in UTC-5, so we need to add 5 hours to get UTC
|
||||||
|
const date = new Date(mysqlDatetime + ' UTC');
|
||||||
|
date.setHours(date.getHours() + 5);
|
||||||
|
return date;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Helper to format date for MySQL queries
|
||||||
|
const formatMySQLDate = (date) => {
|
||||||
|
if (!date) return null;
|
||||||
|
|
||||||
|
// Convert to UTC-5 for storage
|
||||||
|
const utc5Date = new Date(date.getTime() - (5 * 60 * 60 * 1000));
|
||||||
|
return utc5Date.toISOString().slice(0, 19).replace('T', ' ');
|
||||||
|
};
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
getBusinessDayBounds,
|
||||||
|
getTimeRangeConditions,
|
||||||
|
formatBusinessDate,
|
||||||
|
getTimeRangeLabel,
|
||||||
|
parseBusinessDate,
|
||||||
|
formatMySQLDate
|
||||||
|
};
|
||||||
6
dashboard/package-lock.json
generated
6
dashboard/package-lock.json
generated
@@ -4058,9 +4058,9 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"node_modules/caniuse-lite": {
|
"node_modules/caniuse-lite": {
|
||||||
"version": "1.0.30001686",
|
"version": "1.0.30001720",
|
||||||
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001686.tgz",
|
"resolved": "https://registry.npmjs.org/caniuse-lite/-/caniuse-lite-1.0.30001720.tgz",
|
||||||
"integrity": "sha512-Y7deg0Aergpa24M3qLC5xjNklnKnhsmSyR/V89dLZ1n0ucJIFNs7PgR2Yfa/Zf6W79SbBicgtGxZr2juHkEUIA==",
|
"integrity": "sha512-Ec/2yV2nNPwb4DnTANEV99ZWwm3ZWfdlfkQbWSDDt+PsXEVYwlhPH8tdMaPunYTKKmz7AnHi2oNEi1GcmKCD8g==",
|
||||||
"dev": true,
|
"dev": true,
|
||||||
"funding": [
|
"funding": [
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -161,7 +161,6 @@ const DashboardLayout = () => {
|
|||||||
<Navigation />
|
<Navigation />
|
||||||
|
|
||||||
<div className="p-4 space-y-4">
|
<div className="p-4 space-y-4">
|
||||||
|
|
||||||
<div className="grid grid-cols-1 xl:grid-cols-6 gap-4">
|
<div className="grid grid-cols-1 xl:grid-cols-6 gap-4">
|
||||||
<div className="xl:col-span-4 col-span-6">
|
<div className="xl:col-span-4 col-span-6">
|
||||||
<div className="space-y-4 h-full w-full">
|
<div className="space-y-4 h-full w-full">
|
||||||
|
|||||||
133
dashboard/src/components/dashboard/AcotTest.jsx
Normal file
133
dashboard/src/components/dashboard/AcotTest.jsx
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
import React, { useState, useEffect } from "react";
|
||||||
|
import axios from "axios";
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||||
|
import { Button } from "@/components/ui/button";
|
||||||
|
import { Alert, AlertDescription, AlertTitle } from "@/components/ui/alert";
|
||||||
|
import { Loader2, AlertCircle, CheckCircle, RefreshCw } from "lucide-react";
|
||||||
|
|
||||||
|
const AcotTest = () => {
|
||||||
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [error, setError] = useState(null);
|
||||||
|
const [data, setData] = useState(null);
|
||||||
|
const [connectionStatus, setConnectionStatus] = useState(null);
|
||||||
|
|
||||||
|
const testConnection = async () => {
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
try {
|
||||||
|
const response = await axios.get("/api/acot/test/test-connection");
|
||||||
|
setConnectionStatus(response.data);
|
||||||
|
} catch (err) {
|
||||||
|
setError(err.response?.data?.error || err.message);
|
||||||
|
setConnectionStatus(null);
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const fetchOrderCount = async () => {
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
try {
|
||||||
|
const response = await axios.get("/api/acot/test/order-count");
|
||||||
|
setData(response.data.data);
|
||||||
|
} catch (err) {
|
||||||
|
setError(err.response?.data?.error || err.message);
|
||||||
|
setData(null);
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
testConnection();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className="w-full max-w-md">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle className="flex items-center justify-between">
|
||||||
|
ACOT Server Test
|
||||||
|
<Button
|
||||||
|
size="icon"
|
||||||
|
variant="outline"
|
||||||
|
onClick={() => {
|
||||||
|
testConnection();
|
||||||
|
if (connectionStatus?.success) {
|
||||||
|
fetchOrderCount();
|
||||||
|
}
|
||||||
|
}}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
{loading ? (
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin" />
|
||||||
|
) : (
|
||||||
|
<RefreshCw className="h-4 w-4" />
|
||||||
|
)}
|
||||||
|
</Button>
|
||||||
|
</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent className="space-y-4">
|
||||||
|
{/* Connection Status */}
|
||||||
|
<div className="space-y-2">
|
||||||
|
<h3 className="text-sm font-medium">Connection Status</h3>
|
||||||
|
{connectionStatus?.success ? (
|
||||||
|
<Alert className="bg-green-50 border-green-200">
|
||||||
|
<CheckCircle className="h-4 w-4 text-green-600" />
|
||||||
|
<AlertTitle className="text-green-800">Connected</AlertTitle>
|
||||||
|
<AlertDescription className="text-green-700">
|
||||||
|
{connectionStatus.message}
|
||||||
|
</AlertDescription>
|
||||||
|
</Alert>
|
||||||
|
) : error ? (
|
||||||
|
<Alert variant="destructive">
|
||||||
|
<AlertCircle className="h-4 w-4" />
|
||||||
|
<AlertTitle>Connection Failed</AlertTitle>
|
||||||
|
<AlertDescription>{error}</AlertDescription>
|
||||||
|
</Alert>
|
||||||
|
) : (
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
Testing connection...
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Order Count */}
|
||||||
|
{connectionStatus?.success && (
|
||||||
|
<div className="space-y-2">
|
||||||
|
<Button
|
||||||
|
onClick={fetchOrderCount}
|
||||||
|
disabled={loading}
|
||||||
|
className="w-full"
|
||||||
|
>
|
||||||
|
{loading ? (
|
||||||
|
<>
|
||||||
|
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
|
||||||
|
Loading...
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
"Fetch Order Count"
|
||||||
|
)}
|
||||||
|
</Button>
|
||||||
|
|
||||||
|
{data && (
|
||||||
|
<div className="p-4 bg-muted rounded-lg">
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
Total Orders in Database
|
||||||
|
</div>
|
||||||
|
<div className="text-2xl font-bold">
|
||||||
|
{data.orderCount?.toLocaleString()}
|
||||||
|
</div>
|
||||||
|
<div className="text-xs text-muted-foreground mt-1">
|
||||||
|
Last updated: {new Date(data.timestamp).toLocaleTimeString()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default AcotTest;
|
||||||
@@ -28,7 +28,6 @@ import {
|
|||||||
Zap,
|
Zap,
|
||||||
Timer,
|
Timer,
|
||||||
BarChart3,
|
BarChart3,
|
||||||
Bot,
|
|
||||||
ClipboardCheck,
|
ClipboardCheck,
|
||||||
} from "lucide-react";
|
} from "lucide-react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
@@ -214,7 +213,7 @@ const GorgiasOverview = () => {
|
|||||||
const filters = getDateRange(timeRange);
|
const filters = getDateRange(timeRange);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const [overview, channelStats, agentStats, satisfaction, selfService] =
|
const [overview, channelStats, agentStats, satisfaction] =
|
||||||
await Promise.all([
|
await Promise.all([
|
||||||
axios.post('/api/gorgias/stats/overview', filters)
|
axios.post('/api/gorgias/stats/overview', filters)
|
||||||
.then(res => res.data?.data?.data?.data || []),
|
.then(res => res.data?.data?.data?.data || []),
|
||||||
@@ -224,8 +223,6 @@ const GorgiasOverview = () => {
|
|||||||
.then(res => res.data?.data?.data?.data?.lines || []),
|
.then(res => res.data?.data?.data?.data?.lines || []),
|
||||||
axios.post('/api/gorgias/stats/satisfaction-surveys', filters)
|
axios.post('/api/gorgias/stats/satisfaction-surveys', filters)
|
||||||
.then(res => res.data?.data?.data?.data || []),
|
.then(res => res.data?.data?.data?.data || []),
|
||||||
axios.post('/api/gorgias/stats/self-service-overview', filters)
|
|
||||||
.then(res => res.data?.data?.data?.data || []),
|
|
||||||
]);
|
]);
|
||||||
|
|
||||||
console.log('Raw API responses:', {
|
console.log('Raw API responses:', {
|
||||||
@@ -233,7 +230,6 @@ const GorgiasOverview = () => {
|
|||||||
channelStats,
|
channelStats,
|
||||||
agentStats,
|
agentStats,
|
||||||
satisfaction,
|
satisfaction,
|
||||||
selfService
|
|
||||||
});
|
});
|
||||||
|
|
||||||
setData({
|
setData({
|
||||||
@@ -241,7 +237,6 @@ const GorgiasOverview = () => {
|
|||||||
channels: channelStats,
|
channels: channelStats,
|
||||||
agents: agentStats,
|
agents: agentStats,
|
||||||
satisfaction,
|
satisfaction,
|
||||||
selfService,
|
|
||||||
});
|
});
|
||||||
|
|
||||||
setError(null);
|
setError(null);
|
||||||
@@ -292,19 +287,6 @@ const GorgiasOverview = () => {
|
|||||||
|
|
||||||
console.log('Processed satisfaction stats:', satisfactionStats);
|
console.log('Processed satisfaction stats:', satisfactionStats);
|
||||||
|
|
||||||
// Process self-service data
|
|
||||||
const selfServiceStats = (data.selfService || []).reduce((acc, item) => {
|
|
||||||
acc[item.name] = {
|
|
||||||
value: item.value || 0,
|
|
||||||
delta: item.delta || 0,
|
|
||||||
type: item.type,
|
|
||||||
more_is_better: item.more_is_better
|
|
||||||
};
|
|
||||||
return acc;
|
|
||||||
}, {});
|
|
||||||
|
|
||||||
console.log('Processed self-service stats:', selfServiceStats);
|
|
||||||
|
|
||||||
// Process channel data
|
// Process channel data
|
||||||
const channels = data.channels?.map(line => ({
|
const channels = data.channels?.map(line => ({
|
||||||
name: line[0]?.value || '',
|
name: line[0]?.value || '',
|
||||||
@@ -377,7 +359,7 @@ const GorgiasOverview = () => {
|
|||||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4">
|
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4">
|
||||||
{/* Message & Response Metrics */}
|
{/* Message & Response Metrics */}
|
||||||
{loading ? (
|
{loading ? (
|
||||||
[...Array(8)].map((_, i) => (
|
[...Array(7)].map((_, i) => (
|
||||||
<SkeletonMetricCard key={i} />
|
<SkeletonMetricCard key={i} />
|
||||||
))
|
))
|
||||||
) : (
|
) : (
|
||||||
@@ -457,17 +439,6 @@ const GorgiasOverview = () => {
|
|||||||
loading={loading}
|
loading={loading}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
<div className="h-full">
|
|
||||||
<MetricCard
|
|
||||||
title="Self-Service Rate"
|
|
||||||
value={selfServiceStats.self_service_automation_rate?.value}
|
|
||||||
delta={selfServiceStats.self_service_automation_rate?.delta}
|
|
||||||
suffix="%"
|
|
||||||
icon={Bot}
|
|
||||||
colorClass="cyan"
|
|
||||||
loading={loading}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
</>
|
</>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import React, { useState, useEffect, useCallback, memo } from "react";
|
import React, { useState, useEffect, useCallback, memo } from "react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
|
import { acotService } from "@/services/acotService";
|
||||||
import {
|
import {
|
||||||
Card,
|
Card,
|
||||||
CardContent,
|
CardContent,
|
||||||
@@ -175,10 +176,8 @@ const MiniSalesChart = ({ className = "" }) => {
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
setProjectionLoading(true);
|
setProjectionLoading(true);
|
||||||
const response = await axios.get("/api/klaviyo/events/projection", {
|
const response = await acotService.getProjection({ timeRange: "last30days" });
|
||||||
params: { timeRange: "last30days" }
|
setProjection(response);
|
||||||
});
|
|
||||||
setProjection(response.data);
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error loading projection:", error);
|
console.error("Error loading projection:", error);
|
||||||
} finally {
|
} finally {
|
||||||
@@ -191,21 +190,19 @@ const MiniSalesChart = ({ className = "" }) => {
|
|||||||
setLoading(true);
|
setLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const response = await axios.get("/api/klaviyo/events/stats/details", {
|
const response = await acotService.getStatsDetails({
|
||||||
params: {
|
|
||||||
timeRange: "last30days",
|
timeRange: "last30days",
|
||||||
metric: "revenue",
|
metric: "revenue",
|
||||||
daily: true,
|
daily: true,
|
||||||
},
|
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.data) {
|
if (!response.stats) {
|
||||||
throw new Error("Invalid response format");
|
throw new Error("Invalid response format");
|
||||||
}
|
}
|
||||||
|
|
||||||
const stats = Array.isArray(response.data)
|
const stats = Array.isArray(response.stats)
|
||||||
? response.data
|
? response.stats
|
||||||
: response.data.stats || [];
|
: [];
|
||||||
|
|
||||||
const processedData = processData(stats);
|
const processedData = processData(stats);
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import React, { useState, useEffect, useCallback, memo } from "react";
|
import React, { useState, useEffect, useCallback, memo } from "react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
|
import { acotService } from "@/services/acotService";
|
||||||
import {
|
import {
|
||||||
Card,
|
Card,
|
||||||
CardContent,
|
CardContent,
|
||||||
@@ -307,13 +308,11 @@ const MiniStatCards = ({
|
|||||||
|
|
||||||
const params =
|
const params =
|
||||||
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
||||||
const response = await axios.get("/api/klaviyo/events/stats", {
|
const response = await acotService.getStats(params);
|
||||||
params,
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!isMounted) return;
|
if (!isMounted) return;
|
||||||
|
|
||||||
setStats(response.data.stats);
|
setStats(response.stats);
|
||||||
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
||||||
setError(null);
|
setError(null);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -345,12 +344,10 @@ const MiniStatCards = ({
|
|||||||
setProjectionLoading(true);
|
setProjectionLoading(true);
|
||||||
const params =
|
const params =
|
||||||
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
||||||
const response = await axios.get("/api/klaviyo/events/projection", {
|
const response = await acotService.getProjection(params);
|
||||||
params,
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!isMounted) return;
|
if (!isMounted) return;
|
||||||
setProjection(response.data);
|
setProjection(response);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error loading projection:", error);
|
console.error("Error loading projection:", error);
|
||||||
} finally {
|
} finally {
|
||||||
@@ -373,16 +370,12 @@ const MiniStatCards = ({
|
|||||||
const interval = setInterval(async () => {
|
const interval = setInterval(async () => {
|
||||||
try {
|
try {
|
||||||
const [statsResponse, projectionResponse] = await Promise.all([
|
const [statsResponse, projectionResponse] = await Promise.all([
|
||||||
axios.get("/api/klaviyo/events/stats", {
|
acotService.getStats({ timeRange: "today" }),
|
||||||
params: { timeRange: "today" },
|
acotService.getProjection({ timeRange: "today" }),
|
||||||
}),
|
|
||||||
axios.get("/api/klaviyo/events/projection", {
|
|
||||||
params: { timeRange: "today" },
|
|
||||||
}),
|
|
||||||
]);
|
]);
|
||||||
|
|
||||||
setStats(statsResponse.data.stats);
|
setStats(statsResponse.stats);
|
||||||
setProjection(projectionResponse.data);
|
setProjection(projectionResponse);
|
||||||
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error auto-refreshing stats:", error);
|
console.error("Error auto-refreshing stats:", error);
|
||||||
@@ -399,15 +392,13 @@ const MiniStatCards = ({
|
|||||||
|
|
||||||
setDetailDataLoading((prev) => ({ ...prev, [metric]: true }));
|
setDetailDataLoading((prev) => ({ ...prev, [metric]: true }));
|
||||||
try {
|
try {
|
||||||
const response = await axios.get("/api/klaviyo/events/stats/details", {
|
const response = await acotService.getStatsDetails({
|
||||||
params: {
|
|
||||||
timeRange: "last30days",
|
timeRange: "last30days",
|
||||||
metric,
|
metric,
|
||||||
daily: true,
|
daily: true,
|
||||||
},
|
|
||||||
});
|
});
|
||||||
|
|
||||||
setDetailData((prev) => ({ ...prev, [metric]: response.data.stats }));
|
setDetailData((prev) => ({ ...prev, [metric]: response.stats }));
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`Error fetching detail data for ${metric}:`, error);
|
console.error(`Error fetching detail data for ${metric}:`, error);
|
||||||
} finally {
|
} finally {
|
||||||
@@ -424,13 +415,23 @@ const MiniStatCards = ({
|
|||||||
}
|
}
|
||||||
}, [selectedMetric, fetchDetailData]);
|
}, [selectedMetric, fetchDetailData]);
|
||||||
|
|
||||||
// Add preload effect
|
// Add preload effect with throttling
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
// Preload all detail data when component mounts
|
// Preload detail data with throttling to avoid overwhelming the server
|
||||||
|
const preloadData = async () => {
|
||||||
const metrics = ["revenue", "orders", "average_order", "shipping"];
|
const metrics = ["revenue", "orders", "average_order", "shipping"];
|
||||||
metrics.forEach((metric) => {
|
for (const metric of metrics) {
|
||||||
fetchDetailData(metric);
|
try {
|
||||||
});
|
await fetchDetailData(metric);
|
||||||
|
// Small delay between requests
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 25));
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error preloading ${metric}:`, error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
preloadData();
|
||||||
}, []); // eslint-disable-line react-hooks/exhaustive-deps
|
}, []); // eslint-disable-line react-hooks/exhaustive-deps
|
||||||
|
|
||||||
if (loading && !stats) {
|
if (loading && !stats) {
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import React, { useState, useEffect } from "react";
|
import React, { useState, useEffect } from "react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
|
import { acotService } from "@/services/acotService";
|
||||||
import { Card, CardContent, CardHeader, CardTitle, CardDescription } from "@/components/ui/card";
|
import { Card, CardContent, CardHeader, CardTitle, CardDescription } from "@/components/ui/card";
|
||||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||||
import { Loader2, ArrowUpDown, AlertCircle, Package, Settings2, Search, X } from "lucide-react";
|
import { Loader2, ArrowUpDown, AlertCircle, Package, Settings2, Search, X } from "lucide-react";
|
||||||
@@ -57,10 +58,8 @@ const ProductGrid = ({
|
|||||||
setLoading(true);
|
setLoading(true);
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
const response = await axios.get("/api/klaviyo/events/products", {
|
const response = await acotService.getProducts({ timeRange: selectedTimeRange });
|
||||||
params: { timeRange: selectedTimeRange },
|
setProducts(response.stats.products.list || []);
|
||||||
});
|
|
||||||
setProducts(response.data.stats.products.list || []);
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error fetching products:", error);
|
console.error("Error fetching products:", error);
|
||||||
setError(error.message);
|
setError(error.message);
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import React, { useState, useEffect, useMemo, useCallback, memo } from "react";
|
import React, { useState, useEffect, useMemo, useCallback, memo } from "react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
|
import { acotService } from "@/services/acotService";
|
||||||
import {
|
import {
|
||||||
Card,
|
Card,
|
||||||
CardContent,
|
CardContent,
|
||||||
@@ -550,10 +551,8 @@ const SalesChart = ({ timeRange = "last30days", title = "Sales Overview" }) => {
|
|||||||
|
|
||||||
try {
|
try {
|
||||||
setProjectionLoading(true);
|
setProjectionLoading(true);
|
||||||
const response = await axios.get("/api/klaviyo/events/projection", {
|
const response = await acotService.getProjection(params);
|
||||||
params,
|
setProjection(response);
|
||||||
});
|
|
||||||
setProjection(response.data);
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error loading projection:", error);
|
console.error("Error loading projection:", error);
|
||||||
} finally {
|
} finally {
|
||||||
@@ -568,22 +567,20 @@ const SalesChart = ({ timeRange = "last30days", title = "Sales Overview" }) => {
|
|||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
// Fetch data
|
// Fetch data
|
||||||
const response = await axios.get("/api/klaviyo/events/stats/details", {
|
const response = await acotService.getStatsDetails({
|
||||||
params: {
|
|
||||||
...params,
|
...params,
|
||||||
metric: "revenue",
|
metric: "revenue",
|
||||||
daily: true,
|
daily: true,
|
||||||
},
|
|
||||||
});
|
});
|
||||||
|
|
||||||
if (!response.data) {
|
if (!response.stats) {
|
||||||
throw new Error("Invalid response format");
|
throw new Error("Invalid response format");
|
||||||
}
|
}
|
||||||
|
|
||||||
// Process the data
|
// Process the data
|
||||||
const currentStats = Array.isArray(response.data)
|
const currentStats = Array.isArray(response.stats)
|
||||||
? response.data
|
? response.stats
|
||||||
: response.data.stats || [];
|
: [];
|
||||||
|
|
||||||
// Process the data directly without remapping
|
// Process the data directly without remapping
|
||||||
const processedData = processData(currentStats);
|
const processedData = processData(currentStats);
|
||||||
@@ -614,20 +611,15 @@ const SalesChart = ({ timeRange = "last30days", title = "Sales Overview" }) => {
|
|||||||
[fetchData]
|
[fetchData]
|
||||||
);
|
);
|
||||||
|
|
||||||
// Initial load effect
|
// Initial load and auto-refresh effect
|
||||||
useEffect(() => {
|
|
||||||
fetchData({ timeRange: selectedTimeRange });
|
|
||||||
}, [selectedTimeRange, fetchData]);
|
|
||||||
|
|
||||||
// Auto-refresh effect for 'today' view
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
let intervalId = null;
|
let intervalId = null;
|
||||||
|
|
||||||
if (selectedTimeRange === "today") {
|
|
||||||
// Initial fetch
|
// Initial fetch
|
||||||
fetchData({ timeRange: "today" });
|
fetchData({ timeRange: selectedTimeRange });
|
||||||
|
|
||||||
// Set up interval
|
// Set up auto-refresh only for 'today' view
|
||||||
|
if (selectedTimeRange === "today") {
|
||||||
intervalId = setInterval(() => {
|
intervalId = setInterval(() => {
|
||||||
fetchData({ timeRange: "today" });
|
fetchData({ timeRange: "today" });
|
||||||
}, 60000);
|
}, 60000);
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import React, { useState, useEffect, useCallback, Suspense, memo } from "react";
|
import React, { useState, useEffect, useCallback, Suspense, memo } from "react";
|
||||||
import axios from "axios";
|
import axios from "axios";
|
||||||
|
import { acotService } from "@/services/acotService";
|
||||||
import {
|
import {
|
||||||
Card,
|
Card,
|
||||||
CardContent,
|
CardContent,
|
||||||
@@ -1415,10 +1416,8 @@ const StatCards = ({
|
|||||||
|
|
||||||
// For metrics that need the full stats
|
// For metrics that need the full stats
|
||||||
if (["shipping", "brands_categories"].includes(metric)) {
|
if (["shipping", "brands_categories"].includes(metric)) {
|
||||||
const response = await axios.get("/api/klaviyo/events/stats", {
|
const response = await acotService.getStats(params);
|
||||||
params,
|
const data = [response.stats];
|
||||||
});
|
|
||||||
const data = [response.data.stats];
|
|
||||||
setCacheData(detailTimeRange, metric, data);
|
setCacheData(detailTimeRange, metric, data);
|
||||||
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
||||||
setError(null);
|
setError(null);
|
||||||
@@ -1427,16 +1426,11 @@ const StatCards = ({
|
|||||||
|
|
||||||
// For order types (pre_orders, local_pickup, on_hold)
|
// For order types (pre_orders, local_pickup, on_hold)
|
||||||
if (["pre_orders", "local_pickup", "on_hold"].includes(metric)) {
|
if (["pre_orders", "local_pickup", "on_hold"].includes(metric)) {
|
||||||
const response = await axios.get(
|
const response = await acotService.getStatsDetails({
|
||||||
"/api/klaviyo/events/stats/details",
|
|
||||||
{
|
|
||||||
params: {
|
|
||||||
...params,
|
...params,
|
||||||
orderType: orderType,
|
orderType: orderType,
|
||||||
},
|
});
|
||||||
}
|
const data = response.stats;
|
||||||
);
|
|
||||||
const data = response.data.stats;
|
|
||||||
setCacheData(detailTimeRange, metric, data);
|
setCacheData(detailTimeRange, metric, data);
|
||||||
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
||||||
setError(null);
|
setError(null);
|
||||||
@@ -1445,17 +1439,12 @@ const StatCards = ({
|
|||||||
|
|
||||||
// For refunds and cancellations
|
// For refunds and cancellations
|
||||||
if (["refunds", "cancellations"].includes(metric)) {
|
if (["refunds", "cancellations"].includes(metric)) {
|
||||||
const response = await axios.get(
|
const response = await acotService.getStatsDetails({
|
||||||
"/api/klaviyo/events/stats/details",
|
|
||||||
{
|
|
||||||
params: {
|
|
||||||
...params,
|
...params,
|
||||||
eventType:
|
eventType:
|
||||||
metric === "refunds" ? "PAYMENT_REFUNDED" : "CANCELED_ORDER",
|
metric === "refunds" ? "PAYMENT_REFUNDED" : "CANCELED_ORDER",
|
||||||
},
|
});
|
||||||
}
|
const data = response.stats;
|
||||||
);
|
|
||||||
const data = response.data.stats;
|
|
||||||
|
|
||||||
// Transform the data to match the expected format
|
// Transform the data to match the expected format
|
||||||
const transformedData = data.map((day) => ({
|
const transformedData = data.map((day) => ({
|
||||||
@@ -1487,16 +1476,11 @@ const StatCards = ({
|
|||||||
|
|
||||||
// For order range
|
// For order range
|
||||||
if (metric === "order_range") {
|
if (metric === "order_range") {
|
||||||
const response = await axios.get(
|
const response = await acotService.getStatsDetails({
|
||||||
"/api/klaviyo/events/stats/details",
|
|
||||||
{
|
|
||||||
params: {
|
|
||||||
...params,
|
...params,
|
||||||
eventType: "PLACED_ORDER",
|
eventType: "PLACED_ORDER",
|
||||||
},
|
});
|
||||||
}
|
const data = response.stats;
|
||||||
);
|
|
||||||
const data = response.data.stats;
|
|
||||||
console.log("Fetched order range data:", data);
|
console.log("Fetched order range data:", data);
|
||||||
setCacheData(detailTimeRange, metric, data);
|
setCacheData(detailTimeRange, metric, data);
|
||||||
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
||||||
@@ -1505,10 +1489,8 @@ const StatCards = ({
|
|||||||
}
|
}
|
||||||
|
|
||||||
// For all other metrics
|
// For all other metrics
|
||||||
const response = await axios.get("/api/klaviyo/events/stats/details", {
|
const response = await acotService.getStatsDetails(params);
|
||||||
params,
|
const data = response.stats;
|
||||||
});
|
|
||||||
const data = response.data.stats;
|
|
||||||
setCacheData(detailTimeRange, metric, data);
|
setCacheData(detailTimeRange, metric, data);
|
||||||
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
setDetailData((prev) => ({ ...prev, [metric]: data }));
|
||||||
setError(null);
|
setError(null);
|
||||||
@@ -1531,8 +1513,8 @@ const StatCards = ({
|
|||||||
]
|
]
|
||||||
);
|
);
|
||||||
|
|
||||||
// Corrected preloadDetailData function
|
// Throttled preloadDetailData function to avoid overwhelming the server
|
||||||
const preloadDetailData = useCallback(() => {
|
const preloadDetailData = useCallback(async () => {
|
||||||
const metrics = [
|
const metrics = [
|
||||||
"revenue",
|
"revenue",
|
||||||
"orders",
|
"orders",
|
||||||
@@ -1545,11 +1527,22 @@ const StatCards = ({
|
|||||||
"on_hold",
|
"on_hold",
|
||||||
];
|
];
|
||||||
|
|
||||||
return Promise.all(
|
// Process metrics in batches of 3 to avoid overwhelming the connection pool
|
||||||
metrics.map((metric) => fetchDetailData(metric, metric))
|
const batchSize = 3;
|
||||||
).catch((error) => {
|
for (let i = 0; i < metrics.length; i += batchSize) {
|
||||||
console.error("Error during detail data preload:", error);
|
const batch = metrics.slice(i, i + batchSize);
|
||||||
});
|
try {
|
||||||
|
await Promise.all(
|
||||||
|
batch.map((metric) => fetchDetailData(metric, metric))
|
||||||
|
);
|
||||||
|
// Small delay between batches to prevent overwhelming the server
|
||||||
|
if (i + batchSize < metrics.length) {
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 50));
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Error during detail data preload batch ${i / batchSize + 1}:`, error);
|
||||||
|
}
|
||||||
|
}
|
||||||
}, [fetchDetailData]);
|
}, [fetchDetailData]);
|
||||||
|
|
||||||
// Move trend calculation functions inside the component
|
// Move trend calculation functions inside the component
|
||||||
@@ -1630,14 +1623,12 @@ const StatCards = ({
|
|||||||
const params =
|
const params =
|
||||||
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
||||||
|
|
||||||
const response = await axios.get("/api/klaviyo/events/stats", {
|
const response = await acotService.getStats(params);
|
||||||
params,
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!isMounted) return;
|
if (!isMounted) return;
|
||||||
|
|
||||||
setDateRange(response.data.timeRange);
|
setDateRange(response.timeRange);
|
||||||
setStats(response.data.stats);
|
setStats(response.stats);
|
||||||
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
||||||
setError(null);
|
setError(null);
|
||||||
|
|
||||||
@@ -1674,12 +1665,10 @@ const StatCards = ({
|
|||||||
const params =
|
const params =
|
||||||
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
timeRange === "custom" ? { startDate, endDate } : { timeRange };
|
||||||
|
|
||||||
const response = await axios.get("/api/klaviyo/events/projection", {
|
const response = await acotService.getProjection(params);
|
||||||
params,
|
|
||||||
});
|
|
||||||
|
|
||||||
if (!isMounted) return;
|
if (!isMounted) return;
|
||||||
setProjection(response.data);
|
setProjection(response);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error loading projection:", error);
|
console.error("Error loading projection:", error);
|
||||||
} finally {
|
} finally {
|
||||||
@@ -1702,16 +1691,12 @@ const StatCards = ({
|
|||||||
const interval = setInterval(async () => {
|
const interval = setInterval(async () => {
|
||||||
try {
|
try {
|
||||||
const [statsResponse, projectionResponse] = await Promise.all([
|
const [statsResponse, projectionResponse] = await Promise.all([
|
||||||
axios.get("/api/klaviyo/events/stats", {
|
acotService.getStats({ timeRange: "today" }),
|
||||||
params: { timeRange: "today" },
|
acotService.getProjection({ timeRange: "today" }),
|
||||||
}),
|
|
||||||
axios.get("/api/klaviyo/events/projection", {
|
|
||||||
params: { timeRange: "today" },
|
|
||||||
}),
|
|
||||||
]);
|
]);
|
||||||
|
|
||||||
setStats(statsResponse.data.stats);
|
setStats(statsResponse.stats);
|
||||||
setProjection(projectionResponse.data);
|
setProjection(projectionResponse);
|
||||||
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
setLastUpdate(DateTime.now().setZone("America/New_York"));
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error auto-refreshing stats:", error);
|
console.error("Error auto-refreshing stats:", error);
|
||||||
|
|||||||
24
dashboard/src/pages/test-acot.jsx
Normal file
24
dashboard/src/pages/test-acot.jsx
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import DashboardLayout from '@/components/DashboardLayout';
|
||||||
|
import AcotTest from '@/components/dashboard/AcotTest';
|
||||||
|
|
||||||
|
const TestAcotPage = () => {
|
||||||
|
return (
|
||||||
|
<DashboardLayout>
|
||||||
|
<div className="p-6">
|
||||||
|
<div className="mb-6">
|
||||||
|
<h1 className="text-2xl font-bold">ACOT Server Test</h1>
|
||||||
|
<p className="text-muted-foreground mt-1">
|
||||||
|
Test connection to production database through ACOT server
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex justify-center">
|
||||||
|
<AcotTest />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</DashboardLayout>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default TestAcotPage;
|
||||||
176
dashboard/src/services/acotService.js
Normal file
176
dashboard/src/services/acotService.js
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
import axios from 'axios';
|
||||||
|
|
||||||
|
// Use the proxy in development, direct URL in production
|
||||||
|
const ACOT_BASE_URL = process.env.NODE_ENV === 'development'
|
||||||
|
? '' // Use proxy in development (which now points to production)
|
||||||
|
: (process.env.REACT_APP_ACOT_API_URL || 'https://dashboard.kent.pw');
|
||||||
|
|
||||||
|
const acotApi = axios.create({
|
||||||
|
baseURL: ACOT_BASE_URL,
|
||||||
|
timeout: 30000,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Request deduplication cache
|
||||||
|
const requestCache = new Map();
|
||||||
|
|
||||||
|
// Periodic cache cleanup (every 5 minutes)
|
||||||
|
setInterval(() => {
|
||||||
|
const now = Date.now();
|
||||||
|
const maxAge = 5 * 60 * 1000; // 5 minutes
|
||||||
|
|
||||||
|
for (const [key, value] of requestCache.entries()) {
|
||||||
|
if (value.timestamp && now - value.timestamp > maxAge) {
|
||||||
|
requestCache.delete(key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (requestCache.size > 0) {
|
||||||
|
console.log(`[ACOT API] Cache cleanup: ${requestCache.size} entries remaining`);
|
||||||
|
}
|
||||||
|
}, 5 * 60 * 1000);
|
||||||
|
|
||||||
|
// Retry function for timeout errors
|
||||||
|
const retryRequest = async (requestFn, maxRetries = 2, delay = 1000) => {
|
||||||
|
for (let attempt = 1; attempt <= maxRetries + 1; attempt++) {
|
||||||
|
try {
|
||||||
|
return await requestFn();
|
||||||
|
} catch (error) {
|
||||||
|
const isTimeout = error.code === 'ECONNABORTED' || error.message.includes('timeout');
|
||||||
|
const isLastAttempt = attempt === maxRetries + 1;
|
||||||
|
|
||||||
|
if (isTimeout && !isLastAttempt) {
|
||||||
|
console.log(`[ACOT API] Timeout on attempt ${attempt}, retrying in ${delay}ms...`);
|
||||||
|
await new Promise(resolve => setTimeout(resolve, delay));
|
||||||
|
delay *= 1.5; // Exponential backoff
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Request deduplication function
|
||||||
|
const deduplicatedRequest = async (cacheKey, requestFn, cacheDuration = 5000) => {
|
||||||
|
// Check if we have a pending request for this key
|
||||||
|
if (requestCache.has(cacheKey)) {
|
||||||
|
const cached = requestCache.get(cacheKey);
|
||||||
|
|
||||||
|
// If it's a pending promise, return it
|
||||||
|
if (cached.promise) {
|
||||||
|
console.log(`[ACOT API] Deduplicating request: ${cacheKey}`);
|
||||||
|
return cached.promise;
|
||||||
|
}
|
||||||
|
|
||||||
|
// If it's cached data and still fresh, return it
|
||||||
|
if (cached.data && Date.now() - cached.timestamp < cacheDuration) {
|
||||||
|
console.log(`[ACOT API] Using cached data: ${cacheKey}`);
|
||||||
|
return cached.data;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create new request
|
||||||
|
const promise = requestFn().then(data => {
|
||||||
|
// Cache the result
|
||||||
|
requestCache.set(cacheKey, {
|
||||||
|
data,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
promise: null
|
||||||
|
});
|
||||||
|
return data;
|
||||||
|
}).catch(error => {
|
||||||
|
// Remove from cache on error
|
||||||
|
requestCache.delete(cacheKey);
|
||||||
|
throw error;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Cache the promise while it's pending
|
||||||
|
requestCache.set(cacheKey, {
|
||||||
|
promise,
|
||||||
|
timestamp: Date.now(),
|
||||||
|
data: null
|
||||||
|
});
|
||||||
|
|
||||||
|
return promise;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Add request interceptor for logging
|
||||||
|
acotApi.interceptors.request.use(
|
||||||
|
(config) => {
|
||||||
|
console.log(`[ACOT API] ${config.method?.toUpperCase()} ${config.url}`, config.params);
|
||||||
|
return config;
|
||||||
|
},
|
||||||
|
(error) => {
|
||||||
|
console.error('[ACOT API] Request error:', error);
|
||||||
|
return Promise.reject(error);
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// Add response interceptor for logging
|
||||||
|
acotApi.interceptors.response.use(
|
||||||
|
(response) => {
|
||||||
|
console.log(`[ACOT API] Response ${response.status}:`, response.data);
|
||||||
|
return response;
|
||||||
|
},
|
||||||
|
(error) => {
|
||||||
|
console.error('[ACOT API] Response error:', error.response?.data || error.message);
|
||||||
|
return Promise.reject(error);
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// Cleanup function to clear cache
|
||||||
|
const clearCache = () => {
|
||||||
|
requestCache.clear();
|
||||||
|
console.log('[ACOT API] Request cache cleared');
|
||||||
|
};
|
||||||
|
|
||||||
|
export const acotService = {
|
||||||
|
// Get main stats - replaces klaviyo events/stats
|
||||||
|
getStats: async (params) => {
|
||||||
|
const cacheKey = `stats_${JSON.stringify(params)}`;
|
||||||
|
return deduplicatedRequest(cacheKey, () =>
|
||||||
|
retryRequest(async () => {
|
||||||
|
const response = await acotApi.get('/api/acot/events/stats', { params });
|
||||||
|
return response.data;
|
||||||
|
})
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
// Get detailed stats - replaces klaviyo events/stats/details
|
||||||
|
getStatsDetails: async (params) => {
|
||||||
|
const cacheKey = `details_${JSON.stringify(params)}`;
|
||||||
|
return deduplicatedRequest(cacheKey, () =>
|
||||||
|
retryRequest(async () => {
|
||||||
|
const response = await acotApi.get('/api/acot/events/stats/details', { params });
|
||||||
|
return response.data;
|
||||||
|
})
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
// Get products data - replaces klaviyo events/products
|
||||||
|
getProducts: async (params) => {
|
||||||
|
const cacheKey = `products_${JSON.stringify(params)}`;
|
||||||
|
return deduplicatedRequest(cacheKey, () =>
|
||||||
|
retryRequest(async () => {
|
||||||
|
const response = await acotApi.get('/api/acot/events/products', { params });
|
||||||
|
return response.data;
|
||||||
|
})
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
// Get projections - replaces klaviyo events/projection
|
||||||
|
getProjection: async (params) => {
|
||||||
|
const cacheKey = `projection_${JSON.stringify(params)}`;
|
||||||
|
return deduplicatedRequest(cacheKey, () =>
|
||||||
|
retryRequest(async () => {
|
||||||
|
const response = await acotApi.get('/api/acot/events/projection', { params });
|
||||||
|
return response.data;
|
||||||
|
})
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
// Utility functions
|
||||||
|
clearCache,
|
||||||
|
};
|
||||||
|
|
||||||
|
export default acotService;
|
||||||
@@ -31,6 +31,42 @@ export default defineConfig(({ mode }) => {
|
|||||||
host: "0.0.0.0",
|
host: "0.0.0.0",
|
||||||
port: 3000,
|
port: 3000,
|
||||||
proxy: {
|
proxy: {
|
||||||
|
"/api/acot": {
|
||||||
|
target: "https://dashboard.kent.pw",
|
||||||
|
changeOrigin: true,
|
||||||
|
secure: true,
|
||||||
|
rewrite: (path) => path.replace(/^\/api\/acot/, "/api/acot"),
|
||||||
|
configure: (proxy, _options) => {
|
||||||
|
proxy.on("error", (err, req, res) => {
|
||||||
|
console.error("ACOT proxy error:", err);
|
||||||
|
res.writeHead(500, {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
});
|
||||||
|
res.end(
|
||||||
|
JSON.stringify({
|
||||||
|
error: "Proxy Error",
|
||||||
|
message: err.message,
|
||||||
|
details: err.stack
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
proxy.on("proxyReq", (proxyReq, req, _res) => {
|
||||||
|
console.log("Outgoing ACOT request:", {
|
||||||
|
method: req.method,
|
||||||
|
url: req.url,
|
||||||
|
path: proxyReq.path,
|
||||||
|
headers: proxyReq.getHeaders(),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
proxy.on("proxyRes", (proxyRes, req, _res) => {
|
||||||
|
console.log("ACOT proxy response:", {
|
||||||
|
statusCode: proxyRes.statusCode,
|
||||||
|
url: req.url,
|
||||||
|
headers: proxyRes.headers,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
},
|
||||||
|
},
|
||||||
"/api/klaviyo": {
|
"/api/klaviyo": {
|
||||||
target: "https://dashboard.kent.pw",
|
target: "https://dashboard.kent.pw",
|
||||||
changeOrigin: true,
|
changeOrigin: true,
|
||||||
|
|||||||
239
examples DO NOT USE OR EDIT/EXAMPLE ONLY dbConnection.js
Normal file
239
examples DO NOT USE OR EDIT/EXAMPLE ONLY dbConnection.js
Normal file
@@ -0,0 +1,239 @@
|
|||||||
|
const { Client } = require('ssh2');
|
||||||
|
const mysql = require('mysql2/promise');
|
||||||
|
const fs = require('fs');
|
||||||
|
|
||||||
|
// Connection pooling and cache configuration
|
||||||
|
const connectionCache = {
|
||||||
|
ssh: null,
|
||||||
|
dbConnection: null,
|
||||||
|
lastUsed: 0,
|
||||||
|
isConnecting: false,
|
||||||
|
connectionPromise: null,
|
||||||
|
// Cache expiration time in milliseconds (5 minutes)
|
||||||
|
expirationTime: 5 * 60 * 1000,
|
||||||
|
// Cache for query results (key: query string, value: {data, timestamp})
|
||||||
|
queryCache: new Map(),
|
||||||
|
// Cache duration for different query types in milliseconds
|
||||||
|
cacheDuration: {
|
||||||
|
'field-options': 30 * 60 * 1000, // 30 minutes for field options
|
||||||
|
'product-lines': 10 * 60 * 1000, // 10 minutes for product lines
|
||||||
|
'sublines': 10 * 60 * 1000, // 10 minutes for sublines
|
||||||
|
'taxonomy': 30 * 60 * 1000, // 30 minutes for taxonomy data
|
||||||
|
'default': 60 * 1000 // 1 minute default
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get a database connection with connection pooling
|
||||||
|
* @returns {Promise<{ssh: object, connection: object}>} The SSH and database connection
|
||||||
|
*/
|
||||||
|
async function getDbConnection() {
|
||||||
|
const now = Date.now();
|
||||||
|
|
||||||
|
// Check if we need to refresh the connection due to inactivity
|
||||||
|
const needsRefresh = !connectionCache.ssh ||
|
||||||
|
!connectionCache.dbConnection ||
|
||||||
|
(now - connectionCache.lastUsed > connectionCache.expirationTime);
|
||||||
|
|
||||||
|
// If connection is still valid, update last used time and return existing connection
|
||||||
|
if (!needsRefresh) {
|
||||||
|
connectionCache.lastUsed = now;
|
||||||
|
return {
|
||||||
|
ssh: connectionCache.ssh,
|
||||||
|
connection: connectionCache.dbConnection
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// If another request is already establishing a connection, wait for that promise
|
||||||
|
if (connectionCache.isConnecting && connectionCache.connectionPromise) {
|
||||||
|
try {
|
||||||
|
await connectionCache.connectionPromise;
|
||||||
|
return {
|
||||||
|
ssh: connectionCache.ssh,
|
||||||
|
connection: connectionCache.dbConnection
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
// If that connection attempt failed, we'll try again below
|
||||||
|
console.error('Error waiting for existing connection:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close existing connections if they exist
|
||||||
|
if (connectionCache.dbConnection) {
|
||||||
|
try {
|
||||||
|
await connectionCache.dbConnection.end();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing existing database connection:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (connectionCache.ssh) {
|
||||||
|
try {
|
||||||
|
connectionCache.ssh.end();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing existing SSH connection:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark that we're establishing a new connection
|
||||||
|
connectionCache.isConnecting = true;
|
||||||
|
|
||||||
|
// Create a new promise for this connection attempt
|
||||||
|
connectionCache.connectionPromise = setupSshTunnel().then(tunnel => {
|
||||||
|
const { ssh, stream, dbConfig } = tunnel;
|
||||||
|
|
||||||
|
return mysql.createConnection({
|
||||||
|
...dbConfig,
|
||||||
|
stream
|
||||||
|
}).then(connection => {
|
||||||
|
// Store the new connections
|
||||||
|
connectionCache.ssh = ssh;
|
||||||
|
connectionCache.dbConnection = connection;
|
||||||
|
connectionCache.lastUsed = Date.now();
|
||||||
|
connectionCache.isConnecting = false;
|
||||||
|
|
||||||
|
return {
|
||||||
|
ssh,
|
||||||
|
connection
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}).catch(error => {
|
||||||
|
connectionCache.isConnecting = false;
|
||||||
|
throw error;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Wait for the connection to be established
|
||||||
|
return connectionCache.connectionPromise;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get cached query results or execute query if not cached
|
||||||
|
* @param {string} cacheKey - Unique key to identify the query
|
||||||
|
* @param {string} queryType - Type of query (field-options, product-lines, etc.)
|
||||||
|
* @param {Function} queryFn - Function to execute if cache miss
|
||||||
|
* @returns {Promise<any>} The query result
|
||||||
|
*/
|
||||||
|
async function getCachedQuery(cacheKey, queryType, queryFn) {
|
||||||
|
// Get cache duration based on query type
|
||||||
|
const cacheDuration = connectionCache.cacheDuration[queryType] || connectionCache.cacheDuration.default;
|
||||||
|
|
||||||
|
// Check if we have a valid cached result
|
||||||
|
const cachedResult = connectionCache.queryCache.get(cacheKey);
|
||||||
|
const now = Date.now();
|
||||||
|
|
||||||
|
if (cachedResult && (now - cachedResult.timestamp < cacheDuration)) {
|
||||||
|
console.log(`Cache hit for ${queryType} query: ${cacheKey}`);
|
||||||
|
return cachedResult.data;
|
||||||
|
}
|
||||||
|
|
||||||
|
// No valid cache found, execute the query
|
||||||
|
console.log(`Cache miss for ${queryType} query: ${cacheKey}`);
|
||||||
|
const result = await queryFn();
|
||||||
|
|
||||||
|
// Cache the result
|
||||||
|
connectionCache.queryCache.set(cacheKey, {
|
||||||
|
data: result,
|
||||||
|
timestamp: now
|
||||||
|
});
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Setup SSH tunnel to production database
|
||||||
|
* @private - Should only be used by getDbConnection
|
||||||
|
* @returns {Promise<{ssh: object, stream: object, dbConfig: object}>}
|
||||||
|
*/
|
||||||
|
async function setupSshTunnel() {
|
||||||
|
const sshConfig = {
|
||||||
|
host: process.env.PROD_SSH_HOST,
|
||||||
|
port: process.env.PROD_SSH_PORT || 22,
|
||||||
|
username: process.env.PROD_SSH_USER,
|
||||||
|
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||||
|
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||||
|
: undefined,
|
||||||
|
compress: true
|
||||||
|
};
|
||||||
|
|
||||||
|
const dbConfig = {
|
||||||
|
host: process.env.PROD_DB_HOST || 'localhost',
|
||||||
|
user: process.env.PROD_DB_USER,
|
||||||
|
password: process.env.PROD_DB_PASSWORD,
|
||||||
|
database: process.env.PROD_DB_NAME,
|
||||||
|
port: process.env.PROD_DB_PORT || 3306,
|
||||||
|
timezone: 'Z'
|
||||||
|
};
|
||||||
|
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const ssh = new Client();
|
||||||
|
|
||||||
|
ssh.on('error', (err) => {
|
||||||
|
console.error('SSH connection error:', err);
|
||||||
|
reject(err);
|
||||||
|
});
|
||||||
|
|
||||||
|
ssh.on('ready', () => {
|
||||||
|
ssh.forwardOut(
|
||||||
|
'127.0.0.1',
|
||||||
|
0,
|
||||||
|
dbConfig.host,
|
||||||
|
dbConfig.port,
|
||||||
|
(err, stream) => {
|
||||||
|
if (err) reject(err);
|
||||||
|
resolve({ ssh, stream, dbConfig });
|
||||||
|
}
|
||||||
|
);
|
||||||
|
}).connect(sshConfig);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear cached query results
|
||||||
|
* @param {string} [cacheKey] - Specific cache key to clear (clears all if not provided)
|
||||||
|
*/
|
||||||
|
function clearQueryCache(cacheKey) {
|
||||||
|
if (cacheKey) {
|
||||||
|
connectionCache.queryCache.delete(cacheKey);
|
||||||
|
console.log(`Cleared cache for key: ${cacheKey}`);
|
||||||
|
} else {
|
||||||
|
connectionCache.queryCache.clear();
|
||||||
|
console.log('Cleared all query cache');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Force close all active connections
|
||||||
|
* Useful for server shutdown or manual connection reset
|
||||||
|
*/
|
||||||
|
async function closeAllConnections() {
|
||||||
|
if (connectionCache.dbConnection) {
|
||||||
|
try {
|
||||||
|
await connectionCache.dbConnection.end();
|
||||||
|
console.log('Closed database connection');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing database connection:', error);
|
||||||
|
}
|
||||||
|
connectionCache.dbConnection = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (connectionCache.ssh) {
|
||||||
|
try {
|
||||||
|
connectionCache.ssh.end();
|
||||||
|
console.log('Closed SSH connection');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error closing SSH connection:', error);
|
||||||
|
}
|
||||||
|
connectionCache.ssh = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
connectionCache.lastUsed = 0;
|
||||||
|
connectionCache.isConnecting = false;
|
||||||
|
connectionCache.connectionPromise = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = {
|
||||||
|
getDbConnection,
|
||||||
|
getCachedQuery,
|
||||||
|
clearQueryCache,
|
||||||
|
closeAllConnections
|
||||||
|
};
|
||||||
363
examples DO NOT USE OR EDIT/EXAMPLE ONLY import-from-prod.js
Normal file
363
examples DO NOT USE OR EDIT/EXAMPLE ONLY import-from-prod.js
Normal file
@@ -0,0 +1,363 @@
|
|||||||
|
const dotenv = require("dotenv");
|
||||||
|
const path = require("path");
|
||||||
|
const { outputProgress, formatElapsedTime } = require('./metrics-new/utils/progress');
|
||||||
|
const { setupConnections, closeConnections } = require('./import/utils');
|
||||||
|
const importCategories = require('./import/categories');
|
||||||
|
const { importProducts } = require('./import/products');
|
||||||
|
const importOrders = require('./import/orders');
|
||||||
|
const importPurchaseOrders = require('./import/purchase-orders');
|
||||||
|
const importHistoricalData = require('./import/historical-data');
|
||||||
|
|
||||||
|
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||||
|
|
||||||
|
// Constants to control which imports run
|
||||||
|
const IMPORT_CATEGORIES = true;
|
||||||
|
const IMPORT_PRODUCTS = true;
|
||||||
|
const IMPORT_ORDERS = true;
|
||||||
|
const IMPORT_PURCHASE_ORDERS = true;
|
||||||
|
const IMPORT_HISTORICAL_DATA = false;
|
||||||
|
|
||||||
|
// Add flag for incremental updates
|
||||||
|
const INCREMENTAL_UPDATE = process.env.INCREMENTAL_UPDATE !== 'false'; // Default to true unless explicitly set to false
|
||||||
|
|
||||||
|
// SSH configuration
|
||||||
|
const sshConfig = {
|
||||||
|
ssh: {
|
||||||
|
host: process.env.PROD_SSH_HOST,
|
||||||
|
port: process.env.PROD_SSH_PORT || 22,
|
||||||
|
username: process.env.PROD_SSH_USER,
|
||||||
|
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||||
|
? require("fs").readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||||
|
: undefined,
|
||||||
|
compress: true, // Enable SSH compression
|
||||||
|
},
|
||||||
|
prodDbConfig: {
|
||||||
|
// MySQL config for production
|
||||||
|
host: process.env.PROD_DB_HOST || "localhost",
|
||||||
|
user: process.env.PROD_DB_USER,
|
||||||
|
password: process.env.PROD_DB_PASSWORD,
|
||||||
|
database: process.env.PROD_DB_NAME,
|
||||||
|
port: process.env.PROD_DB_PORT || 3306,
|
||||||
|
timezone: '-05:00', // Production DB always stores times in EST (UTC-5) regardless of DST
|
||||||
|
},
|
||||||
|
localDbConfig: {
|
||||||
|
// PostgreSQL config for local
|
||||||
|
host: process.env.DB_HOST,
|
||||||
|
user: process.env.DB_USER,
|
||||||
|
password: process.env.DB_PASSWORD,
|
||||||
|
database: process.env.DB_NAME,
|
||||||
|
port: process.env.DB_PORT || 5432,
|
||||||
|
ssl: process.env.DB_SSL === 'true',
|
||||||
|
connectionTimeoutMillis: 60000,
|
||||||
|
idleTimeoutMillis: 30000,
|
||||||
|
max: 10 // connection pool max size
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
let isImportCancelled = false;
|
||||||
|
|
||||||
|
// Add cancel function
|
||||||
|
function cancelImport() {
|
||||||
|
isImportCancelled = true;
|
||||||
|
outputProgress({
|
||||||
|
status: 'cancelled',
|
||||||
|
operation: 'Import process',
|
||||||
|
message: 'Import cancelled by user',
|
||||||
|
current: 0,
|
||||||
|
total: 0,
|
||||||
|
elapsed: null,
|
||||||
|
remaining: null,
|
||||||
|
rate: 0
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
const startTime = Date.now();
|
||||||
|
let connections;
|
||||||
|
let completedSteps = 0;
|
||||||
|
let importHistoryId;
|
||||||
|
const totalSteps = [
|
||||||
|
IMPORT_CATEGORIES,
|
||||||
|
IMPORT_PRODUCTS,
|
||||||
|
IMPORT_ORDERS,
|
||||||
|
IMPORT_PURCHASE_ORDERS,
|
||||||
|
IMPORT_HISTORICAL_DATA
|
||||||
|
].filter(Boolean).length;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Initial progress update
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Import process",
|
||||||
|
message: `Initializing SSH tunnel for ${INCREMENTAL_UPDATE ? 'incremental' : 'full'} import...`,
|
||||||
|
current: completedSteps,
|
||||||
|
total: totalSteps,
|
||||||
|
elapsed: formatElapsedTime(startTime)
|
||||||
|
});
|
||||||
|
|
||||||
|
connections = await setupConnections(sshConfig);
|
||||||
|
const { prodConnection, localConnection } = connections;
|
||||||
|
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
|
||||||
|
// Clean up any previously running imports that weren't completed
|
||||||
|
await localConnection.query(`
|
||||||
|
UPDATE import_history
|
||||||
|
SET
|
||||||
|
status = 'cancelled',
|
||||||
|
end_time = NOW(),
|
||||||
|
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||||
|
error_message = 'Previous import was not completed properly'
|
||||||
|
WHERE status = 'running'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Create import history record for the overall session
|
||||||
|
try {
|
||||||
|
const [historyResult] = await localConnection.query(`
|
||||||
|
INSERT INTO import_history (
|
||||||
|
table_name,
|
||||||
|
start_time,
|
||||||
|
is_incremental,
|
||||||
|
status,
|
||||||
|
additional_info
|
||||||
|
) VALUES (
|
||||||
|
'all_tables',
|
||||||
|
NOW(),
|
||||||
|
$1::boolean,
|
||||||
|
'running',
|
||||||
|
jsonb_build_object(
|
||||||
|
'categories_enabled', $2::boolean,
|
||||||
|
'products_enabled', $3::boolean,
|
||||||
|
'orders_enabled', $4::boolean,
|
||||||
|
'purchase_orders_enabled', $5::boolean,
|
||||||
|
'historical_data_enabled', $6::boolean
|
||||||
|
)
|
||||||
|
) RETURNING id
|
||||||
|
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS, IMPORT_HISTORICAL_DATA]);
|
||||||
|
importHistoryId = historyResult.rows[0].id;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error creating import history record:", error);
|
||||||
|
outputProgress({
|
||||||
|
status: "error",
|
||||||
|
operation: "Import process",
|
||||||
|
message: "Failed to create import history record",
|
||||||
|
error: error.message
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
|
const results = {
|
||||||
|
categories: null,
|
||||||
|
products: null,
|
||||||
|
orders: null,
|
||||||
|
purchaseOrders: null,
|
||||||
|
historicalData: null
|
||||||
|
};
|
||||||
|
|
||||||
|
let totalRecordsAdded = 0;
|
||||||
|
let totalRecordsUpdated = 0;
|
||||||
|
|
||||||
|
// Run each import based on constants
|
||||||
|
if (IMPORT_CATEGORIES) {
|
||||||
|
results.categories = await importCategories(prodConnection, localConnection);
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
completedSteps++;
|
||||||
|
console.log('Categories import result:', results.categories);
|
||||||
|
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0);
|
||||||
|
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (IMPORT_PRODUCTS) {
|
||||||
|
results.products = await importProducts(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
completedSteps++;
|
||||||
|
console.log('Products import result:', results.products);
|
||||||
|
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0);
|
||||||
|
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (IMPORT_ORDERS) {
|
||||||
|
results.orders = await importOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
completedSteps++;
|
||||||
|
console.log('Orders import result:', results.orders);
|
||||||
|
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0);
|
||||||
|
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (IMPORT_PURCHASE_ORDERS) {
|
||||||
|
try {
|
||||||
|
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
completedSteps++;
|
||||||
|
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||||
|
|
||||||
|
// Handle potential error status
|
||||||
|
if (results.purchaseOrders?.status === 'error') {
|
||||||
|
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
|
||||||
|
} else {
|
||||||
|
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
||||||
|
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error during purchase orders import:', error);
|
||||||
|
// Continue with other imports, don't fail the whole process
|
||||||
|
results.purchaseOrders = {
|
||||||
|
status: 'error',
|
||||||
|
error: error.message,
|
||||||
|
recordsAdded: 0,
|
||||||
|
recordsUpdated: 0
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (IMPORT_HISTORICAL_DATA) {
|
||||||
|
try {
|
||||||
|
results.historicalData = await importHistoricalData(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
|
completedSteps++;
|
||||||
|
console.log('Historical data import result:', results.historicalData);
|
||||||
|
|
||||||
|
// Handle potential error status
|
||||||
|
if (results.historicalData?.status === 'error') {
|
||||||
|
console.error('Historical data import had an error:', results.historicalData.error);
|
||||||
|
} else {
|
||||||
|
totalRecordsAdded += parseInt(results.historicalData?.recordsAdded || 0);
|
||||||
|
totalRecordsUpdated += parseInt(results.historicalData?.recordsUpdated || 0);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error during historical data import:', error);
|
||||||
|
// Continue with other imports, don't fail the whole process
|
||||||
|
results.historicalData = {
|
||||||
|
status: 'error',
|
||||||
|
error: error.message,
|
||||||
|
recordsAdded: 0,
|
||||||
|
recordsUpdated: 0
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const endTime = Date.now();
|
||||||
|
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||||
|
|
||||||
|
// Update import history with final stats
|
||||||
|
await localConnection.query(`
|
||||||
|
UPDATE import_history
|
||||||
|
SET
|
||||||
|
end_time = NOW(),
|
||||||
|
duration_seconds = $1,
|
||||||
|
records_added = $2,
|
||||||
|
records_updated = $3,
|
||||||
|
status = 'completed',
|
||||||
|
additional_info = jsonb_build_object(
|
||||||
|
'categories_enabled', $4::boolean,
|
||||||
|
'products_enabled', $5::boolean,
|
||||||
|
'orders_enabled', $6::boolean,
|
||||||
|
'purchase_orders_enabled', $7::boolean,
|
||||||
|
'historical_data_enabled', $8::boolean,
|
||||||
|
'categories_result', COALESCE($9::jsonb, 'null'::jsonb),
|
||||||
|
'products_result', COALESCE($10::jsonb, 'null'::jsonb),
|
||||||
|
'orders_result', COALESCE($11::jsonb, 'null'::jsonb),
|
||||||
|
'purchase_orders_result', COALESCE($12::jsonb, 'null'::jsonb),
|
||||||
|
'historical_data_result', COALESCE($13::jsonb, 'null'::jsonb)
|
||||||
|
)
|
||||||
|
WHERE id = $14
|
||||||
|
`, [
|
||||||
|
totalElapsedSeconds,
|
||||||
|
parseInt(totalRecordsAdded),
|
||||||
|
parseInt(totalRecordsUpdated),
|
||||||
|
IMPORT_CATEGORIES,
|
||||||
|
IMPORT_PRODUCTS,
|
||||||
|
IMPORT_ORDERS,
|
||||||
|
IMPORT_PURCHASE_ORDERS,
|
||||||
|
IMPORT_HISTORICAL_DATA,
|
||||||
|
JSON.stringify(results.categories),
|
||||||
|
JSON.stringify(results.products),
|
||||||
|
JSON.stringify(results.orders),
|
||||||
|
JSON.stringify(results.purchaseOrders),
|
||||||
|
JSON.stringify(results.historicalData),
|
||||||
|
importHistoryId
|
||||||
|
]);
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "complete",
|
||||||
|
operation: "Import process",
|
||||||
|
message: `${INCREMENTAL_UPDATE ? 'Incremental' : 'Full'} import completed successfully in ${formatElapsedTime(totalElapsedSeconds)}`,
|
||||||
|
current: completedSteps,
|
||||||
|
total: totalSteps,
|
||||||
|
elapsed: formatElapsedTime(startTime),
|
||||||
|
timing: {
|
||||||
|
start_time: new Date(startTime).toISOString(),
|
||||||
|
end_time: new Date(endTime).toISOString(),
|
||||||
|
elapsed_time: formatElapsedTime(startTime),
|
||||||
|
elapsed_seconds: totalElapsedSeconds,
|
||||||
|
total_duration: formatElapsedTime(totalElapsedSeconds)
|
||||||
|
},
|
||||||
|
results
|
||||||
|
});
|
||||||
|
|
||||||
|
return results;
|
||||||
|
} catch (error) {
|
||||||
|
const endTime = Date.now();
|
||||||
|
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||||
|
|
||||||
|
// Update import history with error
|
||||||
|
if (importHistoryId && connections?.localConnection) {
|
||||||
|
await connections.localConnection.query(`
|
||||||
|
UPDATE import_history
|
||||||
|
SET
|
||||||
|
end_time = NOW(),
|
||||||
|
duration_seconds = $1,
|
||||||
|
status = $2,
|
||||||
|
error_message = $3
|
||||||
|
WHERE id = $4
|
||||||
|
`, [totalElapsedSeconds, error.message === "Import cancelled" ? 'cancelled' : 'failed', error.message, importHistoryId]);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error("Error during import process:", error);
|
||||||
|
outputProgress({
|
||||||
|
status: error.message === "Import cancelled" ? "cancelled" : "error",
|
||||||
|
operation: "Import process",
|
||||||
|
message: error.message === "Import cancelled"
|
||||||
|
? `${INCREMENTAL_UPDATE ? 'Incremental' : 'Full'} import cancelled by user after ${formatElapsedTime(totalElapsedSeconds)}`
|
||||||
|
: `${INCREMENTAL_UPDATE ? 'Incremental' : 'Full'} import failed after ${formatElapsedTime(totalElapsedSeconds)}`,
|
||||||
|
error: error.message,
|
||||||
|
current: completedSteps,
|
||||||
|
total: totalSteps,
|
||||||
|
elapsed: formatElapsedTime(startTime),
|
||||||
|
timing: {
|
||||||
|
start_time: new Date(startTime).toISOString(),
|
||||||
|
end_time: new Date(endTime).toISOString(),
|
||||||
|
elapsed_time: formatElapsedTime(startTime),
|
||||||
|
elapsed_seconds: totalElapsedSeconds,
|
||||||
|
total_duration: formatElapsedTime(totalElapsedSeconds)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
} finally {
|
||||||
|
if (connections) {
|
||||||
|
await closeConnections(connections).catch(err => {
|
||||||
|
console.error("Error closing connections:", err);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run the import only if this is the main module
|
||||||
|
if (require.main === module) {
|
||||||
|
main().then((results) => {
|
||||||
|
console.log('Import completed successfully:', results);
|
||||||
|
// Force exit after a small delay to ensure all logs are written
|
||||||
|
setTimeout(() => process.exit(0), 500);
|
||||||
|
}).catch((error) => {
|
||||||
|
console.error("Unhandled error in main process:", error);
|
||||||
|
// Force exit with error code after a small delay
|
||||||
|
setTimeout(() => process.exit(1), 500);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export the functions needed by the route
|
||||||
|
module.exports = {
|
||||||
|
main,
|
||||||
|
cancelImport,
|
||||||
|
};
|
||||||
@@ -1,7 +1,7 @@
|
|||||||
#!/bin/zsh
|
#!/bin/zsh
|
||||||
|
|
||||||
#Clear previous mount in case it’s still there
|
#Clear previous mount in case it’s still there
|
||||||
umount ~/Dev/dashboard-server
|
umount /Users/matt/Dev/dashboard/dashboard-server
|
||||||
|
|
||||||
#Mount
|
#Mount
|
||||||
sshfs matt@dashboard.kent.pw:/var/www/html/dashboard -p 22122 ~/Dev/dashboard-server
|
sshfs matt@dashboard.kent.pw:/var/www/html/dashboard -p 22122 /Users/matt/Dev/dashboard/dashboard-server
|
||||||
83
nginx.conf
83
nginx.conf
@@ -1,83 +0,0 @@
|
|||||||
# Gorgias API endpoints
|
|
||||||
location /api/gorgias/ {
|
|
||||||
proxy_pass http://localhost:3006/api/gorgias/;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
|
|
||||||
# CORS headers
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, DELETE';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
|
|
||||||
# Handle OPTIONS method
|
|
||||||
if ($request_method = 'OPTIONS') {
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, DELETE';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
add_header 'Access-Control-Max-Age' 1728000;
|
|
||||||
add_header 'Content-Type' 'text/plain charset=UTF-8';
|
|
||||||
add_header 'Content-Length' 0;
|
|
||||||
return 204;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# Google Analytics API endpoints
|
|
||||||
location /api/analytics/ {
|
|
||||||
proxy_pass http://localhost:3007/api/analytics/;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
|
|
||||||
# CORS headers
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
|
|
||||||
# Handle OPTIONS method
|
|
||||||
if ($request_method = 'OPTIONS') {
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
add_header 'Access-Control-Max-Age' 1728000;
|
|
||||||
add_header 'Content-Type' 'text/plain charset=UTF-8';
|
|
||||||
add_header 'Content-Length' 0;
|
|
||||||
return 204;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
# Typeform API endpoints
|
|
||||||
location /api/typeform/ {
|
|
||||||
proxy_pass http://localhost:3008/api/typeform/;
|
|
||||||
proxy_http_version 1.1;
|
|
||||||
proxy_set_header Upgrade $http_upgrade;
|
|
||||||
proxy_set_header Connection 'upgrade';
|
|
||||||
proxy_set_header Host $host;
|
|
||||||
proxy_cache_bypass $http_upgrade;
|
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
|
||||||
|
|
||||||
# CORS headers
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
|
|
||||||
# Handle OPTIONS method
|
|
||||||
if ($request_method = 'OPTIONS') {
|
|
||||||
add_header 'Access-Control-Allow-Origin' '*';
|
|
||||||
add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS';
|
|
||||||
add_header 'Access-Control-Allow-Headers' 'DNT,X-CustomHeader,Keep-Alive,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Authorization';
|
|
||||||
add_header 'Access-Control-Max-Age' 1728000;
|
|
||||||
add_header 'Content-Type' 'text/plain charset=UTF-8';
|
|
||||||
add_header 'Content-Length' 0;
|
|
||||||
return 204;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Reference in New Issue
Block a user