Move dashboard server into project
This commit is contained in:
205
inventory-server/dashboard/acot-server/README.md
Normal file
205
inventory-server/dashboard/acot-server/README.md
Normal file
@@ -0,0 +1,205 @@
|
||||
# ACOT Server
|
||||
|
||||
This server replaces the Klaviyo integration with direct database queries to the production MySQL database via SSH tunnel. It provides seamless API compatibility for all frontend components without requiring any frontend changes.
|
||||
|
||||
## Setup
|
||||
|
||||
1. **Environment Variables**: Copy `.env.example` to `.env` and configure:
|
||||
```
|
||||
DB_HOST=localhost
|
||||
DB_PORT=3306
|
||||
DB_USER=your_db_user
|
||||
DB_PASSWORD=your_db_password
|
||||
DB_NAME=your_db_name
|
||||
PORT=3007
|
||||
NODE_ENV=development
|
||||
```
|
||||
|
||||
2. **SSH Tunnel**: Ensure your SSH tunnel to the production database is running on localhost:3306.
|
||||
|
||||
3. **Install Dependencies**:
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
4. **Start Server**:
|
||||
```bash
|
||||
npm start
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
All endpoints provide exact API compatibility with the previous Klaviyo implementation:
|
||||
|
||||
### Main Statistics
|
||||
- `GET /api/acot/events/stats` - Complete statistics dashboard data
|
||||
- Query params: `timeRange` (today, yesterday, thisWeek, lastWeek, thisMonth, lastMonth, last7days, last30days, last90days) or `startDate`/`endDate` for custom ranges
|
||||
- Returns: Revenue, orders, AOV, shipping data, order types, brands/categories, refunds, cancellations, best day, peak hour, order ranges, period progress, projections
|
||||
|
||||
### Daily Details
|
||||
- `GET /api/acot/events/stats/details` - Daily breakdown with previous period comparisons
|
||||
- Query params: `timeRange`, `metric` (revenue, orders, average_order, etc.), `daily=true`
|
||||
- Returns: Array of daily data points with trend comparisons
|
||||
|
||||
### Products
|
||||
- `GET /api/acot/events/products` - Top products with sales data
|
||||
- Query params: `timeRange`
|
||||
- Returns: Product list with images, sales quantities, revenue, and order counts
|
||||
|
||||
### Projections
|
||||
- `GET /api/acot/events/projection` - Smart revenue projections for incomplete periods
|
||||
- Query params: `timeRange`
|
||||
- Returns: Projected revenue with confidence levels based on historical patterns
|
||||
|
||||
### Health Check
|
||||
- `GET /api/acot/test` - Server health and database connectivity test
|
||||
|
||||
## Database Schema
|
||||
|
||||
The server queries the following main tables:
|
||||
|
||||
### Orders (`_order`)
|
||||
- **Key fields**: `order_id`, `date_placed`, `summary_total`, `order_status`, `ship_method_selected`, `stats_waiting_preorder`
|
||||
- **Valid orders**: `order_status > 15`
|
||||
- **Cancelled orders**: `order_status = 15`
|
||||
- **Shipped orders**: `order_status IN (100, 92)`
|
||||
- **Pre-orders**: `stats_waiting_preorder > 0`
|
||||
- **Local pickup**: `ship_method_selected = 'localpickup'`
|
||||
- **On-hold orders**: `ship_method_selected = 'holdit'`
|
||||
|
||||
### Order Items (`order_items`)
|
||||
- **Fields**: `order_id`, `prod_pid`, `qty_ordered`, `prod_price`
|
||||
- **Purpose**: Links orders to products for detailed analysis
|
||||
|
||||
### Products (`products`)
|
||||
- **Fields**: `pid`, `description` (product name), `company`
|
||||
- **Purpose**: Product information and brand data
|
||||
|
||||
### Product Images (`product_images`)
|
||||
- **Fields**: `pid`, `iid`, `order` (priority)
|
||||
- **Primary image**: `order = 255` (highest priority)
|
||||
- **Image URL generation**: `https://sbing.com/i/products/0000/{prefix}/{pid}-{type}-{iid}.jpg`
|
||||
|
||||
### Payments (`order_payment`)
|
||||
- **Refunds**: `payment_amount < 0`
|
||||
- **Purpose**: Track refund amounts and counts
|
||||
|
||||
## Business Logic
|
||||
|
||||
### Time Handling
|
||||
- **Timezone**: All calculations in UTC-5 (Eastern Time)
|
||||
- **Business Day**: 1 AM - 12:59 AM Eastern (25-hour business day)
|
||||
- **Format**: MySQL DATETIME format (YYYY-MM-DD HH:MM:SS)
|
||||
- **Period Boundaries**: Calculated using `timeUtils.js` for consistent time range handling
|
||||
|
||||
### Order Processing
|
||||
- **Revenue Calculation**: Only includes orders with `order_status > 15`
|
||||
- **Order Types**:
|
||||
- Pre-orders: `stats_waiting_preorder > 0`
|
||||
- Local pickup: `ship_method_selected = 'localpickup'`
|
||||
- On-hold: `ship_method_selected = 'holdit'`
|
||||
- **Shipping Methods**: Mapped to friendly names (e.g., `usps_ground_advantage` → "USPS Ground Advantage")
|
||||
|
||||
### Projections
|
||||
- **Period Progress**: Calculated based on current time within the selected period
|
||||
- **Simple Projection**: Linear extrapolation based on current progress
|
||||
- **Smart Projection**: Uses historical data patterns for more accurate forecasting
|
||||
- **Confidence Levels**: Based on data consistency and historical accuracy
|
||||
|
||||
### Image URL Generation
|
||||
- **Pattern**: `https://sbing.com/i/products/0000/{prefix}/{pid}-{type}-{iid}.jpg`
|
||||
- **Prefix**: First 2 digits of product ID
|
||||
- **Type**: "main" for primary images
|
||||
- **Fallback**: Uses primary image (order=255) when available
|
||||
|
||||
## Frontend Integration
|
||||
|
||||
### Service Layer (`services/acotService.js`)
|
||||
- **Purpose**: Replaces direct Klaviyo API calls with acot-server calls
|
||||
- **Methods**: `getStats()`, `getStatsDetails()`, `getProducts()`, `getProjection()`
|
||||
- **Logging**: Axios interceptors for request/response logging
|
||||
- **Environment**: Automatic URL handling (proxy in dev, direct in production)
|
||||
|
||||
### Component Updates
|
||||
All 5 main components updated to use `acotService`:
|
||||
- **StatCards.jsx**: Main dashboard statistics
|
||||
- **MiniStatCards.jsx**: Compact statistics view
|
||||
- **SalesChart.jsx**: Revenue and order trends
|
||||
- **MiniSalesChart.jsx**: Compact chart view
|
||||
- **ProductGrid.jsx**: Top products table
|
||||
|
||||
### Proxy Configuration (`vite.config.js`)
|
||||
```javascript
|
||||
'/api/acot': {
|
||||
target: 'http://localhost:3007',
|
||||
changeOrigin: true,
|
||||
secure: false
|
||||
}
|
||||
```
|
||||
|
||||
## Key Features
|
||||
|
||||
### Complete Business Intelligence
|
||||
- **Revenue Analytics**: Total revenue, trends, projections
|
||||
- **Order Analysis**: Counts, types, status tracking
|
||||
- **Product Performance**: Top sellers, revenue contribution
|
||||
- **Shipping Intelligence**: Methods, locations, distribution
|
||||
- **Customer Insights**: Order value ranges, patterns
|
||||
- **Operational Metrics**: Refunds, cancellations, peak hours
|
||||
|
||||
### Performance Optimizations
|
||||
- **Connection Pooling**: Efficient database connection management
|
||||
- **Query Optimization**: Indexed queries with proper WHERE clauses
|
||||
- **Caching Strategy**: Frontend caching for detail views
|
||||
- **Batch Processing**: Efficient data aggregation
|
||||
|
||||
### Error Handling
|
||||
- **Database Connectivity**: Graceful handling of connection issues
|
||||
- **Query Failures**: Detailed error logging and user-friendly messages
|
||||
- **Data Validation**: Input sanitization and validation
|
||||
- **Fallback Mechanisms**: Default values for missing data
|
||||
|
||||
## Simplified Elements
|
||||
|
||||
Due to database complexity, some features are simplified:
|
||||
- **Brands**: Shows "Various Brands" (companies table structure complex)
|
||||
- **Categories**: Shows "General" (category relationships complex)
|
||||
|
||||
These can be enhanced in future iterations with proper category mapping.
|
||||
|
||||
## Testing
|
||||
|
||||
Test the server functionality:
|
||||
|
||||
```bash
|
||||
# Health check
|
||||
curl http://localhost:3007/api/acot/test
|
||||
|
||||
# Today's stats
|
||||
curl http://localhost:3007/api/acot/events/stats?timeRange=today
|
||||
|
||||
# Last 30 days with details
|
||||
curl http://localhost:3007/api/acot/events/stats/details?timeRange=last30days&daily=true
|
||||
|
||||
# Top products
|
||||
curl http://localhost:3007/api/acot/events/products?timeRange=thisWeek
|
||||
|
||||
# Revenue projection
|
||||
curl http://localhost:3007/api/acot/events/projection?timeRange=today
|
||||
```
|
||||
|
||||
## Development Notes
|
||||
|
||||
- **No Frontend Changes**: Complete drop-in replacement for Klaviyo
|
||||
- **API Compatibility**: Maintains exact response structure
|
||||
- **Business Logic**: Implements all complex e-commerce calculations
|
||||
- **Scalability**: Designed for production workloads
|
||||
- **Maintainability**: Well-documented code with clear separation of concerns
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- Enhanced category and brand mapping
|
||||
- Real-time notifications for significant events
|
||||
- Advanced analytics and forecasting
|
||||
- Customer segmentation analysis
|
||||
- Inventory integration
|
||||
297
inventory-server/dashboard/acot-server/db/connection.js
Normal file
297
inventory-server/dashboard/acot-server/db/connection.js
Normal file
@@ -0,0 +1,297 @@
|
||||
const { Client } = require('ssh2');
|
||||
const mysql = require('mysql2/promise');
|
||||
const fs = require('fs');
|
||||
|
||||
// Connection pool configuration
|
||||
const connectionPool = {
|
||||
connections: [],
|
||||
maxConnections: 20,
|
||||
currentConnections: 0,
|
||||
pendingRequests: [],
|
||||
// Cache for query results (key: query string, value: {data, timestamp})
|
||||
queryCache: new Map(),
|
||||
// Cache duration for different query types in milliseconds
|
||||
cacheDuration: {
|
||||
'stats': 60 * 1000, // 1 minute for stats
|
||||
'products': 5 * 60 * 1000, // 5 minutes for products
|
||||
'orders': 60 * 1000, // 1 minute for orders
|
||||
'default': 60 * 1000 // 1 minute default
|
||||
},
|
||||
// Circuit breaker state
|
||||
circuitBreaker: {
|
||||
failures: 0,
|
||||
lastFailure: 0,
|
||||
isOpen: false,
|
||||
threshold: 5,
|
||||
timeout: 30000 // 30 seconds
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Get a database connection from the pool
|
||||
* @returns {Promise<{connection: object, release: function}>} The database connection and release function
|
||||
*/
|
||||
async function getDbConnection() {
|
||||
return new Promise(async (resolve, reject) => {
|
||||
// Check circuit breaker
|
||||
const now = Date.now();
|
||||
if (connectionPool.circuitBreaker.isOpen) {
|
||||
if (now - connectionPool.circuitBreaker.lastFailure > connectionPool.circuitBreaker.timeout) {
|
||||
// Reset circuit breaker
|
||||
connectionPool.circuitBreaker.isOpen = false;
|
||||
connectionPool.circuitBreaker.failures = 0;
|
||||
console.log('Circuit breaker reset');
|
||||
} else {
|
||||
reject(new Error('Circuit breaker is open - too many connection failures'));
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Check if there's an available connection in the pool
|
||||
if (connectionPool.connections.length > 0) {
|
||||
const conn = connectionPool.connections.pop();
|
||||
console.log(`Using pooled connection. Pool size: ${connectionPool.connections.length}`);
|
||||
resolve({
|
||||
connection: conn.connection,
|
||||
release: () => releaseConnection(conn)
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// If we haven't reached max connections, create a new one
|
||||
if (connectionPool.currentConnections < connectionPool.maxConnections) {
|
||||
try {
|
||||
console.log(`Creating new connection. Current: ${connectionPool.currentConnections}/${connectionPool.maxConnections}`);
|
||||
connectionPool.currentConnections++;
|
||||
|
||||
const tunnel = await setupSshTunnel();
|
||||
const { ssh, stream, dbConfig } = tunnel;
|
||||
|
||||
const connection = await mysql.createConnection({
|
||||
...dbConfig,
|
||||
stream
|
||||
});
|
||||
|
||||
const conn = { ssh, connection, inUse: true, created: Date.now() };
|
||||
|
||||
console.log('Database connection established');
|
||||
|
||||
// Reset circuit breaker on successful connection
|
||||
if (connectionPool.circuitBreaker.failures > 0) {
|
||||
connectionPool.circuitBreaker.failures = 0;
|
||||
connectionPool.circuitBreaker.isOpen = false;
|
||||
}
|
||||
|
||||
resolve({
|
||||
connection: conn.connection,
|
||||
release: () => releaseConnection(conn)
|
||||
});
|
||||
} catch (error) {
|
||||
connectionPool.currentConnections--;
|
||||
|
||||
// Track circuit breaker failures
|
||||
connectionPool.circuitBreaker.failures++;
|
||||
connectionPool.circuitBreaker.lastFailure = Date.now();
|
||||
|
||||
if (connectionPool.circuitBreaker.failures >= connectionPool.circuitBreaker.threshold) {
|
||||
connectionPool.circuitBreaker.isOpen = true;
|
||||
console.log(`Circuit breaker opened after ${connectionPool.circuitBreaker.failures} failures`);
|
||||
}
|
||||
|
||||
reject(error);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Pool is full, queue the request with timeout
|
||||
console.log('Connection pool full, queuing request...');
|
||||
const timeoutId = setTimeout(() => {
|
||||
// Remove from queue if still there
|
||||
const index = connectionPool.pendingRequests.findIndex(req => req.resolve === resolve);
|
||||
if (index !== -1) {
|
||||
connectionPool.pendingRequests.splice(index, 1);
|
||||
reject(new Error('Connection pool queue timeout after 15 seconds'));
|
||||
}
|
||||
}, 15000);
|
||||
|
||||
connectionPool.pendingRequests.push({
|
||||
resolve,
|
||||
reject,
|
||||
timeoutId,
|
||||
timestamp: Date.now()
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Release a connection back to the pool
|
||||
*/
|
||||
function releaseConnection(conn) {
|
||||
conn.inUse = false;
|
||||
|
||||
// Check if there are pending requests
|
||||
if (connectionPool.pendingRequests.length > 0) {
|
||||
const { resolve, timeoutId } = connectionPool.pendingRequests.shift();
|
||||
|
||||
// Clear the timeout since we're serving the request
|
||||
if (timeoutId) {
|
||||
clearTimeout(timeoutId);
|
||||
}
|
||||
|
||||
conn.inUse = true;
|
||||
console.log(`Serving queued request. Queue length: ${connectionPool.pendingRequests.length}`);
|
||||
resolve({
|
||||
connection: conn.connection,
|
||||
release: () => releaseConnection(conn)
|
||||
});
|
||||
} else {
|
||||
// Return to pool
|
||||
connectionPool.connections.push(conn);
|
||||
console.log(`Connection returned to pool. Pool size: ${connectionPool.connections.length}, Active: ${connectionPool.currentConnections}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get cached query results or execute query if not cached
|
||||
* @param {string} cacheKey - Unique key to identify the query
|
||||
* @param {string} queryType - Type of query (stats, products, orders, etc.)
|
||||
* @param {Function} queryFn - Function to execute if cache miss
|
||||
* @returns {Promise<any>} The query result
|
||||
*/
|
||||
async function getCachedQuery(cacheKey, queryType, queryFn) {
|
||||
// Get cache duration based on query type
|
||||
const cacheDuration = connectionPool.cacheDuration[queryType] || connectionPool.cacheDuration.default;
|
||||
|
||||
// Check if we have a valid cached result
|
||||
const cachedResult = connectionPool.queryCache.get(cacheKey);
|
||||
const now = Date.now();
|
||||
|
||||
if (cachedResult && (now - cachedResult.timestamp < cacheDuration)) {
|
||||
console.log(`Cache hit for ${queryType} query: ${cacheKey}`);
|
||||
return cachedResult.data;
|
||||
}
|
||||
|
||||
// No valid cache found, execute the query
|
||||
console.log(`Cache miss for ${queryType} query: ${cacheKey}`);
|
||||
const result = await queryFn();
|
||||
|
||||
// Cache the result
|
||||
connectionPool.queryCache.set(cacheKey, {
|
||||
data: result,
|
||||
timestamp: now
|
||||
});
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup SSH tunnel to production database
|
||||
* @private - Should only be used by getDbConnection
|
||||
* @returns {Promise<{ssh: object, stream: object, dbConfig: object}>}
|
||||
*/
|
||||
async function setupSshTunnel() {
|
||||
const sshConfig = {
|
||||
host: process.env.PROD_SSH_HOST,
|
||||
port: process.env.PROD_SSH_PORT || 22,
|
||||
username: process.env.PROD_SSH_USER,
|
||||
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||
: undefined,
|
||||
compress: true
|
||||
};
|
||||
|
||||
const dbConfig = {
|
||||
host: process.env.PROD_DB_HOST || 'localhost',
|
||||
user: process.env.PROD_DB_USER,
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z'
|
||||
};
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const ssh = new Client();
|
||||
|
||||
ssh.on('error', (err) => {
|
||||
console.error('SSH connection error:', err);
|
||||
reject(err);
|
||||
});
|
||||
|
||||
ssh.on('ready', () => {
|
||||
ssh.forwardOut(
|
||||
'127.0.0.1',
|
||||
0,
|
||||
dbConfig.host,
|
||||
dbConfig.port,
|
||||
(err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream, dbConfig });
|
||||
}
|
||||
);
|
||||
}).connect(sshConfig);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear cached query results
|
||||
* @param {string} [cacheKey] - Specific cache key to clear (clears all if not provided)
|
||||
*/
|
||||
function clearQueryCache(cacheKey) {
|
||||
if (cacheKey) {
|
||||
connectionPool.queryCache.delete(cacheKey);
|
||||
console.log(`Cleared cache for key: ${cacheKey}`);
|
||||
} else {
|
||||
connectionPool.queryCache.clear();
|
||||
console.log('Cleared all query cache');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Force close all active connections
|
||||
* Useful for server shutdown or manual connection reset
|
||||
*/
|
||||
async function closeAllConnections() {
|
||||
// Close all pooled connections
|
||||
for (const conn of connectionPool.connections) {
|
||||
try {
|
||||
await conn.connection.end();
|
||||
conn.ssh.end();
|
||||
console.log('Closed pooled connection');
|
||||
} catch (error) {
|
||||
console.error('Error closing pooled connection:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Reset pool state
|
||||
connectionPool.connections = [];
|
||||
connectionPool.currentConnections = 0;
|
||||
connectionPool.pendingRequests = [];
|
||||
connectionPool.queryCache.clear();
|
||||
|
||||
console.log('All connections closed and pool reset');
|
||||
}
|
||||
|
||||
/**
|
||||
* Get connection pool status for debugging
|
||||
*/
|
||||
function getPoolStatus() {
|
||||
return {
|
||||
poolSize: connectionPool.connections.length,
|
||||
activeConnections: connectionPool.currentConnections,
|
||||
maxConnections: connectionPool.maxConnections,
|
||||
pendingRequests: connectionPool.pendingRequests.length,
|
||||
cacheSize: connectionPool.queryCache.size,
|
||||
queuedRequests: connectionPool.pendingRequests.map(req => ({
|
||||
waitTime: Date.now() - req.timestamp,
|
||||
hasTimeout: !!req.timeoutId
|
||||
}))
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getDbConnection,
|
||||
getCachedQuery,
|
||||
clearQueryCache,
|
||||
closeAllConnections,
|
||||
getPoolStatus
|
||||
};
|
||||
1543
inventory-server/dashboard/acot-server/package-lock.json
generated
Normal file
1543
inventory-server/dashboard/acot-server/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
22
inventory-server/dashboard/acot-server/package.json
Normal file
22
inventory-server/dashboard/acot-server/package.json
Normal file
@@ -0,0 +1,22 @@
|
||||
{
|
||||
"name": "acot-server",
|
||||
"version": "1.0.0",
|
||||
"description": "A Cherry On Top production database server",
|
||||
"main": "server.js",
|
||||
"scripts": {
|
||||
"start": "node server.js",
|
||||
"dev": "nodemon server.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"express": "^4.18.2",
|
||||
"cors": "^2.8.5",
|
||||
"dotenv": "^16.3.1",
|
||||
"morgan": "^1.10.0",
|
||||
"ssh2": "^1.14.0",
|
||||
"mysql2": "^3.6.5",
|
||||
"compression": "^1.7.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"nodemon": "^3.0.1"
|
||||
}
|
||||
}
|
||||
767
inventory-server/dashboard/acot-server/routes/events.js
Normal file
767
inventory-server/dashboard/acot-server/routes/events.js
Normal file
@@ -0,0 +1,767 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { getDbConnection, getPoolStatus } = require('../db/connection');
|
||||
const { getTimeRangeConditions, formatBusinessDate, getBusinessDayBounds } = require('../utils/timeUtils');
|
||||
|
||||
// Image URL generation utility
|
||||
const getImageUrls = (pid, iid = 1) => {
|
||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
const prefix = paddedPid.slice(0, 3);
|
||||
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||
return {
|
||||
image: `${basePath}-t-${iid}.jpg`,
|
||||
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||
image_full: `${basePath}-o-${iid}.jpg`,
|
||||
ImgThumb: `${basePath}-175x175-${iid}.jpg` // For ProductGrid component
|
||||
};
|
||||
};
|
||||
|
||||
// Main stats endpoint - replaces /api/klaviyo/events/stats
|
||||
router.get('/stats', async (req, res) => {
|
||||
const startTime = Date.now();
|
||||
console.log(`[STATS] Starting request for timeRange: ${req.query.timeRange}`);
|
||||
|
||||
// Set a timeout for the entire operation
|
||||
const timeoutPromise = new Promise((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Request timeout after 15 seconds')), 15000);
|
||||
});
|
||||
|
||||
try {
|
||||
const mainOperation = async () => {
|
||||
const { timeRange, startDate, endDate } = req.query;
|
||||
console.log(`[STATS] Getting DB connection...`);
|
||||
const { connection, release } = await getDbConnection();
|
||||
console.log(`[STATS] DB connection obtained in ${Date.now() - startTime}ms`);
|
||||
|
||||
const { whereClause, params, dateRange } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
// Main order stats query
|
||||
const mainStatsQuery = `
|
||||
SELECT
|
||||
COUNT(*) as orderCount,
|
||||
SUM(summary_total) as revenue,
|
||||
SUM(stats_prod_pieces) as itemCount,
|
||||
AVG(summary_total) as averageOrderValue,
|
||||
AVG(stats_prod_pieces) as averageItemsPerOrder,
|
||||
SUM(CASE WHEN stats_waiting_preorder > 0 THEN 1 ELSE 0 END) as preOrderCount,
|
||||
SUM(CASE WHEN ship_method_selected = 'localpickup' THEN 1 ELSE 0 END) as localPickupCount,
|
||||
SUM(CASE WHEN ship_method_selected = 'holdit' THEN 1 ELSE 0 END) as onHoldCount,
|
||||
SUM(CASE WHEN order_status IN (100, 92) THEN 1 ELSE 0 END) as shippedCount,
|
||||
SUM(CASE WHEN order_status = 15 THEN 1 ELSE 0 END) as cancelledCount,
|
||||
SUM(CASE WHEN order_status = 15 THEN summary_total ELSE 0 END) as cancelledTotal
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
`;
|
||||
|
||||
const [mainStats] = await connection.execute(mainStatsQuery, params);
|
||||
const stats = mainStats[0];
|
||||
|
||||
// Refunds query
|
||||
const refundsQuery = `
|
||||
SELECT
|
||||
COUNT(*) as refundCount,
|
||||
ABS(SUM(payment_amount)) as refundTotal
|
||||
FROM order_payment op
|
||||
JOIN _order o ON op.order_id = o.order_id
|
||||
WHERE payment_amount < 0 AND o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||
`;
|
||||
|
||||
const [refundStats] = await connection.execute(refundsQuery, params);
|
||||
|
||||
// Best revenue day query
|
||||
const bestDayQuery = `
|
||||
SELECT
|
||||
DATE(date_placed) as date,
|
||||
SUM(summary_total) as revenue,
|
||||
COUNT(*) as orders
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
GROUP BY DATE(date_placed)
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
const [bestDayResult] = await connection.execute(bestDayQuery, params);
|
||||
|
||||
// Peak hour query (for single day periods)
|
||||
let peakHour = null;
|
||||
if (['today', 'yesterday'].includes(timeRange)) {
|
||||
const peakHourQuery = `
|
||||
SELECT
|
||||
HOUR(date_placed) as hour,
|
||||
COUNT(*) as count
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
GROUP BY HOUR(date_placed)
|
||||
ORDER BY count DESC
|
||||
LIMIT 1
|
||||
`;
|
||||
|
||||
const [peakHourResult] = await connection.execute(peakHourQuery, params);
|
||||
if (peakHourResult.length > 0) {
|
||||
const hour = peakHourResult[0].hour;
|
||||
const date = new Date();
|
||||
date.setHours(hour, 0, 0);
|
||||
peakHour = {
|
||||
hour,
|
||||
count: peakHourResult[0].count,
|
||||
displayHour: date.toLocaleString("en-US", { hour: "numeric", hour12: true })
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Brands and categories query - simplified for now since we don't have the category tables
|
||||
// We'll use a simple approach without company table for now
|
||||
const brandsQuery = `
|
||||
SELECT
|
||||
'Various Brands' as brandName,
|
||||
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||
SUM(oi.qty_ordered) as itemCount,
|
||||
SUM(oi.qty_ordered * oi.prod_price) as revenue
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
JOIN products p ON oi.prod_pid = p.pid
|
||||
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||
HAVING revenue > 0
|
||||
`;
|
||||
|
||||
const [brandsResult] = await connection.execute(brandsQuery, params);
|
||||
|
||||
// For categories, we'll use a simplified approach
|
||||
const categoriesQuery = `
|
||||
SELECT
|
||||
'General' as categoryName,
|
||||
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||
SUM(oi.qty_ordered) as itemCount,
|
||||
SUM(oi.qty_ordered * oi.prod_price) as revenue
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
JOIN products p ON oi.prod_pid = p.pid
|
||||
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||
HAVING revenue > 0
|
||||
`;
|
||||
|
||||
const [categoriesResult] = await connection.execute(categoriesQuery, params);
|
||||
|
||||
// Shipping locations query
|
||||
const shippingQuery = `
|
||||
SELECT
|
||||
ship_country,
|
||||
ship_state,
|
||||
ship_method_selected,
|
||||
COUNT(*) as count
|
||||
FROM _order
|
||||
WHERE order_status IN (100, 92) AND ${whereClause}
|
||||
GROUP BY ship_country, ship_state, ship_method_selected
|
||||
`;
|
||||
|
||||
const [shippingResult] = await connection.execute(shippingQuery, params);
|
||||
|
||||
// Process shipping data
|
||||
const shippingStats = processShippingData(shippingResult, stats.shippedCount);
|
||||
|
||||
// Order value range query
|
||||
const orderRangeQuery = `
|
||||
SELECT
|
||||
MIN(summary_total) as smallest,
|
||||
MAX(summary_total) as largest
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
`;
|
||||
|
||||
const [orderRangeResult] = await connection.execute(orderRangeQuery, params);
|
||||
|
||||
// Calculate period progress for incomplete periods
|
||||
let periodProgress = 100;
|
||||
if (['today', 'thisWeek', 'thisMonth'].includes(timeRange)) {
|
||||
periodProgress = calculatePeriodProgress(timeRange);
|
||||
}
|
||||
|
||||
// Previous period comparison data
|
||||
const prevPeriodData = await getPreviousPeriodData(connection, timeRange, startDate, endDate);
|
||||
|
||||
const response = {
|
||||
timeRange: dateRange,
|
||||
stats: {
|
||||
revenue: parseFloat(stats.revenue || 0),
|
||||
orderCount: parseInt(stats.orderCount || 0),
|
||||
itemCount: parseInt(stats.itemCount || 0),
|
||||
averageOrderValue: parseFloat(stats.averageOrderValue || 0),
|
||||
averageItemsPerOrder: parseFloat(stats.averageItemsPerOrder || 0),
|
||||
|
||||
// Order types
|
||||
orderTypes: {
|
||||
preOrders: {
|
||||
count: parseInt(stats.preOrderCount || 0),
|
||||
percentage: stats.orderCount > 0 ? (stats.preOrderCount / stats.orderCount) * 100 : 0
|
||||
},
|
||||
localPickup: {
|
||||
count: parseInt(stats.localPickupCount || 0),
|
||||
percentage: stats.orderCount > 0 ? (stats.localPickupCount / stats.orderCount) * 100 : 0
|
||||
},
|
||||
heldItems: {
|
||||
count: parseInt(stats.onHoldCount || 0),
|
||||
percentage: stats.orderCount > 0 ? (stats.onHoldCount / stats.orderCount) * 100 : 0
|
||||
}
|
||||
},
|
||||
|
||||
// Shipping
|
||||
shipping: {
|
||||
shippedCount: parseInt(stats.shippedCount || 0),
|
||||
locations: shippingStats.locations,
|
||||
methodStats: shippingStats.methods
|
||||
},
|
||||
|
||||
// Brands and categories
|
||||
brands: {
|
||||
total: brandsResult.length,
|
||||
list: brandsResult.slice(0, 50).map(brand => ({
|
||||
name: brand.brandName,
|
||||
count: parseInt(brand.itemCount),
|
||||
revenue: parseFloat(brand.revenue)
|
||||
}))
|
||||
},
|
||||
|
||||
categories: {
|
||||
total: categoriesResult.length,
|
||||
list: categoriesResult.slice(0, 50).map(category => ({
|
||||
name: category.categoryName,
|
||||
count: parseInt(category.itemCount),
|
||||
revenue: parseFloat(category.revenue)
|
||||
}))
|
||||
},
|
||||
|
||||
// Refunds and cancellations
|
||||
refunds: {
|
||||
total: parseFloat(refundStats[0]?.refundTotal || 0),
|
||||
count: parseInt(refundStats[0]?.refundCount || 0)
|
||||
},
|
||||
|
||||
canceledOrders: {
|
||||
total: parseFloat(stats.cancelledTotal || 0),
|
||||
count: parseInt(stats.cancelledCount || 0)
|
||||
},
|
||||
|
||||
// Best day
|
||||
bestRevenueDay: bestDayResult.length > 0 ? {
|
||||
amount: parseFloat(bestDayResult[0].revenue),
|
||||
displayDate: bestDayResult[0].date,
|
||||
orders: parseInt(bestDayResult[0].orders)
|
||||
} : null,
|
||||
|
||||
// Peak hour (for single days)
|
||||
peakOrderHour: peakHour,
|
||||
|
||||
// Order value range
|
||||
orderValueRange: orderRangeResult.length > 0 ? {
|
||||
smallest: parseFloat(orderRangeResult[0].smallest || 0),
|
||||
largest: parseFloat(orderRangeResult[0].largest || 0)
|
||||
} : { smallest: 0, largest: 0 },
|
||||
|
||||
// Period progress and projections
|
||||
periodProgress,
|
||||
projectedRevenue: periodProgress < 100 ? (stats.revenue / (periodProgress / 100)) : stats.revenue,
|
||||
|
||||
// Previous period comparison
|
||||
prevPeriodRevenue: prevPeriodData.revenue,
|
||||
prevPeriodOrders: prevPeriodData.orderCount,
|
||||
prevPeriodAOV: prevPeriodData.averageOrderValue
|
||||
}
|
||||
};
|
||||
|
||||
return { response, release };
|
||||
};
|
||||
|
||||
// Race between the main operation and timeout
|
||||
let result;
|
||||
try {
|
||||
result = await Promise.race([mainOperation(), timeoutPromise]);
|
||||
} catch (error) {
|
||||
// If it's a timeout, we don't have a release function to call
|
||||
if (error.message.includes('timeout')) {
|
||||
console.log(`[STATS] Request timed out in ${Date.now() - startTime}ms`);
|
||||
throw error;
|
||||
}
|
||||
// For other errors, re-throw
|
||||
throw error;
|
||||
}
|
||||
|
||||
const { response, release } = result;
|
||||
|
||||
// Release connection back to pool
|
||||
if (release) release();
|
||||
|
||||
console.log(`[STATS] Request completed in ${Date.now() - startTime}ms`);
|
||||
res.json(response);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /stats:', error);
|
||||
console.log(`[STATS] Request failed in ${Date.now() - startTime}ms`);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Daily details endpoint - replaces /api/klaviyo/events/stats/details
|
||||
router.get('/stats/details', async (req, res) => {
|
||||
let release;
|
||||
try {
|
||||
const { timeRange, startDate, endDate, metric, daily } = req.query;
|
||||
const { connection, release: releaseConn } = await getDbConnection();
|
||||
release = releaseConn;
|
||||
|
||||
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
// Daily breakdown query
|
||||
const dailyQuery = `
|
||||
SELECT
|
||||
DATE(date_placed) as date,
|
||||
COUNT(*) as orders,
|
||||
SUM(summary_total) as revenue,
|
||||
AVG(summary_total) as averageOrderValue,
|
||||
SUM(stats_prod_pieces) as itemCount
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
GROUP BY DATE(date_placed)
|
||||
ORDER BY DATE(date_placed)
|
||||
`;
|
||||
|
||||
const [dailyResults] = await connection.execute(dailyQuery, params);
|
||||
|
||||
// Get previous period data using the same logic as main stats endpoint
|
||||
let prevWhereClause, prevParams;
|
||||
|
||||
if (timeRange && timeRange !== 'custom') {
|
||||
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||
const result = getTimeRangeConditions(prevTimeRange);
|
||||
prevWhereClause = result.whereClause;
|
||||
prevParams = result.params;
|
||||
} else {
|
||||
// Custom date range - go back by the same duration
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
const duration = end.getTime() - start.getTime();
|
||||
|
||||
const prevEnd = new Date(start.getTime() - 1);
|
||||
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||
|
||||
prevWhereClause = 'date_placed >= ? AND date_placed <= ?';
|
||||
prevParams = [prevStart.toISOString(), prevEnd.toISOString()];
|
||||
}
|
||||
|
||||
// Get previous period daily data
|
||||
const prevQuery = `
|
||||
SELECT
|
||||
DATE(date_placed) as date,
|
||||
COUNT(*) as prevOrders,
|
||||
SUM(summary_total) as prevRevenue,
|
||||
AVG(summary_total) as prevAvgOrderValue
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${prevWhereClause}
|
||||
GROUP BY DATE(date_placed)
|
||||
`;
|
||||
|
||||
const [prevResults] = await connection.execute(prevQuery, prevParams);
|
||||
|
||||
// Create a map for quick lookup of previous period data
|
||||
const prevMap = new Map();
|
||||
prevResults.forEach(prev => {
|
||||
const key = new Date(prev.date).toISOString().split('T')[0];
|
||||
prevMap.set(key, prev);
|
||||
});
|
||||
|
||||
// For period-to-period comparison, we need to map days by relative position
|
||||
// since dates won't match exactly (e.g., current week vs previous week)
|
||||
const dailyArray = dailyResults.map(day => ({
|
||||
timestamp: day.date,
|
||||
date: day.date,
|
||||
orders: parseInt(day.orders),
|
||||
revenue: parseFloat(day.revenue),
|
||||
averageOrderValue: parseFloat(day.averageOrderValue || 0),
|
||||
itemCount: parseInt(day.itemCount)
|
||||
}));
|
||||
|
||||
const prevArray = prevResults.map(day => ({
|
||||
orders: parseInt(day.prevOrders),
|
||||
revenue: parseFloat(day.prevRevenue),
|
||||
averageOrderValue: parseFloat(day.prevAvgOrderValue || 0)
|
||||
}));
|
||||
|
||||
// Combine current and previous period data by matching relative positions
|
||||
const statsWithComparison = dailyArray.map((day, index) => {
|
||||
const prev = prevArray[index] || { orders: 0, revenue: 0, averageOrderValue: 0 };
|
||||
|
||||
return {
|
||||
...day,
|
||||
prevOrders: prev.orders,
|
||||
prevRevenue: prev.revenue,
|
||||
prevAvgOrderValue: prev.averageOrderValue
|
||||
};
|
||||
});
|
||||
|
||||
res.json({ stats: statsWithComparison });
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /stats/details:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
} finally {
|
||||
// Release connection back to pool
|
||||
if (release) release();
|
||||
}
|
||||
});
|
||||
|
||||
// Products endpoint - replaces /api/klaviyo/events/products
|
||||
router.get('/products', async (req, res) => {
|
||||
let release;
|
||||
try {
|
||||
const { timeRange, startDate, endDate } = req.query;
|
||||
const { connection, release: releaseConn } = await getDbConnection();
|
||||
release = releaseConn;
|
||||
|
||||
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
const productsQuery = `
|
||||
SELECT
|
||||
p.pid,
|
||||
p.description as name,
|
||||
SUM(oi.qty_ordered) as totalQuantity,
|
||||
SUM(oi.qty_ordered * oi.prod_price) as totalRevenue,
|
||||
COUNT(DISTINCT oi.order_id) as orderCount,
|
||||
(SELECT pi.iid FROM product_images pi WHERE pi.pid = p.pid AND pi.order = 255 LIMIT 1) as primary_iid
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
JOIN products p ON oi.prod_pid = p.pid
|
||||
WHERE o.order_status > 15 AND ${whereClause.replace('date_placed', 'o.date_placed')}
|
||||
GROUP BY p.pid, p.description
|
||||
ORDER BY totalRevenue DESC
|
||||
LIMIT 500
|
||||
`;
|
||||
|
||||
const [productsResult] = await connection.execute(productsQuery, params);
|
||||
|
||||
// Add image URLs to each product
|
||||
const productsWithImages = productsResult.map(product => {
|
||||
const imageUrls = getImageUrls(product.pid, product.primary_iid || 1);
|
||||
return {
|
||||
id: product.pid,
|
||||
name: product.name,
|
||||
totalQuantity: parseInt(product.totalQuantity),
|
||||
totalRevenue: parseFloat(product.totalRevenue),
|
||||
orderCount: parseInt(product.orderCount),
|
||||
...imageUrls
|
||||
};
|
||||
});
|
||||
|
||||
res.json({
|
||||
stats: {
|
||||
products: {
|
||||
total: productsWithImages.length,
|
||||
list: productsWithImages
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /products:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
} finally {
|
||||
// Release connection back to pool
|
||||
if (release) release();
|
||||
}
|
||||
});
|
||||
|
||||
// Projection endpoint - replaces /api/klaviyo/events/projection
|
||||
router.get('/projection', async (req, res) => {
|
||||
let release;
|
||||
try {
|
||||
const { timeRange, startDate, endDate } = req.query;
|
||||
|
||||
// Only provide projections for incomplete periods
|
||||
if (!['today', 'thisWeek', 'thisMonth'].includes(timeRange)) {
|
||||
return res.json({ projectedRevenue: 0, confidence: 0 });
|
||||
}
|
||||
|
||||
const { connection, release: releaseConn } = await getDbConnection();
|
||||
release = releaseConn;
|
||||
|
||||
// Get current period data
|
||||
const { whereClause, params } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
const currentQuery = `
|
||||
SELECT
|
||||
SUM(summary_total) as currentRevenue,
|
||||
COUNT(*) as currentOrders
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${whereClause}
|
||||
`;
|
||||
|
||||
const [currentResult] = await connection.execute(currentQuery, params);
|
||||
const current = currentResult[0];
|
||||
|
||||
// Get historical data for the same period type
|
||||
const historicalQuery = await getHistoricalProjectionData(connection, timeRange);
|
||||
|
||||
// Calculate projection based on current progress and historical patterns
|
||||
const periodProgress = calculatePeriodProgress(timeRange);
|
||||
const projection = calculateSmartProjection(
|
||||
parseFloat(current.currentRevenue || 0),
|
||||
parseInt(current.currentOrders || 0),
|
||||
periodProgress,
|
||||
historicalQuery
|
||||
);
|
||||
|
||||
res.json(projection);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /projection:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
} finally {
|
||||
// Release connection back to pool
|
||||
if (release) release();
|
||||
}
|
||||
});
|
||||
|
||||
// Debug endpoint to check connection pool status
|
||||
router.get('/debug/pool', (req, res) => {
|
||||
res.json(getPoolStatus());
|
||||
});
|
||||
|
||||
// Health check endpoint
|
||||
router.get('/health', async (req, res) => {
|
||||
try {
|
||||
const { connection, release } = await getDbConnection();
|
||||
|
||||
// Simple query to test connection
|
||||
const [result] = await connection.execute('SELECT 1 as test');
|
||||
release();
|
||||
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus(),
|
||||
dbTest: result[0]
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
status: 'unhealthy',
|
||||
error: error.message,
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus()
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Helper functions
|
||||
function processShippingData(shippingResult, totalShipped) {
|
||||
const countries = {};
|
||||
const states = {};
|
||||
const methods = {};
|
||||
|
||||
shippingResult.forEach(row => {
|
||||
// Countries
|
||||
if (row.ship_country) {
|
||||
countries[row.ship_country] = (countries[row.ship_country] || 0) + row.count;
|
||||
}
|
||||
|
||||
// States
|
||||
if (row.ship_state) {
|
||||
states[row.ship_state] = (states[row.ship_state] || 0) + row.count;
|
||||
}
|
||||
|
||||
// Methods
|
||||
if (row.ship_method_selected) {
|
||||
methods[row.ship_method_selected] = (methods[row.ship_method_selected] || 0) + row.count;
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
locations: {
|
||||
total: totalShipped,
|
||||
byCountry: Object.entries(countries)
|
||||
.map(([country, count]) => ({
|
||||
country,
|
||||
count,
|
||||
percentage: (count / totalShipped) * 100
|
||||
}))
|
||||
.sort((a, b) => b.count - a.count),
|
||||
byState: Object.entries(states)
|
||||
.map(([state, count]) => ({
|
||||
state,
|
||||
count,
|
||||
percentage: (count / totalShipped) * 100
|
||||
}))
|
||||
.sort((a, b) => b.count - a.count)
|
||||
},
|
||||
methods: Object.entries(methods)
|
||||
.map(([name, value]) => ({ name, value }))
|
||||
.sort((a, b) => b.value - a.value)
|
||||
};
|
||||
}
|
||||
|
||||
function calculatePeriodProgress(timeRange) {
|
||||
const now = new Date();
|
||||
const easternTime = new Date(now.getTime() - (5 * 60 * 60 * 1000)); // UTC-5
|
||||
|
||||
switch (timeRange) {
|
||||
case 'today': {
|
||||
const { start } = getBusinessDayBounds('today');
|
||||
const businessStart = new Date(start);
|
||||
const businessEnd = new Date(businessStart);
|
||||
businessEnd.setDate(businessEnd.getDate() + 1);
|
||||
businessEnd.setHours(0, 59, 59, 999); // 12:59 AM next day
|
||||
|
||||
const elapsed = easternTime.getTime() - businessStart.getTime();
|
||||
const total = businessEnd.getTime() - businessStart.getTime();
|
||||
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||
}
|
||||
case 'thisWeek': {
|
||||
const startOfWeek = new Date(easternTime);
|
||||
startOfWeek.setDate(easternTime.getDate() - easternTime.getDay()); // Sunday
|
||||
startOfWeek.setHours(1, 0, 0, 0); // 1 AM business day start
|
||||
|
||||
const endOfWeek = new Date(startOfWeek);
|
||||
endOfWeek.setDate(endOfWeek.getDate() + 7);
|
||||
|
||||
const elapsed = easternTime.getTime() - startOfWeek.getTime();
|
||||
const total = endOfWeek.getTime() - startOfWeek.getTime();
|
||||
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||
}
|
||||
case 'thisMonth': {
|
||||
const startOfMonth = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 1, 0, 0, 0);
|
||||
const endOfMonth = new Date(easternTime.getFullYear(), easternTime.getMonth() + 1, 1, 0, 59, 59, 999);
|
||||
|
||||
const elapsed = easternTime.getTime() - startOfMonth.getTime();
|
||||
const total = endOfMonth.getTime() - startOfMonth.getTime();
|
||||
return Math.min(100, Math.max(0, (elapsed / total) * 100));
|
||||
}
|
||||
default:
|
||||
return 100;
|
||||
}
|
||||
}
|
||||
|
||||
async function getPreviousPeriodData(connection, timeRange, startDate, endDate) {
|
||||
// Calculate previous period dates
|
||||
let prevWhereClause, prevParams;
|
||||
|
||||
if (timeRange && timeRange !== 'custom') {
|
||||
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||
const result = getTimeRangeConditions(prevTimeRange);
|
||||
prevWhereClause = result.whereClause;
|
||||
prevParams = result.params;
|
||||
} else {
|
||||
// Custom date range - go back by the same duration
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
const duration = end.getTime() - start.getTime();
|
||||
|
||||
const prevEnd = new Date(start.getTime() - 1);
|
||||
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||
|
||||
prevWhereClause = 'date_placed >= ? AND date_placed <= ?';
|
||||
prevParams = [prevStart.toISOString(), prevEnd.toISOString()];
|
||||
}
|
||||
|
||||
const prevQuery = `
|
||||
SELECT
|
||||
COUNT(*) as orderCount,
|
||||
SUM(summary_total) as revenue,
|
||||
AVG(summary_total) as averageOrderValue
|
||||
FROM _order
|
||||
WHERE order_status > 15 AND ${prevWhereClause}
|
||||
`;
|
||||
|
||||
const [prevResult] = await connection.execute(prevQuery, prevParams);
|
||||
const prev = prevResult[0] || { orderCount: 0, revenue: 0, averageOrderValue: 0 };
|
||||
|
||||
return {
|
||||
orderCount: parseInt(prev.orderCount || 0),
|
||||
revenue: parseFloat(prev.revenue || 0),
|
||||
averageOrderValue: parseFloat(prev.averageOrderValue || 0)
|
||||
};
|
||||
}
|
||||
|
||||
function getPreviousTimeRange(timeRange) {
|
||||
const map = {
|
||||
today: 'yesterday',
|
||||
thisWeek: 'lastWeek',
|
||||
thisMonth: 'lastMonth',
|
||||
last7days: 'previous7days',
|
||||
last30days: 'previous30days',
|
||||
last90days: 'previous90days',
|
||||
yesterday: 'twoDaysAgo'
|
||||
};
|
||||
return map[timeRange] || timeRange;
|
||||
}
|
||||
|
||||
async function getHistoricalProjectionData(connection, timeRange) {
|
||||
// Get historical data for projection calculations
|
||||
// This is a simplified version - you could make this more sophisticated
|
||||
const historicalQuery = `
|
||||
SELECT
|
||||
SUM(summary_total) as revenue,
|
||||
COUNT(*) as orders
|
||||
FROM _order
|
||||
WHERE order_status > 15
|
||||
AND date_placed >= DATE_SUB(NOW(), INTERVAL 30 DAY)
|
||||
AND date_placed < DATE_SUB(NOW(), INTERVAL 1 DAY)
|
||||
`;
|
||||
|
||||
const [result] = await connection.execute(historicalQuery);
|
||||
return result;
|
||||
}
|
||||
|
||||
function calculateSmartProjection(currentRevenue, currentOrders, periodProgress, historicalData) {
|
||||
if (periodProgress >= 100) {
|
||||
return { projectedRevenue: currentRevenue, projectedOrders: currentOrders, confidence: 1.0 };
|
||||
}
|
||||
|
||||
// Simple linear projection with confidence based on how much of the period has elapsed
|
||||
const projectedRevenue = currentRevenue / (periodProgress / 100);
|
||||
const projectedOrders = Math.round(currentOrders / (periodProgress / 100));
|
||||
|
||||
// Confidence increases with more data (higher period progress)
|
||||
const confidence = Math.min(0.95, Math.max(0.1, periodProgress / 100));
|
||||
|
||||
return {
|
||||
projectedRevenue,
|
||||
projectedOrders,
|
||||
confidence
|
||||
};
|
||||
}
|
||||
|
||||
// Health check endpoint
|
||||
router.get('/health', async (req, res) => {
|
||||
try {
|
||||
const poolStatus = getPoolStatus();
|
||||
|
||||
// Test database connectivity
|
||||
const { connection, release } = await getDbConnection();
|
||||
await connection.execute('SELECT 1 as test');
|
||||
release();
|
||||
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: poolStatus,
|
||||
database: 'connected'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Health check failed:', error);
|
||||
res.status(500).json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error.message,
|
||||
pool: getPoolStatus()
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Debug endpoint for pool status
|
||||
router.get('/debug/pool', (req, res) => {
|
||||
res.json({
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus()
|
||||
});
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
57
inventory-server/dashboard/acot-server/routes/test.js
Normal file
57
inventory-server/dashboard/acot-server/routes/test.js
Normal file
@@ -0,0 +1,57 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { getDbConnection, getCachedQuery } = require('../db/connection');
|
||||
|
||||
// Test endpoint to count orders
|
||||
router.get('/order-count', async (req, res) => {
|
||||
try {
|
||||
const { connection } = await getDbConnection();
|
||||
|
||||
// Simple query to count orders from _order table
|
||||
const queryFn = async () => {
|
||||
const [rows] = await connection.execute('SELECT COUNT(*) as count FROM _order');
|
||||
return rows[0].count;
|
||||
};
|
||||
|
||||
const cacheKey = 'order-count';
|
||||
const count = await getCachedQuery(cacheKey, 'default', queryFn);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: {
|
||||
orderCount: count,
|
||||
timestamp: new Date().toISOString()
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching order count:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: error.message
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Test connection endpoint
|
||||
router.get('/test-connection', async (req, res) => {
|
||||
try {
|
||||
const { connection } = await getDbConnection();
|
||||
|
||||
// Test the connection with a simple query
|
||||
const [rows] = await connection.execute('SELECT 1 as test');
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Database connection successful',
|
||||
data: rows[0]
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error testing connection:', error);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: error.message
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
98
inventory-server/dashboard/acot-server/server.js
Normal file
98
inventory-server/dashboard/acot-server/server.js
Normal file
@@ -0,0 +1,98 @@
|
||||
require('dotenv').config();
|
||||
const express = require('express');
|
||||
const cors = require('cors');
|
||||
const morgan = require('morgan');
|
||||
const compression = require('compression');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const { closeAllConnections } = require('./db/connection');
|
||||
|
||||
const app = express();
|
||||
const PORT = process.env.ACOT_PORT || 3012;
|
||||
|
||||
// Create logs directory if it doesn't exist
|
||||
const logDir = path.join(__dirname, 'logs/app');
|
||||
if (!fs.existsSync(logDir)) {
|
||||
fs.mkdirSync(logDir, { recursive: true });
|
||||
}
|
||||
|
||||
// Create a write stream for access logs
|
||||
const accessLogStream = fs.createWriteStream(
|
||||
path.join(logDir, 'access.log'),
|
||||
{ flags: 'a' }
|
||||
);
|
||||
|
||||
// Middleware
|
||||
app.use(compression());
|
||||
app.use(cors());
|
||||
app.use(express.json());
|
||||
app.use(express.urlencoded({ extended: true }));
|
||||
|
||||
// Logging middleware
|
||||
if (process.env.NODE_ENV === 'production') {
|
||||
app.use(morgan('combined', { stream: accessLogStream }));
|
||||
} else {
|
||||
app.use(morgan('dev'));
|
||||
}
|
||||
|
||||
// Health check endpoint
|
||||
app.get('/health', (req, res) => {
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
service: 'acot-server',
|
||||
timestamp: new Date().toISOString(),
|
||||
uptime: process.uptime()
|
||||
});
|
||||
});
|
||||
|
||||
// Routes
|
||||
app.use('/api/acot/test', require('./routes/test'));
|
||||
app.use('/api/acot/events', require('./routes/events'));
|
||||
|
||||
// Error handling middleware
|
||||
app.use((err, req, res, next) => {
|
||||
console.error('Unhandled error:', err);
|
||||
res.status(500).json({
|
||||
success: false,
|
||||
error: process.env.NODE_ENV === 'production'
|
||||
? 'Internal server error'
|
||||
: err.message
|
||||
});
|
||||
});
|
||||
|
||||
// 404 handler
|
||||
app.use((req, res) => {
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
error: 'Route not found'
|
||||
});
|
||||
});
|
||||
|
||||
// Start server
|
||||
const server = app.listen(PORT, () => {
|
||||
console.log(`ACOT Server running on port ${PORT}`);
|
||||
console.log(`Environment: ${process.env.NODE_ENV}`);
|
||||
});
|
||||
|
||||
// Graceful shutdown
|
||||
const gracefulShutdown = async () => {
|
||||
console.log('SIGTERM signal received: closing HTTP server');
|
||||
server.close(async () => {
|
||||
console.log('HTTP server closed');
|
||||
|
||||
// Close database connections
|
||||
try {
|
||||
await closeAllConnections();
|
||||
console.log('Database connections closed');
|
||||
} catch (error) {
|
||||
console.error('Error closing database connections:', error);
|
||||
}
|
||||
|
||||
process.exit(0);
|
||||
});
|
||||
};
|
||||
|
||||
process.on('SIGTERM', gracefulShutdown);
|
||||
process.on('SIGINT', gracefulShutdown);
|
||||
|
||||
module.exports = app;
|
||||
259
inventory-server/dashboard/acot-server/utils/timeUtils.js
Normal file
259
inventory-server/dashboard/acot-server/utils/timeUtils.js
Normal file
@@ -0,0 +1,259 @@
|
||||
// Time utilities for handling business day logic and time ranges
|
||||
// Business day is 1am-12:59am Eastern time (UTC-5)
|
||||
|
||||
const getBusinessDayBounds = (timeRange) => {
|
||||
const now = new Date();
|
||||
const easternTime = new Date(now.getTime() - (5 * 60 * 60 * 1000)); // UTC-5
|
||||
|
||||
switch (timeRange) {
|
||||
case 'today': {
|
||||
const start = new Date(easternTime);
|
||||
start.setHours(1, 0, 0, 0); // 1 AM start of business day
|
||||
|
||||
const end = new Date(start);
|
||||
end.setDate(end.getDate() + 1);
|
||||
end.setHours(0, 59, 59, 999); // 12:59 AM next day
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'yesterday': {
|
||||
const start = new Date(easternTime);
|
||||
start.setDate(start.getDate() - 1);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
const end = new Date(start);
|
||||
end.setDate(end.getDate() + 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'thisWeek': {
|
||||
const start = new Date(easternTime);
|
||||
start.setDate(easternTime.getDate() - easternTime.getDay()); // Sunday
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
const end = new Date(easternTime);
|
||||
end.setDate(end.getDate() + 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'lastWeek': {
|
||||
const start = new Date(easternTime);
|
||||
start.setDate(easternTime.getDate() - easternTime.getDay() - 7); // Previous Sunday
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
const end = new Date(start);
|
||||
end.setDate(end.getDate() + 7);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'thisMonth': {
|
||||
const start = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 1, 0, 0, 0);
|
||||
const end = new Date(easternTime);
|
||||
end.setDate(end.getDate() + 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'lastMonth': {
|
||||
const start = new Date(easternTime.getFullYear(), easternTime.getMonth() - 1, 1, 1, 0, 0, 0);
|
||||
const end = new Date(easternTime.getFullYear(), easternTime.getMonth(), 1, 0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'last7days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 7);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'last30days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 30);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'last90days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 90);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'previous7days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setDate(end.getDate() - 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 6);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'previous30days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setDate(end.getDate() - 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 29);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'previous90days': {
|
||||
const end = new Date(easternTime);
|
||||
end.setDate(end.getDate() - 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
const start = new Date(end);
|
||||
start.setDate(start.getDate() - 89);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
case 'twoDaysAgo': {
|
||||
const start = new Date(easternTime);
|
||||
start.setDate(start.getDate() - 2);
|
||||
start.setHours(1, 0, 0, 0);
|
||||
|
||||
const end = new Date(start);
|
||||
end.setDate(end.getDate() + 1);
|
||||
end.setHours(0, 59, 59, 999);
|
||||
|
||||
return { start, end };
|
||||
}
|
||||
|
||||
default:
|
||||
throw new Error(`Unknown time range: ${timeRange}`);
|
||||
}
|
||||
};
|
||||
|
||||
const getTimeRangeConditions = (timeRange, startDate, endDate) => {
|
||||
if (timeRange === 'custom' && startDate && endDate) {
|
||||
// Custom date range
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
|
||||
// Convert to UTC-5 (Eastern time)
|
||||
const startUTC5 = new Date(start.getTime() - (5 * 60 * 60 * 1000));
|
||||
const endUTC5 = new Date(end.getTime() - (5 * 60 * 60 * 1000));
|
||||
|
||||
return {
|
||||
whereClause: 'date_placed >= ? AND date_placed <= ?',
|
||||
params: [
|
||||
startUTC5.toISOString().slice(0, 19).replace('T', ' '),
|
||||
endUTC5.toISOString().slice(0, 19).replace('T', ' ')
|
||||
],
|
||||
dateRange: {
|
||||
start: startDate,
|
||||
end: endDate,
|
||||
label: `${formatBusinessDate(start)} - ${formatBusinessDate(end)}`
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
if (!timeRange) {
|
||||
timeRange = 'today';
|
||||
}
|
||||
|
||||
const { start, end } = getBusinessDayBounds(timeRange);
|
||||
|
||||
// Convert to MySQL datetime format (UTC-5)
|
||||
const startStr = start.toISOString().slice(0, 19).replace('T', ' ');
|
||||
const endStr = end.toISOString().slice(0, 19).replace('T', ' ');
|
||||
|
||||
return {
|
||||
whereClause: 'date_placed >= ? AND date_placed <= ?',
|
||||
params: [startStr, endStr],
|
||||
dateRange: {
|
||||
start: start.toISOString(),
|
||||
end: end.toISOString(),
|
||||
label: getTimeRangeLabel(timeRange)
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
const formatBusinessDate = (date) => {
|
||||
return date.toLocaleDateString('en-US', {
|
||||
month: 'short',
|
||||
day: 'numeric',
|
||||
year: 'numeric'
|
||||
});
|
||||
};
|
||||
|
||||
const getTimeRangeLabel = (timeRange) => {
|
||||
const labels = {
|
||||
today: 'Today',
|
||||
yesterday: 'Yesterday',
|
||||
thisWeek: 'This Week',
|
||||
lastWeek: 'Last Week',
|
||||
thisMonth: 'This Month',
|
||||
lastMonth: 'Last Month',
|
||||
last7days: 'Last 7 Days',
|
||||
last30days: 'Last 30 Days',
|
||||
last90days: 'Last 90 Days',
|
||||
previous7days: 'Previous 7 Days',
|
||||
previous30days: 'Previous 30 Days',
|
||||
previous90days: 'Previous 90 Days',
|
||||
twoDaysAgo: 'Two Days Ago'
|
||||
};
|
||||
|
||||
return labels[timeRange] || timeRange;
|
||||
};
|
||||
|
||||
// Helper to convert MySQL datetime to JavaScript Date
|
||||
const parseBusinessDate = (mysqlDatetime) => {
|
||||
if (!mysqlDatetime || mysqlDatetime === '0000-00-00 00:00:00') {
|
||||
return null;
|
||||
}
|
||||
|
||||
// MySQL datetime is stored in UTC-5, so we need to add 5 hours to get UTC
|
||||
const date = new Date(mysqlDatetime + ' UTC');
|
||||
date.setHours(date.getHours() + 5);
|
||||
return date;
|
||||
};
|
||||
|
||||
// Helper to format date for MySQL queries
|
||||
const formatMySQLDate = (date) => {
|
||||
if (!date) return null;
|
||||
|
||||
// Convert to UTC-5 for storage
|
||||
const utc5Date = new Date(date.getTime() - (5 * 60 * 60 * 1000));
|
||||
return utc5Date.toISOString().slice(0, 19).replace('T', ' ');
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
getBusinessDayBounds,
|
||||
getTimeRangeConditions,
|
||||
formatBusinessDate,
|
||||
getTimeRangeLabel,
|
||||
parseBusinessDate,
|
||||
formatMySQLDate
|
||||
};
|
||||
Reference in New Issue
Block a user