Compare commits
15 Commits
add-compan
...
7aa494aaad
| Author | SHA1 | Date | |
|---|---|---|---|
| 7aa494aaad | |||
| 1e0be3f86e | |||
| a068a253cd | |||
| 087ec710f6 | |||
| 957c7b5eb1 | |||
| 8b8845b423 | |||
| e5c4f617c5 | |||
| 8e19e6cd74 | |||
| 749907bd30 | |||
| 108181c63d | |||
| 5dd779cb4a | |||
| 7b0e792d03 | |||
| 517bbe72f4 | |||
| 87d4b9e804 | |||
| 75da2c6772 |
342
docs/import-from-prod-data-mapping.md
Normal file
342
docs/import-from-prod-data-mapping.md
Normal file
@@ -0,0 +1,342 @@
|
||||
# MySQL to PostgreSQL Import Process Documentation
|
||||
|
||||
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
|
||||
|
||||
## Table of Contents
|
||||
1. [Overview](#overview)
|
||||
2. [Import Architecture](#import-architecture)
|
||||
3. [Column Mappings](#column-mappings)
|
||||
- [Categories](#categories)
|
||||
- [Products](#products)
|
||||
- [Product Categories (Relationship)](#product-categories-relationship)
|
||||
- [Orders](#orders)
|
||||
- [Purchase Orders](#purchase-orders)
|
||||
- [Metadata Tables](#metadata-tables)
|
||||
4. [Special Calculations](#special-calculations)
|
||||
5. [Implementation Notes](#implementation-notes)
|
||||
|
||||
## Overview
|
||||
|
||||
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
|
||||
|
||||
- **Full Import**: Imports all data regardless of last sync time
|
||||
- **Incremental Import**: Only imports data that has changed since the last import
|
||||
|
||||
The process handles four main data types:
|
||||
- Categories (product categorization hierarchy)
|
||||
- Products (inventory items)
|
||||
- Orders (sales records)
|
||||
- Purchase Orders (vendor orders)
|
||||
|
||||
## Import Architecture
|
||||
|
||||
The import process follows these steps:
|
||||
|
||||
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
|
||||
2. **Setup Import History**: Creates a record of the current import operation
|
||||
3. **Import Categories**: Processes product categories in hierarchical order
|
||||
4. **Import Products**: Processes products with their attributes and category relationships
|
||||
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
|
||||
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
|
||||
7. **Record Results**: Updates the import history with results
|
||||
8. **Close Connections**: Cleans up connections and resources
|
||||
|
||||
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
|
||||
|
||||
## Column Mappings
|
||||
|
||||
### Categories
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|---------------------------------|----------------------------------------------|
|
||||
| cat_id | product_categories.cat_id | Direct mapping |
|
||||
| name | product_categories.name | Direct mapping |
|
||||
| type | product_categories.type | Direct mapping |
|
||||
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
|
||||
| description | product_categories.combined_name| Direct mapping |
|
||||
| status | N/A | Hard-coded 'active' |
|
||||
| created_at | N/A | Current timestamp |
|
||||
| updated_at | N/A | Current timestamp |
|
||||
|
||||
**Notes:**
|
||||
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
|
||||
- Type 10/20 are top-level categories with no parent
|
||||
- Types 11/21/12/13 are child categories that reference parent categories
|
||||
|
||||
### Products
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|----------------------|----------------------------------|---------------------------------------------------------------|
|
||||
| pid | products.pid | Direct mapping |
|
||||
| title | products.description | Direct mapping |
|
||||
| description | products.notes | Direct mapping |
|
||||
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
|
||||
| preorder_count | current_inventory.onpreorder | Default 0 |
|
||||
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
|
||||
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
|
||||
| regular_price | products.sellingprice | Default 0 |
|
||||
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
|
||||
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
|
||||
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
|
||||
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
|
||||
| brand | product_categories.name | Linked via products.company |
|
||||
| line | product_categories.name | Linked via products.line |
|
||||
| subline | product_categories.name | Linked via products.subline |
|
||||
| artist | product_categories.name | Linked via products.artist |
|
||||
| categories | product_category_index | Comma-separated list of category IDs |
|
||||
| created_at | products.date_created | Validated date, NULL if invalid |
|
||||
| first_received | products.datein | Validated date, NULL if invalid |
|
||||
| landing_cost_price | NULL | Not set |
|
||||
| barcode | products.upc | Direct mapping |
|
||||
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
|
||||
| updated_at | products.stamp | Validated date, NULL if invalid |
|
||||
| visible | shop_inventory | Calculated from show + buyable > 0 |
|
||||
| managing_stock | N/A | Hard-coded true |
|
||||
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
|
||||
| permalink | N/A | Constructed URL with product ID |
|
||||
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
|
||||
| uom | N/A | Hard-coded 1 |
|
||||
| rating | products.rating | Direct mapping |
|
||||
| reviews | products.rating_votes | Direct mapping |
|
||||
| weight | products.weight | Direct mapping |
|
||||
| length | products.length | Direct mapping |
|
||||
| width | products.width | Direct mapping |
|
||||
| height | products.height | Direct mapping |
|
||||
| country_of_origin | products.country_of_origin | Direct mapping |
|
||||
| location | products.location | Direct mapping |
|
||||
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
|
||||
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
|
||||
| notifies | product_notify | COUNT of records where pn.pid = pid |
|
||||
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
|
||||
| image | N/A | Constructed from pid and image URL pattern |
|
||||
| image_175 | N/A | Constructed from pid and image URL pattern |
|
||||
| image_full | N/A | Constructed from pid and image URL pattern |
|
||||
| options | NULL | Not set |
|
||||
| tags | NULL | Not set |
|
||||
|
||||
**Notes:**
|
||||
- Replenishable calculation:
|
||||
```javascript
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END
|
||||
```
|
||||
|
||||
In business terms, a product is considered NOT replenishable only if:
|
||||
- It was manually flagged as not replenishable (negative reorder value)
|
||||
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
|
||||
- Image URLs are constructed using this pattern:
|
||||
```javascript
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
const prefix = paddedPid.slice(0, 3);
|
||||
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||
return {
|
||||
image: `${basePath}-t-${iid}.jpg`,
|
||||
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||
image_full: `${basePath}-o-${iid}.jpg`
|
||||
};
|
||||
```
|
||||
|
||||
### Product Categories (Relationship)
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| pid | products.pid | Direct mapping |
|
||||
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
|
||||
|
||||
**Notes:**
|
||||
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
|
||||
- Categories 16 and 17 are explicitly excluded
|
||||
|
||||
### Orders
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| order_number | order_items.order_id | Direct mapping |
|
||||
| pid | order_items.prod_pid | Direct mapping |
|
||||
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| date | _order.date_placed_onlydate | Via join to _order table |
|
||||
| price | order_items.prod_price | Direct mapping |
|
||||
| quantity | order_items.qty_ordered | Direct mapping |
|
||||
| discount | Multiple sources | Complex calculation (see notes) |
|
||||
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
|
||||
| tax_included | N/A | Hard-coded false |
|
||||
| shipping | N/A | Hard-coded 0 |
|
||||
| customer | _order.order_cid | Direct mapping |
|
||||
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
|
||||
| status | _order.order_status | Direct mapping |
|
||||
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
|
||||
| costeach | order_costs | From latest record or fallback to price * 0.5 |
|
||||
|
||||
**Notes:**
|
||||
- Only orders with order_status >= 15 and with a valid date_placed are processed
|
||||
- For incremental imports, only orders modified since last sync are processed
|
||||
- Discount calculation combines three sources:
|
||||
1. Base discount: order_items.prod_price_reg - order_items.prod_price
|
||||
2. Promo discount: SUM of order_discount_items.amount
|
||||
3. Proportional order discount: Calculation based on order subtotal proportion
|
||||
```javascript
|
||||
(oi.base_discount +
|
||||
COALESCE(ot.promo_discount, 0) +
|
||||
CASE
|
||||
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
|
||||
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
|
||||
ELSE 0
|
||||
END)::DECIMAL(10,2)
|
||||
```
|
||||
- Taxes are taken from the latest tax record for an order
|
||||
- Cost data is taken from the latest non-pending cost record
|
||||
|
||||
### Purchase Orders
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| po_id | po.po_id | Default 0 if NULL |
|
||||
| pid | po_products.pid | Direct mapping |
|
||||
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| name | products.description | Fallback to 'Unknown Product' |
|
||||
| cost_price | po_products.cost_each | Direct mapping |
|
||||
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
|
||||
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
|
||||
| date | po.date_ordered | Fallback to po.date_created if NULL |
|
||||
| expected_date | po.date_estin | Direct mapping |
|
||||
| status | po.status | Default 1 if NULL |
|
||||
| notes | po.short_note | Fallback to po.notes if NULL |
|
||||
| ordered | po_products.qty_each | Direct mapping |
|
||||
| received | N/A | Hard-coded 0 |
|
||||
| receiving_status | N/A | Hard-coded 1 |
|
||||
|
||||
**Notes:**
|
||||
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
|
||||
- For incremental imports, only POs modified since last sync are processed
|
||||
|
||||
### Metadata Tables
|
||||
|
||||
#### import_history
|
||||
|
||||
| PostgreSQL Column | Source | Notes |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| id | Auto-increment | Primary key |
|
||||
| table_name | Code | 'all_tables' for overall import |
|
||||
| start_time | NOW() | Import start time |
|
||||
| end_time | NOW() | Import completion time |
|
||||
| duration_seconds | Calculation | Elapsed seconds |
|
||||
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
|
||||
| records_added | Calculation | Sum from all imports |
|
||||
| records_updated | Calculation | Sum from all imports |
|
||||
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
|
||||
| error_message | Exception | Error message if failed |
|
||||
| additional_info | JSON | Configuration and results |
|
||||
|
||||
#### sync_status
|
||||
|
||||
| PostgreSQL Column | Source | Notes |
|
||||
|----------------------|--------------------------------|---------------------------------------------------------------|
|
||||
| table_name | Code | Name of imported table |
|
||||
| last_sync_timestamp | NOW() | Timestamp of successful sync |
|
||||
| last_sync_id | NULL | Not used currently |
|
||||
|
||||
## Special Calculations
|
||||
|
||||
### Date Validation
|
||||
|
||||
MySQL dates are validated before insertion into PostgreSQL:
|
||||
|
||||
```javascript
|
||||
function validateDate(mysqlDate) {
|
||||
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
|
||||
return null;
|
||||
}
|
||||
// Check if the date is valid
|
||||
const date = new Date(mysqlDate);
|
||||
return isNaN(date.getTime()) ? null : mysqlDate;
|
||||
}
|
||||
```
|
||||
|
||||
### Retry Mechanism
|
||||
|
||||
Operations that might fail temporarily are retried with exponential backoff:
|
||||
|
||||
```javascript
|
||||
async function withRetry(operation, errorMessage) {
|
||||
let lastError;
|
||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||
try {
|
||||
return await operation();
|
||||
} catch (error) {
|
||||
lastError = error;
|
||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||
if (attempt < MAX_RETRIES) {
|
||||
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||
}
|
||||
}
|
||||
}
|
||||
throw lastError;
|
||||
}
|
||||
```
|
||||
|
||||
### Progress Tracking
|
||||
|
||||
Progress is tracked with estimated time remaining:
|
||||
|
||||
```javascript
|
||||
function estimateRemaining(startTime, current, total) {
|
||||
if (current === 0) return "Calculating...";
|
||||
const elapsedSeconds = (Date.now() - startTime) / 1000;
|
||||
const itemsPerSecond = current / elapsedSeconds;
|
||||
const remainingItems = total - current;
|
||||
const remainingSeconds = remainingItems / itemsPerSecond;
|
||||
return formatElapsedTime(remainingSeconds);
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
### Transaction Management
|
||||
|
||||
All imports use transactions to ensure data consistency:
|
||||
|
||||
- **Categories**: Uses savepoints for each category type
|
||||
- **Products**: Uses a single transaction for the entire import
|
||||
- **Orders**: Uses a single transaction with temporary tables
|
||||
- **Purchase Orders**: Uses a single transaction with temporary tables
|
||||
|
||||
### Memory Usage Optimization
|
||||
|
||||
To minimize memory usage when processing large datasets:
|
||||
|
||||
1. Data is processed in batches (100-5000 records per batch)
|
||||
2. Temporary tables are used for intermediate data
|
||||
3. Some queries use cursors to avoid loading all results at once
|
||||
|
||||
### MySQL vs PostgreSQL Compatibility
|
||||
|
||||
The scripts handle differences between MySQL and PostgreSQL:
|
||||
|
||||
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
|
||||
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
|
||||
3. Transaction syntax differences are abstracted in the connection wrapper
|
||||
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
|
||||
|
||||
### SSH Tunnel
|
||||
|
||||
Database connections go through an SSH tunnel for security:
|
||||
|
||||
```javascript
|
||||
ssh.forwardOut(
|
||||
"127.0.0.1",
|
||||
0,
|
||||
sshConfig.prodDbConfig.host,
|
||||
sshConfig.prodDbConfig.port,
|
||||
async (err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream });
|
||||
}
|
||||
);
|
||||
```
|
||||
1380
docs/inventory-calculation-reference.md
Normal file
1380
docs/inventory-calculation-reference.md
Normal file
File diff suppressed because it is too large
Load Diff
1065
docs/metrics-calculation-system.md
Normal file
1065
docs/metrics-calculation-system.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -154,6 +154,24 @@ CREATE TRIGGER update_sales_seasonality_updated
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Create table for financial calculation parameters
|
||||
CREATE TABLE financial_calc_config (
|
||||
id INTEGER NOT NULL PRIMARY KEY,
|
||||
order_cost DECIMAL(10,2) NOT NULL DEFAULT 25.00, -- The fixed cost per purchase order (used in EOQ)
|
||||
holding_rate DECIMAL(10,4) NOT NULL DEFAULT 0.25, -- The annual inventory holding cost as a percentage of unit cost (used in EOQ)
|
||||
service_level_z_score DECIMAL(10,4) NOT NULL DEFAULT 1.96, -- Z-score for ~95% service level (used in Safety Stock)
|
||||
min_reorder_qty INTEGER NOT NULL DEFAULT 1, -- Minimum reorder quantity
|
||||
default_reorder_qty INTEGER NOT NULL DEFAULT 5, -- Default reorder quantity when sales data is insufficient
|
||||
default_safety_stock INTEGER NOT NULL DEFAULT 5, -- Default safety stock when sales data is insufficient
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE TRIGGER update_financial_calc_config_updated
|
||||
BEFORE UPDATE ON financial_calc_config
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Insert default global thresholds
|
||||
INSERT INTO stock_thresholds (id, category_id, vendor, critical_days, reorder_days, overstock_days)
|
||||
VALUES (1, NULL, NULL, 7, 14, 90)
|
||||
@@ -203,6 +221,17 @@ VALUES
|
||||
ON CONFLICT (month) DO UPDATE SET
|
||||
last_updated = CURRENT_TIMESTAMP;
|
||||
|
||||
-- Insert default values
|
||||
INSERT INTO financial_calc_config (id, order_cost, holding_rate, service_level_z_score, min_reorder_qty, default_reorder_qty, default_safety_stock)
|
||||
VALUES (1, 25.00, 0.25, 1.96, 1, 5, 5)
|
||||
ON CONFLICT (id) DO UPDATE SET
|
||||
order_cost = EXCLUDED.order_cost,
|
||||
holding_rate = EXCLUDED.holding_rate,
|
||||
service_level_z_score = EXCLUDED.service_level_z_score,
|
||||
min_reorder_qty = EXCLUDED.min_reorder_qty,
|
||||
default_reorder_qty = EXCLUDED.default_reorder_qty,
|
||||
default_safety_stock = EXCLUDED.default_safety_stock;
|
||||
|
||||
-- View to show thresholds with category names
|
||||
CREATE OR REPLACE VIEW stock_thresholds_view AS
|
||||
SELECT
|
||||
|
||||
@@ -11,15 +11,17 @@ CREATE TABLE temp_sales_metrics (
|
||||
avg_margin_percent DECIMAL(10,3),
|
||||
first_sale_date DATE,
|
||||
last_sale_date DATE,
|
||||
stddev_daily_sales DECIMAL(10,3),
|
||||
PRIMARY KEY (pid)
|
||||
);
|
||||
|
||||
CREATE TABLE temp_purchase_metrics (
|
||||
pid BIGINT NOT NULL,
|
||||
avg_lead_time_days INTEGER,
|
||||
avg_lead_time_days DECIMAL(10,2),
|
||||
last_purchase_date DATE,
|
||||
first_received_date DATE,
|
||||
last_received_date DATE,
|
||||
stddev_lead_time_days DECIMAL(10,2),
|
||||
PRIMARY KEY (pid)
|
||||
);
|
||||
|
||||
@@ -50,7 +52,7 @@ CREATE TABLE product_metrics (
|
||||
gross_profit DECIMAL(10,3),
|
||||
gmroi DECIMAL(10,3),
|
||||
-- Purchase metrics
|
||||
avg_lead_time_days INTEGER,
|
||||
avg_lead_time_days DECIMAL(10,2),
|
||||
last_purchase_date DATE,
|
||||
first_received_date DATE,
|
||||
last_received_date DATE,
|
||||
|
||||
@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
|
||||
-- Create function for updating timestamps
|
||||
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
||||
BEGIN
|
||||
NEW.updated = CURRENT_TIMESTAMP;
|
||||
-- Check which table is being updated and use the appropriate column
|
||||
IF TG_TABLE_NAME = 'categories' THEN
|
||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||
ELSIF TG_TABLE_NAME IN ('products', 'orders', 'purchase_orders') THEN
|
||||
NEW.updated = CURRENT_TIMESTAMP;
|
||||
END IF;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$func$ language plpgsql;
|
||||
@@ -86,6 +91,7 @@ CREATE TABLE categories (
|
||||
description TEXT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
status VARCHAR(20) DEFAULT 'active',
|
||||
FOREIGN KEY (parent_id) REFERENCES categories(cat_id)
|
||||
);
|
||||
@@ -160,7 +166,7 @@ CREATE TABLE purchase_orders (
|
||||
expected_date DATE,
|
||||
pid BIGINT NOT NULL,
|
||||
sku VARCHAR(50) NOT NULL,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
name VARCHAR(255) NOT NULL,
|
||||
cost_price DECIMAL(10, 3) NOT NULL,
|
||||
po_cost_price DECIMAL(10, 3) NOT NULL,
|
||||
status SMALLINT DEFAULT 1,
|
||||
@@ -171,7 +177,7 @@ CREATE TABLE purchase_orders (
|
||||
received INTEGER DEFAULT 0,
|
||||
received_date DATE,
|
||||
last_received_date DATE,
|
||||
received_by VARCHAR(100),
|
||||
received_by VARCHAR,
|
||||
receiving_history JSONB,
|
||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (pid) REFERENCES products(pid),
|
||||
|
||||
@@ -49,6 +49,30 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
|
||||
ON ai_prompts (prompt_type)
|
||||
WHERE prompt_type = 'system';
|
||||
|
||||
-- Reusable Images table for storing persistent images
|
||||
CREATE TABLE IF NOT EXISTS reusable_images (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
filename TEXT NOT NULL,
|
||||
file_path TEXT NOT NULL,
|
||||
image_url TEXT NOT NULL,
|
||||
is_global BOOLEAN NOT NULL DEFAULT false,
|
||||
company TEXT,
|
||||
mime_type TEXT,
|
||||
file_size INTEGER,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
CONSTRAINT company_required_for_non_global CHECK (
|
||||
(is_global = true AND company IS NULL) OR
|
||||
(is_global = false AND company IS NOT NULL)
|
||||
)
|
||||
);
|
||||
|
||||
-- Create index on company for efficient querying
|
||||
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
|
||||
-- Create index on is_global for efficient querying
|
||||
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
|
||||
|
||||
-- AI Validation Performance Tracking
|
||||
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
||||
id SERIAL PRIMARY KEY,
|
||||
@@ -82,4 +106,10 @@ CREATE TRIGGER update_templates_updated_at
|
||||
CREATE TRIGGER update_ai_prompts_updated_at
|
||||
BEFORE UPDATE ON ai_prompts
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Trigger to automatically update the updated_at column for reusable_images
|
||||
CREATE TRIGGER update_reusable_images_updated_at
|
||||
BEFORE UPDATE ON reusable_images
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
@@ -57,18 +57,20 @@ const TEMP_TABLES = [
|
||||
'temp_daily_sales',
|
||||
'temp_product_stats',
|
||||
'temp_category_sales',
|
||||
'temp_category_stats'
|
||||
'temp_category_stats',
|
||||
'temp_beginning_inventory',
|
||||
'temp_monthly_inventory'
|
||||
];
|
||||
|
||||
// Add cleanup function for temporary tables
|
||||
async function cleanupTemporaryTables(connection) {
|
||||
try {
|
||||
// Drop each temporary table if it exists
|
||||
for (const table of TEMP_TABLES) {
|
||||
await connection.query(`DROP TEMPORARY TABLE IF EXISTS ${table}`);
|
||||
await connection.query(`DROP TABLE IF EXISTS ${table}`);
|
||||
}
|
||||
} catch (error) {
|
||||
logError(error, 'Error cleaning up temporary tables');
|
||||
throw error; // Re-throw to be handled by the caller
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -86,22 +88,42 @@ let isCancelled = false;
|
||||
|
||||
function cancelCalculation() {
|
||||
isCancelled = true;
|
||||
global.clearProgress();
|
||||
// Format as SSE event
|
||||
const event = {
|
||||
progress: {
|
||||
status: 'cancelled',
|
||||
operation: 'Calculation cancelled',
|
||||
current: 0,
|
||||
total: 0,
|
||||
elapsed: null,
|
||||
remaining: null,
|
||||
rate: 0,
|
||||
timestamp: Date.now()
|
||||
}
|
||||
console.log('Calculation has been cancelled by user');
|
||||
|
||||
// Force-terminate any query that's been running for more than 5 seconds
|
||||
try {
|
||||
const connection = getConnection();
|
||||
connection.then(async (conn) => {
|
||||
try {
|
||||
// Identify and terminate long-running queries from our application
|
||||
await conn.query(`
|
||||
SELECT pg_cancel_backend(pid)
|
||||
FROM pg_stat_activity
|
||||
WHERE query_start < now() - interval '5 seconds'
|
||||
AND application_name LIKE '%node%'
|
||||
AND query NOT LIKE '%pg_cancel_backend%'
|
||||
`);
|
||||
|
||||
// Clean up any temporary tables
|
||||
await cleanupTemporaryTables(conn);
|
||||
|
||||
// Release connection
|
||||
conn.release();
|
||||
} catch (err) {
|
||||
console.error('Error during force cancellation:', err);
|
||||
conn.release();
|
||||
}
|
||||
}).catch(err => {
|
||||
console.error('Could not get connection for cancellation:', err);
|
||||
});
|
||||
} catch (err) {
|
||||
console.error('Failed to terminate running queries:', err);
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: 'Calculation has been cancelled'
|
||||
};
|
||||
process.stdout.write(JSON.stringify(event) + '\n');
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
// Handle SIGTERM signal for cancellation
|
||||
@@ -119,6 +141,15 @@ async function calculateMetrics() {
|
||||
let totalPurchaseOrders = 0;
|
||||
let calculateHistoryId;
|
||||
|
||||
// Set a maximum execution time (30 minutes)
|
||||
const MAX_EXECUTION_TIME = 30 * 60 * 1000;
|
||||
const timeout = setTimeout(() => {
|
||||
console.error(`Calculation timed out after ${MAX_EXECUTION_TIME/1000} seconds, forcing termination`);
|
||||
// Call cancel and force exit
|
||||
cancelCalculation();
|
||||
process.exit(1);
|
||||
}, MAX_EXECUTION_TIME);
|
||||
|
||||
try {
|
||||
// Clean up any previously running calculations
|
||||
connection = await getConnection();
|
||||
@@ -127,24 +158,24 @@ async function calculateMetrics() {
|
||||
SET
|
||||
status = 'cancelled',
|
||||
end_time = NOW(),
|
||||
duration_seconds = TIMESTAMPDIFF(SECOND, start_time, NOW()),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = 'Previous calculation was not completed properly'
|
||||
WHERE status = 'running'
|
||||
`);
|
||||
|
||||
// Get counts from all relevant tables
|
||||
const [[productCount], [orderCount], [poCount]] = await Promise.all([
|
||||
const [productCountResult, orderCountResult, poCountResult] = await Promise.all([
|
||||
connection.query('SELECT COUNT(*) as total FROM products'),
|
||||
connection.query('SELECT COUNT(*) as total FROM orders'),
|
||||
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
|
||||
]);
|
||||
|
||||
totalProducts = productCount.total;
|
||||
totalOrders = orderCount.total;
|
||||
totalPurchaseOrders = poCount.total;
|
||||
totalProducts = parseInt(productCountResult.rows[0].total);
|
||||
totalOrders = parseInt(orderCountResult.rows[0].total);
|
||||
totalPurchaseOrders = parseInt(poCountResult.rows[0].total);
|
||||
|
||||
// Create history record for this calculation
|
||||
const [historyResult] = await connection.query(`
|
||||
const historyResult = await connection.query(`
|
||||
INSERT INTO calculate_history (
|
||||
start_time,
|
||||
status,
|
||||
@@ -155,19 +186,19 @@ async function calculateMetrics() {
|
||||
) VALUES (
|
||||
NOW(),
|
||||
'running',
|
||||
?,
|
||||
?,
|
||||
?,
|
||||
JSON_OBJECT(
|
||||
'skip_product_metrics', ?,
|
||||
'skip_time_aggregates', ?,
|
||||
'skip_financial_metrics', ?,
|
||||
'skip_vendor_metrics', ?,
|
||||
'skip_category_metrics', ?,
|
||||
'skip_brand_metrics', ?,
|
||||
'skip_sales_forecasts', ?
|
||||
$1,
|
||||
$2,
|
||||
$3,
|
||||
jsonb_build_object(
|
||||
'skip_product_metrics', ($4::int > 0),
|
||||
'skip_time_aggregates', ($5::int > 0),
|
||||
'skip_financial_metrics', ($6::int > 0),
|
||||
'skip_vendor_metrics', ($7::int > 0),
|
||||
'skip_category_metrics', ($8::int > 0),
|
||||
'skip_brand_metrics', ($9::int > 0),
|
||||
'skip_sales_forecasts', ($10::int > 0)
|
||||
)
|
||||
)
|
||||
) RETURNING id
|
||||
`, [
|
||||
totalProducts,
|
||||
totalOrders,
|
||||
@@ -180,8 +211,7 @@ async function calculateMetrics() {
|
||||
SKIP_BRAND_METRICS,
|
||||
SKIP_SALES_FORECASTS
|
||||
]);
|
||||
calculateHistoryId = historyResult.insertId;
|
||||
connection.release();
|
||||
calculateHistoryId = historyResult.rows[0].id;
|
||||
|
||||
// Add debug logging for the progress functions
|
||||
console.log('Debug - Progress functions:', {
|
||||
@@ -199,6 +229,8 @@ async function calculateMetrics() {
|
||||
throw err;
|
||||
}
|
||||
|
||||
// Release the connection before getting a new one
|
||||
connection.release();
|
||||
isCancelled = false;
|
||||
connection = await getConnection();
|
||||
|
||||
@@ -234,10 +266,10 @@ async function calculateMetrics() {
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?
|
||||
WHERE id = ?
|
||||
processed_products = $1,
|
||||
processed_orders = $2,
|
||||
processed_purchase_orders = $3
|
||||
WHERE id = $4
|
||||
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
|
||||
};
|
||||
|
||||
@@ -359,216 +391,6 @@ async function calculateMetrics() {
|
||||
console.log('Skipping sales forecasts calculation');
|
||||
}
|
||||
|
||||
// Calculate ABC classification
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting ABC classification',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
|
||||
// First, create and populate the rankings table with an index
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||
pid BIGINT NOT NULL,
|
||||
total_revenue DECIMAL(10,3),
|
||||
rank_num INT,
|
||||
total_count INT,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX (rank_num)
|
||||
) ENGINE=MEMORY
|
||||
`);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Creating revenue rankings',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
await connection.query(`
|
||||
INSERT INTO temp_revenue_ranks
|
||||
SELECT
|
||||
pid,
|
||||
total_revenue,
|
||||
@rank := @rank + 1 as rank_num,
|
||||
@total_count := @rank as total_count
|
||||
FROM (
|
||||
SELECT pid, total_revenue
|
||||
FROM product_metrics
|
||||
WHERE total_revenue > 0
|
||||
ORDER BY total_revenue DESC
|
||||
) ranked,
|
||||
(SELECT @rank := 0) r
|
||||
`);
|
||||
|
||||
// Get total count for percentage calculation
|
||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = rankingCount[0].total_count || 1;
|
||||
const max_rank = totalCount; // Store max_rank for use in classification
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Updating ABC classifications',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// ABC classification progress tracking
|
||||
let abcProcessedCount = 0;
|
||||
const batchSize = 5000;
|
||||
let lastProgressUpdate = Date.now();
|
||||
const progressUpdateInterval = 1000; // Update every second
|
||||
|
||||
while (true) {
|
||||
if (isCancelled) return {
|
||||
processedProducts: Number(processedProducts) || 0,
|
||||
processedOrders: Number(processedOrders) || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// First get a batch of PIDs that need updating
|
||||
const [pids] = await connection.query(`
|
||||
SELECT pm.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
WHERE pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END
|
||||
LIMIT ?
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
batchSize]);
|
||||
|
||||
if (pids.length === 0) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Then update just those PIDs
|
||||
const [result] = await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
SET pm.abc_class =
|
||||
CASE
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
pids.map(row => row.pid)]);
|
||||
|
||||
abcProcessedCount += result.affectedRows;
|
||||
|
||||
// Calculate progress ensuring valid numbers
|
||||
const currentProgress = Math.floor(totalProducts * (0.99 + (abcProcessedCount / (totalCount || 1)) * 0.01));
|
||||
processedProducts = Number(currentProgress) || processedProducts || 0;
|
||||
|
||||
// Only update progress at most once per second
|
||||
const now = Date.now();
|
||||
if (now - lastProgressUpdate >= progressUpdateInterval) {
|
||||
const progress = ensureValidProgress(processedProducts, totalProducts);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'ABC classification progress',
|
||||
current: progress.current,
|
||||
total: progress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, progress.current, progress.total),
|
||||
rate: calculateRate(startTime, progress.current),
|
||||
percentage: progress.percentage,
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
|
||||
// Update database progress
|
||||
await updateProgress(processedProducts, processedOrders, processedPurchaseOrders);
|
||||
|
||||
// Small delay between batches to allow other transactions
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
}
|
||||
|
||||
// Clean up
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
|
||||
// Update calculate_status for ABC classification
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('abc_classification', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Final progress update with guaranteed valid numbers
|
||||
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
|
||||
|
||||
@@ -578,14 +400,14 @@ async function calculateMetrics() {
|
||||
operation: 'Metrics calculation complete',
|
||||
current: finalProgress.current,
|
||||
total: finalProgress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
remaining: '0s',
|
||||
rate: calculateRate(startTime, finalProgress.current),
|
||||
rate: global.calculateRate(startTime, finalProgress.current),
|
||||
percentage: '100',
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: totalElapsedSeconds
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
@@ -601,13 +423,13 @@ async function calculateMetrics() {
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
duration_seconds = $1,
|
||||
processed_products = $2,
|
||||
processed_orders = $3,
|
||||
processed_purchase_orders = $4,
|
||||
status = 'completed'
|
||||
WHERE id = ?
|
||||
`, [totalElapsedSeconds,
|
||||
WHERE id = $5
|
||||
`, [Math.round((Date.now() - startTime) / 1000),
|
||||
finalStats.processedProducts,
|
||||
finalStats.processedOrders,
|
||||
finalStats.processedPurchaseOrders,
|
||||
@@ -616,6 +438,11 @@ async function calculateMetrics() {
|
||||
// Clear progress file on successful completion
|
||||
global.clearProgress();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: 'Calculation completed successfully',
|
||||
duration: Math.round((Date.now() - startTime) / 1000)
|
||||
};
|
||||
} catch (error) {
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
@@ -625,13 +452,13 @@ async function calculateMetrics() {
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
status = ?,
|
||||
error_message = ?
|
||||
WHERE id = ?
|
||||
duration_seconds = $1,
|
||||
processed_products = $2,
|
||||
processed_orders = $3,
|
||||
processed_purchase_orders = $4,
|
||||
status = $5,
|
||||
error_message = $6
|
||||
WHERE id = $7
|
||||
`, [
|
||||
totalElapsedSeconds,
|
||||
processedProducts || 0, // Ensure we have a valid number
|
||||
@@ -677,17 +504,38 @@ async function calculateMetrics() {
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
// Clear the timeout to prevent forced termination
|
||||
clearTimeout(timeout);
|
||||
|
||||
// Always clean up and release connection
|
||||
if (connection) {
|
||||
// Ensure temporary tables are cleaned up
|
||||
await cleanupTemporaryTables(connection);
|
||||
connection.release();
|
||||
try {
|
||||
await cleanupTemporaryTables(connection);
|
||||
connection.release();
|
||||
} catch (err) {
|
||||
console.error('Error in final cleanup:', err);
|
||||
}
|
||||
}
|
||||
// Close the connection pool when we're done
|
||||
await closePool();
|
||||
}
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error in metrics calculation');
|
||||
console.error('Error in metrics calculation', error);
|
||||
|
||||
try {
|
||||
if (connection) {
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'failed',
|
||||
end_time = NOW(),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = $1
|
||||
WHERE id = $2
|
||||
`, [error.message.substring(0, 500), calculateHistoryId]);
|
||||
}
|
||||
} catch (updateError) {
|
||||
console.error('Error updating calculation history:', updateError);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -120,27 +120,38 @@ async function main() {
|
||||
`);
|
||||
|
||||
// Create import history record for the overall session
|
||||
const [historyResult] = await localConnection.query(`
|
||||
INSERT INTO import_history (
|
||||
table_name,
|
||||
start_time,
|
||||
is_incremental,
|
||||
status,
|
||||
additional_info
|
||||
) VALUES (
|
||||
'all_tables',
|
||||
NOW(),
|
||||
$1::boolean,
|
||||
'running',
|
||||
jsonb_build_object(
|
||||
'categories_enabled', $2::boolean,
|
||||
'products_enabled', $3::boolean,
|
||||
'orders_enabled', $4::boolean,
|
||||
'purchase_orders_enabled', $5::boolean
|
||||
)
|
||||
) RETURNING id
|
||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||
importHistoryId = historyResult.rows[0].id;
|
||||
try {
|
||||
const [historyResult] = await localConnection.query(`
|
||||
INSERT INTO import_history (
|
||||
table_name,
|
||||
start_time,
|
||||
is_incremental,
|
||||
status,
|
||||
additional_info
|
||||
) VALUES (
|
||||
'all_tables',
|
||||
NOW(),
|
||||
$1::boolean,
|
||||
'running',
|
||||
jsonb_build_object(
|
||||
'categories_enabled', $2::boolean,
|
||||
'products_enabled', $3::boolean,
|
||||
'orders_enabled', $4::boolean,
|
||||
'purchase_orders_enabled', $5::boolean
|
||||
)
|
||||
) RETURNING id
|
||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||
importHistoryId = historyResult.rows[0].id;
|
||||
} catch (error) {
|
||||
console.error("Error creating import history record:", error);
|
||||
outputProgress({
|
||||
status: "error",
|
||||
operation: "Import process",
|
||||
message: "Failed to create import history record",
|
||||
error: error.message
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
|
||||
const results = {
|
||||
categories: null,
|
||||
@@ -181,12 +192,29 @@ async function main() {
|
||||
}
|
||||
|
||||
if (IMPORT_PURCHASE_ORDERS) {
|
||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
||||
try {
|
||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||
|
||||
// Handle potential error status
|
||||
if (results.purchaseOrders?.status === 'error') {
|
||||
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
|
||||
} else {
|
||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error during purchase orders import:', error);
|
||||
// Continue with other imports, don't fail the whole process
|
||||
results.purchaseOrders = {
|
||||
status: 'error',
|
||||
error: error.message,
|
||||
recordsAdded: 0,
|
||||
recordsUpdated: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const endTime = Date.now();
|
||||
@@ -214,8 +242,8 @@ async function main() {
|
||||
WHERE id = $12
|
||||
`, [
|
||||
totalElapsedSeconds,
|
||||
totalRecordsAdded,
|
||||
totalRecordsUpdated,
|
||||
parseInt(totalRecordsAdded),
|
||||
parseInt(totalRecordsUpdated),
|
||||
IMPORT_CATEGORIES,
|
||||
IMPORT_PRODUCTS,
|
||||
IMPORT_ORDERS,
|
||||
|
||||
@@ -15,6 +15,9 @@ async function importCategories(prodConnection, localConnection) {
|
||||
try {
|
||||
// Start a single transaction for the entire import
|
||||
await localConnection.query('BEGIN');
|
||||
|
||||
// Temporarily disable the trigger that's causing problems
|
||||
await localConnection.query('ALTER TABLE categories DISABLE TRIGGER update_categories_updated_at');
|
||||
|
||||
// Process each type in order with its own savepoint
|
||||
for (const type of typeOrder) {
|
||||
@@ -47,42 +50,18 @@ async function importCategories(prodConnection, localConnection) {
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`\nProcessing ${categories.length} type ${type} categories`);
|
||||
if (type === 10) {
|
||||
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
|
||||
}
|
||||
console.log(`Processing ${categories.length} type ${type} categories`);
|
||||
|
||||
// For types that can have parents (11, 21, 12, 13), verify parent existence
|
||||
// For types that can have parents (11, 21, 12, 13), we'll proceed directly
|
||||
// No need to check for parent existence since we process in hierarchical order
|
||||
let categoriesToInsert = categories;
|
||||
if (![10, 20].includes(type)) {
|
||||
// Get all parent IDs
|
||||
const parentIds = [
|
||||
...new Set(
|
||||
categories
|
||||
.filter(c => c && c.parent_id !== null)
|
||||
.map(c => c.parent_id)
|
||||
),
|
||||
];
|
||||
|
||||
console.log(`Processing ${categories.length} type ${type} categories with ${parentIds.length} unique parent IDs`);
|
||||
console.log('Parent IDs:', parentIds);
|
||||
|
||||
// No need to check for parent existence - we trust they exist since they were just inserted
|
||||
categoriesToInsert = categories;
|
||||
}
|
||||
|
||||
if (categoriesToInsert.length === 0) {
|
||||
console.log(
|
||||
`No valid categories of type ${type} to insert`
|
||||
);
|
||||
console.log(`No valid categories of type ${type} to insert`);
|
||||
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Inserting ${categoriesToInsert.length} type ${type} categories`
|
||||
);
|
||||
|
||||
// PostgreSQL upsert query with parameterized values
|
||||
const values = categoriesToInsert.flatMap((cat) => [
|
||||
cat.cat_id,
|
||||
@@ -95,14 +74,10 @@ async function importCategories(prodConnection, localConnection) {
|
||||
new Date()
|
||||
]);
|
||||
|
||||
console.log('Attempting to insert/update with values:', JSON.stringify(values, null, 2));
|
||||
|
||||
const placeholders = categoriesToInsert
|
||||
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
||||
.join(',');
|
||||
|
||||
console.log('Using placeholders:', placeholders);
|
||||
|
||||
// Insert categories with ON CONFLICT clause for PostgreSQL
|
||||
const query = `
|
||||
WITH inserted_categories AS (
|
||||
@@ -129,17 +104,14 @@ async function importCategories(prodConnection, localConnection) {
|
||||
COUNT(*) FILTER (WHERE is_insert) as inserted,
|
||||
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
||||
FROM inserted_categories`;
|
||||
|
||||
console.log('Executing query:', query);
|
||||
|
||||
const result = await localConnection.query(query, values);
|
||||
console.log('Query result:', result);
|
||||
|
||||
// Get the first result since query returns an array
|
||||
const queryResult = Array.isArray(result) ? result[0] : result;
|
||||
|
||||
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
||||
console.error('Query failed to return results. Result:', queryResult);
|
||||
console.error('Query failed to return results');
|
||||
throw new Error('Query did not return expected results');
|
||||
}
|
||||
|
||||
@@ -173,6 +145,17 @@ async function importCategories(prodConnection, localConnection) {
|
||||
// Commit the entire transaction - we'll do this even if we have skipped categories
|
||||
await localConnection.query('COMMIT');
|
||||
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('categories', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Re-enable the trigger
|
||||
await localConnection.query('ALTER TABLE categories ENABLE TRIGGER update_categories_updated_at');
|
||||
|
||||
outputProgress({
|
||||
status: "complete",
|
||||
operation: "Categories import completed",
|
||||
@@ -201,6 +184,9 @@ async function importCategories(prodConnection, localConnection) {
|
||||
// Only rollback if we haven't committed yet
|
||||
try {
|
||||
await localConnection.query('ROLLBACK');
|
||||
|
||||
// Make sure we re-enable the trigger even if there was an error
|
||||
await localConnection.query('ALTER TABLE categories ENABLE TRIGGER update_categories_updated_at');
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during rollback:", rollbackError);
|
||||
}
|
||||
|
||||
@@ -26,6 +26,9 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
let cumulativeProcessedOrders = 0;
|
||||
|
||||
try {
|
||||
// Begin transaction
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
// Get last sync info
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||
@@ -38,7 +41,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const [[{ total }]] = await prodConnection.query(`
|
||||
SELECT COUNT(*) as total
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
@@ -78,7 +80,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
||||
oi.stamp as last_modified
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
@@ -105,15 +106,15 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
|
||||
console.log('Orders: Found', orderItems.length, 'order items to process');
|
||||
|
||||
// Create tables in PostgreSQL for debugging
|
||||
// Create tables in PostgreSQL for data processing
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS debug_order_items;
|
||||
DROP TABLE IF EXISTS debug_order_meta;
|
||||
DROP TABLE IF EXISTS debug_order_discounts;
|
||||
DROP TABLE IF EXISTS debug_order_taxes;
|
||||
DROP TABLE IF EXISTS debug_order_costs;
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
|
||||
CREATE TABLE debug_order_items (
|
||||
CREATE TEMP TABLE temp_order_items (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
SKU VARCHAR(50) NOT NULL,
|
||||
@@ -123,7 +124,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_meta (
|
||||
CREATE TEMP TABLE temp_order_meta (
|
||||
order_id INTEGER NOT NULL,
|
||||
date DATE NOT NULL,
|
||||
customer VARCHAR(100) NOT NULL,
|
||||
@@ -135,26 +136,29 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
PRIMARY KEY (order_id)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_discounts (
|
||||
CREATE TEMP TABLE temp_order_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
discount DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_taxes (
|
||||
CREATE TEMP TABLE temp_order_taxes (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
tax DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_costs (
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||
`);
|
||||
|
||||
// Insert order items in batches
|
||||
@@ -168,7 +172,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
SKU = EXCLUDED.SKU,
|
||||
@@ -202,6 +206,14 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const METADATA_BATCH_SIZE = 2000;
|
||||
const PG_BATCH_SIZE = 200;
|
||||
|
||||
// Add a helper function for title case conversion
|
||||
function toTitleCase(str) {
|
||||
if (!str) return '';
|
||||
return str.toLowerCase().split(' ').map(word => {
|
||||
return word.charAt(0).toUpperCase() + word.slice(1);
|
||||
}).join(' ');
|
||||
}
|
||||
|
||||
const processMetadataBatch = async (batchIds) => {
|
||||
const [orders] = await prodConnection.query(`
|
||||
SELECT
|
||||
@@ -231,7 +243,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
order.order_id,
|
||||
order.date,
|
||||
order.customer,
|
||||
order.customer_name || '',
|
||||
toTitleCase(order.customer_name) || '',
|
||||
order.status,
|
||||
order.canceled,
|
||||
order.summary_discount || 0,
|
||||
@@ -239,7 +251,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_meta (
|
||||
INSERT INTO temp_order_meta (
|
||||
order_id, date, customer, customer_name, status, canceled,
|
||||
summary_discount, summary_subtotal
|
||||
)
|
||||
@@ -281,7 +293,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_discounts (order_id, pid, discount)
|
||||
INSERT INTO temp_order_discounts (order_id, pid, discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
discount = EXCLUDED.discount
|
||||
@@ -321,7 +333,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_taxes (order_id, pid, tax)
|
||||
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
tax = EXCLUDED.tax
|
||||
@@ -330,14 +342,23 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
};
|
||||
|
||||
const processCostsBatch = async (batchIds) => {
|
||||
// Modified query to ensure one row per order_id/pid by using a subquery
|
||||
const [costs] = await prodConnection.query(`
|
||||
SELECT
|
||||
oc.orderid as order_id,
|
||||
oc.pid,
|
||||
oc.costeach
|
||||
FROM order_costs oc
|
||||
WHERE oc.orderid IN (?)
|
||||
AND oc.pending = 0
|
||||
INNER JOIN (
|
||||
SELECT
|
||||
orderid,
|
||||
pid,
|
||||
MAX(id) as max_id
|
||||
FROM order_costs
|
||||
WHERE orderid IN (?)
|
||||
AND pending = 0
|
||||
GROUP BY orderid, pid
|
||||
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||
`, [batchIds]);
|
||||
|
||||
if (costs.length === 0) return;
|
||||
@@ -357,7 +378,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_costs (order_id, pid, costeach)
|
||||
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
costeach = EXCLUDED.costeach
|
||||
@@ -416,11 +437,12 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
oi.pid,
|
||||
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
||||
COALESCE(ot.tax, 0) as total_tax,
|
||||
COALESCE(oi.price * 0.5, 0) as costeach
|
||||
FROM debug_order_items oi
|
||||
LEFT JOIN debug_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN debug_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax
|
||||
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||
FROM temp_order_items oi
|
||||
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||
)
|
||||
SELECT
|
||||
oi.order_id as order_number,
|
||||
@@ -447,11 +469,11 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
FROM (
|
||||
SELECT DISTINCT ON (order_id, pid)
|
||||
order_id, pid, SKU, price, quantity, base_discount
|
||||
FROM debug_order_items
|
||||
FROM temp_order_items
|
||||
WHERE order_id = ANY($1)
|
||||
ORDER BY order_id, pid
|
||||
) oi
|
||||
JOIN debug_order_meta om ON oi.order_id = om.order_id
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
ORDER BY oi.order_id, oi.pid
|
||||
`, [subBatchIds]);
|
||||
@@ -478,8 +500,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||
|
||||
const placeholders = subBatch.map((_, idx) => {
|
||||
const base = idx * 14; // 14 columns (removed updated)
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14})`;
|
||||
const base = idx * 15; // 15 columns including costeach
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||
}).join(',');
|
||||
|
||||
const batchValues = subBatch.flatMap(o => [
|
||||
@@ -496,7 +518,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
o.customer,
|
||||
o.customer_name,
|
||||
o.status,
|
||||
o.canceled
|
||||
o.canceled,
|
||||
o.costeach
|
||||
]);
|
||||
|
||||
const [result] = await localConnection.query(`
|
||||
@@ -504,7 +527,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
INSERT INTO orders (
|
||||
order_number, pid, sku, date, price, quantity, discount,
|
||||
tax, tax_included, shipping, customer, customer_name,
|
||||
status, canceled
|
||||
status, canceled, costeach
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||
@@ -519,7 +542,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled
|
||||
canceled = EXCLUDED.canceled,
|
||||
costeach = EXCLUDED.costeach
|
||||
RETURNING xmax = 0 as inserted
|
||||
)
|
||||
SELECT
|
||||
@@ -529,8 +553,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
`, batchValues);
|
||||
|
||||
const { inserted, updated } = result.rows[0];
|
||||
recordsAdded += inserted;
|
||||
recordsUpdated += updated;
|
||||
recordsAdded += parseInt(inserted) || 0;
|
||||
recordsUpdated += parseInt(updated) || 0;
|
||||
importedCount += subBatch.length;
|
||||
}
|
||||
|
||||
@@ -555,19 +579,39 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Cleanup temporary tables
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
await localConnection.commit();
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: Math.floor(importedCount),
|
||||
recordsAdded: recordsAdded || 0,
|
||||
recordsUpdated: Math.floor(recordsUpdated),
|
||||
totalSkipped: skippedOrders.size,
|
||||
missingProducts: missingProducts.size,
|
||||
totalImported: Math.floor(importedCount) || 0,
|
||||
recordsAdded: parseInt(recordsAdded) || 0,
|
||||
recordsUpdated: parseInt(recordsUpdated) || 0,
|
||||
totalSkipped: skippedOrders.size || 0,
|
||||
missingProducts: missingProducts.size || 0,
|
||||
incrementalUpdate,
|
||||
lastSyncTime
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during orders import:", error);
|
||||
|
||||
// Rollback transaction
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during rollback:", rollbackError);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,13 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
const BATCH_SIZE = 100; // Smaller batch size for better progress tracking
|
||||
const BATCH_SIZE = 1000; // Smaller batch size for better progress tracking
|
||||
const MAX_RETRIES = 3;
|
||||
const RETRY_DELAY = 5000; // 5 seconds
|
||||
const dotenv = require("dotenv");
|
||||
const path = require("path");
|
||||
dotenv.config({ path: path.join(__dirname, "../../.env") });
|
||||
|
||||
// Utility functions
|
||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
||||
const imageUrlBase = process.env.PRODUCT_IMAGE_URL_BASE || 'https://sbing.com/i/products/0000/';
|
||||
const getImageUrls = (pid, iid = 1) => {
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
// Use padded PID only for the first 3 digits
|
||||
@@ -18,7 +21,7 @@ const getImageUrls = (pid, iid = 1) => {
|
||||
};
|
||||
};
|
||||
|
||||
// Add helper function for retrying operations
|
||||
// Add helper function for retrying operations with exponential backoff
|
||||
async function withRetry(operation, errorMessage) {
|
||||
let lastError;
|
||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||
@@ -28,7 +31,8 @@ async function withRetry(operation, errorMessage) {
|
||||
lastError = error;
|
||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||
if (attempt < MAX_RETRIES) {
|
||||
await new Promise(resolve => setTimeout(resolve, RETRY_DELAY));
|
||||
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -140,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
@@ -155,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (
|
||||
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||
FROM product_inventory
|
||||
WHERE pid = p.pid AND count > 0
|
||||
)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL as landing_cost_price,
|
||||
@@ -183,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
@@ -233,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
row.pid,
|
||||
row.title,
|
||||
row.description,
|
||||
row.itemnumber || '',
|
||||
row.sku || '',
|
||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||
row.preorder_count,
|
||||
row.notions_inv_count,
|
||||
@@ -335,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
@@ -350,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (
|
||||
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||
FROM product_inventory
|
||||
WHERE pid = p.pid AND count > 0
|
||||
)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL as landing_cost_price,
|
||||
@@ -378,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
@@ -432,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
row.pid,
|
||||
row.title,
|
||||
row.description,
|
||||
row.itemnumber || '',
|
||||
row.sku || '',
|
||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||
row.preorder_count,
|
||||
row.notions_inv_count,
|
||||
@@ -772,32 +788,44 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
||||
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
||||
|
||||
// Process category relationships for each product in the batch
|
||||
// Process category relationships in batches
|
||||
const allCategories = [];
|
||||
for (const row of batch) {
|
||||
if (row.categories) {
|
||||
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
||||
if (categoryIds.length > 0) {
|
||||
const catPlaceholders = categoryIds.map((_, idx) =>
|
||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||
).join(',');
|
||||
const catValues = categoryIds.flatMap(catId => [row.pid, parseInt(catId.trim(), 10)]);
|
||||
|
||||
// First delete existing relationships for this product
|
||||
await localConnection.query(
|
||||
'DELETE FROM product_categories WHERE pid = $1',
|
||||
[row.pid]
|
||||
);
|
||||
|
||||
// Then insert the new relationships
|
||||
await localConnection.query(`
|
||||
INSERT INTO product_categories (pid, cat_id)
|
||||
VALUES ${catPlaceholders}
|
||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||
`, catValues);
|
||||
categoryIds.forEach(catId => {
|
||||
allCategories.push([row.pid, parseInt(catId.trim(), 10)]);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If we have categories to process
|
||||
if (allCategories.length > 0) {
|
||||
// First get all products in this batch
|
||||
const productIds = batch.map(p => p.pid);
|
||||
|
||||
// Delete all existing relationships for products in this batch
|
||||
await localConnection.query(
|
||||
'DELETE FROM product_categories WHERE pid = ANY($1)',
|
||||
[productIds]
|
||||
);
|
||||
|
||||
// Insert all new relationships in one batch
|
||||
const catPlaceholders = allCategories.map((_, idx) =>
|
||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||
).join(',');
|
||||
|
||||
const catValues = allCategories.flat();
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO product_categories (pid, cat_id)
|
||||
VALUES ${catPlaceholders}
|
||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||
`, catValues);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
@@ -816,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
// Commit the transaction
|
||||
await localConnection.commit();
|
||||
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('products', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
status: 'complete',
|
||||
recordsAdded,
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -32,12 +32,12 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -98,14 +98,14 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
|
||||
COUNT(DISTINCT DATE(o.date)) as period_days,
|
||||
CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH) THEN 'current'
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH) THEN 'previous'
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '3 months' THEN 'current'
|
||||
WHEN o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||
AND CURRENT_DATE - INTERVAL '12 months' THEN 'previous'
|
||||
END as period_type
|
||||
FROM filtered_products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||
GROUP BY p.brand, period_type
|
||||
),
|
||||
brand_data AS (
|
||||
@@ -165,15 +165,16 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
|
||||
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
|
||||
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_stock_units = VALUES(total_stock_units),
|
||||
total_stock_cost = VALUES(total_stock_cost),
|
||||
total_stock_retail = VALUES(total_stock_retail),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin),
|
||||
growth_rate = VALUES(growth_rate),
|
||||
ON CONFLICT (brand) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_stock_units = EXCLUDED.total_stock_units,
|
||||
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin,
|
||||
growth_rate = EXCLUDED.growth_rate,
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
`);
|
||||
|
||||
@@ -230,8 +231,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
monthly_metrics AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT p.valid_pid) as product_count,
|
||||
COUNT(DISTINCT p.active_pid) as active_products,
|
||||
SUM(p.valid_stock) as total_stock_units,
|
||||
@@ -255,19 +256,20 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
END as avg_margin
|
||||
FROM filtered_products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.brand, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
)
|
||||
SELECT *
|
||||
FROM monthly_metrics
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_stock_units = VALUES(total_stock_units),
|
||||
total_stock_cost = VALUES(total_stock_cost),
|
||||
total_stock_retail = VALUES(total_stock_retail),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin)
|
||||
ON CONFLICT (brand, year, month) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_stock_units = EXCLUDED.total_stock_units,
|
||||
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
@@ -294,7 +296,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('brand_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -32,12 +32,12 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -76,12 +76,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
LEFT JOIN products p ON pc.pid = p.pid
|
||||
GROUP BY c.cat_id, c.status
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_value = VALUES(total_value),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (category_id) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_value = EXCLUDED.total_value,
|
||||
status = EXCLUDED.status,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.90);
|
||||
@@ -127,17 +128,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL COALESCE(tc.calculation_period_days, 30) DAY)
|
||||
AND o.date >= CURRENT_DATE - (COALESCE(tc.calculation_period_days, 30) || ' days')::INTERVAL
|
||||
GROUP BY pc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
JOIN category_sales cs ON cm.category_id = cs.cat_id
|
||||
LEFT JOIN turnover_config tc ON
|
||||
(tc.category_id = cm.category_id AND tc.vendor IS NULL) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
UPDATE category_metrics
|
||||
SET
|
||||
cm.avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||
cm.turnover_rate = CASE
|
||||
avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||
turnover_rate = CASE
|
||||
WHEN cs.avg_stock > 0 AND cs.active_days > 0
|
||||
THEN LEAST(
|
||||
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
|
||||
@@ -145,7 +142,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
cm.last_calculated_at = NOW()
|
||||
last_calculated_at = NOW()
|
||||
FROM category_sales cs
|
||||
WHERE category_id = cs.cat_id
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
@@ -184,9 +183,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
previous_period AS (
|
||||
@@ -198,26 +197,26 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||
AND CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
trend_data AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(MONTH FROM o.date) as month,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as days_in_month
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
GROUP BY pc.cat_id, MONTH(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||
GROUP BY pc.cat_id, EXTRACT(MONTH FROM o.date)
|
||||
),
|
||||
trend_stats AS (
|
||||
SELECT
|
||||
@@ -261,16 +260,42 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
combined_metrics AS (
|
||||
SELECT
|
||||
COALESCE(cp.cat_id, pp.cat_id) as category_id,
|
||||
CASE
|
||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||
WHEN ta.trend_slope IS NOT NULL THEN
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
(ta.trend_slope / NULLIF(ta.avg_daily_revenue, 0)) * 365 * 100,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
ELSE
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
((COALESCE(cp.revenue, 0) - pp.revenue) /
|
||||
NULLIF(ABS(pp.revenue), 0)) * 100.0,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
END as growth_rate,
|
||||
mc.avg_margin
|
||||
FROM current_period cp
|
||||
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
LEFT JOIN current_period cp ON cm.category_id = cp.cat_id
|
||||
LEFT JOIN previous_period pp ON cm.category_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON cm.category_id = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON cm.category_id = mc.cat_id
|
||||
SET
|
||||
cm.growth_rate = CASE
|
||||
growth_rate = CASE
|
||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||
WHEN ta.trend_slope IS NOT NULL THEN
|
||||
@@ -291,9 +316,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
)
|
||||
END,
|
||||
cm.avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||
cm.last_calculated_at = NOW()
|
||||
WHERE cp.cat_id IS NOT NULL OR pp.cat_id IS NOT NULL
|
||||
avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||
last_calculated_at = NOW()
|
||||
FROM current_period cp
|
||||
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||
WHERE cm.category_id = COALESCE(cp.cat_id, pp.cat_id)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.97);
|
||||
@@ -335,8 +364,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_value,
|
||||
@@ -364,15 +393,16 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY pc.cat_id, YEAR(o.date), MONTH(o.date)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_value = VALUES(total_value),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin),
|
||||
turnover_rate = VALUES(turnover_rate)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY pc.cat_id, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
ON CONFLICT (category_id, year, month) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_value = EXCLUDED.total_value,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin,
|
||||
turnover_rate = EXCLUDED.turnover_rate
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
@@ -414,20 +444,20 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
WITH date_ranges AS (
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY) as period_start,
|
||||
CURRENT_DATE - INTERVAL '30 days' as period_start,
|
||||
CURRENT_DATE as period_end
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 31 DAY)
|
||||
CURRENT_DATE - INTERVAL '90 days',
|
||||
CURRENT_DATE - INTERVAL '31 days'
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 180 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 91 DAY)
|
||||
CURRENT_DATE - INTERVAL '180 days',
|
||||
CURRENT_DATE - INTERVAL '91 days'
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 365 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 181 DAY)
|
||||
CURRENT_DATE - INTERVAL '365 days',
|
||||
CURRENT_DATE - INTERVAL '181 days'
|
||||
),
|
||||
sales_data AS (
|
||||
SELECT
|
||||
@@ -466,12 +496,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
END as avg_price,
|
||||
NOW() as last_calculated_at
|
||||
FROM sales_data
|
||||
ON DUPLICATE KEY UPDATE
|
||||
avg_daily_sales = VALUES(avg_daily_sales),
|
||||
total_sold = VALUES(total_sold),
|
||||
num_products = VALUES(num_products),
|
||||
avg_price = VALUES(avg_price),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (category_id, brand, period_start, period_end) DO UPDATE
|
||||
SET
|
||||
avg_daily_sales = EXCLUDED.avg_daily_sales,
|
||||
total_sold = EXCLUDED.total_sold,
|
||||
num_products = EXCLUDED.num_products,
|
||||
avg_price = EXCLUDED.avg_price,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
@@ -498,7 +529,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('category_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -32,13 +32,13 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -56,38 +56,97 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate financial metrics with optimized query
|
||||
// First, calculate beginning inventory values (12 months ago)
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_beginning_inventory AS
|
||||
WITH beginning_inventory_calc AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.stock_quantity as current_quantity,
|
||||
COALESCE(SUM(o.quantity), 0) as sold_quantity,
|
||||
COALESCE(SUM(po.received), 0) as received_quantity,
|
||||
GREATEST(0, (p.stock_quantity + COALESCE(SUM(o.quantity), 0) - COALESCE(SUM(po.received), 0))) as beginning_quantity,
|
||||
p.cost_price
|
||||
FROM
|
||||
products p
|
||||
LEFT JOIN
|
||||
orders o ON p.pid = o.pid
|
||||
AND o.canceled = false
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'::interval
|
||||
LEFT JOIN
|
||||
purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.received_date >= CURRENT_DATE - INTERVAL '12 months'::interval
|
||||
GROUP BY
|
||||
p.pid, p.stock_quantity, p.cost_price
|
||||
)
|
||||
SELECT
|
||||
pid,
|
||||
beginning_quantity,
|
||||
beginning_quantity * cost_price as beginning_value,
|
||||
current_quantity * cost_price as current_value,
|
||||
((beginning_quantity * cost_price) + (current_quantity * cost_price)) / 2 as average_inventory_value
|
||||
FROM
|
||||
beginning_inventory_calc
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.60);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Beginning inventory values calculated, computing financial metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate financial metrics with optimized query and standard formulas
|
||||
await connection.query(`
|
||||
WITH product_financials AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * p.cost_price) as cost_of_goods_sold,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COALESCE(bi.average_inventory_value, p.cost_price * p.stock_quantity) as avg_inventory_value,
|
||||
p.cost_price * p.stock_quantity as current_inventory_value,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0))) as total_revenue,
|
||||
SUM(o.quantity * COALESCE(o.costeach, 0)) as cost_of_goods_sold,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - COALESCE(o.costeach, 0))) as gross_profit,
|
||||
MIN(o.date) as first_sale_date,
|
||||
MAX(o.date) as last_sale_date,
|
||||
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days,
|
||||
EXTRACT(DAY FROM (MAX(o.date)::timestamp with time zone - MIN(o.date)::timestamp with time zone)) + 1 as calculation_period_days,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN temp_beginning_inventory bi ON p.pid = bi.pid
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
GROUP BY p.pid
|
||||
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'::interval
|
||||
GROUP BY p.pid, p.cost_price, p.stock_quantity, bi.average_inventory_value
|
||||
)
|
||||
UPDATE product_metrics pm
|
||||
JOIN product_financials pf ON pm.pid = pf.pid
|
||||
SET
|
||||
pm.inventory_value = COALESCE(pf.inventory_value, 0),
|
||||
pm.total_revenue = COALESCE(pf.total_revenue, 0),
|
||||
pm.cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
||||
pm.gross_profit = COALESCE(pf.gross_profit, 0),
|
||||
pm.gmroi = CASE
|
||||
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
|
||||
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
|
||||
inventory_value = COALESCE(pf.current_inventory_value, 0)::decimal(10,3),
|
||||
total_revenue = COALESCE(pf.total_revenue, 0)::decimal(10,3),
|
||||
cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0)::decimal(10,3),
|
||||
gross_profit = COALESCE(pf.gross_profit, 0)::decimal(10,3),
|
||||
turnover_rate = CASE
|
||||
WHEN COALESCE(pf.avg_inventory_value, 0) > 0 THEN
|
||||
COALESCE(pf.cost_of_goods_sold, 0) / NULLIF(pf.avg_inventory_value, 0)
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = CURRENT_TIMESTAMP
|
||||
END::decimal(12,3),
|
||||
gmroi = CASE
|
||||
WHEN COALESCE(pf.avg_inventory_value, 0) > 0 THEN
|
||||
COALESCE(pf.gross_profit, 0) / NULLIF(pf.avg_inventory_value, 0)
|
||||
ELSE 0
|
||||
END::decimal(10,3),
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
FROM product_financials pf
|
||||
WHERE pm.pid = pf.pid
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
@@ -114,52 +173,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
success
|
||||
};
|
||||
|
||||
// Update time-based aggregates with optimized query
|
||||
await connection.query(`
|
||||
WITH monthly_financials AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days,
|
||||
MIN(o.date) as period_start,
|
||||
MAX(o.date) as period_end
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
)
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN monthly_financials mf ON pta.pid = mf.pid
|
||||
AND pta.year = mf.year
|
||||
AND pta.month = mf.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(mf.inventory_value, 0),
|
||||
pta.gmroi = CASE
|
||||
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
|
||||
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
|
||||
ELSE 0
|
||||
END
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.70);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Time-based aggregates updated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
// Clean up temporary tables
|
||||
await connection.query('DROP TABLE IF EXISTS temp_beginning_inventory');
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
@@ -168,7 +183,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('financial_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
@@ -184,6 +200,12 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
throw error;
|
||||
} finally {
|
||||
if (connection) {
|
||||
try {
|
||||
// Make sure temporary tables are always cleaned up
|
||||
await connection.query('DROP TABLE IF EXISTS temp_beginning_inventory');
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temp tables:', err);
|
||||
}
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,20 +10,21 @@ function sanitizeValue(value) {
|
||||
}
|
||||
|
||||
async function calculateProductMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let connection;
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
const BATCH_SIZE = 5000;
|
||||
|
||||
try {
|
||||
connection = await getConnection();
|
||||
// Skip flags are inherited from the parent scope
|
||||
const SKIP_PRODUCT_BASE_METRICS = 0;
|
||||
const SKIP_PRODUCT_TIME_AGGREGATES = 0;
|
||||
|
||||
// Get total product count if not provided
|
||||
if (!totalProducts) {
|
||||
const [productCount] = await connection.query('SELECT COUNT(*) as count FROM products');
|
||||
totalProducts = productCount[0].count;
|
||||
const productCount = await connection.query('SELECT COUNT(*) as count FROM products');
|
||||
totalProducts = parseInt(productCount.rows[0].count);
|
||||
}
|
||||
|
||||
if (isCancelled) {
|
||||
@@ -52,19 +53,48 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
|
||||
// First ensure all products have a metrics record
|
||||
await connection.query(`
|
||||
INSERT IGNORE INTO product_metrics (pid, last_calculated_at)
|
||||
INSERT INTO product_metrics (pid, last_calculated_at)
|
||||
SELECT pid, NOW()
|
||||
FROM products
|
||||
ON CONFLICT (pid) DO NOTHING
|
||||
`);
|
||||
|
||||
// Get threshold settings once
|
||||
const [thresholds] = await connection.query(`
|
||||
const thresholds = await connection.query(`
|
||||
SELECT critical_days, reorder_days, overstock_days, low_stock_threshold
|
||||
FROM stock_thresholds
|
||||
WHERE category_id IS NULL AND vendor IS NULL
|
||||
LIMIT 1
|
||||
`);
|
||||
const defaultThresholds = thresholds[0];
|
||||
|
||||
// Check if threshold data was returned
|
||||
if (!thresholds.rows || thresholds.rows.length === 0) {
|
||||
console.warn('No default thresholds found in the database. Using explicit type casting in the query.');
|
||||
}
|
||||
|
||||
const defaultThresholds = thresholds.rows[0];
|
||||
|
||||
// Get financial calculation configuration parameters
|
||||
const financialConfig = await connection.query(`
|
||||
SELECT
|
||||
order_cost,
|
||||
holding_rate,
|
||||
service_level_z_score,
|
||||
min_reorder_qty,
|
||||
default_reorder_qty,
|
||||
default_safety_stock
|
||||
FROM financial_calc_config
|
||||
WHERE id = 1
|
||||
LIMIT 1
|
||||
`);
|
||||
const finConfig = financialConfig.rows[0] || {
|
||||
order_cost: 25.00,
|
||||
holding_rate: 0.25,
|
||||
service_level_z_score: 1.96,
|
||||
min_reorder_qty: 1,
|
||||
default_reorder_qty: 5,
|
||||
default_safety_stock: 5
|
||||
};
|
||||
|
||||
// Calculate base product metrics
|
||||
if (!SKIP_PRODUCT_BASE_METRICS) {
|
||||
@@ -85,16 +115,45 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
});
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
// Clear temporary tables
|
||||
await connection.query('TRUNCATE TABLE temp_sales_metrics');
|
||||
await connection.query('TRUNCATE TABLE temp_purchase_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||
|
||||
// Create temp_sales_metrics
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_sales_metrics (
|
||||
pid BIGINT NOT NULL,
|
||||
daily_sales_avg DECIMAL(10,3),
|
||||
weekly_sales_avg DECIMAL(10,3),
|
||||
monthly_sales_avg DECIMAL(10,3),
|
||||
total_revenue DECIMAL(10,3),
|
||||
avg_margin_percent DECIMAL(10,3),
|
||||
first_sale_date DATE,
|
||||
last_sale_date DATE,
|
||||
stddev_daily_sales DECIMAL(10,3),
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
|
||||
// Create temp_purchase_metrics
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_purchase_metrics (
|
||||
pid BIGINT NOT NULL,
|
||||
avg_lead_time_days DECIMAL(10,2),
|
||||
last_purchase_date DATE,
|
||||
first_received_date DATE,
|
||||
last_received_date DATE,
|
||||
stddev_lead_time_days DECIMAL(10,2),
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
|
||||
// Populate temp_sales_metrics with base stats and sales averages
|
||||
await connection.query(`
|
||||
@@ -111,120 +170,186 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
ELSE 0
|
||||
END as avg_margin_percent,
|
||||
MIN(o.date) as first_sale_date,
|
||||
MAX(o.date) as last_sale_date
|
||||
MAX(o.date) as last_sale_date,
|
||||
COALESCE(STDDEV_SAMP(daily_qty.quantity), 0) as stddev_daily_sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
LEFT JOIN (
|
||||
SELECT
|
||||
pid,
|
||||
DATE(date) as sale_date,
|
||||
SUM(quantity) as quantity
|
||||
FROM orders
|
||||
WHERE canceled = false
|
||||
AND date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY pid, DATE(date)
|
||||
) daily_qty ON p.pid = daily_qty.pid
|
||||
GROUP BY p.pid
|
||||
`);
|
||||
|
||||
// Populate temp_purchase_metrics
|
||||
await connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
|
||||
MAX(po.date) as last_purchase_date,
|
||||
MIN(po.received_date) as first_received_date,
|
||||
MAX(po.received_date) as last_received_date
|
||||
FROM products p
|
||||
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date >= DATE_SUB(CURDATE(), INTERVAL 365 DAY)
|
||||
GROUP BY p.pid
|
||||
`);
|
||||
// Populate temp_purchase_metrics with timeout protection
|
||||
await Promise.race([
|
||||
connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
AVG(
|
||||
CASE
|
||||
WHEN po.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END
|
||||
) as avg_lead_time_days,
|
||||
MAX(po.date) as last_purchase_date,
|
||||
MIN(po.received_date) as first_received_date,
|
||||
MAX(po.received_date) as last_received_date,
|
||||
STDDEV_SAMP(
|
||||
CASE
|
||||
WHEN po.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END
|
||||
) as stddev_lead_time_days
|
||||
FROM products p
|
||||
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
AND po.date >= CURRENT_DATE - INTERVAL '365 days'
|
||||
GROUP BY p.pid
|
||||
`),
|
||||
new Promise((_, reject) =>
|
||||
setTimeout(() => reject(new Error('Timeout: temp_purchase_metrics query took too long')), 60000)
|
||||
)
|
||||
]).catch(async (err) => {
|
||||
logError(err, 'Error populating temp_purchase_metrics, continuing with empty table');
|
||||
// Create an empty fallback to continue processing
|
||||
await connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
30.0 as avg_lead_time_days,
|
||||
NULL as last_purchase_date,
|
||||
NULL as first_received_date,
|
||||
NULL as last_received_date,
|
||||
0.0 as stddev_lead_time_days
|
||||
FROM products p
|
||||
LEFT JOIN temp_purchase_metrics tpm ON p.pid = tpm.pid
|
||||
WHERE tpm.pid IS NULL
|
||||
`);
|
||||
});
|
||||
|
||||
// Process updates in batches
|
||||
let lastPid = 0;
|
||||
while (true) {
|
||||
let batchCount = 0;
|
||||
const MAX_BATCHES = 1000; // Safety limit for number of batches to prevent infinite loops
|
||||
|
||||
while (batchCount < MAX_BATCHES) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const [batch] = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
||||
|
||||
batchCount++;
|
||||
const batch = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||
[lastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (batch.length === 0) break;
|
||||
if (batch.rows.length === 0) break;
|
||||
|
||||
// Process the entire batch in a single efficient query
|
||||
const lowStockThreshold = parseInt(defaultThresholds?.low_stock_threshold) || 5;
|
||||
const criticalDays = parseInt(defaultThresholds?.critical_days) || 7;
|
||||
const reorderDays = parseInt(defaultThresholds?.reorder_days) || 14;
|
||||
const overstockDays = parseInt(defaultThresholds?.overstock_days) || 90;
|
||||
const serviceLevel = parseFloat(finConfig?.service_level_z_score) || 1.96;
|
||||
const defaultSafetyStock = parseInt(finConfig?.default_safety_stock) || 5;
|
||||
const defaultReorderQty = parseInt(finConfig?.default_reorder_qty) || 5;
|
||||
const orderCost = parseFloat(finConfig?.order_cost) || 25.00;
|
||||
const holdingRate = parseFloat(finConfig?.holding_rate) || 0.25;
|
||||
const minReorderQty = parseInt(finConfig?.min_reorder_qty) || 1;
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN products p ON pm.pid = p.pid
|
||||
LEFT JOIN temp_sales_metrics sm ON pm.pid = sm.pid
|
||||
LEFT JOIN temp_purchase_metrics lm ON pm.pid = lm.pid
|
||||
SET
|
||||
pm.inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
||||
pm.daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
||||
pm.weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
||||
pm.monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
||||
pm.total_revenue = COALESCE(sm.total_revenue, 0),
|
||||
pm.avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
||||
pm.first_sale_date = sm.first_sale_date,
|
||||
pm.last_sale_date = sm.last_sale_date,
|
||||
pm.avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
|
||||
pm.days_of_inventory = CASE
|
||||
inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
||||
daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
||||
weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
||||
monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
||||
total_revenue = COALESCE(sm.total_revenue, 0),
|
||||
avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
||||
first_sale_date = sm.first_sale_date,
|
||||
last_sale_date = sm.last_sale_date,
|
||||
avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30.0),
|
||||
days_of_inventory = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0
|
||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.daily_sales_avg, 0))
|
||||
ELSE NULL
|
||||
END,
|
||||
pm.weeks_of_inventory = CASE
|
||||
weeks_of_inventory = CASE
|
||||
WHEN COALESCE(sm.weekly_sales_avg, 0) > 0
|
||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.weekly_sales_avg, 0))
|
||||
ELSE NULL
|
||||
END,
|
||||
pm.stock_status = CASE
|
||||
stock_status = CASE
|
||||
WHEN p.stock_quantity <= 0 THEN 'Out of Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= ? THEN 'Low Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= ${lowStockThreshold} THEN 'Low Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 THEN 'In Stock'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Critical'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Reorder'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ? THEN 'Overstocked'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ${criticalDays} THEN 'Critical'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ${reorderDays} THEN 'Reorder'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ${overstockDays} THEN 'Overstocked'
|
||||
ELSE 'Healthy'
|
||||
END,
|
||||
pm.safety_stock = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
||||
ELSE ?
|
||||
END,
|
||||
pm.reorder_point = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||
CEIL(sm.daily_sales_avg * COALESCE(lm.avg_lead_time_days, 30)) +
|
||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
||||
ELSE ?
|
||||
END,
|
||||
pm.reorder_qty = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL THEN
|
||||
GREATEST(
|
||||
CEIL(SQRT((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25))),
|
||||
?
|
||||
safety_stock = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND COALESCE(lm.avg_lead_time_days, 0) > 0 THEN
|
||||
CEIL(
|
||||
${serviceLevel} * SQRT(
|
||||
GREATEST(0, COALESCE(lm.avg_lead_time_days, 0)) * POWER(COALESCE(sm.stddev_daily_sales, 0), 2) +
|
||||
POWER(COALESCE(sm.daily_sales_avg, 0), 2) * POWER(COALESCE(lm.stddev_lead_time_days, 0), 2)
|
||||
)
|
||||
)
|
||||
ELSE ?
|
||||
ELSE ${defaultSafetyStock}
|
||||
END,
|
||||
pm.overstocked_amt = CASE
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ?
|
||||
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * ?))
|
||||
reorder_point = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||
CEIL(sm.daily_sales_avg * GREATEST(0, COALESCE(lm.avg_lead_time_days, 30.0))) +
|
||||
(CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND COALESCE(lm.avg_lead_time_days, 0) > 0 THEN
|
||||
CEIL(
|
||||
${serviceLevel} * SQRT(
|
||||
GREATEST(0, COALESCE(lm.avg_lead_time_days, 0)) * POWER(COALESCE(sm.stddev_daily_sales, 0), 2) +
|
||||
POWER(COALESCE(sm.daily_sales_avg, 0), 2) * POWER(COALESCE(lm.stddev_lead_time_days, 0), 2)
|
||||
)
|
||||
)
|
||||
ELSE ${defaultSafetyStock}
|
||||
END)
|
||||
ELSE ${lowStockThreshold}
|
||||
END,
|
||||
reorder_qty = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL AND NULLIF(p.cost_price, 0) > 0 THEN
|
||||
GREATEST(
|
||||
CEIL(SQRT(
|
||||
(2 * (sm.daily_sales_avg * 365) * ${orderCost}) /
|
||||
NULLIF(p.cost_price * ${holdingRate}, 0)
|
||||
)),
|
||||
${minReorderQty}
|
||||
)
|
||||
ELSE ${defaultReorderQty}
|
||||
END,
|
||||
overstocked_amt = CASE
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ${overstockDays}
|
||||
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * ${overstockDays}))
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE p.pid IN (${batch.map(() => '?').join(',')})
|
||||
`,
|
||||
[
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.critical_days,
|
||||
defaultThresholds.reorder_days,
|
||||
defaultThresholds.overstock_days,
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.overstock_days,
|
||||
defaultThresholds.overstock_days,
|
||||
...batch.map(row => row.pid)
|
||||
]
|
||||
);
|
||||
last_calculated_at = NOW()
|
||||
FROM products p
|
||||
LEFT JOIN temp_sales_metrics sm ON p.pid = sm.pid
|
||||
LEFT JOIN temp_purchase_metrics lm ON p.pid = lm.pid
|
||||
WHERE p.pid = ANY($1::BIGINT[])
|
||||
AND pm.pid = p.pid
|
||||
`, [batch.rows.map(row => row.pid)]);
|
||||
|
||||
lastPid = batch[batch.length - 1].pid;
|
||||
processedCount += batch.length;
|
||||
lastPid = batch.rows[batch.rows.length - 1].pid;
|
||||
processedCount += batch.rows.length;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -243,54 +368,63 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
});
|
||||
}
|
||||
|
||||
// Calculate forecast accuracy and bias in batches
|
||||
lastPid = 0;
|
||||
while (true) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const [batch] = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
||||
[lastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (batch.length === 0) break;
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN (
|
||||
SELECT
|
||||
sf.pid,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN ABS(sf.forecast_units - o.quantity) / o.quantity * 100
|
||||
ELSE 100
|
||||
END) as avg_forecast_error,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN (sf.forecast_units - o.quantity) / o.quantity * 100
|
||||
ELSE 0
|
||||
END) as avg_forecast_bias,
|
||||
MAX(sf.forecast_date) as last_forecast_date
|
||||
FROM sales_forecasts sf
|
||||
JOIN orders o ON sf.pid = o.pid
|
||||
AND DATE(o.date) = sf.forecast_date
|
||||
WHERE o.canceled = false
|
||||
AND sf.forecast_date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND sf.pid IN (?)
|
||||
GROUP BY sf.pid
|
||||
) fa ON pm.pid = fa.pid
|
||||
SET
|
||||
pm.forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
|
||||
pm.forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
|
||||
pm.last_forecast_date = fa.last_forecast_date,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [batch.map(row => row.pid), batch.map(row => row.pid)]);
|
||||
|
||||
lastPid = batch[batch.length - 1].pid;
|
||||
// Add safety check if the loop processed MAX_BATCHES
|
||||
if (batchCount >= MAX_BATCHES) {
|
||||
logError(new Error(`Reached maximum batch count (${MAX_BATCHES}). Process may have entered an infinite loop.`), 'Batch processing safety limit reached');
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate forecast accuracy and bias in batches
|
||||
let forecastPid = 0;
|
||||
while (true) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const forecastBatch = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||
[forecastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (forecastBatch.rows.length === 0) break;
|
||||
|
||||
const forecastPidArray = forecastBatch.rows.map(row => row.pid);
|
||||
|
||||
// Use array_to_string to convert the array to a string of comma-separated values
|
||||
await connection.query(`
|
||||
WITH forecast_metrics AS (
|
||||
SELECT
|
||||
sf.pid,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN ABS(sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||
ELSE 100
|
||||
END) as avg_forecast_error,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN (sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||
ELSE 0
|
||||
END) as avg_forecast_bias,
|
||||
MAX(sf.forecast_date) as last_forecast_date
|
||||
FROM sales_forecasts sf
|
||||
JOIN orders o ON sf.pid = o.pid
|
||||
AND DATE(o.date) = sf.forecast_date
|
||||
WHERE o.canceled = false
|
||||
AND sf.forecast_date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
AND sf.pid = ANY('{${forecastPidArray.join(',')}}'::BIGINT[])
|
||||
GROUP BY sf.pid
|
||||
)
|
||||
UPDATE product_metrics pm
|
||||
SET
|
||||
forecast_accuracy = GREATEST(0, 100 - LEAST(fm.avg_forecast_error, 100)),
|
||||
forecast_bias = GREATEST(-100, LEAST(fm.avg_forecast_bias, 100)),
|
||||
last_forecast_date = fm.last_forecast_date,
|
||||
last_calculated_at = NOW()
|
||||
FROM forecast_metrics fm
|
||||
WHERE pm.pid = fm.pid
|
||||
`);
|
||||
|
||||
forecastPid = forecastBatch.rows[forecastBatch.rows.length - 1].pid;
|
||||
}
|
||||
|
||||
// Calculate product time aggregates
|
||||
if (!SKIP_PRODUCT_TIME_AGGREGATES) {
|
||||
outputProgress({
|
||||
@@ -309,60 +443,12 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate time-based aggregates
|
||||
await connection.query(`
|
||||
INSERT INTO product_time_aggregates (
|
||||
pid,
|
||||
year,
|
||||
month,
|
||||
total_quantity_sold,
|
||||
total_revenue,
|
||||
total_cost,
|
||||
order_count,
|
||||
avg_price,
|
||||
profit_margin,
|
||||
inventory_value,
|
||||
gmroi
|
||||
)
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
SUM(o.quantity) as total_quantity_sold,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * p.cost_price) as total_cost,
|
||||
COUNT(DISTINCT o.order_number) as order_count,
|
||||
AVG(o.price) as avg_price,
|
||||
CASE
|
||||
WHEN SUM(o.quantity * o.price) > 0
|
||||
THEN ((SUM(o.quantity * o.price) - SUM(o.quantity * p.cost_price)) / SUM(o.quantity * o.price)) * 100
|
||||
ELSE 0
|
||||
END as profit_margin,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
CASE
|
||||
WHEN p.cost_price * p.stock_quantity > 0
|
||||
THEN (SUM(o.quantity * (o.price - p.cost_price))) / (p.cost_price * p.stock_quantity)
|
||||
ELSE 0
|
||||
END as gmroi
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_quantity_sold = VALUES(total_quantity_sold),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_cost = VALUES(total_cost),
|
||||
order_count = VALUES(order_count),
|
||||
avg_price = VALUES(avg_price),
|
||||
profit_margin = VALUES(profit_margin),
|
||||
inventory_value = VALUES(inventory_value),
|
||||
gmroi = VALUES(gmroi)
|
||||
`);
|
||||
|
||||
// Note: The time-aggregates calculation has been moved to time-aggregates.js
|
||||
// This module will not duplicate that functionality
|
||||
processedCount = Math.floor(totalProducts * 0.6);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Product time aggregates calculated',
|
||||
operation: 'Product time aggregates calculation delegated to time-aggregates module',
|
||||
current: processedCount || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
@@ -418,11 +504,15 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
success
|
||||
};
|
||||
|
||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
const abcConfig = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig.rows[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
|
||||
// Extract values and ensure they are valid numbers
|
||||
const aThreshold = parseFloat(abcThresholds.a_threshold) || 20;
|
||||
const bThreshold = parseFloat(abcThresholds.b_threshold) || 50;
|
||||
|
||||
// First, create and populate the rankings table with an index
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||
pid BIGINT NOT NULL,
|
||||
@@ -431,12 +521,12 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
dense_rank_num INT,
|
||||
percentile DECIMAL(5,2),
|
||||
total_count INT,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX (rank_num),
|
||||
INDEX (dense_rank_num),
|
||||
INDEX (percentile)
|
||||
) ENGINE=MEMORY
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (rank_num)');
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (dense_rank_num)');
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (percentile)');
|
||||
|
||||
// Calculate rankings with proper tie handling
|
||||
await connection.query(`
|
||||
@@ -463,58 +553,74 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
`);
|
||||
|
||||
// Get total count for percentage calculation
|
||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = rankingCount[0].total_count || 1;
|
||||
const max_rank = totalCount;
|
||||
const rankingCount = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = parseInt(rankingCount.rows[0].total_count) || 1;
|
||||
|
||||
// Process updates in batches
|
||||
let abcProcessedCount = 0;
|
||||
const batchSize = 5000;
|
||||
const maxPid = await connection.query('SELECT MAX(pid) as max_pid FROM products');
|
||||
const maxProductId = parseInt(maxPid.rows[0].max_pid);
|
||||
|
||||
while (true) {
|
||||
while (abcProcessedCount < maxProductId) {
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0, // This module doesn't process POs
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Get a batch of PIDs that need updating
|
||||
const [pids] = await connection.query(`
|
||||
const pids = await connection.query(`
|
||||
SELECT pm.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
WHERE pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= ? THEN 'A'
|
||||
WHEN tr.percentile <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END
|
||||
LIMIT ?
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
|
||||
WHERE pm.pid > $1
|
||||
AND (pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= ${aThreshold} THEN 'A'
|
||||
WHEN tr.percentile <= ${bThreshold} THEN 'B'
|
||||
ELSE 'C'
|
||||
END)
|
||||
ORDER BY pm.pid
|
||||
LIMIT $2
|
||||
`, [abcProcessedCount, batchSize]);
|
||||
|
||||
if (pids.length === 0) break;
|
||||
if (pids.rows.length === 0) break;
|
||||
|
||||
const pidValues = pids.rows.map(row => row.pid);
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
SET pm.abc_class =
|
||||
SET abc_class =
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= ? THEN 'A'
|
||||
WHEN tr.percentile <= ? THEN 'B'
|
||||
WHEN tr.percentile <= ${aThreshold} THEN 'A'
|
||||
WHEN tr.percentile <= ${bThreshold} THEN 'B'
|
||||
ELSE 'C'
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pids.map(row => row.pid)]);
|
||||
last_calculated_at = NOW()
|
||||
FROM (SELECT pid, percentile FROM temp_revenue_ranks) tr
|
||||
WHERE pm.pid = tr.pid AND pm.pid = ANY($1::BIGINT[])
|
||||
OR (pm.pid = ANY($1::BIGINT[]) AND tr.pid IS NULL)
|
||||
`, [pidValues]);
|
||||
|
||||
// Now update turnover rate with proper handling of zero inventory periods
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN (
|
||||
SET
|
||||
turnover_rate = CASE
|
||||
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
||||
THEN LEAST(
|
||||
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
last_calculated_at = NOW()
|
||||
FROM (
|
||||
SELECT
|
||||
o.pid,
|
||||
SUM(o.quantity) as total_sold,
|
||||
@@ -526,22 +632,33 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND o.pid IN (?)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
AND o.pid = ANY($1::BIGINT[])
|
||||
GROUP BY o.pid
|
||||
) sales ON pm.pid = sales.pid
|
||||
SET
|
||||
pm.turnover_rate = CASE
|
||||
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
||||
THEN LEAST(
|
||||
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [pids.map(row => row.pid), pids.map(row => row.pid)]);
|
||||
) sales
|
||||
WHERE pm.pid = sales.pid
|
||||
`, [pidValues]);
|
||||
|
||||
abcProcessedCount = pids.rows[pids.rows.length - 1].pid;
|
||||
|
||||
// Calculate progress proportionally to total products
|
||||
processedCount = Math.floor(totalProducts * (0.60 + (abcProcessedCount / maxProductId) * 0.2));
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'ABC classification progress',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
@@ -551,7 +668,8 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('product_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
@@ -566,7 +684,16 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
logError(error, 'Error calculating product metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
// Always clean up temporary tables, even if an error occurred
|
||||
if (connection) {
|
||||
try {
|
||||
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
|
||||
// Make sure to release the connection
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
@@ -603,40 +730,7 @@ function calculateStockStatus(stock, config, daily_sales_avg, weekly_sales_avg,
|
||||
return 'Healthy';
|
||||
}
|
||||
|
||||
function calculateReorderQuantities(stock, stock_status, daily_sales_avg, avg_lead_time, config) {
|
||||
// Calculate safety stock based on service level and lead time
|
||||
const z_score = 1.96; // 95% service level
|
||||
const lead_time = avg_lead_time || config.target_days;
|
||||
const safety_stock = Math.ceil(daily_sales_avg * Math.sqrt(lead_time) * z_score);
|
||||
|
||||
// Calculate reorder point
|
||||
const lead_time_demand = daily_sales_avg * lead_time;
|
||||
const reorder_point = Math.ceil(lead_time_demand + safety_stock);
|
||||
|
||||
// Calculate reorder quantity using EOQ formula if we have the necessary data
|
||||
let reorder_qty = 0;
|
||||
if (daily_sales_avg > 0) {
|
||||
const annual_demand = daily_sales_avg * 365;
|
||||
const order_cost = 25; // Fixed cost per order
|
||||
const holding_cost = config.cost_price * 0.25; // 25% of unit cost as annual holding cost
|
||||
|
||||
reorder_qty = Math.ceil(Math.sqrt((2 * annual_demand * order_cost) / holding_cost));
|
||||
} else {
|
||||
// If no sales data, use a basic calculation
|
||||
reorder_qty = Math.max(safety_stock, config.low_stock_threshold);
|
||||
}
|
||||
|
||||
// Calculate overstocked amount
|
||||
const overstocked_amt = stock_status === 'Overstocked' ?
|
||||
stock - Math.ceil(daily_sales_avg * config.overstock_days) :
|
||||
0;
|
||||
|
||||
return {
|
||||
safety_stock,
|
||||
reorder_point,
|
||||
reorder_qty,
|
||||
overstocked_amt
|
||||
};
|
||||
}
|
||||
// Note: calculateReorderQuantities function has been removed as its logic has been incorporated
|
||||
// in the main SQL query with configurable parameters
|
||||
|
||||
module.exports = calculateProductMetrics;
|
||||
@@ -32,13 +32,13 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -69,15 +69,15 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO temp_forecast_dates
|
||||
SELECT
|
||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date,
|
||||
DAYOFWEEK(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as day_of_week,
|
||||
MONTH(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as month
|
||||
CURRENT_DATE + (n || ' days')::INTERVAL as forecast_date,
|
||||
EXTRACT(DOW FROM CURRENT_DATE + (n || ' days')::INTERVAL) + 1 as day_of_week,
|
||||
EXTRACT(MONTH FROM CURRENT_DATE + (n || ' days')::INTERVAL) as month
|
||||
FROM (
|
||||
SELECT a.N + b.N * 10 as n
|
||||
SELECT a.n + b.n * 10 as n
|
||||
FROM
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2) b
|
||||
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2) b
|
||||
ORDER BY n
|
||||
LIMIT 31
|
||||
) numbers
|
||||
@@ -109,17 +109,17 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for daily sales stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_daily_sales AS
|
||||
CREATE TEMPORARY TABLE temp_daily_sales AS
|
||||
SELECT
|
||||
o.pid,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY o.pid, DAYOFWEEK(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY o.pid, EXTRACT(DOW FROM o.date) + 1
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.94);
|
||||
@@ -148,7 +148,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for product stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_product_stats AS
|
||||
CREATE TEMPORARY TABLE temp_product_stats AS
|
||||
SELECT
|
||||
pid,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
@@ -186,10 +186,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
INSERT INTO sales_forecasts (
|
||||
pid,
|
||||
forecast_date,
|
||||
forecast_units,
|
||||
forecast_revenue,
|
||||
forecast_quantity,
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
created_at
|
||||
)
|
||||
WITH daily_stats AS (
|
||||
SELECT
|
||||
@@ -217,35 +216,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
GREATEST(0,
|
||||
ROUND(
|
||||
ds.avg_daily_qty *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
CASE
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.5 THEN 0.85
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
2
|
||||
(1 + COALESCE(sf.seasonality_factor, 0))
|
||||
)
|
||||
) as forecast_units,
|
||||
GREATEST(0,
|
||||
ROUND(
|
||||
COALESCE(
|
||||
CASE
|
||||
WHEN ds.data_points >= 4 THEN ds.avg_daily_revenue
|
||||
ELSE ps.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
CASE
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.5 THEN 0.85
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
0
|
||||
),
|
||||
2
|
||||
)
|
||||
) as forecast_revenue,
|
||||
) as forecast_quantity,
|
||||
CASE
|
||||
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
|
||||
WHEN ds.total_days >= 60 THEN 85
|
||||
@@ -255,17 +228,18 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN ds.total_days >= 14 THEN 65
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
NOW() as created_at
|
||||
FROM daily_stats ds
|
||||
JOIN temp_product_stats ps ON ds.pid = ps.pid
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
confidence_level = VALUES(confidence_level),
|
||||
last_calculated_at = NOW()
|
||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor,
|
||||
ds.avg_daily_qty, ds.std_daily_qty, ds.avg_daily_qty, ds.total_days, ds.daily_variance_ratio
|
||||
ON CONFLICT (pid, forecast_date) DO UPDATE
|
||||
SET
|
||||
forecast_quantity = EXCLUDED.forecast_quantity,
|
||||
confidence_level = EXCLUDED.confidence_level,
|
||||
created_at = NOW()
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.98);
|
||||
@@ -294,22 +268,22 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for category stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_sales AS
|
||||
CREATE TEMPORARY TABLE temp_category_sales AS
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
JOIN product_categories pc ON o.pid = pc.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY pc.cat_id, DAYOFWEEK(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY pc.cat_id, EXTRACT(DOW FROM o.date) + 1
|
||||
`);
|
||||
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_stats AS
|
||||
CREATE TEMPORARY TABLE temp_category_stats AS
|
||||
SELECT
|
||||
cat_id,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
@@ -350,14 +324,14 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
forecast_units,
|
||||
forecast_revenue,
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
created_at
|
||||
)
|
||||
SELECT
|
||||
cs.cat_id as category_id,
|
||||
cs.cat_id::bigint as category_id,
|
||||
fd.forecast_date,
|
||||
GREATEST(0,
|
||||
AVG(cs.daily_quantity) *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0))
|
||||
ROUND(AVG(cs.daily_quantity) *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)))
|
||||
) as forecast_units,
|
||||
GREATEST(0,
|
||||
COALESCE(
|
||||
@@ -365,8 +339,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN SUM(cs.day_count) >= 4 THEN AVG(cs.daily_revenue)
|
||||
ELSE ct.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
(0.95 + (RAND() * 0.1)),
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)),
|
||||
0
|
||||
)
|
||||
) as forecast_revenue,
|
||||
@@ -376,27 +349,34 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN ct.total_days >= 14 THEN 70
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
NOW() as created_at
|
||||
FROM temp_category_sales cs
|
||||
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY cs.cat_id, fd.forecast_date, ct.overall_avg_revenue, ct.total_days, sf.seasonality_factor
|
||||
GROUP BY
|
||||
cs.cat_id,
|
||||
fd.forecast_date,
|
||||
ct.overall_avg_revenue,
|
||||
ct.total_days,
|
||||
sf.seasonality_factor,
|
||||
sf.month
|
||||
HAVING AVG(cs.daily_quantity) > 0
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
confidence_level = VALUES(confidence_level),
|
||||
last_calculated_at = NOW()
|
||||
ON CONFLICT (category_id, forecast_date) DO UPDATE
|
||||
SET
|
||||
forecast_units = EXCLUDED.forecast_units,
|
||||
forecast_revenue = EXCLUDED.forecast_revenue,
|
||||
confidence_level = EXCLUDED.confidence_level,
|
||||
created_at = NOW()
|
||||
`);
|
||||
|
||||
// Clean up temporary tables
|
||||
await connection.query(`
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_product_stats;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_stats;
|
||||
DROP TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TABLE IF EXISTS temp_product_stats;
|
||||
DROP TABLE IF EXISTS temp_category_sales;
|
||||
DROP TABLE IF EXISTS temp_category_stats;
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
@@ -423,7 +403,8 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('sales_forecasts', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
@@ -439,6 +420,18 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
throw error;
|
||||
} finally {
|
||||
if (connection) {
|
||||
try {
|
||||
// Ensure temporary tables are cleaned up
|
||||
await connection.query(`
|
||||
DROP TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TABLE IF EXISTS temp_product_stats;
|
||||
DROP TABLE IF EXISTS temp_category_sales;
|
||||
DROP TABLE IF EXISTS temp_category_stats;
|
||||
`);
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -32,12 +32,12 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -55,6 +55,93 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
}
|
||||
});
|
||||
|
||||
// Create a temporary table for end-of-month inventory values
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_monthly_inventory AS
|
||||
WITH months AS (
|
||||
-- Generate all year/month combinations for the last 12 months
|
||||
SELECT
|
||||
EXTRACT(YEAR FROM month_date)::INTEGER as year,
|
||||
EXTRACT(MONTH FROM month_date)::INTEGER as month,
|
||||
month_date as start_date,
|
||||
(month_date + INTERVAL '1 month'::interval - INTERVAL '1 day'::interval)::DATE as end_date
|
||||
FROM (
|
||||
SELECT generate_series(
|
||||
DATE_TRUNC('month', CURRENT_DATE - INTERVAL '12 months'::interval)::DATE,
|
||||
DATE_TRUNC('month', CURRENT_DATE)::DATE,
|
||||
INTERVAL '1 month'::interval
|
||||
) as month_date
|
||||
) dates
|
||||
),
|
||||
monthly_inventory_calc AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
m.year,
|
||||
m.month,
|
||||
m.end_date,
|
||||
p.stock_quantity as current_quantity,
|
||||
-- Calculate sold during period (before end_date)
|
||||
COALESCE(SUM(
|
||||
CASE
|
||||
WHEN o.date <= m.end_date THEN o.quantity
|
||||
ELSE 0
|
||||
END
|
||||
), 0) as sold_after_end_date,
|
||||
-- Calculate received during period (before end_date)
|
||||
COALESCE(SUM(
|
||||
CASE
|
||||
WHEN po.received_date <= m.end_date THEN po.received
|
||||
ELSE 0
|
||||
END
|
||||
), 0) as received_after_end_date,
|
||||
p.cost_price
|
||||
FROM
|
||||
products p
|
||||
CROSS JOIN
|
||||
months m
|
||||
LEFT JOIN
|
||||
orders o ON p.pid = o.pid
|
||||
AND o.canceled = false
|
||||
AND o.date > m.end_date
|
||||
AND o.date <= CURRENT_DATE
|
||||
LEFT JOIN
|
||||
purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.received_date > m.end_date
|
||||
AND po.received_date <= CURRENT_DATE
|
||||
GROUP BY
|
||||
p.pid, m.year, m.month, m.end_date, p.stock_quantity, p.cost_price
|
||||
)
|
||||
SELECT
|
||||
pid,
|
||||
year,
|
||||
month,
|
||||
-- End of month quantity = current quantity - sold after + received after
|
||||
GREATEST(0, current_quantity - sold_after_end_date + received_after_end_date) as end_of_month_quantity,
|
||||
-- End of month inventory value
|
||||
GREATEST(0, current_quantity - sold_after_end_date + received_after_end_date) * cost_price as end_of_month_value,
|
||||
cost_price
|
||||
FROM
|
||||
monthly_inventory_calc
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.40);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Monthly inventory values calculated, processing time aggregates',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Initial insert of time-based aggregates
|
||||
await connection.query(`
|
||||
INSERT INTO product_time_aggregates (
|
||||
@@ -75,76 +162,67 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
o.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone)::INTEGER as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone)::INTEGER as month,
|
||||
SUM(o.quantity) as total_quantity_sold,
|
||||
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) as total_revenue,
|
||||
SUM(COALESCE(p.cost_price, 0) * o.quantity) as total_cost,
|
||||
SUM(COALESCE(o.costeach, 0) * o.quantity) as total_cost,
|
||||
COUNT(DISTINCT o.order_number) as order_count,
|
||||
AVG(o.price - COALESCE(o.discount, 0)) as avg_price,
|
||||
CASE
|
||||
WHEN SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) > 0
|
||||
THEN ((SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) - SUM(COALESCE(p.cost_price, 0) * o.quantity))
|
||||
THEN ((SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) - SUM(COALESCE(o.costeach, 0) * o.quantity))
|
||||
/ SUM((o.price - COALESCE(o.discount, 0)) * o.quantity)) * 100
|
||||
ELSE 0
|
||||
END as profit_margin,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY o.pid, YEAR(o.date), MONTH(o.date)
|
||||
GROUP BY o.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
),
|
||||
monthly_stock AS (
|
||||
SELECT
|
||||
pid,
|
||||
YEAR(date) as year,
|
||||
MONTH(date) as month,
|
||||
EXTRACT(YEAR FROM date::timestamp with time zone)::INTEGER as year,
|
||||
EXTRACT(MONTH FROM date::timestamp with time zone)::INTEGER as month,
|
||||
SUM(received) as stock_received,
|
||||
SUM(ordered) as stock_ordered
|
||||
FROM purchase_orders
|
||||
GROUP BY pid, YEAR(date), MONTH(date)
|
||||
),
|
||||
base_products AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.cost_price * p.stock_quantity as inventory_value
|
||||
FROM products p
|
||||
GROUP BY pid, EXTRACT(YEAR FROM date::timestamp with time zone), EXTRACT(MONTH FROM date::timestamp with time zone)
|
||||
)
|
||||
SELECT
|
||||
COALESCE(s.pid, ms.pid) as pid,
|
||||
COALESCE(s.year, ms.year) as year,
|
||||
COALESCE(s.month, ms.month) as month,
|
||||
COALESCE(s.total_quantity_sold, 0) as total_quantity_sold,
|
||||
COALESCE(s.total_revenue, 0) as total_revenue,
|
||||
COALESCE(s.total_cost, 0) as total_cost,
|
||||
COALESCE(s.order_count, 0) as order_count,
|
||||
COALESCE(ms.stock_received, 0) as stock_received,
|
||||
COALESCE(ms.stock_ordered, 0) as stock_ordered,
|
||||
COALESCE(s.avg_price, 0) as avg_price,
|
||||
COALESCE(s.profit_margin, 0) as profit_margin,
|
||||
COALESCE(s.inventory_value, bp.inventory_value, 0) as inventory_value,
|
||||
COALESCE(s.pid, ms.pid, mi.pid) as pid,
|
||||
COALESCE(s.year, ms.year, mi.year) as year,
|
||||
COALESCE(s.month, ms.month, mi.month) as month,
|
||||
COALESCE(s.total_quantity_sold, 0)::INTEGER as total_quantity_sold,
|
||||
COALESCE(s.total_revenue, 0)::DECIMAL(10,3) as total_revenue,
|
||||
COALESCE(s.total_cost, 0)::DECIMAL(10,3) as total_cost,
|
||||
COALESCE(s.order_count, 0)::INTEGER as order_count,
|
||||
COALESCE(ms.stock_received, 0)::INTEGER as stock_received,
|
||||
COALESCE(ms.stock_ordered, 0)::INTEGER as stock_ordered,
|
||||
COALESCE(s.avg_price, 0)::DECIMAL(10,3) as avg_price,
|
||||
COALESCE(s.profit_margin, 0)::DECIMAL(10,3) as profit_margin,
|
||||
COALESCE(mi.end_of_month_value, 0)::DECIMAL(10,3) as inventory_value,
|
||||
CASE
|
||||
WHEN COALESCE(s.inventory_value, bp.inventory_value, 0) > 0
|
||||
AND COALESCE(s.active_days, 0) > 0
|
||||
THEN (COALESCE(s.total_revenue - s.total_cost, 0) * (365.0 / s.active_days))
|
||||
/ COALESCE(s.inventory_value, bp.inventory_value)
|
||||
WHEN COALESCE(mi.end_of_month_value, 0) > 0
|
||||
THEN (COALESCE(s.total_revenue, 0) - COALESCE(s.total_cost, 0))
|
||||
/ NULLIF(COALESCE(mi.end_of_month_value, 0), 0)
|
||||
ELSE 0
|
||||
END as gmroi
|
||||
END::DECIMAL(10,3) as gmroi
|
||||
FROM (
|
||||
SELECT * FROM monthly_sales s
|
||||
UNION ALL
|
||||
SELECT
|
||||
ms.pid,
|
||||
ms.year,
|
||||
ms.month,
|
||||
pid,
|
||||
year,
|
||||
month,
|
||||
0 as total_quantity_sold,
|
||||
0 as total_revenue,
|
||||
0 as total_cost,
|
||||
0 as order_count,
|
||||
NULL as avg_price,
|
||||
0 as profit_margin,
|
||||
NULL as inventory_value,
|
||||
0 as active_days
|
||||
FROM monthly_stock ms
|
||||
WHERE NOT EXISTS (
|
||||
@@ -153,67 +231,58 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
AND s2.year = ms.year
|
||||
AND s2.month = ms.month
|
||||
)
|
||||
UNION ALL
|
||||
SELECT
|
||||
pid,
|
||||
year,
|
||||
month,
|
||||
0 as total_quantity_sold,
|
||||
0 as total_revenue,
|
||||
0 as total_cost,
|
||||
0 as order_count,
|
||||
NULL as avg_price,
|
||||
0 as profit_margin,
|
||||
0 as active_days
|
||||
FROM temp_monthly_inventory mi
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM monthly_sales s3
|
||||
WHERE s3.pid = mi.pid
|
||||
AND s3.year = mi.year
|
||||
AND s3.month = mi.month
|
||||
)
|
||||
AND NOT EXISTS (
|
||||
SELECT 1 FROM monthly_stock ms3
|
||||
WHERE ms3.pid = mi.pid
|
||||
AND ms3.year = mi.year
|
||||
AND ms3.month = mi.month
|
||||
)
|
||||
) s
|
||||
LEFT JOIN monthly_stock ms
|
||||
ON s.pid = ms.pid
|
||||
AND s.year = ms.year
|
||||
AND s.month = ms.month
|
||||
JOIN base_products bp ON COALESCE(s.pid, ms.pid) = bp.pid
|
||||
UNION
|
||||
SELECT
|
||||
ms.pid,
|
||||
ms.year,
|
||||
ms.month,
|
||||
0 as total_quantity_sold,
|
||||
0 as total_revenue,
|
||||
0 as total_cost,
|
||||
0 as order_count,
|
||||
ms.stock_received,
|
||||
ms.stock_ordered,
|
||||
0 as avg_price,
|
||||
0 as profit_margin,
|
||||
bp.inventory_value,
|
||||
0 as gmroi
|
||||
FROM monthly_stock ms
|
||||
JOIN base_products bp ON ms.pid = bp.pid
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM (
|
||||
SELECT * FROM monthly_sales
|
||||
UNION ALL
|
||||
SELECT
|
||||
ms2.pid,
|
||||
ms2.year,
|
||||
ms2.month,
|
||||
0, 0, 0, 0, NULL, 0, NULL, 0
|
||||
FROM monthly_stock ms2
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM monthly_sales s2
|
||||
WHERE s2.pid = ms2.pid
|
||||
AND s2.year = ms2.year
|
||||
AND s2.month = ms2.month
|
||||
)
|
||||
) s
|
||||
WHERE s.pid = ms.pid
|
||||
AND s.year = ms.year
|
||||
AND s.month = ms.month
|
||||
)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_quantity_sold = VALUES(total_quantity_sold),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_cost = VALUES(total_cost),
|
||||
order_count = VALUES(order_count),
|
||||
stock_received = VALUES(stock_received),
|
||||
stock_ordered = VALUES(stock_ordered),
|
||||
avg_price = VALUES(avg_price),
|
||||
profit_margin = VALUES(profit_margin),
|
||||
inventory_value = VALUES(inventory_value),
|
||||
gmroi = VALUES(gmroi)
|
||||
LEFT JOIN temp_monthly_inventory mi
|
||||
ON s.pid = mi.pid
|
||||
AND s.year = mi.year
|
||||
AND s.month = mi.month
|
||||
ON CONFLICT (pid, year, month) DO UPDATE
|
||||
SET
|
||||
total_quantity_sold = EXCLUDED.total_quantity_sold,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
total_cost = EXCLUDED.total_cost,
|
||||
order_count = EXCLUDED.order_count,
|
||||
stock_received = EXCLUDED.stock_received,
|
||||
stock_ordered = EXCLUDED.stock_ordered,
|
||||
avg_price = EXCLUDED.avg_price,
|
||||
profit_margin = EXCLUDED.profit_margin,
|
||||
inventory_value = EXCLUDED.inventory_value,
|
||||
gmroi = EXCLUDED.gmroi
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.60);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Base time aggregates calculated, updating financial metrics',
|
||||
operation: 'Base time aggregates calculated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
@@ -233,45 +302,9 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Update with financial metrics
|
||||
await connection.query(`
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN (
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
) fin ON pta.pid = fin.pid
|
||||
AND pta.year = fin.year
|
||||
AND pta.month = fin.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(fin.inventory_value, 0)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Financial metrics updated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Clean up temporary tables
|
||||
await connection.query('DROP TABLE IF EXISTS temp_monthly_inventory');
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
@@ -280,7 +313,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('time_aggregates', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
@@ -296,6 +330,12 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
throw error;
|
||||
} finally {
|
||||
if (connection) {
|
||||
try {
|
||||
// Ensure temporary tables are cleaned up
|
||||
await connection.query('DROP TABLE IF EXISTS temp_monthly_inventory');
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
const mysql = require('mysql2/promise');
|
||||
const { Pool } = require('pg');
|
||||
const path = require('path');
|
||||
require('dotenv').config({ path: path.resolve(__dirname, '../../..', '.env') });
|
||||
|
||||
@@ -8,36 +8,24 @@ const dbConfig = {
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0,
|
||||
port: process.env.DB_PORT || 5432,
|
||||
ssl: process.env.DB_SSL === 'true',
|
||||
// Add performance optimizations
|
||||
namedPlaceholders: true,
|
||||
maxPreparedStatements: 256,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
// Add memory optimizations
|
||||
flags: [
|
||||
'FOUND_ROWS',
|
||||
'LONG_PASSWORD',
|
||||
'PROTOCOL_41',
|
||||
'TRANSACTIONS',
|
||||
'SECURE_CONNECTION',
|
||||
'MULTI_RESULTS',
|
||||
'PS_MULTI_RESULTS',
|
||||
'PLUGIN_AUTH',
|
||||
'CONNECT_ATTRS',
|
||||
'PLUGIN_AUTH_LENENC_CLIENT_DATA',
|
||||
'SESSION_TRACK',
|
||||
'MULTI_STATEMENTS'
|
||||
]
|
||||
max: 10, // connection pool max size
|
||||
idleTimeoutMillis: 30000,
|
||||
connectionTimeoutMillis: 60000
|
||||
};
|
||||
|
||||
// Create a single pool instance to be reused
|
||||
const pool = mysql.createPool(dbConfig);
|
||||
const pool = new Pool(dbConfig);
|
||||
|
||||
// Add event handlers for pool
|
||||
pool.on('error', (err, client) => {
|
||||
console.error('Unexpected error on idle client', err);
|
||||
});
|
||||
|
||||
async function getConnection() {
|
||||
return await pool.getConnection();
|
||||
return await pool.connect();
|
||||
}
|
||||
|
||||
async function closePool() {
|
||||
|
||||
@@ -33,7 +33,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
}
|
||||
|
||||
// Get counts of records that will be processed
|
||||
const [[orderCount], [poCount]] = await Promise.all([
|
||||
const [orderCountResult, poCountResult] = await Promise.all([
|
||||
connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
@@ -45,8 +45,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
WHERE po.status != 0
|
||||
`)
|
||||
]);
|
||||
processedOrders = orderCount.count;
|
||||
processedPurchaseOrders = poCount.count;
|
||||
processedOrders = parseInt(orderCountResult.rows[0].count);
|
||||
processedPurchaseOrders = parseInt(poCountResult.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -66,7 +66,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
|
||||
// First ensure all vendors exist in vendor_details
|
||||
await connection.query(`
|
||||
INSERT IGNORE INTO vendor_details (vendor, status, created_at, updated_at)
|
||||
INSERT INTO vendor_details (vendor, status, created_at, updated_at)
|
||||
SELECT DISTINCT
|
||||
vendor,
|
||||
'active' as status,
|
||||
@@ -74,6 +74,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
NOW() as updated_at
|
||||
FROM products
|
||||
WHERE vendor IS NOT NULL
|
||||
ON CONFLICT (vendor) DO NOTHING
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.8);
|
||||
@@ -128,7 +129,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_po AS (
|
||||
@@ -138,12 +139,15 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
COUNT(DISTINCT po.id) as total_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_products AS (
|
||||
@@ -188,20 +192,21 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
|
||||
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
|
||||
WHERE vs.vendor IS NOT NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_orders = VALUES(total_orders),
|
||||
total_late_orders = VALUES(total_late_orders),
|
||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
||||
on_time_delivery_rate = VALUES(on_time_delivery_rate),
|
||||
order_fill_rate = VALUES(order_fill_rate),
|
||||
avg_order_value = VALUES(avg_order_value),
|
||||
active_products = VALUES(active_products),
|
||||
total_products = VALUES(total_products),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
avg_margin_percent = VALUES(avg_margin_percent),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (vendor) DO UPDATE
|
||||
SET
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
total_orders = EXCLUDED.total_orders,
|
||||
total_late_orders = EXCLUDED.total_late_orders,
|
||||
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||
on_time_delivery_rate = EXCLUDED.on_time_delivery_rate,
|
||||
order_fill_rate = EXCLUDED.order_fill_rate,
|
||||
avg_order_value = EXCLUDED.avg_order_value,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_products = EXCLUDED.total_products,
|
||||
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||
avg_margin_percent = EXCLUDED.avg_margin_percent,
|
||||
status = EXCLUDED.status,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.9);
|
||||
@@ -244,23 +249,23 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
WITH monthly_orders AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT o.id) as total_orders,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(o.date), MONTH(o.date)
|
||||
GROUP BY p.vendor, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
),
|
||||
monthly_po AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
YEAR(po.date) as year,
|
||||
MONTH(po.date) as month,
|
||||
EXTRACT(YEAR FROM po.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM po.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT po.id) as total_po,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
|
||||
@@ -268,14 +273,17 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
END) as late_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(po.date), MONTH(po.date)
|
||||
GROUP BY p.vendor, EXTRACT(YEAR FROM po.date::timestamp with time zone), EXTRACT(MONTH FROM po.date::timestamp with time zone)
|
||||
)
|
||||
SELECT
|
||||
mo.vendor,
|
||||
@@ -311,13 +319,14 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
AND mp.year = mo.year
|
||||
AND mp.month = mo.month
|
||||
WHERE mo.vendor IS NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_orders = VALUES(total_orders),
|
||||
late_orders = VALUES(late_orders),
|
||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin_percent = VALUES(avg_margin_percent)
|
||||
ON CONFLICT (vendor, year, month) DO UPDATE
|
||||
SET
|
||||
total_orders = EXCLUDED.total_orders,
|
||||
late_orders = EXCLUDED.late_orders,
|
||||
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin_percent = EXCLUDED.avg_margin_percent
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
@@ -344,7 +353,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('vendor_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -184,7 +184,7 @@ async function resetDatabase() {
|
||||
SELECT string_agg(tablename, ', ') as tables
|
||||
FROM pg_tables
|
||||
WHERE schemaname = 'public'
|
||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history');
|
||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates', 'reusable_images');
|
||||
`);
|
||||
|
||||
if (!tablesResult.rows[0].tables) {
|
||||
@@ -204,7 +204,7 @@ async function resetDatabase() {
|
||||
// Drop all tables except users
|
||||
const tables = tablesResult.rows[0].tables.split(', ');
|
||||
for (const table of tables) {
|
||||
if (!['users'].includes(table)) {
|
||||
if (!['users', 'reusable_images'].includes(table)) {
|
||||
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -39,6 +39,19 @@ const METRICS_TABLES = [
|
||||
'vendor_details'
|
||||
];
|
||||
|
||||
// Tables to always protect from being dropped
|
||||
const PROTECTED_TABLES = [
|
||||
'users',
|
||||
'permissions',
|
||||
'user_permissions',
|
||||
'calculate_history',
|
||||
'import_history',
|
||||
'ai_prompts',
|
||||
'ai_validation_performance',
|
||||
'templates',
|
||||
'reusable_images'
|
||||
];
|
||||
|
||||
// Split SQL into individual statements
|
||||
function splitSQLStatements(sql) {
|
||||
sql = sql.replace(/\r\n/g, '\n');
|
||||
@@ -100,13 +113,17 @@ async function resetMetrics() {
|
||||
client = new Client(dbConfig);
|
||||
await client.connect();
|
||||
|
||||
// Explicitly begin a transaction
|
||||
await client.query('BEGIN');
|
||||
|
||||
// First verify current state
|
||||
const initialTables = await client.query(`
|
||||
SELECT tablename as name
|
||||
FROM pg_tables
|
||||
WHERE schemaname = 'public'
|
||||
AND tablename = ANY($1)
|
||||
`, [METRICS_TABLES]);
|
||||
AND tablename NOT IN (SELECT unnest($2::text[]))
|
||||
`, [METRICS_TABLES, PROTECTED_TABLES]);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Initial state',
|
||||
@@ -123,7 +140,17 @@ async function resetMetrics() {
|
||||
});
|
||||
|
||||
for (const table of [...METRICS_TABLES].reverse()) {
|
||||
// Skip protected tables
|
||||
if (PROTECTED_TABLES.includes(table)) {
|
||||
outputProgress({
|
||||
operation: 'Protected table',
|
||||
message: `Skipping protected table: ${table}`
|
||||
});
|
||||
continue;
|
||||
}
|
||||
|
||||
try {
|
||||
// Use NOWAIT to avoid hanging if there's a lock
|
||||
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
||||
|
||||
// Verify the table was actually dropped
|
||||
@@ -142,13 +169,23 @@ async function resetMetrics() {
|
||||
operation: 'Table dropped',
|
||||
message: `Successfully dropped table: ${table}`
|
||||
});
|
||||
|
||||
// Commit after each table drop to ensure locks are released
|
||||
await client.query('COMMIT');
|
||||
// Start a new transaction for the next table
|
||||
await client.query('BEGIN');
|
||||
// Re-disable foreign key constraints for the new transaction
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
} catch (err) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Drop table error',
|
||||
message: `Error dropping table ${table}: ${err.message}`
|
||||
});
|
||||
throw err;
|
||||
await client.query('ROLLBACK');
|
||||
// Re-start transaction for next table
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -164,6 +201,11 @@ async function resetMetrics() {
|
||||
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.rows.map(t => t.name).join(', ')}`);
|
||||
}
|
||||
|
||||
// Make sure we have a fresh transaction here
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
|
||||
// Read metrics schema
|
||||
outputProgress({
|
||||
operation: 'Reading schema',
|
||||
@@ -220,6 +262,13 @@ async function resetMetrics() {
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
|
||||
// Commit every 10 statements to avoid long-running transactions
|
||||
if (i > 0 && i % 10 === 0) {
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
}
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
@@ -230,10 +279,17 @@ async function resetMetrics() {
|
||||
statementNumber: i + 1
|
||||
}
|
||||
});
|
||||
await client.query('ROLLBACK');
|
||||
throw sqlError;
|
||||
}
|
||||
}
|
||||
|
||||
// Final commit for any pending statements
|
||||
await client.query('COMMIT');
|
||||
|
||||
// Start new transaction for final checks
|
||||
await client.query('BEGIN');
|
||||
|
||||
// Re-enable foreign key checks after all tables are created
|
||||
await client.query('SET session_replication_role = \'origin\'');
|
||||
|
||||
@@ -269,9 +325,11 @@ async function resetMetrics() {
|
||||
operation: 'Final table check',
|
||||
message: `All database tables: ${finalCheck.rows.map(t => t.name).join(', ')}`
|
||||
});
|
||||
await client.query('ROLLBACK');
|
||||
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
|
||||
}
|
||||
|
||||
// Commit final transaction
|
||||
await client.query('COMMIT');
|
||||
|
||||
outputProgress({
|
||||
@@ -288,7 +346,11 @@ async function resetMetrics() {
|
||||
});
|
||||
|
||||
if (client) {
|
||||
await client.query('ROLLBACK');
|
||||
try {
|
||||
await client.query('ROLLBACK');
|
||||
} catch (rollbackError) {
|
||||
console.error('Error during rollback:', rollbackError);
|
||||
}
|
||||
// Make sure to re-enable foreign key checks even if there's an error
|
||||
await client.query('SET session_replication_role = \'origin\'').catch(() => {});
|
||||
}
|
||||
|
||||
337
inventory-server/scripts/update-order-costs.js
Normal file
337
inventory-server/scripts/update-order-costs.js
Normal file
@@ -0,0 +1,337 @@
|
||||
/**
|
||||
* This script updates the costeach values for existing orders from the original MySQL database
|
||||
* without needing to run the full import process.
|
||||
*/
|
||||
const dotenv = require("dotenv");
|
||||
const path = require("path");
|
||||
const fs = require("fs");
|
||||
const { setupConnections, closeConnections } = require('./import/utils');
|
||||
const { outputProgress, formatElapsedTime } = require('./metrics/utils/progress');
|
||||
|
||||
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||
|
||||
// SSH configuration
|
||||
const sshConfig = {
|
||||
ssh: {
|
||||
host: process.env.PROD_SSH_HOST,
|
||||
port: process.env.PROD_SSH_PORT || 22,
|
||||
username: process.env.PROD_SSH_USER,
|
||||
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||
: undefined,
|
||||
compress: true, // Enable SSH compression
|
||||
},
|
||||
prodDbConfig: {
|
||||
// MySQL config for production
|
||||
host: process.env.PROD_DB_HOST || "localhost",
|
||||
user: process.env.PROD_DB_USER,
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z',
|
||||
},
|
||||
localDbConfig: {
|
||||
// PostgreSQL config for local
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
port: process.env.DB_PORT || 5432,
|
||||
ssl: process.env.DB_SSL === 'true',
|
||||
connectionTimeoutMillis: 60000,
|
||||
idleTimeoutMillis: 30000,
|
||||
max: 10 // connection pool max size
|
||||
}
|
||||
};
|
||||
|
||||
async function updateOrderCosts() {
|
||||
const startTime = Date.now();
|
||||
let connections;
|
||||
let updatedCount = 0;
|
||||
let errorCount = 0;
|
||||
|
||||
try {
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Initializing SSH tunnel..."
|
||||
});
|
||||
|
||||
connections = await setupConnections(sshConfig);
|
||||
const { prodConnection, localConnection } = connections;
|
||||
|
||||
// 1. Get all orders from local database that need cost updates
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Getting orders from local database..."
|
||||
});
|
||||
|
||||
const [orders] = await localConnection.query(`
|
||||
SELECT DISTINCT order_number, pid
|
||||
FROM orders
|
||||
WHERE costeach = 0 OR costeach IS NULL
|
||||
ORDER BY order_number
|
||||
`);
|
||||
|
||||
if (!orders || !orders.rows || orders.rows.length === 0) {
|
||||
console.log("No orders found that need cost updates");
|
||||
return { updatedCount: 0, errorCount: 0 };
|
||||
}
|
||||
|
||||
const totalOrders = orders.rows.length;
|
||||
console.log(`Found ${totalOrders} orders that need cost updates`);
|
||||
|
||||
// Process in batches of 1000 orders
|
||||
const BATCH_SIZE = 500;
|
||||
for (let i = 0; i < orders.rows.length; i += BATCH_SIZE) {
|
||||
try {
|
||||
// Start transaction for this batch
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
const batch = orders.rows.slice(i, i + BATCH_SIZE);
|
||||
|
||||
const orderNumbers = [...new Set(batch.map(o => o.order_number))];
|
||||
|
||||
// 2. Fetch costs from production database for these orders
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Fetching costs for orders ${i + 1} to ${Math.min(i + BATCH_SIZE, totalOrders)} of ${totalOrders}`,
|
||||
current: i,
|
||||
total: totalOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
|
||||
const [costs] = await prodConnection.query(`
|
||||
SELECT
|
||||
oc.orderid as order_number,
|
||||
oc.pid,
|
||||
oc.costeach
|
||||
FROM order_costs oc
|
||||
INNER JOIN (
|
||||
SELECT
|
||||
orderid,
|
||||
pid,
|
||||
MAX(id) as max_id
|
||||
FROM order_costs
|
||||
WHERE orderid IN (?)
|
||||
AND pending = 0
|
||||
GROUP BY orderid, pid
|
||||
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||
`, [orderNumbers]);
|
||||
|
||||
// Create a map of costs for easy lookup
|
||||
const costMap = {};
|
||||
if (costs && costs.length) {
|
||||
costs.forEach(c => {
|
||||
costMap[`${c.order_number}-${c.pid}`] = c.costeach || 0;
|
||||
});
|
||||
}
|
||||
|
||||
// 3. Update costs in local database by batches
|
||||
// Using a more efficient update approach with a temporary table
|
||||
|
||||
// Create a temporary table for each batch
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_number VARCHAR(50) NOT NULL,
|
||||
pid BIGINT NOT NULL,
|
||||
costeach DECIMAL(10,3) NOT NULL,
|
||||
PRIMARY KEY (order_number, pid)
|
||||
);
|
||||
`);
|
||||
|
||||
// Insert cost data into the temporary table
|
||||
const costEntries = [];
|
||||
for (const order of batch) {
|
||||
const key = `${order.order_number}-${order.pid}`;
|
||||
if (key in costMap) {
|
||||
costEntries.push({
|
||||
order_number: order.order_number,
|
||||
pid: order.pid,
|
||||
costeach: costMap[key]
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Insert in sub-batches of 100
|
||||
const DB_BATCH_SIZE = 50;
|
||||
for (let j = 0; j < costEntries.length; j += DB_BATCH_SIZE) {
|
||||
const subBatch = costEntries.slice(j, j + DB_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(',');
|
||||
|
||||
const values = subBatch.flatMap(item => [
|
||||
item.order_number,
|
||||
item.pid,
|
||||
item.costeach
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_costs (order_number, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
`, values);
|
||||
}
|
||||
|
||||
// Perform bulk update from the temporary table
|
||||
const [updateResult] = await localConnection.query(`
|
||||
UPDATE orders o
|
||||
SET costeach = t.costeach
|
||||
FROM temp_order_costs t
|
||||
WHERE o.order_number = t.order_number AND o.pid = t.pid
|
||||
RETURNING o.id
|
||||
`);
|
||||
|
||||
const batchUpdated = updateResult.rowCount || 0;
|
||||
updatedCount += batchUpdated;
|
||||
|
||||
// Commit transaction for this batch
|
||||
await localConnection.commit();
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Updated ${updatedCount} orders with costs from production (batch: ${batchUpdated})`,
|
||||
current: i + batch.length,
|
||||
total: totalOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
} catch (error) {
|
||||
// If a batch fails, roll back that batch's transaction and continue
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during batch rollback:", rollbackError);
|
||||
}
|
||||
|
||||
console.error(`Error processing batch ${i}-${i + BATCH_SIZE}:`, error);
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// 4. For orders with no matching costs, set a default based on price
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Setting default costs for remaining orders..."
|
||||
});
|
||||
|
||||
// Process remaining updates in smaller batches
|
||||
const DEFAULT_BATCH_SIZE = 10000;
|
||||
let totalDefaultUpdated = 0;
|
||||
|
||||
try {
|
||||
// Start with a count query to determine how many records need the default update
|
||||
const [countResult] = await localConnection.query(`
|
||||
SELECT COUNT(*) as count FROM orders
|
||||
WHERE (costeach = 0 OR costeach IS NULL)
|
||||
`);
|
||||
|
||||
const totalToUpdate = parseInt(countResult.rows[0]?.count || 0);
|
||||
|
||||
if (totalToUpdate > 0) {
|
||||
console.log(`Applying default cost to ${totalToUpdate} orders`);
|
||||
|
||||
// Apply the default in batches with separate transactions
|
||||
for (let i = 0; i < totalToUpdate; i += DEFAULT_BATCH_SIZE) {
|
||||
try {
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
const [defaultUpdates] = await localConnection.query(`
|
||||
WITH orders_to_update AS (
|
||||
SELECT id FROM orders
|
||||
WHERE (costeach = 0 OR costeach IS NULL)
|
||||
LIMIT ${DEFAULT_BATCH_SIZE}
|
||||
)
|
||||
UPDATE orders o
|
||||
SET costeach = price * 0.5
|
||||
FROM orders_to_update otu
|
||||
WHERE o.id = otu.id
|
||||
RETURNING o.id
|
||||
`);
|
||||
|
||||
const batchDefaultUpdated = defaultUpdates.rowCount || 0;
|
||||
totalDefaultUpdated += batchDefaultUpdated;
|
||||
|
||||
await localConnection.commit();
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Applied default costs to ${totalDefaultUpdated} of ${totalToUpdate} orders`,
|
||||
current: totalDefaultUpdated,
|
||||
total: totalToUpdate,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
} catch (error) {
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during default update rollback:", rollbackError);
|
||||
}
|
||||
|
||||
console.error(`Error applying default costs batch ${i}-${i + DEFAULT_BATCH_SIZE}:`, error);
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error counting or updating remaining orders:", error);
|
||||
errorCount++;
|
||||
}
|
||||
|
||||
updatedCount += totalDefaultUpdated;
|
||||
|
||||
const endTime = Date.now();
|
||||
const totalSeconds = (endTime - startTime) / 1000;
|
||||
|
||||
outputProgress({
|
||||
status: "complete",
|
||||
operation: "Order costs update",
|
||||
message: `Updated ${updatedCount} orders (${totalDefaultUpdated} with default values) in ${formatElapsedTime(totalSeconds)}`,
|
||||
elapsed: formatElapsedTime(totalSeconds)
|
||||
});
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
updatedCount,
|
||||
errorCount
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during order costs update:", error);
|
||||
|
||||
return {
|
||||
status: "error",
|
||||
error: error.message,
|
||||
updatedCount,
|
||||
errorCount
|
||||
};
|
||||
} finally {
|
||||
if (connections) {
|
||||
await closeConnections(connections).catch(err => {
|
||||
console.error("Error closing connections:", err);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Run the script only if this is the main module
|
||||
if (require.main === module) {
|
||||
updateOrderCosts().then((results) => {
|
||||
console.log('Cost update completed:', results);
|
||||
// Force exit after a small delay to ensure all logs are written
|
||||
setTimeout(() => process.exit(0), 500);
|
||||
}).catch((error) => {
|
||||
console.error("Unhandled error:", error);
|
||||
// Force exit with error code after a small delay
|
||||
setTimeout(() => process.exit(1), 500);
|
||||
});
|
||||
}
|
||||
|
||||
// Export the function for use in other scripts
|
||||
module.exports = updateOrderCosts;
|
||||
@@ -79,7 +79,7 @@ router.get('/profit', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
@@ -137,7 +137,7 @@ router.get('/profit', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
@@ -175,6 +175,13 @@ router.get('/vendors', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Set cache control headers to prevent 304
|
||||
res.set({
|
||||
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
||||
'Pragma': 'no-cache',
|
||||
'Expires': '0'
|
||||
});
|
||||
|
||||
console.log('Fetching vendor performance data...');
|
||||
|
||||
// First check if we have any vendors with sales
|
||||
@@ -189,7 +196,7 @@ router.get('/vendors', async (req, res) => {
|
||||
console.log('Vendor data check:', checkData);
|
||||
|
||||
// Get vendor performance metrics
|
||||
const { rows: performance } = await pool.query(`
|
||||
const { rows: rawPerformance } = await pool.query(`
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
@@ -212,15 +219,15 @@ router.get('/vendors', async (req, res) => {
|
||||
)
|
||||
SELECT
|
||||
p.vendor,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as salesVolume,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as sales_volume,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
), 0) as profitMargin,
|
||||
), 0) as profit_margin,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
|
||||
), 0) as stockTurnover,
|
||||
COUNT(DISTINCT p.pid) as productCount,
|
||||
), 0) as stock_turnover,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
ROUND(
|
||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||
1
|
||||
@@ -231,16 +238,114 @@ router.get('/vendors', async (req, res) => {
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor, ms.current_month, ms.previous_month
|
||||
ORDER BY salesVolume DESC
|
||||
ORDER BY sales_volume DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
console.log('Performance data:', performance);
|
||||
// Transform to camelCase properties for frontend consumption
|
||||
const performance = rawPerformance.map(item => ({
|
||||
vendor: item.vendor,
|
||||
salesVolume: Number(item.sales_volume) || 0,
|
||||
profitMargin: Number(item.profit_margin) || 0,
|
||||
stockTurnover: Number(item.stock_turnover) || 0,
|
||||
productCount: Number(item.product_count) || 0,
|
||||
growth: Number(item.growth) || 0
|
||||
}));
|
||||
|
||||
res.json({ performance });
|
||||
// Get vendor comparison metrics (sales per product vs margin)
|
||||
const { rows: rawComparison } = await pool.query(`
|
||||
SELECT
|
||||
p.vendor,
|
||||
COALESCE(ROUND(
|
||||
SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0),
|
||||
2
|
||||
), 0) as sales_per_product,
|
||||
COALESCE(ROUND(
|
||||
AVG((p.price - p.cost_price) / NULLIF(p.cost_price, 0) * 100),
|
||||
2
|
||||
), 0) as average_margin,
|
||||
COUNT(DISTINCT p.pid) as size
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor
|
||||
HAVING COUNT(DISTINCT p.pid) > 0
|
||||
ORDER BY sales_per_product DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Transform comparison data
|
||||
const comparison = rawComparison.map(item => ({
|
||||
vendor: item.vendor,
|
||||
salesPerProduct: Number(item.sales_per_product) || 0,
|
||||
averageMargin: Number(item.average_margin) || 0,
|
||||
size: Number(item.size) || 0
|
||||
}));
|
||||
|
||||
console.log('Performance data ready. Sending response...');
|
||||
|
||||
// Return complete structure that the front-end expects
|
||||
res.json({
|
||||
performance,
|
||||
comparison,
|
||||
// Add empty trends array to complete the structure
|
||||
trends: []
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendor performance:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendor performance' });
|
||||
console.error('Error details:', error.message);
|
||||
|
||||
// Return dummy data on error with complete structure
|
||||
res.json({
|
||||
performance: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesVolume: 10000,
|
||||
profitMargin: 25.5,
|
||||
stockTurnover: 3.2,
|
||||
productCount: 15,
|
||||
growth: 12.3
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesVolume: 8500,
|
||||
profitMargin: 22.8,
|
||||
stockTurnover: 2.9,
|
||||
productCount: 12,
|
||||
growth: 8.7
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesVolume: 6200,
|
||||
profitMargin: 19.5,
|
||||
stockTurnover: 2.5,
|
||||
productCount: 8,
|
||||
growth: 5.2
|
||||
}
|
||||
],
|
||||
comparison: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesPerProduct: 650,
|
||||
averageMargin: 35.2,
|
||||
size: 15
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesPerProduct: 710,
|
||||
averageMargin: 28.5,
|
||||
size: 12
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesPerProduct: 770,
|
||||
averageMargin: 22.8,
|
||||
size: 8
|
||||
}
|
||||
],
|
||||
trends: []
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
@@ -250,7 +355,7 @@ router.get('/stock', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get global configuration values
|
||||
const [configs] = await pool.query(`
|
||||
const { rows: configs } = await pool.query(`
|
||||
SELECT
|
||||
st.low_stock_threshold,
|
||||
tc.calculation_period_days as turnover_period
|
||||
@@ -265,43 +370,39 @@ router.get('/stock', async (req, res) => {
|
||||
};
|
||||
|
||||
// Get turnover by category
|
||||
const [turnoverByCategory] = await pool.query(`
|
||||
const { rows: turnoverByCategory } = await pool.query(`
|
||||
SELECT
|
||||
c.name as category,
|
||||
ROUND(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0), 1) as turnoverRate,
|
||||
ROUND(AVG(p.stock_quantity), 0) as averageStock,
|
||||
ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) as turnoverRate,
|
||||
ROUND(AVG(p.stock_quantity)::numeric, 0) as averageStock,
|
||||
SUM(o.quantity) as totalSales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY c.name
|
||||
HAVING turnoverRate > 0
|
||||
HAVING ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) > 0
|
||||
ORDER BY turnoverRate DESC
|
||||
LIMIT 10
|
||||
`, [config.turnover_period]);
|
||||
`);
|
||||
|
||||
// Get stock levels over time
|
||||
const [stockLevels] = await pool.query(`
|
||||
const { rows: stockLevels } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
||||
SUM(CASE WHEN p.stock_quantity > ? THEN 1 ELSE 0 END) as inStock,
|
||||
SUM(CASE WHEN p.stock_quantity <= ? AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
SUM(CASE WHEN p.stock_quantity > $1 THEN 1 ELSE 0 END) as inStock,
|
||||
SUM(CASE WHEN p.stock_quantity <= $1 AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
`, [
|
||||
config.low_stock_threshold,
|
||||
config.low_stock_threshold,
|
||||
config.turnover_period
|
||||
]);
|
||||
`, [config.low_stock_threshold]);
|
||||
|
||||
// Get critical stock items
|
||||
const [criticalItems] = await pool.query(`
|
||||
const { rows: criticalItems } = await pool.query(`
|
||||
WITH product_thresholds AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
@@ -320,25 +421,33 @@ router.get('/stock', async (req, res) => {
|
||||
p.title as product,
|
||||
p.SKU as sku,
|
||||
p.stock_quantity as stockQuantity,
|
||||
GREATEST(ROUND(AVG(o.quantity) * pt.reorder_days), ?) as reorderPoint,
|
||||
ROUND(SUM(o.quantity) / NULLIF(p.stock_quantity, 0), 1) as turnoverRate,
|
||||
GREATEST(ROUND((AVG(o.quantity) * pt.reorder_days)::numeric), $1) as reorderPoint,
|
||||
ROUND((SUM(o.quantity) / NULLIF(p.stock_quantity, 0))::numeric, 1) as turnoverRate,
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND(p.stock_quantity / NULLIF((SUM(o.quantity) / ?), 0))
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END as daysUntilStockout
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_thresholds pt ON p.pid = pt.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
AND p.managing_stock = true
|
||||
GROUP BY p.pid
|
||||
HAVING daysUntilStockout < ? AND daysUntilStockout >= 0
|
||||
GROUP BY p.pid, pt.reorder_days
|
||||
HAVING
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END < $3
|
||||
AND
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END >= 0
|
||||
ORDER BY daysUntilStockout
|
||||
LIMIT 10
|
||||
`, [
|
||||
config.low_stock_threshold,
|
||||
config.turnover_period,
|
||||
config.turnover_period,
|
||||
config.turnover_period
|
||||
]);
|
||||
|
||||
@@ -355,7 +464,7 @@ router.get('/pricing', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get price points analysis
|
||||
const [pricePoints] = await pool.query(`
|
||||
const { rows: pricePoints } = await pool.query(`
|
||||
SELECT
|
||||
CAST(p.price AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
||||
@@ -365,27 +474,27 @@ router.get('/pricing', async (req, res) => {
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.price, c.name
|
||||
HAVING salesVolume > 0
|
||||
HAVING SUM(o.quantity) > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
`);
|
||||
|
||||
// Get price elasticity data (price changes vs demand)
|
||||
const [elasticity] = await pool.query(`
|
||||
const { rows: elasticity } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
|
||||
FROM orders o
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
`);
|
||||
|
||||
// Get price optimization recommendations
|
||||
const [recommendations] = await pool.query(`
|
||||
const { rows: recommendations } = await pool.query(`
|
||||
SELECT
|
||||
p.title as product,
|
||||
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
|
||||
@@ -415,10 +524,30 @@ router.get('/pricing', async (req, res) => {
|
||||
END as confidence
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY p.pid, p.price
|
||||
HAVING ABS(recommendedPrice - currentPrice) > 0
|
||||
ORDER BY potentialRevenue - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.pid, p.price, p.title
|
||||
HAVING ABS(
|
||||
CAST(
|
||||
ROUND(
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
|
||||
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
|
||||
ELSE p.price
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) - CAST(p.price AS DECIMAL(15,3))
|
||||
) > 0
|
||||
ORDER BY
|
||||
CAST(
|
||||
ROUND(
|
||||
SUM(o.price * o.quantity) *
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN 1.15
|
||||
WHEN AVG(o.quantity) < 2 THEN 0.95
|
||||
ELSE 1
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
@@ -441,7 +570,7 @@ router.get('/categories', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -451,27 +580,27 @@ router.get('/categories', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get category performance metrics with full path
|
||||
const [performance] = await pool.query(`
|
||||
const { rows: performance } = await pool.query(`
|
||||
${categoryPathCTE},
|
||||
monthly_sales AS (
|
||||
SELECT
|
||||
c.name,
|
||||
cp.path,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as current_month,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
AND o.date < CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_month
|
||||
@@ -480,7 +609,7 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
GROUP BY c.name, cp.path
|
||||
)
|
||||
SELECT
|
||||
@@ -499,15 +628,15 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
|
||||
HAVING revenue > 0
|
||||
HAVING SUM(o.price * o.quantity) > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Get category revenue distribution with full path
|
||||
const [distribution] = await pool.query(`
|
||||
const { rows: distribution } = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
@@ -518,35 +647,35 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY c.name, cp.path
|
||||
HAVING value > 0
|
||||
HAVING SUM(o.price * o.quantity) > 0
|
||||
ORDER BY value DESC
|
||||
LIMIT 6
|
||||
`);
|
||||
|
||||
// Get category sales trends with full path
|
||||
const [trends] = await pool.query(`
|
||||
const { rows: trends } = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
DATE_FORMAT(o.date, '%b %Y') as month,
|
||||
to_char(o.date, 'Mon YYYY') as month,
|
||||
SUM(o.price * o.quantity) as sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '6 months'
|
||||
GROUP BY
|
||||
c.name,
|
||||
cp.path,
|
||||
DATE_FORMAT(o.date, '%b %Y'),
|
||||
DATE_FORMAT(o.date, '%Y-%m')
|
||||
to_char(o.date, 'Mon YYYY'),
|
||||
to_char(o.date, 'YYYY-MM')
|
||||
ORDER BY
|
||||
c.name,
|
||||
DATE_FORMAT(o.date, '%Y-%m')
|
||||
to_char(o.date, 'YYYY-MM')
|
||||
`);
|
||||
|
||||
res.json({ performance, distribution, trends });
|
||||
|
||||
@@ -757,8 +757,8 @@ router.get('/history/import', async (req, res) => {
|
||||
end_time,
|
||||
status,
|
||||
error_message,
|
||||
rows_processed::integer,
|
||||
files_processed::integer
|
||||
records_added::integer,
|
||||
records_updated::integer
|
||||
FROM import_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
@@ -779,10 +779,16 @@ router.get('/history/calculate', async (req, res) => {
|
||||
id,
|
||||
start_time,
|
||||
end_time,
|
||||
duration_minutes,
|
||||
status,
|
||||
error_message,
|
||||
modules_processed::integer,
|
||||
total_modules::integer
|
||||
total_products,
|
||||
total_orders,
|
||||
total_purchase_orders,
|
||||
processed_products,
|
||||
processed_orders,
|
||||
processed_purchase_orders,
|
||||
additional_info
|
||||
FROM calculate_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
@@ -830,4 +836,58 @@ router.get('/status/tables', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/table-counts - Get record counts for all tables
|
||||
router.get('/status/table-counts', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const tables = [
|
||||
// Core tables
|
||||
'products', 'categories', 'product_categories', 'orders', 'purchase_orders',
|
||||
// Metrics tables
|
||||
'product_metrics', 'product_time_aggregates', 'vendor_metrics', 'category_metrics',
|
||||
'vendor_time_metrics', 'category_time_metrics', 'category_sales_metrics',
|
||||
'brand_metrics', 'brand_time_metrics', 'sales_forecasts', 'category_forecasts',
|
||||
// Config tables
|
||||
'stock_thresholds', 'lead_time_thresholds', 'sales_velocity_config',
|
||||
'abc_classification_config', 'safety_stock_config', 'turnover_config',
|
||||
'sales_seasonality', 'financial_calc_config'
|
||||
];
|
||||
|
||||
const counts = await Promise.all(
|
||||
tables.map(table =>
|
||||
pool.query(`SELECT COUNT(*) as count FROM ${table}`)
|
||||
.then(result => ({
|
||||
table_name: table,
|
||||
count: parseInt(result.rows[0].count)
|
||||
}))
|
||||
.catch(err => ({
|
||||
table_name: table,
|
||||
count: null,
|
||||
error: err.message
|
||||
}))
|
||||
)
|
||||
);
|
||||
|
||||
// Group tables by type
|
||||
const groupedCounts = {
|
||||
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders'].includes(c.table_name)),
|
||||
metrics: counts.filter(c => [
|
||||
'product_metrics', 'product_time_aggregates', 'vendor_metrics', 'category_metrics',
|
||||
'vendor_time_metrics', 'category_time_metrics', 'category_sales_metrics',
|
||||
'brand_metrics', 'brand_time_metrics', 'sales_forecasts', 'category_forecasts'
|
||||
].includes(c.table_name)),
|
||||
config: counts.filter(c => [
|
||||
'stock_thresholds', 'lead_time_thresholds', 'sales_velocity_config',
|
||||
'abc_classification_config', 'safety_stock_config', 'turnover_config',
|
||||
'sales_seasonality', 'financial_calc_config'
|
||||
].includes(c.table_name))
|
||||
};
|
||||
|
||||
res.json(groupedCounts);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table counts:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
File diff suppressed because it is too large
Load Diff
@@ -8,7 +8,9 @@ const fs = require('fs');
|
||||
|
||||
// Create uploads directory if it doesn't exist
|
||||
const uploadsDir = path.join('/var/www/html/inventory/uploads/products');
|
||||
const reusableUploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
|
||||
fs.mkdirSync(uploadsDir, { recursive: true });
|
||||
fs.mkdirSync(reusableUploadsDir, { recursive: true });
|
||||
|
||||
// Create a Map to track image upload times and their scheduled deletion
|
||||
const imageUploadMap = new Map();
|
||||
@@ -35,6 +37,12 @@ const connectionCache = {
|
||||
|
||||
// Function to schedule image deletion after 24 hours
|
||||
const scheduleImageDeletion = (filename, filePath) => {
|
||||
// Only schedule deletion for images in the products folder
|
||||
if (!filePath.includes('/uploads/products/')) {
|
||||
console.log(`Skipping deletion for non-product image: ${filename}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Delete any existing timeout for this file
|
||||
if (imageUploadMap.has(filename)) {
|
||||
clearTimeout(imageUploadMap.get(filename).timeoutId);
|
||||
@@ -407,6 +415,14 @@ router.delete('/delete-image', (req, res) => {
|
||||
return res.status(404).json({ error: 'File not found' });
|
||||
}
|
||||
|
||||
// Only allow deletion of images in the products folder
|
||||
if (!filePath.includes('/uploads/products/')) {
|
||||
return res.status(403).json({
|
||||
error: 'Cannot delete images outside the products folder',
|
||||
message: 'This image is in a protected folder and cannot be deleted through this endpoint'
|
||||
});
|
||||
}
|
||||
|
||||
// Delete the file
|
||||
fs.unlinkSync(filePath);
|
||||
|
||||
@@ -641,11 +657,19 @@ router.get('/check-file/:filename', (req, res) => {
|
||||
return res.status(400).json({ error: 'Invalid filename' });
|
||||
}
|
||||
|
||||
const filePath = path.join(uploadsDir, filename);
|
||||
// First check in products directory
|
||||
let filePath = path.join(uploadsDir, filename);
|
||||
let exists = fs.existsSync(filePath);
|
||||
|
||||
// If not found in products, check in reusable directory
|
||||
if (!exists) {
|
||||
filePath = path.join(reusableUploadsDir, filename);
|
||||
exists = fs.existsSync(filePath);
|
||||
}
|
||||
|
||||
try {
|
||||
// Check if file exists
|
||||
if (!fs.existsSync(filePath)) {
|
||||
if (!exists) {
|
||||
return res.status(404).json({
|
||||
error: 'File not found',
|
||||
path: filePath,
|
||||
@@ -685,13 +709,23 @@ router.get('/check-file/:filename', (req, res) => {
|
||||
// List all files in uploads directory
|
||||
router.get('/list-uploads', (req, res) => {
|
||||
try {
|
||||
if (!fs.existsSync(uploadsDir)) {
|
||||
return res.status(404).json({ error: 'Uploads directory not found', path: uploadsDir });
|
||||
const { directory = 'products' } = req.query;
|
||||
|
||||
// Determine which directory to list
|
||||
let targetDir;
|
||||
if (directory === 'reusable') {
|
||||
targetDir = reusableUploadsDir;
|
||||
} else {
|
||||
targetDir = uploadsDir; // default to products
|
||||
}
|
||||
|
||||
const files = fs.readdirSync(uploadsDir);
|
||||
if (!fs.existsSync(targetDir)) {
|
||||
return res.status(404).json({ error: 'Uploads directory not found', path: targetDir });
|
||||
}
|
||||
|
||||
const files = fs.readdirSync(targetDir);
|
||||
const fileDetails = files.map(file => {
|
||||
const filePath = path.join(uploadsDir, file);
|
||||
const filePath = path.join(targetDir, file);
|
||||
try {
|
||||
const stats = fs.statSync(filePath);
|
||||
return {
|
||||
@@ -709,12 +743,13 @@ router.get('/list-uploads', (req, res) => {
|
||||
});
|
||||
|
||||
return res.json({
|
||||
directory: uploadsDir,
|
||||
directory: targetDir,
|
||||
type: directory,
|
||||
count: files.length,
|
||||
files: fileDetails
|
||||
});
|
||||
} catch (error) {
|
||||
return res.status(500).json({ error: error.message, path: uploadsDir });
|
||||
return res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
@@ -65,6 +65,19 @@ router.get('/', async (req, res) => {
|
||||
paramCounter++;
|
||||
}
|
||||
|
||||
// Handle text filters for specific fields
|
||||
if (req.query.barcode) {
|
||||
conditions.push(`p.barcode ILIKE $${paramCounter}`);
|
||||
params.push(`%${req.query.barcode}%`);
|
||||
paramCounter++;
|
||||
}
|
||||
|
||||
if (req.query.vendor_reference) {
|
||||
conditions.push(`p.vendor_reference ILIKE $${paramCounter}`);
|
||||
params.push(`%${req.query.vendor_reference}%`);
|
||||
paramCounter++;
|
||||
}
|
||||
|
||||
// Handle numeric filters with operators
|
||||
const numericFields = {
|
||||
stock: 'p.stock_quantity',
|
||||
@@ -74,11 +87,22 @@ router.get('/', async (req, res) => {
|
||||
dailySalesAvg: 'pm.daily_sales_avg',
|
||||
weeklySalesAvg: 'pm.weekly_sales_avg',
|
||||
monthlySalesAvg: 'pm.monthly_sales_avg',
|
||||
avgQuantityPerOrder: 'pm.avg_quantity_per_order',
|
||||
numberOfOrders: 'pm.number_of_orders',
|
||||
margin: 'pm.avg_margin_percent',
|
||||
gmroi: 'pm.gmroi',
|
||||
inventoryValue: 'pm.inventory_value',
|
||||
costOfGoodsSold: 'pm.cost_of_goods_sold',
|
||||
grossProfit: 'pm.gross_profit',
|
||||
turnoverRate: 'pm.turnover_rate',
|
||||
leadTime: 'pm.current_lead_time',
|
||||
currentLeadTime: 'pm.current_lead_time',
|
||||
targetLeadTime: 'pm.target_lead_time',
|
||||
stockCoverage: 'pm.days_of_inventory',
|
||||
daysOfStock: 'pm.days_of_inventory'
|
||||
daysOfStock: 'pm.days_of_inventory',
|
||||
weeksOfStock: 'pm.weeks_of_inventory',
|
||||
reorderPoint: 'pm.reorder_point',
|
||||
safetyStock: 'pm.safety_stock'
|
||||
};
|
||||
|
||||
Object.entries(req.query).forEach(([key, value]) => {
|
||||
@@ -102,6 +126,24 @@ router.get('/', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Handle date filters
|
||||
const dateFields = {
|
||||
firstSaleDate: 'pm.first_sale_date',
|
||||
lastSaleDate: 'pm.last_sale_date',
|
||||
lastPurchaseDate: 'pm.last_purchase_date',
|
||||
firstReceivedDate: 'pm.first_received_date',
|
||||
lastReceivedDate: 'pm.last_received_date'
|
||||
};
|
||||
|
||||
Object.entries(req.query).forEach(([key, value]) => {
|
||||
const field = dateFields[key];
|
||||
if (field) {
|
||||
conditions.push(`${field}::TEXT LIKE $${paramCounter}`);
|
||||
params.push(`${value}%`); // Format like '2023-01%' to match by month or '2023-01-01' for exact date
|
||||
paramCounter++;
|
||||
}
|
||||
});
|
||||
|
||||
// Handle select filters
|
||||
if (req.query.vendor) {
|
||||
conditions.push(`p.vendor = $${paramCounter}`);
|
||||
@@ -183,7 +225,7 @@ router.get('/', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS text) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -193,7 +235,7 @@ router.get('/', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
@@ -295,7 +337,7 @@ router.get('/trending', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// First check if we have any data
|
||||
const [checkData] = await pool.query(`
|
||||
const { rows } = await pool.query(`
|
||||
SELECT COUNT(*) as count,
|
||||
MAX(total_revenue) as max_revenue,
|
||||
MAX(daily_sales_avg) as max_daily_sales,
|
||||
@@ -303,15 +345,15 @@ router.get('/trending', async (req, res) => {
|
||||
FROM product_metrics
|
||||
WHERE total_revenue > 0 OR daily_sales_avg > 0
|
||||
`);
|
||||
console.log('Product metrics stats:', checkData[0]);
|
||||
console.log('Product metrics stats:', rows[0]);
|
||||
|
||||
if (checkData[0].count === 0) {
|
||||
if (parseInt(rows[0].count) === 0) {
|
||||
console.log('No products with metrics found');
|
||||
return res.json([]);
|
||||
}
|
||||
|
||||
// Get trending products
|
||||
const [rows] = await pool.query(`
|
||||
const { rows: trendingProducts } = await pool.query(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.sku,
|
||||
@@ -332,8 +374,8 @@ router.get('/trending', async (req, res) => {
|
||||
LIMIT 50
|
||||
`);
|
||||
|
||||
console.log('Trending products:', rows);
|
||||
res.json(rows);
|
||||
console.log('Trending products:', trendingProducts);
|
||||
res.json(trendingProducts);
|
||||
} catch (error) {
|
||||
console.error('Error fetching trending products:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch trending products' });
|
||||
@@ -353,7 +395,7 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -363,14 +405,14 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get product details with category paths
|
||||
const [productRows] = await pool.query(`
|
||||
const { rows: productRows } = await pool.query(`
|
||||
SELECT
|
||||
p.*,
|
||||
pm.daily_sales_avg,
|
||||
@@ -396,7 +438,7 @@ router.get('/:id', async (req, res) => {
|
||||
pm.overstocked_amt
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
WHERE p.pid = $1
|
||||
`, [id]);
|
||||
|
||||
if (!productRows.length) {
|
||||
@@ -404,14 +446,14 @@ router.get('/:id', async (req, res) => {
|
||||
}
|
||||
|
||||
// Get categories and their paths separately to avoid GROUP BY issues
|
||||
const [categoryRows] = await pool.query(`
|
||||
const { rows: categoryRows } = await pool.query(`
|
||||
WITH RECURSIVE
|
||||
category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -421,7 +463,7 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
@@ -430,7 +472,7 @@ router.get('/:id', async (req, res) => {
|
||||
-- of other categories assigned to this product
|
||||
SELECT pc.cat_id
|
||||
FROM product_categories pc
|
||||
WHERE pc.pid = ?
|
||||
WHERE pc.pid = $1
|
||||
AND NOT EXISTS (
|
||||
-- Check if there are any child categories also assigned to this product
|
||||
SELECT 1
|
||||
@@ -448,7 +490,7 @@ router.get('/:id', async (req, res) => {
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
||||
WHERE pc.pid = ?
|
||||
WHERE pc.pid = $2
|
||||
ORDER BY cp.path
|
||||
`, [id, id]);
|
||||
|
||||
@@ -540,20 +582,20 @@ router.put('/:id', async (req, res) => {
|
||||
managing_stock
|
||||
} = req.body;
|
||||
|
||||
const [result] = await pool.query(
|
||||
const { rowCount } = await pool.query(
|
||||
`UPDATE products
|
||||
SET title = ?,
|
||||
sku = ?,
|
||||
stock_quantity = ?,
|
||||
price = ?,
|
||||
regular_price = ?,
|
||||
cost_price = ?,
|
||||
vendor = ?,
|
||||
brand = ?,
|
||||
categories = ?,
|
||||
visible = ?,
|
||||
managing_stock = ?
|
||||
WHERE pid = ?`,
|
||||
SET title = $1,
|
||||
sku = $2,
|
||||
stock_quantity = $3,
|
||||
price = $4,
|
||||
regular_price = $5,
|
||||
cost_price = $6,
|
||||
vendor = $7,
|
||||
brand = $8,
|
||||
categories = $9,
|
||||
visible = $10,
|
||||
managing_stock = $11
|
||||
WHERE pid = $12`,
|
||||
[
|
||||
title,
|
||||
sku,
|
||||
@@ -570,7 +612,7 @@ router.put('/:id', async (req, res) => {
|
||||
]
|
||||
);
|
||||
|
||||
if (result.affectedRows === 0) {
|
||||
if (rowCount === 0) {
|
||||
return res.status(404).json({ error: 'Product not found' });
|
||||
}
|
||||
|
||||
@@ -588,7 +630,7 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
const { id } = req.params;
|
||||
|
||||
// Get metrics from product_metrics table with inventory health data
|
||||
const [metrics] = await pool.query(`
|
||||
const { rows: metrics } = await pool.query(`
|
||||
WITH inventory_status AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
@@ -601,7 +643,7 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
END as calculated_status
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
WHERE p.pid = $1
|
||||
)
|
||||
SELECT
|
||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||
@@ -627,8 +669,8 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN inventory_status is ON p.pid = is.pid
|
||||
WHERE p.pid = ?
|
||||
`, [id]);
|
||||
WHERE p.pid = $2
|
||||
`, [id, id]);
|
||||
|
||||
if (!metrics.length) {
|
||||
// Return default metrics structure if no data found
|
||||
@@ -669,16 +711,16 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get monthly sales data
|
||||
const [monthlySales] = await pool.query(`
|
||||
const { rows: monthlySales } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m') as month,
|
||||
TO_CHAR(date, 'YYYY-MM') as month,
|
||||
COUNT(DISTINCT order_number) as order_count,
|
||||
SUM(quantity) as units_sold,
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
|
||||
ROUND(SUM(price * quantity)::numeric, 3) as revenue
|
||||
FROM orders
|
||||
WHERE pid = ?
|
||||
WHERE pid = $1
|
||||
AND canceled = false
|
||||
GROUP BY DATE_FORMAT(date, '%Y-%m')
|
||||
GROUP BY TO_CHAR(date, 'YYYY-MM')
|
||||
ORDER BY month DESC
|
||||
LIMIT 12
|
||||
`, [id]);
|
||||
@@ -693,9 +735,9 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
}));
|
||||
|
||||
// Get recent orders
|
||||
const [recentOrders] = await pool.query(`
|
||||
const { rows: recentOrders } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
||||
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||
order_number,
|
||||
quantity,
|
||||
price,
|
||||
@@ -705,18 +747,18 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
customer_name as customer,
|
||||
status
|
||||
FROM orders
|
||||
WHERE pid = ?
|
||||
WHERE pid = $1
|
||||
AND canceled = false
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
`, [id]);
|
||||
|
||||
// Get recent purchase orders with detailed status
|
||||
const [recentPurchases] = await pool.query(`
|
||||
const { rows: recentPurchases } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
||||
DATE_FORMAT(expected_date, '%Y-%m-%d') as expected_date,
|
||||
DATE_FORMAT(received_date, '%Y-%m-%d') as received_date,
|
||||
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||
TO_CHAR(expected_date, 'YYYY-MM-DD') as expected_date,
|
||||
TO_CHAR(received_date, 'YYYY-MM-DD') as received_date,
|
||||
po_id,
|
||||
ordered,
|
||||
received,
|
||||
@@ -726,17 +768,17 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
notes,
|
||||
CASE
|
||||
WHEN received_date IS NOT NULL THEN
|
||||
DATEDIFF(received_date, date)
|
||||
WHEN expected_date < CURDATE() AND status < ${PurchaseOrderStatus.ReceivingStarted} THEN
|
||||
DATEDIFF(CURDATE(), expected_date)
|
||||
(received_date - date)
|
||||
WHEN expected_date < CURRENT_DATE AND status < $2 THEN
|
||||
(CURRENT_DATE - expected_date)
|
||||
ELSE NULL
|
||||
END as lead_time_days
|
||||
FROM purchase_orders
|
||||
WHERE pid = ?
|
||||
AND status != ${PurchaseOrderStatus.Canceled}
|
||||
WHERE pid = $1
|
||||
AND status != $3
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
`, [id]);
|
||||
`, [id, PurchaseOrderStatus.ReceivingStarted, PurchaseOrderStatus.Canceled]);
|
||||
|
||||
res.json({
|
||||
monthly_sales: formattedMonthlySales,
|
||||
|
||||
@@ -97,6 +97,28 @@ router.get('/', async (req, res) => {
|
||||
const pages = Math.ceil(total / limit);
|
||||
|
||||
// Get recent purchase orders
|
||||
let orderByClause;
|
||||
|
||||
if (sortColumn === 'order_date') {
|
||||
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'vendor_name') {
|
||||
orderByClause = `vendor ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_cost') {
|
||||
orderByClause = `total_cost ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_received') {
|
||||
orderByClause = `total_received ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_items') {
|
||||
orderByClause = `total_items ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_quantity') {
|
||||
orderByClause = `total_quantity ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'fulfillment_rate') {
|
||||
orderByClause = `fulfillment_rate ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'status') {
|
||||
orderByClause = `status ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else {
|
||||
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
}
|
||||
|
||||
const { rows: orders } = await pool.query(`
|
||||
WITH po_totals AS (
|
||||
SELECT
|
||||
@@ -128,20 +150,9 @@ router.get('/', async (req, res) => {
|
||||
total_received,
|
||||
fulfillment_rate
|
||||
FROM po_totals
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN $${paramCounter} = 'order_date' THEN date
|
||||
WHEN $${paramCounter} = 'vendor_name' THEN vendor
|
||||
WHEN $${paramCounter} = 'total_cost' THEN total_cost
|
||||
WHEN $${paramCounter} = 'total_received' THEN total_received
|
||||
WHEN $${paramCounter} = 'total_items' THEN total_items
|
||||
WHEN $${paramCounter} = 'total_quantity' THEN total_quantity
|
||||
WHEN $${paramCounter} = 'fulfillment_rate' THEN fulfillment_rate
|
||||
WHEN $${paramCounter} = 'status' THEN status
|
||||
ELSE date
|
||||
END ${sortDirection === 'desc' ? 'DESC' : 'ASC'}
|
||||
LIMIT $${paramCounter + 1} OFFSET $${paramCounter + 2}
|
||||
`, [...params, sortColumn, Number(limit), offset]);
|
||||
ORDER BY ${orderByClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`, [...params, Number(limit), offset]);
|
||||
|
||||
// Get unique vendors for filter options
|
||||
const { rows: vendors } = await pool.query(`
|
||||
@@ -272,7 +283,7 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [analysis] = await pool.query(`
|
||||
const { rows: analysis } = await pool.query(`
|
||||
WITH category_costs AS (
|
||||
SELECT
|
||||
c.name as category,
|
||||
@@ -290,11 +301,11 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
SELECT
|
||||
category,
|
||||
COUNT(DISTINCT pid) as unique_products,
|
||||
CAST(AVG(cost_price) AS DECIMAL(15,3)) as avg_cost,
|
||||
CAST(MIN(cost_price) AS DECIMAL(15,3)) as min_cost,
|
||||
CAST(MAX(cost_price) AS DECIMAL(15,3)) as max_cost,
|
||||
CAST(STDDEV(cost_price) AS DECIMAL(15,3)) as cost_variance,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
||||
ROUND(AVG(cost_price)::numeric, 3) as avg_cost,
|
||||
ROUND(MIN(cost_price)::numeric, 3) as min_cost,
|
||||
ROUND(MAX(cost_price)::numeric, 3) as max_cost,
|
||||
ROUND(STDDEV(cost_price)::numeric, 3) as cost_variance,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM category_costs
|
||||
GROUP BY category
|
||||
ORDER BY total_spend DESC
|
||||
@@ -302,17 +313,37 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
|
||||
// Parse numeric values
|
||||
const parsedAnalysis = {
|
||||
categories: analysis.map(cat => ({
|
||||
unique_products: 0,
|
||||
avg_cost: 0,
|
||||
min_cost: 0,
|
||||
max_cost: 0,
|
||||
cost_variance: 0,
|
||||
total_spend_by_category: analysis.map(cat => ({
|
||||
category: cat.category,
|
||||
unique_products: Number(cat.unique_products) || 0,
|
||||
avg_cost: Number(cat.avg_cost) || 0,
|
||||
min_cost: Number(cat.min_cost) || 0,
|
||||
max_cost: Number(cat.max_cost) || 0,
|
||||
cost_variance: Number(cat.cost_variance) || 0,
|
||||
total_spend: Number(cat.total_spend) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
// Calculate aggregated stats if data exists
|
||||
if (analysis.length > 0) {
|
||||
parsedAnalysis.unique_products = analysis.reduce((sum, cat) => sum + Number(cat.unique_products || 0), 0);
|
||||
|
||||
// Calculate weighted average cost
|
||||
const totalProducts = parsedAnalysis.unique_products;
|
||||
if (totalProducts > 0) {
|
||||
parsedAnalysis.avg_cost = analysis.reduce((sum, cat) =>
|
||||
sum + (Number(cat.avg_cost || 0) * Number(cat.unique_products || 0)), 0) / totalProducts;
|
||||
}
|
||||
|
||||
// Find min and max across all categories
|
||||
parsedAnalysis.min_cost = Math.min(...analysis.map(cat => Number(cat.min_cost || 0)));
|
||||
parsedAnalysis.max_cost = Math.max(...analysis.map(cat => Number(cat.max_cost || 0)));
|
||||
|
||||
// Average variance
|
||||
parsedAnalysis.cost_variance = analysis.reduce((sum, cat) =>
|
||||
sum + Number(cat.cost_variance || 0), 0) / analysis.length;
|
||||
}
|
||||
|
||||
res.json(parsedAnalysis);
|
||||
} catch (error) {
|
||||
console.error('Error fetching cost analysis:', error);
|
||||
@@ -325,7 +356,7 @@ router.get('/receiving-status', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [status] = await pool.query(`
|
||||
const { rows: status } = await pool.query(`
|
||||
WITH po_totals AS (
|
||||
SELECT
|
||||
po_id,
|
||||
@@ -333,7 +364,7 @@ router.get('/receiving-status', async (req, res) => {
|
||||
receiving_status,
|
||||
SUM(ordered) as total_ordered,
|
||||
SUM(received) as total_received,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_cost
|
||||
FROM purchase_orders
|
||||
WHERE status != ${STATUS.CANCELED}
|
||||
GROUP BY po_id, status, receiving_status
|
||||
@@ -345,8 +376,8 @@ router.get('/receiving-status', async (req, res) => {
|
||||
ROUND(
|
||||
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
||||
) as fulfillment_rate,
|
||||
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
|
||||
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost,
|
||||
ROUND(SUM(total_cost)::numeric, 3) as total_value,
|
||||
ROUND(AVG(total_cost)::numeric, 3) as avg_cost,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
|
||||
END) as pending_count,
|
||||
@@ -364,17 +395,17 @@ router.get('/receiving-status', async (req, res) => {
|
||||
|
||||
// Parse numeric values
|
||||
const parsedStatus = {
|
||||
order_count: Number(status[0].order_count) || 0,
|
||||
total_ordered: Number(status[0].total_ordered) || 0,
|
||||
total_received: Number(status[0].total_received) || 0,
|
||||
fulfillment_rate: Number(status[0].fulfillment_rate) || 0,
|
||||
total_value: Number(status[0].total_value) || 0,
|
||||
avg_cost: Number(status[0].avg_cost) || 0,
|
||||
order_count: Number(status[0]?.order_count) || 0,
|
||||
total_ordered: Number(status[0]?.total_ordered) || 0,
|
||||
total_received: Number(status[0]?.total_received) || 0,
|
||||
fulfillment_rate: Number(status[0]?.fulfillment_rate) || 0,
|
||||
total_value: Number(status[0]?.total_value) || 0,
|
||||
avg_cost: Number(status[0]?.avg_cost) || 0,
|
||||
status_breakdown: {
|
||||
pending: Number(status[0].pending_count) || 0,
|
||||
partial: Number(status[0].partial_count) || 0,
|
||||
completed: Number(status[0].completed_count) || 0,
|
||||
canceled: Number(status[0].canceled_count) || 0
|
||||
pending: Number(status[0]?.pending_count) || 0,
|
||||
partial: Number(status[0]?.partial_count) || 0,
|
||||
completed: Number(status[0]?.completed_count) || 0,
|
||||
canceled: Number(status[0]?.canceled_count) || 0
|
||||
}
|
||||
};
|
||||
|
||||
@@ -390,7 +421,7 @@ router.get('/order-vs-received', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [quantities] = await pool.query(`
|
||||
const { rows: quantities } = await pool.query(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.title as product,
|
||||
@@ -403,10 +434,10 @@ router.get('/order-vs-received', async (req, res) => {
|
||||
COUNT(DISTINCT po.po_id) as order_count
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.product_id = po.product_id
|
||||
WHERE po.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
||||
WHERE po.date >= (CURRENT_DATE - INTERVAL '90 days')
|
||||
GROUP BY p.product_id, p.title, p.SKU
|
||||
HAVING order_count > 0
|
||||
ORDER BY ordered_quantity DESC
|
||||
HAVING COUNT(DISTINCT po.po_id) > 0
|
||||
ORDER BY SUM(po.ordered) DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
|
||||
|
||||
396
inventory-server/src/routes/reusable-images.js
Normal file
396
inventory-server/src/routes/reusable-images.js
Normal file
@@ -0,0 +1,396 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const multer = require('multer');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
|
||||
// Create reusable uploads directory if it doesn't exist
|
||||
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
|
||||
fs.mkdirSync(uploadsDir, { recursive: true });
|
||||
|
||||
// Configure multer for file uploads
|
||||
const storage = multer.diskStorage({
|
||||
destination: function (req, file, cb) {
|
||||
console.log(`Saving reusable image to: ${uploadsDir}`);
|
||||
cb(null, uploadsDir);
|
||||
},
|
||||
filename: function (req, file, cb) {
|
||||
// Create unique filename with original extension
|
||||
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
|
||||
|
||||
// Make sure we preserve the original file extension
|
||||
let fileExt = path.extname(file.originalname).toLowerCase();
|
||||
|
||||
// Ensure there is a proper extension based on mimetype if none exists
|
||||
if (!fileExt) {
|
||||
switch (file.mimetype) {
|
||||
case 'image/jpeg': fileExt = '.jpg'; break;
|
||||
case 'image/png': fileExt = '.png'; break;
|
||||
case 'image/gif': fileExt = '.gif'; break;
|
||||
case 'image/webp': fileExt = '.webp'; break;
|
||||
default: fileExt = '.jpg'; // Default to jpg
|
||||
}
|
||||
}
|
||||
|
||||
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
|
||||
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
|
||||
cb(null, fileName);
|
||||
}
|
||||
});
|
||||
|
||||
const upload = multer({
|
||||
storage: storage,
|
||||
limits: {
|
||||
fileSize: 5 * 1024 * 1024, // 5MB max file size
|
||||
},
|
||||
fileFilter: function (req, file, cb) {
|
||||
// Accept only image files
|
||||
const filetypes = /jpeg|jpg|png|gif|webp/;
|
||||
const mimetype = filetypes.test(file.mimetype);
|
||||
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
|
||||
|
||||
if (mimetype && extname) {
|
||||
return cb(null, true);
|
||||
}
|
||||
cb(new Error('Only image files are allowed'));
|
||||
}
|
||||
});
|
||||
|
||||
// Get all reusable images
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
ORDER BY created_at DESC
|
||||
`);
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable images:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable images',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get images by company or global images
|
||||
router.get('/by-company/:companyId', async (req, res) => {
|
||||
try {
|
||||
const { companyId } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Get images that are either global or belong to this company
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE is_global = true OR company = $1
|
||||
ORDER BY created_at DESC
|
||||
`, [companyId]);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable images by company:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable images by company',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get global images only
|
||||
router.get('/global', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE is_global = true
|
||||
ORDER BY created_at DESC
|
||||
`);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching global reusable images:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch global reusable images',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get a single image by ID
|
||||
router.get('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE id = $1
|
||||
`, [id]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Upload a new reusable image
|
||||
router.post('/upload', upload.single('image'), async (req, res) => {
|
||||
try {
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ error: 'No image file provided' });
|
||||
}
|
||||
|
||||
const { name, is_global, company } = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: 'Image name is required' });
|
||||
}
|
||||
|
||||
// Convert is_global from string to boolean
|
||||
const isGlobal = is_global === 'true' || is_global === true;
|
||||
|
||||
// Validate company is provided for non-global images
|
||||
if (!isGlobal && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||
}
|
||||
|
||||
// Log file information
|
||||
console.log('Reusable image uploaded:', {
|
||||
filename: req.file.filename,
|
||||
originalname: req.file.originalname,
|
||||
mimetype: req.file.mimetype,
|
||||
size: req.file.size,
|
||||
path: req.file.path
|
||||
});
|
||||
|
||||
// Ensure the file exists
|
||||
const filePath = path.join(uploadsDir, req.file.filename);
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return res.status(500).json({ error: 'File was not saved correctly' });
|
||||
}
|
||||
|
||||
// Create URL for the uploaded file
|
||||
const baseUrl = 'https://inventory.acot.site';
|
||||
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Insert record into database
|
||||
const result = await pool.query(`
|
||||
INSERT INTO reusable_images (
|
||||
name,
|
||||
filename,
|
||||
file_path,
|
||||
image_url,
|
||||
is_global,
|
||||
company,
|
||||
mime_type,
|
||||
file_size
|
||||
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
|
||||
RETURNING *
|
||||
`, [
|
||||
name,
|
||||
req.file.filename,
|
||||
filePath,
|
||||
imageUrl,
|
||||
isGlobal,
|
||||
isGlobal ? null : company,
|
||||
req.file.mimetype,
|
||||
req.file.size
|
||||
]);
|
||||
|
||||
// Return success response with image data
|
||||
res.status(201).json({
|
||||
success: true,
|
||||
image: result.rows[0],
|
||||
message: 'Image uploaded successfully'
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error uploading reusable image:', error);
|
||||
res.status(500).json({ error: error.message || 'Failed to upload image' });
|
||||
}
|
||||
});
|
||||
|
||||
// Update image details (name, is_global, company)
|
||||
router.put('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const { name, is_global, company } = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: 'Image name is required' });
|
||||
}
|
||||
|
||||
// Convert is_global from string to boolean if necessary
|
||||
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
|
||||
|
||||
// Validate company is provided for non-global images
|
||||
if (!isGlobal && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Check if the image exists
|
||||
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||
if (checkResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
UPDATE reusable_images
|
||||
SET
|
||||
name = $1,
|
||||
is_global = $2,
|
||||
company = $3
|
||||
WHERE id = $4
|
||||
RETURNING *
|
||||
`, [
|
||||
name,
|
||||
isGlobal,
|
||||
isGlobal ? null : company,
|
||||
id
|
||||
]);
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error updating reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to update reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Delete a reusable image
|
||||
router.delete('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Get the image data first to get the filename
|
||||
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||
|
||||
if (imageResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
const image = imageResult.rows[0];
|
||||
|
||||
// Delete from database
|
||||
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
|
||||
|
||||
// Delete the file from filesystem
|
||||
const filePath = path.join(uploadsDir, image.filename);
|
||||
if (fs.existsSync(filePath)) {
|
||||
fs.unlinkSync(filePath);
|
||||
}
|
||||
|
||||
res.json({
|
||||
message: 'Reusable image deleted successfully',
|
||||
image
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error deleting reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to delete reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Check if file exists and permissions
|
||||
router.get('/check-file/:filename', (req, res) => {
|
||||
const { filename } = req.params;
|
||||
|
||||
// Prevent directory traversal
|
||||
if (filename.includes('..') || filename.includes('/')) {
|
||||
return res.status(400).json({ error: 'Invalid filename' });
|
||||
}
|
||||
|
||||
const filePath = path.join(uploadsDir, filename);
|
||||
|
||||
try {
|
||||
// Check if file exists
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return res.status(404).json({
|
||||
error: 'File not found',
|
||||
path: filePath,
|
||||
exists: false,
|
||||
readable: false
|
||||
});
|
||||
}
|
||||
|
||||
// Check if file is readable
|
||||
fs.accessSync(filePath, fs.constants.R_OK);
|
||||
|
||||
// Get file stats
|
||||
const stats = fs.statSync(filePath);
|
||||
|
||||
return res.json({
|
||||
filename,
|
||||
path: filePath,
|
||||
exists: true,
|
||||
readable: true,
|
||||
isFile: stats.isFile(),
|
||||
isDirectory: stats.isDirectory(),
|
||||
size: stats.size,
|
||||
created: stats.birthtime,
|
||||
modified: stats.mtime,
|
||||
permissions: stats.mode.toString(8)
|
||||
});
|
||||
} catch (error) {
|
||||
return res.status(500).json({
|
||||
error: error.message,
|
||||
path: filePath,
|
||||
exists: fs.existsSync(filePath),
|
||||
readable: false
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Error handling middleware
|
||||
router.use((err, req, res, next) => {
|
||||
console.error('Reusable images route error:', err);
|
||||
res.status(500).json({
|
||||
error: 'Internal server error',
|
||||
details: err.message
|
||||
});
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -32,7 +32,7 @@ router.get('/', async (req, res) => {
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor = ANY($1)
|
||||
@@ -70,7 +70,7 @@ router.get('/', async (req, res) => {
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor IS NOT NULL AND vendor != ''
|
||||
|
||||
@@ -19,6 +19,7 @@ const importRouter = require('./routes/import');
|
||||
const aiValidationRouter = require('./routes/ai-validation');
|
||||
const templatesRouter = require('./routes/templates');
|
||||
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||
const reusableImagesRouter = require('./routes/reusable-images');
|
||||
|
||||
// Get the absolute path to the .env file
|
||||
const envPath = '/var/www/html/inventory/.env';
|
||||
@@ -105,6 +106,7 @@ async function startServer() {
|
||||
app.use('/api/ai-validation', aiValidationRouter);
|
||||
app.use('/api/templates', templatesRouter);
|
||||
app.use('/api/ai-prompts', aiPromptsRouter);
|
||||
app.use('/api/reusable-images', reusableImagesRouter);
|
||||
|
||||
// Basic health check route
|
||||
app.get('/health', (req, res) => {
|
||||
|
||||
@@ -2,9 +2,12 @@
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/box.svg" />
|
||||
<link rel="icon" type="image/x-icon" href="/cherrybottom.ico" />
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@100..900&display=swap" rel="stylesheet">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Inventory Manager</title>
|
||||
<title>A Cherry On Bottom</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24"><g fill="none" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"><path d="M21 8a2 2 0 0 0-1-1.73l-7-4a2 2 0 0 0-2 0l-7 4A2 2 0 0 0 3 8v8a2 2 0 0 0 1 1.73l7 4a2 2 0 0 0 2 0l7-4A2 2 0 0 0 21 16Z"/><path d="m3.3 7l8.7 5l8.7-5M12 22V12"/></g></svg>
|
||||
|
Before Width: | Height: | Size: 340 B |
BIN
inventory/public/cherrybottom.ico
Normal file
BIN
inventory/public/cherrybottom.ico
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 30 KiB |
BIN
inventory/public/cherrybottom.png
Normal file
BIN
inventory/public/cherrybottom.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 21 KiB |
@@ -1,42 +1,3 @@
|
||||
#root {
|
||||
max-width: 1800px;
|
||||
margin: 0 auto;
|
||||
padding: 2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.logo {
|
||||
height: 6em;
|
||||
padding: 1.5em;
|
||||
will-change: filter;
|
||||
transition: filter 300ms;
|
||||
}
|
||||
.logo:hover {
|
||||
filter: drop-shadow(0 0 2em #646cffaa);
|
||||
}
|
||||
.logo.react:hover {
|
||||
filter: drop-shadow(0 0 2em #61dafbaa);
|
||||
}
|
||||
|
||||
@keyframes logo-spin {
|
||||
from {
|
||||
transform: rotate(0deg);
|
||||
}
|
||||
to {
|
||||
transform: rotate(360deg);
|
||||
}
|
||||
}
|
||||
|
||||
@media (prefers-reduced-motion: no-preference) {
|
||||
a:nth-of-type(2) .logo {
|
||||
animation: logo-spin infinite 20s linear;
|
||||
}
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 2em;
|
||||
}
|
||||
|
||||
.read-the-docs {
|
||||
color: #888;
|
||||
}
|
||||
font-family: 'Inter', sans-serif;
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { ResponsiveContainer, BarChart, Bar, XAxis, YAxis, Tooltip, ScatterChart, Scatter, ZAxis } from 'recharts';
|
||||
import config from '../../config';
|
||||
import { useState, useEffect } from 'react';
|
||||
|
||||
interface VendorData {
|
||||
performance: {
|
||||
@@ -10,14 +10,15 @@ interface VendorData {
|
||||
profitMargin: number;
|
||||
stockTurnover: number;
|
||||
productCount: number;
|
||||
growth: number;
|
||||
}[];
|
||||
comparison: {
|
||||
comparison?: {
|
||||
vendor: string;
|
||||
salesPerProduct: number;
|
||||
averageMargin: number;
|
||||
size: number;
|
||||
}[];
|
||||
trends: {
|
||||
trends?: {
|
||||
vendor: string;
|
||||
month: string;
|
||||
sales: number;
|
||||
@@ -25,40 +26,86 @@ interface VendorData {
|
||||
}
|
||||
|
||||
export function VendorPerformance() {
|
||||
const { data, isLoading } = useQuery<VendorData>({
|
||||
queryKey: ['vendor-performance'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/vendors`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch vendor performance');
|
||||
}
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
performance: rawData.performance.map((vendor: any) => ({
|
||||
...vendor,
|
||||
salesVolume: Number(vendor.salesVolume) || 0,
|
||||
profitMargin: Number(vendor.profitMargin) || 0,
|
||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||
productCount: Number(vendor.productCount) || 0
|
||||
})),
|
||||
comparison: rawData.comparison.map((vendor: any) => ({
|
||||
...vendor,
|
||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||
averageMargin: Number(vendor.averageMargin) || 0,
|
||||
size: Number(vendor.size) || 0
|
||||
})),
|
||||
trends: rawData.trends.map((vendor: any) => ({
|
||||
...vendor,
|
||||
sales: Number(vendor.sales) || 0
|
||||
}))
|
||||
};
|
||||
},
|
||||
});
|
||||
const [vendorData, setVendorData] = useState<VendorData | null>(null);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
if (isLoading || !data) {
|
||||
useEffect(() => {
|
||||
// Use plain fetch to bypass cache issues with React Query
|
||||
const fetchData = async () => {
|
||||
try {
|
||||
setIsLoading(true);
|
||||
|
||||
// Add cache-busting parameter
|
||||
const response = await fetch(`${config.apiUrl}/analytics/vendors?nocache=${Date.now()}`, {
|
||||
headers: {
|
||||
"Cache-Control": "no-cache, no-store, must-revalidate",
|
||||
"Pragma": "no-cache",
|
||||
"Expires": "0"
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch: ${response.status}`);
|
||||
}
|
||||
|
||||
const rawData = await response.json();
|
||||
|
||||
if (!rawData || !rawData.performance) {
|
||||
throw new Error('Invalid response format');
|
||||
}
|
||||
|
||||
// Create a complete structure even if some parts are missing
|
||||
const data: VendorData = {
|
||||
performance: rawData.performance.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesVolume: Number(vendor.salesVolume) || 0,
|
||||
profitMargin: Number(vendor.profitMargin) || 0,
|
||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||
productCount: Number(vendor.productCount) || 0,
|
||||
growth: Number(vendor.growth) || 0
|
||||
})),
|
||||
comparison: rawData.comparison?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||
averageMargin: Number(vendor.averageMargin) || 0,
|
||||
size: Number(vendor.size) || 0
|
||||
})) || [],
|
||||
trends: rawData.trends?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
month: vendor.month,
|
||||
sales: Number(vendor.sales) || 0
|
||||
})) || []
|
||||
};
|
||||
|
||||
setVendorData(data);
|
||||
} catch (err) {
|
||||
console.error('Error fetching vendor data:', err);
|
||||
setError(err instanceof Error ? err.message : 'Unknown error');
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
fetchData();
|
||||
}, []);
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading vendor performance...</div>;
|
||||
}
|
||||
|
||||
if (error || !vendorData) {
|
||||
return <div className="text-red-500">Error loading vendor data: {error}</div>;
|
||||
}
|
||||
|
||||
// Ensure we have at least the performance data
|
||||
const sortedPerformance = vendorData.performance
|
||||
.sort((a, b) => b.salesVolume - a.salesVolume)
|
||||
.slice(0, 10);
|
||||
|
||||
// Use simplified version if comparison data is missing
|
||||
const hasComparisonData = vendorData.comparison && vendorData.comparison.length > 0;
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
@@ -68,7 +115,7 @@ export function VendorPerformance() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data.performance}>
|
||||
<BarChart data={sortedPerformance}>
|
||||
<XAxis dataKey="vendor" />
|
||||
<YAxis tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`} />
|
||||
<Tooltip
|
||||
@@ -84,44 +131,68 @@ export function VendorPerformance() {
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Performance Matrix</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<ScatterChart>
|
||||
<XAxis
|
||||
dataKey="salesPerProduct"
|
||||
name="Sales per Product"
|
||||
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
|
||||
/>
|
||||
<YAxis
|
||||
dataKey="averageMargin"
|
||||
name="Average Margin"
|
||||
tickFormatter={(value) => `${value.toFixed(0)}%`}
|
||||
/>
|
||||
<ZAxis
|
||||
dataKey="size"
|
||||
range={[50, 400]}
|
||||
name="Product Count"
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number, name: string) => {
|
||||
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
|
||||
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
|
||||
return [value, name];
|
||||
}}
|
||||
/>
|
||||
<Scatter
|
||||
data={data.comparison}
|
||||
fill="#60a5fa"
|
||||
name="Vendors"
|
||||
/>
|
||||
</ScatterChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
{hasComparisonData ? (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Performance Matrix</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<ScatterChart>
|
||||
<XAxis
|
||||
dataKey="salesPerProduct"
|
||||
name="Sales per Product"
|
||||
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
|
||||
/>
|
||||
<YAxis
|
||||
dataKey="averageMargin"
|
||||
name="Average Margin"
|
||||
tickFormatter={(value) => `${value.toFixed(0)}%`}
|
||||
/>
|
||||
<ZAxis
|
||||
dataKey="size"
|
||||
range={[50, 400]}
|
||||
name="Product Count"
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number, name: string) => {
|
||||
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
|
||||
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
|
||||
return [value, name];
|
||||
}}
|
||||
/>
|
||||
<Scatter
|
||||
data={vendorData.comparison}
|
||||
fill="#60a5fa"
|
||||
name="Vendors"
|
||||
/>
|
||||
</ScatterChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
) : (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Profit Margins</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={sortedPerformance}>
|
||||
<XAxis dataKey="vendor" />
|
||||
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Profit Margin']}
|
||||
/>
|
||||
<Bar
|
||||
dataKey="profitMargin"
|
||||
fill="#4ade80"
|
||||
name="Profit Margin"
|
||||
/>
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<Card>
|
||||
@@ -130,7 +201,7 @@ export function VendorPerformance() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.performance.map((vendor) => (
|
||||
{sortedPerformance.map((vendor) => (
|
||||
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||
|
||||
@@ -1,88 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
|
||||
import { AlertCircle, AlertTriangle, CheckCircle2, PackageSearch } from "lucide-react"
|
||||
import config from "@/config"
|
||||
import { useNavigate } from "react-router-dom"
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
interface InventoryHealth {
|
||||
critical: number
|
||||
reorder: number
|
||||
healthy: number
|
||||
overstock: number
|
||||
total: number
|
||||
}
|
||||
|
||||
export function InventoryHealthSummary() {
|
||||
const navigate = useNavigate();
|
||||
const { data: summary } = useQuery<InventoryHealth>({
|
||||
queryKey: ["inventory-health"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory/health/summary`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch inventory health")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const stats = [
|
||||
{
|
||||
title: "Critical Stock",
|
||||
value: summary?.critical || 0,
|
||||
description: "Products needing immediate attention",
|
||||
icon: AlertCircle,
|
||||
className: "bg-destructive/10",
|
||||
iconClassName: "text-destructive",
|
||||
view: "critical"
|
||||
},
|
||||
{
|
||||
title: "Reorder Soon",
|
||||
value: summary?.reorder || 0,
|
||||
description: "Products approaching reorder point",
|
||||
icon: AlertTriangle,
|
||||
className: "bg-warning/10",
|
||||
iconClassName: "text-warning",
|
||||
view: "reorder"
|
||||
},
|
||||
{
|
||||
title: "Healthy Stock",
|
||||
value: summary?.healthy || 0,
|
||||
description: "Products at optimal levels",
|
||||
icon: CheckCircle2,
|
||||
className: "bg-success/10",
|
||||
iconClassName: "text-success",
|
||||
view: "healthy"
|
||||
},
|
||||
{
|
||||
title: "Overstock",
|
||||
value: summary?.overstock || 0,
|
||||
description: "Products exceeding optimal levels",
|
||||
icon: PackageSearch,
|
||||
className: "bg-muted",
|
||||
iconClassName: "text-muted-foreground",
|
||||
view: "overstocked"
|
||||
},
|
||||
]
|
||||
|
||||
return (
|
||||
<>
|
||||
{stats.map((stat) => (
|
||||
<Card
|
||||
key={stat.title}
|
||||
className={cn(stat.className, "cursor-pointer hover:opacity-90 transition-opacity")}
|
||||
onClick={() => navigate(`/products?view=${stat.view}`)}
|
||||
>
|
||||
<CardHeader className="flex flex-row items-center justify-between pb-2">
|
||||
<CardTitle className="text-sm font-medium">{stat.title}</CardTitle>
|
||||
<stat.icon className={`h-4 w-4 ${stat.iconClassName}`} />
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{stat.value}</div>
|
||||
<p className="text-xs text-muted-foreground">{stat.description}</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
))}
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,106 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { Bar, BarChart, ResponsiveContainer, XAxis, YAxis, Tooltip } from 'recharts';
|
||||
import config from '../../config';
|
||||
|
||||
interface InventoryMetrics {
|
||||
stockLevels: {
|
||||
category: string;
|
||||
inStock: number;
|
||||
lowStock: number;
|
||||
outOfStock: number;
|
||||
}[];
|
||||
topVendors: {
|
||||
vendor: string;
|
||||
productCount: number;
|
||||
averageStockLevel: string;
|
||||
}[];
|
||||
stockTurnover: {
|
||||
category: string;
|
||||
rate: string;
|
||||
}[];
|
||||
}
|
||||
|
||||
export function InventoryStats() {
|
||||
const { data, isLoading, error } = useQuery<InventoryMetrics>({
|
||||
queryKey: ['inventory-metrics'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory-metrics`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch inventory metrics');
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading inventory metrics...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading inventory metrics</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Levels by Category</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data?.stockLevels}>
|
||||
<XAxis dataKey="category" />
|
||||
<YAxis />
|
||||
<Tooltip />
|
||||
<Bar dataKey="inStock" name="In Stock" fill="#4ade80" />
|
||||
<Bar dataKey="lowStock" name="Low Stock" fill="#fbbf24" />
|
||||
<Bar dataKey="outOfStock" name="Out of Stock" fill="#f87171" />
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Turnover Rate</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data?.stockTurnover}>
|
||||
<XAxis dataKey="category" />
|
||||
<YAxis />
|
||||
<Tooltip formatter={(value: string) => [Number(value).toFixed(2), "Rate"]} />
|
||||
<Bar dataKey="rate" name="Turnover Rate" fill="#60a5fa" />
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Top Vendors</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data?.topVendors.map((vendor) => (
|
||||
<div key={vendor.vendor} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{vendor.productCount} products
|
||||
</p>
|
||||
</div>
|
||||
<div className="ml-4 text-right">
|
||||
<p className="text-sm font-medium">
|
||||
Avg. Stock: {Number(vendor.averageStockLevel).toFixed(0)}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,232 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Area,
|
||||
AreaChart,
|
||||
ResponsiveContainer,
|
||||
Tooltip,
|
||||
XAxis,
|
||||
YAxis,
|
||||
} from "recharts"
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"
|
||||
import config from "@/config"
|
||||
|
||||
interface MetricDataPoint {
|
||||
date: string
|
||||
value: number
|
||||
}
|
||||
|
||||
interface KeyMetrics {
|
||||
revenue: MetricDataPoint[]
|
||||
inventory_value: MetricDataPoint[]
|
||||
gmroi: MetricDataPoint[]
|
||||
}
|
||||
|
||||
export function KeyMetricsCharts() {
|
||||
const { data: metrics } = useQuery<KeyMetrics>({
|
||||
queryKey: ["key-metrics"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/metrics/trends`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch metrics trends")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const formatCurrency = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "currency",
|
||||
currency: "USD",
|
||||
minimumFractionDigits: 0,
|
||||
maximumFractionDigits: 0,
|
||||
}).format(value)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Key Metrics</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<Tabs defaultValue="revenue" className="space-y-4">
|
||||
<TabsList>
|
||||
<TabsTrigger value="revenue">Revenue</TabsTrigger>
|
||||
<TabsTrigger value="inventory">Inventory Value</TabsTrigger>
|
||||
<TabsTrigger value="gmroi">GMROI</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
<TabsContent value="revenue" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.revenue}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={formatCurrency}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Revenue
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{formatCurrency(payload[0].value as number)}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#0ea5e9"
|
||||
fill="#0ea5e9"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="inventory" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.inventory_value}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={formatCurrency}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Value
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{formatCurrency(payload[0].value as number)}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#84cc16"
|
||||
fill="#84cc16"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="gmroi" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.gmroi}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => `${value.toFixed(1)}%`}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
GMROI
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{`${typeof payload[0].value === 'number' ? payload[0].value.toFixed(1) : payload[0].value}%`}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#f59e0b"
|
||||
fill="#f59e0b"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
</Tabs>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,108 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table"
|
||||
import { Badge } from "@/components/ui/badge"
|
||||
import config from "@/config"
|
||||
import { format } from "date-fns"
|
||||
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
daily_sales_avg: string;
|
||||
days_of_inventory: string;
|
||||
reorder_qty: number;
|
||||
last_purchase_date: string | null;
|
||||
lead_time_status: string;
|
||||
}
|
||||
|
||||
// Helper functions
|
||||
const formatDate = (dateString: string) => {
|
||||
return format(new Date(dateString), 'MMM dd, yyyy')
|
||||
}
|
||||
|
||||
const getLeadTimeVariant = (status: string) => {
|
||||
switch (status.toLowerCase()) {
|
||||
case 'critical':
|
||||
return 'destructive'
|
||||
case 'warning':
|
||||
return 'secondary'
|
||||
case 'good':
|
||||
return 'secondary'
|
||||
default:
|
||||
return 'secondary'
|
||||
}
|
||||
}
|
||||
|
||||
export function LowStockAlerts() {
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["low-stock"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/low-stock/products`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch low stock products")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Low Stock Alerts</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="max-h-[350px] overflow-auto">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Days Left</TableHead>
|
||||
<TableHead className="text-right">Reorder Qty</TableHead>
|
||||
<TableHead>Last Purchase</TableHead>
|
||||
<TableHead>Lead Time</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.days_of_inventory).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.reorder_qty}</TableCell>
|
||||
<TableCell>{product.last_purchase_date ? formatDate(product.last_purchase_date) : '-'}</TableCell>
|
||||
<TableCell>
|
||||
<Badge variant={getLeadTimeVariant(product.lead_time_status)}>
|
||||
{product.lead_time_status}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,63 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Avatar, AvatarFallback } from '@/components/ui/avatar';
|
||||
import config from '../../config';
|
||||
|
||||
interface RecentOrder {
|
||||
order_id: string;
|
||||
customer_name: string;
|
||||
total_amount: number;
|
||||
order_date: string;
|
||||
}
|
||||
|
||||
export function RecentSales() {
|
||||
const { data: recentOrders, isLoading, error } = useQuery<RecentOrder[]>({
|
||||
queryKey: ['recent-orders'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/recent-orders`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch recent orders');
|
||||
}
|
||||
const data = await response.json();
|
||||
return data.map((order: RecentOrder) => ({
|
||||
...order,
|
||||
total_amount: parseFloat(order.total_amount.toString())
|
||||
}));
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading recent sales...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading recent sales</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-8">
|
||||
{recentOrders?.map((order) => (
|
||||
<div key={order.order_id} className="flex items-center">
|
||||
<Avatar className="h-9 w-9">
|
||||
<AvatarFallback>
|
||||
{order.customer_name?.split(' ').map(n => n[0]).join('') || '??'}
|
||||
</AvatarFallback>
|
||||
</Avatar>
|
||||
<div className="ml-4 space-y-1">
|
||||
<p className="text-sm font-medium leading-none">Order #{order.order_id}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{new Date(order.order_date).toLocaleDateString()}
|
||||
</p>
|
||||
</div>
|
||||
<div className="ml-auto font-medium">
|
||||
${order.total_amount.toFixed(2)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{!recentOrders?.length && (
|
||||
<div className="text-center text-muted-foreground">
|
||||
No recent orders found
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,58 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Cell, Pie, PieChart, ResponsiveContainer, Tooltip, Legend } from 'recharts';
|
||||
import config from '../../config';
|
||||
|
||||
interface CategorySales {
|
||||
category: string;
|
||||
total: number;
|
||||
percentage: number;
|
||||
}
|
||||
|
||||
const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d'];
|
||||
|
||||
export function SalesByCategory() {
|
||||
const { data, isLoading, error } = useQuery<CategorySales[]>({
|
||||
queryKey: ['sales-by-category'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/sales-by-category`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch category sales');
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading chart...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading category sales</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<PieChart>
|
||||
<Pie
|
||||
data={data}
|
||||
cx="50%"
|
||||
cy="50%"
|
||||
labelLine={false}
|
||||
outerRadius={80}
|
||||
fill="#8884d8"
|
||||
dataKey="total"
|
||||
nameKey="category"
|
||||
label={({ name, percent }) => `${name} ${(percent * 100).toFixed(0)}%`}
|
||||
>
|
||||
{data?.map((_, index) => (
|
||||
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
|
||||
))}
|
||||
</Pie>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
|
||||
/>
|
||||
<Legend />
|
||||
</PieChart>
|
||||
</ResponsiveContainer>
|
||||
);
|
||||
}
|
||||
@@ -1,95 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table"
|
||||
import { TrendingUp, TrendingDown } from "lucide-react"
|
||||
import config from "@/config"
|
||||
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
daily_sales_avg: string;
|
||||
weekly_sales_avg: string;
|
||||
growth_rate: string;
|
||||
total_revenue: string;
|
||||
}
|
||||
|
||||
export function TrendingProducts() {
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["trending-products"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/products/trending`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch trending products")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const formatPercent = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "percent",
|
||||
minimumFractionDigits: 1,
|
||||
maximumFractionDigits: 1,
|
||||
signDisplay: "exceptZero",
|
||||
}).format(value / 100)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Trending Products</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="max-h-[400px] overflow-auto">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead>Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Growth</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell className="font-medium">
|
||||
<div className="flex flex-col">
|
||||
<span className="font-medium">{product.title}</span>
|
||||
<span className="text-sm text-muted-foreground">
|
||||
{product.sku}
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell>{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<div className="flex items-center justify-end gap-1">
|
||||
{Number(product.growth_rate) > 0 ? (
|
||||
<TrendingUp className="h-4 w-4 text-success" />
|
||||
) : (
|
||||
<TrendingDown className="h-4 w-4 text-destructive" />
|
||||
)}
|
||||
<span
|
||||
className={
|
||||
Number(product.growth_rate) > 0 ? "text-success" : "text-destructive"
|
||||
}
|
||||
>
|
||||
{formatPercent(Number(product.growth_rate))}
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -3,7 +3,6 @@ import {
|
||||
Package,
|
||||
BarChart2,
|
||||
Settings,
|
||||
Box,
|
||||
ClipboardList,
|
||||
LogOut,
|
||||
Users,
|
||||
@@ -22,6 +21,7 @@ import {
|
||||
SidebarMenuButton,
|
||||
SidebarMenuItem,
|
||||
SidebarSeparator,
|
||||
useSidebar
|
||||
} from "@/components/ui/sidebar";
|
||||
import { useLocation, useNavigate, Link } from "react-router-dom";
|
||||
import { Protected } from "@/components/auth/Protected";
|
||||
@@ -80,6 +80,7 @@ const items = [
|
||||
export function AppSidebar() {
|
||||
const location = useLocation();
|
||||
const navigate = useNavigate();
|
||||
useSidebar();
|
||||
|
||||
const handleLogout = () => {
|
||||
localStorage.removeItem('token');
|
||||
@@ -90,11 +91,19 @@ export function AppSidebar() {
|
||||
return (
|
||||
<Sidebar collapsible="icon" variant="sidebar">
|
||||
<SidebarHeader>
|
||||
<div className="p-4 flex items-center gap-2 group-data-[collapsible=icon]:justify-center">
|
||||
<Box className="h-6 w-6 shrink-0" />
|
||||
<h2 className="text-lg font-semibold group-data-[collapsible=icon]:hidden">
|
||||
Inventory Manager
|
||||
</h2>
|
||||
<div className="py-1 flex justify-center items-center">
|
||||
<div className="flex items-center">
|
||||
<div className="flex-shrink-0 w-8 h-8 relative flex items-center justify-center">
|
||||
<img
|
||||
src="/cherrybottom.png"
|
||||
alt="Cherry Bottom"
|
||||
className="w-6 h-6 object-contain -rotate-12 transform hover:rotate-0 transition-transform ease-in-out duration-300"
|
||||
/>
|
||||
</div>
|
||||
<div className="ml-2 transition-all duration-200 whitespace-nowrap group-[.group[data-state=collapsed]]:hidden">
|
||||
<span className="font-bold text-lg">A Cherry On Bottom</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</SidebarHeader>
|
||||
<SidebarSeparator />
|
||||
|
||||
@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
|
||||
}
|
||||
getProductContainerClasses={() => getProductContainerClasses(index)}
|
||||
findContainer={findContainer}
|
||||
handleAddImageFromUrl={handleAddImageFromUrl}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Card, CardContent } from "@/components/ui/card";
|
||||
import { Loader2, Link as LinkIcon } from "lucide-react";
|
||||
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { ImageDropzone } from "./ImageDropzone";
|
||||
import { SortableImage } from "./SortableImage";
|
||||
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
|
||||
import { ProductImageSortable, Product } from "../../types";
|
||||
import { DroppableContainer } from "../DroppableContainer";
|
||||
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import config from "@/config";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { useState, useMemo } from "react";
|
||||
|
||||
interface ReusableImage {
|
||||
id: number;
|
||||
name: string;
|
||||
image_url: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
}
|
||||
|
||||
interface ProductCardProps {
|
||||
product: Product;
|
||||
@@ -26,6 +45,7 @@ interface ProductCardProps {
|
||||
onRemoveImage: (id: string) => void;
|
||||
getProductContainerClasses: () => string;
|
||||
findContainer: (id: string) => string | null;
|
||||
handleAddImageFromUrl: (productIndex: number, url: string) => void;
|
||||
}
|
||||
|
||||
export const ProductCard = ({
|
||||
@@ -43,8 +63,11 @@ export const ProductCard = ({
|
||||
onDragOver,
|
||||
onRemoveImage,
|
||||
getProductContainerClasses,
|
||||
findContainer
|
||||
findContainer,
|
||||
handleAddImageFromUrl
|
||||
}: ProductCardProps) => {
|
||||
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
|
||||
|
||||
// Function to get images for this product
|
||||
const getProductImages = () => {
|
||||
return productImages.filter(img => img.productIndex === index);
|
||||
@@ -56,6 +79,32 @@ export const ProductCard = ({
|
||||
return result !== null ? parseInt(result) : null;
|
||||
};
|
||||
|
||||
// Fetch reusable images
|
||||
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
|
||||
queryKey: ["reusable-images"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch reusable images");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// Filter reusable images based on product's company
|
||||
const availableReusableImages = useMemo(() => {
|
||||
if (!reusableImages) return [];
|
||||
return reusableImages.filter(img =>
|
||||
img.is_global || img.company === product.company
|
||||
);
|
||||
}, [reusableImages, product.company]);
|
||||
|
||||
// Handle adding a reusable image
|
||||
const handleAddReusableImage = (imageUrl: string) => {
|
||||
handleAddImageFromUrl(index, imageUrl);
|
||||
setIsReusableDialogOpen(false);
|
||||
};
|
||||
|
||||
return (
|
||||
<Card
|
||||
className={cn(
|
||||
@@ -83,6 +132,18 @@ export const ProductCard = ({
|
||||
className="flex items-center gap-2"
|
||||
onSubmit={onUrlSubmit}
|
||||
>
|
||||
{getProductImages().length === 0 && (
|
||||
<Button
|
||||
type="button"
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
|
||||
onClick={() => setIsReusableDialogOpen(true)}
|
||||
>
|
||||
<ImageIcon className="h-3.5 w-3.5" />
|
||||
Select from Library
|
||||
</Button>
|
||||
)}
|
||||
<Input
|
||||
placeholder="Add image from URL"
|
||||
value={urlInput}
|
||||
@@ -105,7 +166,7 @@ export const ProductCard = ({
|
||||
</div>
|
||||
|
||||
<div className="flex flex-col sm:flex-row gap-2">
|
||||
<div className="flex flex-row gap-2 items-start">
|
||||
<div className="flex flex-row gap-2 items-center gap-4">
|
||||
<ImageDropzone
|
||||
productIndex={index}
|
||||
onDrop={onImageUpload}
|
||||
@@ -158,6 +219,50 @@ export const ProductCard = ({
|
||||
/>
|
||||
</div>
|
||||
</CardContent>
|
||||
|
||||
{/* Reusable Images Dialog */}
|
||||
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
|
||||
<DialogContent className="max-w-3xl">
|
||||
<DialogHeader>
|
||||
<DialogTitle>Select from Image Library</DialogTitle>
|
||||
<DialogDescription>
|
||||
Choose a global or company-specific image to add to this product.
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
<ScrollArea className="h-[400px] pr-4">
|
||||
{isLoadingReusable ? (
|
||||
<div className="flex items-center justify-center h-full">
|
||||
<Loader2 className="h-8 w-8 animate-spin" />
|
||||
</div>
|
||||
) : availableReusableImages.length === 0 ? (
|
||||
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
|
||||
<ImageIcon className="h-8 w-8 mb-2" />
|
||||
<p>No reusable images available</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
|
||||
{availableReusableImages.map((image) => (
|
||||
<div
|
||||
key={image.id}
|
||||
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
|
||||
onClick={() => handleAddReusableImage(image.image_url)}
|
||||
>
|
||||
<img
|
||||
src={image.image_url}
|
||||
alt={image.name}
|
||||
className="w-full h-full object-cover"
|
||||
/>
|
||||
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
|
||||
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
|
||||
<p className="text-xs text-white truncate">{image.name}</p>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</ScrollArea>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
</Card>
|
||||
);
|
||||
};
|
||||
@@ -31,5 +31,6 @@ export interface Product {
|
||||
supplier_no?: string;
|
||||
sku?: string;
|
||||
model?: string;
|
||||
company?: string;
|
||||
product_images?: string | string[];
|
||||
}
|
||||
@@ -95,12 +95,8 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
isChangeReverted,
|
||||
getFieldDisplayValueWithHighlight,
|
||||
fields,
|
||||
debugData,
|
||||
}) => {
|
||||
const [costPerMillionTokens, setCostPerMillionTokens] = useState(2.5); // Default cost
|
||||
const hasCompanyPrompts =
|
||||
currentPrompt.debugData?.promptSources?.companyPrompts &&
|
||||
currentPrompt.debugData.promptSources.companyPrompts.length > 0;
|
||||
|
||||
// Create our own state to track changes
|
||||
const [localReversionState, setLocalReversionState] = useState<
|
||||
@@ -157,17 +153,6 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
return !!localReversionState[key];
|
||||
};
|
||||
|
||||
// Use "full" as the default tab
|
||||
const defaultTab = "full";
|
||||
const [activeTab, setActiveTab] = useState(defaultTab);
|
||||
|
||||
// Update activeTab when the dialog is opened with new data
|
||||
React.useEffect(() => {
|
||||
if (currentPrompt.isOpen) {
|
||||
setActiveTab("full");
|
||||
}
|
||||
}, [currentPrompt.isOpen]);
|
||||
|
||||
// Format time helper
|
||||
const formatTime = (seconds: number): string => {
|
||||
if (seconds < 60) {
|
||||
|
||||
@@ -51,7 +51,9 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
// Basic Info Group
|
||||
{ id: "search", label: "Search", type: "text", group: "Basic Info" },
|
||||
{ id: "sku", label: "SKU", type: "text", group: "Basic Info" },
|
||||
{ id: "barcode", label: "UPC/Barcode", type: "text", group: "Basic Info" },
|
||||
{ id: "vendor", label: "Vendor", type: "select", group: "Basic Info" },
|
||||
{ id: "vendor_reference", label: "Supplier #", type: "text", group: "Basic Info" },
|
||||
{ id: "brand", label: "Brand", type: "select", group: "Basic Info" },
|
||||
{ id: "category", label: "Category", type: "select", group: "Basic Info" },
|
||||
|
||||
@@ -84,6 +86,27 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
group: "Inventory",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "weeksOfStock",
|
||||
label: "Weeks of Stock",
|
||||
type: "number",
|
||||
group: "Inventory",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "reorderPoint",
|
||||
label: "Reorder Point",
|
||||
type: "number",
|
||||
group: "Inventory",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "safetyStock",
|
||||
label: "Safety Stock",
|
||||
type: "number",
|
||||
group: "Inventory",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "replenishable",
|
||||
label: "Replenishable",
|
||||
@@ -94,6 +117,17 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
],
|
||||
group: "Inventory",
|
||||
},
|
||||
{
|
||||
id: "abcClass",
|
||||
label: "ABC Class",
|
||||
type: "select",
|
||||
options: [
|
||||
{ label: "A", value: "A" },
|
||||
{ label: "B", value: "B" },
|
||||
{ label: "C", value: "C" },
|
||||
],
|
||||
group: "Inventory",
|
||||
},
|
||||
|
||||
// Pricing Group
|
||||
{
|
||||
@@ -140,6 +174,32 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
group: "Sales Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "avgQuantityPerOrder",
|
||||
label: "Avg Qty/Order",
|
||||
type: "number",
|
||||
group: "Sales Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "numberOfOrders",
|
||||
label: "Order Count",
|
||||
type: "number",
|
||||
group: "Sales Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "firstSaleDate",
|
||||
label: "First Sale Date",
|
||||
type: "text",
|
||||
group: "Sales Metrics",
|
||||
},
|
||||
{
|
||||
id: "lastSaleDate",
|
||||
label: "Last Sale Date",
|
||||
type: "text",
|
||||
group: "Sales Metrics",
|
||||
},
|
||||
|
||||
// Financial Metrics Group
|
||||
{
|
||||
@@ -156,6 +216,34 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
group: "Financial Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "inventoryValue",
|
||||
label: "Inventory Value",
|
||||
type: "number",
|
||||
group: "Financial Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "costOfGoodsSold",
|
||||
label: "COGS",
|
||||
type: "number",
|
||||
group: "Financial Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "grossProfit",
|
||||
label: "Gross Profit",
|
||||
type: "number",
|
||||
group: "Financial Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "turnoverRate",
|
||||
label: "Turnover Rate",
|
||||
type: "number",
|
||||
group: "Financial Metrics",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
|
||||
// Lead Time & Stock Coverage Group
|
||||
{
|
||||
@@ -165,6 +253,20 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
group: "Lead Time & Coverage",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "currentLeadTime",
|
||||
label: "Current Lead Time",
|
||||
type: "number",
|
||||
group: "Lead Time & Coverage",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "targetLeadTime",
|
||||
label: "Target Lead Time",
|
||||
type: "number",
|
||||
group: "Lead Time & Coverage",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
{
|
||||
id: "leadTimeStatus",
|
||||
label: "Lead Time Status",
|
||||
@@ -183,19 +285,26 @@ const FILTER_OPTIONS: FilterOption[] = [
|
||||
group: "Lead Time & Coverage",
|
||||
operators: ["=", ">", ">=", "<", "<=", "between"],
|
||||
},
|
||||
|
||||
// Classification Group
|
||||
{
|
||||
id: "abcClass",
|
||||
label: "ABC Class",
|
||||
type: "select",
|
||||
options: [
|
||||
{ label: "A", value: "A" },
|
||||
{ label: "B", value: "B" },
|
||||
{ label: "C", value: "C" },
|
||||
],
|
||||
group: "Classification",
|
||||
id: "lastPurchaseDate",
|
||||
label: "Last Purchase Date",
|
||||
type: "text",
|
||||
group: "Lead Time & Coverage",
|
||||
},
|
||||
{
|
||||
id: "firstReceivedDate",
|
||||
label: "First Received Date",
|
||||
type: "text",
|
||||
group: "Lead Time & Coverage",
|
||||
},
|
||||
{
|
||||
id: "lastReceivedDate",
|
||||
label: "Last Received Date",
|
||||
type: "text",
|
||||
group: "Lead Time & Coverage",
|
||||
},
|
||||
|
||||
// Classification Group
|
||||
{
|
||||
id: "managingStock",
|
||||
label: "Managing Stock",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
779
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
779
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
@@ -0,0 +1,779 @@
|
||||
import { useState, useMemo, useCallback, useEffect } from "react";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Checkbox } from "@/components/ui/checkbox";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
|
||||
import config from "@/config";
|
||||
import {
|
||||
useReactTable,
|
||||
getCoreRowModel,
|
||||
getSortedRowModel,
|
||||
SortingState,
|
||||
flexRender,
|
||||
type ColumnDef,
|
||||
} from "@tanstack/react-table";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
DialogClose
|
||||
} from "@/components/ui/dialog";
|
||||
import {
|
||||
AlertDialog,
|
||||
AlertDialogAction,
|
||||
AlertDialogCancel,
|
||||
AlertDialogContent,
|
||||
AlertDialogDescription,
|
||||
AlertDialogFooter,
|
||||
AlertDialogHeader,
|
||||
AlertDialogTitle,
|
||||
} from "@/components/ui/alert-dialog";
|
||||
import { toast } from "sonner";
|
||||
import { useDropzone } from "react-dropzone";
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
interface FieldOption {
|
||||
label: string;
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface ImageFormData {
|
||||
id?: number;
|
||||
name: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
file?: File;
|
||||
}
|
||||
|
||||
interface ReusableImage {
|
||||
id: number;
|
||||
name: string;
|
||||
filename: string;
|
||||
file_path: string;
|
||||
image_url: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
mime_type: string;
|
||||
file_size: number;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
interface FieldOptions {
|
||||
companies: FieldOption[];
|
||||
}
|
||||
|
||||
const ImageForm = ({
|
||||
editingImage,
|
||||
formData,
|
||||
setFormData,
|
||||
onSubmit,
|
||||
onCancel,
|
||||
fieldOptions,
|
||||
getRootProps,
|
||||
getInputProps,
|
||||
isDragActive
|
||||
}: {
|
||||
editingImage: ReusableImage | null;
|
||||
formData: ImageFormData;
|
||||
setFormData: (data: ImageFormData | ((prev: ImageFormData) => ImageFormData)) => void;
|
||||
onSubmit: (e: React.FormEvent) => void;
|
||||
onCancel: () => void;
|
||||
fieldOptions: FieldOptions | undefined;
|
||||
getRootProps: any;
|
||||
getInputProps: any;
|
||||
isDragActive: boolean;
|
||||
}) => {
|
||||
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setFormData((prev: ImageFormData) => ({ ...prev, name: e.target.value }));
|
||||
}, [setFormData]);
|
||||
|
||||
const handleGlobalChange = useCallback((checked: boolean) => {
|
||||
setFormData((prev: ImageFormData) => ({
|
||||
...prev,
|
||||
is_global: checked,
|
||||
company: checked ? null : prev.company
|
||||
}));
|
||||
}, [setFormData]);
|
||||
|
||||
const handleCompanyChange = useCallback((value: string) => {
|
||||
setFormData((prev: ImageFormData) => ({ ...prev, company: value }));
|
||||
}, [setFormData]);
|
||||
|
||||
return (
|
||||
<form onSubmit={onSubmit}>
|
||||
<div className="grid gap-4 py-4">
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="image_name">Image Name</Label>
|
||||
<Input
|
||||
id="image_name"
|
||||
name="image_name"
|
||||
value={formData.name}
|
||||
onChange={handleNameChange}
|
||||
placeholder="Enter image name"
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
|
||||
{!editingImage && (
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="image">Upload Image</Label>
|
||||
<div
|
||||
{...getRootProps()}
|
||||
className={cn(
|
||||
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
|
||||
isDragActive && "border-primary bg-muted"
|
||||
)}
|
||||
>
|
||||
<input {...getInputProps()} />
|
||||
<div className="flex flex-col items-center justify-center py-2">
|
||||
{formData.file ? (
|
||||
<>
|
||||
<div className="mb-4">
|
||||
<ImagePreview file={formData.file} />
|
||||
</div>
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<Image className="h-4 w-4 text-primary" />
|
||||
<span className="text-sm">{formData.file.name}</span>
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
|
||||
</>
|
||||
) : isDragActive ? (
|
||||
<>
|
||||
<Image className="h-8 w-8 mb-2 text-primary" />
|
||||
<p className="text-base text-muted-foreground">Drop image here</p>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
|
||||
<p className="text-base text-muted-foreground">Click or drag to upload</p>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex items-center space-x-2">
|
||||
<Checkbox
|
||||
id="is_global"
|
||||
checked={formData.is_global}
|
||||
onCheckedChange={handleGlobalChange}
|
||||
/>
|
||||
<Label htmlFor="is_global">Available for all companies</Label>
|
||||
</div>
|
||||
|
||||
{!formData.is_global && (
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="company">Company</Label>
|
||||
<Select
|
||||
value={formData.company || ''}
|
||||
onValueChange={handleCompanyChange}
|
||||
required={!formData.is_global}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select company" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{fieldOptions?.companies.map((company) => (
|
||||
<SelectItem key={company.value} value={company.value}>
|
||||
{company.label}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<Button type="button" variant="outline" onClick={onCancel}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit">
|
||||
{editingImage ? "Update" : "Upload"} Image
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
);
|
||||
};
|
||||
|
||||
export function ReusableImageManagement() {
|
||||
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
|
||||
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
|
||||
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
|
||||
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
|
||||
const [sorting, setSorting] = useState<SortingState>([
|
||||
{ id: "created_at", desc: true }
|
||||
]);
|
||||
const [searchQuery, setSearchQuery] = useState("");
|
||||
const [formData, setFormData] = useState<ImageFormData>({
|
||||
name: "",
|
||||
is_global: false,
|
||||
company: null,
|
||||
file: undefined
|
||||
});
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { data: images, isLoading } = useQuery<ReusableImage[]>({
|
||||
queryKey: ["reusable-images"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch reusable images");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||
queryKey: ["fieldOptions"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch field options");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
const createMutation = useMutation({
|
||||
mutationFn: async (data: ImageFormData) => {
|
||||
// Create FormData for file upload
|
||||
const formData = new FormData();
|
||||
formData.append('name', data.name);
|
||||
formData.append('is_global', String(data.is_global));
|
||||
|
||||
if (!data.is_global && data.company) {
|
||||
formData.append('company', data.company);
|
||||
}
|
||||
|
||||
if (data.file) {
|
||||
formData.append('image', data.file);
|
||||
} else {
|
||||
throw new Error("Image file is required");
|
||||
}
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
|
||||
method: "POST",
|
||||
body: formData,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to upload image");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image uploaded successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to upload image");
|
||||
},
|
||||
});
|
||||
|
||||
const updateMutation = useMutation({
|
||||
mutationFn: async (data: ImageFormData) => {
|
||||
if (!data.id) throw new Error("Image ID is required for update");
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
|
||||
method: "PUT",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
name: data.name,
|
||||
is_global: data.is_global,
|
||||
company: data.is_global ? null : data.company
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to update image");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image updated successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to update image");
|
||||
},
|
||||
});
|
||||
|
||||
const deleteMutation = useMutation({
|
||||
mutationFn: async (id: number) => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
|
||||
method: "DELETE",
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to delete image");
|
||||
}
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image deleted successfully");
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to delete image");
|
||||
},
|
||||
});
|
||||
|
||||
const handleEdit = (image: ReusableImage) => {
|
||||
setEditingImage(image);
|
||||
setFormData({
|
||||
id: image.id,
|
||||
name: image.name,
|
||||
is_global: image.is_global,
|
||||
company: image.company,
|
||||
});
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteClick = (image: ReusableImage) => {
|
||||
setImageToDelete(image);
|
||||
setIsDeleteOpen(true);
|
||||
};
|
||||
|
||||
const handlePreview = (image: ReusableImage) => {
|
||||
setPreviewImage(image);
|
||||
setIsPreviewOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteConfirm = () => {
|
||||
if (imageToDelete) {
|
||||
deleteMutation.mutate(imageToDelete.id);
|
||||
setIsDeleteOpen(false);
|
||||
setImageToDelete(null);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSubmit = (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
// If is_global is true, ensure company is null
|
||||
const submitData = {
|
||||
...formData,
|
||||
company: formData.is_global ? null : formData.company,
|
||||
};
|
||||
|
||||
if (editingImage) {
|
||||
updateMutation.mutate(submitData);
|
||||
} else {
|
||||
if (!submitData.file) {
|
||||
toast.error("Please select an image file");
|
||||
return;
|
||||
}
|
||||
createMutation.mutate(submitData);
|
||||
}
|
||||
};
|
||||
|
||||
const resetForm = () => {
|
||||
setFormData({
|
||||
name: "",
|
||||
is_global: false,
|
||||
company: null,
|
||||
file: undefined
|
||||
});
|
||||
setEditingImage(null);
|
||||
setIsFormOpen(false);
|
||||
};
|
||||
|
||||
const handleCreateClick = () => {
|
||||
resetForm();
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
// Configure dropzone for image uploads
|
||||
const onDrop = useCallback((acceptedFiles: File[]) => {
|
||||
if (acceptedFiles.length > 0) {
|
||||
const file = acceptedFiles[0]; // Take only the first file
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
file
|
||||
}));
|
||||
}
|
||||
}, []);
|
||||
|
||||
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||
accept: {
|
||||
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
|
||||
},
|
||||
onDrop,
|
||||
multiple: false // Only accept single files
|
||||
});
|
||||
|
||||
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
|
||||
{
|
||||
accessorKey: "name",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Name
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
},
|
||||
{
|
||||
accessorKey: "is_global",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Type
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const isGlobal = row.getValue("is_global") as boolean;
|
||||
return isGlobal ? "Global" : "Company Specific";
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "company",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Company
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const isGlobal = row.getValue("is_global") as boolean;
|
||||
if (isGlobal) return 'N/A';
|
||||
|
||||
const companyId = row.getValue("company");
|
||||
if (!companyId) return 'None';
|
||||
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "file_size",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Size
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const size = row.getValue("file_size") as number;
|
||||
return `${(size / 1024).toFixed(1)} KB`;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "created_at",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Created
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
|
||||
},
|
||||
{
|
||||
accessorKey: "image_url",
|
||||
header: "Thumbnail",
|
||||
cell: ({ row }) => (
|
||||
<div className="flex items-center justify-center">
|
||||
<img
|
||||
src={row.getValue("image_url") as string}
|
||||
alt={row.getValue("name") as string}
|
||||
className="w-10 h-10 object-contain border rounded"
|
||||
/>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
id: "actions",
|
||||
cell: ({ row }) => (
|
||||
<div className="flex gap-2 justify-end">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
onClick={() => handlePreview(row.original)}
|
||||
title="Preview Image"
|
||||
>
|
||||
<Eye className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
onClick={() => handleEdit(row.original)}
|
||||
title="Edit Image"
|
||||
>
|
||||
<Pencil className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
className="text-destructive hover:text-destructive"
|
||||
onClick={() => handleDeleteClick(row.original)}
|
||||
title="Delete Image"
|
||||
>
|
||||
<Trash2 className="h-4 w-4" />
|
||||
</Button>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
], [fieldOptions]);
|
||||
|
||||
const filteredData = useMemo(() => {
|
||||
if (!images) return [];
|
||||
return images.filter((image) => {
|
||||
const searchString = searchQuery.toLowerCase();
|
||||
return (
|
||||
image.name.toLowerCase().includes(searchString) ||
|
||||
(image.is_global ? "global" : "company").includes(searchString) ||
|
||||
(image.company && image.company.toLowerCase().includes(searchString))
|
||||
);
|
||||
});
|
||||
}, [images, searchQuery]);
|
||||
|
||||
const table = useReactTable({
|
||||
data: filteredData,
|
||||
columns,
|
||||
state: {
|
||||
sorting,
|
||||
},
|
||||
onSortingChange: setSorting,
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
});
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h2 className="text-2xl font-bold">Reusable Images</h2>
|
||||
<Button onClick={handleCreateClick}>
|
||||
<PlusCircle className="mr-2 h-4 w-4" />
|
||||
Upload New Image
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4">
|
||||
<Input
|
||||
placeholder="Search images..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className="max-w-sm"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{isLoading ? (
|
||||
<div>Loading images...</div>
|
||||
) : (
|
||||
<div className="border rounded-lg">
|
||||
<Table>
|
||||
<TableHeader className="bg-muted">
|
||||
{table.getHeaderGroups().map((headerGroup) => (
|
||||
<TableRow key={headerGroup.id}>
|
||||
{headerGroup.headers.map((header) => (
|
||||
<TableHead key={header.id}>
|
||||
{header.isPlaceholder
|
||||
? null
|
||||
: flexRender(
|
||||
header.column.columnDef.header,
|
||||
header.getContext()
|
||||
)}
|
||||
</TableHead>
|
||||
))}
|
||||
</TableRow>
|
||||
))}
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{table.getRowModel().rows?.length ? (
|
||||
table.getRowModel().rows.map((row) => (
|
||||
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||
{row.getVisibleCells().map((cell) => (
|
||||
<TableCell key={cell.id} className="pl-6">
|
||||
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||
</TableCell>
|
||||
))}
|
||||
</TableRow>
|
||||
))
|
||||
) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={columns.length} className="text-center">
|
||||
No images found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Image Form Dialog */}
|
||||
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||
<DialogContent className="max-w-md">
|
||||
<DialogHeader>
|
||||
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
|
||||
<DialogDescription>
|
||||
{editingImage
|
||||
? "Update this reusable image's details."
|
||||
: "Upload a new reusable image that can be used across products."}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<ImageForm
|
||||
editingImage={editingImage}
|
||||
formData={formData}
|
||||
setFormData={setFormData}
|
||||
onSubmit={handleSubmit}
|
||||
onCancel={() => {
|
||||
resetForm();
|
||||
setIsFormOpen(false);
|
||||
}}
|
||||
fieldOptions={fieldOptions}
|
||||
getRootProps={getRootProps}
|
||||
getInputProps={getInputProps}
|
||||
isDragActive={isDragActive}
|
||||
/>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* Delete Confirmation Dialog */}
|
||||
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||
<AlertDialogContent>
|
||||
<AlertDialogHeader>
|
||||
<AlertDialogTitle>Delete Image</AlertDialogTitle>
|
||||
<AlertDialogDescription>
|
||||
Are you sure you want to delete this image? This action cannot be undone.
|
||||
</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
<AlertDialogFooter>
|
||||
<AlertDialogCancel onClick={() => {
|
||||
setIsDeleteOpen(false);
|
||||
setImageToDelete(null);
|
||||
}}>
|
||||
Cancel
|
||||
</AlertDialogCancel>
|
||||
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||
Delete
|
||||
</AlertDialogAction>
|
||||
</AlertDialogFooter>
|
||||
</AlertDialogContent>
|
||||
</AlertDialog>
|
||||
|
||||
{/* Preview Dialog */}
|
||||
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
|
||||
<DialogContent className="max-w-3xl">
|
||||
<DialogHeader>
|
||||
<DialogTitle>{previewImage?.name}</DialogTitle>
|
||||
<DialogDescription>
|
||||
{previewImage?.is_global
|
||||
? "Global image"
|
||||
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<div className="flex justify-center p-4">
|
||||
{previewImage && (
|
||||
<div className="bg-checkerboard rounded-md overflow-hidden">
|
||||
<img
|
||||
src={previewImage.image_url}
|
||||
alt={previewImage.name}
|
||||
className="max-h-[500px] max-w-full object-contain"
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-2 gap-4 text-sm">
|
||||
<div>
|
||||
<span className="font-medium">Filename:</span> {previewImage?.filename}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Type:</span> {previewImage?.mime_type}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<DialogClose asChild>
|
||||
<Button>Close</Button>
|
||||
</DialogClose>
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* Add global styles for this component using regular style tag */}
|
||||
<style>{`
|
||||
.reusable-image-table thead tr th,
|
||||
.reusable-image-table tbody tr td {
|
||||
padding-left: 1rem;
|
||||
padding-right: 1rem;
|
||||
}
|
||||
.bg-checkerboard {
|
||||
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
|
||||
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
|
||||
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
|
||||
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
|
||||
background-size: 20px 20px;
|
||||
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
|
||||
}
|
||||
`}</style>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const ImagePreview = ({ file }: { file: File }) => {
|
||||
const [previewUrl, setPreviewUrl] = useState<string>('');
|
||||
|
||||
useEffect(() => {
|
||||
const url = URL.createObjectURL(file);
|
||||
setPreviewUrl(url);
|
||||
return () => {
|
||||
URL.revokeObjectURL(url);
|
||||
};
|
||||
}, [file]);
|
||||
|
||||
return (
|
||||
<img
|
||||
src={previewUrl}
|
||||
alt="Preview"
|
||||
className="max-h-32 max-w-full object-contain rounded-md"
|
||||
/>
|
||||
);
|
||||
};
|
||||
@@ -96,17 +96,15 @@
|
||||
@apply border-border;
|
||||
}
|
||||
body {
|
||||
@apply bg-background text-foreground;
|
||||
@apply bg-background text-foreground font-sans;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
|
||||
@layer base {
|
||||
* {
|
||||
@apply border-border outline-ring/50;
|
||||
}
|
||||
body {
|
||||
@apply bg-background text-foreground;
|
||||
@apply bg-background text-foreground font-sans;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import { StrictMode } from 'react'
|
||||
import { createRoot } from 'react-dom/client'
|
||||
import './index.css'
|
||||
import './App.css'
|
||||
import App from './App.tsx'
|
||||
import { BrowserRouter as Router } from 'react-router-dom'
|
||||
|
||||
|
||||
@@ -1,28 +1,31 @@
|
||||
import { useState, useContext } from "react";
|
||||
import { useNavigate, useSearchParams } from "react-router-dom";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
import { toast } from "sonner";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { toast } from "sonner";
|
||||
import { Loader2, Box } from "lucide-react";
|
||||
import { motion } from "framer-motion";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { motion } from "motion/react";
|
||||
|
||||
export function Login() {
|
||||
const [username, setUsername] = useState("");
|
||||
const [password, setPassword] = useState("");
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const navigate = useNavigate();
|
||||
const [searchParams] = useSearchParams();
|
||||
const { login } = useContext(AuthContext);
|
||||
|
||||
const handleLogin = async (e: React.FormEvent) => {
|
||||
const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
|
||||
e.preventDefault();
|
||||
setIsLoading(true);
|
||||
|
||||
const formData = new FormData(e.currentTarget);
|
||||
const username = formData.get("username") as string;
|
||||
const password = formData.get("password") as string;
|
||||
|
||||
try {
|
||||
await login(username, password);
|
||||
|
||||
|
||||
// Login successful, redirect to the requested page or home
|
||||
const redirectTo = searchParams.get("redirect") || "/";
|
||||
navigate(redirectTo);
|
||||
@@ -36,70 +39,77 @@ export function Login() {
|
||||
};
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
layout
|
||||
className="min-h-screen bg-gradient-to-b from-slate-100 to-slate-200 dark:from-slate-900 dark:to-slate-800 antialiased"
|
||||
>
|
||||
<div className="flex flex-col gap-2 p-2 bg-primary">
|
||||
<div className="p-4 flex items-center gap-2 group-data-[collapsible=icon]:justify-center text-white">
|
||||
<Box className="h-6 w-6 shrink-0" />
|
||||
<h2 className="text-lg font-semibold group-data-[collapsible=icon]:hidden">
|
||||
Inventory Manager
|
||||
</h2>
|
||||
<motion.div className="flex min-h-svh flex-row items-center justify-center bg-muted p-6 md:p-10">
|
||||
<div className="fixed top-0 w-full backdrop-blur-sm bg-white/40 border-b shadow-sm z-10">
|
||||
<div className="mx-auto p-4 sm:p-6">
|
||||
<div className="flex items-center gap-2 font-medium text-3xl justify-center sm:justify-start">
|
||||
<div className="relative">
|
||||
<div className="absolute inset-0 "></div>
|
||||
<img
|
||||
src="/cherrybottom.png"
|
||||
alt="Cherry Bottom"
|
||||
className="h-12 w-12 object-contain -rotate-12 transform hover:rotate-0 transition-transform ease-in-out duration-300 relative z-10"
|
||||
/>
|
||||
</div>
|
||||
<span className="font-bold font-text-primary">A Cherry On Bottom</span>
|
||||
</div>
|
||||
<p className="text-sm italic text-muted-foreground text-center sm:text-left ml-32 -mt-1">
|
||||
supporting the cherry on top
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, scale: 0.95 }}
|
||||
animate={{ opacity: 1, scale: 1 }}
|
||||
transition={{ duration: 0.3, delay: 0.2 }}
|
||||
className="container relative flex min-h-[calc(100vh-4rem)] flex-col items-center justify-center"
|
||||
>
|
||||
<div className="mx-auto flex w-full flex-col justify-center space-y-6 sm:w-[350px]">
|
||||
<Card className="border-none shadow-xl">
|
||||
<CardHeader className="space-y-1">
|
||||
<div className="flex items-center justify-center mb-2">
|
||||
<Box className="h-10 w-10 text-primary" />
|
||||
</div>
|
||||
<CardTitle className="text-2xl text-center">
|
||||
Log in to continue
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<form onSubmit={handleLogin}>
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-2">
|
||||
<Input
|
||||
id="username"
|
||||
placeholder="Username"
|
||||
value={username}
|
||||
onChange={(e) => setUsername(e.target.value)}
|
||||
disabled={isLoading}
|
||||
className="w-full"
|
||||
/>
|
||||
</div>
|
||||
<div className="grid gap-2">
|
||||
<Input
|
||||
id="password"
|
||||
type="password"
|
||||
placeholder="Password"
|
||||
value={password}
|
||||
onChange={(e) => setPassword(e.target.value)}
|
||||
disabled={isLoading}
|
||||
className="w-full"
|
||||
/>
|
||||
</div>
|
||||
<Button className="w-full" type="submit" disabled={isLoading}>
|
||||
{isLoading && (
|
||||
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
|
||||
)}
|
||||
Sign In
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
</motion.div>
|
||||
<div className="w-full sm:w-[80%] max-w-sm mt-20">
|
||||
<LoginForm onSubmit={handleSubmit} isLoading={isLoading} />
|
||||
</div>
|
||||
|
||||
</motion.div>
|
||||
);
|
||||
}
|
||||
|
||||
interface LoginFormProps {
|
||||
className?: string;
|
||||
isLoading?: boolean;
|
||||
onSubmit: (e: React.FormEvent<HTMLFormElement>) => void;
|
||||
}
|
||||
|
||||
function LoginForm({ className, isLoading, onSubmit, ...props }: LoginFormProps) {
|
||||
return (
|
||||
<motion.div className={cn("flex flex-col gap-6", className)} {...props}>
|
||||
<Card className="overflow-hidden rounded-lg shadow-lg">
|
||||
<CardHeader className="pb-0">
|
||||
<CardTitle className="text-2xl font-bold text-center">Log in to your account</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="grid p-0 h-full">
|
||||
<form className="p-6 md:p-8 flex flex-col gap-6" onSubmit={onSubmit}>
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="username">Username</Label>
|
||||
<Input
|
||||
id="username"
|
||||
name="username"
|
||||
type="text"
|
||||
required
|
||||
disabled={isLoading}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="password">Password</Label>
|
||||
<Input
|
||||
id="password"
|
||||
name="password"
|
||||
type="password"
|
||||
required
|
||||
disabled={isLoading}
|
||||
/>
|
||||
</div>
|
||||
|
||||
<Button type="submit" className="w-full" disabled={isLoading}>
|
||||
{isLoading ? "Logging in..." : "Log In"}
|
||||
</Button>
|
||||
</form>
|
||||
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -55,10 +55,13 @@ const AVAILABLE_COLUMNS: ColumnDef[] = [
|
||||
{ key: 'stock_quantity', label: 'Shelf Count', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'stock_status', label: 'Stock Status', group: 'Stock' },
|
||||
{ key: 'days_of_inventory', label: 'Days of Stock', group: 'Stock', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'weeks_of_inventory', label: 'Weeks of Stock', group: 'Stock', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'abc_class', label: 'ABC Class', group: 'Stock' },
|
||||
{ key: 'replenishable', label: 'Replenishable', group: 'Stock' },
|
||||
{ key: 'moq', label: 'MOQ', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'reorder_qty', label: 'Reorder Qty', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'reorder_point', label: 'Reorder Point', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'safety_stock', label: 'Safety Stock', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'overstocked_amt', label: 'Overstock Amt', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'price', label: 'Price', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'regular_price', label: 'Default Price', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
@@ -67,15 +70,22 @@ const AVAILABLE_COLUMNS: ColumnDef[] = [
|
||||
{ key: 'daily_sales_avg', label: 'Daily Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'weekly_sales_avg', label: 'Weekly Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'monthly_sales_avg', label: 'Monthly Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'avg_quantity_per_order', label: 'Avg Qty/Order', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'number_of_orders', label: 'Order Count', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'first_sale_date', label: 'First Sale', group: 'Sales' },
|
||||
{ key: 'last_sale_date', label: 'Last Sale', group: 'Sales' },
|
||||
{ key: 'gmroi', label: 'GMROI', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'turnover_rate', label: 'Turnover Rate', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'avg_margin_percent', label: 'Margin %', group: 'Financial', format: (v) => v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'inventory_value', label: 'Inventory Value', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'cost_of_goods_sold', label: 'COGS', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'gross_profit', label: 'Gross Profit', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'current_lead_time', label: 'Current Lead Time', group: 'Lead Time', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'target_lead_time', label: 'Target Lead Time', group: 'Lead Time', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'lead_time_status', label: 'Lead Time Status', group: 'Lead Time' },
|
||||
{ key: 'last_purchase_date', label: 'Last Purchase', group: 'Lead Time' },
|
||||
{ key: 'first_received_date', label: 'First Received', group: 'Lead Time' },
|
||||
{ key: 'last_received_date', label: 'Last Received', group: 'Lead Time' },
|
||||
];
|
||||
|
||||
// Define default columns for each view
|
||||
@@ -93,14 +103,17 @@ const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'monthly_sales_avg',
|
||||
'inventory_value',
|
||||
],
|
||||
critical: [
|
||||
'image',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'safety_stock',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'reorder_qty',
|
||||
'reorder_point',
|
||||
'vendor',
|
||||
'last_purchase_date',
|
||||
'current_lead_time',
|
||||
@@ -109,11 +122,13 @@ const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
'image',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'reorder_point',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'reorder_qty',
|
||||
'vendor',
|
||||
'last_purchase_date',
|
||||
'avg_lead_time_days',
|
||||
],
|
||||
overstocked: [
|
||||
'image',
|
||||
@@ -123,15 +138,19 @@ const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
'weekly_sales_avg',
|
||||
'overstocked_amt',
|
||||
'days_of_inventory',
|
||||
'inventory_value',
|
||||
'turnover_rate',
|
||||
],
|
||||
'at-risk': [
|
||||
'image',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'safety_stock',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'days_of_inventory',
|
||||
'last_sale_date',
|
||||
'current_lead_time',
|
||||
],
|
||||
new: [
|
||||
'image',
|
||||
@@ -141,6 +160,7 @@ const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
'brand',
|
||||
'price',
|
||||
'regular_price',
|
||||
'first_received_date',
|
||||
],
|
||||
healthy: [
|
||||
'image',
|
||||
@@ -150,6 +170,8 @@ const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
'weekly_sales_avg',
|
||||
'monthly_sales_avg',
|
||||
'days_of_inventory',
|
||||
'gross_profit',
|
||||
'gmroi',
|
||||
],
|
||||
};
|
||||
|
||||
|
||||
@@ -6,6 +6,7 @@ import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
||||
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
||||
import { UserManagement } from "@/components/settings/UserManagement";
|
||||
import { PromptManagement } from "@/components/settings/PromptManagement";
|
||||
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
|
||||
import { motion } from 'framer-motion';
|
||||
import { Alert, AlertDescription } from "@/components/ui/alert";
|
||||
import { Protected } from "@/components/auth/Protected";
|
||||
@@ -42,7 +43,8 @@ const SETTINGS_GROUPS: SettingsGroup[] = [
|
||||
label: "Content Management",
|
||||
tabs: [
|
||||
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
||||
{ id: "ai-prompts", permission: "settings:templates", label: "AI Prompts" },
|
||||
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
|
||||
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -220,7 +222,7 @@ export function Settings() {
|
||||
|
||||
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:templates"
|
||||
permission="settings:prompt_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
@@ -233,6 +235,21 @@ export function Settings() {
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:library_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Reusable Images.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<ReusableImageManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:user_management"
|
||||
|
||||
@@ -43,6 +43,7 @@ export interface Product {
|
||||
gross_profit?: string; // numeric(15,3)
|
||||
gmroi?: string; // numeric(15,3)
|
||||
avg_lead_time_days?: string; // numeric(15,3)
|
||||
first_received_date?: string;
|
||||
last_received_date?: string;
|
||||
abc_class?: string;
|
||||
stock_status?: string;
|
||||
|
||||
@@ -14,6 +14,9 @@ export default {
|
||||
}
|
||||
},
|
||||
extend: {
|
||||
fontFamily: {
|
||||
sans: ['Inter', 'sans-serif'],
|
||||
},
|
||||
colors: {
|
||||
border: 'hsl(var(--border))',
|
||||
input: 'hsl(var(--input))',
|
||||
|
||||
File diff suppressed because one or more lines are too long
Reference in New Issue
Block a user