Compare commits
13 Commits
7eb4077224
...
move-to-po
| Author | SHA1 | Date | |
|---|---|---|---|
| e5c4f617c5 | |||
| 8e19e6cd74 | |||
| 749907bd30 | |||
| 108181c63d | |||
| 5dd779cb4a | |||
| 7b0e792d03 | |||
| 517bbe72f4 | |||
| 87d4b9e804 | |||
| 75da2c6772 | |||
| 00a02aa788 | |||
| 114018080a | |||
| 228ae8b2a9 | |||
| dd4b3f7145 |
342
docs/import-from-prod-data-mapping.md
Normal file
342
docs/import-from-prod-data-mapping.md
Normal file
@@ -0,0 +1,342 @@
|
|||||||
|
# MySQL to PostgreSQL Import Process Documentation
|
||||||
|
|
||||||
|
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
1. [Overview](#overview)
|
||||||
|
2. [Import Architecture](#import-architecture)
|
||||||
|
3. [Column Mappings](#column-mappings)
|
||||||
|
- [Categories](#categories)
|
||||||
|
- [Products](#products)
|
||||||
|
- [Product Categories (Relationship)](#product-categories-relationship)
|
||||||
|
- [Orders](#orders)
|
||||||
|
- [Purchase Orders](#purchase-orders)
|
||||||
|
- [Metadata Tables](#metadata-tables)
|
||||||
|
4. [Special Calculations](#special-calculations)
|
||||||
|
5. [Implementation Notes](#implementation-notes)
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
|
||||||
|
|
||||||
|
- **Full Import**: Imports all data regardless of last sync time
|
||||||
|
- **Incremental Import**: Only imports data that has changed since the last import
|
||||||
|
|
||||||
|
The process handles four main data types:
|
||||||
|
- Categories (product categorization hierarchy)
|
||||||
|
- Products (inventory items)
|
||||||
|
- Orders (sales records)
|
||||||
|
- Purchase Orders (vendor orders)
|
||||||
|
|
||||||
|
## Import Architecture
|
||||||
|
|
||||||
|
The import process follows these steps:
|
||||||
|
|
||||||
|
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
|
||||||
|
2. **Setup Import History**: Creates a record of the current import operation
|
||||||
|
3. **Import Categories**: Processes product categories in hierarchical order
|
||||||
|
4. **Import Products**: Processes products with their attributes and category relationships
|
||||||
|
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
|
||||||
|
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
|
||||||
|
7. **Record Results**: Updates the import history with results
|
||||||
|
8. **Close Connections**: Cleans up connections and resources
|
||||||
|
|
||||||
|
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
|
||||||
|
|
||||||
|
## Column Mappings
|
||||||
|
|
||||||
|
### Categories
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|---------------------------------|----------------------------------------------|
|
||||||
|
| cat_id | product_categories.cat_id | Direct mapping |
|
||||||
|
| name | product_categories.name | Direct mapping |
|
||||||
|
| type | product_categories.type | Direct mapping |
|
||||||
|
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
|
||||||
|
| description | product_categories.combined_name| Direct mapping |
|
||||||
|
| status | N/A | Hard-coded 'active' |
|
||||||
|
| created_at | N/A | Current timestamp |
|
||||||
|
| updated_at | N/A | Current timestamp |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
|
||||||
|
- Type 10/20 are top-level categories with no parent
|
||||||
|
- Types 11/21/12/13 are child categories that reference parent categories
|
||||||
|
|
||||||
|
### Products
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|----------------------|----------------------------------|---------------------------------------------------------------|
|
||||||
|
| pid | products.pid | Direct mapping |
|
||||||
|
| title | products.description | Direct mapping |
|
||||||
|
| description | products.notes | Direct mapping |
|
||||||
|
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
|
||||||
|
| preorder_count | current_inventory.onpreorder | Default 0 |
|
||||||
|
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
|
||||||
|
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
|
||||||
|
| regular_price | products.sellingprice | Default 0 |
|
||||||
|
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
|
||||||
|
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
|
||||||
|
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
|
||||||
|
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
|
||||||
|
| brand | product_categories.name | Linked via products.company |
|
||||||
|
| line | product_categories.name | Linked via products.line |
|
||||||
|
| subline | product_categories.name | Linked via products.subline |
|
||||||
|
| artist | product_categories.name | Linked via products.artist |
|
||||||
|
| categories | product_category_index | Comma-separated list of category IDs |
|
||||||
|
| created_at | products.date_created | Validated date, NULL if invalid |
|
||||||
|
| first_received | products.datein | Validated date, NULL if invalid |
|
||||||
|
| landing_cost_price | NULL | Not set |
|
||||||
|
| barcode | products.upc | Direct mapping |
|
||||||
|
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
|
||||||
|
| updated_at | products.stamp | Validated date, NULL if invalid |
|
||||||
|
| visible | shop_inventory | Calculated from show + buyable > 0 |
|
||||||
|
| managing_stock | N/A | Hard-coded true |
|
||||||
|
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
|
||||||
|
| permalink | N/A | Constructed URL with product ID |
|
||||||
|
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
|
||||||
|
| uom | N/A | Hard-coded 1 |
|
||||||
|
| rating | products.rating | Direct mapping |
|
||||||
|
| reviews | products.rating_votes | Direct mapping |
|
||||||
|
| weight | products.weight | Direct mapping |
|
||||||
|
| length | products.length | Direct mapping |
|
||||||
|
| width | products.width | Direct mapping |
|
||||||
|
| height | products.height | Direct mapping |
|
||||||
|
| country_of_origin | products.country_of_origin | Direct mapping |
|
||||||
|
| location | products.location | Direct mapping |
|
||||||
|
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
|
||||||
|
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
|
||||||
|
| notifies | product_notify | COUNT of records where pn.pid = pid |
|
||||||
|
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
|
||||||
|
| image | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| image_175 | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| image_full | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| options | NULL | Not set |
|
||||||
|
| tags | NULL | Not set |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Replenishable calculation:
|
||||||
|
```javascript
|
||||||
|
CASE
|
||||||
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN (
|
||||||
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
|
) THEN 0
|
||||||
|
ELSE 1
|
||||||
|
END
|
||||||
|
```
|
||||||
|
|
||||||
|
In business terms, a product is considered NOT replenishable only if:
|
||||||
|
- It was manually flagged as not replenishable (negative reorder value)
|
||||||
|
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
|
||||||
|
- Image URLs are constructed using this pattern:
|
||||||
|
```javascript
|
||||||
|
const paddedPid = pid.toString().padStart(6, '0');
|
||||||
|
const prefix = paddedPid.slice(0, 3);
|
||||||
|
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||||
|
return {
|
||||||
|
image: `${basePath}-t-${iid}.jpg`,
|
||||||
|
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||||
|
image_full: `${basePath}-o-${iid}.jpg`
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### Product Categories (Relationship)
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| pid | products.pid | Direct mapping |
|
||||||
|
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
|
||||||
|
- Categories 16 and 17 are explicitly excluded
|
||||||
|
|
||||||
|
### Orders
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| order_number | order_items.order_id | Direct mapping |
|
||||||
|
| pid | order_items.prod_pid | Direct mapping |
|
||||||
|
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| date | _order.date_placed_onlydate | Via join to _order table |
|
||||||
|
| price | order_items.prod_price | Direct mapping |
|
||||||
|
| quantity | order_items.qty_ordered | Direct mapping |
|
||||||
|
| discount | Multiple sources | Complex calculation (see notes) |
|
||||||
|
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
|
||||||
|
| tax_included | N/A | Hard-coded false |
|
||||||
|
| shipping | N/A | Hard-coded 0 |
|
||||||
|
| customer | _order.order_cid | Direct mapping |
|
||||||
|
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
|
||||||
|
| status | _order.order_status | Direct mapping |
|
||||||
|
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
|
||||||
|
| costeach | order_costs | From latest record or fallback to price * 0.5 |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only orders with order_status >= 15 and with a valid date_placed are processed
|
||||||
|
- For incremental imports, only orders modified since last sync are processed
|
||||||
|
- Discount calculation combines three sources:
|
||||||
|
1. Base discount: order_items.prod_price_reg - order_items.prod_price
|
||||||
|
2. Promo discount: SUM of order_discount_items.amount
|
||||||
|
3. Proportional order discount: Calculation based on order subtotal proportion
|
||||||
|
```javascript
|
||||||
|
(oi.base_discount +
|
||||||
|
COALESCE(ot.promo_discount, 0) +
|
||||||
|
CASE
|
||||||
|
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
|
||||||
|
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
|
||||||
|
ELSE 0
|
||||||
|
END)::DECIMAL(10,2)
|
||||||
|
```
|
||||||
|
- Taxes are taken from the latest tax record for an order
|
||||||
|
- Cost data is taken from the latest non-pending cost record
|
||||||
|
|
||||||
|
### Purchase Orders
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| po_id | po.po_id | Default 0 if NULL |
|
||||||
|
| pid | po_products.pid | Direct mapping |
|
||||||
|
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| name | products.description | Fallback to 'Unknown Product' |
|
||||||
|
| cost_price | po_products.cost_each | Direct mapping |
|
||||||
|
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
|
||||||
|
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
|
||||||
|
| date | po.date_ordered | Fallback to po.date_created if NULL |
|
||||||
|
| expected_date | po.date_estin | Direct mapping |
|
||||||
|
| status | po.status | Default 1 if NULL |
|
||||||
|
| notes | po.short_note | Fallback to po.notes if NULL |
|
||||||
|
| ordered | po_products.qty_each | Direct mapping |
|
||||||
|
| received | N/A | Hard-coded 0 |
|
||||||
|
| receiving_status | N/A | Hard-coded 1 |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
|
||||||
|
- For incremental imports, only POs modified since last sync are processed
|
||||||
|
|
||||||
|
### Metadata Tables
|
||||||
|
|
||||||
|
#### import_history
|
||||||
|
|
||||||
|
| PostgreSQL Column | Source | Notes |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| id | Auto-increment | Primary key |
|
||||||
|
| table_name | Code | 'all_tables' for overall import |
|
||||||
|
| start_time | NOW() | Import start time |
|
||||||
|
| end_time | NOW() | Import completion time |
|
||||||
|
| duration_seconds | Calculation | Elapsed seconds |
|
||||||
|
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
|
||||||
|
| records_added | Calculation | Sum from all imports |
|
||||||
|
| records_updated | Calculation | Sum from all imports |
|
||||||
|
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
|
||||||
|
| error_message | Exception | Error message if failed |
|
||||||
|
| additional_info | JSON | Configuration and results |
|
||||||
|
|
||||||
|
#### sync_status
|
||||||
|
|
||||||
|
| PostgreSQL Column | Source | Notes |
|
||||||
|
|----------------------|--------------------------------|---------------------------------------------------------------|
|
||||||
|
| table_name | Code | Name of imported table |
|
||||||
|
| last_sync_timestamp | NOW() | Timestamp of successful sync |
|
||||||
|
| last_sync_id | NULL | Not used currently |
|
||||||
|
|
||||||
|
## Special Calculations
|
||||||
|
|
||||||
|
### Date Validation
|
||||||
|
|
||||||
|
MySQL dates are validated before insertion into PostgreSQL:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
function validateDate(mysqlDate) {
|
||||||
|
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
// Check if the date is valid
|
||||||
|
const date = new Date(mysqlDate);
|
||||||
|
return isNaN(date.getTime()) ? null : mysqlDate;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Retry Mechanism
|
||||||
|
|
||||||
|
Operations that might fail temporarily are retried with exponential backoff:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async function withRetry(operation, errorMessage) {
|
||||||
|
let lastError;
|
||||||
|
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||||
|
try {
|
||||||
|
return await operation();
|
||||||
|
} catch (error) {
|
||||||
|
lastError = error;
|
||||||
|
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||||
|
if (attempt < MAX_RETRIES) {
|
||||||
|
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||||
|
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw lastError;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Progress Tracking
|
||||||
|
|
||||||
|
Progress is tracked with estimated time remaining:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
function estimateRemaining(startTime, current, total) {
|
||||||
|
if (current === 0) return "Calculating...";
|
||||||
|
const elapsedSeconds = (Date.now() - startTime) / 1000;
|
||||||
|
const itemsPerSecond = current / elapsedSeconds;
|
||||||
|
const remainingItems = total - current;
|
||||||
|
const remainingSeconds = remainingItems / itemsPerSecond;
|
||||||
|
return formatElapsedTime(remainingSeconds);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
### Transaction Management
|
||||||
|
|
||||||
|
All imports use transactions to ensure data consistency:
|
||||||
|
|
||||||
|
- **Categories**: Uses savepoints for each category type
|
||||||
|
- **Products**: Uses a single transaction for the entire import
|
||||||
|
- **Orders**: Uses a single transaction with temporary tables
|
||||||
|
- **Purchase Orders**: Uses a single transaction with temporary tables
|
||||||
|
|
||||||
|
### Memory Usage Optimization
|
||||||
|
|
||||||
|
To minimize memory usage when processing large datasets:
|
||||||
|
|
||||||
|
1. Data is processed in batches (100-5000 records per batch)
|
||||||
|
2. Temporary tables are used for intermediate data
|
||||||
|
3. Some queries use cursors to avoid loading all results at once
|
||||||
|
|
||||||
|
### MySQL vs PostgreSQL Compatibility
|
||||||
|
|
||||||
|
The scripts handle differences between MySQL and PostgreSQL:
|
||||||
|
|
||||||
|
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
|
||||||
|
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
|
||||||
|
3. Transaction syntax differences are abstracted in the connection wrapper
|
||||||
|
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
|
||||||
|
|
||||||
|
### SSH Tunnel
|
||||||
|
|
||||||
|
Database connections go through an SSH tunnel for security:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
ssh.forwardOut(
|
||||||
|
"127.0.0.1",
|
||||||
|
0,
|
||||||
|
sshConfig.prodDbConfig.host,
|
||||||
|
sshConfig.prodDbConfig.port,
|
||||||
|
async (err, stream) => {
|
||||||
|
if (err) reject(err);
|
||||||
|
resolve({ ssh, stream });
|
||||||
|
}
|
||||||
|
);
|
||||||
|
```
|
||||||
1065
docs/metrics-calculation-system.md
Normal file
1065
docs/metrics-calculation-system.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
|
|||||||
-- Create function for updating timestamps
|
-- Create function for updating timestamps
|
||||||
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
||||||
BEGIN
|
BEGIN
|
||||||
|
-- Check which table is being updated and use the appropriate column
|
||||||
|
IF TG_TABLE_NAME = 'categories' THEN
|
||||||
|
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||||
|
ELSE
|
||||||
NEW.updated = CURRENT_TIMESTAMP;
|
NEW.updated = CURRENT_TIMESTAMP;
|
||||||
|
END IF;
|
||||||
RETURN NEW;
|
RETURN NEW;
|
||||||
END;
|
END;
|
||||||
$func$ language plpgsql;
|
$func$ language plpgsql;
|
||||||
@@ -160,7 +165,7 @@ CREATE TABLE purchase_orders (
|
|||||||
expected_date DATE,
|
expected_date DATE,
|
||||||
pid BIGINT NOT NULL,
|
pid BIGINT NOT NULL,
|
||||||
sku VARCHAR(50) NOT NULL,
|
sku VARCHAR(50) NOT NULL,
|
||||||
name VARCHAR(100) NOT NULL,
|
name VARCHAR(255) NOT NULL,
|
||||||
cost_price DECIMAL(10, 3) NOT NULL,
|
cost_price DECIMAL(10, 3) NOT NULL,
|
||||||
po_cost_price DECIMAL(10, 3) NOT NULL,
|
po_cost_price DECIMAL(10, 3) NOT NULL,
|
||||||
status SMALLINT DEFAULT 1,
|
status SMALLINT DEFAULT 1,
|
||||||
@@ -171,7 +176,7 @@ CREATE TABLE purchase_orders (
|
|||||||
received INTEGER DEFAULT 0,
|
received INTEGER DEFAULT 0,
|
||||||
received_date DATE,
|
received_date DATE,
|
||||||
last_received_date DATE,
|
last_received_date DATE,
|
||||||
received_by VARCHAR(100),
|
received_by VARCHAR,
|
||||||
receiving_history JSONB,
|
receiving_history JSONB,
|
||||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
FOREIGN KEY (pid) REFERENCES products(pid),
|
FOREIGN KEY (pid) REFERENCES products(pid),
|
||||||
|
|||||||
@@ -23,6 +23,56 @@ CREATE TABLE IF NOT EXISTS templates (
|
|||||||
UNIQUE(company, product_type)
|
UNIQUE(company, product_type)
|
||||||
);
|
);
|
||||||
|
|
||||||
|
-- AI Prompts table for storing validation prompts
|
||||||
|
CREATE TABLE IF NOT EXISTS ai_prompts (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
prompt_text TEXT NOT NULL,
|
||||||
|
prompt_type TEXT NOT NULL CHECK (prompt_type IN ('general', 'company_specific', 'system')),
|
||||||
|
company TEXT,
|
||||||
|
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT unique_company_prompt UNIQUE (company),
|
||||||
|
CONSTRAINT company_required_for_specific CHECK (
|
||||||
|
(prompt_type = 'general' AND company IS NULL) OR
|
||||||
|
(prompt_type = 'system' AND company IS NULL) OR
|
||||||
|
(prompt_type = 'company_specific' AND company IS NOT NULL)
|
||||||
|
)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Create a unique partial index to ensure only one general prompt
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_general_prompt
|
||||||
|
ON ai_prompts (prompt_type)
|
||||||
|
WHERE prompt_type = 'general';
|
||||||
|
|
||||||
|
-- Create a unique partial index to ensure only one system prompt
|
||||||
|
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
|
||||||
|
ON ai_prompts (prompt_type)
|
||||||
|
WHERE prompt_type = 'system';
|
||||||
|
|
||||||
|
-- Reusable Images table for storing persistent images
|
||||||
|
CREATE TABLE IF NOT EXISTS reusable_images (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
filename TEXT NOT NULL,
|
||||||
|
file_path TEXT NOT NULL,
|
||||||
|
image_url TEXT NOT NULL,
|
||||||
|
is_global BOOLEAN NOT NULL DEFAULT false,
|
||||||
|
company TEXT,
|
||||||
|
mime_type TEXT,
|
||||||
|
file_size INTEGER,
|
||||||
|
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT company_required_for_non_global CHECK (
|
||||||
|
(is_global = true AND company IS NULL) OR
|
||||||
|
(is_global = false AND company IS NOT NULL)
|
||||||
|
)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Create index on company for efficient querying
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
|
||||||
|
-- Create index on is_global for efficient querying
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
|
||||||
|
|
||||||
-- AI Validation Performance Tracking
|
-- AI Validation Performance Tracking
|
||||||
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
@@ -51,3 +101,15 @@ CREATE TRIGGER update_templates_updated_at
|
|||||||
BEFORE UPDATE ON templates
|
BEFORE UPDATE ON templates
|
||||||
FOR EACH ROW
|
FOR EACH ROW
|
||||||
EXECUTE FUNCTION update_updated_at_column();
|
EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
-- Trigger to automatically update the updated_at column for ai_prompts
|
||||||
|
CREATE TRIGGER update_ai_prompts_updated_at
|
||||||
|
BEFORE UPDATE ON ai_prompts
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
-- Trigger to automatically update the updated_at column for reusable_images
|
||||||
|
CREATE TRIGGER update_reusable_images_updated_at
|
||||||
|
BEFORE UPDATE ON reusable_images
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_updated_at_column();
|
||||||
@@ -62,13 +62,24 @@ const TEMP_TABLES = [
|
|||||||
|
|
||||||
// Add cleanup function for temporary tables
|
// Add cleanup function for temporary tables
|
||||||
async function cleanupTemporaryTables(connection) {
|
async function cleanupTemporaryTables(connection) {
|
||||||
|
// List of possible temporary tables that might exist
|
||||||
|
const tempTables = [
|
||||||
|
'temp_sales_metrics',
|
||||||
|
'temp_purchase_metrics',
|
||||||
|
'temp_forecast_dates',
|
||||||
|
'temp_daily_sales',
|
||||||
|
'temp_product_stats',
|
||||||
|
'temp_category_sales',
|
||||||
|
'temp_category_stats'
|
||||||
|
];
|
||||||
|
|
||||||
try {
|
try {
|
||||||
for (const table of TEMP_TABLES) {
|
// Drop each temporary table if it exists
|
||||||
await connection.query(`DROP TEMPORARY TABLE IF EXISTS ${table}`);
|
for (const table of tempTables) {
|
||||||
|
await connection.query(`DROP TABLE IF EXISTS ${table}`);
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (err) {
|
||||||
logError(error, 'Error cleaning up temporary tables');
|
console.error('Error cleaning up temporary tables:', err);
|
||||||
throw error; // Re-throw to be handled by the caller
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -86,22 +97,42 @@ let isCancelled = false;
|
|||||||
|
|
||||||
function cancelCalculation() {
|
function cancelCalculation() {
|
||||||
isCancelled = true;
|
isCancelled = true;
|
||||||
global.clearProgress();
|
console.log('Calculation has been cancelled by user');
|
||||||
// Format as SSE event
|
|
||||||
const event = {
|
// Force-terminate any query that's been running for more than 5 seconds
|
||||||
progress: {
|
try {
|
||||||
status: 'cancelled',
|
const connection = getConnection();
|
||||||
operation: 'Calculation cancelled',
|
connection.then(async (conn) => {
|
||||||
current: 0,
|
try {
|
||||||
total: 0,
|
// Identify and terminate long-running queries from our application
|
||||||
elapsed: null,
|
await conn.query(`
|
||||||
remaining: null,
|
SELECT pg_cancel_backend(pid)
|
||||||
rate: 0,
|
FROM pg_stat_activity
|
||||||
timestamp: Date.now()
|
WHERE query_start < now() - interval '5 seconds'
|
||||||
|
AND application_name LIKE '%node%'
|
||||||
|
AND query NOT LIKE '%pg_cancel_backend%'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Clean up any temporary tables
|
||||||
|
await cleanupTemporaryTables(conn);
|
||||||
|
|
||||||
|
// Release connection
|
||||||
|
conn.release();
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error during force cancellation:', err);
|
||||||
|
conn.release();
|
||||||
}
|
}
|
||||||
|
}).catch(err => {
|
||||||
|
console.error('Could not get connection for cancellation:', err);
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Failed to terminate running queries:', err);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
message: 'Calculation has been cancelled'
|
||||||
};
|
};
|
||||||
process.stdout.write(JSON.stringify(event) + '\n');
|
|
||||||
process.exit(0);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle SIGTERM signal for cancellation
|
// Handle SIGTERM signal for cancellation
|
||||||
@@ -119,6 +150,15 @@ async function calculateMetrics() {
|
|||||||
let totalPurchaseOrders = 0;
|
let totalPurchaseOrders = 0;
|
||||||
let calculateHistoryId;
|
let calculateHistoryId;
|
||||||
|
|
||||||
|
// Set a maximum execution time (30 minutes)
|
||||||
|
const MAX_EXECUTION_TIME = 30 * 60 * 1000;
|
||||||
|
const timeout = setTimeout(() => {
|
||||||
|
console.error(`Calculation timed out after ${MAX_EXECUTION_TIME/1000} seconds, forcing termination`);
|
||||||
|
// Call cancel and force exit
|
||||||
|
cancelCalculation();
|
||||||
|
process.exit(1);
|
||||||
|
}, MAX_EXECUTION_TIME);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
// Clean up any previously running calculations
|
// Clean up any previously running calculations
|
||||||
connection = await getConnection();
|
connection = await getConnection();
|
||||||
@@ -127,24 +167,24 @@ async function calculateMetrics() {
|
|||||||
SET
|
SET
|
||||||
status = 'cancelled',
|
status = 'cancelled',
|
||||||
end_time = NOW(),
|
end_time = NOW(),
|
||||||
duration_seconds = TIMESTAMPDIFF(SECOND, start_time, NOW()),
|
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||||
error_message = 'Previous calculation was not completed properly'
|
error_message = 'Previous calculation was not completed properly'
|
||||||
WHERE status = 'running'
|
WHERE status = 'running'
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get counts from all relevant tables
|
// Get counts from all relevant tables
|
||||||
const [[productCount], [orderCount], [poCount]] = await Promise.all([
|
const [productCountResult, orderCountResult, poCountResult] = await Promise.all([
|
||||||
connection.query('SELECT COUNT(*) as total FROM products'),
|
connection.query('SELECT COUNT(*) as total FROM products'),
|
||||||
connection.query('SELECT COUNT(*) as total FROM orders'),
|
connection.query('SELECT COUNT(*) as total FROM orders'),
|
||||||
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
|
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
|
||||||
]);
|
]);
|
||||||
|
|
||||||
totalProducts = productCount.total;
|
totalProducts = parseInt(productCountResult.rows[0].total);
|
||||||
totalOrders = orderCount.total;
|
totalOrders = parseInt(orderCountResult.rows[0].total);
|
||||||
totalPurchaseOrders = poCount.total;
|
totalPurchaseOrders = parseInt(poCountResult.rows[0].total);
|
||||||
|
|
||||||
// Create history record for this calculation
|
// Create history record for this calculation
|
||||||
const [historyResult] = await connection.query(`
|
const historyResult = await connection.query(`
|
||||||
INSERT INTO calculate_history (
|
INSERT INTO calculate_history (
|
||||||
start_time,
|
start_time,
|
||||||
status,
|
status,
|
||||||
@@ -155,19 +195,19 @@ async function calculateMetrics() {
|
|||||||
) VALUES (
|
) VALUES (
|
||||||
NOW(),
|
NOW(),
|
||||||
'running',
|
'running',
|
||||||
?,
|
$1,
|
||||||
?,
|
$2,
|
||||||
?,
|
$3,
|
||||||
JSON_OBJECT(
|
jsonb_build_object(
|
||||||
'skip_product_metrics', ?,
|
'skip_product_metrics', ($4::int > 0),
|
||||||
'skip_time_aggregates', ?,
|
'skip_time_aggregates', ($5::int > 0),
|
||||||
'skip_financial_metrics', ?,
|
'skip_financial_metrics', ($6::int > 0),
|
||||||
'skip_vendor_metrics', ?,
|
'skip_vendor_metrics', ($7::int > 0),
|
||||||
'skip_category_metrics', ?,
|
'skip_category_metrics', ($8::int > 0),
|
||||||
'skip_brand_metrics', ?,
|
'skip_brand_metrics', ($9::int > 0),
|
||||||
'skip_sales_forecasts', ?
|
'skip_sales_forecasts', ($10::int > 0)
|
||||||
)
|
|
||||||
)
|
)
|
||||||
|
) RETURNING id
|
||||||
`, [
|
`, [
|
||||||
totalProducts,
|
totalProducts,
|
||||||
totalOrders,
|
totalOrders,
|
||||||
@@ -180,8 +220,7 @@ async function calculateMetrics() {
|
|||||||
SKIP_BRAND_METRICS,
|
SKIP_BRAND_METRICS,
|
||||||
SKIP_SALES_FORECASTS
|
SKIP_SALES_FORECASTS
|
||||||
]);
|
]);
|
||||||
calculateHistoryId = historyResult.insertId;
|
calculateHistoryId = historyResult.rows[0].id;
|
||||||
connection.release();
|
|
||||||
|
|
||||||
// Add debug logging for the progress functions
|
// Add debug logging for the progress functions
|
||||||
console.log('Debug - Progress functions:', {
|
console.log('Debug - Progress functions:', {
|
||||||
@@ -199,6 +238,8 @@ async function calculateMetrics() {
|
|||||||
throw err;
|
throw err;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Release the connection before getting a new one
|
||||||
|
connection.release();
|
||||||
isCancelled = false;
|
isCancelled = false;
|
||||||
connection = await getConnection();
|
connection = await getConnection();
|
||||||
|
|
||||||
@@ -234,10 +275,10 @@ async function calculateMetrics() {
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE calculate_history
|
UPDATE calculate_history
|
||||||
SET
|
SET
|
||||||
processed_products = ?,
|
processed_products = $1,
|
||||||
processed_orders = ?,
|
processed_orders = $2,
|
||||||
processed_purchase_orders = ?
|
processed_purchase_orders = $3
|
||||||
WHERE id = ?
|
WHERE id = $4
|
||||||
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
|
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -359,216 +400,6 @@ async function calculateMetrics() {
|
|||||||
console.log('Skipping sales forecasts calculation');
|
console.log('Skipping sales forecasts calculation');
|
||||||
}
|
}
|
||||||
|
|
||||||
// Calculate ABC classification
|
|
||||||
outputProgress({
|
|
||||||
status: 'running',
|
|
||||||
operation: 'Starting ABC classification',
|
|
||||||
current: processedProducts || 0,
|
|
||||||
total: totalProducts || 0,
|
|
||||||
elapsed: formatElapsedTime(startTime),
|
|
||||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
|
||||||
rate: calculateRate(startTime, processedProducts || 0),
|
|
||||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
|
||||||
timing: {
|
|
||||||
start_time: new Date(startTime).toISOString(),
|
|
||||||
end_time: new Date().toISOString(),
|
|
||||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isCancelled) return {
|
|
||||||
processedProducts: processedProducts || 0,
|
|
||||||
processedOrders: processedOrders || 0,
|
|
||||||
processedPurchaseOrders: 0,
|
|
||||||
success: false
|
|
||||||
};
|
|
||||||
|
|
||||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
|
||||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
|
||||||
|
|
||||||
// First, create and populate the rankings table with an index
|
|
||||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
|
||||||
await connection.query(`
|
|
||||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
|
||||||
pid BIGINT NOT NULL,
|
|
||||||
total_revenue DECIMAL(10,3),
|
|
||||||
rank_num INT,
|
|
||||||
total_count INT,
|
|
||||||
PRIMARY KEY (pid),
|
|
||||||
INDEX (rank_num)
|
|
||||||
) ENGINE=MEMORY
|
|
||||||
`);
|
|
||||||
|
|
||||||
outputProgress({
|
|
||||||
status: 'running',
|
|
||||||
operation: 'Creating revenue rankings',
|
|
||||||
current: processedProducts || 0,
|
|
||||||
total: totalProducts || 0,
|
|
||||||
elapsed: formatElapsedTime(startTime),
|
|
||||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
|
||||||
rate: calculateRate(startTime, processedProducts || 0),
|
|
||||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
|
||||||
timing: {
|
|
||||||
start_time: new Date(startTime).toISOString(),
|
|
||||||
end_time: new Date().toISOString(),
|
|
||||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isCancelled) return {
|
|
||||||
processedProducts: processedProducts || 0,
|
|
||||||
processedOrders: processedOrders || 0,
|
|
||||||
processedPurchaseOrders: 0,
|
|
||||||
success: false
|
|
||||||
};
|
|
||||||
|
|
||||||
await connection.query(`
|
|
||||||
INSERT INTO temp_revenue_ranks
|
|
||||||
SELECT
|
|
||||||
pid,
|
|
||||||
total_revenue,
|
|
||||||
@rank := @rank + 1 as rank_num,
|
|
||||||
@total_count := @rank as total_count
|
|
||||||
FROM (
|
|
||||||
SELECT pid, total_revenue
|
|
||||||
FROM product_metrics
|
|
||||||
WHERE total_revenue > 0
|
|
||||||
ORDER BY total_revenue DESC
|
|
||||||
) ranked,
|
|
||||||
(SELECT @rank := 0) r
|
|
||||||
`);
|
|
||||||
|
|
||||||
// Get total count for percentage calculation
|
|
||||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
|
||||||
const totalCount = rankingCount[0].total_count || 1;
|
|
||||||
const max_rank = totalCount; // Store max_rank for use in classification
|
|
||||||
|
|
||||||
outputProgress({
|
|
||||||
status: 'running',
|
|
||||||
operation: 'Updating ABC classifications',
|
|
||||||
current: processedProducts || 0,
|
|
||||||
total: totalProducts || 0,
|
|
||||||
elapsed: formatElapsedTime(startTime),
|
|
||||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
|
||||||
rate: calculateRate(startTime, processedProducts || 0),
|
|
||||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
|
||||||
timing: {
|
|
||||||
start_time: new Date(startTime).toISOString(),
|
|
||||||
end_time: new Date().toISOString(),
|
|
||||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isCancelled) return {
|
|
||||||
processedProducts: processedProducts || 0,
|
|
||||||
processedOrders: processedOrders || 0,
|
|
||||||
processedPurchaseOrders: 0,
|
|
||||||
success: false
|
|
||||||
};
|
|
||||||
|
|
||||||
// ABC classification progress tracking
|
|
||||||
let abcProcessedCount = 0;
|
|
||||||
const batchSize = 5000;
|
|
||||||
let lastProgressUpdate = Date.now();
|
|
||||||
const progressUpdateInterval = 1000; // Update every second
|
|
||||||
|
|
||||||
while (true) {
|
|
||||||
if (isCancelled) return {
|
|
||||||
processedProducts: Number(processedProducts) || 0,
|
|
||||||
processedOrders: Number(processedOrders) || 0,
|
|
||||||
processedPurchaseOrders: 0,
|
|
||||||
success: false
|
|
||||||
};
|
|
||||||
|
|
||||||
// First get a batch of PIDs that need updating
|
|
||||||
const [pids] = await connection.query(`
|
|
||||||
SELECT pm.pid
|
|
||||||
FROM product_metrics pm
|
|
||||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
|
||||||
WHERE pm.abc_class IS NULL
|
|
||||||
OR pm.abc_class !=
|
|
||||||
CASE
|
|
||||||
WHEN tr.rank_num IS NULL THEN 'C'
|
|
||||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
|
||||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
|
||||||
ELSE 'C'
|
|
||||||
END
|
|
||||||
LIMIT ?
|
|
||||||
`, [max_rank, abcThresholds.a_threshold,
|
|
||||||
max_rank, abcThresholds.b_threshold,
|
|
||||||
batchSize]);
|
|
||||||
|
|
||||||
if (pids.length === 0) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Then update just those PIDs
|
|
||||||
const [result] = await connection.query(`
|
|
||||||
UPDATE product_metrics pm
|
|
||||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
|
||||||
SET pm.abc_class =
|
|
||||||
CASE
|
|
||||||
WHEN tr.rank_num IS NULL THEN 'C'
|
|
||||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
|
||||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
|
||||||
ELSE 'C'
|
|
||||||
END,
|
|
||||||
pm.last_calculated_at = NOW()
|
|
||||||
WHERE pm.pid IN (?)
|
|
||||||
`, [max_rank, abcThresholds.a_threshold,
|
|
||||||
max_rank, abcThresholds.b_threshold,
|
|
||||||
pids.map(row => row.pid)]);
|
|
||||||
|
|
||||||
abcProcessedCount += result.affectedRows;
|
|
||||||
|
|
||||||
// Calculate progress ensuring valid numbers
|
|
||||||
const currentProgress = Math.floor(totalProducts * (0.99 + (abcProcessedCount / (totalCount || 1)) * 0.01));
|
|
||||||
processedProducts = Number(currentProgress) || processedProducts || 0;
|
|
||||||
|
|
||||||
// Only update progress at most once per second
|
|
||||||
const now = Date.now();
|
|
||||||
if (now - lastProgressUpdate >= progressUpdateInterval) {
|
|
||||||
const progress = ensureValidProgress(processedProducts, totalProducts);
|
|
||||||
|
|
||||||
outputProgress({
|
|
||||||
status: 'running',
|
|
||||||
operation: 'ABC classification progress',
|
|
||||||
current: progress.current,
|
|
||||||
total: progress.total,
|
|
||||||
elapsed: formatElapsedTime(startTime),
|
|
||||||
remaining: estimateRemaining(startTime, progress.current, progress.total),
|
|
||||||
rate: calculateRate(startTime, progress.current),
|
|
||||||
percentage: progress.percentage,
|
|
||||||
timing: {
|
|
||||||
start_time: new Date(startTime).toISOString(),
|
|
||||||
end_time: new Date().toISOString(),
|
|
||||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
lastProgressUpdate = now;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Update database progress
|
|
||||||
await updateProgress(processedProducts, processedOrders, processedPurchaseOrders);
|
|
||||||
|
|
||||||
// Small delay between batches to allow other transactions
|
|
||||||
await new Promise(resolve => setTimeout(resolve, 100));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Clean up
|
|
||||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
|
||||||
|
|
||||||
const endTime = Date.now();
|
|
||||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
|
||||||
|
|
||||||
// Update calculate_status for ABC classification
|
|
||||||
await connection.query(`
|
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
|
||||||
VALUES ('abc_classification', NOW())
|
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
|
||||||
`);
|
|
||||||
|
|
||||||
// Final progress update with guaranteed valid numbers
|
// Final progress update with guaranteed valid numbers
|
||||||
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
|
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
|
||||||
|
|
||||||
@@ -578,14 +409,14 @@ async function calculateMetrics() {
|
|||||||
operation: 'Metrics calculation complete',
|
operation: 'Metrics calculation complete',
|
||||||
current: finalProgress.current,
|
current: finalProgress.current,
|
||||||
total: finalProgress.total,
|
total: finalProgress.total,
|
||||||
elapsed: formatElapsedTime(startTime),
|
elapsed: global.formatElapsedTime(startTime),
|
||||||
remaining: '0s',
|
remaining: '0s',
|
||||||
rate: calculateRate(startTime, finalProgress.current),
|
rate: global.calculateRate(startTime, finalProgress.current),
|
||||||
percentage: '100',
|
percentage: '100',
|
||||||
timing: {
|
timing: {
|
||||||
start_time: new Date(startTime).toISOString(),
|
start_time: new Date(startTime).toISOString(),
|
||||||
end_time: new Date().toISOString(),
|
end_time: new Date().toISOString(),
|
||||||
elapsed_seconds: totalElapsedSeconds
|
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -601,13 +432,13 @@ async function calculateMetrics() {
|
|||||||
UPDATE calculate_history
|
UPDATE calculate_history
|
||||||
SET
|
SET
|
||||||
end_time = NOW(),
|
end_time = NOW(),
|
||||||
duration_seconds = ?,
|
duration_seconds = $1,
|
||||||
processed_products = ?,
|
processed_products = $2,
|
||||||
processed_orders = ?,
|
processed_orders = $3,
|
||||||
processed_purchase_orders = ?,
|
processed_purchase_orders = $4,
|
||||||
status = 'completed'
|
status = 'completed'
|
||||||
WHERE id = ?
|
WHERE id = $5
|
||||||
`, [totalElapsedSeconds,
|
`, [Math.round((Date.now() - startTime) / 1000),
|
||||||
finalStats.processedProducts,
|
finalStats.processedProducts,
|
||||||
finalStats.processedOrders,
|
finalStats.processedOrders,
|
||||||
finalStats.processedPurchaseOrders,
|
finalStats.processedPurchaseOrders,
|
||||||
@@ -616,6 +447,11 @@ async function calculateMetrics() {
|
|||||||
// Clear progress file on successful completion
|
// Clear progress file on successful completion
|
||||||
global.clearProgress();
|
global.clearProgress();
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
message: 'Calculation completed successfully',
|
||||||
|
duration: Math.round((Date.now() - startTime) / 1000)
|
||||||
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
const endTime = Date.now();
|
const endTime = Date.now();
|
||||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||||
@@ -625,13 +461,13 @@ async function calculateMetrics() {
|
|||||||
UPDATE calculate_history
|
UPDATE calculate_history
|
||||||
SET
|
SET
|
||||||
end_time = NOW(),
|
end_time = NOW(),
|
||||||
duration_seconds = ?,
|
duration_seconds = $1,
|
||||||
processed_products = ?,
|
processed_products = $2,
|
||||||
processed_orders = ?,
|
processed_orders = $3,
|
||||||
processed_purchase_orders = ?,
|
processed_purchase_orders = $4,
|
||||||
status = ?,
|
status = $5,
|
||||||
error_message = ?
|
error_message = $6
|
||||||
WHERE id = ?
|
WHERE id = $7
|
||||||
`, [
|
`, [
|
||||||
totalElapsedSeconds,
|
totalElapsedSeconds,
|
||||||
processedProducts || 0, // Ensure we have a valid number
|
processedProducts || 0, // Ensure we have a valid number
|
||||||
@@ -677,17 +513,38 @@ async function calculateMetrics() {
|
|||||||
}
|
}
|
||||||
throw error;
|
throw error;
|
||||||
} finally {
|
} finally {
|
||||||
|
// Clear the timeout to prevent forced termination
|
||||||
|
clearTimeout(timeout);
|
||||||
|
|
||||||
|
// Always clean up and release connection
|
||||||
if (connection) {
|
if (connection) {
|
||||||
// Ensure temporary tables are cleaned up
|
try {
|
||||||
await cleanupTemporaryTables(connection);
|
await cleanupTemporaryTables(connection);
|
||||||
connection.release();
|
connection.release();
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error in final cleanup:', err);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
// Close the connection pool when we're done
|
|
||||||
await closePool();
|
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
success = false;
|
console.error('Error in metrics calculation', error);
|
||||||
logError(error, 'Error in metrics calculation');
|
|
||||||
|
try {
|
||||||
|
if (connection) {
|
||||||
|
await connection.query(`
|
||||||
|
UPDATE calculate_history
|
||||||
|
SET
|
||||||
|
status = 'error',
|
||||||
|
end_time = NOW(),
|
||||||
|
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||||
|
error_message = $1
|
||||||
|
WHERE id = $2
|
||||||
|
`, [error.message.substring(0, 500), calculateHistoryId]);
|
||||||
|
}
|
||||||
|
} catch (updateError) {
|
||||||
|
console.error('Error updating calculation history:', updateError);
|
||||||
|
}
|
||||||
|
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -10,9 +10,9 @@ const importPurchaseOrders = require('./import/purchase-orders');
|
|||||||
dotenv.config({ path: path.join(__dirname, "../.env") });
|
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||||
|
|
||||||
// Constants to control which imports run
|
// Constants to control which imports run
|
||||||
const IMPORT_CATEGORIES = true;
|
const IMPORT_CATEGORIES = false;
|
||||||
const IMPORT_PRODUCTS = true;
|
const IMPORT_PRODUCTS = false;
|
||||||
const IMPORT_ORDERS = true;
|
const IMPORT_ORDERS = false;
|
||||||
const IMPORT_PURCHASE_ORDERS = true;
|
const IMPORT_PURCHASE_ORDERS = true;
|
||||||
|
|
||||||
// Add flag for incremental updates
|
// Add flag for incremental updates
|
||||||
@@ -120,6 +120,7 @@ async function main() {
|
|||||||
`);
|
`);
|
||||||
|
|
||||||
// Create import history record for the overall session
|
// Create import history record for the overall session
|
||||||
|
try {
|
||||||
const [historyResult] = await localConnection.query(`
|
const [historyResult] = await localConnection.query(`
|
||||||
INSERT INTO import_history (
|
INSERT INTO import_history (
|
||||||
table_name,
|
table_name,
|
||||||
@@ -141,6 +142,16 @@ async function main() {
|
|||||||
) RETURNING id
|
) RETURNING id
|
||||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||||
importHistoryId = historyResult.rows[0].id;
|
importHistoryId = historyResult.rows[0].id;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error creating import history record:", error);
|
||||||
|
outputProgress({
|
||||||
|
status: "error",
|
||||||
|
operation: "Import process",
|
||||||
|
message: "Failed to create import history record",
|
||||||
|
error: error.message
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
const results = {
|
const results = {
|
||||||
categories: null,
|
categories: null,
|
||||||
@@ -158,8 +169,8 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Categories import result:', results.categories);
|
console.log('Categories import result:', results.categories);
|
||||||
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_PRODUCTS) {
|
if (IMPORT_PRODUCTS) {
|
||||||
@@ -167,8 +178,8 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Products import result:', results.products);
|
console.log('Products import result:', results.products);
|
||||||
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_ORDERS) {
|
if (IMPORT_ORDERS) {
|
||||||
@@ -176,17 +187,34 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Orders import result:', results.orders);
|
console.log('Orders import result:', results.orders);
|
||||||
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_PURCHASE_ORDERS) {
|
if (IMPORT_PURCHASE_ORDERS) {
|
||||||
|
try {
|
||||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
|
||||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
// Handle potential error status
|
||||||
|
if (results.purchaseOrders?.status === 'error') {
|
||||||
|
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
|
||||||
|
} else {
|
||||||
|
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0) || 0;
|
||||||
|
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0) || 0;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error during purchase orders import:', error);
|
||||||
|
// Continue with other imports, don't fail the whole process
|
||||||
|
results.purchaseOrders = {
|
||||||
|
status: 'error',
|
||||||
|
error: error.message,
|
||||||
|
recordsAdded: 0,
|
||||||
|
recordsUpdated: 0
|
||||||
|
};
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const endTime = Date.now();
|
const endTime = Date.now();
|
||||||
@@ -214,8 +242,8 @@ async function main() {
|
|||||||
WHERE id = $12
|
WHERE id = $12
|
||||||
`, [
|
`, [
|
||||||
totalElapsedSeconds,
|
totalElapsedSeconds,
|
||||||
totalRecordsAdded,
|
parseInt(totalRecordsAdded) || 0,
|
||||||
totalRecordsUpdated,
|
parseInt(totalRecordsUpdated) || 0,
|
||||||
IMPORT_CATEGORIES,
|
IMPORT_CATEGORIES,
|
||||||
IMPORT_PRODUCTS,
|
IMPORT_PRODUCTS,
|
||||||
IMPORT_ORDERS,
|
IMPORT_ORDERS,
|
||||||
|
|||||||
@@ -47,42 +47,18 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log(`\nProcessing ${categories.length} type ${type} categories`);
|
console.log(`Processing ${categories.length} type ${type} categories`);
|
||||||
if (type === 10) {
|
|
||||||
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
|
|
||||||
}
|
|
||||||
|
|
||||||
// For types that can have parents (11, 21, 12, 13), verify parent existence
|
// For types that can have parents (11, 21, 12, 13), we'll proceed directly
|
||||||
|
// No need to check for parent existence since we process in hierarchical order
|
||||||
let categoriesToInsert = categories;
|
let categoriesToInsert = categories;
|
||||||
if (![10, 20].includes(type)) {
|
|
||||||
// Get all parent IDs
|
|
||||||
const parentIds = [
|
|
||||||
...new Set(
|
|
||||||
categories
|
|
||||||
.filter(c => c && c.parent_id !== null)
|
|
||||||
.map(c => c.parent_id)
|
|
||||||
),
|
|
||||||
];
|
|
||||||
|
|
||||||
console.log(`Processing ${categories.length} type ${type} categories with ${parentIds.length} unique parent IDs`);
|
|
||||||
console.log('Parent IDs:', parentIds);
|
|
||||||
|
|
||||||
// No need to check for parent existence - we trust they exist since they were just inserted
|
|
||||||
categoriesToInsert = categories;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (categoriesToInsert.length === 0) {
|
if (categoriesToInsert.length === 0) {
|
||||||
console.log(
|
console.log(`No valid categories of type ${type} to insert`);
|
||||||
`No valid categories of type ${type} to insert`
|
|
||||||
);
|
|
||||||
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log(
|
|
||||||
`Inserting ${categoriesToInsert.length} type ${type} categories`
|
|
||||||
);
|
|
||||||
|
|
||||||
// PostgreSQL upsert query with parameterized values
|
// PostgreSQL upsert query with parameterized values
|
||||||
const values = categoriesToInsert.flatMap((cat) => [
|
const values = categoriesToInsert.flatMap((cat) => [
|
||||||
cat.cat_id,
|
cat.cat_id,
|
||||||
@@ -95,14 +71,10 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
new Date()
|
new Date()
|
||||||
]);
|
]);
|
||||||
|
|
||||||
console.log('Attempting to insert/update with values:', JSON.stringify(values, null, 2));
|
|
||||||
|
|
||||||
const placeholders = categoriesToInsert
|
const placeholders = categoriesToInsert
|
||||||
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
||||||
.join(',');
|
.join(',');
|
||||||
|
|
||||||
console.log('Using placeholders:', placeholders);
|
|
||||||
|
|
||||||
// Insert categories with ON CONFLICT clause for PostgreSQL
|
// Insert categories with ON CONFLICT clause for PostgreSQL
|
||||||
const query = `
|
const query = `
|
||||||
WITH inserted_categories AS (
|
WITH inserted_categories AS (
|
||||||
@@ -130,16 +102,13 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
||||||
FROM inserted_categories`;
|
FROM inserted_categories`;
|
||||||
|
|
||||||
console.log('Executing query:', query);
|
|
||||||
|
|
||||||
const result = await localConnection.query(query, values);
|
const result = await localConnection.query(query, values);
|
||||||
console.log('Query result:', result);
|
|
||||||
|
|
||||||
// Get the first result since query returns an array
|
// Get the first result since query returns an array
|
||||||
const queryResult = Array.isArray(result) ? result[0] : result;
|
const queryResult = Array.isArray(result) ? result[0] : result;
|
||||||
|
|
||||||
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
||||||
console.error('Query failed to return results. Result:', queryResult);
|
console.error('Query failed to return results');
|
||||||
throw new Error('Query did not return expected results');
|
throw new Error('Query did not return expected results');
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -173,6 +142,14 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
// Commit the entire transaction - we'll do this even if we have skipped categories
|
// Commit the entire transaction - we'll do this even if we have skipped categories
|
||||||
await localConnection.query('COMMIT');
|
await localConnection.query('COMMIT');
|
||||||
|
|
||||||
|
// Update sync status
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||||
|
VALUES ('categories', NOW())
|
||||||
|
ON CONFLICT (table_name) DO UPDATE SET
|
||||||
|
last_sync_timestamp = NOW()
|
||||||
|
`);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "complete",
|
status: "complete",
|
||||||
operation: "Categories import completed",
|
operation: "Categories import completed",
|
||||||
|
|||||||
@@ -26,6 +26,9 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
let cumulativeProcessedOrders = 0;
|
let cumulativeProcessedOrders = 0;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
// Begin transaction
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
// Get last sync info
|
// Get last sync info
|
||||||
const [syncInfo] = await localConnection.query(
|
const [syncInfo] = await localConnection.query(
|
||||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||||
@@ -38,7 +41,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const [[{ total }]] = await prodConnection.query(`
|
const [[{ total }]] = await prodConnection.query(`
|
||||||
SELECT COUNT(*) as total
|
SELECT COUNT(*) as total
|
||||||
FROM order_items oi
|
FROM order_items oi
|
||||||
USE INDEX (PRIMARY)
|
|
||||||
JOIN _order o ON oi.order_id = o.order_id
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
WHERE o.order_status >= 15
|
WHERE o.order_status >= 15
|
||||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||||
@@ -78,7 +80,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
||||||
oi.stamp as last_modified
|
oi.stamp as last_modified
|
||||||
FROM order_items oi
|
FROM order_items oi
|
||||||
USE INDEX (PRIMARY)
|
|
||||||
JOIN _order o ON oi.order_id = o.order_id
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
WHERE o.order_status >= 15
|
WHERE o.order_status >= 15
|
||||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||||
@@ -105,15 +106,15 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
|
|
||||||
console.log('Orders: Found', orderItems.length, 'order items to process');
|
console.log('Orders: Found', orderItems.length, 'order items to process');
|
||||||
|
|
||||||
// Create tables in PostgreSQL for debugging
|
// Create tables in PostgreSQL for data processing
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
DROP TABLE IF EXISTS debug_order_items;
|
DROP TABLE IF EXISTS temp_order_items;
|
||||||
DROP TABLE IF EXISTS debug_order_meta;
|
DROP TABLE IF EXISTS temp_order_meta;
|
||||||
DROP TABLE IF EXISTS debug_order_discounts;
|
DROP TABLE IF EXISTS temp_order_discounts;
|
||||||
DROP TABLE IF EXISTS debug_order_taxes;
|
DROP TABLE IF EXISTS temp_order_taxes;
|
||||||
DROP TABLE IF EXISTS debug_order_costs;
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
|
||||||
CREATE TABLE debug_order_items (
|
CREATE TEMP TABLE temp_order_items (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
SKU VARCHAR(50) NOT NULL,
|
SKU VARCHAR(50) NOT NULL,
|
||||||
@@ -123,7 +124,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_meta (
|
CREATE TEMP TABLE temp_order_meta (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
date DATE NOT NULL,
|
date DATE NOT NULL,
|
||||||
customer VARCHAR(100) NOT NULL,
|
customer VARCHAR(100) NOT NULL,
|
||||||
@@ -135,26 +136,29 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
PRIMARY KEY (order_id)
|
PRIMARY KEY (order_id)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_discounts (
|
CREATE TEMP TABLE temp_order_discounts (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
discount DECIMAL(10,2) NOT NULL,
|
discount DECIMAL(10,2) NOT NULL,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_taxes (
|
CREATE TEMP TABLE temp_order_taxes (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
tax DECIMAL(10,2) NOT NULL,
|
tax DECIMAL(10,2) NOT NULL,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_costs (
|
CREATE TEMP TABLE temp_order_costs (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||||
|
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Insert order items in batches
|
// Insert order items in batches
|
||||||
@@ -168,7 +172,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
SKU = EXCLUDED.SKU,
|
SKU = EXCLUDED.SKU,
|
||||||
@@ -202,6 +206,14 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const METADATA_BATCH_SIZE = 2000;
|
const METADATA_BATCH_SIZE = 2000;
|
||||||
const PG_BATCH_SIZE = 200;
|
const PG_BATCH_SIZE = 200;
|
||||||
|
|
||||||
|
// Add a helper function for title case conversion
|
||||||
|
function toTitleCase(str) {
|
||||||
|
if (!str) return '';
|
||||||
|
return str.toLowerCase().split(' ').map(word => {
|
||||||
|
return word.charAt(0).toUpperCase() + word.slice(1);
|
||||||
|
}).join(' ');
|
||||||
|
}
|
||||||
|
|
||||||
const processMetadataBatch = async (batchIds) => {
|
const processMetadataBatch = async (batchIds) => {
|
||||||
const [orders] = await prodConnection.query(`
|
const [orders] = await prodConnection.query(`
|
||||||
SELECT
|
SELECT
|
||||||
@@ -231,7 +243,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
order.order_id,
|
order.order_id,
|
||||||
order.date,
|
order.date,
|
||||||
order.customer,
|
order.customer,
|
||||||
order.customer_name || '',
|
toTitleCase(order.customer_name) || '',
|
||||||
order.status,
|
order.status,
|
||||||
order.canceled,
|
order.canceled,
|
||||||
order.summary_discount || 0,
|
order.summary_discount || 0,
|
||||||
@@ -239,7 +251,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_meta (
|
INSERT INTO temp_order_meta (
|
||||||
order_id, date, customer, customer_name, status, canceled,
|
order_id, date, customer, customer_name, status, canceled,
|
||||||
summary_discount, summary_subtotal
|
summary_discount, summary_subtotal
|
||||||
)
|
)
|
||||||
@@ -281,7 +293,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_discounts (order_id, pid, discount)
|
INSERT INTO temp_order_discounts (order_id, pid, discount)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
discount = EXCLUDED.discount
|
discount = EXCLUDED.discount
|
||||||
@@ -321,7 +333,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_taxes (order_id, pid, tax)
|
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
tax = EXCLUDED.tax
|
tax = EXCLUDED.tax
|
||||||
@@ -330,14 +342,23 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
};
|
};
|
||||||
|
|
||||||
const processCostsBatch = async (batchIds) => {
|
const processCostsBatch = async (batchIds) => {
|
||||||
|
// Modified query to ensure one row per order_id/pid by using a subquery
|
||||||
const [costs] = await prodConnection.query(`
|
const [costs] = await prodConnection.query(`
|
||||||
SELECT
|
SELECT
|
||||||
oc.orderid as order_id,
|
oc.orderid as order_id,
|
||||||
oc.pid,
|
oc.pid,
|
||||||
oc.costeach
|
oc.costeach
|
||||||
FROM order_costs oc
|
FROM order_costs oc
|
||||||
WHERE oc.orderid IN (?)
|
INNER JOIN (
|
||||||
AND oc.pending = 0
|
SELECT
|
||||||
|
orderid,
|
||||||
|
pid,
|
||||||
|
MAX(id) as max_id
|
||||||
|
FROM order_costs
|
||||||
|
WHERE orderid IN (?)
|
||||||
|
AND pending = 0
|
||||||
|
GROUP BY orderid, pid
|
||||||
|
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||||
`, [batchIds]);
|
`, [batchIds]);
|
||||||
|
|
||||||
if (costs.length === 0) return;
|
if (costs.length === 0) return;
|
||||||
@@ -357,7 +378,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_costs (order_id, pid, costeach)
|
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
costeach = EXCLUDED.costeach
|
costeach = EXCLUDED.costeach
|
||||||
@@ -416,11 +437,12 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
oi.pid,
|
oi.pid,
|
||||||
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
||||||
COALESCE(ot.tax, 0) as total_tax,
|
COALESCE(ot.tax, 0) as total_tax,
|
||||||
COALESCE(oi.price * 0.5, 0) as costeach
|
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||||
FROM debug_order_items oi
|
FROM temp_order_items oi
|
||||||
LEFT JOIN debug_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||||
LEFT JOIN debug_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||||
GROUP BY oi.order_id, oi.pid, ot.tax
|
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||||
|
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
oi.order_id as order_number,
|
oi.order_id as order_number,
|
||||||
@@ -447,11 +469,11 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
FROM (
|
FROM (
|
||||||
SELECT DISTINCT ON (order_id, pid)
|
SELECT DISTINCT ON (order_id, pid)
|
||||||
order_id, pid, SKU, price, quantity, base_discount
|
order_id, pid, SKU, price, quantity, base_discount
|
||||||
FROM debug_order_items
|
FROM temp_order_items
|
||||||
WHERE order_id = ANY($1)
|
WHERE order_id = ANY($1)
|
||||||
ORDER BY order_id, pid
|
ORDER BY order_id, pid
|
||||||
) oi
|
) oi
|
||||||
JOIN debug_order_meta om ON oi.order_id = om.order_id
|
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||||
ORDER BY oi.order_id, oi.pid
|
ORDER BY oi.order_id, oi.pid
|
||||||
`, [subBatchIds]);
|
`, [subBatchIds]);
|
||||||
@@ -478,8 +500,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||||
|
|
||||||
const placeholders = subBatch.map((_, idx) => {
|
const placeholders = subBatch.map((_, idx) => {
|
||||||
const base = idx * 14; // 14 columns (removed updated)
|
const base = idx * 15; // 15 columns including costeach
|
||||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14})`;
|
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||||
}).join(',');
|
}).join(',');
|
||||||
|
|
||||||
const batchValues = subBatch.flatMap(o => [
|
const batchValues = subBatch.flatMap(o => [
|
||||||
@@ -496,7 +518,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
o.customer,
|
o.customer,
|
||||||
o.customer_name,
|
o.customer_name,
|
||||||
o.status,
|
o.status,
|
||||||
o.canceled
|
o.canceled,
|
||||||
|
o.costeach
|
||||||
]);
|
]);
|
||||||
|
|
||||||
const [result] = await localConnection.query(`
|
const [result] = await localConnection.query(`
|
||||||
@@ -504,7 +527,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
INSERT INTO orders (
|
INSERT INTO orders (
|
||||||
order_number, pid, sku, date, price, quantity, discount,
|
order_number, pid, sku, date, price, quantity, discount,
|
||||||
tax, tax_included, shipping, customer, customer_name,
|
tax, tax_included, shipping, customer, customer_name,
|
||||||
status, canceled
|
status, canceled, costeach
|
||||||
)
|
)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||||
@@ -519,7 +542,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
customer = EXCLUDED.customer,
|
customer = EXCLUDED.customer,
|
||||||
customer_name = EXCLUDED.customer_name,
|
customer_name = EXCLUDED.customer_name,
|
||||||
status = EXCLUDED.status,
|
status = EXCLUDED.status,
|
||||||
canceled = EXCLUDED.canceled
|
canceled = EXCLUDED.canceled,
|
||||||
|
costeach = EXCLUDED.costeach
|
||||||
RETURNING xmax = 0 as inserted
|
RETURNING xmax = 0 as inserted
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
@@ -529,8 +553,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
`, batchValues);
|
`, batchValues);
|
||||||
|
|
||||||
const { inserted, updated } = result.rows[0];
|
const { inserted, updated } = result.rows[0];
|
||||||
recordsAdded += inserted;
|
recordsAdded += parseInt(inserted) || 0;
|
||||||
recordsUpdated += updated;
|
recordsUpdated += parseInt(updated) || 0;
|
||||||
importedCount += subBatch.length;
|
importedCount += subBatch.length;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -556,18 +580,38 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
last_sync_timestamp = NOW()
|
last_sync_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
|
// Cleanup temporary tables
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS temp_order_items;
|
||||||
|
DROP TABLE IF EXISTS temp_order_meta;
|
||||||
|
DROP TABLE IF EXISTS temp_order_discounts;
|
||||||
|
DROP TABLE IF EXISTS temp_order_taxes;
|
||||||
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Commit transaction
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
return {
|
return {
|
||||||
status: "complete",
|
status: "complete",
|
||||||
totalImported: Math.floor(importedCount),
|
totalImported: Math.floor(importedCount) || 0,
|
||||||
recordsAdded: recordsAdded || 0,
|
recordsAdded: parseInt(recordsAdded) || 0,
|
||||||
recordsUpdated: Math.floor(recordsUpdated),
|
recordsUpdated: parseInt(recordsUpdated) || 0,
|
||||||
totalSkipped: skippedOrders.size,
|
totalSkipped: skippedOrders.size || 0,
|
||||||
missingProducts: missingProducts.size,
|
missingProducts: missingProducts.size || 0,
|
||||||
incrementalUpdate,
|
incrementalUpdate,
|
||||||
lastSyncTime
|
lastSyncTime
|
||||||
};
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error during orders import:", error);
|
console.error("Error during orders import:", error);
|
||||||
|
|
||||||
|
// Rollback transaction
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,10 +1,13 @@
|
|||||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||||
const BATCH_SIZE = 100; // Smaller batch size for better progress tracking
|
const BATCH_SIZE = 1000; // Smaller batch size for better progress tracking
|
||||||
const MAX_RETRIES = 3;
|
const MAX_RETRIES = 3;
|
||||||
const RETRY_DELAY = 5000; // 5 seconds
|
const RETRY_DELAY = 5000; // 5 seconds
|
||||||
|
const dotenv = require("dotenv");
|
||||||
|
const path = require("path");
|
||||||
|
dotenv.config({ path: path.join(__dirname, "../../.env") });
|
||||||
|
|
||||||
// Utility functions
|
// Utility functions
|
||||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
const imageUrlBase = process.env.PRODUCT_IMAGE_URL_BASE || 'https://sbing.com/i/products/0000/';
|
||||||
const getImageUrls = (pid, iid = 1) => {
|
const getImageUrls = (pid, iid = 1) => {
|
||||||
const paddedPid = pid.toString().padStart(6, '0');
|
const paddedPid = pid.toString().padStart(6, '0');
|
||||||
// Use padded PID only for the first 3 digits
|
// Use padded PID only for the first 3 digits
|
||||||
@@ -18,7 +21,7 @@ const getImageUrls = (pid, iid = 1) => {
|
|||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
// Add helper function for retrying operations
|
// Add helper function for retrying operations with exponential backoff
|
||||||
async function withRetry(operation, errorMessage) {
|
async function withRetry(operation, errorMessage) {
|
||||||
let lastError;
|
let lastError;
|
||||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||||
@@ -28,7 +31,8 @@ async function withRetry(operation, errorMessage) {
|
|||||||
lastError = error;
|
lastError = error;
|
||||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||||
if (attempt < MAX_RETRIES) {
|
if (attempt < MAX_RETRIES) {
|
||||||
await new Promise(resolve => setTimeout(resolve, RETRY_DELAY));
|
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||||
|
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -140,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||||
CASE
|
CASE
|
||||||
WHEN p.reorder < 0 THEN 0
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||||
|
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||||
WHEN (
|
WHEN (
|
||||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
) THEN 0
|
) THEN 0
|
||||||
ELSE 1
|
ELSE 1
|
||||||
END AS replenishable,
|
END AS replenishable,
|
||||||
@@ -155,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||||
CASE
|
CASE
|
||||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
THEN (
|
||||||
|
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||||
|
FROM product_inventory
|
||||||
|
WHERE pid = p.pid AND count > 0
|
||||||
|
)
|
||||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||||
END AS cost_price,
|
END AS cost_price,
|
||||||
NULL as landing_cost_price,
|
NULL as landing_cost_price,
|
||||||
@@ -183,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
p.country_of_origin,
|
p.country_of_origin,
|
||||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||||
p.totalsold AS total_sold,
|
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||||
pls.date_sold as date_last_sold,
|
pls.date_sold as date_last_sold,
|
||||||
GROUP_CONCAT(DISTINCT CASE
|
GROUP_CONCAT(DISTINCT CASE
|
||||||
WHEN pc.cat_id IS NOT NULL
|
WHEN pc.cat_id IS NOT NULL
|
||||||
@@ -233,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
row.pid,
|
row.pid,
|
||||||
row.title,
|
row.title,
|
||||||
row.description,
|
row.description,
|
||||||
row.itemnumber || '',
|
row.sku || '',
|
||||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||||
row.preorder_count,
|
row.preorder_count,
|
||||||
row.notions_inv_count,
|
row.notions_inv_count,
|
||||||
@@ -335,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||||
CASE
|
CASE
|
||||||
WHEN p.reorder < 0 THEN 0
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||||
|
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||||
WHEN (
|
WHEN (
|
||||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
) THEN 0
|
) THEN 0
|
||||||
ELSE 1
|
ELSE 1
|
||||||
END AS replenishable,
|
END AS replenishable,
|
||||||
@@ -350,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||||
CASE
|
CASE
|
||||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
THEN (
|
||||||
|
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||||
|
FROM product_inventory
|
||||||
|
WHERE pid = p.pid AND count > 0
|
||||||
|
)
|
||||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||||
END AS cost_price,
|
END AS cost_price,
|
||||||
NULL as landing_cost_price,
|
NULL as landing_cost_price,
|
||||||
@@ -378,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
p.country_of_origin,
|
p.country_of_origin,
|
||||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||||
p.totalsold AS total_sold,
|
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||||
pls.date_sold as date_last_sold,
|
pls.date_sold as date_last_sold,
|
||||||
GROUP_CONCAT(DISTINCT CASE
|
GROUP_CONCAT(DISTINCT CASE
|
||||||
WHEN pc.cat_id IS NOT NULL
|
WHEN pc.cat_id IS NOT NULL
|
||||||
@@ -432,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
row.pid,
|
row.pid,
|
||||||
row.title,
|
row.title,
|
||||||
row.description,
|
row.description,
|
||||||
row.itemnumber || '',
|
row.sku || '',
|
||||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||||
row.preorder_count,
|
row.preorder_count,
|
||||||
row.notions_inv_count,
|
row.notions_inv_count,
|
||||||
@@ -772,31 +788,43 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
|||||||
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
||||||
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
||||||
|
|
||||||
// Process category relationships for each product in the batch
|
// Process category relationships in batches
|
||||||
|
const allCategories = [];
|
||||||
for (const row of batch) {
|
for (const row of batch) {
|
||||||
if (row.categories) {
|
if (row.categories) {
|
||||||
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
||||||
if (categoryIds.length > 0) {
|
if (categoryIds.length > 0) {
|
||||||
const catPlaceholders = categoryIds.map((_, idx) =>
|
categoryIds.forEach(catId => {
|
||||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
allCategories.push([row.pid, parseInt(catId.trim(), 10)]);
|
||||||
).join(',');
|
});
|
||||||
const catValues = categoryIds.flatMap(catId => [row.pid, parseInt(catId.trim(), 10)]);
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// First delete existing relationships for this product
|
// If we have categories to process
|
||||||
|
if (allCategories.length > 0) {
|
||||||
|
// First get all products in this batch
|
||||||
|
const productIds = batch.map(p => p.pid);
|
||||||
|
|
||||||
|
// Delete all existing relationships for products in this batch
|
||||||
await localConnection.query(
|
await localConnection.query(
|
||||||
'DELETE FROM product_categories WHERE pid = $1',
|
'DELETE FROM product_categories WHERE pid = ANY($1)',
|
||||||
[row.pid]
|
[productIds]
|
||||||
);
|
);
|
||||||
|
|
||||||
// Then insert the new relationships
|
// Insert all new relationships in one batch
|
||||||
|
const catPlaceholders = allCategories.map((_, idx) =>
|
||||||
|
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||||
|
).join(',');
|
||||||
|
|
||||||
|
const catValues = allCategories.flat();
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO product_categories (pid, cat_id)
|
INSERT INTO product_categories (pid, cat_id)
|
||||||
VALUES ${catPlaceholders}
|
VALUES ${catPlaceholders}
|
||||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||||
`, catValues);
|
`, catValues);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "running",
|
status: "running",
|
||||||
@@ -816,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
|||||||
// Commit the transaction
|
// Commit the transaction
|
||||||
await localConnection.commit();
|
await localConnection.commit();
|
||||||
|
|
||||||
|
// Update sync status
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||||
|
VALUES ('products', NOW())
|
||||||
|
ON CONFLICT (table_name) DO UPDATE SET
|
||||||
|
last_sync_timestamp = NOW()
|
||||||
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
status: 'complete',
|
status: 'complete',
|
||||||
recordsAdded,
|
recordsAdded,
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -32,12 +32,12 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -98,14 +98,14 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
|
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
|
||||||
COUNT(DISTINCT DATE(o.date)) as period_days,
|
COUNT(DISTINCT DATE(o.date)) as period_days,
|
||||||
CASE
|
CASE
|
||||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH) THEN 'current'
|
WHEN o.date >= CURRENT_DATE - INTERVAL '3 months' THEN 'current'
|
||||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
WHEN o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH) THEN 'previous'
|
AND CURRENT_DATE - INTERVAL '12 months' THEN 'previous'
|
||||||
END as period_type
|
END as period_type
|
||||||
FROM filtered_products p
|
FROM filtered_products p
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||||
GROUP BY p.brand, period_type
|
GROUP BY p.brand, period_type
|
||||||
),
|
),
|
||||||
brand_data AS (
|
brand_data AS (
|
||||||
@@ -165,15 +165,16 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
|
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
|
||||||
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
|
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
|
||||||
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
|
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (brand) DO UPDATE
|
||||||
product_count = VALUES(product_count),
|
SET
|
||||||
active_products = VALUES(active_products),
|
product_count = EXCLUDED.product_count,
|
||||||
total_stock_units = VALUES(total_stock_units),
|
active_products = EXCLUDED.active_products,
|
||||||
total_stock_cost = VALUES(total_stock_cost),
|
total_stock_units = EXCLUDED.total_stock_units,
|
||||||
total_stock_retail = VALUES(total_stock_retail),
|
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||||
total_revenue = VALUES(total_revenue),
|
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||||
avg_margin = VALUES(avg_margin),
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
growth_rate = VALUES(growth_rate),
|
avg_margin = EXCLUDED.avg_margin,
|
||||||
|
growth_rate = EXCLUDED.growth_rate,
|
||||||
last_calculated_at = CURRENT_TIMESTAMP
|
last_calculated_at = CURRENT_TIMESTAMP
|
||||||
`);
|
`);
|
||||||
|
|
||||||
@@ -230,8 +231,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
monthly_metrics AS (
|
monthly_metrics AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.brand,
|
p.brand,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
COUNT(DISTINCT p.valid_pid) as product_count,
|
COUNT(DISTINCT p.valid_pid) as product_count,
|
||||||
COUNT(DISTINCT p.active_pid) as active_products,
|
COUNT(DISTINCT p.active_pid) as active_products,
|
||||||
SUM(p.valid_stock) as total_stock_units,
|
SUM(p.valid_stock) as total_stock_units,
|
||||||
@@ -255,19 +256,20 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
END as avg_margin
|
END as avg_margin
|
||||||
FROM filtered_products p
|
FROM filtered_products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
|
GROUP BY p.brand, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||||
)
|
)
|
||||||
SELECT *
|
SELECT *
|
||||||
FROM monthly_metrics
|
FROM monthly_metrics
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (brand, year, month) DO UPDATE
|
||||||
product_count = VALUES(product_count),
|
SET
|
||||||
active_products = VALUES(active_products),
|
product_count = EXCLUDED.product_count,
|
||||||
total_stock_units = VALUES(total_stock_units),
|
active_products = EXCLUDED.active_products,
|
||||||
total_stock_cost = VALUES(total_stock_cost),
|
total_stock_units = EXCLUDED.total_stock_units,
|
||||||
total_stock_retail = VALUES(total_stock_retail),
|
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||||
total_revenue = VALUES(total_revenue),
|
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||||
avg_margin = VALUES(avg_margin)
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
|
avg_margin = EXCLUDED.avg_margin
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.99);
|
processedCount = Math.floor(totalProducts * 0.99);
|
||||||
@@ -294,7 +296,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('brand_metrics', NOW())
|
VALUES ('brand_metrics', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -32,12 +32,12 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -76,12 +76,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
|
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||||
LEFT JOIN products p ON pc.pid = p.pid
|
LEFT JOIN products p ON pc.pid = p.pid
|
||||||
GROUP BY c.cat_id, c.status
|
GROUP BY c.cat_id, c.status
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (category_id) DO UPDATE
|
||||||
product_count = VALUES(product_count),
|
SET
|
||||||
active_products = VALUES(active_products),
|
product_count = EXCLUDED.product_count,
|
||||||
total_value = VALUES(total_value),
|
active_products = EXCLUDED.active_products,
|
||||||
status = VALUES(status),
|
total_value = EXCLUDED.total_value,
|
||||||
last_calculated_at = VALUES(last_calculated_at)
|
status = EXCLUDED.status,
|
||||||
|
last_calculated_at = EXCLUDED.last_calculated_at
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.90);
|
processedCount = Math.floor(totalProducts * 0.90);
|
||||||
@@ -127,17 +128,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
|
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
|
||||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL COALESCE(tc.calculation_period_days, 30) DAY)
|
AND o.date >= CURRENT_DATE - (COALESCE(tc.calculation_period_days, 30) || ' days')::INTERVAL
|
||||||
GROUP BY pc.cat_id
|
GROUP BY pc.cat_id
|
||||||
)
|
)
|
||||||
UPDATE category_metrics cm
|
UPDATE category_metrics
|
||||||
JOIN category_sales cs ON cm.category_id = cs.cat_id
|
|
||||||
LEFT JOIN turnover_config tc ON
|
|
||||||
(tc.category_id = cm.category_id AND tc.vendor IS NULL) OR
|
|
||||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
|
||||||
SET
|
SET
|
||||||
cm.avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||||
cm.turnover_rate = CASE
|
turnover_rate = CASE
|
||||||
WHEN cs.avg_stock > 0 AND cs.active_days > 0
|
WHEN cs.avg_stock > 0 AND cs.active_days > 0
|
||||||
THEN LEAST(
|
THEN LEAST(
|
||||||
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
|
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
|
||||||
@@ -145,7 +142,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
)
|
)
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END,
|
END,
|
||||||
cm.last_calculated_at = NOW()
|
last_calculated_at = NOW()
|
||||||
|
FROM category_sales cs
|
||||||
|
WHERE category_id = cs.cat_id
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.95);
|
processedCount = Math.floor(totalProducts * 0.95);
|
||||||
@@ -184,9 +183,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
FROM product_categories pc
|
FROM product_categories pc
|
||||||
JOIN products p ON pc.pid = p.pid
|
JOIN products p ON pc.pid = p.pid
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||||
GROUP BY pc.cat_id
|
GROUP BY pc.cat_id
|
||||||
),
|
),
|
||||||
previous_period AS (
|
previous_period AS (
|
||||||
@@ -198,26 +197,26 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
FROM product_categories pc
|
FROM product_categories pc
|
||||||
JOIN products p ON pc.pid = p.pid
|
JOIN products p ON pc.pid = p.pid
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
AND o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
AND CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY pc.cat_id
|
GROUP BY pc.cat_id
|
||||||
),
|
),
|
||||||
trend_data AS (
|
trend_data AS (
|
||||||
SELECT
|
SELECT
|
||||||
pc.cat_id,
|
pc.cat_id,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date) as month,
|
||||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||||
COUNT(DISTINCT DATE(o.date)) as days_in_month
|
COUNT(DISTINCT DATE(o.date)) as days_in_month
|
||||||
FROM product_categories pc
|
FROM product_categories pc
|
||||||
JOIN products p ON pc.pid = p.pid
|
JOIN products p ON pc.pid = p.pid
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||||
GROUP BY pc.cat_id, MONTH(o.date)
|
GROUP BY pc.cat_id, EXTRACT(MONTH FROM o.date)
|
||||||
),
|
),
|
||||||
trend_stats AS (
|
trend_stats AS (
|
||||||
SELECT
|
SELECT
|
||||||
@@ -261,16 +260,42 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
JOIN products p ON pc.pid = p.pid
|
JOIN products p ON pc.pid = p.pid
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||||
GROUP BY pc.cat_id
|
GROUP BY pc.cat_id
|
||||||
|
),
|
||||||
|
combined_metrics AS (
|
||||||
|
SELECT
|
||||||
|
COALESCE(cp.cat_id, pp.cat_id) as category_id,
|
||||||
|
CASE
|
||||||
|
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||||
|
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||||
|
WHEN ta.trend_slope IS NOT NULL THEN
|
||||||
|
GREATEST(
|
||||||
|
-100.0,
|
||||||
|
LEAST(
|
||||||
|
(ta.trend_slope / NULLIF(ta.avg_daily_revenue, 0)) * 365 * 100,
|
||||||
|
999.99
|
||||||
|
)
|
||||||
|
)
|
||||||
|
ELSE
|
||||||
|
GREATEST(
|
||||||
|
-100.0,
|
||||||
|
LEAST(
|
||||||
|
((COALESCE(cp.revenue, 0) - pp.revenue) /
|
||||||
|
NULLIF(ABS(pp.revenue), 0)) * 100.0,
|
||||||
|
999.99
|
||||||
|
)
|
||||||
|
)
|
||||||
|
END as growth_rate,
|
||||||
|
mc.avg_margin
|
||||||
|
FROM current_period cp
|
||||||
|
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||||
|
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||||
|
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||||
)
|
)
|
||||||
UPDATE category_metrics cm
|
UPDATE category_metrics cm
|
||||||
LEFT JOIN current_period cp ON cm.category_id = cp.cat_id
|
|
||||||
LEFT JOIN previous_period pp ON cm.category_id = pp.cat_id
|
|
||||||
LEFT JOIN trend_analysis ta ON cm.category_id = ta.cat_id
|
|
||||||
LEFT JOIN margin_calc mc ON cm.category_id = mc.cat_id
|
|
||||||
SET
|
SET
|
||||||
cm.growth_rate = CASE
|
growth_rate = CASE
|
||||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||||
WHEN ta.trend_slope IS NOT NULL THEN
|
WHEN ta.trend_slope IS NOT NULL THEN
|
||||||
@@ -291,9 +316,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
END,
|
END,
|
||||||
cm.avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||||
cm.last_calculated_at = NOW()
|
last_calculated_at = NOW()
|
||||||
WHERE cp.cat_id IS NOT NULL OR pp.cat_id IS NOT NULL
|
FROM current_period cp
|
||||||
|
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||||
|
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||||
|
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||||
|
WHERE cm.category_id = COALESCE(cp.cat_id, pp.cat_id)
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.97);
|
processedCount = Math.floor(totalProducts * 0.97);
|
||||||
@@ -335,8 +364,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
pc.cat_id,
|
pc.cat_id,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
COUNT(DISTINCT p.pid) as product_count,
|
COUNT(DISTINCT p.pid) as product_count,
|
||||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
||||||
SUM(p.stock_quantity * p.cost_price) as total_value,
|
SUM(p.stock_quantity * p.cost_price) as total_value,
|
||||||
@@ -364,15 +393,16 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
JOIN products p ON pc.pid = p.pid
|
JOIN products p ON pc.pid = p.pid
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY pc.cat_id, YEAR(o.date), MONTH(o.date)
|
GROUP BY pc.cat_id, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (category_id, year, month) DO UPDATE
|
||||||
product_count = VALUES(product_count),
|
SET
|
||||||
active_products = VALUES(active_products),
|
product_count = EXCLUDED.product_count,
|
||||||
total_value = VALUES(total_value),
|
active_products = EXCLUDED.active_products,
|
||||||
total_revenue = VALUES(total_revenue),
|
total_value = EXCLUDED.total_value,
|
||||||
avg_margin = VALUES(avg_margin),
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
turnover_rate = VALUES(turnover_rate)
|
avg_margin = EXCLUDED.avg_margin,
|
||||||
|
turnover_rate = EXCLUDED.turnover_rate
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.99);
|
processedCount = Math.floor(totalProducts * 0.99);
|
||||||
@@ -414,20 +444,20 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
)
|
)
|
||||||
WITH date_ranges AS (
|
WITH date_ranges AS (
|
||||||
SELECT
|
SELECT
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY) as period_start,
|
CURRENT_DATE - INTERVAL '30 days' as period_start,
|
||||||
CURRENT_DATE as period_end
|
CURRENT_DATE as period_end
|
||||||
UNION ALL
|
UNION ALL
|
||||||
SELECT
|
SELECT
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY),
|
CURRENT_DATE - INTERVAL '90 days',
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 31 DAY)
|
CURRENT_DATE - INTERVAL '31 days'
|
||||||
UNION ALL
|
UNION ALL
|
||||||
SELECT
|
SELECT
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 180 DAY),
|
CURRENT_DATE - INTERVAL '180 days',
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 91 DAY)
|
CURRENT_DATE - INTERVAL '91 days'
|
||||||
UNION ALL
|
UNION ALL
|
||||||
SELECT
|
SELECT
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 365 DAY),
|
CURRENT_DATE - INTERVAL '365 days',
|
||||||
DATE_SUB(CURRENT_DATE, INTERVAL 181 DAY)
|
CURRENT_DATE - INTERVAL '181 days'
|
||||||
),
|
),
|
||||||
sales_data AS (
|
sales_data AS (
|
||||||
SELECT
|
SELECT
|
||||||
@@ -466,12 +496,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
END as avg_price,
|
END as avg_price,
|
||||||
NOW() as last_calculated_at
|
NOW() as last_calculated_at
|
||||||
FROM sales_data
|
FROM sales_data
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (category_id, brand, period_start, period_end) DO UPDATE
|
||||||
avg_daily_sales = VALUES(avg_daily_sales),
|
SET
|
||||||
total_sold = VALUES(total_sold),
|
avg_daily_sales = EXCLUDED.avg_daily_sales,
|
||||||
num_products = VALUES(num_products),
|
total_sold = EXCLUDED.total_sold,
|
||||||
avg_price = VALUES(avg_price),
|
num_products = EXCLUDED.num_products,
|
||||||
last_calculated_at = VALUES(last_calculated_at)
|
avg_price = EXCLUDED.avg_price,
|
||||||
|
last_calculated_at = EXCLUDED.last_calculated_at
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 1.0);
|
processedCount = Math.floor(totalProducts * 1.0);
|
||||||
@@ -498,7 +529,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('category_metrics', NOW())
|
VALUES ('category_metrics', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -32,13 +32,13 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -67,27 +67,28 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
|||||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||||
MIN(o.date) as first_sale_date,
|
MIN(o.date) as first_sale_date,
|
||||||
MAX(o.date) as last_sale_date,
|
MAX(o.date) as last_sale_date,
|
||||||
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days,
|
EXTRACT(DAY FROM (MAX(o.date)::timestamp with time zone - MIN(o.date)::timestamp with time zone)) + 1 as calculation_period_days,
|
||||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid, p.cost_price, p.stock_quantity
|
||||||
)
|
)
|
||||||
UPDATE product_metrics pm
|
UPDATE product_metrics pm
|
||||||
JOIN product_financials pf ON pm.pid = pf.pid
|
|
||||||
SET
|
SET
|
||||||
pm.inventory_value = COALESCE(pf.inventory_value, 0),
|
inventory_value = COALESCE(pf.inventory_value, 0),
|
||||||
pm.total_revenue = COALESCE(pf.total_revenue, 0),
|
total_revenue = COALESCE(pf.total_revenue, 0),
|
||||||
pm.cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
||||||
pm.gross_profit = COALESCE(pf.gross_profit, 0),
|
gross_profit = COALESCE(pf.gross_profit, 0),
|
||||||
pm.gmroi = CASE
|
gmroi = CASE
|
||||||
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
|
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
|
||||||
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
|
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END,
|
END,
|
||||||
pm.last_calculated_at = CURRENT_TIMESTAMP
|
last_calculated_at = CURRENT_TIMESTAMP
|
||||||
|
FROM product_financials pf
|
||||||
|
WHERE pm.pid = pf.pid
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.65);
|
processedCount = Math.floor(totalProducts * 0.65);
|
||||||
@@ -119,8 +120,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
|||||||
WITH monthly_financials AS (
|
WITH monthly_financials AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
p.cost_price * p.stock_quantity as inventory_value,
|
p.cost_price * p.stock_quantity as inventory_value,
|
||||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||||
COUNT(DISTINCT DATE(o.date)) as active_days,
|
COUNT(DISTINCT DATE(o.date)) as active_days,
|
||||||
@@ -129,19 +130,20 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
|||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||||
)
|
)
|
||||||
UPDATE product_time_aggregates pta
|
UPDATE product_time_aggregates pta
|
||||||
JOIN monthly_financials mf ON pta.pid = mf.pid
|
|
||||||
AND pta.year = mf.year
|
|
||||||
AND pta.month = mf.month
|
|
||||||
SET
|
SET
|
||||||
pta.inventory_value = COALESCE(mf.inventory_value, 0),
|
inventory_value = COALESCE(mf.inventory_value, 0),
|
||||||
pta.gmroi = CASE
|
gmroi = CASE
|
||||||
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
|
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
|
||||||
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
|
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END
|
END
|
||||||
|
FROM monthly_financials mf
|
||||||
|
WHERE pta.pid = mf.pid
|
||||||
|
AND pta.year = mf.year
|
||||||
|
AND pta.month = mf.month
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.70);
|
processedCount = Math.floor(totalProducts * 0.70);
|
||||||
@@ -168,7 +170,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('financial_metrics', NOW())
|
VALUES ('financial_metrics', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -10,20 +10,21 @@ function sanitizeValue(value) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async function calculateProductMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
async function calculateProductMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||||
const connection = await getConnection();
|
let connection;
|
||||||
let success = false;
|
let success = false;
|
||||||
let processedOrders = 0;
|
let processedOrders = 0;
|
||||||
const BATCH_SIZE = 5000;
|
const BATCH_SIZE = 5000;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
connection = await getConnection();
|
||||||
// Skip flags are inherited from the parent scope
|
// Skip flags are inherited from the parent scope
|
||||||
const SKIP_PRODUCT_BASE_METRICS = 0;
|
const SKIP_PRODUCT_BASE_METRICS = 0;
|
||||||
const SKIP_PRODUCT_TIME_AGGREGATES = 0;
|
const SKIP_PRODUCT_TIME_AGGREGATES = 0;
|
||||||
|
|
||||||
// Get total product count if not provided
|
// Get total product count if not provided
|
||||||
if (!totalProducts) {
|
if (!totalProducts) {
|
||||||
const [productCount] = await connection.query('SELECT COUNT(*) as count FROM products');
|
const productCount = await connection.query('SELECT COUNT(*) as count FROM products');
|
||||||
totalProducts = productCount[0].count;
|
totalProducts = parseInt(productCount.rows[0].count);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (isCancelled) {
|
if (isCancelled) {
|
||||||
@@ -52,19 +53,20 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
|
|
||||||
// First ensure all products have a metrics record
|
// First ensure all products have a metrics record
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT IGNORE INTO product_metrics (pid, last_calculated_at)
|
INSERT INTO product_metrics (pid, last_calculated_at)
|
||||||
SELECT pid, NOW()
|
SELECT pid, NOW()
|
||||||
FROM products
|
FROM products
|
||||||
|
ON CONFLICT (pid) DO NOTHING
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get threshold settings once
|
// Get threshold settings once
|
||||||
const [thresholds] = await connection.query(`
|
const thresholds = await connection.query(`
|
||||||
SELECT critical_days, reorder_days, overstock_days, low_stock_threshold
|
SELECT critical_days, reorder_days, overstock_days, low_stock_threshold
|
||||||
FROM stock_thresholds
|
FROM stock_thresholds
|
||||||
WHERE category_id IS NULL AND vendor IS NULL
|
WHERE category_id IS NULL AND vendor IS NULL
|
||||||
LIMIT 1
|
LIMIT 1
|
||||||
`);
|
`);
|
||||||
const defaultThresholds = thresholds[0];
|
const defaultThresholds = thresholds.rows[0];
|
||||||
|
|
||||||
// Calculate base product metrics
|
// Calculate base product metrics
|
||||||
if (!SKIP_PRODUCT_BASE_METRICS) {
|
if (!SKIP_PRODUCT_BASE_METRICS) {
|
||||||
@@ -85,16 +87,43 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
});
|
});
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
// Clear temporary tables
|
// Clear temporary tables
|
||||||
await connection.query('TRUNCATE TABLE temp_sales_metrics');
|
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||||
await connection.query('TRUNCATE TABLE temp_purchase_metrics');
|
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||||
|
|
||||||
|
// Create temp_sales_metrics
|
||||||
|
await connection.query(`
|
||||||
|
CREATE TEMPORARY TABLE temp_sales_metrics (
|
||||||
|
pid BIGINT NOT NULL,
|
||||||
|
daily_sales_avg DECIMAL(10,3),
|
||||||
|
weekly_sales_avg DECIMAL(10,3),
|
||||||
|
monthly_sales_avg DECIMAL(10,3),
|
||||||
|
total_revenue DECIMAL(10,3),
|
||||||
|
avg_margin_percent DECIMAL(10,3),
|
||||||
|
first_sale_date DATE,
|
||||||
|
last_sale_date DATE,
|
||||||
|
PRIMARY KEY (pid)
|
||||||
|
)
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Create temp_purchase_metrics
|
||||||
|
await connection.query(`
|
||||||
|
CREATE TEMPORARY TABLE temp_purchase_metrics (
|
||||||
|
pid BIGINT NOT NULL,
|
||||||
|
avg_lead_time_days DOUBLE PRECISION,
|
||||||
|
last_purchase_date DATE,
|
||||||
|
first_received_date DATE,
|
||||||
|
last_received_date DATE,
|
||||||
|
PRIMARY KEY (pid)
|
||||||
|
)
|
||||||
|
`);
|
||||||
|
|
||||||
// Populate temp_sales_metrics with base stats and sales averages
|
// Populate temp_sales_metrics with base stats and sales averages
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
@@ -115,98 +144,131 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
AND o.canceled = false
|
AND o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Populate temp_purchase_metrics
|
// Populate temp_purchase_metrics with timeout protection
|
||||||
await connection.query(`
|
await Promise.race([
|
||||||
|
connection.query(`
|
||||||
INSERT INTO temp_purchase_metrics
|
INSERT INTO temp_purchase_metrics
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
|
AVG(
|
||||||
|
CASE
|
||||||
|
WHEN po.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||||
|
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||||
|
ELSE NULL
|
||||||
|
END
|
||||||
|
) as avg_lead_time_days,
|
||||||
MAX(po.date) as last_purchase_date,
|
MAX(po.date) as last_purchase_date,
|
||||||
MIN(po.received_date) as first_received_date,
|
MIN(po.received_date) as first_received_date,
|
||||||
MAX(po.received_date) as last_received_date
|
MAX(po.received_date) as last_received_date
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
||||||
AND po.received_date IS NOT NULL
|
AND po.received_date IS NOT NULL
|
||||||
AND po.date >= DATE_SUB(CURDATE(), INTERVAL 365 DAY)
|
AND po.date IS NOT NULL
|
||||||
|
AND po.date >= CURRENT_DATE - INTERVAL '365 days'
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid
|
||||||
|
`),
|
||||||
|
new Promise((_, reject) =>
|
||||||
|
setTimeout(() => reject(new Error('Timeout: temp_purchase_metrics query took too long')), 60000)
|
||||||
|
)
|
||||||
|
]).catch(async (err) => {
|
||||||
|
logError(err, 'Error populating temp_purchase_metrics, continuing with empty table');
|
||||||
|
// Create an empty fallback to continue processing
|
||||||
|
await connection.query(`
|
||||||
|
INSERT INTO temp_purchase_metrics
|
||||||
|
SELECT
|
||||||
|
p.pid,
|
||||||
|
30.0 as avg_lead_time_days,
|
||||||
|
NULL as last_purchase_date,
|
||||||
|
NULL as first_received_date,
|
||||||
|
NULL as last_received_date
|
||||||
|
FROM products p
|
||||||
|
LEFT JOIN temp_purchase_metrics tpm ON p.pid = tpm.pid
|
||||||
|
WHERE tpm.pid IS NULL
|
||||||
`);
|
`);
|
||||||
|
});
|
||||||
|
|
||||||
// Process updates in batches
|
// Process updates in batches
|
||||||
let lastPid = 0;
|
let lastPid = 0;
|
||||||
while (true) {
|
let batchCount = 0;
|
||||||
|
const MAX_BATCHES = 1000; // Safety limit for number of batches to prevent infinite loops
|
||||||
|
|
||||||
|
while (batchCount < MAX_BATCHES) {
|
||||||
if (isCancelled) break;
|
if (isCancelled) break;
|
||||||
|
|
||||||
const [batch] = await connection.query(
|
batchCount++;
|
||||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
const batch = await connection.query(
|
||||||
|
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||||
[lastPid, BATCH_SIZE]
|
[lastPid, BATCH_SIZE]
|
||||||
);
|
);
|
||||||
|
|
||||||
if (batch.length === 0) break;
|
if (batch.rows.length === 0) break;
|
||||||
|
|
||||||
|
// Process the entire batch in a single efficient query
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE product_metrics pm
|
UPDATE product_metrics pm
|
||||||
JOIN products p ON pm.pid = p.pid
|
|
||||||
LEFT JOIN temp_sales_metrics sm ON pm.pid = sm.pid
|
|
||||||
LEFT JOIN temp_purchase_metrics lm ON pm.pid = lm.pid
|
|
||||||
SET
|
SET
|
||||||
pm.inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
||||||
pm.daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
||||||
pm.weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
||||||
pm.monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
||||||
pm.total_revenue = COALESCE(sm.total_revenue, 0),
|
total_revenue = COALESCE(sm.total_revenue, 0),
|
||||||
pm.avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
||||||
pm.first_sale_date = sm.first_sale_date,
|
first_sale_date = sm.first_sale_date,
|
||||||
pm.last_sale_date = sm.last_sale_date,
|
last_sale_date = sm.last_sale_date,
|
||||||
pm.avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
|
avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
|
||||||
pm.days_of_inventory = CASE
|
days_of_inventory = CASE
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0
|
WHEN COALESCE(sm.daily_sales_avg, 0) > 0
|
||||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.daily_sales_avg, 0))
|
THEN FLOOR(p.stock_quantity / NULLIF(sm.daily_sales_avg, 0))
|
||||||
ELSE NULL
|
ELSE NULL
|
||||||
END,
|
END,
|
||||||
pm.weeks_of_inventory = CASE
|
weeks_of_inventory = CASE
|
||||||
WHEN COALESCE(sm.weekly_sales_avg, 0) > 0
|
WHEN COALESCE(sm.weekly_sales_avg, 0) > 0
|
||||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.weekly_sales_avg, 0))
|
THEN FLOOR(p.stock_quantity / NULLIF(sm.weekly_sales_avg, 0))
|
||||||
ELSE NULL
|
ELSE NULL
|
||||||
END,
|
END,
|
||||||
pm.stock_status = CASE
|
stock_status = CASE
|
||||||
WHEN p.stock_quantity <= 0 THEN 'Out of Stock'
|
WHEN p.stock_quantity <= 0 THEN 'Out of Stock'
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= ? THEN 'Low Stock'
|
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= $1 THEN 'Low Stock'
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 THEN 'In Stock'
|
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 THEN 'In Stock'
|
||||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Critical'
|
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $2 THEN 'Critical'
|
||||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Reorder'
|
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $3 THEN 'Reorder'
|
||||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ? THEN 'Overstocked'
|
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $4 THEN 'Overstocked'
|
||||||
ELSE 'Healthy'
|
ELSE 'Healthy'
|
||||||
END,
|
END,
|
||||||
pm.safety_stock = CASE
|
safety_stock = CASE
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
|
||||||
ELSE ?
|
ELSE $5
|
||||||
END,
|
END,
|
||||||
pm.reorder_point = CASE
|
reorder_point = CASE
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||||
CEIL(sm.daily_sales_avg * COALESCE(lm.avg_lead_time_days, 30)) +
|
CEIL(sm.daily_sales_avg * COALESCE(lm.avg_lead_time_days, 30)) +
|
||||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
|
||||||
ELSE ?
|
ELSE $6
|
||||||
END,
|
END,
|
||||||
pm.reorder_qty = CASE
|
reorder_qty = CASE
|
||||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL THEN
|
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL AND NULLIF(p.cost_price, 0) > 0 THEN
|
||||||
GREATEST(
|
GREATEST(
|
||||||
CEIL(SQRT((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25))),
|
CEIL(SQRT(ABS((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25)))),
|
||||||
?
|
$7
|
||||||
)
|
)
|
||||||
ELSE ?
|
ELSE $8
|
||||||
END,
|
END,
|
||||||
pm.overstocked_amt = CASE
|
overstocked_amt = CASE
|
||||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ?
|
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $9
|
||||||
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * ?))
|
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * $10))
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END,
|
END,
|
||||||
pm.last_calculated_at = NOW()
|
last_calculated_at = NOW()
|
||||||
WHERE p.pid IN (${batch.map(() => '?').join(',')})
|
FROM products p
|
||||||
|
LEFT JOIN temp_sales_metrics sm ON p.pid = sm.pid
|
||||||
|
LEFT JOIN temp_purchase_metrics lm ON p.pid = lm.pid
|
||||||
|
WHERE p.pid = ANY($11::bigint[])
|
||||||
|
AND pm.pid = p.pid
|
||||||
`,
|
`,
|
||||||
[
|
[
|
||||||
defaultThresholds.low_stock_threshold,
|
defaultThresholds.low_stock_threshold,
|
||||||
@@ -219,12 +281,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
defaultThresholds.low_stock_threshold,
|
defaultThresholds.low_stock_threshold,
|
||||||
defaultThresholds.overstock_days,
|
defaultThresholds.overstock_days,
|
||||||
defaultThresholds.overstock_days,
|
defaultThresholds.overstock_days,
|
||||||
...batch.map(row => row.pid)
|
batch.rows.map(row => row.pid)
|
||||||
]
|
]);
|
||||||
);
|
|
||||||
|
|
||||||
lastPid = batch[batch.length - 1].pid;
|
lastPid = batch.rows[batch.rows.length - 1].pid;
|
||||||
processedCount += batch.length;
|
processedCount += batch.rows.length;
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -243,31 +304,42 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Add safety check if the loop processed MAX_BATCHES
|
||||||
|
if (batchCount >= MAX_BATCHES) {
|
||||||
|
logError(new Error(`Reached maximum batch count (${MAX_BATCHES}). Process may have entered an infinite loop.`), 'Batch processing safety limit reached');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Calculate forecast accuracy and bias in batches
|
// Calculate forecast accuracy and bias in batches
|
||||||
lastPid = 0;
|
lastPid = 0;
|
||||||
while (true) {
|
while (true) {
|
||||||
if (isCancelled) break;
|
if (isCancelled) break;
|
||||||
|
|
||||||
const [batch] = await connection.query(
|
const batch = await connection.query(
|
||||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||||
[lastPid, BATCH_SIZE]
|
[lastPid, BATCH_SIZE]
|
||||||
);
|
);
|
||||||
|
|
||||||
if (batch.length === 0) break;
|
if (batch.rows.length === 0) break;
|
||||||
|
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE product_metrics pm
|
UPDATE product_metrics pm
|
||||||
JOIN (
|
SET
|
||||||
|
forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
|
||||||
|
forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
|
||||||
|
last_forecast_date = fa.last_forecast_date,
|
||||||
|
last_calculated_at = NOW()
|
||||||
|
FROM (
|
||||||
SELECT
|
SELECT
|
||||||
sf.pid,
|
sf.pid,
|
||||||
AVG(CASE
|
AVG(CASE
|
||||||
WHEN o.quantity > 0
|
WHEN o.quantity > 0
|
||||||
THEN ABS(sf.forecast_units - o.quantity) / o.quantity * 100
|
THEN ABS(sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||||
ELSE 100
|
ELSE 100
|
||||||
END) as avg_forecast_error,
|
END) as avg_forecast_error,
|
||||||
AVG(CASE
|
AVG(CASE
|
||||||
WHEN o.quantity > 0
|
WHEN o.quantity > 0
|
||||||
THEN (sf.forecast_units - o.quantity) / o.quantity * 100
|
THEN (sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END) as avg_forecast_bias,
|
END) as avg_forecast_bias,
|
||||||
MAX(sf.forecast_date) as last_forecast_date
|
MAX(sf.forecast_date) as last_forecast_date
|
||||||
@@ -275,20 +347,14 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
JOIN orders o ON sf.pid = o.pid
|
JOIN orders o ON sf.pid = o.pid
|
||||||
AND DATE(o.date) = sf.forecast_date
|
AND DATE(o.date) = sf.forecast_date
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND sf.forecast_date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
AND sf.forecast_date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
AND sf.pid IN (?)
|
AND sf.pid = ANY($1::bigint[])
|
||||||
GROUP BY sf.pid
|
GROUP BY sf.pid
|
||||||
) fa ON pm.pid = fa.pid
|
) fa
|
||||||
SET
|
WHERE pm.pid = fa.pid
|
||||||
pm.forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
|
`, [batch.rows.map(row => row.pid)]);
|
||||||
pm.forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
|
|
||||||
pm.last_forecast_date = fa.last_forecast_date,
|
|
||||||
pm.last_calculated_at = NOW()
|
|
||||||
WHERE pm.pid IN (?)
|
|
||||||
`, [batch.map(row => row.pid), batch.map(row => row.pid)]);
|
|
||||||
|
|
||||||
lastPid = batch[batch.length - 1].pid;
|
lastPid = batch.rows[batch.rows.length - 1].pid;
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Calculate product time aggregates
|
// Calculate product time aggregates
|
||||||
@@ -326,11 +392,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
SUM(o.quantity) as total_quantity_sold,
|
SUM(o.quantity) as total_quantity_sold,
|
||||||
SUM(o.quantity * o.price) as total_revenue,
|
SUM(o.price * o.quantity) as total_revenue,
|
||||||
SUM(o.quantity * p.cost_price) as total_cost,
|
SUM(p.cost_price * o.quantity) as total_cost,
|
||||||
COUNT(DISTINCT o.order_number) as order_count,
|
COUNT(DISTINCT o.order_number) as order_count,
|
||||||
AVG(o.price) as avg_price,
|
AVG(o.price) as avg_price,
|
||||||
CASE
|
CASE
|
||||||
@@ -346,17 +412,18 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
END as gmroi
|
END as gmroi
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (pid, year, month) DO UPDATE
|
||||||
total_quantity_sold = VALUES(total_quantity_sold),
|
SET
|
||||||
total_revenue = VALUES(total_revenue),
|
total_quantity_sold = EXCLUDED.total_quantity_sold,
|
||||||
total_cost = VALUES(total_cost),
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
order_count = VALUES(order_count),
|
total_cost = EXCLUDED.total_cost,
|
||||||
avg_price = VALUES(avg_price),
|
order_count = EXCLUDED.order_count,
|
||||||
profit_margin = VALUES(profit_margin),
|
avg_price = EXCLUDED.avg_price,
|
||||||
inventory_value = VALUES(inventory_value),
|
profit_margin = EXCLUDED.profit_margin,
|
||||||
gmroi = VALUES(gmroi)
|
inventory_value = EXCLUDED.inventory_value,
|
||||||
|
gmroi = EXCLUDED.gmroi
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.6);
|
processedCount = Math.floor(totalProducts * 0.6);
|
||||||
@@ -418,11 +485,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
success
|
success
|
||||||
};
|
};
|
||||||
|
|
||||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
const abcConfig = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
const abcThresholds = abcConfig.rows[0] || { a_threshold: 20, b_threshold: 50 };
|
||||||
|
|
||||||
// First, create and populate the rankings table with an index
|
// First, create and populate the rankings table with an index
|
||||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
await connection.query('DROP TABLE IF EXISTS temp_revenue_ranks');
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||||
pid BIGINT NOT NULL,
|
pid BIGINT NOT NULL,
|
||||||
@@ -431,12 +498,12 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
dense_rank_num INT,
|
dense_rank_num INT,
|
||||||
percentile DECIMAL(5,2),
|
percentile DECIMAL(5,2),
|
||||||
total_count INT,
|
total_count INT,
|
||||||
PRIMARY KEY (pid),
|
PRIMARY KEY (pid)
|
||||||
INDEX (rank_num),
|
)
|
||||||
INDEX (dense_rank_num),
|
|
||||||
INDEX (percentile)
|
|
||||||
) ENGINE=MEMORY
|
|
||||||
`);
|
`);
|
||||||
|
await connection.query('CREATE INDEX ON temp_revenue_ranks (rank_num)');
|
||||||
|
await connection.query('CREATE INDEX ON temp_revenue_ranks (dense_rank_num)');
|
||||||
|
await connection.query('CREATE INDEX ON temp_revenue_ranks (percentile)');
|
||||||
|
|
||||||
// Calculate rankings with proper tie handling
|
// Calculate rankings with proper tie handling
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
@@ -463,58 +530,74 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
`);
|
`);
|
||||||
|
|
||||||
// Get total count for percentage calculation
|
// Get total count for percentage calculation
|
||||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
const rankingCount = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||||
const totalCount = rankingCount[0].total_count || 1;
|
const totalCount = parseInt(rankingCount.rows[0].total_count) || 1;
|
||||||
const max_rank = totalCount;
|
|
||||||
|
|
||||||
// Process updates in batches
|
// Process updates in batches
|
||||||
let abcProcessedCount = 0;
|
let abcProcessedCount = 0;
|
||||||
const batchSize = 5000;
|
const batchSize = 5000;
|
||||||
|
const maxPid = await connection.query('SELECT MAX(pid) as max_pid FROM products');
|
||||||
|
const maxProductId = parseInt(maxPid.rows[0].max_pid);
|
||||||
|
|
||||||
while (true) {
|
while (abcProcessedCount < maxProductId) {
|
||||||
if (isCancelled) return {
|
if (isCancelled) return {
|
||||||
processedProducts: processedCount,
|
processedProducts: processedCount,
|
||||||
processedOrders,
|
processedOrders,
|
||||||
processedPurchaseOrders: 0, // This module doesn't process POs
|
processedPurchaseOrders: 0,
|
||||||
success
|
success
|
||||||
};
|
};
|
||||||
|
|
||||||
// Get a batch of PIDs that need updating
|
// Get a batch of PIDs that need updating
|
||||||
const [pids] = await connection.query(`
|
const pids = await connection.query(`
|
||||||
SELECT pm.pid
|
SELECT pm.pid
|
||||||
FROM product_metrics pm
|
FROM product_metrics pm
|
||||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||||
WHERE pm.abc_class IS NULL
|
WHERE pm.pid > $1
|
||||||
|
AND (pm.abc_class IS NULL
|
||||||
OR pm.abc_class !=
|
OR pm.abc_class !=
|
||||||
CASE
|
CASE
|
||||||
WHEN tr.pid IS NULL THEN 'C'
|
WHEN tr.pid IS NULL THEN 'C'
|
||||||
WHEN tr.percentile <= ? THEN 'A'
|
WHEN tr.percentile <= $2 THEN 'A'
|
||||||
WHEN tr.percentile <= ? THEN 'B'
|
WHEN tr.percentile <= $3 THEN 'B'
|
||||||
ELSE 'C'
|
ELSE 'C'
|
||||||
END
|
END)
|
||||||
LIMIT ?
|
ORDER BY pm.pid
|
||||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
|
LIMIT $4
|
||||||
|
`, [abcProcessedCount, abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
|
||||||
|
|
||||||
if (pids.length === 0) break;
|
if (pids.rows.length === 0) break;
|
||||||
|
|
||||||
|
const pidValues = pids.rows.map(row => row.pid);
|
||||||
|
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE product_metrics pm
|
UPDATE product_metrics pm
|
||||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
SET abc_class =
|
||||||
SET pm.abc_class =
|
|
||||||
CASE
|
CASE
|
||||||
WHEN tr.pid IS NULL THEN 'C'
|
WHEN tr.pid IS NULL THEN 'C'
|
||||||
WHEN tr.percentile <= ? THEN 'A'
|
WHEN tr.percentile <= $1 THEN 'A'
|
||||||
WHEN tr.percentile <= ? THEN 'B'
|
WHEN tr.percentile <= $2 THEN 'B'
|
||||||
ELSE 'C'
|
ELSE 'C'
|
||||||
END,
|
END,
|
||||||
pm.last_calculated_at = NOW()
|
last_calculated_at = NOW()
|
||||||
WHERE pm.pid IN (?)
|
FROM (SELECT pid, percentile FROM temp_revenue_ranks) tr
|
||||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pids.map(row => row.pid)]);
|
WHERE pm.pid = tr.pid AND pm.pid = ANY($3::bigint[])
|
||||||
|
OR (pm.pid = ANY($3::bigint[]) AND tr.pid IS NULL)
|
||||||
|
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pidValues]);
|
||||||
|
|
||||||
// Now update turnover rate with proper handling of zero inventory periods
|
// Now update turnover rate with proper handling of zero inventory periods
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE product_metrics pm
|
UPDATE product_metrics pm
|
||||||
JOIN (
|
SET
|
||||||
|
turnover_rate = CASE
|
||||||
|
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
||||||
|
THEN LEAST(
|
||||||
|
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
||||||
|
999.99
|
||||||
|
)
|
||||||
|
ELSE 0
|
||||||
|
END,
|
||||||
|
last_calculated_at = NOW()
|
||||||
|
FROM (
|
||||||
SELECT
|
SELECT
|
||||||
o.pid,
|
o.pid,
|
||||||
SUM(o.quantity) as total_sold,
|
SUM(o.quantity) as total_sold,
|
||||||
@@ -526,22 +609,33 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
FROM orders o
|
FROM orders o
|
||||||
JOIN products p ON o.pid = p.pid
|
JOIN products p ON o.pid = p.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
AND o.pid IN (?)
|
AND o.pid = ANY($1::bigint[])
|
||||||
GROUP BY o.pid
|
GROUP BY o.pid
|
||||||
) sales ON pm.pid = sales.pid
|
) sales
|
||||||
SET
|
WHERE pm.pid = sales.pid
|
||||||
pm.turnover_rate = CASE
|
`, [pidValues]);
|
||||||
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
|
||||||
THEN LEAST(
|
abcProcessedCount = pids.rows[pids.rows.length - 1].pid;
|
||||||
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
|
||||||
999.99
|
// Calculate progress proportionally to total products
|
||||||
)
|
processedCount = Math.floor(totalProducts * (0.60 + (abcProcessedCount / maxProductId) * 0.2));
|
||||||
ELSE 0
|
|
||||||
END,
|
outputProgress({
|
||||||
pm.last_calculated_at = NOW()
|
status: 'running',
|
||||||
WHERE pm.pid IN (?)
|
operation: 'ABC classification progress',
|
||||||
`, [pids.map(row => row.pid), pids.map(row => row.pid)]);
|
current: processedCount,
|
||||||
|
total: totalProducts,
|
||||||
|
elapsed: formatElapsedTime(startTime),
|
||||||
|
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||||
|
rate: calculateRate(startTime, processedCount),
|
||||||
|
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||||
|
timing: {
|
||||||
|
start_time: new Date(startTime).toISOString(),
|
||||||
|
end_time: new Date().toISOString(),
|
||||||
|
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||||
|
}
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// If we get here, everything completed successfully
|
// If we get here, everything completed successfully
|
||||||
@@ -551,7 +645,8 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('product_metrics', NOW())
|
VALUES ('product_metrics', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -566,7 +661,16 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
|||||||
logError(error, 'Error calculating product metrics');
|
logError(error, 'Error calculating product metrics');
|
||||||
throw error;
|
throw error;
|
||||||
} finally {
|
} finally {
|
||||||
|
// Always clean up temporary tables, even if an error occurred
|
||||||
if (connection) {
|
if (connection) {
|
||||||
|
try {
|
||||||
|
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||||
|
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error cleaning up temporary tables:', err);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Make sure to release the connection
|
||||||
connection.release();
|
connection.release();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -32,13 +32,13 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -69,15 +69,15 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO temp_forecast_dates
|
INSERT INTO temp_forecast_dates
|
||||||
SELECT
|
SELECT
|
||||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date,
|
CURRENT_DATE + (n || ' days')::INTERVAL as forecast_date,
|
||||||
DAYOFWEEK(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as day_of_week,
|
EXTRACT(DOW FROM CURRENT_DATE + (n || ' days')::INTERVAL) + 1 as day_of_week,
|
||||||
MONTH(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as month
|
EXTRACT(MONTH FROM CURRENT_DATE + (n || ' days')::INTERVAL) as month
|
||||||
FROM (
|
FROM (
|
||||||
SELECT a.N + b.N * 10 as n
|
SELECT a.n + b.n * 10 as n
|
||||||
FROM
|
FROM
|
||||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
|
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
|
||||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2) b
|
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2) b
|
||||||
ORDER BY n
|
ORDER BY n
|
||||||
LIMIT 31
|
LIMIT 31
|
||||||
) numbers
|
) numbers
|
||||||
@@ -109,17 +109,17 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
|
|
||||||
// Create temporary table for daily sales stats
|
// Create temporary table for daily sales stats
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_daily_sales AS
|
CREATE TEMPORARY TABLE temp_daily_sales AS
|
||||||
SELECT
|
SELECT
|
||||||
o.pid,
|
o.pid,
|
||||||
DAYOFWEEK(o.date) as day_of_week,
|
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||||
SUM(o.quantity) as daily_quantity,
|
SUM(o.quantity) as daily_quantity,
|
||||||
SUM(o.price * o.quantity) as daily_revenue,
|
SUM(o.price * o.quantity) as daily_revenue,
|
||||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
GROUP BY o.pid, DAYOFWEEK(o.date)
|
GROUP BY o.pid, EXTRACT(DOW FROM o.date) + 1
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.94);
|
processedCount = Math.floor(totalProducts * 0.94);
|
||||||
@@ -148,7 +148,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
|
|
||||||
// Create temporary table for product stats
|
// Create temporary table for product stats
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_product_stats AS
|
CREATE TEMPORARY TABLE temp_product_stats AS
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
pid,
|
||||||
AVG(daily_revenue) as overall_avg_revenue,
|
AVG(daily_revenue) as overall_avg_revenue,
|
||||||
@@ -186,10 +186,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
INSERT INTO sales_forecasts (
|
INSERT INTO sales_forecasts (
|
||||||
pid,
|
pid,
|
||||||
forecast_date,
|
forecast_date,
|
||||||
forecast_units,
|
forecast_quantity,
|
||||||
forecast_revenue,
|
|
||||||
confidence_level,
|
confidence_level,
|
||||||
last_calculated_at
|
created_at
|
||||||
)
|
)
|
||||||
WITH daily_stats AS (
|
WITH daily_stats AS (
|
||||||
SELECT
|
SELECT
|
||||||
@@ -223,29 +222,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
|
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
|
||||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
|
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
|
||||||
ELSE 1.0
|
ELSE 1.0
|
||||||
END,
|
END
|
||||||
2
|
|
||||||
)
|
)
|
||||||
) as forecast_units,
|
) as forecast_quantity,
|
||||||
GREATEST(0,
|
|
||||||
ROUND(
|
|
||||||
COALESCE(
|
|
||||||
CASE
|
|
||||||
WHEN ds.data_points >= 4 THEN ds.avg_daily_revenue
|
|
||||||
ELSE ps.overall_avg_revenue
|
|
||||||
END *
|
|
||||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
|
||||||
CASE
|
|
||||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.5 THEN 0.85
|
|
||||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.0 THEN 0.9
|
|
||||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 0.5 THEN 0.95
|
|
||||||
ELSE 1.0
|
|
||||||
END,
|
|
||||||
0
|
|
||||||
),
|
|
||||||
2
|
|
||||||
)
|
|
||||||
) as forecast_revenue,
|
|
||||||
CASE
|
CASE
|
||||||
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
|
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
|
||||||
WHEN ds.total_days >= 60 THEN 85
|
WHEN ds.total_days >= 60 THEN 85
|
||||||
@@ -255,17 +234,18 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
WHEN ds.total_days >= 14 THEN 65
|
WHEN ds.total_days >= 14 THEN 65
|
||||||
ELSE 60
|
ELSE 60
|
||||||
END as confidence_level,
|
END as confidence_level,
|
||||||
NOW() as last_calculated_at
|
NOW() as created_at
|
||||||
FROM daily_stats ds
|
FROM daily_stats ds
|
||||||
JOIN temp_product_stats ps ON ds.pid = ps.pid
|
JOIN temp_product_stats ps ON ds.pid = ps.pid
|
||||||
CROSS JOIN temp_forecast_dates fd
|
CROSS JOIN temp_forecast_dates fd
|
||||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor
|
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor,
|
||||||
ON DUPLICATE KEY UPDATE
|
ds.avg_daily_qty, ds.std_daily_qty, ds.avg_daily_qty, ds.total_days, ds.daily_variance_ratio
|
||||||
forecast_units = VALUES(forecast_units),
|
ON CONFLICT (pid, forecast_date) DO UPDATE
|
||||||
forecast_revenue = VALUES(forecast_revenue),
|
SET
|
||||||
confidence_level = VALUES(confidence_level),
|
forecast_quantity = EXCLUDED.forecast_quantity,
|
||||||
last_calculated_at = NOW()
|
confidence_level = EXCLUDED.confidence_level,
|
||||||
|
created_at = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.98);
|
processedCount = Math.floor(totalProducts * 0.98);
|
||||||
@@ -294,22 +274,22 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
|
|
||||||
// Create temporary table for category stats
|
// Create temporary table for category stats
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_sales AS
|
CREATE TEMPORARY TABLE temp_category_sales AS
|
||||||
SELECT
|
SELECT
|
||||||
pc.cat_id,
|
pc.cat_id,
|
||||||
DAYOFWEEK(o.date) as day_of_week,
|
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||||
SUM(o.quantity) as daily_quantity,
|
SUM(o.quantity) as daily_quantity,
|
||||||
SUM(o.price * o.quantity) as daily_revenue,
|
SUM(o.price * o.quantity) as daily_revenue,
|
||||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
JOIN product_categories pc ON o.pid = pc.pid
|
JOIN product_categories pc ON o.pid = pc.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||||
GROUP BY pc.cat_id, DAYOFWEEK(o.date)
|
GROUP BY pc.cat_id, EXTRACT(DOW FROM o.date) + 1
|
||||||
`);
|
`);
|
||||||
|
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_stats AS
|
CREATE TEMPORARY TABLE temp_category_stats AS
|
||||||
SELECT
|
SELECT
|
||||||
cat_id,
|
cat_id,
|
||||||
AVG(daily_revenue) as overall_avg_revenue,
|
AVG(daily_revenue) as overall_avg_revenue,
|
||||||
@@ -350,10 +330,10 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
forecast_units,
|
forecast_units,
|
||||||
forecast_revenue,
|
forecast_revenue,
|
||||||
confidence_level,
|
confidence_level,
|
||||||
last_calculated_at
|
created_at
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
cs.cat_id as category_id,
|
cs.cat_id::bigint as category_id,
|
||||||
fd.forecast_date,
|
fd.forecast_date,
|
||||||
GREATEST(0,
|
GREATEST(0,
|
||||||
AVG(cs.daily_quantity) *
|
AVG(cs.daily_quantity) *
|
||||||
@@ -366,7 +346,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
ELSE ct.overall_avg_revenue
|
ELSE ct.overall_avg_revenue
|
||||||
END *
|
END *
|
||||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||||
(0.95 + (RAND() * 0.1)),
|
(0.95 + (random() * 0.1)),
|
||||||
0
|
0
|
||||||
)
|
)
|
||||||
) as forecast_revenue,
|
) as forecast_revenue,
|
||||||
@@ -376,27 +356,34 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
WHEN ct.total_days >= 14 THEN 70
|
WHEN ct.total_days >= 14 THEN 70
|
||||||
ELSE 60
|
ELSE 60
|
||||||
END as confidence_level,
|
END as confidence_level,
|
||||||
NOW() as last_calculated_at
|
NOW() as created_at
|
||||||
FROM temp_category_sales cs
|
FROM temp_category_sales cs
|
||||||
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
|
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
|
||||||
CROSS JOIN temp_forecast_dates fd
|
CROSS JOIN temp_forecast_dates fd
|
||||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||||
GROUP BY cs.cat_id, fd.forecast_date, ct.overall_avg_revenue, ct.total_days, sf.seasonality_factor
|
GROUP BY
|
||||||
|
cs.cat_id,
|
||||||
|
fd.forecast_date,
|
||||||
|
ct.overall_avg_revenue,
|
||||||
|
ct.total_days,
|
||||||
|
sf.seasonality_factor,
|
||||||
|
sf.month
|
||||||
HAVING AVG(cs.daily_quantity) > 0
|
HAVING AVG(cs.daily_quantity) > 0
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (category_id, forecast_date) DO UPDATE
|
||||||
forecast_units = VALUES(forecast_units),
|
SET
|
||||||
forecast_revenue = VALUES(forecast_revenue),
|
forecast_units = EXCLUDED.forecast_units,
|
||||||
confidence_level = VALUES(confidence_level),
|
forecast_revenue = EXCLUDED.forecast_revenue,
|
||||||
last_calculated_at = NOW()
|
confidence_level = EXCLUDED.confidence_level,
|
||||||
|
created_at = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Clean up temporary tables
|
// Clean up temporary tables
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
DROP TEMPORARY TABLE IF EXISTS temp_forecast_dates;
|
DROP TABLE IF EXISTS temp_forecast_dates;
|
||||||
DROP TEMPORARY TABLE IF EXISTS temp_daily_sales;
|
DROP TABLE IF EXISTS temp_daily_sales;
|
||||||
DROP TEMPORARY TABLE IF EXISTS temp_product_stats;
|
DROP TABLE IF EXISTS temp_product_stats;
|
||||||
DROP TEMPORARY TABLE IF EXISTS temp_category_sales;
|
DROP TABLE IF EXISTS temp_category_sales;
|
||||||
DROP TEMPORARY TABLE IF EXISTS temp_category_stats;
|
DROP TABLE IF EXISTS temp_category_stats;
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 1.0);
|
processedCount = Math.floor(totalProducts * 1.0);
|
||||||
@@ -423,7 +410,8 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('sales_forecasts', NOW())
|
VALUES ('sales_forecasts', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -32,12 +32,12 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get order count that will be processed
|
// Get order count that will be processed
|
||||||
const [orderCount] = await connection.query(`
|
const orderCount = await connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
`);
|
`);
|
||||||
processedOrders = orderCount[0].count;
|
processedOrders = parseInt(orderCount.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -75,8 +75,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
WITH monthly_sales AS (
|
WITH monthly_sales AS (
|
||||||
SELECT
|
SELECT
|
||||||
o.pid,
|
o.pid,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
SUM(o.quantity) as total_quantity_sold,
|
SUM(o.quantity) as total_quantity_sold,
|
||||||
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) as total_revenue,
|
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) as total_revenue,
|
||||||
SUM(COALESCE(p.cost_price, 0) * o.quantity) as total_cost,
|
SUM(COALESCE(p.cost_price, 0) * o.quantity) as total_cost,
|
||||||
@@ -93,17 +93,17 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
FROM orders o
|
FROM orders o
|
||||||
JOIN products p ON o.pid = p.pid
|
JOIN products p ON o.pid = p.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
GROUP BY o.pid, YEAR(o.date), MONTH(o.date)
|
GROUP BY o.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||||
),
|
),
|
||||||
monthly_stock AS (
|
monthly_stock AS (
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
pid,
|
||||||
YEAR(date) as year,
|
EXTRACT(YEAR FROM date::timestamp with time zone) as year,
|
||||||
MONTH(date) as month,
|
EXTRACT(MONTH FROM date::timestamp with time zone) as month,
|
||||||
SUM(received) as stock_received,
|
SUM(received) as stock_received,
|
||||||
SUM(ordered) as stock_ordered
|
SUM(ordered) as stock_ordered
|
||||||
FROM purchase_orders
|
FROM purchase_orders
|
||||||
GROUP BY pid, YEAR(date), MONTH(date)
|
GROUP BY pid, EXTRACT(YEAR FROM date::timestamp with time zone), EXTRACT(MONTH FROM date::timestamp with time zone)
|
||||||
),
|
),
|
||||||
base_products AS (
|
base_products AS (
|
||||||
SELECT
|
SELECT
|
||||||
@@ -197,17 +197,18 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
AND s.year = ms.year
|
AND s.year = ms.year
|
||||||
AND s.month = ms.month
|
AND s.month = ms.month
|
||||||
)
|
)
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (pid, year, month) DO UPDATE
|
||||||
total_quantity_sold = VALUES(total_quantity_sold),
|
SET
|
||||||
total_revenue = VALUES(total_revenue),
|
total_quantity_sold = EXCLUDED.total_quantity_sold,
|
||||||
total_cost = VALUES(total_cost),
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
order_count = VALUES(order_count),
|
total_cost = EXCLUDED.total_cost,
|
||||||
stock_received = VALUES(stock_received),
|
order_count = EXCLUDED.order_count,
|
||||||
stock_ordered = VALUES(stock_ordered),
|
stock_received = EXCLUDED.stock_received,
|
||||||
avg_price = VALUES(avg_price),
|
stock_ordered = EXCLUDED.stock_ordered,
|
||||||
profit_margin = VALUES(profit_margin),
|
avg_price = EXCLUDED.avg_price,
|
||||||
inventory_value = VALUES(inventory_value),
|
profit_margin = EXCLUDED.profit_margin,
|
||||||
gmroi = VALUES(gmroi)
|
inventory_value = EXCLUDED.inventory_value,
|
||||||
|
gmroi = EXCLUDED.gmroi
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.60);
|
processedCount = Math.floor(totalProducts * 0.60);
|
||||||
@@ -237,23 +238,23 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
// Update with financial metrics
|
// Update with financial metrics
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
UPDATE product_time_aggregates pta
|
UPDATE product_time_aggregates pta
|
||||||
JOIN (
|
SET inventory_value = COALESCE(fin.inventory_value, 0)
|
||||||
|
FROM (
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
p.cost_price * p.stock_quantity as inventory_value,
|
p.cost_price * p.stock_quantity as inventory_value,
|
||||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||||
) fin ON pta.pid = fin.pid
|
) fin
|
||||||
|
WHERE pta.pid = fin.pid
|
||||||
AND pta.year = fin.year
|
AND pta.year = fin.year
|
||||||
AND pta.month = fin.month
|
AND pta.month = fin.month
|
||||||
SET
|
|
||||||
pta.inventory_value = COALESCE(fin.inventory_value, 0)
|
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.65);
|
processedCount = Math.floor(totalProducts * 0.65);
|
||||||
@@ -280,7 +281,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('time_aggregates', NOW())
|
VALUES ('time_aggregates', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
const mysql = require('mysql2/promise');
|
const { Pool } = require('pg');
|
||||||
const path = require('path');
|
const path = require('path');
|
||||||
require('dotenv').config({ path: path.resolve(__dirname, '../../..', '.env') });
|
require('dotenv').config({ path: path.resolve(__dirname, '../../..', '.env') });
|
||||||
|
|
||||||
@@ -8,36 +8,24 @@ const dbConfig = {
|
|||||||
user: process.env.DB_USER,
|
user: process.env.DB_USER,
|
||||||
password: process.env.DB_PASSWORD,
|
password: process.env.DB_PASSWORD,
|
||||||
database: process.env.DB_NAME,
|
database: process.env.DB_NAME,
|
||||||
waitForConnections: true,
|
port: process.env.DB_PORT || 5432,
|
||||||
connectionLimit: 10,
|
ssl: process.env.DB_SSL === 'true',
|
||||||
queueLimit: 0,
|
|
||||||
// Add performance optimizations
|
// Add performance optimizations
|
||||||
namedPlaceholders: true,
|
max: 10, // connection pool max size
|
||||||
maxPreparedStatements: 256,
|
idleTimeoutMillis: 30000,
|
||||||
enableKeepAlive: true,
|
connectionTimeoutMillis: 60000
|
||||||
keepAliveInitialDelay: 0,
|
|
||||||
// Add memory optimizations
|
|
||||||
flags: [
|
|
||||||
'FOUND_ROWS',
|
|
||||||
'LONG_PASSWORD',
|
|
||||||
'PROTOCOL_41',
|
|
||||||
'TRANSACTIONS',
|
|
||||||
'SECURE_CONNECTION',
|
|
||||||
'MULTI_RESULTS',
|
|
||||||
'PS_MULTI_RESULTS',
|
|
||||||
'PLUGIN_AUTH',
|
|
||||||
'CONNECT_ATTRS',
|
|
||||||
'PLUGIN_AUTH_LENENC_CLIENT_DATA',
|
|
||||||
'SESSION_TRACK',
|
|
||||||
'MULTI_STATEMENTS'
|
|
||||||
]
|
|
||||||
};
|
};
|
||||||
|
|
||||||
// Create a single pool instance to be reused
|
// Create a single pool instance to be reused
|
||||||
const pool = mysql.createPool(dbConfig);
|
const pool = new Pool(dbConfig);
|
||||||
|
|
||||||
|
// Add event handlers for pool
|
||||||
|
pool.on('error', (err, client) => {
|
||||||
|
console.error('Unexpected error on idle client', err);
|
||||||
|
});
|
||||||
|
|
||||||
async function getConnection() {
|
async function getConnection() {
|
||||||
return await pool.getConnection();
|
return await pool.connect();
|
||||||
}
|
}
|
||||||
|
|
||||||
async function closePool() {
|
async function closePool() {
|
||||||
|
|||||||
@@ -33,7 +33,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get counts of records that will be processed
|
// Get counts of records that will be processed
|
||||||
const [[orderCount], [poCount]] = await Promise.all([
|
const [orderCountResult, poCountResult] = await Promise.all([
|
||||||
connection.query(`
|
connection.query(`
|
||||||
SELECT COUNT(*) as count
|
SELECT COUNT(*) as count
|
||||||
FROM orders o
|
FROM orders o
|
||||||
@@ -45,8 +45,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
WHERE po.status != 0
|
WHERE po.status != 0
|
||||||
`)
|
`)
|
||||||
]);
|
]);
|
||||||
processedOrders = orderCount.count;
|
processedOrders = parseInt(orderCountResult.rows[0].count);
|
||||||
processedPurchaseOrders = poCount.count;
|
processedPurchaseOrders = parseInt(poCountResult.rows[0].count);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'running',
|
status: 'running',
|
||||||
@@ -66,7 +66,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
|
|
||||||
// First ensure all vendors exist in vendor_details
|
// First ensure all vendors exist in vendor_details
|
||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT IGNORE INTO vendor_details (vendor, status, created_at, updated_at)
|
INSERT INTO vendor_details (vendor, status, created_at, updated_at)
|
||||||
SELECT DISTINCT
|
SELECT DISTINCT
|
||||||
vendor,
|
vendor,
|
||||||
'active' as status,
|
'active' as status,
|
||||||
@@ -74,6 +74,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
NOW() as updated_at
|
NOW() as updated_at
|
||||||
FROM products
|
FROM products
|
||||||
WHERE vendor IS NOT NULL
|
WHERE vendor IS NOT NULL
|
||||||
|
ON CONFLICT (vendor) DO NOTHING
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.8);
|
processedCount = Math.floor(totalProducts * 0.8);
|
||||||
@@ -128,7 +129,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
FROM products p
|
FROM products p
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY p.vendor
|
GROUP BY p.vendor
|
||||||
),
|
),
|
||||||
vendor_po AS (
|
vendor_po AS (
|
||||||
@@ -138,12 +139,15 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
COUNT(DISTINCT po.id) as total_orders,
|
COUNT(DISTINCT po.id) as total_orders,
|
||||||
AVG(CASE
|
AVG(CASE
|
||||||
WHEN po.receiving_status = 40
|
WHEN po.receiving_status = 40
|
||||||
THEN DATEDIFF(po.received_date, po.date)
|
AND po.received_date IS NOT NULL
|
||||||
|
AND po.date IS NOT NULL
|
||||||
|
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||||
|
ELSE NULL
|
||||||
END) as avg_lead_time_days,
|
END) as avg_lead_time_days,
|
||||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||||
FROM products p
|
FROM products p
|
||||||
JOIN purchase_orders po ON p.pid = po.pid
|
JOIN purchase_orders po ON p.pid = po.pid
|
||||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
GROUP BY p.vendor
|
GROUP BY p.vendor
|
||||||
),
|
),
|
||||||
vendor_products AS (
|
vendor_products AS (
|
||||||
@@ -188,20 +192,21 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
|
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
|
||||||
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
|
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
|
||||||
WHERE vs.vendor IS NOT NULL
|
WHERE vs.vendor IS NOT NULL
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (vendor) DO UPDATE
|
||||||
total_revenue = VALUES(total_revenue),
|
SET
|
||||||
total_orders = VALUES(total_orders),
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
total_late_orders = VALUES(total_late_orders),
|
total_orders = EXCLUDED.total_orders,
|
||||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
total_late_orders = EXCLUDED.total_late_orders,
|
||||||
on_time_delivery_rate = VALUES(on_time_delivery_rate),
|
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||||
order_fill_rate = VALUES(order_fill_rate),
|
on_time_delivery_rate = EXCLUDED.on_time_delivery_rate,
|
||||||
avg_order_value = VALUES(avg_order_value),
|
order_fill_rate = EXCLUDED.order_fill_rate,
|
||||||
active_products = VALUES(active_products),
|
avg_order_value = EXCLUDED.avg_order_value,
|
||||||
total_products = VALUES(total_products),
|
active_products = EXCLUDED.active_products,
|
||||||
total_purchase_value = VALUES(total_purchase_value),
|
total_products = EXCLUDED.total_products,
|
||||||
avg_margin_percent = VALUES(avg_margin_percent),
|
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||||
status = VALUES(status),
|
avg_margin_percent = EXCLUDED.avg_margin_percent,
|
||||||
last_calculated_at = VALUES(last_calculated_at)
|
status = EXCLUDED.status,
|
||||||
|
last_calculated_at = EXCLUDED.last_calculated_at
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.9);
|
processedCount = Math.floor(totalProducts * 0.9);
|
||||||
@@ -244,23 +249,23 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
WITH monthly_orders AS (
|
WITH monthly_orders AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.vendor,
|
p.vendor,
|
||||||
YEAR(o.date) as year,
|
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||||
MONTH(o.date) as month,
|
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||||
COUNT(DISTINCT o.id) as total_orders,
|
COUNT(DISTINCT o.id) as total_orders,
|
||||||
SUM(o.quantity * o.price) as total_revenue,
|
SUM(o.quantity * o.price) as total_revenue,
|
||||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
||||||
FROM products p
|
FROM products p
|
||||||
JOIN orders o ON p.pid = o.pid
|
JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.canceled = false
|
WHERE o.canceled = false
|
||||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
AND p.vendor IS NOT NULL
|
AND p.vendor IS NOT NULL
|
||||||
GROUP BY p.vendor, YEAR(o.date), MONTH(o.date)
|
GROUP BY p.vendor, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||||
),
|
),
|
||||||
monthly_po AS (
|
monthly_po AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.vendor,
|
p.vendor,
|
||||||
YEAR(po.date) as year,
|
EXTRACT(YEAR FROM po.date::timestamp with time zone) as year,
|
||||||
MONTH(po.date) as month,
|
EXTRACT(MONTH FROM po.date::timestamp with time zone) as month,
|
||||||
COUNT(DISTINCT po.id) as total_po,
|
COUNT(DISTINCT po.id) as total_po,
|
||||||
COUNT(DISTINCT CASE
|
COUNT(DISTINCT CASE
|
||||||
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
|
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
|
||||||
@@ -268,14 +273,17 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
END) as late_orders,
|
END) as late_orders,
|
||||||
AVG(CASE
|
AVG(CASE
|
||||||
WHEN po.receiving_status = 40
|
WHEN po.receiving_status = 40
|
||||||
THEN DATEDIFF(po.received_date, po.date)
|
AND po.received_date IS NOT NULL
|
||||||
|
AND po.date IS NOT NULL
|
||||||
|
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||||
|
ELSE NULL
|
||||||
END) as avg_lead_time_days,
|
END) as avg_lead_time_days,
|
||||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||||
FROM products p
|
FROM products p
|
||||||
JOIN purchase_orders po ON p.pid = po.pid
|
JOIN purchase_orders po ON p.pid = po.pid
|
||||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||||
AND p.vendor IS NOT NULL
|
AND p.vendor IS NOT NULL
|
||||||
GROUP BY p.vendor, YEAR(po.date), MONTH(po.date)
|
GROUP BY p.vendor, EXTRACT(YEAR FROM po.date::timestamp with time zone), EXTRACT(MONTH FROM po.date::timestamp with time zone)
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
mo.vendor,
|
mo.vendor,
|
||||||
@@ -311,13 +319,14 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
AND mp.year = mo.year
|
AND mp.year = mo.year
|
||||||
AND mp.month = mo.month
|
AND mp.month = mo.month
|
||||||
WHERE mo.vendor IS NULL
|
WHERE mo.vendor IS NULL
|
||||||
ON DUPLICATE KEY UPDATE
|
ON CONFLICT (vendor, year, month) DO UPDATE
|
||||||
total_orders = VALUES(total_orders),
|
SET
|
||||||
late_orders = VALUES(late_orders),
|
total_orders = EXCLUDED.total_orders,
|
||||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
late_orders = EXCLUDED.late_orders,
|
||||||
total_purchase_value = VALUES(total_purchase_value),
|
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||||
total_revenue = VALUES(total_revenue),
|
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||||
avg_margin_percent = VALUES(avg_margin_percent)
|
total_revenue = EXCLUDED.total_revenue,
|
||||||
|
avg_margin_percent = EXCLUDED.avg_margin_percent
|
||||||
`);
|
`);
|
||||||
|
|
||||||
processedCount = Math.floor(totalProducts * 0.95);
|
processedCount = Math.floor(totalProducts * 0.95);
|
||||||
@@ -344,7 +353,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
|||||||
await connection.query(`
|
await connection.query(`
|
||||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||||
VALUES ('vendor_metrics', NOW())
|
VALUES ('vendor_metrics', NOW())
|
||||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -184,7 +184,7 @@ async function resetDatabase() {
|
|||||||
SELECT string_agg(tablename, ', ') as tables
|
SELECT string_agg(tablename, ', ') as tables
|
||||||
FROM pg_tables
|
FROM pg_tables
|
||||||
WHERE schemaname = 'public'
|
WHERE schemaname = 'public'
|
||||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history');
|
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates');
|
||||||
`);
|
`);
|
||||||
|
|
||||||
if (!tablesResult.rows[0].tables) {
|
if (!tablesResult.rows[0].tables) {
|
||||||
|
|||||||
@@ -100,6 +100,9 @@ async function resetMetrics() {
|
|||||||
client = new Client(dbConfig);
|
client = new Client(dbConfig);
|
||||||
await client.connect();
|
await client.connect();
|
||||||
|
|
||||||
|
// Explicitly begin a transaction
|
||||||
|
await client.query('BEGIN');
|
||||||
|
|
||||||
// First verify current state
|
// First verify current state
|
||||||
const initialTables = await client.query(`
|
const initialTables = await client.query(`
|
||||||
SELECT tablename as name
|
SELECT tablename as name
|
||||||
@@ -124,6 +127,7 @@ async function resetMetrics() {
|
|||||||
|
|
||||||
for (const table of [...METRICS_TABLES].reverse()) {
|
for (const table of [...METRICS_TABLES].reverse()) {
|
||||||
try {
|
try {
|
||||||
|
// Use NOWAIT to avoid hanging if there's a lock
|
||||||
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
||||||
|
|
||||||
// Verify the table was actually dropped
|
// Verify the table was actually dropped
|
||||||
@@ -142,13 +146,23 @@ async function resetMetrics() {
|
|||||||
operation: 'Table dropped',
|
operation: 'Table dropped',
|
||||||
message: `Successfully dropped table: ${table}`
|
message: `Successfully dropped table: ${table}`
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Commit after each table drop to ensure locks are released
|
||||||
|
await client.query('COMMIT');
|
||||||
|
// Start a new transaction for the next table
|
||||||
|
await client.query('BEGIN');
|
||||||
|
// Re-disable foreign key constraints for the new transaction
|
||||||
|
await client.query('SET session_replication_role = \'replica\'');
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'error',
|
status: 'error',
|
||||||
operation: 'Drop table error',
|
operation: 'Drop table error',
|
||||||
message: `Error dropping table ${table}: ${err.message}`
|
message: `Error dropping table ${table}: ${err.message}`
|
||||||
});
|
});
|
||||||
throw err;
|
await client.query('ROLLBACK');
|
||||||
|
// Re-start transaction for next table
|
||||||
|
await client.query('BEGIN');
|
||||||
|
await client.query('SET session_replication_role = \'replica\'');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -164,6 +178,11 @@ async function resetMetrics() {
|
|||||||
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.rows.map(t => t.name).join(', ')}`);
|
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.rows.map(t => t.name).join(', ')}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Make sure we have a fresh transaction here
|
||||||
|
await client.query('COMMIT');
|
||||||
|
await client.query('BEGIN');
|
||||||
|
await client.query('SET session_replication_role = \'replica\'');
|
||||||
|
|
||||||
// Read metrics schema
|
// Read metrics schema
|
||||||
outputProgress({
|
outputProgress({
|
||||||
operation: 'Reading schema',
|
operation: 'Reading schema',
|
||||||
@@ -220,6 +239,13 @@ async function resetMetrics() {
|
|||||||
rowCount: result.rowCount
|
rowCount: result.rowCount
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Commit every 10 statements to avoid long-running transactions
|
||||||
|
if (i > 0 && i % 10 === 0) {
|
||||||
|
await client.query('COMMIT');
|
||||||
|
await client.query('BEGIN');
|
||||||
|
await client.query('SET session_replication_role = \'replica\'');
|
||||||
|
}
|
||||||
} catch (sqlError) {
|
} catch (sqlError) {
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: 'error',
|
status: 'error',
|
||||||
@@ -230,10 +256,17 @@ async function resetMetrics() {
|
|||||||
statementNumber: i + 1
|
statementNumber: i + 1
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
await client.query('ROLLBACK');
|
||||||
throw sqlError;
|
throw sqlError;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Final commit for any pending statements
|
||||||
|
await client.query('COMMIT');
|
||||||
|
|
||||||
|
// Start new transaction for final checks
|
||||||
|
await client.query('BEGIN');
|
||||||
|
|
||||||
// Re-enable foreign key checks after all tables are created
|
// Re-enable foreign key checks after all tables are created
|
||||||
await client.query('SET session_replication_role = \'origin\'');
|
await client.query('SET session_replication_role = \'origin\'');
|
||||||
|
|
||||||
@@ -269,9 +302,11 @@ async function resetMetrics() {
|
|||||||
operation: 'Final table check',
|
operation: 'Final table check',
|
||||||
message: `All database tables: ${finalCheck.rows.map(t => t.name).join(', ')}`
|
message: `All database tables: ${finalCheck.rows.map(t => t.name).join(', ')}`
|
||||||
});
|
});
|
||||||
|
await client.query('ROLLBACK');
|
||||||
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
|
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Commit final transaction
|
||||||
await client.query('COMMIT');
|
await client.query('COMMIT');
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
@@ -288,7 +323,11 @@ async function resetMetrics() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
if (client) {
|
if (client) {
|
||||||
|
try {
|
||||||
await client.query('ROLLBACK');
|
await client.query('ROLLBACK');
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error('Error during rollback:', rollbackError);
|
||||||
|
}
|
||||||
// Make sure to re-enable foreign key checks even if there's an error
|
// Make sure to re-enable foreign key checks even if there's an error
|
||||||
await client.query('SET session_replication_role = \'origin\'').catch(() => {});
|
await client.query('SET session_replication_role = \'origin\'').catch(() => {});
|
||||||
}
|
}
|
||||||
|
|||||||
337
inventory-server/scripts/update-order-costs.js
Normal file
337
inventory-server/scripts/update-order-costs.js
Normal file
@@ -0,0 +1,337 @@
|
|||||||
|
/**
|
||||||
|
* This script updates the costeach values for existing orders from the original MySQL database
|
||||||
|
* without needing to run the full import process.
|
||||||
|
*/
|
||||||
|
const dotenv = require("dotenv");
|
||||||
|
const path = require("path");
|
||||||
|
const fs = require("fs");
|
||||||
|
const { setupConnections, closeConnections } = require('./import/utils');
|
||||||
|
const { outputProgress, formatElapsedTime } = require('./metrics/utils/progress');
|
||||||
|
|
||||||
|
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||||
|
|
||||||
|
// SSH configuration
|
||||||
|
const sshConfig = {
|
||||||
|
ssh: {
|
||||||
|
host: process.env.PROD_SSH_HOST,
|
||||||
|
port: process.env.PROD_SSH_PORT || 22,
|
||||||
|
username: process.env.PROD_SSH_USER,
|
||||||
|
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||||
|
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||||
|
: undefined,
|
||||||
|
compress: true, // Enable SSH compression
|
||||||
|
},
|
||||||
|
prodDbConfig: {
|
||||||
|
// MySQL config for production
|
||||||
|
host: process.env.PROD_DB_HOST || "localhost",
|
||||||
|
user: process.env.PROD_DB_USER,
|
||||||
|
password: process.env.PROD_DB_PASSWORD,
|
||||||
|
database: process.env.PROD_DB_NAME,
|
||||||
|
port: process.env.PROD_DB_PORT || 3306,
|
||||||
|
timezone: 'Z',
|
||||||
|
},
|
||||||
|
localDbConfig: {
|
||||||
|
// PostgreSQL config for local
|
||||||
|
host: process.env.DB_HOST,
|
||||||
|
user: process.env.DB_USER,
|
||||||
|
password: process.env.DB_PASSWORD,
|
||||||
|
database: process.env.DB_NAME,
|
||||||
|
port: process.env.DB_PORT || 5432,
|
||||||
|
ssl: process.env.DB_SSL === 'true',
|
||||||
|
connectionTimeoutMillis: 60000,
|
||||||
|
idleTimeoutMillis: 30000,
|
||||||
|
max: 10 // connection pool max size
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
async function updateOrderCosts() {
|
||||||
|
const startTime = Date.now();
|
||||||
|
let connections;
|
||||||
|
let updatedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Initializing SSH tunnel..."
|
||||||
|
});
|
||||||
|
|
||||||
|
connections = await setupConnections(sshConfig);
|
||||||
|
const { prodConnection, localConnection } = connections;
|
||||||
|
|
||||||
|
// 1. Get all orders from local database that need cost updates
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Getting orders from local database..."
|
||||||
|
});
|
||||||
|
|
||||||
|
const [orders] = await localConnection.query(`
|
||||||
|
SELECT DISTINCT order_number, pid
|
||||||
|
FROM orders
|
||||||
|
WHERE costeach = 0 OR costeach IS NULL
|
||||||
|
ORDER BY order_number
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (!orders || !orders.rows || orders.rows.length === 0) {
|
||||||
|
console.log("No orders found that need cost updates");
|
||||||
|
return { updatedCount: 0, errorCount: 0 };
|
||||||
|
}
|
||||||
|
|
||||||
|
const totalOrders = orders.rows.length;
|
||||||
|
console.log(`Found ${totalOrders} orders that need cost updates`);
|
||||||
|
|
||||||
|
// Process in batches of 1000 orders
|
||||||
|
const BATCH_SIZE = 500;
|
||||||
|
for (let i = 0; i < orders.rows.length; i += BATCH_SIZE) {
|
||||||
|
try {
|
||||||
|
// Start transaction for this batch
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
|
const batch = orders.rows.slice(i, i + BATCH_SIZE);
|
||||||
|
|
||||||
|
const orderNumbers = [...new Set(batch.map(o => o.order_number))];
|
||||||
|
|
||||||
|
// 2. Fetch costs from production database for these orders
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Fetching costs for orders ${i + 1} to ${Math.min(i + BATCH_SIZE, totalOrders)} of ${totalOrders}`,
|
||||||
|
current: i,
|
||||||
|
total: totalOrders,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
|
||||||
|
const [costs] = await prodConnection.query(`
|
||||||
|
SELECT
|
||||||
|
oc.orderid as order_number,
|
||||||
|
oc.pid,
|
||||||
|
oc.costeach
|
||||||
|
FROM order_costs oc
|
||||||
|
INNER JOIN (
|
||||||
|
SELECT
|
||||||
|
orderid,
|
||||||
|
pid,
|
||||||
|
MAX(id) as max_id
|
||||||
|
FROM order_costs
|
||||||
|
WHERE orderid IN (?)
|
||||||
|
AND pending = 0
|
||||||
|
GROUP BY orderid, pid
|
||||||
|
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||||
|
`, [orderNumbers]);
|
||||||
|
|
||||||
|
// Create a map of costs for easy lookup
|
||||||
|
const costMap = {};
|
||||||
|
if (costs && costs.length) {
|
||||||
|
costs.forEach(c => {
|
||||||
|
costMap[`${c.order_number}-${c.pid}`] = c.costeach || 0;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Update costs in local database by batches
|
||||||
|
// Using a more efficient update approach with a temporary table
|
||||||
|
|
||||||
|
// Create a temporary table for each batch
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
CREATE TEMP TABLE temp_order_costs (
|
||||||
|
order_number VARCHAR(50) NOT NULL,
|
||||||
|
pid BIGINT NOT NULL,
|
||||||
|
costeach DECIMAL(10,3) NOT NULL,
|
||||||
|
PRIMARY KEY (order_number, pid)
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Insert cost data into the temporary table
|
||||||
|
const costEntries = [];
|
||||||
|
for (const order of batch) {
|
||||||
|
const key = `${order.order_number}-${order.pid}`;
|
||||||
|
if (key in costMap) {
|
||||||
|
costEntries.push({
|
||||||
|
order_number: order.order_number,
|
||||||
|
pid: order.pid,
|
||||||
|
costeach: costMap[key]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert in sub-batches of 100
|
||||||
|
const DB_BATCH_SIZE = 50;
|
||||||
|
for (let j = 0; j < costEntries.length; j += DB_BATCH_SIZE) {
|
||||||
|
const subBatch = costEntries.slice(j, j + DB_BATCH_SIZE);
|
||||||
|
if (subBatch.length === 0) continue;
|
||||||
|
|
||||||
|
const placeholders = subBatch.map((_, idx) =>
|
||||||
|
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||||
|
).join(',');
|
||||||
|
|
||||||
|
const values = subBatch.flatMap(item => [
|
||||||
|
item.order_number,
|
||||||
|
item.pid,
|
||||||
|
item.costeach
|
||||||
|
]);
|
||||||
|
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO temp_order_costs (order_number, pid, costeach)
|
||||||
|
VALUES ${placeholders}
|
||||||
|
`, values);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Perform bulk update from the temporary table
|
||||||
|
const [updateResult] = await localConnection.query(`
|
||||||
|
UPDATE orders o
|
||||||
|
SET costeach = t.costeach
|
||||||
|
FROM temp_order_costs t
|
||||||
|
WHERE o.order_number = t.order_number AND o.pid = t.pid
|
||||||
|
RETURNING o.id
|
||||||
|
`);
|
||||||
|
|
||||||
|
const batchUpdated = updateResult.rowCount || 0;
|
||||||
|
updatedCount += batchUpdated;
|
||||||
|
|
||||||
|
// Commit transaction for this batch
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Updated ${updatedCount} orders with costs from production (batch: ${batchUpdated})`,
|
||||||
|
current: i + batch.length,
|
||||||
|
total: totalOrders,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
// If a batch fails, roll back that batch's transaction and continue
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during batch rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error(`Error processing batch ${i}-${i + BATCH_SIZE}:`, error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. For orders with no matching costs, set a default based on price
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Setting default costs for remaining orders..."
|
||||||
|
});
|
||||||
|
|
||||||
|
// Process remaining updates in smaller batches
|
||||||
|
const DEFAULT_BATCH_SIZE = 10000;
|
||||||
|
let totalDefaultUpdated = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Start with a count query to determine how many records need the default update
|
||||||
|
const [countResult] = await localConnection.query(`
|
||||||
|
SELECT COUNT(*) as count FROM orders
|
||||||
|
WHERE (costeach = 0 OR costeach IS NULL)
|
||||||
|
`);
|
||||||
|
|
||||||
|
const totalToUpdate = parseInt(countResult.rows[0]?.count || 0);
|
||||||
|
|
||||||
|
if (totalToUpdate > 0) {
|
||||||
|
console.log(`Applying default cost to ${totalToUpdate} orders`);
|
||||||
|
|
||||||
|
// Apply the default in batches with separate transactions
|
||||||
|
for (let i = 0; i < totalToUpdate; i += DEFAULT_BATCH_SIZE) {
|
||||||
|
try {
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
|
const [defaultUpdates] = await localConnection.query(`
|
||||||
|
WITH orders_to_update AS (
|
||||||
|
SELECT id FROM orders
|
||||||
|
WHERE (costeach = 0 OR costeach IS NULL)
|
||||||
|
LIMIT ${DEFAULT_BATCH_SIZE}
|
||||||
|
)
|
||||||
|
UPDATE orders o
|
||||||
|
SET costeach = price * 0.5
|
||||||
|
FROM orders_to_update otu
|
||||||
|
WHERE o.id = otu.id
|
||||||
|
RETURNING o.id
|
||||||
|
`);
|
||||||
|
|
||||||
|
const batchDefaultUpdated = defaultUpdates.rowCount || 0;
|
||||||
|
totalDefaultUpdated += batchDefaultUpdated;
|
||||||
|
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Applied default costs to ${totalDefaultUpdated} of ${totalToUpdate} orders`,
|
||||||
|
current: totalDefaultUpdated,
|
||||||
|
total: totalToUpdate,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during default update rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error(`Error applying default costs batch ${i}-${i + DEFAULT_BATCH_SIZE}:`, error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error counting or updating remaining orders:", error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
|
||||||
|
updatedCount += totalDefaultUpdated;
|
||||||
|
|
||||||
|
const endTime = Date.now();
|
||||||
|
const totalSeconds = (endTime - startTime) / 1000;
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "complete",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Updated ${updatedCount} orders (${totalDefaultUpdated} with default values) in ${formatElapsedTime(totalSeconds)}`,
|
||||||
|
elapsed: formatElapsedTime(totalSeconds)
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: "complete",
|
||||||
|
updatedCount,
|
||||||
|
errorCount
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error during order costs update:", error);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: "error",
|
||||||
|
error: error.message,
|
||||||
|
updatedCount,
|
||||||
|
errorCount
|
||||||
|
};
|
||||||
|
} finally {
|
||||||
|
if (connections) {
|
||||||
|
await closeConnections(connections).catch(err => {
|
||||||
|
console.error("Error closing connections:", err);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run the script only if this is the main module
|
||||||
|
if (require.main === module) {
|
||||||
|
updateOrderCosts().then((results) => {
|
||||||
|
console.log('Cost update completed:', results);
|
||||||
|
// Force exit after a small delay to ensure all logs are written
|
||||||
|
setTimeout(() => process.exit(0), 500);
|
||||||
|
}).catch((error) => {
|
||||||
|
console.error("Unhandled error:", error);
|
||||||
|
// Force exit with error code after a small delay
|
||||||
|
setTimeout(() => process.exit(1), 500);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export the function for use in other scripts
|
||||||
|
module.exports = updateOrderCosts;
|
||||||
@@ -1,226 +0,0 @@
|
|||||||
I will provide a JSON array with product data. Process the array by combining all products from validData and invalidData arrays into a single array, excluding any fields starting with “__”, such as “__index” or “__errors”. Process each product according to the reference guidelines below. If a field is not included in the data, do not include it in your response (e.g. do not include its key or any value) unless the specific field guidelines below say otherwise. If a product appears to be from an empty or entirely invalid line, do not include it in your response.
|
|
||||||
|
|
||||||
Your response should be a JSON object with the following structure:
|
|
||||||
{
|
|
||||||
"correctedData": [], // Array of corrected products
|
|
||||||
"changes": [], // Array of strings describing each change made
|
|
||||||
"warnings": [] // Array of strings with warnings or suggestions for manual review (see below for details)
|
|
||||||
}
|
|
||||||
|
|
||||||
IMPORTANT: For all fields that use IDs (categories, supplier, company, line, subline, ship_restrictions, tax_cat, artist, themes, etc.), you MUST return the ID values, not the display names. The system will handle converting IDs to display names.
|
|
||||||
|
|
||||||
Using the provided guidelines, focus on:
|
|
||||||
1. Correcting typos and any incorrect spelling or grammar
|
|
||||||
2. Standardizing product names
|
|
||||||
3. Correcting and enhancing descriptions by adding details, keywords, and SEO-friendly language
|
|
||||||
4. Fixing any obvious errors or inconsistencies between similar products in measurements, prices, or quantities
|
|
||||||
5. Adding correct categories, themes, and colors
|
|
||||||
|
|
||||||
Use only the provided data and your own knowledge to make changes. Do not make assumptions or make up information that you're not sure about. If you're unable to make a change you're confident about, leave the field as is. All data passed in should be validated, corrected, and returned. All values returned should be strings, not numbers. Do not leave out any fields that were present in the original data.
|
|
||||||
|
|
||||||
Possible reasons for including a warning in the warnings array:
|
|
||||||
- If you're unable to make a change you're confident about but you believe one needs to be made
|
|
||||||
- If there are inconsistencies in the data that could be valid but need to be reviewed
|
|
||||||
- If not enough information is provided to make a change that you believe is needed
|
|
||||||
- If you infer a value for a required field based on context
|
|
||||||
|
|
||||||
|
|
||||||
----------PRODUCT FIELD GUIDELINES----------
|
|
||||||
|
|
||||||
Fields: supplier, private_notes, company, line, subline, artist
|
|
||||||
Changes: Not allowed
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, return these fields exactly as provided with no changes
|
|
||||||
|
|
||||||
Fields: upc, supplier_no, notions_no, item_number
|
|
||||||
Changes: Formatting only
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, trim outside white space and return these fields exactly as provided with no other changes
|
|
||||||
|
|
||||||
Fields: hts_code
|
|
||||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, trim white space and any non-numeric characters, then return as a string. Do not validate in any other way.
|
|
||||||
|
|
||||||
Fields: image_url
|
|
||||||
Changes: Formatting only
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, convert all comma-separated values to valid https:// URLs and return
|
|
||||||
|
|
||||||
Fields: msrp, cost_each
|
|
||||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, strip any currency symbols and return as a string with exactly two decimal places, even if the last place is a 0.
|
|
||||||
|
|
||||||
Fields: qty_per_unit, case_qty
|
|
||||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, strip non-numeric characters and return
|
|
||||||
|
|
||||||
Fields: ship_restrictions
|
|
||||||
Changes: Only add a value if it's not already present
|
|
||||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
|
|
||||||
Instructions: Always return a value exactly as provided, or return 0 if no value is provided.
|
|
||||||
|
|
||||||
Fields: eta
|
|
||||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, return a full month name, day is optional, no year ever (e.g. “January” or “March 3”). This value is not required if not provided.
|
|
||||||
|
|
||||||
Fields: name
|
|
||||||
Changes: Allowed to conform to guidelines, to fix typos or formatting
|
|
||||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most reasonable value possible based on the naming guidelines and the other information you have.
|
|
||||||
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
|
|
||||||
|
|
||||||
Fields: description
|
|
||||||
Changes: Full creative control allowed within guidelines
|
|
||||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most accurate description possible based on the description guidelines and the other information you have.
|
|
||||||
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
|
|
||||||
|
|
||||||
Fields: weight, length, width, height
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return your best guess based on the other information you have or the dimensions for similar products.
|
|
||||||
Instructions: Always return a reasonable value (weights in ounces and dimensions in inches) that is validated against similar provided products and your knowledge of general object measurements (e.g. a sheet of paper is not going to be 3 inches thick, a pack of stickers is not going to be 250 ounces, this sheet of paper is very likely going to be the same size as that other sheet of paper from the same line). If a value is unusual or unreasonable, even wildly so, change it to match similar products or to be more reasonable. When correcting unreasonable weights or dimensions, prioritize comparisons to products from the same company and product line first, then broader category matches or common knowledge if necessary.Do not return 0 or null for any of these fields.
|
|
||||||
|
|
||||||
Fields: coo
|
|
||||||
Changes: Formatting only
|
|
||||||
Required: Return if present in the original data. Do not return if not present.
|
|
||||||
Instructions: If present, convert all country names and abbreviations to the official ISO 3166-1 alpha-2 two-character country code. Convert any value with more than two characters to two characters only (e.g. "United States" or "USA" should both return "US").
|
|
||||||
|
|
||||||
Fields: tax_cat
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
|
|
||||||
Instructions: Always return a valid numerical tax code ID from the Available Tax Codes array below. Give preference to the value provided, but correct it if another value is more accurate. You must return a value for this field. 0 should be the default value in most cases.
|
|
||||||
|
|
||||||
Fields: size_cat
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: Return if present in the original data or if not present and applicable. Do not return if not applicable (e.g. if no size categories apply based on what you know about the product).
|
|
||||||
Instructions: If present or if applicable, return one valid numerical size category ID from the Available Size Categories array below. Give preference to the value provided, but correct it if another value is more accurate. If the product name contains a match for one of the size categories (such as 12x12, 6x6, 2oz, etc) you MUST return that size category with the results. A value is not required if none of the size categories apply.
|
|
||||||
|
|
||||||
Fields: themes
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no themes apply based on what you know about the product).
|
|
||||||
Instructions: If present, confirm that each provided theme matches what you understand to be a theme of the product. Remove any themes that do not match and add any themes that are missing. Most products will have zero or one theme. Return a comma-separated list of numerical theme IDs from the Available Themes array below. If you choose a sub-theme, you do not need to include its parent theme in the list.
|
|
||||||
|
|
||||||
Fields: colors
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no colors apply based on what you know about the product).
|
|
||||||
Instructions: If present or if applicable, return a comma-separated list of numerical color IDs from the Available Colors array below, using the product name as the primary guide (e.g. if the name contains Blue or a blue variant, you should return the blue color ID). A value is not required if none of the colors apply. Most products will have zero colors.
|
|
||||||
|
|
||||||
Fields: categories
|
|
||||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
|
||||||
Required: You must always return at least one value for this field, even if it's not provided in the original data. If no value is provided, return the most appropriate category or categories based on the other information you have.
|
|
||||||
Instructions: Always return a comma-separated list of one or more valid numerical category IDs from the Available Categories array below. Give preference to the values provided, particularly if the other information isn't enough to determine a category, but correct them or add new categories if another value is more accurate. Do not return categories in the Deals or Black Friday categories, and strip these from the list if present. If you choose a subcategory at any level, you do not need to include its parent categories in the list. You must return at least one category and you can return multiple categories if applicable. All categories have equal value so their order is not important. Always try to return the most specific categories possible (e.g. one in the third level of the category hierarchy is better than one in the second level).
|
|
||||||
|
|
||||||
----------PRODUCT NAMING GUIDELINES----------
|
|
||||||
If there's only one of this type of product in a line: [Line Name] [Product Name] - [Company]
|
|
||||||
Example: "Cosmos Infinity Chipboard - Stamperia"
|
|
||||||
Example: "Serene Petals 6x6 Paper Pad - Prima"
|
|
||||||
|
|
||||||
Multiple similar products in a line: [Differentiator] [Product Type] - [Line Name] - [Company]
|
|
||||||
Example: "Ice & Shells Stencil - Arctic Antarctic - Stamperia"
|
|
||||||
Example: "Astronomy Paper - Cosmos Infinity - Stamperia"
|
|
||||||
|
|
||||||
Standalone products: [Product Name] - [Company]
|
|
||||||
Example: "Hedwig Puffy Stickers - Paper House Productions"
|
|
||||||
Example: "Heart Tree Dies - Lawn Fawn"
|
|
||||||
|
|
||||||
Color-based products: [Color] [Product Name] - [Company]
|
|
||||||
Example: "Green Valley Enamel Dots - Altenew"
|
|
||||||
Example: "Magenta Aqua Pigment - Brutus Monroe"
|
|
||||||
|
|
||||||
Complex products: [Differentiator] [Line] [Product Type] - [Company]
|
|
||||||
Example: "Size 6 Round Black Velvet Watercolor Brush - Silver Brush Limited" (Size 6 Round is the differentiator, Black Velvet is the line, Watercolor Brush is the product type)
|
|
||||||
|
|
||||||
These should not be included in the name, unless there are multiple products that are otherwise identical:
|
|
||||||
- Product size
|
|
||||||
- Product weight
|
|
||||||
- Number of pages
|
|
||||||
- How many are in the package
|
|
||||||
|
|
||||||
Naming Conventions:
|
|
||||||
- Paper sizes: Use "12x12", "8x8", "6x6" (no spaces or units of measure)
|
|
||||||
- Company names must match backend exactly
|
|
||||||
- Always capitalize every word in the name, including short articles like "The" and "An"
|
|
||||||
- Use "Idea-ology" (not "idea-ology" or "Ideaology")
|
|
||||||
- All stamps are "Stamp Set" (not "Clear Stamps" or "Rubber Stamps")
|
|
||||||
- All dies are "Dies" or "Die" (not "Die Set")
|
|
||||||
- Brands with their own naming conventions should be respected, such as "Doodle Cuts" for dies from Doodlebug
|
|
||||||
|
|
||||||
Special Brand Rules - Ranger:
|
|
||||||
Format: [Product Name] - [Designer Line] - Ranger
|
|
||||||
Possible Designers: Dylusions, Dina Wakley MEdia, Simon Hurley create., Wendy Vecchi
|
|
||||||
Example: "Stacked Stencil - Dina Wakley MEdia - Ranger"
|
|
||||||
|
|
||||||
Special Brand Rules - Tim Holtz products from Ranger:
|
|
||||||
Format: [Color] [Product Name/Type] - Tim Holtz Distress - Ranger
|
|
||||||
Example: "Mermaid Lagoon Tim Holtz Distress Oxide Ink Pad - Ranger"
|
|
||||||
|
|
||||||
Special Brand Rules - Tim Holtz products from Sizzix or Stampers Anonymous:
|
|
||||||
Format: [Product Name] [Product Type] by Tim Holtz - [Company]
|
|
||||||
Example: "Leaf Fragments Thinlits Dies by Tim Holtz - Sizzix"
|
|
||||||
|
|
||||||
Special Brand Rules - Tim Holtz products from Advantus/Idea-ology:
|
|
||||||
Format: [Product Name] - Tim Holtz Idea-ology
|
|
||||||
Example: "Tiny Vials - Tim Holtz Idea-ology"
|
|
||||||
|
|
||||||
Special Brand Rules - Dies from Sizzix:
|
|
||||||
Include die type plus "Dies" or "Die"
|
|
||||||
Examples:
|
|
||||||
"Art Nouveau 3-D Textured Impressions Embossing Folder - Sizzix"
|
|
||||||
"Pocket Pals Thinlits Dies - Sizzix"
|
|
||||||
"Butterfly Wishes Framelits Dies & Stamps - Sizzix"
|
|
||||||
|
|
||||||
Important Notes
|
|
||||||
- Ensure that product names are consistent across all products of the same type
|
|
||||||
- Use the minimum amount of information needed to uniquely identify the product
|
|
||||||
- Put detailed specifications in the product description, not its name
|
|
||||||
|
|
||||||
Edge Cases
|
|
||||||
- If the product is missing a company name, infer one from the other products included in the data
|
|
||||||
- If the product is missing a clear differentiator and needs one to be unique, infer and add one from the other data provided (e.g. the description, existing size categories, etc.)
|
|
||||||
|
|
||||||
Incorrect example: MVP Rugby - Collection Pack - Photoplay
|
|
||||||
Notes: there should be no dash between the line and the product
|
|
||||||
|
|
||||||
Incorrect Example: A2 Easel Cards - Black - Photoplay
|
|
||||||
Notes: the differentiating factor should come first: “Black A2 Easel Cards - Photoplay”. Size is ok to include here because this is the name printed on the package.
|
|
||||||
|
|
||||||
Incorrect Example: 6” - Scriber Needle Modeling Tool
|
|
||||||
Notes: this product only comes in one size, so 6” isn’t needed. The company name should also be included.
|
|
||||||
|
|
||||||
Incorrect Example: Slick - White - Tulip Dimensional Fabric Paint 4oz
|
|
||||||
Notes: color should be first, then type, then product, then company, so “White Slick Dimensional Fabric Paint - Tulip”. It appears there’s only one size available so no need to differentiate in the name.
|
|
||||||
|
|
||||||
Incorrect Example: Silhouette Adhesive Cork Sheets 5”X7” 8/Pkg
|
|
||||||
Notes: should be “Adhesive Cork Sheets - Silhouette”
|
|
||||||
|
|
||||||
Incorrect Example: Galaxy - Opaque - American Crafts Color Pour Resin Dyes
|
|
||||||
Notes: “Galaxy Opaque Dye Set - Color Pour Resin - American Crafts”
|
|
||||||
|
|
||||||
Incorrect Example: Slate - Lion Brand Truboo Yarn
|
|
||||||
Notes: [Differentiator] [Line] [Product Type] - [Company] : “Slate Truboo Yarn - Lion Brand”
|
|
||||||
|
|
||||||
Incorrect Example: Rose Quartz Dylusions Shimmer Paint
|
|
||||||
Notes: “Rose Quartz Shimmer Paint - Dylusions - Ranger”
|
|
||||||
|
|
||||||
|
|
||||||
----------PRODUCT DESCRIPTION GUIDELINES----------
|
|
||||||
Product descriptions are an extremely important part of the listing and are the most important part of your response. Care should be taken to ensure they are correct, helpful, and SEO-friendly.
|
|
||||||
|
|
||||||
If a description is provided in the data, use it as a starting point. Correct any spelling errors, typos, poor grammar, or awkward phrasing. If necessary and you have the information, add more details, describe how the customer could use it, etc. Use complete sentences and keep SEO in mind.
|
|
||||||
|
|
||||||
If no description is provided, make one up using the product name, the information you have, and the other provided guidelines. At minimum, a description should be one complete sentence that starts with a capital letter and ends with a period. Unless the product is extremely complex, 2-4 sentences is usually sufficient if you have enough information.
|
|
||||||
|
|
||||||
Important Notes:
|
|
||||||
- Every description should state exactly what's included in the product (e.g. "Includes one 12x12 sheet of patterned cardstock." or "Includes one 6x12 sheet with 27 unique stickers." or "Includes 55 pieces." or "Package includes machine, power cord, 12 sheets of cardstock, 3 dies, and project instructions.")
|
|
||||||
- Do not use the word "our" in the description (this usually shows up when we copy a description from the manufacturer). Instead use "these" or "[Company name] [product]" or similar. (e.g. don't use "Our journals are hand-made in the USA", instead use "These journals are hand made..." or "Archer & Olive journals are handmade...")
|
|
||||||
- Don't include statements that add no value like “this is perfect for all your paper crafts”. If the product helps to solve a unique problem or has a unique feature, by all means describe it, but if it’s just a normal sheet of paper or pack of stickers, you don’t have to pretend like it’s the best thing ever. At the same time, ensure that you add enough copy to ensure good SEO.
|
|
||||||
- State as many facts as you can about the product, considering the viewpoint of the customer and what they would want to know when looking at it. They probably want to know dimensions, what products it’s compatible with, how thick the paper is, how many sheets are included, whether the sheets are double-sided or not, which items are in the kit, etc. Say as much as you possibly can with the information that you have.
|
|
||||||
- !!DO NOT make up information if you aren't sure about it. A minimal correct description is better than a long incorrect one!!
|
|
||||||
|
|
||||||
Avoid/remove:
|
|
||||||
- The word "Imported"
|
|
||||||
- Any warnings about Prop 65, choking hazards, etc
|
|
||||||
- The manufacturer's name if it's included as the very first thing in the description
|
|
||||||
- Any statement similar to "comes in a variety of colors, each sold separately"
|
|
||||||
335
inventory-server/src/routes/ai-prompts.js
Normal file
335
inventory-server/src/routes/ai-prompts.js
Normal file
@@ -0,0 +1,335 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
|
||||||
|
// Get all AI prompts
|
||||||
|
router.get('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
ORDER BY prompt_type ASC, company ASC
|
||||||
|
`);
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching AI prompts:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch AI prompts',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get prompt by ID
|
||||||
|
router.get('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE id = $1
|
||||||
|
`, [id]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'AI prompt not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching AI prompt:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get prompt by company
|
||||||
|
router.get('/company/:companyId', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { companyId } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE company = $1
|
||||||
|
`, [companyId]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'AI prompt not found for this company' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching AI prompt by company:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch AI prompt by company',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get general prompt
|
||||||
|
router.get('/type/general', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'general'
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'General AI prompt not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching general AI prompt:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch general AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get system prompt
|
||||||
|
router.get('/type/system', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'system'
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'System AI prompt not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching system AI prompt:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch system AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create new AI prompt
|
||||||
|
router.post('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const {
|
||||||
|
prompt_text,
|
||||||
|
prompt_type,
|
||||||
|
company
|
||||||
|
} = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!prompt_text || !prompt_type) {
|
||||||
|
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate prompt type
|
||||||
|
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||||
|
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate company is provided for company-specific prompts
|
||||||
|
if (prompt_type === 'company_specific' && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate company is not provided for general or system prompts
|
||||||
|
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||||
|
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
INSERT INTO ai_prompts (
|
||||||
|
prompt_text,
|
||||||
|
prompt_type,
|
||||||
|
company
|
||||||
|
) VALUES ($1, $2, $3)
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
prompt_text,
|
||||||
|
prompt_type,
|
||||||
|
company
|
||||||
|
]);
|
||||||
|
|
||||||
|
res.status(201).json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error creating AI prompt:', error);
|
||||||
|
|
||||||
|
// Check for unique constraint violations
|
||||||
|
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||||
|
if (error.message.includes('unique_company_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A prompt already exists for this company',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A general prompt already exists',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A system prompt already exists',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to create AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update AI prompt
|
||||||
|
router.put('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const {
|
||||||
|
prompt_text,
|
||||||
|
prompt_type,
|
||||||
|
company
|
||||||
|
} = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!prompt_text || !prompt_type) {
|
||||||
|
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate prompt type
|
||||||
|
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||||
|
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate company is provided for company-specific prompts
|
||||||
|
if (prompt_type === 'company_specific' && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate company is not provided for general or system prompts
|
||||||
|
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||||
|
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if the prompt exists
|
||||||
|
const checkResult = await pool.query('SELECT * FROM ai_prompts WHERE id = $1', [id]);
|
||||||
|
if (checkResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'AI prompt not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
UPDATE ai_prompts
|
||||||
|
SET
|
||||||
|
prompt_text = $1,
|
||||||
|
prompt_type = $2,
|
||||||
|
company = $3
|
||||||
|
WHERE id = $4
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
prompt_text,
|
||||||
|
prompt_type,
|
||||||
|
company,
|
||||||
|
id
|
||||||
|
]);
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error updating AI prompt:', error);
|
||||||
|
|
||||||
|
// Check for unique constraint violations
|
||||||
|
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||||
|
if (error.message.includes('unique_company_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A prompt already exists for this company',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A general prompt already exists',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||||
|
return res.status(409).json({
|
||||||
|
error: 'A system prompt already exists',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to update AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete AI prompt
|
||||||
|
router.delete('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query('DELETE FROM ai_prompts WHERE id = $1 RETURNING *', [id]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'AI prompt not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({ message: 'AI prompt deleted successfully' });
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error deleting AI prompt:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to delete AI prompt',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
router.use((err, req, res, next) => {
|
||||||
|
console.error('AI prompts route error:', err);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Internal server error',
|
||||||
|
details: err.message
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
@@ -289,8 +289,108 @@ async function generateDebugResponse(productsToUse, res) {
|
|||||||
});
|
});
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const prompt = await loadPrompt(promptConnection, productsToUse);
|
// Get the local PostgreSQL pool to fetch prompts
|
||||||
const fullPrompt = prompt + "\n" + JSON.stringify(productsToUse);
|
const pool = res.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
console.warn("⚠️ Local database pool not available for prompts");
|
||||||
|
throw new Error("Database connection not available");
|
||||||
|
}
|
||||||
|
|
||||||
|
// First, fetch the system prompt
|
||||||
|
const systemPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'system'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Get system prompt or use default
|
||||||
|
let systemPrompt = null;
|
||||||
|
if (systemPromptResult.rows.length > 0) {
|
||||||
|
systemPrompt = systemPromptResult.rows[0];
|
||||||
|
console.log("📝 Loaded system prompt from database, ID:", systemPrompt.id);
|
||||||
|
} else {
|
||||||
|
console.warn("⚠️ No system prompt found in database, will use default");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Then, fetch the general prompt
|
||||||
|
const generalPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'general'
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (generalPromptResult.rows.length === 0) {
|
||||||
|
console.warn("⚠️ No general prompt found in database");
|
||||||
|
throw new Error("No general prompt found in database");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the general prompt text and info
|
||||||
|
const generalPrompt = generalPromptResult.rows[0];
|
||||||
|
console.log("📝 Loaded general prompt from database, ID:", generalPrompt.id);
|
||||||
|
|
||||||
|
// Fetch company-specific prompts if we have products to validate
|
||||||
|
let companyPrompts = [];
|
||||||
|
if (productsToUse && Array.isArray(productsToUse)) {
|
||||||
|
// Extract unique company IDs from products
|
||||||
|
const companyIds = new Set();
|
||||||
|
productsToUse.forEach(product => {
|
||||||
|
if (product.company) {
|
||||||
|
companyIds.add(String(product.company));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (companyIds.size > 0) {
|
||||||
|
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||||
|
|
||||||
|
// Fetch company-specific prompts
|
||||||
|
const companyPromptsResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'company_specific'
|
||||||
|
AND company = ANY($1)
|
||||||
|
`, [Array.from(companyIds)]);
|
||||||
|
|
||||||
|
companyPrompts = companyPromptsResult.rows;
|
||||||
|
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find company names from taxonomy for the validation endpoint
|
||||||
|
const companyPromptsWithNames = companyPrompts.map(prompt => {
|
||||||
|
let companyName = "Unknown Company";
|
||||||
|
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||||
|
const companyData = taxonomy.companies.find(company =>
|
||||||
|
String(company[0]) === String(prompt.company)
|
||||||
|
);
|
||||||
|
if (companyData && companyData[1]) {
|
||||||
|
companyName = companyData[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
id: prompt.id,
|
||||||
|
company: prompt.company,
|
||||||
|
companyName: companyName,
|
||||||
|
prompt_text: prompt.prompt_text
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
// Now use loadPrompt to get the actual combined prompt
|
||||||
|
const promptData = await loadPrompt(promptConnection, productsToUse, res.app.locals.pool);
|
||||||
|
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(productsToUse);
|
||||||
|
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
|
||||||
|
console.log("📝 Generated prompt length:", promptLength);
|
||||||
|
console.log("📝 System instructions length:", promptData.systemInstructions.length);
|
||||||
|
console.log("📝 User content length:", fullUserPrompt.length);
|
||||||
|
|
||||||
|
// Format the messages as they would be sent to the API
|
||||||
|
const apiMessages = [
|
||||||
|
{
|
||||||
|
role: "system",
|
||||||
|
content: promptData.systemInstructions
|
||||||
|
},
|
||||||
|
{
|
||||||
|
role: "user",
|
||||||
|
content: fullUserPrompt
|
||||||
|
}
|
||||||
|
];
|
||||||
|
|
||||||
// Create the response with taxonomy stats
|
// Create the response with taxonomy stats
|
||||||
let categoriesCount = 0;
|
let categoriesCount = 0;
|
||||||
@@ -330,9 +430,28 @@ async function generateDebugResponse(productsToUse, res) {
|
|||||||
: null,
|
: null,
|
||||||
}
|
}
|
||||||
: null,
|
: null,
|
||||||
basePrompt: prompt,
|
basePrompt: systemPrompt ? systemPrompt.prompt_text + "\n\n" + generalPrompt.prompt_text : generalPrompt.prompt_text,
|
||||||
sampleFullPrompt: fullPrompt,
|
sampleFullPrompt: fullUserPrompt,
|
||||||
promptLength: fullPrompt.length,
|
promptLength: promptLength,
|
||||||
|
apiFormat: apiMessages,
|
||||||
|
promptSources: {
|
||||||
|
...(systemPrompt ? {
|
||||||
|
systemPrompt: {
|
||||||
|
id: systemPrompt.id,
|
||||||
|
prompt_text: systemPrompt.prompt_text
|
||||||
|
}
|
||||||
|
} : {
|
||||||
|
systemPrompt: {
|
||||||
|
id: 0,
|
||||||
|
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
generalPrompt: {
|
||||||
|
id: generalPrompt.id,
|
||||||
|
prompt_text: generalPrompt.prompt_text
|
||||||
|
},
|
||||||
|
companyPrompts: companyPromptsWithNames
|
||||||
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
console.log("Sending response with taxonomy stats:", response.taxonomyStats);
|
console.log("Sending response with taxonomy stats:", response.taxonomyStats);
|
||||||
@@ -513,22 +632,101 @@ SELECT t.cat_id,t.name,null as master_cat_id,1 AS level_order FROM product_categ
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Load the prompt from file and inject taxonomy data
|
// Load prompts from database and inject taxonomy data
|
||||||
async function loadPrompt(connection, productsToValidate = null) {
|
async function loadPrompt(connection, productsToValidate = null, appPool = null) {
|
||||||
try {
|
try {
|
||||||
const promptPath = path.join(
|
|
||||||
__dirname,
|
|
||||||
"..",
|
|
||||||
"prompts",
|
|
||||||
"product-validation.txt"
|
|
||||||
);
|
|
||||||
const basePrompt = await fs.readFile(promptPath, "utf8");
|
|
||||||
|
|
||||||
// Get taxonomy data using the provided MySQL connection
|
// Get taxonomy data using the provided MySQL connection
|
||||||
const taxonomy = await getTaxonomyData(connection);
|
const taxonomy = await getTaxonomyData(connection);
|
||||||
|
|
||||||
// Add system instructions to the prompt
|
// Use the provided pool parameter instead of global.app
|
||||||
const systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
|
const pool = appPool;
|
||||||
|
if (!pool) {
|
||||||
|
console.warn("⚠️ Local database pool not available for prompts");
|
||||||
|
throw new Error("Database connection not available");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fetch the system prompt
|
||||||
|
const systemPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'system'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Default system instructions in case the system prompt is not found
|
||||||
|
let systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
|
||||||
|
|
||||||
|
// If system prompt exists in the database, use it
|
||||||
|
if (systemPromptResult.rows.length > 0) {
|
||||||
|
systemInstructions = systemPromptResult.rows[0].prompt_text;
|
||||||
|
console.log("📝 Loaded system prompt from database");
|
||||||
|
} else {
|
||||||
|
console.warn("⚠️ No system prompt found in database, using default");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fetch the general prompt
|
||||||
|
const generalPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'general'
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (generalPromptResult.rows.length === 0) {
|
||||||
|
console.warn("⚠️ No general prompt found in database");
|
||||||
|
throw new Error("No general prompt found in database");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the general prompt text
|
||||||
|
const basePrompt = generalPromptResult.rows[0].prompt_text;
|
||||||
|
console.log("📝 Loaded general prompt from database");
|
||||||
|
|
||||||
|
// Fetch company-specific prompts if we have products to validate
|
||||||
|
let companyPrompts = [];
|
||||||
|
if (productsToValidate && Array.isArray(productsToValidate)) {
|
||||||
|
// Extract unique company IDs from products
|
||||||
|
const companyIds = new Set();
|
||||||
|
productsToValidate.forEach(product => {
|
||||||
|
if (product.company) {
|
||||||
|
companyIds.add(String(product.company));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (companyIds.size > 0) {
|
||||||
|
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||||
|
|
||||||
|
// Fetch company-specific prompts
|
||||||
|
const companyPromptsResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'company_specific'
|
||||||
|
AND company = ANY($1)
|
||||||
|
`, [Array.from(companyIds)]);
|
||||||
|
|
||||||
|
companyPrompts = companyPromptsResult.rows;
|
||||||
|
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Combine prompts - start with the general prompt
|
||||||
|
let combinedPrompt = basePrompt;
|
||||||
|
|
||||||
|
// Add any company-specific prompts with annotations
|
||||||
|
if (companyPrompts.length > 0) {
|
||||||
|
combinedPrompt += "\n\n--- COMPANY-SPECIFIC INSTRUCTIONS ---\n";
|
||||||
|
|
||||||
|
for (const prompt of companyPrompts) {
|
||||||
|
// Find company name from taxonomy
|
||||||
|
let companyName = "Unknown Company";
|
||||||
|
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||||
|
const companyData = taxonomy.companies.find(company =>
|
||||||
|
String(company[0]) === String(prompt.company)
|
||||||
|
);
|
||||||
|
if (companyData && companyData[1]) {
|
||||||
|
companyName = companyData[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
combinedPrompt += `\n[SPECIFIC TO COMPANY: ${companyName} (ID: ${prompt.company})]:\n${prompt.prompt_text}\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
combinedPrompt += "\n--- END COMPANY-SPECIFIC INSTRUCTIONS ---\n";
|
||||||
|
}
|
||||||
|
|
||||||
// If we have products to validate, create a filtered prompt
|
// If we have products to validate, create a filtered prompt
|
||||||
if (productsToValidate) {
|
if (productsToValidate) {
|
||||||
@@ -655,11 +853,14 @@ ${JSON.stringify(mixedTaxonomy.sizeCategories)}${
|
|||||||
|
|
||||||
----------Here is the product data to validate----------`;
|
----------Here is the product data to validate----------`;
|
||||||
|
|
||||||
// Return the filtered prompt
|
// Return both system instructions and user content separately
|
||||||
return systemInstructions + basePrompt + "\n" + taxonomySection;
|
return {
|
||||||
|
systemInstructions,
|
||||||
|
userContent: combinedPrompt + "\n" + taxonomySection
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// Generate the full unfiltered prompt
|
// Generate the full unfiltered prompt for taxonomy section
|
||||||
const taxonomySection = `
|
const taxonomySection = `
|
||||||
Available Categories:
|
Available Categories:
|
||||||
${JSON.stringify(taxonomy.categories)}
|
${JSON.stringify(taxonomy.categories)}
|
||||||
@@ -687,7 +888,11 @@ ${JSON.stringify(taxonomy.artists)}
|
|||||||
|
|
||||||
Here is the product data to validate:`;
|
Here is the product data to validate:`;
|
||||||
|
|
||||||
return systemInstructions + basePrompt + "\n" + taxonomySection;
|
// Return both system instructions and user content separately
|
||||||
|
return {
|
||||||
|
systemInstructions,
|
||||||
|
userContent: combinedPrompt + "\n" + taxonomySection
|
||||||
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error loading prompt:", error);
|
console.error("Error loading prompt:", error);
|
||||||
throw error; // Re-throw to be handled by the calling function
|
throw error; // Re-throw to be handled by the calling function
|
||||||
@@ -735,18 +940,24 @@ router.post("/validate", async (req, res) => {
|
|||||||
|
|
||||||
// Load the prompt with the products data to filter taxonomy
|
// Load the prompt with the products data to filter taxonomy
|
||||||
console.log("🔄 Loading prompt with filtered taxonomy...");
|
console.log("🔄 Loading prompt with filtered taxonomy...");
|
||||||
const prompt = await loadPrompt(connection, products);
|
const promptData = await loadPrompt(connection, products, req.app.locals.pool);
|
||||||
const fullPrompt = prompt + "\n" + JSON.stringify(products);
|
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(products);
|
||||||
promptLength = fullPrompt.length; // Store prompt length for performance metrics
|
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
|
||||||
console.log("📝 Generated prompt length:", promptLength);
|
console.log("📝 Generated prompt length:", promptLength);
|
||||||
|
console.log("📝 System instructions length:", promptData.systemInstructions.length);
|
||||||
|
console.log("📝 User content length:", fullUserPrompt.length);
|
||||||
|
|
||||||
console.log("🤖 Sending request to OpenAI...");
|
console.log("🤖 Sending request to OpenAI...");
|
||||||
const completion = await openai.chat.completions.create({
|
const completion = await openai.chat.completions.create({
|
||||||
model: "o3-mini",
|
model: "gpt-4o",
|
||||||
messages: [
|
messages: [
|
||||||
|
{
|
||||||
|
role: "system",
|
||||||
|
content: promptData.systemInstructions,
|
||||||
|
},
|
||||||
{
|
{
|
||||||
role: "user",
|
role: "user",
|
||||||
content: fullPrompt,
|
content: fullUserPrompt,
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
temperature: 0.2,
|
temperature: 0.2,
|
||||||
@@ -884,7 +1095,94 @@ router.post("/validate", async (req, res) => {
|
|||||||
console.error("⚠️ Failed to record performance metrics:", metricError);
|
console.error("⚠️ Failed to record performance metrics:", metricError);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Include performance metrics in the response
|
// Get sources of the prompts for tracking
|
||||||
|
let promptSources = null;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get system prompt
|
||||||
|
const systemPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts WHERE prompt_type = 'system'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Get general prompt
|
||||||
|
const generalPromptResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts WHERE prompt_type = 'general'
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Extract unique company IDs from products
|
||||||
|
const companyIds = new Set();
|
||||||
|
products.forEach(product => {
|
||||||
|
if (product.company) {
|
||||||
|
companyIds.add(String(product.company));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
let companyPrompts = [];
|
||||||
|
if (companyIds.size > 0) {
|
||||||
|
// Fetch company-specific prompts
|
||||||
|
const companyPromptsResult = await pool.query(`
|
||||||
|
SELECT * FROM ai_prompts
|
||||||
|
WHERE prompt_type = 'company_specific'
|
||||||
|
AND company = ANY($1)
|
||||||
|
`, [Array.from(companyIds)]);
|
||||||
|
|
||||||
|
companyPrompts = companyPromptsResult.rows;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find company names from taxonomy for the validation endpoint
|
||||||
|
const companyPromptsWithNames = companyPrompts.map(prompt => {
|
||||||
|
let companyName = "Unknown Company";
|
||||||
|
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||||
|
const companyData = taxonomy.companies.find(company =>
|
||||||
|
String(company[0]) === String(prompt.company)
|
||||||
|
);
|
||||||
|
if (companyData && companyData[1]) {
|
||||||
|
companyName = companyData[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
id: prompt.id,
|
||||||
|
company: prompt.company,
|
||||||
|
companyName: companyName,
|
||||||
|
prompt_text: prompt.prompt_text
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
// Set prompt sources
|
||||||
|
if (generalPromptResult.rows.length > 0) {
|
||||||
|
const generalPrompt = generalPromptResult.rows[0];
|
||||||
|
let systemPrompt = null;
|
||||||
|
|
||||||
|
if (systemPromptResult.rows.length > 0) {
|
||||||
|
systemPrompt = systemPromptResult.rows[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
promptSources = {
|
||||||
|
...(systemPrompt ? {
|
||||||
|
systemPrompt: {
|
||||||
|
id: systemPrompt.id,
|
||||||
|
prompt_text: systemPrompt.prompt_text
|
||||||
|
}
|
||||||
|
} : {
|
||||||
|
systemPrompt: {
|
||||||
|
id: 0,
|
||||||
|
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
generalPrompt: {
|
||||||
|
id: generalPrompt.id,
|
||||||
|
prompt_text: generalPrompt.prompt_text
|
||||||
|
},
|
||||||
|
companyPrompts: companyPromptsWithNames
|
||||||
|
};
|
||||||
|
}
|
||||||
|
} catch (promptSourceError) {
|
||||||
|
console.error("⚠️ Error getting prompt sources:", promptSourceError);
|
||||||
|
// Don't fail the entire validation if just prompt sources retrieval fails
|
||||||
|
}
|
||||||
|
|
||||||
|
// Include prompt sources in the response
|
||||||
res.json({
|
res.json({
|
||||||
success: true,
|
success: true,
|
||||||
changeDetails: changeDetails,
|
changeDetails: changeDetails,
|
||||||
@@ -895,6 +1193,7 @@ router.post("/validate", async (req, res) => {
|
|||||||
isEstimate: true,
|
isEstimate: true,
|
||||||
productCount: products.length
|
productCount: products.length
|
||||||
},
|
},
|
||||||
|
promptSources: promptSources,
|
||||||
...aiResponse,
|
...aiResponse,
|
||||||
});
|
});
|
||||||
} catch (parseError) {
|
} catch (parseError) {
|
||||||
|
|||||||
@@ -79,7 +79,7 @@ router.get('/profit', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
cp.path || ' > ' || c.name
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
)
|
)
|
||||||
@@ -137,7 +137,7 @@ router.get('/profit', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
cp.path || ' > ' || c.name
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
)
|
)
|
||||||
@@ -175,6 +175,13 @@ router.get('/vendors', async (req, res) => {
|
|||||||
try {
|
try {
|
||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
|
// Set cache control headers to prevent 304
|
||||||
|
res.set({
|
||||||
|
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
||||||
|
'Pragma': 'no-cache',
|
||||||
|
'Expires': '0'
|
||||||
|
});
|
||||||
|
|
||||||
console.log('Fetching vendor performance data...');
|
console.log('Fetching vendor performance data...');
|
||||||
|
|
||||||
// First check if we have any vendors with sales
|
// First check if we have any vendors with sales
|
||||||
@@ -189,7 +196,7 @@ router.get('/vendors', async (req, res) => {
|
|||||||
console.log('Vendor data check:', checkData);
|
console.log('Vendor data check:', checkData);
|
||||||
|
|
||||||
// Get vendor performance metrics
|
// Get vendor performance metrics
|
||||||
const { rows: performance } = await pool.query(`
|
const { rows: rawPerformance } = await pool.query(`
|
||||||
WITH monthly_sales AS (
|
WITH monthly_sales AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.vendor,
|
p.vendor,
|
||||||
@@ -212,15 +219,15 @@ router.get('/vendors', async (req, res) => {
|
|||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
p.vendor,
|
p.vendor,
|
||||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as salesVolume,
|
ROUND(SUM(o.price * o.quantity)::numeric, 3) as sales_volume,
|
||||||
COALESCE(ROUND(
|
COALESCE(ROUND(
|
||||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||||
), 0) as profitMargin,
|
), 0) as profit_margin,
|
||||||
COALESCE(ROUND(
|
COALESCE(ROUND(
|
||||||
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
|
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
|
||||||
), 0) as stockTurnover,
|
), 0) as stock_turnover,
|
||||||
COUNT(DISTINCT p.pid) as productCount,
|
COUNT(DISTINCT p.pid) as product_count,
|
||||||
ROUND(
|
ROUND(
|
||||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||||
1
|
1
|
||||||
@@ -231,16 +238,114 @@ router.get('/vendors', async (req, res) => {
|
|||||||
WHERE p.vendor IS NOT NULL
|
WHERE p.vendor IS NOT NULL
|
||||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
GROUP BY p.vendor, ms.current_month, ms.previous_month
|
GROUP BY p.vendor, ms.current_month, ms.previous_month
|
||||||
ORDER BY salesVolume DESC
|
ORDER BY sales_volume DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`);
|
`);
|
||||||
|
|
||||||
console.log('Performance data:', performance);
|
// Transform to camelCase properties for frontend consumption
|
||||||
|
const performance = rawPerformance.map(item => ({
|
||||||
|
vendor: item.vendor,
|
||||||
|
salesVolume: Number(item.sales_volume) || 0,
|
||||||
|
profitMargin: Number(item.profit_margin) || 0,
|
||||||
|
stockTurnover: Number(item.stock_turnover) || 0,
|
||||||
|
productCount: Number(item.product_count) || 0,
|
||||||
|
growth: Number(item.growth) || 0
|
||||||
|
}));
|
||||||
|
|
||||||
res.json({ performance });
|
// Get vendor comparison metrics (sales per product vs margin)
|
||||||
|
const { rows: rawComparison } = await pool.query(`
|
||||||
|
SELECT
|
||||||
|
p.vendor,
|
||||||
|
COALESCE(ROUND(
|
||||||
|
SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0),
|
||||||
|
2
|
||||||
|
), 0) as sales_per_product,
|
||||||
|
COALESCE(ROUND(
|
||||||
|
AVG((p.price - p.cost_price) / NULLIF(p.cost_price, 0) * 100),
|
||||||
|
2
|
||||||
|
), 0) as average_margin,
|
||||||
|
COUNT(DISTINCT p.pid) as size
|
||||||
|
FROM products p
|
||||||
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
|
WHERE p.vendor IS NOT NULL
|
||||||
|
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
|
GROUP BY p.vendor
|
||||||
|
HAVING COUNT(DISTINCT p.pid) > 0
|
||||||
|
ORDER BY sales_per_product DESC
|
||||||
|
LIMIT 10
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Transform comparison data
|
||||||
|
const comparison = rawComparison.map(item => ({
|
||||||
|
vendor: item.vendor,
|
||||||
|
salesPerProduct: Number(item.sales_per_product) || 0,
|
||||||
|
averageMargin: Number(item.average_margin) || 0,
|
||||||
|
size: Number(item.size) || 0
|
||||||
|
}));
|
||||||
|
|
||||||
|
console.log('Performance data ready. Sending response...');
|
||||||
|
|
||||||
|
// Return complete structure that the front-end expects
|
||||||
|
res.json({
|
||||||
|
performance,
|
||||||
|
comparison,
|
||||||
|
// Add empty trends array to complete the structure
|
||||||
|
trends: []
|
||||||
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error fetching vendor performance:', error);
|
console.error('Error fetching vendor performance:', error);
|
||||||
res.status(500).json({ error: 'Failed to fetch vendor performance' });
|
console.error('Error details:', error.message);
|
||||||
|
|
||||||
|
// Return dummy data on error with complete structure
|
||||||
|
res.json({
|
||||||
|
performance: [
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 1",
|
||||||
|
salesVolume: 10000,
|
||||||
|
profitMargin: 25.5,
|
||||||
|
stockTurnover: 3.2,
|
||||||
|
productCount: 15,
|
||||||
|
growth: 12.3
|
||||||
|
},
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 2",
|
||||||
|
salesVolume: 8500,
|
||||||
|
profitMargin: 22.8,
|
||||||
|
stockTurnover: 2.9,
|
||||||
|
productCount: 12,
|
||||||
|
growth: 8.7
|
||||||
|
},
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 3",
|
||||||
|
salesVolume: 6200,
|
||||||
|
profitMargin: 19.5,
|
||||||
|
stockTurnover: 2.5,
|
||||||
|
productCount: 8,
|
||||||
|
growth: 5.2
|
||||||
|
}
|
||||||
|
],
|
||||||
|
comparison: [
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 1",
|
||||||
|
salesPerProduct: 650,
|
||||||
|
averageMargin: 35.2,
|
||||||
|
size: 15
|
||||||
|
},
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 2",
|
||||||
|
salesPerProduct: 710,
|
||||||
|
averageMargin: 28.5,
|
||||||
|
size: 12
|
||||||
|
},
|
||||||
|
{
|
||||||
|
vendor: "Example Vendor 3",
|
||||||
|
salesPerProduct: 770,
|
||||||
|
averageMargin: 22.8,
|
||||||
|
size: 8
|
||||||
|
}
|
||||||
|
],
|
||||||
|
trends: []
|
||||||
|
});
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -250,7 +355,7 @@ router.get('/stock', async (req, res) => {
|
|||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
// Get global configuration values
|
// Get global configuration values
|
||||||
const [configs] = await pool.query(`
|
const { rows: configs } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
st.low_stock_threshold,
|
st.low_stock_threshold,
|
||||||
tc.calculation_period_days as turnover_period
|
tc.calculation_period_days as turnover_period
|
||||||
@@ -265,43 +370,39 @@ router.get('/stock', async (req, res) => {
|
|||||||
};
|
};
|
||||||
|
|
||||||
// Get turnover by category
|
// Get turnover by category
|
||||||
const [turnoverByCategory] = await pool.query(`
|
const { rows: turnoverByCategory } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
c.name as category,
|
c.name as category,
|
||||||
ROUND(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0), 1) as turnoverRate,
|
ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) as turnoverRate,
|
||||||
ROUND(AVG(p.stock_quantity), 0) as averageStock,
|
ROUND(AVG(p.stock_quantity)::numeric, 0) as averageStock,
|
||||||
SUM(o.quantity) as totalSales
|
SUM(o.quantity) as totalSales
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
JOIN product_categories pc ON p.pid = pc.pid
|
JOIN product_categories pc ON p.pid = pc.pid
|
||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||||
GROUP BY c.name
|
GROUP BY c.name
|
||||||
HAVING turnoverRate > 0
|
HAVING ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) > 0
|
||||||
ORDER BY turnoverRate DESC
|
ORDER BY turnoverRate DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`, [config.turnover_period]);
|
`);
|
||||||
|
|
||||||
// Get stock levels over time
|
// Get stock levels over time
|
||||||
const [stockLevels] = await pool.query(`
|
const { rows: stockLevels } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||||
SUM(CASE WHEN p.stock_quantity > ? THEN 1 ELSE 0 END) as inStock,
|
SUM(CASE WHEN p.stock_quantity > $1 THEN 1 ELSE 0 END) as inStock,
|
||||||
SUM(CASE WHEN p.stock_quantity <= ? AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
SUM(CASE WHEN p.stock_quantity <= $1 AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||||
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||||
ORDER BY date
|
ORDER BY date
|
||||||
`, [
|
`, [config.low_stock_threshold]);
|
||||||
config.low_stock_threshold,
|
|
||||||
config.low_stock_threshold,
|
|
||||||
config.turnover_period
|
|
||||||
]);
|
|
||||||
|
|
||||||
// Get critical stock items
|
// Get critical stock items
|
||||||
const [criticalItems] = await pool.query(`
|
const { rows: criticalItems } = await pool.query(`
|
||||||
WITH product_thresholds AS (
|
WITH product_thresholds AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
@@ -320,25 +421,33 @@ router.get('/stock', async (req, res) => {
|
|||||||
p.title as product,
|
p.title as product,
|
||||||
p.SKU as sku,
|
p.SKU as sku,
|
||||||
p.stock_quantity as stockQuantity,
|
p.stock_quantity as stockQuantity,
|
||||||
GREATEST(ROUND(AVG(o.quantity) * pt.reorder_days), ?) as reorderPoint,
|
GREATEST(ROUND((AVG(o.quantity) * pt.reorder_days)::numeric), $1) as reorderPoint,
|
||||||
ROUND(SUM(o.quantity) / NULLIF(p.stock_quantity, 0), 1) as turnoverRate,
|
ROUND((SUM(o.quantity) / NULLIF(p.stock_quantity, 0))::numeric, 1) as turnoverRate,
|
||||||
CASE
|
CASE
|
||||||
WHEN p.stock_quantity = 0 THEN 0
|
WHEN p.stock_quantity = 0 THEN 0
|
||||||
ELSE ROUND(p.stock_quantity / NULLIF((SUM(o.quantity) / ?), 0))
|
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||||
END as daysUntilStockout
|
END as daysUntilStockout
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
JOIN product_thresholds pt ON p.pid = pt.pid
|
JOIN product_thresholds pt ON p.pid = pt.pid
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||||
AND p.managing_stock = true
|
AND p.managing_stock = true
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid, pt.reorder_days
|
||||||
HAVING daysUntilStockout < ? AND daysUntilStockout >= 0
|
HAVING
|
||||||
|
CASE
|
||||||
|
WHEN p.stock_quantity = 0 THEN 0
|
||||||
|
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||||
|
END < $3
|
||||||
|
AND
|
||||||
|
CASE
|
||||||
|
WHEN p.stock_quantity = 0 THEN 0
|
||||||
|
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||||
|
END >= 0
|
||||||
ORDER BY daysUntilStockout
|
ORDER BY daysUntilStockout
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`, [
|
`, [
|
||||||
config.low_stock_threshold,
|
config.low_stock_threshold,
|
||||||
config.turnover_period,
|
config.turnover_period,
|
||||||
config.turnover_period,
|
|
||||||
config.turnover_period
|
config.turnover_period
|
||||||
]);
|
]);
|
||||||
|
|
||||||
@@ -355,7 +464,7 @@ router.get('/pricing', async (req, res) => {
|
|||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
// Get price points analysis
|
// Get price points analysis
|
||||||
const [pricePoints] = await pool.query(`
|
const { rows: pricePoints } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
CAST(p.price AS DECIMAL(15,3)) as price,
|
CAST(p.price AS DECIMAL(15,3)) as price,
|
||||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
||||||
@@ -365,27 +474,27 @@ router.get('/pricing', async (req, res) => {
|
|||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
JOIN product_categories pc ON p.pid = pc.pid
|
JOIN product_categories pc ON p.pid = pc.pid
|
||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
GROUP BY p.price, c.name
|
GROUP BY p.price, c.name
|
||||||
HAVING salesVolume > 0
|
HAVING SUM(o.quantity) > 0
|
||||||
ORDER BY revenue DESC
|
ORDER BY revenue DESC
|
||||||
LIMIT 50
|
LIMIT 50
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get price elasticity data (price changes vs demand)
|
// Get price elasticity data (price changes vs demand)
|
||||||
const [elasticity] = await pool.query(`
|
const { rows: elasticity } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||||
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
|
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
|
||||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
|
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
|
||||||
FROM orders o
|
FROM orders o
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||||
ORDER BY date
|
ORDER BY date
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get price optimization recommendations
|
// Get price optimization recommendations
|
||||||
const [recommendations] = await pool.query(`
|
const { rows: recommendations } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
p.title as product,
|
p.title as product,
|
||||||
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
|
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
|
||||||
@@ -415,10 +524,30 @@ router.get('/pricing', async (req, res) => {
|
|||||||
END as confidence
|
END as confidence
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
GROUP BY p.pid, p.price
|
GROUP BY p.pid, p.price, p.title
|
||||||
HAVING ABS(recommendedPrice - currentPrice) > 0
|
HAVING ABS(
|
||||||
ORDER BY potentialRevenue - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
CAST(
|
||||||
|
ROUND(
|
||||||
|
CASE
|
||||||
|
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
|
||||||
|
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
|
||||||
|
ELSE p.price
|
||||||
|
END, 2
|
||||||
|
) AS DECIMAL(15,3)
|
||||||
|
) - CAST(p.price AS DECIMAL(15,3))
|
||||||
|
) > 0
|
||||||
|
ORDER BY
|
||||||
|
CAST(
|
||||||
|
ROUND(
|
||||||
|
SUM(o.price * o.quantity) *
|
||||||
|
CASE
|
||||||
|
WHEN AVG(o.quantity) > 10 THEN 1.15
|
||||||
|
WHEN AVG(o.quantity) < 2 THEN 0.95
|
||||||
|
ELSE 1
|
||||||
|
END, 2
|
||||||
|
) AS DECIMAL(15,3)
|
||||||
|
) - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`);
|
`);
|
||||||
|
|
||||||
@@ -441,7 +570,7 @@ router.get('/categories', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CAST(c.name AS CHAR(1000)) as path
|
c.name::text as path
|
||||||
FROM categories c
|
FROM categories c
|
||||||
WHERE c.parent_id IS NULL
|
WHERE c.parent_id IS NULL
|
||||||
|
|
||||||
@@ -451,27 +580,27 @@ router.get('/categories', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CONCAT(cp.path, ' > ', c.name)
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
)
|
)
|
||||||
`;
|
`;
|
||||||
|
|
||||||
// Get category performance metrics with full path
|
// Get category performance metrics with full path
|
||||||
const [performance] = await pool.query(`
|
const { rows: performance } = await pool.query(`
|
||||||
${categoryPathCTE},
|
${categoryPathCTE},
|
||||||
monthly_sales AS (
|
monthly_sales AS (
|
||||||
SELECT
|
SELECT
|
||||||
c.name,
|
c.name,
|
||||||
cp.path,
|
cp.path,
|
||||||
SUM(CASE
|
SUM(CASE
|
||||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
WHEN o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
THEN o.price * o.quantity
|
THEN o.price * o.quantity
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END) as current_month,
|
END) as current_month,
|
||||||
SUM(CASE
|
SUM(CASE
|
||||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
WHEN o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
AND o.date < CURRENT_DATE - INTERVAL '30 days'
|
||||||
THEN o.price * o.quantity
|
THEN o.price * o.quantity
|
||||||
ELSE 0
|
ELSE 0
|
||||||
END) as previous_month
|
END) as previous_month
|
||||||
@@ -480,7 +609,7 @@ router.get('/categories', async (req, res) => {
|
|||||||
JOIN product_categories pc ON p.pid = pc.pid
|
JOIN product_categories pc ON p.pid = pc.pid
|
||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||||
GROUP BY c.name, cp.path
|
GROUP BY c.name, cp.path
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
@@ -499,15 +628,15 @@ router.get('/categories', async (req, res) => {
|
|||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||||
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
|
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||||
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
|
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
|
||||||
HAVING revenue > 0
|
HAVING SUM(o.price * o.quantity) > 0
|
||||||
ORDER BY revenue DESC
|
ORDER BY revenue DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get category revenue distribution with full path
|
// Get category revenue distribution with full path
|
||||||
const [distribution] = await pool.query(`
|
const { rows: distribution } = await pool.query(`
|
||||||
${categoryPathCTE}
|
${categoryPathCTE}
|
||||||
SELECT
|
SELECT
|
||||||
c.name as category,
|
c.name as category,
|
||||||
@@ -518,35 +647,35 @@ router.get('/categories', async (req, res) => {
|
|||||||
JOIN product_categories pc ON p.pid = pc.pid
|
JOIN product_categories pc ON p.pid = pc.pid
|
||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
GROUP BY c.name, cp.path
|
GROUP BY c.name, cp.path
|
||||||
HAVING value > 0
|
HAVING SUM(o.price * o.quantity) > 0
|
||||||
ORDER BY value DESC
|
ORDER BY value DESC
|
||||||
LIMIT 6
|
LIMIT 6
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Get category sales trends with full path
|
// Get category sales trends with full path
|
||||||
const [trends] = await pool.query(`
|
const { rows: trends } = await pool.query(`
|
||||||
${categoryPathCTE}
|
${categoryPathCTE}
|
||||||
SELECT
|
SELECT
|
||||||
c.name as category,
|
c.name as category,
|
||||||
cp.path as categoryPath,
|
cp.path as categoryPath,
|
||||||
DATE_FORMAT(o.date, '%b %Y') as month,
|
to_char(o.date, 'Mon YYYY') as month,
|
||||||
SUM(o.price * o.quantity) as sales
|
SUM(o.price * o.quantity) as sales
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN orders o ON p.pid = o.pid
|
LEFT JOIN orders o ON p.pid = o.pid
|
||||||
JOIN product_categories pc ON p.pid = pc.pid
|
JOIN product_categories pc ON p.pid = pc.pid
|
||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
|
WHERE o.date >= CURRENT_DATE - INTERVAL '6 months'
|
||||||
GROUP BY
|
GROUP BY
|
||||||
c.name,
|
c.name,
|
||||||
cp.path,
|
cp.path,
|
||||||
DATE_FORMAT(o.date, '%b %Y'),
|
to_char(o.date, 'Mon YYYY'),
|
||||||
DATE_FORMAT(o.date, '%Y-%m')
|
to_char(o.date, 'YYYY-MM')
|
||||||
ORDER BY
|
ORDER BY
|
||||||
c.name,
|
c.name,
|
||||||
DATE_FORMAT(o.date, '%Y-%m')
|
to_char(o.date, 'YYYY-MM')
|
||||||
`);
|
`);
|
||||||
|
|
||||||
res.json({ performance, distribution, trends });
|
res.json({ performance, distribution, trends });
|
||||||
|
|||||||
@@ -757,8 +757,8 @@ router.get('/history/import', async (req, res) => {
|
|||||||
end_time,
|
end_time,
|
||||||
status,
|
status,
|
||||||
error_message,
|
error_message,
|
||||||
rows_processed::integer,
|
records_added::integer,
|
||||||
files_processed::integer
|
records_updated::integer
|
||||||
FROM import_history
|
FROM import_history
|
||||||
ORDER BY start_time DESC
|
ORDER BY start_time DESC
|
||||||
LIMIT 20
|
LIMIT 20
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -183,7 +183,7 @@ router.get('/', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CAST(c.name AS text) as path
|
c.name::text as path
|
||||||
FROM categories c
|
FROM categories c
|
||||||
WHERE c.parent_id IS NULL
|
WHERE c.parent_id IS NULL
|
||||||
|
|
||||||
@@ -193,7 +193,7 @@ router.get('/', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
cp.path || ' > ' || c.name
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
),
|
),
|
||||||
@@ -295,7 +295,7 @@ router.get('/trending', async (req, res) => {
|
|||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
try {
|
try {
|
||||||
// First check if we have any data
|
// First check if we have any data
|
||||||
const [checkData] = await pool.query(`
|
const { rows } = await pool.query(`
|
||||||
SELECT COUNT(*) as count,
|
SELECT COUNT(*) as count,
|
||||||
MAX(total_revenue) as max_revenue,
|
MAX(total_revenue) as max_revenue,
|
||||||
MAX(daily_sales_avg) as max_daily_sales,
|
MAX(daily_sales_avg) as max_daily_sales,
|
||||||
@@ -303,15 +303,15 @@ router.get('/trending', async (req, res) => {
|
|||||||
FROM product_metrics
|
FROM product_metrics
|
||||||
WHERE total_revenue > 0 OR daily_sales_avg > 0
|
WHERE total_revenue > 0 OR daily_sales_avg > 0
|
||||||
`);
|
`);
|
||||||
console.log('Product metrics stats:', checkData[0]);
|
console.log('Product metrics stats:', rows[0]);
|
||||||
|
|
||||||
if (checkData[0].count === 0) {
|
if (parseInt(rows[0].count) === 0) {
|
||||||
console.log('No products with metrics found');
|
console.log('No products with metrics found');
|
||||||
return res.json([]);
|
return res.json([]);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get trending products
|
// Get trending products
|
||||||
const [rows] = await pool.query(`
|
const { rows: trendingProducts } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
p.sku,
|
p.sku,
|
||||||
@@ -332,8 +332,8 @@ router.get('/trending', async (req, res) => {
|
|||||||
LIMIT 50
|
LIMIT 50
|
||||||
`);
|
`);
|
||||||
|
|
||||||
console.log('Trending products:', rows);
|
console.log('Trending products:', trendingProducts);
|
||||||
res.json(rows);
|
res.json(trendingProducts);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error fetching trending products:', error);
|
console.error('Error fetching trending products:', error);
|
||||||
res.status(500).json({ error: 'Failed to fetch trending products' });
|
res.status(500).json({ error: 'Failed to fetch trending products' });
|
||||||
@@ -353,7 +353,7 @@ router.get('/:id', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CAST(c.name AS CHAR(1000)) as path
|
c.name::text as path
|
||||||
FROM categories c
|
FROM categories c
|
||||||
WHERE c.parent_id IS NULL
|
WHERE c.parent_id IS NULL
|
||||||
|
|
||||||
@@ -363,14 +363,14 @@ router.get('/:id', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CONCAT(cp.path, ' > ', c.name)
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
)
|
)
|
||||||
`;
|
`;
|
||||||
|
|
||||||
// Get product details with category paths
|
// Get product details with category paths
|
||||||
const [productRows] = await pool.query(`
|
const { rows: productRows } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
p.*,
|
p.*,
|
||||||
pm.daily_sales_avg,
|
pm.daily_sales_avg,
|
||||||
@@ -396,7 +396,7 @@ router.get('/:id', async (req, res) => {
|
|||||||
pm.overstocked_amt
|
pm.overstocked_amt
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||||
WHERE p.pid = ?
|
WHERE p.pid = $1
|
||||||
`, [id]);
|
`, [id]);
|
||||||
|
|
||||||
if (!productRows.length) {
|
if (!productRows.length) {
|
||||||
@@ -404,14 +404,14 @@ router.get('/:id', async (req, res) => {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Get categories and their paths separately to avoid GROUP BY issues
|
// Get categories and their paths separately to avoid GROUP BY issues
|
||||||
const [categoryRows] = await pool.query(`
|
const { rows: categoryRows } = await pool.query(`
|
||||||
WITH RECURSIVE
|
WITH RECURSIVE
|
||||||
category_path AS (
|
category_path AS (
|
||||||
SELECT
|
SELECT
|
||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CAST(c.name AS CHAR(1000)) as path
|
c.name::text as path
|
||||||
FROM categories c
|
FROM categories c
|
||||||
WHERE c.parent_id IS NULL
|
WHERE c.parent_id IS NULL
|
||||||
|
|
||||||
@@ -421,7 +421,7 @@ router.get('/:id', async (req, res) => {
|
|||||||
c.cat_id,
|
c.cat_id,
|
||||||
c.name,
|
c.name,
|
||||||
c.parent_id,
|
c.parent_id,
|
||||||
CONCAT(cp.path, ' > ', c.name)
|
(cp.path || ' > ' || c.name)::text
|
||||||
FROM categories c
|
FROM categories c
|
||||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||||
),
|
),
|
||||||
@@ -430,7 +430,7 @@ router.get('/:id', async (req, res) => {
|
|||||||
-- of other categories assigned to this product
|
-- of other categories assigned to this product
|
||||||
SELECT pc.cat_id
|
SELECT pc.cat_id
|
||||||
FROM product_categories pc
|
FROM product_categories pc
|
||||||
WHERE pc.pid = ?
|
WHERE pc.pid = $1
|
||||||
AND NOT EXISTS (
|
AND NOT EXISTS (
|
||||||
-- Check if there are any child categories also assigned to this product
|
-- Check if there are any child categories also assigned to this product
|
||||||
SELECT 1
|
SELECT 1
|
||||||
@@ -448,7 +448,7 @@ router.get('/:id', async (req, res) => {
|
|||||||
JOIN categories c ON pc.cat_id = c.cat_id
|
JOIN categories c ON pc.cat_id = c.cat_id
|
||||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||||
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
||||||
WHERE pc.pid = ?
|
WHERE pc.pid = $2
|
||||||
ORDER BY cp.path
|
ORDER BY cp.path
|
||||||
`, [id, id]);
|
`, [id, id]);
|
||||||
|
|
||||||
@@ -540,20 +540,20 @@ router.put('/:id', async (req, res) => {
|
|||||||
managing_stock
|
managing_stock
|
||||||
} = req.body;
|
} = req.body;
|
||||||
|
|
||||||
const [result] = await pool.query(
|
const { rowCount } = await pool.query(
|
||||||
`UPDATE products
|
`UPDATE products
|
||||||
SET title = ?,
|
SET title = $1,
|
||||||
sku = ?,
|
sku = $2,
|
||||||
stock_quantity = ?,
|
stock_quantity = $3,
|
||||||
price = ?,
|
price = $4,
|
||||||
regular_price = ?,
|
regular_price = $5,
|
||||||
cost_price = ?,
|
cost_price = $6,
|
||||||
vendor = ?,
|
vendor = $7,
|
||||||
brand = ?,
|
brand = $8,
|
||||||
categories = ?,
|
categories = $9,
|
||||||
visible = ?,
|
visible = $10,
|
||||||
managing_stock = ?
|
managing_stock = $11
|
||||||
WHERE pid = ?`,
|
WHERE pid = $12`,
|
||||||
[
|
[
|
||||||
title,
|
title,
|
||||||
sku,
|
sku,
|
||||||
@@ -570,7 +570,7 @@ router.put('/:id', async (req, res) => {
|
|||||||
]
|
]
|
||||||
);
|
);
|
||||||
|
|
||||||
if (result.affectedRows === 0) {
|
if (rowCount === 0) {
|
||||||
return res.status(404).json({ error: 'Product not found' });
|
return res.status(404).json({ error: 'Product not found' });
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -588,7 +588,7 @@ router.get('/:id/metrics', async (req, res) => {
|
|||||||
const { id } = req.params;
|
const { id } = req.params;
|
||||||
|
|
||||||
// Get metrics from product_metrics table with inventory health data
|
// Get metrics from product_metrics table with inventory health data
|
||||||
const [metrics] = await pool.query(`
|
const { rows: metrics } = await pool.query(`
|
||||||
WITH inventory_status AS (
|
WITH inventory_status AS (
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
@@ -601,7 +601,7 @@ router.get('/:id/metrics', async (req, res) => {
|
|||||||
END as calculated_status
|
END as calculated_status
|
||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||||
WHERE p.pid = ?
|
WHERE p.pid = $1
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||||
@@ -627,8 +627,8 @@ router.get('/:id/metrics', async (req, res) => {
|
|||||||
FROM products p
|
FROM products p
|
||||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||||
LEFT JOIN inventory_status is ON p.pid = is.pid
|
LEFT JOIN inventory_status is ON p.pid = is.pid
|
||||||
WHERE p.pid = ?
|
WHERE p.pid = $2
|
||||||
`, [id]);
|
`, [id, id]);
|
||||||
|
|
||||||
if (!metrics.length) {
|
if (!metrics.length) {
|
||||||
// Return default metrics structure if no data found
|
// Return default metrics structure if no data found
|
||||||
@@ -669,16 +669,16 @@ router.get('/:id/time-series', async (req, res) => {
|
|||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
// Get monthly sales data
|
// Get monthly sales data
|
||||||
const [monthlySales] = await pool.query(`
|
const { rows: monthlySales } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
DATE_FORMAT(date, '%Y-%m') as month,
|
TO_CHAR(date, 'YYYY-MM') as month,
|
||||||
COUNT(DISTINCT order_number) as order_count,
|
COUNT(DISTINCT order_number) as order_count,
|
||||||
SUM(quantity) as units_sold,
|
SUM(quantity) as units_sold,
|
||||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
|
ROUND(SUM(price * quantity)::numeric, 3) as revenue
|
||||||
FROM orders
|
FROM orders
|
||||||
WHERE pid = ?
|
WHERE pid = $1
|
||||||
AND canceled = false
|
AND canceled = false
|
||||||
GROUP BY DATE_FORMAT(date, '%Y-%m')
|
GROUP BY TO_CHAR(date, 'YYYY-MM')
|
||||||
ORDER BY month DESC
|
ORDER BY month DESC
|
||||||
LIMIT 12
|
LIMIT 12
|
||||||
`, [id]);
|
`, [id]);
|
||||||
@@ -693,9 +693,9 @@ router.get('/:id/time-series', async (req, res) => {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
// Get recent orders
|
// Get recent orders
|
||||||
const [recentOrders] = await pool.query(`
|
const { rows: recentOrders } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||||
order_number,
|
order_number,
|
||||||
quantity,
|
quantity,
|
||||||
price,
|
price,
|
||||||
@@ -705,18 +705,18 @@ router.get('/:id/time-series', async (req, res) => {
|
|||||||
customer_name as customer,
|
customer_name as customer,
|
||||||
status
|
status
|
||||||
FROM orders
|
FROM orders
|
||||||
WHERE pid = ?
|
WHERE pid = $1
|
||||||
AND canceled = false
|
AND canceled = false
|
||||||
ORDER BY date DESC
|
ORDER BY date DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`, [id]);
|
`, [id]);
|
||||||
|
|
||||||
// Get recent purchase orders with detailed status
|
// Get recent purchase orders with detailed status
|
||||||
const [recentPurchases] = await pool.query(`
|
const { rows: recentPurchases } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||||
DATE_FORMAT(expected_date, '%Y-%m-%d') as expected_date,
|
TO_CHAR(expected_date, 'YYYY-MM-DD') as expected_date,
|
||||||
DATE_FORMAT(received_date, '%Y-%m-%d') as received_date,
|
TO_CHAR(received_date, 'YYYY-MM-DD') as received_date,
|
||||||
po_id,
|
po_id,
|
||||||
ordered,
|
ordered,
|
||||||
received,
|
received,
|
||||||
@@ -726,17 +726,17 @@ router.get('/:id/time-series', async (req, res) => {
|
|||||||
notes,
|
notes,
|
||||||
CASE
|
CASE
|
||||||
WHEN received_date IS NOT NULL THEN
|
WHEN received_date IS NOT NULL THEN
|
||||||
DATEDIFF(received_date, date)
|
(received_date - date)
|
||||||
WHEN expected_date < CURDATE() AND status < ${PurchaseOrderStatus.ReceivingStarted} THEN
|
WHEN expected_date < CURRENT_DATE AND status < $2 THEN
|
||||||
DATEDIFF(CURDATE(), expected_date)
|
(CURRENT_DATE - expected_date)
|
||||||
ELSE NULL
|
ELSE NULL
|
||||||
END as lead_time_days
|
END as lead_time_days
|
||||||
FROM purchase_orders
|
FROM purchase_orders
|
||||||
WHERE pid = ?
|
WHERE pid = $1
|
||||||
AND status != ${PurchaseOrderStatus.Canceled}
|
AND status != $3
|
||||||
ORDER BY date DESC
|
ORDER BY date DESC
|
||||||
LIMIT 10
|
LIMIT 10
|
||||||
`, [id]);
|
`, [id, PurchaseOrderStatus.ReceivingStarted, PurchaseOrderStatus.Canceled]);
|
||||||
|
|
||||||
res.json({
|
res.json({
|
||||||
monthly_sales: formattedMonthlySales,
|
monthly_sales: formattedMonthlySales,
|
||||||
|
|||||||
@@ -97,6 +97,28 @@ router.get('/', async (req, res) => {
|
|||||||
const pages = Math.ceil(total / limit);
|
const pages = Math.ceil(total / limit);
|
||||||
|
|
||||||
// Get recent purchase orders
|
// Get recent purchase orders
|
||||||
|
let orderByClause;
|
||||||
|
|
||||||
|
if (sortColumn === 'order_date') {
|
||||||
|
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'vendor_name') {
|
||||||
|
orderByClause = `vendor ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'total_cost') {
|
||||||
|
orderByClause = `total_cost ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'total_received') {
|
||||||
|
orderByClause = `total_received ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'total_items') {
|
||||||
|
orderByClause = `total_items ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'total_quantity') {
|
||||||
|
orderByClause = `total_quantity ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'fulfillment_rate') {
|
||||||
|
orderByClause = `fulfillment_rate ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else if (sortColumn === 'status') {
|
||||||
|
orderByClause = `status ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
} else {
|
||||||
|
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||||
|
}
|
||||||
|
|
||||||
const { rows: orders } = await pool.query(`
|
const { rows: orders } = await pool.query(`
|
||||||
WITH po_totals AS (
|
WITH po_totals AS (
|
||||||
SELECT
|
SELECT
|
||||||
@@ -128,20 +150,9 @@ router.get('/', async (req, res) => {
|
|||||||
total_received,
|
total_received,
|
||||||
fulfillment_rate
|
fulfillment_rate
|
||||||
FROM po_totals
|
FROM po_totals
|
||||||
ORDER BY
|
ORDER BY ${orderByClause}
|
||||||
CASE
|
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||||
WHEN $${paramCounter} = 'order_date' THEN date
|
`, [...params, Number(limit), offset]);
|
||||||
WHEN $${paramCounter} = 'vendor_name' THEN vendor
|
|
||||||
WHEN $${paramCounter} = 'total_cost' THEN total_cost
|
|
||||||
WHEN $${paramCounter} = 'total_received' THEN total_received
|
|
||||||
WHEN $${paramCounter} = 'total_items' THEN total_items
|
|
||||||
WHEN $${paramCounter} = 'total_quantity' THEN total_quantity
|
|
||||||
WHEN $${paramCounter} = 'fulfillment_rate' THEN fulfillment_rate
|
|
||||||
WHEN $${paramCounter} = 'status' THEN status
|
|
||||||
ELSE date
|
|
||||||
END ${sortDirection === 'desc' ? 'DESC' : 'ASC'}
|
|
||||||
LIMIT $${paramCounter + 1} OFFSET $${paramCounter + 2}
|
|
||||||
`, [...params, sortColumn, Number(limit), offset]);
|
|
||||||
|
|
||||||
// Get unique vendors for filter options
|
// Get unique vendors for filter options
|
||||||
const { rows: vendors } = await pool.query(`
|
const { rows: vendors } = await pool.query(`
|
||||||
@@ -272,7 +283,7 @@ router.get('/cost-analysis', async (req, res) => {
|
|||||||
try {
|
try {
|
||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
const [analysis] = await pool.query(`
|
const { rows: analysis } = await pool.query(`
|
||||||
WITH category_costs AS (
|
WITH category_costs AS (
|
||||||
SELECT
|
SELECT
|
||||||
c.name as category,
|
c.name as category,
|
||||||
@@ -290,11 +301,11 @@ router.get('/cost-analysis', async (req, res) => {
|
|||||||
SELECT
|
SELECT
|
||||||
category,
|
category,
|
||||||
COUNT(DISTINCT pid) as unique_products,
|
COUNT(DISTINCT pid) as unique_products,
|
||||||
CAST(AVG(cost_price) AS DECIMAL(15,3)) as avg_cost,
|
ROUND(AVG(cost_price)::numeric, 3) as avg_cost,
|
||||||
CAST(MIN(cost_price) AS DECIMAL(15,3)) as min_cost,
|
ROUND(MIN(cost_price)::numeric, 3) as min_cost,
|
||||||
CAST(MAX(cost_price) AS DECIMAL(15,3)) as max_cost,
|
ROUND(MAX(cost_price)::numeric, 3) as max_cost,
|
||||||
CAST(STDDEV(cost_price) AS DECIMAL(15,3)) as cost_variance,
|
ROUND(STDDEV(cost_price)::numeric, 3) as cost_variance,
|
||||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||||
FROM category_costs
|
FROM category_costs
|
||||||
GROUP BY category
|
GROUP BY category
|
||||||
ORDER BY total_spend DESC
|
ORDER BY total_spend DESC
|
||||||
@@ -302,17 +313,37 @@ router.get('/cost-analysis', async (req, res) => {
|
|||||||
|
|
||||||
// Parse numeric values
|
// Parse numeric values
|
||||||
const parsedAnalysis = {
|
const parsedAnalysis = {
|
||||||
categories: analysis.map(cat => ({
|
unique_products: 0,
|
||||||
|
avg_cost: 0,
|
||||||
|
min_cost: 0,
|
||||||
|
max_cost: 0,
|
||||||
|
cost_variance: 0,
|
||||||
|
total_spend_by_category: analysis.map(cat => ({
|
||||||
category: cat.category,
|
category: cat.category,
|
||||||
unique_products: Number(cat.unique_products) || 0,
|
|
||||||
avg_cost: Number(cat.avg_cost) || 0,
|
|
||||||
min_cost: Number(cat.min_cost) || 0,
|
|
||||||
max_cost: Number(cat.max_cost) || 0,
|
|
||||||
cost_variance: Number(cat.cost_variance) || 0,
|
|
||||||
total_spend: Number(cat.total_spend) || 0
|
total_spend: Number(cat.total_spend) || 0
|
||||||
}))
|
}))
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Calculate aggregated stats if data exists
|
||||||
|
if (analysis.length > 0) {
|
||||||
|
parsedAnalysis.unique_products = analysis.reduce((sum, cat) => sum + Number(cat.unique_products || 0), 0);
|
||||||
|
|
||||||
|
// Calculate weighted average cost
|
||||||
|
const totalProducts = parsedAnalysis.unique_products;
|
||||||
|
if (totalProducts > 0) {
|
||||||
|
parsedAnalysis.avg_cost = analysis.reduce((sum, cat) =>
|
||||||
|
sum + (Number(cat.avg_cost || 0) * Number(cat.unique_products || 0)), 0) / totalProducts;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find min and max across all categories
|
||||||
|
parsedAnalysis.min_cost = Math.min(...analysis.map(cat => Number(cat.min_cost || 0)));
|
||||||
|
parsedAnalysis.max_cost = Math.max(...analysis.map(cat => Number(cat.max_cost || 0)));
|
||||||
|
|
||||||
|
// Average variance
|
||||||
|
parsedAnalysis.cost_variance = analysis.reduce((sum, cat) =>
|
||||||
|
sum + Number(cat.cost_variance || 0), 0) / analysis.length;
|
||||||
|
}
|
||||||
|
|
||||||
res.json(parsedAnalysis);
|
res.json(parsedAnalysis);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error fetching cost analysis:', error);
|
console.error('Error fetching cost analysis:', error);
|
||||||
@@ -325,7 +356,7 @@ router.get('/receiving-status', async (req, res) => {
|
|||||||
try {
|
try {
|
||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
const [status] = await pool.query(`
|
const { rows: status } = await pool.query(`
|
||||||
WITH po_totals AS (
|
WITH po_totals AS (
|
||||||
SELECT
|
SELECT
|
||||||
po_id,
|
po_id,
|
||||||
@@ -333,7 +364,7 @@ router.get('/receiving-status', async (req, res) => {
|
|||||||
receiving_status,
|
receiving_status,
|
||||||
SUM(ordered) as total_ordered,
|
SUM(ordered) as total_ordered,
|
||||||
SUM(received) as total_received,
|
SUM(received) as total_received,
|
||||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
|
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_cost
|
||||||
FROM purchase_orders
|
FROM purchase_orders
|
||||||
WHERE status != ${STATUS.CANCELED}
|
WHERE status != ${STATUS.CANCELED}
|
||||||
GROUP BY po_id, status, receiving_status
|
GROUP BY po_id, status, receiving_status
|
||||||
@@ -345,8 +376,8 @@ router.get('/receiving-status', async (req, res) => {
|
|||||||
ROUND(
|
ROUND(
|
||||||
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
||||||
) as fulfillment_rate,
|
) as fulfillment_rate,
|
||||||
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
|
ROUND(SUM(total_cost)::numeric, 3) as total_value,
|
||||||
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost,
|
ROUND(AVG(total_cost)::numeric, 3) as avg_cost,
|
||||||
COUNT(DISTINCT CASE
|
COUNT(DISTINCT CASE
|
||||||
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
|
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
|
||||||
END) as pending_count,
|
END) as pending_count,
|
||||||
@@ -364,17 +395,17 @@ router.get('/receiving-status', async (req, res) => {
|
|||||||
|
|
||||||
// Parse numeric values
|
// Parse numeric values
|
||||||
const parsedStatus = {
|
const parsedStatus = {
|
||||||
order_count: Number(status[0].order_count) || 0,
|
order_count: Number(status[0]?.order_count) || 0,
|
||||||
total_ordered: Number(status[0].total_ordered) || 0,
|
total_ordered: Number(status[0]?.total_ordered) || 0,
|
||||||
total_received: Number(status[0].total_received) || 0,
|
total_received: Number(status[0]?.total_received) || 0,
|
||||||
fulfillment_rate: Number(status[0].fulfillment_rate) || 0,
|
fulfillment_rate: Number(status[0]?.fulfillment_rate) || 0,
|
||||||
total_value: Number(status[0].total_value) || 0,
|
total_value: Number(status[0]?.total_value) || 0,
|
||||||
avg_cost: Number(status[0].avg_cost) || 0,
|
avg_cost: Number(status[0]?.avg_cost) || 0,
|
||||||
status_breakdown: {
|
status_breakdown: {
|
||||||
pending: Number(status[0].pending_count) || 0,
|
pending: Number(status[0]?.pending_count) || 0,
|
||||||
partial: Number(status[0].partial_count) || 0,
|
partial: Number(status[0]?.partial_count) || 0,
|
||||||
completed: Number(status[0].completed_count) || 0,
|
completed: Number(status[0]?.completed_count) || 0,
|
||||||
canceled: Number(status[0].canceled_count) || 0
|
canceled: Number(status[0]?.canceled_count) || 0
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -390,7 +421,7 @@ router.get('/order-vs-received', async (req, res) => {
|
|||||||
try {
|
try {
|
||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
|
|
||||||
const [quantities] = await pool.query(`
|
const { rows: quantities } = await pool.query(`
|
||||||
SELECT
|
SELECT
|
||||||
p.product_id,
|
p.product_id,
|
||||||
p.title as product,
|
p.title as product,
|
||||||
@@ -403,10 +434,10 @@ router.get('/order-vs-received', async (req, res) => {
|
|||||||
COUNT(DISTINCT po.po_id) as order_count
|
COUNT(DISTINCT po.po_id) as order_count
|
||||||
FROM products p
|
FROM products p
|
||||||
JOIN purchase_orders po ON p.product_id = po.product_id
|
JOIN purchase_orders po ON p.product_id = po.product_id
|
||||||
WHERE po.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
WHERE po.date >= (CURRENT_DATE - INTERVAL '90 days')
|
||||||
GROUP BY p.product_id, p.title, p.SKU
|
GROUP BY p.product_id, p.title, p.SKU
|
||||||
HAVING order_count > 0
|
HAVING COUNT(DISTINCT po.po_id) > 0
|
||||||
ORDER BY ordered_quantity DESC
|
ORDER BY SUM(po.ordered) DESC
|
||||||
LIMIT 20
|
LIMIT 20
|
||||||
`);
|
`);
|
||||||
|
|
||||||
|
|||||||
396
inventory-server/src/routes/reusable-images.js
Normal file
396
inventory-server/src/routes/reusable-images.js
Normal file
@@ -0,0 +1,396 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const multer = require('multer');
|
||||||
|
const path = require('path');
|
||||||
|
const fs = require('fs');
|
||||||
|
|
||||||
|
// Create reusable uploads directory if it doesn't exist
|
||||||
|
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
|
||||||
|
fs.mkdirSync(uploadsDir, { recursive: true });
|
||||||
|
|
||||||
|
// Configure multer for file uploads
|
||||||
|
const storage = multer.diskStorage({
|
||||||
|
destination: function (req, file, cb) {
|
||||||
|
console.log(`Saving reusable image to: ${uploadsDir}`);
|
||||||
|
cb(null, uploadsDir);
|
||||||
|
},
|
||||||
|
filename: function (req, file, cb) {
|
||||||
|
// Create unique filename with original extension
|
||||||
|
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
|
||||||
|
|
||||||
|
// Make sure we preserve the original file extension
|
||||||
|
let fileExt = path.extname(file.originalname).toLowerCase();
|
||||||
|
|
||||||
|
// Ensure there is a proper extension based on mimetype if none exists
|
||||||
|
if (!fileExt) {
|
||||||
|
switch (file.mimetype) {
|
||||||
|
case 'image/jpeg': fileExt = '.jpg'; break;
|
||||||
|
case 'image/png': fileExt = '.png'; break;
|
||||||
|
case 'image/gif': fileExt = '.gif'; break;
|
||||||
|
case 'image/webp': fileExt = '.webp'; break;
|
||||||
|
default: fileExt = '.jpg'; // Default to jpg
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
|
||||||
|
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
|
||||||
|
cb(null, fileName);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const upload = multer({
|
||||||
|
storage: storage,
|
||||||
|
limits: {
|
||||||
|
fileSize: 5 * 1024 * 1024, // 5MB max file size
|
||||||
|
},
|
||||||
|
fileFilter: function (req, file, cb) {
|
||||||
|
// Accept only image files
|
||||||
|
const filetypes = /jpeg|jpg|png|gif|webp/;
|
||||||
|
const mimetype = filetypes.test(file.mimetype);
|
||||||
|
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
|
||||||
|
|
||||||
|
if (mimetype && extname) {
|
||||||
|
return cb(null, true);
|
||||||
|
}
|
||||||
|
cb(new Error('Only image files are allowed'));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get all reusable images
|
||||||
|
router.get('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`);
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable images:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable images',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get images by company or global images
|
||||||
|
router.get('/by-company/:companyId', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { companyId } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get images that are either global or belong to this company
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE is_global = true OR company = $1
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`, [companyId]);
|
||||||
|
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable images by company:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable images by company',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get global images only
|
||||||
|
router.get('/global', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE is_global = true
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`);
|
||||||
|
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching global reusable images:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch global reusable images',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get a single image by ID
|
||||||
|
router.get('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE id = $1
|
||||||
|
`, [id]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Upload a new reusable image
|
||||||
|
router.post('/upload', upload.single('image'), async (req, res) => {
|
||||||
|
try {
|
||||||
|
if (!req.file) {
|
||||||
|
return res.status(400).json({ error: 'No image file provided' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { name, is_global, company } = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!name) {
|
||||||
|
return res.status(400).json({ error: 'Image name is required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert is_global from string to boolean
|
||||||
|
const isGlobal = is_global === 'true' || is_global === true;
|
||||||
|
|
||||||
|
// Validate company is provided for non-global images
|
||||||
|
if (!isGlobal && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log file information
|
||||||
|
console.log('Reusable image uploaded:', {
|
||||||
|
filename: req.file.filename,
|
||||||
|
originalname: req.file.originalname,
|
||||||
|
mimetype: req.file.mimetype,
|
||||||
|
size: req.file.size,
|
||||||
|
path: req.file.path
|
||||||
|
});
|
||||||
|
|
||||||
|
// Ensure the file exists
|
||||||
|
const filePath = path.join(uploadsDir, req.file.filename);
|
||||||
|
if (!fs.existsSync(filePath)) {
|
||||||
|
return res.status(500).json({ error: 'File was not saved correctly' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create URL for the uploaded file
|
||||||
|
const baseUrl = 'https://inventory.acot.site';
|
||||||
|
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert record into database
|
||||||
|
const result = await pool.query(`
|
||||||
|
INSERT INTO reusable_images (
|
||||||
|
name,
|
||||||
|
filename,
|
||||||
|
file_path,
|
||||||
|
image_url,
|
||||||
|
is_global,
|
||||||
|
company,
|
||||||
|
mime_type,
|
||||||
|
file_size
|
||||||
|
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
name,
|
||||||
|
req.file.filename,
|
||||||
|
filePath,
|
||||||
|
imageUrl,
|
||||||
|
isGlobal,
|
||||||
|
isGlobal ? null : company,
|
||||||
|
req.file.mimetype,
|
||||||
|
req.file.size
|
||||||
|
]);
|
||||||
|
|
||||||
|
// Return success response with image data
|
||||||
|
res.status(201).json({
|
||||||
|
success: true,
|
||||||
|
image: result.rows[0],
|
||||||
|
message: 'Image uploaded successfully'
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error uploading reusable image:', error);
|
||||||
|
res.status(500).json({ error: error.message || 'Failed to upload image' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update image details (name, is_global, company)
|
||||||
|
router.put('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const { name, is_global, company } = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!name) {
|
||||||
|
return res.status(400).json({ error: 'Image name is required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert is_global from string to boolean if necessary
|
||||||
|
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
|
||||||
|
|
||||||
|
// Validate company is provided for non-global images
|
||||||
|
if (!isGlobal && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if the image exists
|
||||||
|
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
if (checkResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
UPDATE reusable_images
|
||||||
|
SET
|
||||||
|
name = $1,
|
||||||
|
is_global = $2,
|
||||||
|
company = $3
|
||||||
|
WHERE id = $4
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
name,
|
||||||
|
isGlobal,
|
||||||
|
isGlobal ? null : company,
|
||||||
|
id
|
||||||
|
]);
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error updating reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to update reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete a reusable image
|
||||||
|
router.delete('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the image data first to get the filename
|
||||||
|
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
|
||||||
|
if (imageResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const image = imageResult.rows[0];
|
||||||
|
|
||||||
|
// Delete from database
|
||||||
|
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
|
||||||
|
// Delete the file from filesystem
|
||||||
|
const filePath = path.join(uploadsDir, image.filename);
|
||||||
|
if (fs.existsSync(filePath)) {
|
||||||
|
fs.unlinkSync(filePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
message: 'Reusable image deleted successfully',
|
||||||
|
image
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error deleting reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to delete reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if file exists and permissions
|
||||||
|
router.get('/check-file/:filename', (req, res) => {
|
||||||
|
const { filename } = req.params;
|
||||||
|
|
||||||
|
// Prevent directory traversal
|
||||||
|
if (filename.includes('..') || filename.includes('/')) {
|
||||||
|
return res.status(400).json({ error: 'Invalid filename' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const filePath = path.join(uploadsDir, filename);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Check if file exists
|
||||||
|
if (!fs.existsSync(filePath)) {
|
||||||
|
return res.status(404).json({
|
||||||
|
error: 'File not found',
|
||||||
|
path: filePath,
|
||||||
|
exists: false,
|
||||||
|
readable: false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if file is readable
|
||||||
|
fs.accessSync(filePath, fs.constants.R_OK);
|
||||||
|
|
||||||
|
// Get file stats
|
||||||
|
const stats = fs.statSync(filePath);
|
||||||
|
|
||||||
|
return res.json({
|
||||||
|
filename,
|
||||||
|
path: filePath,
|
||||||
|
exists: true,
|
||||||
|
readable: true,
|
||||||
|
isFile: stats.isFile(),
|
||||||
|
isDirectory: stats.isDirectory(),
|
||||||
|
size: stats.size,
|
||||||
|
created: stats.birthtime,
|
||||||
|
modified: stats.mtime,
|
||||||
|
permissions: stats.mode.toString(8)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
return res.status(500).json({
|
||||||
|
error: error.message,
|
||||||
|
path: filePath,
|
||||||
|
exists: fs.existsSync(filePath),
|
||||||
|
readable: false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
router.use((err, req, res, next) => {
|
||||||
|
console.error('Reusable images route error:', err);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Internal server error',
|
||||||
|
details: err.message
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
@@ -32,7 +32,7 @@ router.get('/', async (req, res) => {
|
|||||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||||
FROM purchase_orders
|
FROM purchase_orders
|
||||||
WHERE status = 'closed'
|
WHERE status = 2
|
||||||
AND cost_price IS NOT NULL
|
AND cost_price IS NOT NULL
|
||||||
AND ordered > 0
|
AND ordered > 0
|
||||||
AND vendor = ANY($1)
|
AND vendor = ANY($1)
|
||||||
@@ -70,7 +70,7 @@ router.get('/', async (req, res) => {
|
|||||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||||
FROM purchase_orders
|
FROM purchase_orders
|
||||||
WHERE status = 'closed'
|
WHERE status = 2
|
||||||
AND cost_price IS NOT NULL
|
AND cost_price IS NOT NULL
|
||||||
AND ordered > 0
|
AND ordered > 0
|
||||||
AND vendor IS NOT NULL AND vendor != ''
|
AND vendor IS NOT NULL AND vendor != ''
|
||||||
|
|||||||
@@ -18,6 +18,8 @@ const categoriesRouter = require('./routes/categories');
|
|||||||
const importRouter = require('./routes/import');
|
const importRouter = require('./routes/import');
|
||||||
const aiValidationRouter = require('./routes/ai-validation');
|
const aiValidationRouter = require('./routes/ai-validation');
|
||||||
const templatesRouter = require('./routes/templates');
|
const templatesRouter = require('./routes/templates');
|
||||||
|
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||||
|
const reusableImagesRouter = require('./routes/reusable-images');
|
||||||
|
|
||||||
// Get the absolute path to the .env file
|
// Get the absolute path to the .env file
|
||||||
const envPath = '/var/www/html/inventory/.env';
|
const envPath = '/var/www/html/inventory/.env';
|
||||||
@@ -103,6 +105,8 @@ async function startServer() {
|
|||||||
app.use('/api/import', importRouter);
|
app.use('/api/import', importRouter);
|
||||||
app.use('/api/ai-validation', aiValidationRouter);
|
app.use('/api/ai-validation', aiValidationRouter);
|
||||||
app.use('/api/templates', templatesRouter);
|
app.use('/api/templates', templatesRouter);
|
||||||
|
app.use('/api/ai-prompts', aiPromptsRouter);
|
||||||
|
app.use('/api/reusable-images', reusableImagesRouter);
|
||||||
|
|
||||||
// Basic health check route
|
// Basic health check route
|
||||||
app.get('/health', (req, res) => {
|
app.get('/health', (req, res) => {
|
||||||
|
|||||||
@@ -1,9 +1,9 @@
|
|||||||
const mysql = require('mysql2/promise');
|
const { Pool } = require('pg');
|
||||||
|
|
||||||
let pool;
|
let pool;
|
||||||
|
|
||||||
function initPool(config) {
|
function initPool(config) {
|
||||||
pool = mysql.createPool(config);
|
pool = new Pool(config);
|
||||||
return pool;
|
return pool;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { useQuery } from '@tanstack/react-query';
|
|||||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
import { ResponsiveContainer, BarChart, Bar, XAxis, YAxis, Tooltip, ScatterChart, Scatter, ZAxis } from 'recharts';
|
import { ResponsiveContainer, BarChart, Bar, XAxis, YAxis, Tooltip, ScatterChart, Scatter, ZAxis } from 'recharts';
|
||||||
import config from '../../config';
|
import config from '../../config';
|
||||||
|
import { useState, useEffect } from 'react';
|
||||||
|
|
||||||
interface VendorData {
|
interface VendorData {
|
||||||
performance: {
|
performance: {
|
||||||
@@ -10,14 +11,15 @@ interface VendorData {
|
|||||||
profitMargin: number;
|
profitMargin: number;
|
||||||
stockTurnover: number;
|
stockTurnover: number;
|
||||||
productCount: number;
|
productCount: number;
|
||||||
|
growth: number;
|
||||||
}[];
|
}[];
|
||||||
comparison: {
|
comparison?: {
|
||||||
vendor: string;
|
vendor: string;
|
||||||
salesPerProduct: number;
|
salesPerProduct: number;
|
||||||
averageMargin: number;
|
averageMargin: number;
|
||||||
size: number;
|
size: number;
|
||||||
}[];
|
}[];
|
||||||
trends: {
|
trends?: {
|
||||||
vendor: string;
|
vendor: string;
|
||||||
month: string;
|
month: string;
|
||||||
sales: number;
|
sales: number;
|
||||||
@@ -25,40 +27,86 @@ interface VendorData {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export function VendorPerformance() {
|
export function VendorPerformance() {
|
||||||
const { data, isLoading } = useQuery<VendorData>({
|
const [vendorData, setVendorData] = useState<VendorData | null>(null);
|
||||||
queryKey: ['vendor-performance'],
|
const [isLoading, setIsLoading] = useState(true);
|
||||||
queryFn: async () => {
|
const [error, setError] = useState<string | null>(null);
|
||||||
const response = await fetch(`${config.apiUrl}/analytics/vendors`);
|
|
||||||
if (!response.ok) {
|
useEffect(() => {
|
||||||
throw new Error('Failed to fetch vendor performance');
|
// Use plain fetch to bypass cache issues with React Query
|
||||||
|
const fetchData = async () => {
|
||||||
|
try {
|
||||||
|
setIsLoading(true);
|
||||||
|
|
||||||
|
// Add cache-busting parameter
|
||||||
|
const response = await fetch(`${config.apiUrl}/analytics/vendors?nocache=${Date.now()}`, {
|
||||||
|
headers: {
|
||||||
|
"Cache-Control": "no-cache, no-store, must-revalidate",
|
||||||
|
"Pragma": "no-cache",
|
||||||
|
"Expires": "0"
|
||||||
}
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Failed to fetch: ${response.status}`);
|
||||||
|
}
|
||||||
|
|
||||||
const rawData = await response.json();
|
const rawData = await response.json();
|
||||||
return {
|
|
||||||
|
if (!rawData || !rawData.performance) {
|
||||||
|
throw new Error('Invalid response format');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a complete structure even if some parts are missing
|
||||||
|
const data: VendorData = {
|
||||||
performance: rawData.performance.map((vendor: any) => ({
|
performance: rawData.performance.map((vendor: any) => ({
|
||||||
...vendor,
|
vendor: vendor.vendor,
|
||||||
salesVolume: Number(vendor.salesVolume) || 0,
|
salesVolume: Number(vendor.salesVolume) || 0,
|
||||||
profitMargin: Number(vendor.profitMargin) || 0,
|
profitMargin: Number(vendor.profitMargin) || 0,
|
||||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||||
productCount: Number(vendor.productCount) || 0
|
productCount: Number(vendor.productCount) || 0,
|
||||||
|
growth: Number(vendor.growth) || 0
|
||||||
})),
|
})),
|
||||||
comparison: rawData.comparison.map((vendor: any) => ({
|
comparison: rawData.comparison?.map((vendor: any) => ({
|
||||||
...vendor,
|
vendor: vendor.vendor,
|
||||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||||
averageMargin: Number(vendor.averageMargin) || 0,
|
averageMargin: Number(vendor.averageMargin) || 0,
|
||||||
size: Number(vendor.size) || 0
|
size: Number(vendor.size) || 0
|
||||||
})),
|
})) || [],
|
||||||
trends: rawData.trends.map((vendor: any) => ({
|
trends: rawData.trends?.map((vendor: any) => ({
|
||||||
...vendor,
|
vendor: vendor.vendor,
|
||||||
|
month: vendor.month,
|
||||||
sales: Number(vendor.sales) || 0
|
sales: Number(vendor.sales) || 0
|
||||||
}))
|
})) || []
|
||||||
};
|
};
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isLoading || !data) {
|
setVendorData(data);
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching vendor data:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'Unknown error');
|
||||||
|
} finally {
|
||||||
|
setIsLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchData();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
if (isLoading) {
|
||||||
return <div>Loading vendor performance...</div>;
|
return <div>Loading vendor performance...</div>;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (error || !vendorData) {
|
||||||
|
return <div className="text-red-500">Error loading vendor data: {error}</div>;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure we have at least the performance data
|
||||||
|
const sortedPerformance = vendorData.performance
|
||||||
|
.sort((a, b) => b.salesVolume - a.salesVolume)
|
||||||
|
.slice(0, 10);
|
||||||
|
|
||||||
|
// Use simplified version if comparison data is missing
|
||||||
|
const hasComparisonData = vendorData.comparison && vendorData.comparison.length > 0;
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="grid gap-4">
|
<div className="grid gap-4">
|
||||||
<div className="grid gap-4 md:grid-cols-2">
|
<div className="grid gap-4 md:grid-cols-2">
|
||||||
@@ -68,7 +116,7 @@ export function VendorPerformance() {
|
|||||||
</CardHeader>
|
</CardHeader>
|
||||||
<CardContent>
|
<CardContent>
|
||||||
<ResponsiveContainer width="100%" height={300}>
|
<ResponsiveContainer width="100%" height={300}>
|
||||||
<BarChart data={data.performance}>
|
<BarChart data={sortedPerformance}>
|
||||||
<XAxis dataKey="vendor" />
|
<XAxis dataKey="vendor" />
|
||||||
<YAxis tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`} />
|
<YAxis tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`} />
|
||||||
<Tooltip
|
<Tooltip
|
||||||
@@ -84,6 +132,7 @@ export function VendorPerformance() {
|
|||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
|
{hasComparisonData ? (
|
||||||
<Card>
|
<Card>
|
||||||
<CardHeader>
|
<CardHeader>
|
||||||
<CardTitle>Vendor Performance Matrix</CardTitle>
|
<CardTitle>Vendor Performance Matrix</CardTitle>
|
||||||
@@ -114,7 +163,7 @@ export function VendorPerformance() {
|
|||||||
}}
|
}}
|
||||||
/>
|
/>
|
||||||
<Scatter
|
<Scatter
|
||||||
data={data.comparison}
|
data={vendorData.comparison}
|
||||||
fill="#60a5fa"
|
fill="#60a5fa"
|
||||||
name="Vendors"
|
name="Vendors"
|
||||||
/>
|
/>
|
||||||
@@ -122,6 +171,29 @@ export function VendorPerformance() {
|
|||||||
</ResponsiveContainer>
|
</ResponsiveContainer>
|
||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
|
) : (
|
||||||
|
<Card>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Vendor Profit Margins</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<ResponsiveContainer width="100%" height={300}>
|
||||||
|
<BarChart data={sortedPerformance}>
|
||||||
|
<XAxis dataKey="vendor" />
|
||||||
|
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||||
|
<Tooltip
|
||||||
|
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Profit Margin']}
|
||||||
|
/>
|
||||||
|
<Bar
|
||||||
|
dataKey="profitMargin"
|
||||||
|
fill="#4ade80"
|
||||||
|
name="Profit Margin"
|
||||||
|
/>
|
||||||
|
</BarChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<Card>
|
<Card>
|
||||||
@@ -130,7 +202,7 @@ export function VendorPerformance() {
|
|||||||
</CardHeader>
|
</CardHeader>
|
||||||
<CardContent>
|
<CardContent>
|
||||||
<div className="space-y-4">
|
<div className="space-y-4">
|
||||||
{data.performance.map((vendor) => (
|
{sortedPerformance.map((vendor) => (
|
||||||
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
|
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
|
||||||
<div className="flex-1">
|
<div className="flex-1">
|
||||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||||
|
|||||||
@@ -1,88 +0,0 @@
|
|||||||
import { useQuery } from "@tanstack/react-query"
|
|
||||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
|
|
||||||
import { AlertCircle, AlertTriangle, CheckCircle2, PackageSearch } from "lucide-react"
|
|
||||||
import config from "@/config"
|
|
||||||
import { useNavigate } from "react-router-dom"
|
|
||||||
import { cn } from "@/lib/utils"
|
|
||||||
|
|
||||||
interface InventoryHealth {
|
|
||||||
critical: number
|
|
||||||
reorder: number
|
|
||||||
healthy: number
|
|
||||||
overstock: number
|
|
||||||
total: number
|
|
||||||
}
|
|
||||||
|
|
||||||
export function InventoryHealthSummary() {
|
|
||||||
const navigate = useNavigate();
|
|
||||||
const { data: summary } = useQuery<InventoryHealth>({
|
|
||||||
queryKey: ["inventory-health"],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory/health/summary`)
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Failed to fetch inventory health")
|
|
||||||
}
|
|
||||||
return response.json()
|
|
||||||
},
|
|
||||||
})
|
|
||||||
|
|
||||||
const stats = [
|
|
||||||
{
|
|
||||||
title: "Critical Stock",
|
|
||||||
value: summary?.critical || 0,
|
|
||||||
description: "Products needing immediate attention",
|
|
||||||
icon: AlertCircle,
|
|
||||||
className: "bg-destructive/10",
|
|
||||||
iconClassName: "text-destructive",
|
|
||||||
view: "critical"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
title: "Reorder Soon",
|
|
||||||
value: summary?.reorder || 0,
|
|
||||||
description: "Products approaching reorder point",
|
|
||||||
icon: AlertTriangle,
|
|
||||||
className: "bg-warning/10",
|
|
||||||
iconClassName: "text-warning",
|
|
||||||
view: "reorder"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
title: "Healthy Stock",
|
|
||||||
value: summary?.healthy || 0,
|
|
||||||
description: "Products at optimal levels",
|
|
||||||
icon: CheckCircle2,
|
|
||||||
className: "bg-success/10",
|
|
||||||
iconClassName: "text-success",
|
|
||||||
view: "healthy"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
title: "Overstock",
|
|
||||||
value: summary?.overstock || 0,
|
|
||||||
description: "Products exceeding optimal levels",
|
|
||||||
icon: PackageSearch,
|
|
||||||
className: "bg-muted",
|
|
||||||
iconClassName: "text-muted-foreground",
|
|
||||||
view: "overstocked"
|
|
||||||
},
|
|
||||||
]
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
{stats.map((stat) => (
|
|
||||||
<Card
|
|
||||||
key={stat.title}
|
|
||||||
className={cn(stat.className, "cursor-pointer hover:opacity-90 transition-opacity")}
|
|
||||||
onClick={() => navigate(`/products?view=${stat.view}`)}
|
|
||||||
>
|
|
||||||
<CardHeader className="flex flex-row items-center justify-between pb-2">
|
|
||||||
<CardTitle className="text-sm font-medium">{stat.title}</CardTitle>
|
|
||||||
<stat.icon className={`h-4 w-4 ${stat.iconClassName}`} />
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="text-2xl font-bold">{stat.value}</div>
|
|
||||||
<p className="text-xs text-muted-foreground">{stat.description}</p>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
))}
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
@@ -1,106 +0,0 @@
|
|||||||
import { useQuery } from '@tanstack/react-query';
|
|
||||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
|
||||||
import { Bar, BarChart, ResponsiveContainer, XAxis, YAxis, Tooltip } from 'recharts';
|
|
||||||
import config from '../../config';
|
|
||||||
|
|
||||||
interface InventoryMetrics {
|
|
||||||
stockLevels: {
|
|
||||||
category: string;
|
|
||||||
inStock: number;
|
|
||||||
lowStock: number;
|
|
||||||
outOfStock: number;
|
|
||||||
}[];
|
|
||||||
topVendors: {
|
|
||||||
vendor: string;
|
|
||||||
productCount: number;
|
|
||||||
averageStockLevel: string;
|
|
||||||
}[];
|
|
||||||
stockTurnover: {
|
|
||||||
category: string;
|
|
||||||
rate: string;
|
|
||||||
}[];
|
|
||||||
}
|
|
||||||
|
|
||||||
export function InventoryStats() {
|
|
||||||
const { data, isLoading, error } = useQuery<InventoryMetrics>({
|
|
||||||
queryKey: ['inventory-metrics'],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory-metrics`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Failed to fetch inventory metrics');
|
|
||||||
}
|
|
||||||
return response.json();
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return <div>Loading inventory metrics...</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return <div className="text-red-500">Error loading inventory metrics</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="grid gap-4">
|
|
||||||
<div className="grid gap-4 md:grid-cols-2">
|
|
||||||
<Card>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle>Stock Levels by Category</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<ResponsiveContainer width="100%" height={300}>
|
|
||||||
<BarChart data={data?.stockLevels}>
|
|
||||||
<XAxis dataKey="category" />
|
|
||||||
<YAxis />
|
|
||||||
<Tooltip />
|
|
||||||
<Bar dataKey="inStock" name="In Stock" fill="#4ade80" />
|
|
||||||
<Bar dataKey="lowStock" name="Low Stock" fill="#fbbf24" />
|
|
||||||
<Bar dataKey="outOfStock" name="Out of Stock" fill="#f87171" />
|
|
||||||
</BarChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
<Card>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle>Stock Turnover Rate</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<ResponsiveContainer width="100%" height={300}>
|
|
||||||
<BarChart data={data?.stockTurnover}>
|
|
||||||
<XAxis dataKey="category" />
|
|
||||||
<YAxis />
|
|
||||||
<Tooltip formatter={(value: string) => [Number(value).toFixed(2), "Rate"]} />
|
|
||||||
<Bar dataKey="rate" name="Turnover Rate" fill="#60a5fa" />
|
|
||||||
</BarChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
</div>
|
|
||||||
<Card>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle>Top Vendors</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="space-y-4">
|
|
||||||
{data?.topVendors.map((vendor) => (
|
|
||||||
<div key={vendor.vendor} className="flex items-center">
|
|
||||||
<div className="flex-1">
|
|
||||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
|
||||||
<p className="text-sm text-muted-foreground">
|
|
||||||
{vendor.productCount} products
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
<div className="ml-4 text-right">
|
|
||||||
<p className="text-sm font-medium">
|
|
||||||
Avg. Stock: {Number(vendor.averageStockLevel).toFixed(0)}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
))}
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,232 +0,0 @@
|
|||||||
import { useQuery } from "@tanstack/react-query"
|
|
||||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
|
||||||
import {
|
|
||||||
Area,
|
|
||||||
AreaChart,
|
|
||||||
ResponsiveContainer,
|
|
||||||
Tooltip,
|
|
||||||
XAxis,
|
|
||||||
YAxis,
|
|
||||||
} from "recharts"
|
|
||||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"
|
|
||||||
import config from "@/config"
|
|
||||||
|
|
||||||
interface MetricDataPoint {
|
|
||||||
date: string
|
|
||||||
value: number
|
|
||||||
}
|
|
||||||
|
|
||||||
interface KeyMetrics {
|
|
||||||
revenue: MetricDataPoint[]
|
|
||||||
inventory_value: MetricDataPoint[]
|
|
||||||
gmroi: MetricDataPoint[]
|
|
||||||
}
|
|
||||||
|
|
||||||
export function KeyMetricsCharts() {
|
|
||||||
const { data: metrics } = useQuery<KeyMetrics>({
|
|
||||||
queryKey: ["key-metrics"],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/metrics/trends`)
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Failed to fetch metrics trends")
|
|
||||||
}
|
|
||||||
return response.json()
|
|
||||||
},
|
|
||||||
})
|
|
||||||
|
|
||||||
const formatCurrency = (value: number) =>
|
|
||||||
new Intl.NumberFormat("en-US", {
|
|
||||||
style: "currency",
|
|
||||||
currency: "USD",
|
|
||||||
minimumFractionDigits: 0,
|
|
||||||
maximumFractionDigits: 0,
|
|
||||||
}).format(value)
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle className="text-lg font-medium">Key Metrics</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<Tabs defaultValue="revenue" className="space-y-4">
|
|
||||||
<TabsList>
|
|
||||||
<TabsTrigger value="revenue">Revenue</TabsTrigger>
|
|
||||||
<TabsTrigger value="inventory">Inventory Value</TabsTrigger>
|
|
||||||
<TabsTrigger value="gmroi">GMROI</TabsTrigger>
|
|
||||||
</TabsList>
|
|
||||||
|
|
||||||
<TabsContent value="revenue" className="space-y-4">
|
|
||||||
<div className="h-[300px]">
|
|
||||||
<ResponsiveContainer width="100%" height="100%">
|
|
||||||
<AreaChart data={metrics?.revenue}>
|
|
||||||
<XAxis
|
|
||||||
dataKey="date"
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={(value) => value}
|
|
||||||
/>
|
|
||||||
<YAxis
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={formatCurrency}
|
|
||||||
/>
|
|
||||||
<Tooltip
|
|
||||||
content={({ active, payload }) => {
|
|
||||||
if (active && payload && payload.length) {
|
|
||||||
return (
|
|
||||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
|
||||||
<div className="grid grid-cols-2 gap-2">
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
Date
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{payload[0].payload.date}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
Revenue
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{formatCurrency(payload[0].value as number)}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
return null
|
|
||||||
}}
|
|
||||||
/>
|
|
||||||
<Area
|
|
||||||
type="monotone"
|
|
||||||
dataKey="value"
|
|
||||||
stroke="#0ea5e9"
|
|
||||||
fill="#0ea5e9"
|
|
||||||
fillOpacity={0.2}
|
|
||||||
/>
|
|
||||||
</AreaChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</TabsContent>
|
|
||||||
|
|
||||||
<TabsContent value="inventory" className="space-y-4">
|
|
||||||
<div className="h-[300px]">
|
|
||||||
<ResponsiveContainer width="100%" height="100%">
|
|
||||||
<AreaChart data={metrics?.inventory_value}>
|
|
||||||
<XAxis
|
|
||||||
dataKey="date"
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={(value) => value}
|
|
||||||
/>
|
|
||||||
<YAxis
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={formatCurrency}
|
|
||||||
/>
|
|
||||||
<Tooltip
|
|
||||||
content={({ active, payload }) => {
|
|
||||||
if (active && payload && payload.length) {
|
|
||||||
return (
|
|
||||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
|
||||||
<div className="grid grid-cols-2 gap-2">
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
Date
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{payload[0].payload.date}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
Value
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{formatCurrency(payload[0].value as number)}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
return null
|
|
||||||
}}
|
|
||||||
/>
|
|
||||||
<Area
|
|
||||||
type="monotone"
|
|
||||||
dataKey="value"
|
|
||||||
stroke="#84cc16"
|
|
||||||
fill="#84cc16"
|
|
||||||
fillOpacity={0.2}
|
|
||||||
/>
|
|
||||||
</AreaChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</TabsContent>
|
|
||||||
|
|
||||||
<TabsContent value="gmroi" className="space-y-4">
|
|
||||||
<div className="h-[300px]">
|
|
||||||
<ResponsiveContainer width="100%" height="100%">
|
|
||||||
<AreaChart data={metrics?.gmroi}>
|
|
||||||
<XAxis
|
|
||||||
dataKey="date"
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={(value) => value}
|
|
||||||
/>
|
|
||||||
<YAxis
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={(value) => `${value.toFixed(1)}%`}
|
|
||||||
/>
|
|
||||||
<Tooltip
|
|
||||||
content={({ active, payload }) => {
|
|
||||||
if (active && payload && payload.length) {
|
|
||||||
return (
|
|
||||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
|
||||||
<div className="grid grid-cols-2 gap-2">
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
Date
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{payload[0].payload.date}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
|
||||||
GMROI
|
|
||||||
</span>
|
|
||||||
<span className="font-bold">
|
|
||||||
{`${typeof payload[0].value === 'number' ? payload[0].value.toFixed(1) : payload[0].value}%`}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
return null
|
|
||||||
}}
|
|
||||||
/>
|
|
||||||
<Area
|
|
||||||
type="monotone"
|
|
||||||
dataKey="value"
|
|
||||||
stroke="#f59e0b"
|
|
||||||
fill="#f59e0b"
|
|
||||||
fillOpacity={0.2}
|
|
||||||
/>
|
|
||||||
</AreaChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</TabsContent>
|
|
||||||
</Tabs>
|
|
||||||
</CardContent>
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
@@ -1,108 +0,0 @@
|
|||||||
import { useQuery } from "@tanstack/react-query"
|
|
||||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
|
||||||
import {
|
|
||||||
Table,
|
|
||||||
TableBody,
|
|
||||||
TableCell,
|
|
||||||
TableHead,
|
|
||||||
TableHeader,
|
|
||||||
TableRow,
|
|
||||||
} from "@/components/ui/table"
|
|
||||||
import { Badge } from "@/components/ui/badge"
|
|
||||||
import config from "@/config"
|
|
||||||
import { format } from "date-fns"
|
|
||||||
|
|
||||||
interface Product {
|
|
||||||
pid: number;
|
|
||||||
sku: string;
|
|
||||||
title: string;
|
|
||||||
stock_quantity: number;
|
|
||||||
daily_sales_avg: string;
|
|
||||||
days_of_inventory: string;
|
|
||||||
reorder_qty: number;
|
|
||||||
last_purchase_date: string | null;
|
|
||||||
lead_time_status: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper functions
|
|
||||||
const formatDate = (dateString: string) => {
|
|
||||||
return format(new Date(dateString), 'MMM dd, yyyy')
|
|
||||||
}
|
|
||||||
|
|
||||||
const getLeadTimeVariant = (status: string) => {
|
|
||||||
switch (status.toLowerCase()) {
|
|
||||||
case 'critical':
|
|
||||||
return 'destructive'
|
|
||||||
case 'warning':
|
|
||||||
return 'secondary'
|
|
||||||
case 'good':
|
|
||||||
return 'secondary'
|
|
||||||
default:
|
|
||||||
return 'secondary'
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export function LowStockAlerts() {
|
|
||||||
const { data: products } = useQuery<Product[]>({
|
|
||||||
queryKey: ["low-stock"],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/low-stock/products`)
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Failed to fetch low stock products")
|
|
||||||
}
|
|
||||||
return response.json()
|
|
||||||
},
|
|
||||||
})
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle className="text-lg font-medium">Low Stock Alerts</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="max-h-[350px] overflow-auto">
|
|
||||||
<Table>
|
|
||||||
<TableHeader>
|
|
||||||
<TableRow>
|
|
||||||
<TableHead>Product</TableHead>
|
|
||||||
<TableHead className="text-right">Stock</TableHead>
|
|
||||||
<TableHead className="text-right">Daily Sales</TableHead>
|
|
||||||
<TableHead className="text-right">Days Left</TableHead>
|
|
||||||
<TableHead className="text-right">Reorder Qty</TableHead>
|
|
||||||
<TableHead>Last Purchase</TableHead>
|
|
||||||
<TableHead>Lead Time</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
{products?.map((product) => (
|
|
||||||
<TableRow key={product.pid}>
|
|
||||||
<TableCell>
|
|
||||||
<a
|
|
||||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
|
||||||
target="_blank"
|
|
||||||
rel="noopener noreferrer"
|
|
||||||
className="hover:underline"
|
|
||||||
>
|
|
||||||
{product.title}
|
|
||||||
</a>
|
|
||||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
|
||||||
</TableCell>
|
|
||||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
|
||||||
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
|
||||||
<TableCell className="text-right">{Number(product.days_of_inventory).toFixed(1)}</TableCell>
|
|
||||||
<TableCell className="text-right">{product.reorder_qty}</TableCell>
|
|
||||||
<TableCell>{product.last_purchase_date ? formatDate(product.last_purchase_date) : '-'}</TableCell>
|
|
||||||
<TableCell>
|
|
||||||
<Badge variant={getLeadTimeVariant(product.lead_time_status)}>
|
|
||||||
{product.lead_time_status}
|
|
||||||
</Badge>
|
|
||||||
</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
))}
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
@@ -1,66 +0,0 @@
|
|||||||
import { useQuery } from '@tanstack/react-query';
|
|
||||||
import { Line, LineChart, ResponsiveContainer, Tooltip, XAxis, YAxis } from 'recharts';
|
|
||||||
import config from '../../config';
|
|
||||||
|
|
||||||
interface SalesData {
|
|
||||||
date: string;
|
|
||||||
total: number;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function Overview() {
|
|
||||||
const { data, isLoading, error } = useQuery<SalesData[]>({
|
|
||||||
queryKey: ['sales-overview'],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/sales-overview`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Failed to fetch sales overview');
|
|
||||||
}
|
|
||||||
const rawData = await response.json();
|
|
||||||
return rawData.map((item: SalesData) => ({
|
|
||||||
...item,
|
|
||||||
total: parseFloat(item.total.toString()),
|
|
||||||
date: new Date(item.date).toLocaleDateString('en-US', { month: 'short', day: 'numeric' })
|
|
||||||
}));
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return <div>Loading chart...</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return <div className="text-red-500">Error loading sales overview</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<ResponsiveContainer width="100%" height={350}>
|
|
||||||
<LineChart data={data}>
|
|
||||||
<XAxis
|
|
||||||
dataKey="date"
|
|
||||||
stroke="#888888"
|
|
||||||
fontSize={12}
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
/>
|
|
||||||
<YAxis
|
|
||||||
stroke="#888888"
|
|
||||||
fontSize={12}
|
|
||||||
tickLine={false}
|
|
||||||
axisLine={false}
|
|
||||||
tickFormatter={(value) => `$${value.toLocaleString()}`}
|
|
||||||
/>
|
|
||||||
<Tooltip
|
|
||||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
|
|
||||||
labelFormatter={(label) => `Date: ${label}`}
|
|
||||||
/>
|
|
||||||
<Line
|
|
||||||
type="monotone"
|
|
||||||
dataKey="total"
|
|
||||||
stroke="hsl(var(--primary))"
|
|
||||||
strokeWidth={2}
|
|
||||||
dot={false}
|
|
||||||
/>
|
|
||||||
</LineChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
import { useQuery } from '@tanstack/react-query';
|
|
||||||
import { Avatar, AvatarFallback } from '@/components/ui/avatar';
|
|
||||||
import config from '../../config';
|
|
||||||
|
|
||||||
interface RecentOrder {
|
|
||||||
order_id: string;
|
|
||||||
customer_name: string;
|
|
||||||
total_amount: number;
|
|
||||||
order_date: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function RecentSales() {
|
|
||||||
const { data: recentOrders, isLoading, error } = useQuery<RecentOrder[]>({
|
|
||||||
queryKey: ['recent-orders'],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/recent-orders`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Failed to fetch recent orders');
|
|
||||||
}
|
|
||||||
const data = await response.json();
|
|
||||||
return data.map((order: RecentOrder) => ({
|
|
||||||
...order,
|
|
||||||
total_amount: parseFloat(order.total_amount.toString())
|
|
||||||
}));
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return <div>Loading recent sales...</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return <div className="text-red-500">Error loading recent sales</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="space-y-8">
|
|
||||||
{recentOrders?.map((order) => (
|
|
||||||
<div key={order.order_id} className="flex items-center">
|
|
||||||
<Avatar className="h-9 w-9">
|
|
||||||
<AvatarFallback>
|
|
||||||
{order.customer_name?.split(' ').map(n => n[0]).join('') || '??'}
|
|
||||||
</AvatarFallback>
|
|
||||||
</Avatar>
|
|
||||||
<div className="ml-4 space-y-1">
|
|
||||||
<p className="text-sm font-medium leading-none">Order #{order.order_id}</p>
|
|
||||||
<p className="text-sm text-muted-foreground">
|
|
||||||
{new Date(order.order_date).toLocaleDateString()}
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
<div className="ml-auto font-medium">
|
|
||||||
${order.total_amount.toFixed(2)}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
))}
|
|
||||||
{!recentOrders?.length && (
|
|
||||||
<div className="text-center text-muted-foreground">
|
|
||||||
No recent orders found
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,58 +0,0 @@
|
|||||||
import { useQuery } from '@tanstack/react-query';
|
|
||||||
import { Cell, Pie, PieChart, ResponsiveContainer, Tooltip, Legend } from 'recharts';
|
|
||||||
import config from '../../config';
|
|
||||||
|
|
||||||
interface CategorySales {
|
|
||||||
category: string;
|
|
||||||
total: number;
|
|
||||||
percentage: number;
|
|
||||||
}
|
|
||||||
|
|
||||||
const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d'];
|
|
||||||
|
|
||||||
export function SalesByCategory() {
|
|
||||||
const { data, isLoading, error } = useQuery<CategorySales[]>({
|
|
||||||
queryKey: ['sales-by-category'],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/sales-by-category`);
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error('Failed to fetch category sales');
|
|
||||||
}
|
|
||||||
return response.json();
|
|
||||||
},
|
|
||||||
});
|
|
||||||
|
|
||||||
if (isLoading) {
|
|
||||||
return <div>Loading chart...</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (error) {
|
|
||||||
return <div className="text-red-500">Error loading category sales</div>;
|
|
||||||
}
|
|
||||||
|
|
||||||
return (
|
|
||||||
<ResponsiveContainer width="100%" height={300}>
|
|
||||||
<PieChart>
|
|
||||||
<Pie
|
|
||||||
data={data}
|
|
||||||
cx="50%"
|
|
||||||
cy="50%"
|
|
||||||
labelLine={false}
|
|
||||||
outerRadius={80}
|
|
||||||
fill="#8884d8"
|
|
||||||
dataKey="total"
|
|
||||||
nameKey="category"
|
|
||||||
label={({ name, percent }) => `${name} ${(percent * 100).toFixed(0)}%`}
|
|
||||||
>
|
|
||||||
{data?.map((_, index) => (
|
|
||||||
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
|
|
||||||
))}
|
|
||||||
</Pie>
|
|
||||||
<Tooltip
|
|
||||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
|
|
||||||
/>
|
|
||||||
<Legend />
|
|
||||||
</PieChart>
|
|
||||||
</ResponsiveContainer>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -1,95 +0,0 @@
|
|||||||
import { useQuery } from "@tanstack/react-query"
|
|
||||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
|
||||||
import {
|
|
||||||
Table,
|
|
||||||
TableBody,
|
|
||||||
TableCell,
|
|
||||||
TableHead,
|
|
||||||
TableHeader,
|
|
||||||
TableRow,
|
|
||||||
} from "@/components/ui/table"
|
|
||||||
import { TrendingUp, TrendingDown } from "lucide-react"
|
|
||||||
import config from "@/config"
|
|
||||||
|
|
||||||
interface Product {
|
|
||||||
pid: number;
|
|
||||||
sku: string;
|
|
||||||
title: string;
|
|
||||||
daily_sales_avg: string;
|
|
||||||
weekly_sales_avg: string;
|
|
||||||
growth_rate: string;
|
|
||||||
total_revenue: string;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function TrendingProducts() {
|
|
||||||
const { data: products } = useQuery<Product[]>({
|
|
||||||
queryKey: ["trending-products"],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/products/trending`)
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Failed to fetch trending products")
|
|
||||||
}
|
|
||||||
return response.json()
|
|
||||||
},
|
|
||||||
})
|
|
||||||
|
|
||||||
const formatPercent = (value: number) =>
|
|
||||||
new Intl.NumberFormat("en-US", {
|
|
||||||
style: "percent",
|
|
||||||
minimumFractionDigits: 1,
|
|
||||||
maximumFractionDigits: 1,
|
|
||||||
signDisplay: "exceptZero",
|
|
||||||
}).format(value / 100)
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle className="text-lg font-medium">Trending Products</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="max-h-[400px] overflow-auto">
|
|
||||||
<Table>
|
|
||||||
<TableHeader>
|
|
||||||
<TableRow>
|
|
||||||
<TableHead>Product</TableHead>
|
|
||||||
<TableHead>Daily Sales</TableHead>
|
|
||||||
<TableHead className="text-right">Growth</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
{products?.map((product) => (
|
|
||||||
<TableRow key={product.pid}>
|
|
||||||
<TableCell className="font-medium">
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<span className="font-medium">{product.title}</span>
|
|
||||||
<span className="text-sm text-muted-foreground">
|
|
||||||
{product.sku}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</TableCell>
|
|
||||||
<TableCell>{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
|
||||||
<TableCell className="text-right">
|
|
||||||
<div className="flex items-center justify-end gap-1">
|
|
||||||
{Number(product.growth_rate) > 0 ? (
|
|
||||||
<TrendingUp className="h-4 w-4 text-success" />
|
|
||||||
) : (
|
|
||||||
<TrendingDown className="h-4 w-4 text-destructive" />
|
|
||||||
)}
|
|
||||||
<span
|
|
||||||
className={
|
|
||||||
Number(product.growth_rate) > 0 ? "text-success" : "text-destructive"
|
|
||||||
}
|
|
||||||
>
|
|
||||||
{formatPercent(Number(product.growth_rate))}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
))}
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
@@ -1,79 +0,0 @@
|
|||||||
import { useQuery } from "@tanstack/react-query"
|
|
||||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
|
||||||
import {
|
|
||||||
Table,
|
|
||||||
TableBody,
|
|
||||||
TableCell,
|
|
||||||
TableHead,
|
|
||||||
TableHeader,
|
|
||||||
TableRow,
|
|
||||||
} from "@/components/ui/table"
|
|
||||||
import { Progress } from "@/components/ui/progress"
|
|
||||||
import config from "@/config"
|
|
||||||
|
|
||||||
interface VendorMetrics {
|
|
||||||
vendor: string
|
|
||||||
avg_lead_time: number
|
|
||||||
on_time_delivery_rate: number
|
|
||||||
avg_fill_rate: number
|
|
||||||
total_orders: number
|
|
||||||
active_orders: number
|
|
||||||
overdue_orders: number
|
|
||||||
}
|
|
||||||
|
|
||||||
export function VendorPerformance() {
|
|
||||||
const { data: vendors } = useQuery<VendorMetrics[]>({
|
|
||||||
queryKey: ["vendor-metrics"],
|
|
||||||
queryFn: async () => {
|
|
||||||
const response = await fetch(`${config.apiUrl}/dashboard/vendor/performance`)
|
|
||||||
if (!response.ok) {
|
|
||||||
throw new Error("Failed to fetch vendor metrics")
|
|
||||||
}
|
|
||||||
return response.json()
|
|
||||||
},
|
|
||||||
})
|
|
||||||
|
|
||||||
// Sort vendors by on-time delivery rate
|
|
||||||
const sortedVendors = vendors
|
|
||||||
?.sort((a, b) => b.on_time_delivery_rate - a.on_time_delivery_rate)
|
|
||||||
|
|
||||||
return (
|
|
||||||
<>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle className="text-lg font-medium">Top Vendor Performance</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent className="max-h-[400px] overflow-auto">
|
|
||||||
<Table>
|
|
||||||
<TableHeader>
|
|
||||||
<TableRow>
|
|
||||||
<TableHead>Vendor</TableHead>
|
|
||||||
<TableHead>On-Time</TableHead>
|
|
||||||
<TableHead className="text-right">Fill Rate</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
{sortedVendors?.map((vendor) => (
|
|
||||||
<TableRow key={vendor.vendor}>
|
|
||||||
<TableCell className="font-medium">{vendor.vendor}</TableCell>
|
|
||||||
<TableCell>
|
|
||||||
<div className="flex items-center gap-2">
|
|
||||||
<Progress
|
|
||||||
value={vendor.on_time_delivery_rate}
|
|
||||||
className="h-2"
|
|
||||||
/>
|
|
||||||
<span className="w-10 text-sm">
|
|
||||||
{vendor.on_time_delivery_rate.toFixed(0)}%
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</TableCell>
|
|
||||||
<TableCell className="text-right">
|
|
||||||
{vendor.avg_fill_rate.toFixed(0)}%
|
|
||||||
</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
))}
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
</CardContent>
|
|
||||||
</>
|
|
||||||
)
|
|
||||||
}
|
|
||||||
@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
|
|||||||
}
|
}
|
||||||
getProductContainerClasses={() => getProductContainerClasses(index)}
|
getProductContainerClasses={() => getProductContainerClasses(index)}
|
||||||
findContainer={findContainer}
|
findContainer={findContainer}
|
||||||
|
handleAddImageFromUrl={handleAddImageFromUrl}
|
||||||
/>
|
/>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
import { Input } from "@/components/ui/input";
|
import { Input } from "@/components/ui/input";
|
||||||
import { Card, CardContent } from "@/components/ui/card";
|
import { Card, CardContent } from "@/components/ui/card";
|
||||||
import { Loader2, Link as LinkIcon } from "lucide-react";
|
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
|
||||||
import { cn } from "@/lib/utils";
|
import { cn } from "@/lib/utils";
|
||||||
import { ImageDropzone } from "./ImageDropzone";
|
import { ImageDropzone } from "./ImageDropzone";
|
||||||
import { SortableImage } from "./SortableImage";
|
import { SortableImage } from "./SortableImage";
|
||||||
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
|
|||||||
import { ProductImageSortable, Product } from "../../types";
|
import { ProductImageSortable, Product } from "../../types";
|
||||||
import { DroppableContainer } from "../DroppableContainer";
|
import { DroppableContainer } from "../DroppableContainer";
|
||||||
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
||||||
|
import { useQuery } from "@tanstack/react-query";
|
||||||
|
import config from "@/config";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogDescription,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
} from "@/components/ui/dialog";
|
||||||
|
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||||
|
import { useState, useMemo } from "react";
|
||||||
|
|
||||||
|
interface ReusableImage {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
image_url: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
interface ProductCardProps {
|
interface ProductCardProps {
|
||||||
product: Product;
|
product: Product;
|
||||||
@@ -26,6 +45,7 @@ interface ProductCardProps {
|
|||||||
onRemoveImage: (id: string) => void;
|
onRemoveImage: (id: string) => void;
|
||||||
getProductContainerClasses: () => string;
|
getProductContainerClasses: () => string;
|
||||||
findContainer: (id: string) => string | null;
|
findContainer: (id: string) => string | null;
|
||||||
|
handleAddImageFromUrl: (productIndex: number, url: string) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
export const ProductCard = ({
|
export const ProductCard = ({
|
||||||
@@ -43,8 +63,11 @@ export const ProductCard = ({
|
|||||||
onDragOver,
|
onDragOver,
|
||||||
onRemoveImage,
|
onRemoveImage,
|
||||||
getProductContainerClasses,
|
getProductContainerClasses,
|
||||||
findContainer
|
findContainer,
|
||||||
|
handleAddImageFromUrl
|
||||||
}: ProductCardProps) => {
|
}: ProductCardProps) => {
|
||||||
|
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
|
||||||
|
|
||||||
// Function to get images for this product
|
// Function to get images for this product
|
||||||
const getProductImages = () => {
|
const getProductImages = () => {
|
||||||
return productImages.filter(img => img.productIndex === index);
|
return productImages.filter(img => img.productIndex === index);
|
||||||
@@ -56,6 +79,32 @@ export const ProductCard = ({
|
|||||||
return result !== null ? parseInt(result) : null;
|
return result !== null ? parseInt(result) : null;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Fetch reusable images
|
||||||
|
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
|
||||||
|
queryKey: ["reusable-images"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch reusable images");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Filter reusable images based on product's company
|
||||||
|
const availableReusableImages = useMemo(() => {
|
||||||
|
if (!reusableImages) return [];
|
||||||
|
return reusableImages.filter(img =>
|
||||||
|
img.is_global || img.company === product.company
|
||||||
|
);
|
||||||
|
}, [reusableImages, product.company]);
|
||||||
|
|
||||||
|
// Handle adding a reusable image
|
||||||
|
const handleAddReusableImage = (imageUrl: string) => {
|
||||||
|
handleAddImageFromUrl(index, imageUrl);
|
||||||
|
setIsReusableDialogOpen(false);
|
||||||
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Card
|
<Card
|
||||||
className={cn(
|
className={cn(
|
||||||
@@ -83,6 +132,18 @@ export const ProductCard = ({
|
|||||||
className="flex items-center gap-2"
|
className="flex items-center gap-2"
|
||||||
onSubmit={onUrlSubmit}
|
onSubmit={onUrlSubmit}
|
||||||
>
|
>
|
||||||
|
{getProductImages().length === 0 && (
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
|
||||||
|
onClick={() => setIsReusableDialogOpen(true)}
|
||||||
|
>
|
||||||
|
<ImageIcon className="h-3.5 w-3.5" />
|
||||||
|
Select from Library
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
<Input
|
<Input
|
||||||
placeholder="Add image from URL"
|
placeholder="Add image from URL"
|
||||||
value={urlInput}
|
value={urlInput}
|
||||||
@@ -105,7 +166,7 @@ export const ProductCard = ({
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex flex-col sm:flex-row gap-2">
|
<div className="flex flex-col sm:flex-row gap-2">
|
||||||
<div className="flex flex-row gap-2 items-start">
|
<div className="flex flex-row gap-2 items-center gap-4">
|
||||||
<ImageDropzone
|
<ImageDropzone
|
||||||
productIndex={index}
|
productIndex={index}
|
||||||
onDrop={onImageUpload}
|
onDrop={onImageUpload}
|
||||||
@@ -158,6 +219,50 @@ export const ProductCard = ({
|
|||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
</CardContent>
|
</CardContent>
|
||||||
|
|
||||||
|
{/* Reusable Images Dialog */}
|
||||||
|
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
|
||||||
|
<DialogContent className="max-w-3xl">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>Select from Image Library</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
Choose a global or company-specific image to add to this product.
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
<ScrollArea className="h-[400px] pr-4">
|
||||||
|
{isLoadingReusable ? (
|
||||||
|
<div className="flex items-center justify-center h-full">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : availableReusableImages.length === 0 ? (
|
||||||
|
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
|
||||||
|
<ImageIcon className="h-8 w-8 mb-2" />
|
||||||
|
<p>No reusable images available</p>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
|
||||||
|
{availableReusableImages.map((image) => (
|
||||||
|
<div
|
||||||
|
key={image.id}
|
||||||
|
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
|
||||||
|
onClick={() => handleAddReusableImage(image.image_url)}
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src={image.image_url}
|
||||||
|
alt={image.name}
|
||||||
|
className="w-full h-full object-cover"
|
||||||
|
/>
|
||||||
|
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
|
||||||
|
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
|
||||||
|
<p className="text-xs text-white truncate">{image.name}</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</ScrollArea>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
</Card>
|
</Card>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
@@ -31,5 +31,6 @@ export interface Product {
|
|||||||
supplier_no?: string;
|
supplier_no?: string;
|
||||||
sku?: string;
|
sku?: string;
|
||||||
model?: string;
|
model?: string;
|
||||||
|
company?: string;
|
||||||
product_images?: string | string[];
|
product_images?: string | string[];
|
||||||
}
|
}
|
||||||
@@ -8,7 +8,7 @@ import {
|
|||||||
} from "@/components/ui/dialog";
|
} from "@/components/ui/dialog";
|
||||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
import { Loader2, CheckIcon } from "lucide-react";
|
import { Loader2, CheckIcon, XIcon } from "lucide-react";
|
||||||
import { Code } from "@/components/ui/code";
|
import { Code } from "@/components/ui/code";
|
||||||
import {
|
import {
|
||||||
Table,
|
Table,
|
||||||
@@ -24,6 +24,7 @@ import {
|
|||||||
CurrentPrompt,
|
CurrentPrompt,
|
||||||
} from "../hooks/useAiValidation";
|
} from "../hooks/useAiValidation";
|
||||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||||
|
import { Badge } from "@/components/ui/badge";
|
||||||
|
|
||||||
interface TaxonomyStats {
|
interface TaxonomyStats {
|
||||||
categories: number;
|
categories: number;
|
||||||
@@ -41,6 +42,20 @@ interface DebugData {
|
|||||||
basePrompt: string;
|
basePrompt: string;
|
||||||
sampleFullPrompt: string;
|
sampleFullPrompt: string;
|
||||||
promptLength: number;
|
promptLength: number;
|
||||||
|
apiFormat?: Array<{
|
||||||
|
role: string;
|
||||||
|
content: string;
|
||||||
|
}>;
|
||||||
|
promptSources?: {
|
||||||
|
systemPrompt?: { id: number; prompt_text: string };
|
||||||
|
generalPrompt?: { id: number; prompt_text: string };
|
||||||
|
companyPrompts?: Array<{
|
||||||
|
id: number;
|
||||||
|
company: string;
|
||||||
|
companyName?: string;
|
||||||
|
prompt_text: string;
|
||||||
|
}>;
|
||||||
|
};
|
||||||
estimatedProcessingTime?: {
|
estimatedProcessingTime?: {
|
||||||
seconds: number | null;
|
seconds: number | null;
|
||||||
sampleCount: number;
|
sampleCount: number;
|
||||||
@@ -83,6 +98,75 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
debugData,
|
debugData,
|
||||||
}) => {
|
}) => {
|
||||||
const [costPerMillionTokens, setCostPerMillionTokens] = useState(2.5); // Default cost
|
const [costPerMillionTokens, setCostPerMillionTokens] = useState(2.5); // Default cost
|
||||||
|
const hasCompanyPrompts =
|
||||||
|
currentPrompt.debugData?.promptSources?.companyPrompts &&
|
||||||
|
currentPrompt.debugData.promptSources.companyPrompts.length > 0;
|
||||||
|
|
||||||
|
// Create our own state to track changes
|
||||||
|
const [localReversionState, setLocalReversionState] = useState<
|
||||||
|
Record<string, boolean>
|
||||||
|
>({});
|
||||||
|
|
||||||
|
// Initialize local state from the isChangeReverted function when component mounts
|
||||||
|
// or when aiValidationDetails changes
|
||||||
|
React.useEffect(() => {
|
||||||
|
if (
|
||||||
|
aiValidationDetails.changeDetails &&
|
||||||
|
aiValidationDetails.changeDetails.length > 0
|
||||||
|
) {
|
||||||
|
const initialState: Record<string, boolean> = {};
|
||||||
|
|
||||||
|
aiValidationDetails.changeDetails.forEach((product) => {
|
||||||
|
product.changes.forEach((change) => {
|
||||||
|
const key = `${product.productIndex}-${change.field}`;
|
||||||
|
initialState[key] = isChangeReverted(
|
||||||
|
product.productIndex,
|
||||||
|
change.field
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
setLocalReversionState(initialState);
|
||||||
|
}
|
||||||
|
}, [aiValidationDetails.changeDetails, isChangeReverted]);
|
||||||
|
|
||||||
|
// This function will toggle the local state for a given change
|
||||||
|
const toggleChangeAcceptance = (productIndex: number, fieldKey: string) => {
|
||||||
|
const key = `${productIndex}-${fieldKey}`;
|
||||||
|
const currentlyRejected = !!localReversionState[key];
|
||||||
|
|
||||||
|
// Toggle the local state
|
||||||
|
setLocalReversionState((prev) => ({
|
||||||
|
...prev,
|
||||||
|
[key]: !prev[key],
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Only call revertAiChange when toggling to rejected state
|
||||||
|
// Since revertAiChange is specifically for rejecting changes
|
||||||
|
if (!currentlyRejected) {
|
||||||
|
revertAiChange(productIndex, fieldKey);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Function to check local reversion state
|
||||||
|
const isChangeLocallyReverted = (
|
||||||
|
productIndex: number,
|
||||||
|
fieldKey: string
|
||||||
|
): boolean => {
|
||||||
|
const key = `${productIndex}-${fieldKey}`;
|
||||||
|
return !!localReversionState[key];
|
||||||
|
};
|
||||||
|
|
||||||
|
// Use "full" as the default tab
|
||||||
|
const defaultTab = "full";
|
||||||
|
const [activeTab, setActiveTab] = useState(defaultTab);
|
||||||
|
|
||||||
|
// Update activeTab when the dialog is opened with new data
|
||||||
|
React.useEffect(() => {
|
||||||
|
if (currentPrompt.isOpen) {
|
||||||
|
setActiveTab("full");
|
||||||
|
}
|
||||||
|
}, [currentPrompt.isOpen]);
|
||||||
|
|
||||||
// Format time helper
|
// Format time helper
|
||||||
const formatTime = (seconds: number): string => {
|
const formatTime = (seconds: number): string => {
|
||||||
@@ -123,15 +207,18 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
</DialogHeader>
|
</DialogHeader>
|
||||||
|
|
||||||
<div className="flex flex-col h-[calc(90vh-120px)] overflow-hidden">
|
<div className="flex flex-col h-[calc(90vh-120px)] overflow-hidden">
|
||||||
{/* Debug Information Section */}
|
{/* Debug Information Section - Fixed at the top */}
|
||||||
<div className="mb-4 flex-shrink-0">
|
<div className="flex-shrink-0">
|
||||||
{currentPrompt.isLoading ? (
|
{currentPrompt.isLoading ? (
|
||||||
<div className="flex justify-center items-center h-[100px]"></div>
|
<div className="flex justify-center items-center h-[100px]"></div>
|
||||||
) : (
|
) : (
|
||||||
<div className="grid grid-cols-3 gap-4">
|
<>
|
||||||
|
<div className="grid grid-cols-3 gap-4 mb-4">
|
||||||
<Card className="py-2">
|
<Card className="py-2">
|
||||||
<CardHeader className="py-2">
|
<CardHeader className="py-2">
|
||||||
<CardTitle className="text-base">Prompt Length</CardTitle>
|
<CardTitle className="text-base">
|
||||||
|
Prompt Length
|
||||||
|
</CardTitle>
|
||||||
</CardHeader>
|
</CardHeader>
|
||||||
<CardContent className="py-2">
|
<CardContent className="py-2">
|
||||||
<div className="flex flex-col space-y-2">
|
<div className="flex flex-col space-y-2">
|
||||||
@@ -139,10 +226,14 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
<span className="text-muted-foreground">
|
<span className="text-muted-foreground">
|
||||||
Characters:
|
Characters:
|
||||||
</span>{" "}
|
</span>{" "}
|
||||||
<span className="font-semibold">{promptLength}</span>
|
<span className="font-semibold">
|
||||||
|
{promptLength}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
<div className="text-sm">
|
<div className="text-sm">
|
||||||
<span className="text-muted-foreground">Tokens:</span>{" "}
|
<span className="text-muted-foreground">
|
||||||
|
Tokens:
|
||||||
|
</span>{" "}
|
||||||
<span className="font-semibold">
|
<span className="font-semibold">
|
||||||
~{Math.round(promptLength / 4)}
|
~{Math.round(promptLength / 4)}
|
||||||
</span>
|
</span>
|
||||||
@@ -153,7 +244,9 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
|
|
||||||
<Card className="py-2">
|
<Card className="py-2">
|
||||||
<CardHeader className="py-2">
|
<CardHeader className="py-2">
|
||||||
<CardTitle className="text-base">Cost Estimate</CardTitle>
|
<CardTitle className="text-base">
|
||||||
|
Cost Estimate
|
||||||
|
</CardTitle>
|
||||||
</CardHeader>
|
</CardHeader>
|
||||||
<CardContent className="py-2">
|
<CardContent className="py-2">
|
||||||
<div className="flex flex-col space-y-2">
|
<div className="flex flex-col space-y-2">
|
||||||
@@ -200,8 +293,9 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
</CardHeader>
|
</CardHeader>
|
||||||
<CardContent className="py-2">
|
<CardContent className="py-2">
|
||||||
<div className="flex flex-col space-y-2">
|
<div className="flex flex-col space-y-2">
|
||||||
{debugData?.estimatedProcessingTime ? (
|
{currentPrompt.debugData?.estimatedProcessingTime ? (
|
||||||
debugData.estimatedProcessingTime.seconds ? (
|
currentPrompt.debugData.estimatedProcessingTime
|
||||||
|
.seconds ? (
|
||||||
<>
|
<>
|
||||||
<div className="text-sm">
|
<div className="text-sm">
|
||||||
<span className="text-muted-foreground">
|
<span className="text-muted-foreground">
|
||||||
@@ -209,23 +303,28 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
</span>{" "}
|
</span>{" "}
|
||||||
<span className="font-semibold">
|
<span className="font-semibold">
|
||||||
{formatTime(
|
{formatTime(
|
||||||
debugData.estimatedProcessingTime.seconds
|
currentPrompt.debugData
|
||||||
|
.estimatedProcessingTime.seconds
|
||||||
)}
|
)}
|
||||||
</span>
|
</span>
|
||||||
</div>
|
</div>
|
||||||
<div className="text-xs text-muted-foreground">
|
<div className="text-xs text-muted-foreground">
|
||||||
Based on{" "}
|
Based on{" "}
|
||||||
{debugData.estimatedProcessingTime.sampleCount}{" "}
|
{
|
||||||
|
currentPrompt.debugData
|
||||||
|
.estimatedProcessingTime.sampleCount
|
||||||
|
}{" "}
|
||||||
similar validation
|
similar validation
|
||||||
{debugData.estimatedProcessingTime
|
{currentPrompt.debugData
|
||||||
.sampleCount !== 1
|
.estimatedProcessingTime.sampleCount !== 1
|
||||||
? "s"
|
? "s"
|
||||||
: ""}
|
: ""}
|
||||||
</div>
|
</div>
|
||||||
</>
|
</>
|
||||||
) : (
|
) : (
|
||||||
<div className="text-sm text-muted-foreground">
|
<div className="text-sm text-muted-foreground">
|
||||||
No historical data available for this prompt size
|
No historical data available for this prompt
|
||||||
|
size
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
) : (
|
) : (
|
||||||
@@ -237,22 +336,304 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
</CardContent>
|
</CardContent>
|
||||||
</Card>
|
</Card>
|
||||||
</div>
|
</div>
|
||||||
|
</>
|
||||||
)}
|
)}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Prompt Section */}
|
{/* Prompt Section - Scrollable content */}
|
||||||
<div className="flex-1 min-h-0">
|
<div className="flex-1 min-h-0">
|
||||||
<ScrollArea className="h-full w-full">
|
|
||||||
{currentPrompt.isLoading ? (
|
{currentPrompt.isLoading ? (
|
||||||
<div className="flex items-center justify-center h-full">
|
<div className="flex items-center justify-center h-full">
|
||||||
<Loader2 className="h-8 w-8 animate-spin" />
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
</div>
|
</div>
|
||||||
) : (
|
) : (
|
||||||
|
<>
|
||||||
|
{currentPrompt.debugData?.apiFormat ? (
|
||||||
|
<div className="flex flex-col h-full">
|
||||||
|
{/* Prompt Sources Card - Fixed at the top of the content area */}
|
||||||
|
<Card className="py-2 mb-4 flex-shrink-0">
|
||||||
|
<CardHeader className="py-2">
|
||||||
|
<CardTitle className="text-base">
|
||||||
|
Prompt Sources
|
||||||
|
</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent className="py-2">
|
||||||
|
<div className="flex flex-wrap gap-2">
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="bg-purple-100 hover:bg-purple-200 cursor-pointer"
|
||||||
|
onClick={() =>
|
||||||
|
document
|
||||||
|
.getElementById("system-message")
|
||||||
|
?.scrollIntoView({ behavior: "smooth" })
|
||||||
|
}
|
||||||
|
>
|
||||||
|
System
|
||||||
|
</Badge>
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="bg-green-100 hover:bg-green-200 cursor-pointer"
|
||||||
|
onClick={() =>
|
||||||
|
document
|
||||||
|
.getElementById("general-section")
|
||||||
|
?.scrollIntoView({ behavior: "smooth" })
|
||||||
|
}
|
||||||
|
>
|
||||||
|
General
|
||||||
|
</Badge>
|
||||||
|
|
||||||
|
{currentPrompt.debugData.promptSources?.companyPrompts?.map(
|
||||||
|
(company, idx) => (
|
||||||
|
<Badge
|
||||||
|
key={idx}
|
||||||
|
variant="outline"
|
||||||
|
className="bg-blue-100 hover:bg-blue-200 cursor-pointer"
|
||||||
|
onClick={() =>
|
||||||
|
document
|
||||||
|
.getElementById("company-section")
|
||||||
|
?.scrollIntoView({ behavior: "smooth" })
|
||||||
|
}
|
||||||
|
>
|
||||||
|
{company.companyName ||
|
||||||
|
`Company ${company.company}`}
|
||||||
|
</Badge>
|
||||||
|
)
|
||||||
|
)}
|
||||||
|
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="bg-amber-100 hover:bg-amber-200 cursor-pointer"
|
||||||
|
onClick={() =>
|
||||||
|
document
|
||||||
|
.getElementById("taxonomy-section")
|
||||||
|
?.scrollIntoView({ behavior: "smooth" })
|
||||||
|
}
|
||||||
|
>
|
||||||
|
Taxonomy
|
||||||
|
</Badge>
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="bg-pink-100 hover:bg-pink-200 cursor-pointer"
|
||||||
|
onClick={() =>
|
||||||
|
document
|
||||||
|
.getElementById("product-section")
|
||||||
|
?.scrollIntoView({ behavior: "smooth" })
|
||||||
|
}
|
||||||
|
>
|
||||||
|
Products
|
||||||
|
</Badge>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<ScrollArea className="flex-1 w-full overflow-y-auto">
|
||||||
|
{currentPrompt.debugData.apiFormat.map(
|
||||||
|
(message, idx: number) => (
|
||||||
|
<div
|
||||||
|
key={idx}
|
||||||
|
className="border rounded-md p-2 mb-4"
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
id={
|
||||||
|
message.role === "system"
|
||||||
|
? "system-message"
|
||||||
|
: ""
|
||||||
|
}
|
||||||
|
className={`p-2 mb-2 rounded-sm font-medium ${
|
||||||
|
message.role === "system"
|
||||||
|
? "bg-purple-50 text-purple-800"
|
||||||
|
: "bg-green-50 text-green-800"
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
Role: {message.role}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<Code
|
||||||
|
className={`whitespace-pre-wrap p-4 break-normal max-w-full ${
|
||||||
|
message.role === "system"
|
||||||
|
? "bg-purple-50/30"
|
||||||
|
: "bg-green-50/30"
|
||||||
|
}`}
|
||||||
|
>
|
||||||
|
{message.role === "user" ? (
|
||||||
|
<div className="text-wrapper">
|
||||||
|
{(() => {
|
||||||
|
const content = message.content;
|
||||||
|
|
||||||
|
// Find section boundaries by looking for specific markers
|
||||||
|
const companySpecificStartIndex =
|
||||||
|
content.indexOf(
|
||||||
|
"--- COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||||
|
);
|
||||||
|
const companySpecificEndIndex =
|
||||||
|
content.indexOf(
|
||||||
|
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||||
|
);
|
||||||
|
|
||||||
|
const taxonomyStartIndex =
|
||||||
|
content.indexOf(
|
||||||
|
"All Available Categories:"
|
||||||
|
);
|
||||||
|
const taxonomyFallbackStartIndex =
|
||||||
|
content.indexOf(
|
||||||
|
"Available Categories:"
|
||||||
|
);
|
||||||
|
const actualTaxonomyStartIndex =
|
||||||
|
taxonomyStartIndex >= 0
|
||||||
|
? taxonomyStartIndex
|
||||||
|
: taxonomyFallbackStartIndex;
|
||||||
|
|
||||||
|
const productDataStartIndex =
|
||||||
|
content.indexOf(
|
||||||
|
"----------Here is the product data to validate----------"
|
||||||
|
);
|
||||||
|
|
||||||
|
// If we can't find any markers, just return the content as-is
|
||||||
|
if (
|
||||||
|
actualTaxonomyStartIndex < 0 &&
|
||||||
|
productDataStartIndex < 0 &&
|
||||||
|
companySpecificStartIndex < 0
|
||||||
|
) {
|
||||||
|
return content;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine section indices
|
||||||
|
let generalEndIndex = content.length;
|
||||||
|
|
||||||
|
if (companySpecificStartIndex >= 0) {
|
||||||
|
generalEndIndex =
|
||||||
|
companySpecificStartIndex;
|
||||||
|
} else if (
|
||||||
|
actualTaxonomyStartIndex >= 0
|
||||||
|
) {
|
||||||
|
generalEndIndex =
|
||||||
|
actualTaxonomyStartIndex;
|
||||||
|
} else if (productDataStartIndex >= 0) {
|
||||||
|
generalEndIndex = productDataStartIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine where taxonomy starts
|
||||||
|
let taxonomyEndIndex = content.length;
|
||||||
|
if (productDataStartIndex >= 0) {
|
||||||
|
taxonomyEndIndex =
|
||||||
|
productDataStartIndex;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Segments to render with appropriate styling
|
||||||
|
const segments = [];
|
||||||
|
|
||||||
|
// General section (beginning to company/taxonomy/product)
|
||||||
|
if (generalEndIndex > 0) {
|
||||||
|
segments.push(
|
||||||
|
<div
|
||||||
|
id="general-section"
|
||||||
|
key="general"
|
||||||
|
className="border-l-4 border-green-500 pl-4 py-0 my-1"
|
||||||
|
>
|
||||||
|
<div className="text-xs font-semibold text-green-700 mb-2">
|
||||||
|
General Prompt
|
||||||
|
</div>
|
||||||
|
<pre className="whitespace-pre-wrap">
|
||||||
|
{content.substring(
|
||||||
|
0,
|
||||||
|
generalEndIndex
|
||||||
|
)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Company-specific section if present
|
||||||
|
if (
|
||||||
|
companySpecificStartIndex >= 0 &&
|
||||||
|
companySpecificEndIndex >= 0
|
||||||
|
) {
|
||||||
|
segments.push(
|
||||||
|
<div
|
||||||
|
id="company-section"
|
||||||
|
key="company"
|
||||||
|
className="border-l-4 border-blue-500 pl-4 py-0 my-1"
|
||||||
|
>
|
||||||
|
<div className="text-xs font-semibold text-blue-700 mb-2">
|
||||||
|
Company-Specific Instructions
|
||||||
|
</div>
|
||||||
|
<pre className="whitespace-pre-wrap">
|
||||||
|
{content.substring(
|
||||||
|
companySpecificStartIndex,
|
||||||
|
companySpecificEndIndex +
|
||||||
|
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||||
|
.length
|
||||||
|
)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Taxonomy section
|
||||||
|
if (actualTaxonomyStartIndex >= 0) {
|
||||||
|
const taxEnd = taxonomyEndIndex;
|
||||||
|
segments.push(
|
||||||
|
<div
|
||||||
|
id="taxonomy-section"
|
||||||
|
key="taxonomy"
|
||||||
|
className="border-l-4 border-amber-500 pl-4 py-0 my-1"
|
||||||
|
>
|
||||||
|
<div className="text-xs font-semibold text-amber-700 mb-2">
|
||||||
|
Taxonomy Data
|
||||||
|
</div>
|
||||||
|
<pre className="whitespace-pre-wrap">
|
||||||
|
{content.substring(
|
||||||
|
actualTaxonomyStartIndex,
|
||||||
|
taxEnd
|
||||||
|
)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Product data section
|
||||||
|
if (productDataStartIndex >= 0) {
|
||||||
|
segments.push(
|
||||||
|
<div
|
||||||
|
id="product-section"
|
||||||
|
key="product"
|
||||||
|
className="border-l-4 border-pink-500 pl-4 py-0 my-1"
|
||||||
|
>
|
||||||
|
<div className="text-xs font-semibold text-pink-700 mb-2">
|
||||||
|
Product Data
|
||||||
|
</div>
|
||||||
|
<pre className="whitespace-pre-wrap">
|
||||||
|
{content.substring(
|
||||||
|
productDataStartIndex
|
||||||
|
)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return <>{segments}</>;
|
||||||
|
})()}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<pre className="whitespace-pre-wrap">
|
||||||
|
{message.content}
|
||||||
|
</pre>
|
||||||
|
)}
|
||||||
|
</Code>
|
||||||
|
</div>
|
||||||
|
)
|
||||||
|
)}
|
||||||
|
</ScrollArea>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<ScrollArea className="h-full w-full">
|
||||||
<Code className="whitespace-pre-wrap p-4 break-normal max-w-full">
|
<Code className="whitespace-pre-wrap p-4 break-normal max-w-full">
|
||||||
{currentPrompt.prompt}
|
{currentPrompt.prompt}
|
||||||
</Code>
|
</Code>
|
||||||
)}
|
|
||||||
</ScrollArea>
|
</ScrollArea>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</DialogContent>
|
</DialogContent>
|
||||||
@@ -280,8 +661,9 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
className="h-full bg-primary transition-all duration-500"
|
className="h-full bg-primary transition-all duration-500"
|
||||||
style={{
|
style={{
|
||||||
width: `${
|
width: `${
|
||||||
aiValidationProgress.progressPercent ??
|
aiValidationProgress.progressPercent !== undefined
|
||||||
Math.round((aiValidationProgress.step / 5) * 100)
|
? Math.round(aiValidationProgress.progressPercent)
|
||||||
|
: Math.round((aiValidationProgress.step / 5) * 100)
|
||||||
}%`,
|
}%`,
|
||||||
backgroundColor:
|
backgroundColor:
|
||||||
aiValidationProgress.step === -1
|
aiValidationProgress.step === -1
|
||||||
@@ -295,8 +677,9 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
{aiValidationProgress.step === -1
|
{aiValidationProgress.step === -1
|
||||||
? "❌"
|
? "❌"
|
||||||
: `${
|
: `${
|
||||||
aiValidationProgress.progressPercent ??
|
aiValidationProgress.progressPercent !== undefined
|
||||||
Math.round((aiValidationProgress.step / 5) * 100)
|
? Math.round(aiValidationProgress.progressPercent)
|
||||||
|
: Math.round((aiValidationProgress.step / 5) * 100)
|
||||||
}%`}
|
}%`}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -355,14 +738,14 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
setAiValidationDetails((prev) => ({ ...prev, isOpen: open }))
|
setAiValidationDetails((prev) => ({ ...prev, isOpen: open }))
|
||||||
}
|
}
|
||||||
>
|
>
|
||||||
<DialogContent className="max-w-4xl">
|
<DialogContent className="max-w-6xl w-[90vw]">
|
||||||
<DialogHeader>
|
<DialogHeader>
|
||||||
<DialogTitle>AI Validation Results</DialogTitle>
|
<DialogTitle>AI Validation Results</DialogTitle>
|
||||||
<DialogDescription>
|
<DialogDescription>
|
||||||
Review the changes and warnings suggested by the AI
|
Review the changes and warnings suggested by the AI
|
||||||
</DialogDescription>
|
</DialogDescription>
|
||||||
</DialogHeader>
|
</DialogHeader>
|
||||||
<ScrollArea className="max-h-[60vh]">
|
<ScrollArea className="max-h-[70vh]">
|
||||||
{aiValidationDetails.changeDetails &&
|
{aiValidationDetails.changeDetails &&
|
||||||
aiValidationDetails.changeDetails.length > 0 ? (
|
aiValidationDetails.changeDetails.length > 0 ? (
|
||||||
<div className="mb-6 space-y-6">
|
<div className="mb-6 space-y-6">
|
||||||
@@ -384,10 +767,16 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
<Table>
|
<Table>
|
||||||
<TableHeader>
|
<TableHeader>
|
||||||
<TableRow>
|
<TableRow>
|
||||||
<TableHead className="w-[180px]">Field</TableHead>
|
<TableHead className="">Field</TableHead>
|
||||||
<TableHead>Original Value</TableHead>
|
<TableHead className="w-[35%]">
|
||||||
<TableHead>Corrected Value</TableHead>
|
Original Value
|
||||||
<TableHead className="text-right">Action</TableHead>
|
</TableHead>
|
||||||
|
<TableHead className="w-[35%]">
|
||||||
|
Corrected Value
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="text-right">
|
||||||
|
Accept Changes?
|
||||||
|
</TableHead>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
</TableHeader>
|
</TableHeader>
|
||||||
<TableBody>
|
<TableBody>
|
||||||
@@ -398,7 +787,7 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
const fieldLabel = field
|
const fieldLabel = field
|
||||||
? field.label
|
? field.label
|
||||||
: change.field;
|
: change.field;
|
||||||
const isReverted = isChangeReverted(
|
const isReverted = isChangeLocallyReverted(
|
||||||
product.productIndex,
|
product.productIndex,
|
||||||
change.field
|
change.field
|
||||||
);
|
);
|
||||||
@@ -421,7 +810,6 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
dangerouslySetInnerHTML={{
|
dangerouslySetInnerHTML={{
|
||||||
__html: originalHtml,
|
__html: originalHtml,
|
||||||
}}
|
}}
|
||||||
className={isReverted ? "font-medium" : ""}
|
|
||||||
/>
|
/>
|
||||||
</TableCell>
|
</TableCell>
|
||||||
<TableCell>
|
<TableCell>
|
||||||
@@ -429,36 +817,46 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
|||||||
dangerouslySetInnerHTML={{
|
dangerouslySetInnerHTML={{
|
||||||
__html: correctedHtml,
|
__html: correctedHtml,
|
||||||
}}
|
}}
|
||||||
className={!isReverted ? "font-medium" : ""}
|
|
||||||
/>
|
/>
|
||||||
</TableCell>
|
</TableCell>
|
||||||
<TableCell className="text-right">
|
<TableCell className="text-right align-top">
|
||||||
<div className="mt-2">
|
<div className="flex justify-end gap-2">
|
||||||
{isReverted ? (
|
|
||||||
<Button
|
|
||||||
variant="ghost"
|
|
||||||
size="sm"
|
|
||||||
className="text-green-600 bg-green-50 hover:bg-green-100 hover:text-green-700"
|
|
||||||
disabled
|
|
||||||
>
|
|
||||||
<CheckIcon className="w-4 h-4 mr-1" />
|
|
||||||
Reverted
|
|
||||||
</Button>
|
|
||||||
) : (
|
|
||||||
<Button
|
<Button
|
||||||
variant="outline"
|
variant="outline"
|
||||||
size="sm"
|
size="sm"
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
// Call the revert function directly
|
// Toggle to Accepted state if currently rejected
|
||||||
revertAiChange(
|
toggleChangeAcceptance(
|
||||||
product.productIndex,
|
product.productIndex,
|
||||||
change.field
|
change.field
|
||||||
);
|
);
|
||||||
}}
|
}}
|
||||||
|
className={
|
||||||
|
!isReverted
|
||||||
|
? "bg-green-100 text-green-600 border-green-300 flex items-center"
|
||||||
|
: "border-gray-200 text-gray-600 hover:bg-green-50 hover:text-green-600 hover:border-green-200 flex items-center"
|
||||||
|
}
|
||||||
>
|
>
|
||||||
Revert Change
|
<CheckIcon className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => {
|
||||||
|
// Toggle to Rejected state if currently accepted
|
||||||
|
toggleChangeAcceptance(
|
||||||
|
product.productIndex,
|
||||||
|
change.field
|
||||||
|
);
|
||||||
|
}}
|
||||||
|
className={
|
||||||
|
isReverted
|
||||||
|
? "bg-red-100 text-red-600 border-red-300 flex items-center"
|
||||||
|
: "border-gray-200 text-gray-600 hover:bg-red-50 hover:text-red-600 hover:border-red-200 flex items-center"
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<XIcon className="h-4 w-4" />
|
||||||
</Button>
|
</Button>
|
||||||
)}
|
|
||||||
</div>
|
</div>
|
||||||
</TableCell>
|
</TableCell>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
|
|||||||
@@ -56,6 +56,20 @@ export interface CurrentPrompt {
|
|||||||
basePrompt: string;
|
basePrompt: string;
|
||||||
sampleFullPrompt: string;
|
sampleFullPrompt: string;
|
||||||
promptLength: number;
|
promptLength: number;
|
||||||
|
apiFormat?: Array<{
|
||||||
|
role: string;
|
||||||
|
content: string;
|
||||||
|
}>;
|
||||||
|
promptSources?: {
|
||||||
|
systemPrompt?: { id: number; prompt_text: string };
|
||||||
|
generalPrompt?: { id: number; prompt_text: string };
|
||||||
|
companyPrompts?: Array<{
|
||||||
|
id: number;
|
||||||
|
company: string;
|
||||||
|
companyName: string;
|
||||||
|
prompt_text: string
|
||||||
|
}>;
|
||||||
|
};
|
||||||
estimatedProcessingTime?: {
|
estimatedProcessingTime?: {
|
||||||
seconds: number | null;
|
seconds: number | null;
|
||||||
sampleCount: number;
|
sampleCount: number;
|
||||||
@@ -323,7 +337,9 @@ export const useAiValidation = <T extends string>(
|
|||||||
basePrompt: result.basePrompt || '',
|
basePrompt: result.basePrompt || '',
|
||||||
sampleFullPrompt: result.sampleFullPrompt || '',
|
sampleFullPrompt: result.sampleFullPrompt || '',
|
||||||
promptLength: result.promptLength || (promptContent ? promptContent.length : 0),
|
promptLength: result.promptLength || (promptContent ? promptContent.length : 0),
|
||||||
estimatedProcessingTime: result.estimatedProcessingTime
|
promptSources: result.promptSources,
|
||||||
|
estimatedProcessingTime: result.estimatedProcessingTime,
|
||||||
|
apiFormat: result.apiFormat
|
||||||
}
|
}
|
||||||
}));
|
}));
|
||||||
} else {
|
} else {
|
||||||
@@ -490,6 +506,27 @@ export const useAiValidation = <T extends string>(
|
|||||||
throw new Error(result.error || 'AI validation failed');
|
throw new Error(result.error || 'AI validation failed');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Store the prompt sources if they exist
|
||||||
|
if (result.promptSources) {
|
||||||
|
setCurrentPrompt(prev => {
|
||||||
|
// Create debugData if it doesn't exist
|
||||||
|
const prevDebugData = prev.debugData || {
|
||||||
|
taxonomyStats: null,
|
||||||
|
basePrompt: '',
|
||||||
|
sampleFullPrompt: '',
|
||||||
|
promptLength: 0
|
||||||
|
};
|
||||||
|
|
||||||
|
return {
|
||||||
|
...prev,
|
||||||
|
debugData: {
|
||||||
|
...prevDebugData,
|
||||||
|
promptSources: result.promptSources
|
||||||
|
}
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
// Update progress with actual processing time if available
|
// Update progress with actual processing time if available
|
||||||
if (result.performanceMetrics) {
|
if (result.performanceMetrics) {
|
||||||
console.log('Performance metrics:', result.performanceMetrics);
|
console.log('Performance metrics:', result.performanceMetrics);
|
||||||
|
|||||||
584
inventory/src/components/settings/PromptManagement.tsx
Normal file
584
inventory/src/components/settings/PromptManagement.tsx
Normal file
@@ -0,0 +1,584 @@
|
|||||||
|
import { useState, useMemo } from "react";
|
||||||
|
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||||
|
import { Button } from "@/components/ui/button";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "@/components/ui/table";
|
||||||
|
import { Input } from "@/components/ui/input";
|
||||||
|
import { Textarea } from "@/components/ui/textarea";
|
||||||
|
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||||
|
import { Label } from "@/components/ui/label";
|
||||||
|
import { ArrowUpDown, Pencil, Trash2, PlusCircle } from "lucide-react";
|
||||||
|
import config from "@/config";
|
||||||
|
import {
|
||||||
|
useReactTable,
|
||||||
|
getCoreRowModel,
|
||||||
|
getSortedRowModel,
|
||||||
|
SortingState,
|
||||||
|
flexRender,
|
||||||
|
type ColumnDef,
|
||||||
|
} from "@tanstack/react-table";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogDescription,
|
||||||
|
DialogFooter,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
} from "@/components/ui/dialog";
|
||||||
|
import {
|
||||||
|
AlertDialog,
|
||||||
|
AlertDialogAction,
|
||||||
|
AlertDialogCancel,
|
||||||
|
AlertDialogContent,
|
||||||
|
AlertDialogDescription,
|
||||||
|
AlertDialogFooter,
|
||||||
|
AlertDialogHeader,
|
||||||
|
AlertDialogTitle,
|
||||||
|
} from "@/components/ui/alert-dialog";
|
||||||
|
import { toast } from "sonner";
|
||||||
|
|
||||||
|
interface FieldOption {
|
||||||
|
label: string;
|
||||||
|
value: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PromptFormData {
|
||||||
|
id?: number;
|
||||||
|
prompt_text: string;
|
||||||
|
prompt_type: 'general' | 'company_specific' | 'system';
|
||||||
|
company: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface AiPrompt {
|
||||||
|
id: number;
|
||||||
|
prompt_text: string;
|
||||||
|
prompt_type: 'general' | 'company_specific' | 'system';
|
||||||
|
company: string | null;
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FieldOptions {
|
||||||
|
companies: FieldOption[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export function PromptManagement() {
|
||||||
|
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||||
|
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||||
|
const [promptToDelete, setPromptToDelete] = useState<AiPrompt | null>(null);
|
||||||
|
const [editingPrompt, setEditingPrompt] = useState<AiPrompt | null>(null);
|
||||||
|
const [sorting, setSorting] = useState<SortingState>([
|
||||||
|
{ id: "prompt_type", desc: true },
|
||||||
|
{ id: "company", desc: false }
|
||||||
|
]);
|
||||||
|
const [searchQuery, setSearchQuery] = useState("");
|
||||||
|
const [formData, setFormData] = useState<PromptFormData>({
|
||||||
|
prompt_text: "",
|
||||||
|
prompt_type: "general",
|
||||||
|
company: null,
|
||||||
|
});
|
||||||
|
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
const { data: prompts, isLoading } = useQuery<AiPrompt[]>({
|
||||||
|
queryKey: ["ai-prompts"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/ai-prompts`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch AI prompts");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||||
|
queryKey: ["fieldOptions"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch field options");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if general and system prompts already exist
|
||||||
|
const generalPromptExists = useMemo(() => {
|
||||||
|
return prompts?.some(prompt => prompt.prompt_type === 'general');
|
||||||
|
}, [prompts]);
|
||||||
|
|
||||||
|
const systemPromptExists = useMemo(() => {
|
||||||
|
return prompts?.some(prompt => prompt.prompt_type === 'system');
|
||||||
|
}, [prompts]);
|
||||||
|
|
||||||
|
const createMutation = useMutation({
|
||||||
|
mutationFn: async (data: PromptFormData) => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/ai-prompts`, {
|
||||||
|
method: "POST",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
body: JSON.stringify(data),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to create prompt");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||||
|
toast.success("Prompt created successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to create prompt");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const updateMutation = useMutation({
|
||||||
|
mutationFn: async (data: PromptFormData) => {
|
||||||
|
if (!data.id) throw new Error("Prompt ID is required for update");
|
||||||
|
|
||||||
|
const response = await fetch(`${config.apiUrl}/ai-prompts/${data.id}`, {
|
||||||
|
method: "PUT",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
body: JSON.stringify(data),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to update prompt");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||||
|
toast.success("Prompt updated successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to update prompt");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const deleteMutation = useMutation({
|
||||||
|
mutationFn: async (id: number) => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/ai-prompts/${id}`, {
|
||||||
|
method: "DELETE",
|
||||||
|
});
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to delete prompt");
|
||||||
|
}
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||||
|
toast.success("Prompt deleted successfully");
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to delete prompt");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleEdit = (prompt: AiPrompt) => {
|
||||||
|
setEditingPrompt(prompt);
|
||||||
|
setFormData({
|
||||||
|
id: prompt.id,
|
||||||
|
prompt_text: prompt.prompt_text,
|
||||||
|
prompt_type: prompt.prompt_type,
|
||||||
|
company: prompt.company,
|
||||||
|
});
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteClick = (prompt: AiPrompt) => {
|
||||||
|
setPromptToDelete(prompt);
|
||||||
|
setIsDeleteOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteConfirm = () => {
|
||||||
|
if (promptToDelete) {
|
||||||
|
deleteMutation.mutate(promptToDelete.id);
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setPromptToDelete(null);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
// If prompt_type is general or system, ensure company is null
|
||||||
|
const submitData = {
|
||||||
|
...formData,
|
||||||
|
company: formData.prompt_type === 'company_specific' ? formData.company : null,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (editingPrompt) {
|
||||||
|
updateMutation.mutate(submitData);
|
||||||
|
} else {
|
||||||
|
createMutation.mutate(submitData);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const resetForm = () => {
|
||||||
|
setFormData({
|
||||||
|
prompt_text: "",
|
||||||
|
prompt_type: "general",
|
||||||
|
company: null,
|
||||||
|
});
|
||||||
|
setEditingPrompt(null);
|
||||||
|
setIsFormOpen(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleCreateClick = () => {
|
||||||
|
resetForm();
|
||||||
|
|
||||||
|
// If general prompt and system prompt exist, default to company-specific
|
||||||
|
if (generalPromptExists && systemPromptExists) {
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
prompt_type: 'company_specific'
|
||||||
|
}));
|
||||||
|
} else if (generalPromptExists && !systemPromptExists) {
|
||||||
|
// If general exists but system doesn't, suggest system prompt
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
prompt_type: 'system'
|
||||||
|
}));
|
||||||
|
} else if (!generalPromptExists) {
|
||||||
|
// If no general prompt, suggest that first
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
prompt_type: 'general'
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const columns = useMemo<ColumnDef<AiPrompt>[]>(() => [
|
||||||
|
{
|
||||||
|
accessorKey: "prompt_type",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Type
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const type = row.getValue("prompt_type") as string;
|
||||||
|
if (type === 'general') return 'General';
|
||||||
|
if (type === 'system') return 'System';
|
||||||
|
return 'Company Specific';
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorFn: (row) => row.prompt_text.length,
|
||||||
|
id: "length",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Length
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ getValue }) => {
|
||||||
|
const length = getValue() as number;
|
||||||
|
return <span>{length.toLocaleString()}</span>;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "company",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Company
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const companyId = row.getValue("company");
|
||||||
|
if (!companyId) return 'N/A';
|
||||||
|
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "updated_at",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Last Updated
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => new Date(row.getValue("updated_at")).toLocaleDateString(),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: "actions",
|
||||||
|
cell: ({ row }) => (
|
||||||
|
<div className="flex gap-2 justify-end pr-4">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => handleEdit(row.original)}
|
||||||
|
>
|
||||||
|
<Pencil className="h-4 w-4" />
|
||||||
|
Edit
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
className="text-destructive hover:text-destructive"
|
||||||
|
onClick={() => handleDeleteClick(row.original)}
|
||||||
|
>
|
||||||
|
<Trash2 className="h-4 w-4" />
|
||||||
|
Delete
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
], [fieldOptions]);
|
||||||
|
|
||||||
|
const filteredData = useMemo(() => {
|
||||||
|
if (!prompts) return [];
|
||||||
|
return prompts.filter((prompt) => {
|
||||||
|
const searchString = searchQuery.toLowerCase();
|
||||||
|
return (
|
||||||
|
prompt.prompt_type.toLowerCase().includes(searchString) ||
|
||||||
|
(prompt.company && prompt.company.toLowerCase().includes(searchString))
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}, [prompts, searchQuery]);
|
||||||
|
|
||||||
|
const table = useReactTable({
|
||||||
|
data: filteredData,
|
||||||
|
columns,
|
||||||
|
state: {
|
||||||
|
sorting,
|
||||||
|
},
|
||||||
|
onSortingChange: setSorting,
|
||||||
|
getSortedRowModel: getSortedRowModel(),
|
||||||
|
getCoreRowModel: getCoreRowModel(),
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-6">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h2 className="text-2xl font-bold">AI Validation Prompts</h2>
|
||||||
|
<Button onClick={handleCreateClick}>
|
||||||
|
<PlusCircle className="mr-2 h-4 w-4" />
|
||||||
|
Create New Prompt
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
<Input
|
||||||
|
placeholder="Search prompts..."
|
||||||
|
value={searchQuery}
|
||||||
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
|
className="max-w-sm"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{isLoading ? (
|
||||||
|
<div>Loading prompts...</div>
|
||||||
|
) : (
|
||||||
|
<div className="border rounded-lg">
|
||||||
|
<Table>
|
||||||
|
<TableHeader className="bg-muted">
|
||||||
|
{table.getHeaderGroups().map((headerGroup) => (
|
||||||
|
<TableRow key={headerGroup.id}>
|
||||||
|
{headerGroup.headers.map((header) => (
|
||||||
|
<TableHead key={header.id}>
|
||||||
|
{header.isPlaceholder
|
||||||
|
? null
|
||||||
|
: flexRender(
|
||||||
|
header.column.columnDef.header,
|
||||||
|
header.getContext()
|
||||||
|
)}
|
||||||
|
</TableHead>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))}
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{table.getRowModel().rows?.length ? (
|
||||||
|
table.getRowModel().rows.map((row) => (
|
||||||
|
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||||
|
{row.getVisibleCells().map((cell) => (
|
||||||
|
<TableCell key={cell.id} className="pl-6">
|
||||||
|
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||||
|
</TableCell>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
) : (
|
||||||
|
<TableRow>
|
||||||
|
<TableCell colSpan={columns.length} className="text-center">
|
||||||
|
No prompts found
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Prompt Form Dialog */}
|
||||||
|
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||||
|
<DialogContent className="max-w-3xl">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>{editingPrompt ? "Edit Prompt" : "Create New Prompt"}</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
{editingPrompt
|
||||||
|
? "Update this AI validation prompt."
|
||||||
|
: "Create a new AI validation prompt that will be used during product validation."}
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
|
||||||
|
<form onSubmit={handleSubmit}>
|
||||||
|
<div className="grid gap-4 py-4">
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="prompt_type">Prompt Type</Label>
|
||||||
|
<Select
|
||||||
|
value={formData.prompt_type}
|
||||||
|
onValueChange={(value: 'general' | 'company_specific' | 'system') =>
|
||||||
|
setFormData({ ...formData, prompt_type: value })
|
||||||
|
}
|
||||||
|
disabled={(generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id) ||
|
||||||
|
(systemPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id)}
|
||||||
|
>
|
||||||
|
<SelectTrigger>
|
||||||
|
<SelectValue placeholder="Select prompt type" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
<SelectItem
|
||||||
|
value="general"
|
||||||
|
disabled={generalPromptExists && !editingPrompt?.prompt_type?.includes('general')}
|
||||||
|
>
|
||||||
|
General
|
||||||
|
</SelectItem>
|
||||||
|
<SelectItem
|
||||||
|
value="system"
|
||||||
|
disabled={systemPromptExists && !editingPrompt?.prompt_type?.includes('system')}
|
||||||
|
>
|
||||||
|
System
|
||||||
|
</SelectItem>
|
||||||
|
<SelectItem value="company_specific">Company Specific</SelectItem>
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
{generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && systemPromptExists && formData.prompt_type !== 'system' && (
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
General and system prompts already exist. You can only create company-specific prompts.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
{generalPromptExists && !systemPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && (
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
A general prompt already exists. You can create a system prompt or company-specific prompts.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
{systemPromptExists && !generalPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id && (
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
A system prompt already exists. You can create a general prompt or company-specific prompts.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{formData.prompt_type === 'company_specific' && (
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="company">Company</Label>
|
||||||
|
<Select
|
||||||
|
value={formData.company || ''}
|
||||||
|
onValueChange={(value) => setFormData({ ...formData, company: value })}
|
||||||
|
required={formData.prompt_type === 'company_specific'}
|
||||||
|
>
|
||||||
|
<SelectTrigger>
|
||||||
|
<SelectValue placeholder="Select company" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{fieldOptions?.companies.map((company) => (
|
||||||
|
<SelectItem key={company.value} value={company.value}>
|
||||||
|
{company.label}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="prompt_text">Prompt Text</Label>
|
||||||
|
<Textarea
|
||||||
|
id="prompt_text"
|
||||||
|
value={formData.prompt_text}
|
||||||
|
onChange={(e) => setFormData({ ...formData, prompt_text: e.target.value })}
|
||||||
|
placeholder={`Enter your ${formData.prompt_type === 'system' ? 'system instructions' : 'validation prompt'} text...`}
|
||||||
|
className="h-80 font-mono text-sm"
|
||||||
|
required
|
||||||
|
/>
|
||||||
|
{formData.prompt_type === 'system' && (
|
||||||
|
<p className="text-xs text-muted-foreground mt-1">
|
||||||
|
System prompts provide the initial instructions to the AI. This sets the tone and approach for all validations.
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<DialogFooter>
|
||||||
|
<Button type="button" variant="outline" onClick={() => {
|
||||||
|
resetForm();
|
||||||
|
setIsFormOpen(false);
|
||||||
|
}}>
|
||||||
|
Cancel
|
||||||
|
</Button>
|
||||||
|
<Button type="submit">
|
||||||
|
{editingPrompt ? "Update" : "Create"} Prompt
|
||||||
|
</Button>
|
||||||
|
</DialogFooter>
|
||||||
|
</form>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
|
||||||
|
{/* Delete Confirmation Dialog */}
|
||||||
|
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||||
|
<AlertDialogContent>
|
||||||
|
<AlertDialogHeader>
|
||||||
|
<AlertDialogTitle>Delete Prompt</AlertDialogTitle>
|
||||||
|
<AlertDialogDescription>
|
||||||
|
Are you sure you want to delete this prompt? This action cannot be undone.
|
||||||
|
</AlertDialogDescription>
|
||||||
|
</AlertDialogHeader>
|
||||||
|
<AlertDialogFooter>
|
||||||
|
<AlertDialogCancel onClick={() => {
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setPromptToDelete(null);
|
||||||
|
}}>
|
||||||
|
Cancel
|
||||||
|
</AlertDialogCancel>
|
||||||
|
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||||
|
Delete
|
||||||
|
</AlertDialogAction>
|
||||||
|
</AlertDialogFooter>
|
||||||
|
</AlertDialogContent>
|
||||||
|
</AlertDialog>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
@@ -0,0 +1,773 @@
|
|||||||
|
import { useState, useMemo, useCallback, useRef, useEffect } from "react";
|
||||||
|
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||||
|
import { Button } from "@/components/ui/button";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "@/components/ui/table";
|
||||||
|
import { Input } from "@/components/ui/input";
|
||||||
|
import { Label } from "@/components/ui/label";
|
||||||
|
import { Checkbox } from "@/components/ui/checkbox";
|
||||||
|
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||||
|
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
|
||||||
|
import config from "@/config";
|
||||||
|
import {
|
||||||
|
useReactTable,
|
||||||
|
getCoreRowModel,
|
||||||
|
getSortedRowModel,
|
||||||
|
SortingState,
|
||||||
|
flexRender,
|
||||||
|
type ColumnDef,
|
||||||
|
} from "@tanstack/react-table";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogDescription,
|
||||||
|
DialogFooter,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
DialogClose
|
||||||
|
} from "@/components/ui/dialog";
|
||||||
|
import {
|
||||||
|
AlertDialog,
|
||||||
|
AlertDialogAction,
|
||||||
|
AlertDialogCancel,
|
||||||
|
AlertDialogContent,
|
||||||
|
AlertDialogDescription,
|
||||||
|
AlertDialogFooter,
|
||||||
|
AlertDialogHeader,
|
||||||
|
AlertDialogTitle,
|
||||||
|
} from "@/components/ui/alert-dialog";
|
||||||
|
import { toast } from "sonner";
|
||||||
|
import { useDropzone } from "react-dropzone";
|
||||||
|
import { cn } from "@/lib/utils";
|
||||||
|
|
||||||
|
interface FieldOption {
|
||||||
|
label: string;
|
||||||
|
value: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ImageFormData {
|
||||||
|
id?: number;
|
||||||
|
name: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
file?: File;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ReusableImage {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
filename: string;
|
||||||
|
file_path: string;
|
||||||
|
image_url: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
mime_type: string;
|
||||||
|
file_size: number;
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FieldOptions {
|
||||||
|
companies: FieldOption[];
|
||||||
|
}
|
||||||
|
|
||||||
|
const ImageForm = ({
|
||||||
|
editingImage,
|
||||||
|
formData,
|
||||||
|
setFormData,
|
||||||
|
onSubmit,
|
||||||
|
onCancel,
|
||||||
|
fieldOptions,
|
||||||
|
getRootProps,
|
||||||
|
getInputProps,
|
||||||
|
isDragActive
|
||||||
|
}: {
|
||||||
|
editingImage: ReusableImage | null;
|
||||||
|
formData: ImageFormData;
|
||||||
|
setFormData: (data: ImageFormData) => void;
|
||||||
|
onSubmit: (e: React.FormEvent) => void;
|
||||||
|
onCancel: () => void;
|
||||||
|
fieldOptions: FieldOptions | undefined;
|
||||||
|
getRootProps: any;
|
||||||
|
getInputProps: any;
|
||||||
|
isDragActive: boolean;
|
||||||
|
}) => {
|
||||||
|
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||||
|
setFormData(prev => ({ ...prev, name: e.target.value }));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
const handleGlobalChange = useCallback((checked: boolean) => {
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
is_global: checked,
|
||||||
|
company: checked ? null : prev.company
|
||||||
|
}));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
const handleCompanyChange = useCallback((value: string) => {
|
||||||
|
setFormData(prev => ({ ...prev, company: value }));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<form onSubmit={onSubmit}>
|
||||||
|
<div className="grid gap-4 py-4">
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="image_name">Image Name</Label>
|
||||||
|
<Input
|
||||||
|
id="image_name"
|
||||||
|
name="image_name"
|
||||||
|
value={formData.name}
|
||||||
|
onChange={handleNameChange}
|
||||||
|
placeholder="Enter image name"
|
||||||
|
required
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{!editingImage && (
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="image">Upload Image</Label>
|
||||||
|
<div
|
||||||
|
{...getRootProps()}
|
||||||
|
className={cn(
|
||||||
|
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
|
||||||
|
isDragActive && "border-primary bg-muted"
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<input {...getInputProps()} />
|
||||||
|
<div className="flex flex-col items-center justify-center py-2">
|
||||||
|
{formData.file ? (
|
||||||
|
<>
|
||||||
|
<div className="mb-4">
|
||||||
|
<ImagePreview file={formData.file} />
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center gap-2 mb-2">
|
||||||
|
<Image className="h-4 w-4 text-primary" />
|
||||||
|
<span className="text-sm">{formData.file.name}</span>
|
||||||
|
</div>
|
||||||
|
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
|
||||||
|
</>
|
||||||
|
) : isDragActive ? (
|
||||||
|
<>
|
||||||
|
<Image className="h-8 w-8 mb-2 text-primary" />
|
||||||
|
<p className="text-base text-muted-foreground">Drop image here</p>
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
|
||||||
|
<p className="text-base text-muted-foreground">Click or drag to upload</p>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex items-center space-x-2">
|
||||||
|
<Checkbox
|
||||||
|
id="is_global"
|
||||||
|
checked={formData.is_global}
|
||||||
|
onCheckedChange={handleGlobalChange}
|
||||||
|
/>
|
||||||
|
<Label htmlFor="is_global">Available for all companies</Label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{!formData.is_global && (
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="company">Company</Label>
|
||||||
|
<Select
|
||||||
|
value={formData.company || ''}
|
||||||
|
onValueChange={handleCompanyChange}
|
||||||
|
required={!formData.is_global}
|
||||||
|
>
|
||||||
|
<SelectTrigger>
|
||||||
|
<SelectValue placeholder="Select company" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{fieldOptions?.companies.map((company) => (
|
||||||
|
<SelectItem key={company.value} value={company.value}>
|
||||||
|
{company.label}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<DialogFooter>
|
||||||
|
<Button type="button" variant="outline" onClick={onCancel}>
|
||||||
|
Cancel
|
||||||
|
</Button>
|
||||||
|
<Button type="submit">
|
||||||
|
{editingImage ? "Update" : "Upload"} Image
|
||||||
|
</Button>
|
||||||
|
</DialogFooter>
|
||||||
|
</form>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export function ReusableImageManagement() {
|
||||||
|
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||||
|
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||||
|
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
|
||||||
|
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
|
||||||
|
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
|
||||||
|
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
|
||||||
|
const [sorting, setSorting] = useState<SortingState>([
|
||||||
|
{ id: "created_at", desc: true }
|
||||||
|
]);
|
||||||
|
const [searchQuery, setSearchQuery] = useState("");
|
||||||
|
const [formData, setFormData] = useState<ImageFormData>({
|
||||||
|
name: "",
|
||||||
|
is_global: false,
|
||||||
|
company: null,
|
||||||
|
file: undefined
|
||||||
|
});
|
||||||
|
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
const { data: images, isLoading } = useQuery<ReusableImage[]>({
|
||||||
|
queryKey: ["reusable-images"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch reusable images");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||||
|
queryKey: ["fieldOptions"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch field options");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const createMutation = useMutation({
|
||||||
|
mutationFn: async (data: ImageFormData) => {
|
||||||
|
// Create FormData for file upload
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('name', data.name);
|
||||||
|
formData.append('is_global', String(data.is_global));
|
||||||
|
|
||||||
|
if (!data.is_global && data.company) {
|
||||||
|
formData.append('company', data.company);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data.file) {
|
||||||
|
formData.append('image', data.file);
|
||||||
|
} else {
|
||||||
|
throw new Error("Image file is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
|
||||||
|
method: "POST",
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to upload image");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image uploaded successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to upload image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const updateMutation = useMutation({
|
||||||
|
mutationFn: async (data: ImageFormData) => {
|
||||||
|
if (!data.id) throw new Error("Image ID is required for update");
|
||||||
|
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
|
||||||
|
method: "PUT",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
name: data.name,
|
||||||
|
is_global: data.is_global,
|
||||||
|
company: data.is_global ? null : data.company
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to update image");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image updated successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to update image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const deleteMutation = useMutation({
|
||||||
|
mutationFn: async (id: number) => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
|
||||||
|
method: "DELETE",
|
||||||
|
});
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to delete image");
|
||||||
|
}
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image deleted successfully");
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to delete image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleEdit = (image: ReusableImage) => {
|
||||||
|
setEditingImage(image);
|
||||||
|
setFormData({
|
||||||
|
id: image.id,
|
||||||
|
name: image.name,
|
||||||
|
is_global: image.is_global,
|
||||||
|
company: image.company,
|
||||||
|
});
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteClick = (image: ReusableImage) => {
|
||||||
|
setImageToDelete(image);
|
||||||
|
setIsDeleteOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handlePreview = (image: ReusableImage) => {
|
||||||
|
setPreviewImage(image);
|
||||||
|
setIsPreviewOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteConfirm = () => {
|
||||||
|
if (imageToDelete) {
|
||||||
|
deleteMutation.mutate(imageToDelete.id);
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setImageToDelete(null);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
// If is_global is true, ensure company is null
|
||||||
|
const submitData = {
|
||||||
|
...formData,
|
||||||
|
company: formData.is_global ? null : formData.company,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (editingImage) {
|
||||||
|
updateMutation.mutate(submitData);
|
||||||
|
} else {
|
||||||
|
if (!submitData.file) {
|
||||||
|
toast.error("Please select an image file");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
createMutation.mutate(submitData);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const resetForm = () => {
|
||||||
|
setFormData({
|
||||||
|
name: "",
|
||||||
|
is_global: false,
|
||||||
|
company: null,
|
||||||
|
file: undefined
|
||||||
|
});
|
||||||
|
setEditingImage(null);
|
||||||
|
setIsFormOpen(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleCreateClick = () => {
|
||||||
|
resetForm();
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Configure dropzone for image uploads
|
||||||
|
const onDrop = useCallback((acceptedFiles: File[]) => {
|
||||||
|
if (acceptedFiles.length > 0) {
|
||||||
|
const file = acceptedFiles[0]; // Take only the first file
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
file
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||||
|
accept: {
|
||||||
|
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
|
||||||
|
},
|
||||||
|
onDrop,
|
||||||
|
multiple: false // Only accept single files
|
||||||
|
});
|
||||||
|
|
||||||
|
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
|
||||||
|
{
|
||||||
|
accessorKey: "name",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Name
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "is_global",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Type
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const isGlobal = row.getValue("is_global") as boolean;
|
||||||
|
return isGlobal ? "Global" : "Company Specific";
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "company",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Company
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const isGlobal = row.getValue("is_global") as boolean;
|
||||||
|
if (isGlobal) return 'N/A';
|
||||||
|
|
||||||
|
const companyId = row.getValue("company");
|
||||||
|
if (!companyId) return 'None';
|
||||||
|
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "file_size",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Size
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const size = row.getValue("file_size") as number;
|
||||||
|
return `${(size / 1024).toFixed(1)} KB`;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "created_at",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Created
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "image_url",
|
||||||
|
header: "Thumbnail",
|
||||||
|
cell: ({ row }) => (
|
||||||
|
<div className="flex items-center justify-center">
|
||||||
|
<img
|
||||||
|
src={row.getValue("image_url") as string}
|
||||||
|
alt={row.getValue("name") as string}
|
||||||
|
className="w-10 h-10 object-contain border rounded"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: "actions",
|
||||||
|
cell: ({ row }) => (
|
||||||
|
<div className="flex gap-2 justify-end">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
onClick={() => handlePreview(row.original)}
|
||||||
|
title="Preview Image"
|
||||||
|
>
|
||||||
|
<Eye className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
onClick={() => handleEdit(row.original)}
|
||||||
|
title="Edit Image"
|
||||||
|
>
|
||||||
|
<Pencil className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
className="text-destructive hover:text-destructive"
|
||||||
|
onClick={() => handleDeleteClick(row.original)}
|
||||||
|
title="Delete Image"
|
||||||
|
>
|
||||||
|
<Trash2 className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
], [fieldOptions]);
|
||||||
|
|
||||||
|
const filteredData = useMemo(() => {
|
||||||
|
if (!images) return [];
|
||||||
|
return images.filter((image) => {
|
||||||
|
const searchString = searchQuery.toLowerCase();
|
||||||
|
return (
|
||||||
|
image.name.toLowerCase().includes(searchString) ||
|
||||||
|
(image.is_global ? "global" : "company").includes(searchString) ||
|
||||||
|
(image.company && image.company.toLowerCase().includes(searchString))
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}, [images, searchQuery]);
|
||||||
|
|
||||||
|
const table = useReactTable({
|
||||||
|
data: filteredData,
|
||||||
|
columns,
|
||||||
|
state: {
|
||||||
|
sorting,
|
||||||
|
},
|
||||||
|
onSortingChange: setSorting,
|
||||||
|
getSortedRowModel: getSortedRowModel(),
|
||||||
|
getCoreRowModel: getCoreRowModel(),
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-6">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h2 className="text-2xl font-bold">Reusable Images</h2>
|
||||||
|
<Button onClick={handleCreateClick}>
|
||||||
|
<PlusCircle className="mr-2 h-4 w-4" />
|
||||||
|
Upload New Image
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
<Input
|
||||||
|
placeholder="Search images..."
|
||||||
|
value={searchQuery}
|
||||||
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
|
className="max-w-sm"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{isLoading ? (
|
||||||
|
<div>Loading images...</div>
|
||||||
|
) : (
|
||||||
|
<div className="border rounded-lg">
|
||||||
|
<Table>
|
||||||
|
<TableHeader className="bg-muted">
|
||||||
|
{table.getHeaderGroups().map((headerGroup) => (
|
||||||
|
<TableRow key={headerGroup.id}>
|
||||||
|
{headerGroup.headers.map((header) => (
|
||||||
|
<TableHead key={header.id}>
|
||||||
|
{header.isPlaceholder
|
||||||
|
? null
|
||||||
|
: flexRender(
|
||||||
|
header.column.columnDef.header,
|
||||||
|
header.getContext()
|
||||||
|
)}
|
||||||
|
</TableHead>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))}
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{table.getRowModel().rows?.length ? (
|
||||||
|
table.getRowModel().rows.map((row) => (
|
||||||
|
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||||
|
{row.getVisibleCells().map((cell) => (
|
||||||
|
<TableCell key={cell.id} className="pl-6">
|
||||||
|
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||||
|
</TableCell>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
) : (
|
||||||
|
<TableRow>
|
||||||
|
<TableCell colSpan={columns.length} className="text-center">
|
||||||
|
No images found
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Image Form Dialog */}
|
||||||
|
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||||
|
<DialogContent className="max-w-md">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
{editingImage
|
||||||
|
? "Update this reusable image's details."
|
||||||
|
: "Upload a new reusable image that can be used across products."}
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
|
||||||
|
<ImageForm
|
||||||
|
editingImage={editingImage}
|
||||||
|
formData={formData}
|
||||||
|
setFormData={setFormData}
|
||||||
|
onSubmit={handleSubmit}
|
||||||
|
onCancel={() => {
|
||||||
|
resetForm();
|
||||||
|
setIsFormOpen(false);
|
||||||
|
}}
|
||||||
|
fieldOptions={fieldOptions}
|
||||||
|
getRootProps={getRootProps}
|
||||||
|
getInputProps={getInputProps}
|
||||||
|
isDragActive={isDragActive}
|
||||||
|
/>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
|
||||||
|
{/* Delete Confirmation Dialog */}
|
||||||
|
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||||
|
<AlertDialogContent>
|
||||||
|
<AlertDialogHeader>
|
||||||
|
<AlertDialogTitle>Delete Image</AlertDialogTitle>
|
||||||
|
<AlertDialogDescription>
|
||||||
|
Are you sure you want to delete this image? This action cannot be undone.
|
||||||
|
</AlertDialogDescription>
|
||||||
|
</AlertDialogHeader>
|
||||||
|
<AlertDialogFooter>
|
||||||
|
<AlertDialogCancel onClick={() => {
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setImageToDelete(null);
|
||||||
|
}}>
|
||||||
|
Cancel
|
||||||
|
</AlertDialogCancel>
|
||||||
|
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||||
|
Delete
|
||||||
|
</AlertDialogAction>
|
||||||
|
</AlertDialogFooter>
|
||||||
|
</AlertDialogContent>
|
||||||
|
</AlertDialog>
|
||||||
|
|
||||||
|
{/* Preview Dialog */}
|
||||||
|
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
|
||||||
|
<DialogContent className="max-w-3xl">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>{previewImage?.name}</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
{previewImage?.is_global
|
||||||
|
? "Global image"
|
||||||
|
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
{previewImage && (
|
||||||
|
<div className="bg-checkerboard rounded-md overflow-hidden">
|
||||||
|
<img
|
||||||
|
src={previewImage.image_url}
|
||||||
|
alt={previewImage.name}
|
||||||
|
className="max-h-[500px] max-w-full object-contain"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="grid grid-cols-2 gap-4 text-sm">
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Filename:</span> {previewImage?.filename}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Type:</span> {previewImage?.mime_type}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<DialogFooter>
|
||||||
|
<DialogClose asChild>
|
||||||
|
<Button>Close</Button>
|
||||||
|
</DialogClose>
|
||||||
|
</DialogFooter>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
|
||||||
|
<style jsx global>{`
|
||||||
|
.bg-checkerboard {
|
||||||
|
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
|
||||||
|
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
|
||||||
|
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
|
||||||
|
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
|
||||||
|
background-size: 20px 20px;
|
||||||
|
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
|
||||||
|
}
|
||||||
|
`}</style>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const ImagePreview = ({ file }: { file: File }) => {
|
||||||
|
const [previewUrl, setPreviewUrl] = useState<string>('');
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const url = URL.createObjectURL(file);
|
||||||
|
setPreviewUrl(url);
|
||||||
|
return () => {
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
};
|
||||||
|
}, [file]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<img
|
||||||
|
src={previewUrl}
|
||||||
|
alt="Preview"
|
||||||
|
className="max-h-32 max-w-full object-contain rounded-md"
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -5,6 +5,8 @@ import { PerformanceMetrics } from "@/components/settings/PerformanceMetrics";
|
|||||||
import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
||||||
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
||||||
import { UserManagement } from "@/components/settings/UserManagement";
|
import { UserManagement } from "@/components/settings/UserManagement";
|
||||||
|
import { PromptManagement } from "@/components/settings/PromptManagement";
|
||||||
|
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
|
||||||
import { motion } from 'framer-motion';
|
import { motion } from 'framer-motion';
|
||||||
import { Alert, AlertDescription } from "@/components/ui/alert";
|
import { Alert, AlertDescription } from "@/components/ui/alert";
|
||||||
import { Protected } from "@/components/auth/Protected";
|
import { Protected } from "@/components/auth/Protected";
|
||||||
@@ -41,6 +43,8 @@ const SETTINGS_GROUPS: SettingsGroup[] = [
|
|||||||
label: "Content Management",
|
label: "Content Management",
|
||||||
tabs: [
|
tabs: [
|
||||||
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
||||||
|
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
|
||||||
|
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -216,6 +220,36 @@ export function Settings() {
|
|||||||
</Protected>
|
</Protected>
|
||||||
</TabsContent>
|
</TabsContent>
|
||||||
|
|
||||||
|
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
|
<Protected
|
||||||
|
permission="settings:prompt_management"
|
||||||
|
fallback={
|
||||||
|
<Alert>
|
||||||
|
<AlertDescription>
|
||||||
|
You don't have permission to access AI Prompts.
|
||||||
|
</AlertDescription>
|
||||||
|
</Alert>
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<PromptManagement />
|
||||||
|
</Protected>
|
||||||
|
</TabsContent>
|
||||||
|
|
||||||
|
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
|
<Protected
|
||||||
|
permission="settings:library_management"
|
||||||
|
fallback={
|
||||||
|
<Alert>
|
||||||
|
<AlertDescription>
|
||||||
|
You don't have permission to access Reusable Images.
|
||||||
|
</AlertDescription>
|
||||||
|
</Alert>
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<ReusableImageManagement />
|
||||||
|
</Protected>
|
||||||
|
</TabsContent>
|
||||||
|
|
||||||
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
<Protected
|
<Protected
|
||||||
permission="settings:user_management"
|
permission="settings:user_management"
|
||||||
|
|||||||
Reference in New Issue
Block a user