17 Commits

Author SHA1 Message Date
e5c4f617c5 Get frontend pages loading data again, remove unused components 2025-03-26 21:47:24 -04:00
8e19e6cd74 Finish fixing calculate scripts 2025-03-26 14:22:08 -04:00
749907bd30 Start migrating and fixing calculate scripts 2025-03-26 01:19:44 -04:00
108181c63d Fix more import script bugs/missing data 2025-03-25 22:23:06 -04:00
5dd779cb4a Fix purchase orders import 2025-03-25 19:12:41 -04:00
7b0e792d03 Merge branch 'master' into move-to-postgresql 2025-03-25 12:15:07 -04:00
517bbe72f4 Add in image library feature 2025-03-25 12:14:36 -04:00
87d4b9e804 Fixes/improvements for import scripts 2025-03-24 22:27:44 -04:00
75da2c6772 Get all import scripts running again 2025-03-24 21:58:00 -04:00
00a02aa788 Enhance ai validation changes dialog 2025-03-24 14:17:02 -04:00
114018080a Enhance ai debug dialog 2025-03-24 13:25:18 -04:00
228ae8b2a9 Layout/style tweaks, remove text file prompts, integrate system prompt into database/settings 2025-03-24 12:26:21 -04:00
dd4b3f7145 Add prompts table and settings page to create/read/update/delete from it, incorporate company specific prompts into ai validation 2025-03-24 11:30:15 -04:00
7eb4077224 Clean up build errors 2025-03-23 22:15:11 -04:00
d60a8cbc6e Hide debug components without permission 2025-03-23 22:06:51 -04:00
1fcbf54989 Layout/style tweaks, fix performance metrics settings page 2025-03-23 22:01:41 -04:00
ce75496770 Clean up unused permissions, take user to first page/component they can access 2025-03-23 17:18:31 -04:00
63 changed files with 8590 additions and 3486 deletions

View File

@@ -0,0 +1,342 @@
# MySQL to PostgreSQL Import Process Documentation
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
## Table of Contents
1. [Overview](#overview)
2. [Import Architecture](#import-architecture)
3. [Column Mappings](#column-mappings)
- [Categories](#categories)
- [Products](#products)
- [Product Categories (Relationship)](#product-categories-relationship)
- [Orders](#orders)
- [Purchase Orders](#purchase-orders)
- [Metadata Tables](#metadata-tables)
4. [Special Calculations](#special-calculations)
5. [Implementation Notes](#implementation-notes)
## Overview
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
- **Full Import**: Imports all data regardless of last sync time
- **Incremental Import**: Only imports data that has changed since the last import
The process handles four main data types:
- Categories (product categorization hierarchy)
- Products (inventory items)
- Orders (sales records)
- Purchase Orders (vendor orders)
## Import Architecture
The import process follows these steps:
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
2. **Setup Import History**: Creates a record of the current import operation
3. **Import Categories**: Processes product categories in hierarchical order
4. **Import Products**: Processes products with their attributes and category relationships
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
7. **Record Results**: Updates the import history with results
8. **Close Connections**: Cleans up connections and resources
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
## Column Mappings
### Categories
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|---------------------------------|----------------------------------------------|
| cat_id | product_categories.cat_id | Direct mapping |
| name | product_categories.name | Direct mapping |
| type | product_categories.type | Direct mapping |
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
| description | product_categories.combined_name| Direct mapping |
| status | N/A | Hard-coded 'active' |
| created_at | N/A | Current timestamp |
| updated_at | N/A | Current timestamp |
**Notes:**
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
- Type 10/20 are top-level categories with no parent
- Types 11/21/12/13 are child categories that reference parent categories
### Products
| PostgreSQL Column | MySQL Source | Transformation |
|----------------------|----------------------------------|---------------------------------------------------------------|
| pid | products.pid | Direct mapping |
| title | products.description | Direct mapping |
| description | products.notes | Direct mapping |
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
| preorder_count | current_inventory.onpreorder | Default 0 |
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
| regular_price | products.sellingprice | Default 0 |
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
| brand | product_categories.name | Linked via products.company |
| line | product_categories.name | Linked via products.line |
| subline | product_categories.name | Linked via products.subline |
| artist | product_categories.name | Linked via products.artist |
| categories | product_category_index | Comma-separated list of category IDs |
| created_at | products.date_created | Validated date, NULL if invalid |
| first_received | products.datein | Validated date, NULL if invalid |
| landing_cost_price | NULL | Not set |
| barcode | products.upc | Direct mapping |
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
| updated_at | products.stamp | Validated date, NULL if invalid |
| visible | shop_inventory | Calculated from show + buyable > 0 |
| managing_stock | N/A | Hard-coded true |
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
| permalink | N/A | Constructed URL with product ID |
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
| uom | N/A | Hard-coded 1 |
| rating | products.rating | Direct mapping |
| reviews | products.rating_votes | Direct mapping |
| weight | products.weight | Direct mapping |
| length | products.length | Direct mapping |
| width | products.width | Direct mapping |
| height | products.height | Direct mapping |
| country_of_origin | products.country_of_origin | Direct mapping |
| location | products.location | Direct mapping |
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
| notifies | product_notify | COUNT of records where pn.pid = pid |
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
| image | N/A | Constructed from pid and image URL pattern |
| image_175 | N/A | Constructed from pid and image URL pattern |
| image_full | N/A | Constructed from pid and image URL pattern |
| options | NULL | Not set |
| tags | NULL | Not set |
**Notes:**
- Replenishable calculation:
```javascript
CASE
WHEN p.reorder < 0 THEN 0
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END
```
In business terms, a product is considered NOT replenishable only if:
- It was manually flagged as not replenishable (negative reorder value)
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
- Image URLs are constructed using this pattern:
```javascript
const paddedPid = pid.toString().padStart(6, '0');
const prefix = paddedPid.slice(0, 3);
const basePath = `${imageUrlBase}${prefix}/${pid}`;
return {
image: `${basePath}-t-${iid}.jpg`,
image_175: `${basePath}-175x175-${iid}.jpg`,
image_full: `${basePath}-o-${iid}.jpg`
};
```
### Product Categories (Relationship)
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| pid | products.pid | Direct mapping |
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
**Notes:**
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
- Categories 16 and 17 are explicitly excluded
### Orders
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| order_number | order_items.order_id | Direct mapping |
| pid | order_items.prod_pid | Direct mapping |
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
| date | _order.date_placed_onlydate | Via join to _order table |
| price | order_items.prod_price | Direct mapping |
| quantity | order_items.qty_ordered | Direct mapping |
| discount | Multiple sources | Complex calculation (see notes) |
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
| tax_included | N/A | Hard-coded false |
| shipping | N/A | Hard-coded 0 |
| customer | _order.order_cid | Direct mapping |
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
| status | _order.order_status | Direct mapping |
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
| costeach | order_costs | From latest record or fallback to price * 0.5 |
**Notes:**
- Only orders with order_status >= 15 and with a valid date_placed are processed
- For incremental imports, only orders modified since last sync are processed
- Discount calculation combines three sources:
1. Base discount: order_items.prod_price_reg - order_items.prod_price
2. Promo discount: SUM of order_discount_items.amount
3. Proportional order discount: Calculation based on order subtotal proportion
```javascript
(oi.base_discount +
COALESCE(ot.promo_discount, 0) +
CASE
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
ELSE 0
END)::DECIMAL(10,2)
```
- Taxes are taken from the latest tax record for an order
- Cost data is taken from the latest non-pending cost record
### Purchase Orders
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| po_id | po.po_id | Default 0 if NULL |
| pid | po_products.pid | Direct mapping |
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
| name | products.description | Fallback to 'Unknown Product' |
| cost_price | po_products.cost_each | Direct mapping |
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
| date | po.date_ordered | Fallback to po.date_created if NULL |
| expected_date | po.date_estin | Direct mapping |
| status | po.status | Default 1 if NULL |
| notes | po.short_note | Fallback to po.notes if NULL |
| ordered | po_products.qty_each | Direct mapping |
| received | N/A | Hard-coded 0 |
| receiving_status | N/A | Hard-coded 1 |
**Notes:**
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
- For incremental imports, only POs modified since last sync are processed
### Metadata Tables
#### import_history
| PostgreSQL Column | Source | Notes |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| id | Auto-increment | Primary key |
| table_name | Code | 'all_tables' for overall import |
| start_time | NOW() | Import start time |
| end_time | NOW() | Import completion time |
| duration_seconds | Calculation | Elapsed seconds |
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
| records_added | Calculation | Sum from all imports |
| records_updated | Calculation | Sum from all imports |
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
| error_message | Exception | Error message if failed |
| additional_info | JSON | Configuration and results |
#### sync_status
| PostgreSQL Column | Source | Notes |
|----------------------|--------------------------------|---------------------------------------------------------------|
| table_name | Code | Name of imported table |
| last_sync_timestamp | NOW() | Timestamp of successful sync |
| last_sync_id | NULL | Not used currently |
## Special Calculations
### Date Validation
MySQL dates are validated before insertion into PostgreSQL:
```javascript
function validateDate(mysqlDate) {
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
return null;
}
// Check if the date is valid
const date = new Date(mysqlDate);
return isNaN(date.getTime()) ? null : mysqlDate;
}
```
### Retry Mechanism
Operations that might fail temporarily are retried with exponential backoff:
```javascript
async function withRetry(operation, errorMessage) {
let lastError;
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
try {
return await operation();
} catch (error) {
lastError = error;
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
if (attempt < MAX_RETRIES) {
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
await new Promise(resolve => setTimeout(resolve, backoffTime));
}
}
}
throw lastError;
}
```
### Progress Tracking
Progress is tracked with estimated time remaining:
```javascript
function estimateRemaining(startTime, current, total) {
if (current === 0) return "Calculating...";
const elapsedSeconds = (Date.now() - startTime) / 1000;
const itemsPerSecond = current / elapsedSeconds;
const remainingItems = total - current;
const remainingSeconds = remainingItems / itemsPerSecond;
return formatElapsedTime(remainingSeconds);
}
```
## Implementation Notes
### Transaction Management
All imports use transactions to ensure data consistency:
- **Categories**: Uses savepoints for each category type
- **Products**: Uses a single transaction for the entire import
- **Orders**: Uses a single transaction with temporary tables
- **Purchase Orders**: Uses a single transaction with temporary tables
### Memory Usage Optimization
To minimize memory usage when processing large datasets:
1. Data is processed in batches (100-5000 records per batch)
2. Temporary tables are used for intermediate data
3. Some queries use cursors to avoid loading all results at once
### MySQL vs PostgreSQL Compatibility
The scripts handle differences between MySQL and PostgreSQL:
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
3. Transaction syntax differences are abstracted in the connection wrapper
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
### SSH Tunnel
Database connections go through an SSH tunnel for security:
```javascript
ssh.forwardOut(
"127.0.0.1",
0,
sshConfig.prodDbConfig.host,
sshConfig.prodDbConfig.port,
async (err, stream) => {
if (err) reject(err);
resolve({ ssh, stream });
}
);
```

File diff suppressed because it is too large Load Diff

View File

@@ -2,10 +2,14 @@ CREATE TABLE users (
id SERIAL PRIMARY KEY,
username VARCHAR(255) NOT NULL UNIQUE,
password VARCHAR(255) NOT NULL,
email VARCHAR UNIQUE,
is_admin BOOLEAN DEFAULT FALSE,
is_active BOOLEAN DEFAULT TRUE,
last_login TIMESTAMP WITH TIME ZONE,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- Function to update the updated_at timestamp
CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
@@ -18,14 +22,6 @@ $$ language 'plpgsql';
-- Sequence and defined type for users table if not exists
CREATE SEQUENCE IF NOT EXISTS users_id_seq;
-- Update users table with new fields
ALTER TABLE "public"."users"
ADD COLUMN IF NOT EXISTS "email" varchar UNIQUE,
ADD COLUMN IF NOT EXISTS "is_admin" boolean DEFAULT FALSE,
ADD COLUMN IF NOT EXISTS "is_active" boolean DEFAULT TRUE,
ADD COLUMN IF NOT EXISTS "last_login" timestamp with time zone,
ADD COLUMN IF NOT EXISTS "updated_at" timestamp with time zone DEFAULT CURRENT_TIMESTAMP;
-- Create permissions table
CREATE TABLE IF NOT EXISTS "public"."permissions" (
"id" SERIAL PRIMARY KEY,
@@ -58,8 +54,7 @@ CREATE TRIGGER update_permissions_updated_at
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- Insert default permissions by page
-- Core page access permissions
-- Insert default permissions by page - only the ones used in application
INSERT INTO permissions (name, code, description, category) VALUES
('Dashboard Access', 'access:dashboard', 'Can access the Dashboard page', 'Pages'),
('Products Access', 'access:products', 'Can access the Products page', 'Pages'),
@@ -73,52 +68,14 @@ INSERT INTO permissions (name, code, description, category) VALUES
('AI Validation Debug Access', 'access:ai_validation_debug', 'Can access the AI Validation Debug page', 'Pages')
ON CONFLICT (code) DO NOTHING;
-- Granular permissions for Products
INSERT INTO permissions (name, code, description, category) VALUES
('View Products', 'view:products', 'Can view product listings', 'Products'),
('Create Products', 'create:products', 'Can create new products', 'Products'),
('Edit Products', 'edit:products', 'Can edit product details', 'Products'),
('Delete Products', 'delete:products', 'Can delete products', 'Products')
ON CONFLICT (code) DO NOTHING;
-- Granular permissions for Categories
INSERT INTO permissions (name, code, description, category) VALUES
('View Categories', 'view:categories', 'Can view categories', 'Categories'),
('Create Categories', 'create:categories', 'Can create new categories', 'Categories'),
('Edit Categories', 'edit:categories', 'Can edit categories', 'Categories'),
('Delete Categories', 'delete:categories', 'Can delete categories', 'Categories')
ON CONFLICT (code) DO NOTHING;
-- Granular permissions for Vendors
INSERT INTO permissions (name, code, description, category) VALUES
('View Vendors', 'view:vendors', 'Can view vendors', 'Vendors'),
('Create Vendors', 'create:vendors', 'Can create new vendors', 'Vendors'),
('Edit Vendors', 'edit:vendors', 'Can edit vendors', 'Vendors'),
('Delete Vendors', 'delete:vendors', 'Can delete vendors', 'Vendors')
ON CONFLICT (code) DO NOTHING;
-- Granular permissions for Purchase Orders
INSERT INTO permissions (name, code, description, category) VALUES
('View Purchase Orders', 'view:purchase_orders', 'Can view purchase orders', 'Purchase Orders'),
('Create Purchase Orders', 'create:purchase_orders', 'Can create new purchase orders', 'Purchase Orders'),
('Edit Purchase Orders', 'edit:purchase_orders', 'Can edit purchase orders', 'Purchase Orders'),
('Delete Purchase Orders', 'delete:purchase_orders', 'Can delete purchase orders', 'Purchase Orders')
ON CONFLICT (code) DO NOTHING;
-- User management permissions
INSERT INTO permissions (name, code, description, category) VALUES
('View Users', 'view:users', 'Can view user accounts', 'Users'),
('Create Users', 'create:users', 'Can create user accounts', 'Users'),
('Edit Users', 'edit:users', 'Can modify user accounts', 'Users'),
('Delete Users', 'delete:users', 'Can delete user accounts', 'Users'),
('Manage Permissions', 'manage:permissions', 'Can assign permissions to users', 'Users')
ON CONFLICT (code) DO NOTHING;
-- System permissions
-- Settings section permissions
INSERT INTO permissions (name, code, description, category) VALUES
('Run Calculations', 'run:calculations', 'Can trigger system calculations', 'System'),
('Import Data', 'import:data', 'Can import data into the system', 'System'),
('System Settings', 'edit:system_settings', 'Can modify system settings', 'System')
('Data Management', 'settings:data_management', 'Access to the Data Management settings section', 'Settings'),
('Stock Management', 'settings:stock_management', 'Access to the Stock Management settings section', 'Settings'),
('Performance Metrics', 'settings:performance_metrics', 'Access to the Performance Metrics settings section', 'Settings'),
('Calculation Settings', 'settings:calculation_settings', 'Access to the Calculation Settings section', 'Settings'),
('Template Management', 'settings:templates', 'Access to the Template Management settings section', 'Settings'),
('User Management', 'settings:user_management', 'Access to the User Management settings section', 'Settings')
ON CONFLICT (code) DO NOTHING;
-- Set any existing users as admin

View File

@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
-- Create function for updating timestamps
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
BEGIN
NEW.updated = CURRENT_TIMESTAMP;
-- Check which table is being updated and use the appropriate column
IF TG_TABLE_NAME = 'categories' THEN
NEW.updated_at = CURRENT_TIMESTAMP;
ELSE
NEW.updated = CURRENT_TIMESTAMP;
END IF;
RETURN NEW;
END;
$func$ language plpgsql;
@@ -160,7 +165,7 @@ CREATE TABLE purchase_orders (
expected_date DATE,
pid BIGINT NOT NULL,
sku VARCHAR(50) NOT NULL,
name VARCHAR(100) NOT NULL,
name VARCHAR(255) NOT NULL,
cost_price DECIMAL(10, 3) NOT NULL,
po_cost_price DECIMAL(10, 3) NOT NULL,
status SMALLINT DEFAULT 1,
@@ -171,7 +176,7 @@ CREATE TABLE purchase_orders (
received INTEGER DEFAULT 0,
received_date DATE,
last_received_date DATE,
received_by VARCHAR(100),
received_by VARCHAR,
receiving_history JSONB,
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (pid) REFERENCES products(pid),

View File

@@ -23,6 +23,56 @@ CREATE TABLE IF NOT EXISTS templates (
UNIQUE(company, product_type)
);
-- AI Prompts table for storing validation prompts
CREATE TABLE IF NOT EXISTS ai_prompts (
id SERIAL PRIMARY KEY,
prompt_text TEXT NOT NULL,
prompt_type TEXT NOT NULL CHECK (prompt_type IN ('general', 'company_specific', 'system')),
company TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT unique_company_prompt UNIQUE (company),
CONSTRAINT company_required_for_specific CHECK (
(prompt_type = 'general' AND company IS NULL) OR
(prompt_type = 'system' AND company IS NULL) OR
(prompt_type = 'company_specific' AND company IS NOT NULL)
)
);
-- Create a unique partial index to ensure only one general prompt
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_general_prompt
ON ai_prompts (prompt_type)
WHERE prompt_type = 'general';
-- Create a unique partial index to ensure only one system prompt
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
ON ai_prompts (prompt_type)
WHERE prompt_type = 'system';
-- Reusable Images table for storing persistent images
CREATE TABLE IF NOT EXISTS reusable_images (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
filename TEXT NOT NULL,
file_path TEXT NOT NULL,
image_url TEXT NOT NULL,
is_global BOOLEAN NOT NULL DEFAULT false,
company TEXT,
mime_type TEXT,
file_size INTEGER,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT company_required_for_non_global CHECK (
(is_global = true AND company IS NULL) OR
(is_global = false AND company IS NOT NULL)
)
);
-- Create index on company for efficient querying
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
-- Create index on is_global for efficient querying
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
-- AI Validation Performance Tracking
CREATE TABLE IF NOT EXISTS ai_validation_performance (
id SERIAL PRIMARY KEY,
@@ -50,4 +100,16 @@ $$ language 'plpgsql';
CREATE TRIGGER update_templates_updated_at
BEFORE UPDATE ON templates
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- Trigger to automatically update the updated_at column for ai_prompts
CREATE TRIGGER update_ai_prompts_updated_at
BEFORE UPDATE ON ai_prompts
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- Trigger to automatically update the updated_at column for reusable_images
CREATE TRIGGER update_reusable_images_updated_at
BEFORE UPDATE ON reusable_images
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();

View File

@@ -62,13 +62,24 @@ const TEMP_TABLES = [
// Add cleanup function for temporary tables
async function cleanupTemporaryTables(connection) {
// List of possible temporary tables that might exist
const tempTables = [
'temp_sales_metrics',
'temp_purchase_metrics',
'temp_forecast_dates',
'temp_daily_sales',
'temp_product_stats',
'temp_category_sales',
'temp_category_stats'
];
try {
for (const table of TEMP_TABLES) {
await connection.query(`DROP TEMPORARY TABLE IF EXISTS ${table}`);
// Drop each temporary table if it exists
for (const table of tempTables) {
await connection.query(`DROP TABLE IF EXISTS ${table}`);
}
} catch (error) {
logError(error, 'Error cleaning up temporary tables');
throw error; // Re-throw to be handled by the caller
} catch (err) {
console.error('Error cleaning up temporary tables:', err);
}
}
@@ -86,22 +97,42 @@ let isCancelled = false;
function cancelCalculation() {
isCancelled = true;
global.clearProgress();
// Format as SSE event
const event = {
progress: {
status: 'cancelled',
operation: 'Calculation cancelled',
current: 0,
total: 0,
elapsed: null,
remaining: null,
rate: 0,
timestamp: Date.now()
}
console.log('Calculation has been cancelled by user');
// Force-terminate any query that's been running for more than 5 seconds
try {
const connection = getConnection();
connection.then(async (conn) => {
try {
// Identify and terminate long-running queries from our application
await conn.query(`
SELECT pg_cancel_backend(pid)
FROM pg_stat_activity
WHERE query_start < now() - interval '5 seconds'
AND application_name LIKE '%node%'
AND query NOT LIKE '%pg_cancel_backend%'
`);
// Clean up any temporary tables
await cleanupTemporaryTables(conn);
// Release connection
conn.release();
} catch (err) {
console.error('Error during force cancellation:', err);
conn.release();
}
}).catch(err => {
console.error('Could not get connection for cancellation:', err);
});
} catch (err) {
console.error('Failed to terminate running queries:', err);
}
return {
success: true,
message: 'Calculation has been cancelled'
};
process.stdout.write(JSON.stringify(event) + '\n');
process.exit(0);
}
// Handle SIGTERM signal for cancellation
@@ -119,6 +150,15 @@ async function calculateMetrics() {
let totalPurchaseOrders = 0;
let calculateHistoryId;
// Set a maximum execution time (30 minutes)
const MAX_EXECUTION_TIME = 30 * 60 * 1000;
const timeout = setTimeout(() => {
console.error(`Calculation timed out after ${MAX_EXECUTION_TIME/1000} seconds, forcing termination`);
// Call cancel and force exit
cancelCalculation();
process.exit(1);
}, MAX_EXECUTION_TIME);
try {
// Clean up any previously running calculations
connection = await getConnection();
@@ -127,24 +167,24 @@ async function calculateMetrics() {
SET
status = 'cancelled',
end_time = NOW(),
duration_seconds = TIMESTAMPDIFF(SECOND, start_time, NOW()),
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
error_message = 'Previous calculation was not completed properly'
WHERE status = 'running'
`);
// Get counts from all relevant tables
const [[productCount], [orderCount], [poCount]] = await Promise.all([
const [productCountResult, orderCountResult, poCountResult] = await Promise.all([
connection.query('SELECT COUNT(*) as total FROM products'),
connection.query('SELECT COUNT(*) as total FROM orders'),
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
]);
totalProducts = productCount.total;
totalOrders = orderCount.total;
totalPurchaseOrders = poCount.total;
totalProducts = parseInt(productCountResult.rows[0].total);
totalOrders = parseInt(orderCountResult.rows[0].total);
totalPurchaseOrders = parseInt(poCountResult.rows[0].total);
// Create history record for this calculation
const [historyResult] = await connection.query(`
const historyResult = await connection.query(`
INSERT INTO calculate_history (
start_time,
status,
@@ -155,19 +195,19 @@ async function calculateMetrics() {
) VALUES (
NOW(),
'running',
?,
?,
?,
JSON_OBJECT(
'skip_product_metrics', ?,
'skip_time_aggregates', ?,
'skip_financial_metrics', ?,
'skip_vendor_metrics', ?,
'skip_category_metrics', ?,
'skip_brand_metrics', ?,
'skip_sales_forecasts', ?
$1,
$2,
$3,
jsonb_build_object(
'skip_product_metrics', ($4::int > 0),
'skip_time_aggregates', ($5::int > 0),
'skip_financial_metrics', ($6::int > 0),
'skip_vendor_metrics', ($7::int > 0),
'skip_category_metrics', ($8::int > 0),
'skip_brand_metrics', ($9::int > 0),
'skip_sales_forecasts', ($10::int > 0)
)
)
) RETURNING id
`, [
totalProducts,
totalOrders,
@@ -180,8 +220,7 @@ async function calculateMetrics() {
SKIP_BRAND_METRICS,
SKIP_SALES_FORECASTS
]);
calculateHistoryId = historyResult.insertId;
connection.release();
calculateHistoryId = historyResult.rows[0].id;
// Add debug logging for the progress functions
console.log('Debug - Progress functions:', {
@@ -199,6 +238,8 @@ async function calculateMetrics() {
throw err;
}
// Release the connection before getting a new one
connection.release();
isCancelled = false;
connection = await getConnection();
@@ -234,10 +275,10 @@ async function calculateMetrics() {
await connection.query(`
UPDATE calculate_history
SET
processed_products = ?,
processed_orders = ?,
processed_purchase_orders = ?
WHERE id = ?
processed_products = $1,
processed_orders = $2,
processed_purchase_orders = $3
WHERE id = $4
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
};
@@ -359,216 +400,6 @@ async function calculateMetrics() {
console.log('Skipping sales forecasts calculation');
}
// Calculate ABC classification
outputProgress({
status: 'running',
operation: 'Starting ABC classification',
current: processedProducts || 0,
total: totalProducts || 0,
elapsed: formatElapsedTime(startTime),
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
rate: calculateRate(startTime, processedProducts || 0),
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
if (isCancelled) return {
processedProducts: processedProducts || 0,
processedOrders: processedOrders || 0,
processedPurchaseOrders: 0,
success: false
};
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
// First, create and populate the rankings table with an index
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
await connection.query(`
CREATE TEMPORARY TABLE temp_revenue_ranks (
pid BIGINT NOT NULL,
total_revenue DECIMAL(10,3),
rank_num INT,
total_count INT,
PRIMARY KEY (pid),
INDEX (rank_num)
) ENGINE=MEMORY
`);
outputProgress({
status: 'running',
operation: 'Creating revenue rankings',
current: processedProducts || 0,
total: totalProducts || 0,
elapsed: formatElapsedTime(startTime),
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
rate: calculateRate(startTime, processedProducts || 0),
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
if (isCancelled) return {
processedProducts: processedProducts || 0,
processedOrders: processedOrders || 0,
processedPurchaseOrders: 0,
success: false
};
await connection.query(`
INSERT INTO temp_revenue_ranks
SELECT
pid,
total_revenue,
@rank := @rank + 1 as rank_num,
@total_count := @rank as total_count
FROM (
SELECT pid, total_revenue
FROM product_metrics
WHERE total_revenue > 0
ORDER BY total_revenue DESC
) ranked,
(SELECT @rank := 0) r
`);
// Get total count for percentage calculation
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
const totalCount = rankingCount[0].total_count || 1;
const max_rank = totalCount; // Store max_rank for use in classification
outputProgress({
status: 'running',
operation: 'Updating ABC classifications',
current: processedProducts || 0,
total: totalProducts || 0,
elapsed: formatElapsedTime(startTime),
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
rate: calculateRate(startTime, processedProducts || 0),
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
if (isCancelled) return {
processedProducts: processedProducts || 0,
processedOrders: processedOrders || 0,
processedPurchaseOrders: 0,
success: false
};
// ABC classification progress tracking
let abcProcessedCount = 0;
const batchSize = 5000;
let lastProgressUpdate = Date.now();
const progressUpdateInterval = 1000; // Update every second
while (true) {
if (isCancelled) return {
processedProducts: Number(processedProducts) || 0,
processedOrders: Number(processedOrders) || 0,
processedPurchaseOrders: 0,
success: false
};
// First get a batch of PIDs that need updating
const [pids] = await connection.query(`
SELECT pm.pid
FROM product_metrics pm
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
WHERE pm.abc_class IS NULL
OR pm.abc_class !=
CASE
WHEN tr.rank_num IS NULL THEN 'C'
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
ELSE 'C'
END
LIMIT ?
`, [max_rank, abcThresholds.a_threshold,
max_rank, abcThresholds.b_threshold,
batchSize]);
if (pids.length === 0) {
break;
}
// Then update just those PIDs
const [result] = await connection.query(`
UPDATE product_metrics pm
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
SET pm.abc_class =
CASE
WHEN tr.rank_num IS NULL THEN 'C'
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
ELSE 'C'
END,
pm.last_calculated_at = NOW()
WHERE pm.pid IN (?)
`, [max_rank, abcThresholds.a_threshold,
max_rank, abcThresholds.b_threshold,
pids.map(row => row.pid)]);
abcProcessedCount += result.affectedRows;
// Calculate progress ensuring valid numbers
const currentProgress = Math.floor(totalProducts * (0.99 + (abcProcessedCount / (totalCount || 1)) * 0.01));
processedProducts = Number(currentProgress) || processedProducts || 0;
// Only update progress at most once per second
const now = Date.now();
if (now - lastProgressUpdate >= progressUpdateInterval) {
const progress = ensureValidProgress(processedProducts, totalProducts);
outputProgress({
status: 'running',
operation: 'ABC classification progress',
current: progress.current,
total: progress.total,
elapsed: formatElapsedTime(startTime),
remaining: estimateRemaining(startTime, progress.current, progress.total),
rate: calculateRate(startTime, progress.current),
percentage: progress.percentage,
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
lastProgressUpdate = now;
}
// Update database progress
await updateProgress(processedProducts, processedOrders, processedPurchaseOrders);
// Small delay between batches to allow other transactions
await new Promise(resolve => setTimeout(resolve, 100));
}
// Clean up
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
const endTime = Date.now();
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
// Update calculate_status for ABC classification
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('abc_classification', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
`);
// Final progress update with guaranteed valid numbers
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
@@ -578,14 +409,14 @@ async function calculateMetrics() {
operation: 'Metrics calculation complete',
current: finalProgress.current,
total: finalProgress.total,
elapsed: formatElapsedTime(startTime),
elapsed: global.formatElapsedTime(startTime),
remaining: '0s',
rate: calculateRate(startTime, finalProgress.current),
rate: global.calculateRate(startTime, finalProgress.current),
percentage: '100',
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: totalElapsedSeconds
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
@@ -601,13 +432,13 @@ async function calculateMetrics() {
UPDATE calculate_history
SET
end_time = NOW(),
duration_seconds = ?,
processed_products = ?,
processed_orders = ?,
processed_purchase_orders = ?,
duration_seconds = $1,
processed_products = $2,
processed_orders = $3,
processed_purchase_orders = $4,
status = 'completed'
WHERE id = ?
`, [totalElapsedSeconds,
WHERE id = $5
`, [Math.round((Date.now() - startTime) / 1000),
finalStats.processedProducts,
finalStats.processedOrders,
finalStats.processedPurchaseOrders,
@@ -616,6 +447,11 @@ async function calculateMetrics() {
// Clear progress file on successful completion
global.clearProgress();
return {
success: true,
message: 'Calculation completed successfully',
duration: Math.round((Date.now() - startTime) / 1000)
};
} catch (error) {
const endTime = Date.now();
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
@@ -625,13 +461,13 @@ async function calculateMetrics() {
UPDATE calculate_history
SET
end_time = NOW(),
duration_seconds = ?,
processed_products = ?,
processed_orders = ?,
processed_purchase_orders = ?,
status = ?,
error_message = ?
WHERE id = ?
duration_seconds = $1,
processed_products = $2,
processed_orders = $3,
processed_purchase_orders = $4,
status = $5,
error_message = $6
WHERE id = $7
`, [
totalElapsedSeconds,
processedProducts || 0, // Ensure we have a valid number
@@ -677,17 +513,38 @@ async function calculateMetrics() {
}
throw error;
} finally {
// Clear the timeout to prevent forced termination
clearTimeout(timeout);
// Always clean up and release connection
if (connection) {
// Ensure temporary tables are cleaned up
await cleanupTemporaryTables(connection);
connection.release();
try {
await cleanupTemporaryTables(connection);
connection.release();
} catch (err) {
console.error('Error in final cleanup:', err);
}
}
// Close the connection pool when we're done
await closePool();
}
} catch (error) {
success = false;
logError(error, 'Error in metrics calculation');
console.error('Error in metrics calculation', error);
try {
if (connection) {
await connection.query(`
UPDATE calculate_history
SET
status = 'error',
end_time = NOW(),
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
error_message = $1
WHERE id = $2
`, [error.message.substring(0, 500), calculateHistoryId]);
}
} catch (updateError) {
console.error('Error updating calculation history:', updateError);
}
throw error;
}
}

View File

@@ -10,9 +10,9 @@ const importPurchaseOrders = require('./import/purchase-orders');
dotenv.config({ path: path.join(__dirname, "../.env") });
// Constants to control which imports run
const IMPORT_CATEGORIES = true;
const IMPORT_PRODUCTS = true;
const IMPORT_ORDERS = true;
const IMPORT_CATEGORIES = false;
const IMPORT_PRODUCTS = false;
const IMPORT_ORDERS = false;
const IMPORT_PURCHASE_ORDERS = true;
// Add flag for incremental updates
@@ -120,27 +120,38 @@ async function main() {
`);
// Create import history record for the overall session
const [historyResult] = await localConnection.query(`
INSERT INTO import_history (
table_name,
start_time,
is_incremental,
status,
additional_info
) VALUES (
'all_tables',
NOW(),
$1::boolean,
'running',
jsonb_build_object(
'categories_enabled', $2::boolean,
'products_enabled', $3::boolean,
'orders_enabled', $4::boolean,
'purchase_orders_enabled', $5::boolean
)
) RETURNING id
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
importHistoryId = historyResult.rows[0].id;
try {
const [historyResult] = await localConnection.query(`
INSERT INTO import_history (
table_name,
start_time,
is_incremental,
status,
additional_info
) VALUES (
'all_tables',
NOW(),
$1::boolean,
'running',
jsonb_build_object(
'categories_enabled', $2::boolean,
'products_enabled', $3::boolean,
'orders_enabled', $4::boolean,
'purchase_orders_enabled', $5::boolean
)
) RETURNING id
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
importHistoryId = historyResult.rows[0].id;
} catch (error) {
console.error("Error creating import history record:", error);
outputProgress({
status: "error",
operation: "Import process",
message: "Failed to create import history record",
error: error.message
});
throw error;
}
const results = {
categories: null,
@@ -158,8 +169,8 @@ async function main() {
if (isImportCancelled) throw new Error("Import cancelled");
completedSteps++;
console.log('Categories import result:', results.categories);
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0);
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0);
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0) || 0;
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0) || 0;
}
if (IMPORT_PRODUCTS) {
@@ -167,8 +178,8 @@ async function main() {
if (isImportCancelled) throw new Error("Import cancelled");
completedSteps++;
console.log('Products import result:', results.products);
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0);
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0);
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0) || 0;
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0) || 0;
}
if (IMPORT_ORDERS) {
@@ -176,17 +187,34 @@ async function main() {
if (isImportCancelled) throw new Error("Import cancelled");
completedSteps++;
console.log('Orders import result:', results.orders);
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0);
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0);
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0) || 0;
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0) || 0;
}
if (IMPORT_PURCHASE_ORDERS) {
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
if (isImportCancelled) throw new Error("Import cancelled");
completedSteps++;
console.log('Purchase orders import result:', results.purchaseOrders);
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
try {
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
if (isImportCancelled) throw new Error("Import cancelled");
completedSteps++;
console.log('Purchase orders import result:', results.purchaseOrders);
// Handle potential error status
if (results.purchaseOrders?.status === 'error') {
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
} else {
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0) || 0;
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0) || 0;
}
} catch (error) {
console.error('Error during purchase orders import:', error);
// Continue with other imports, don't fail the whole process
results.purchaseOrders = {
status: 'error',
error: error.message,
recordsAdded: 0,
recordsUpdated: 0
};
}
}
const endTime = Date.now();
@@ -214,8 +242,8 @@ async function main() {
WHERE id = $12
`, [
totalElapsedSeconds,
totalRecordsAdded,
totalRecordsUpdated,
parseInt(totalRecordsAdded) || 0,
parseInt(totalRecordsUpdated) || 0,
IMPORT_CATEGORIES,
IMPORT_PRODUCTS,
IMPORT_ORDERS,

View File

@@ -47,42 +47,18 @@ async function importCategories(prodConnection, localConnection) {
continue;
}
console.log(`\nProcessing ${categories.length} type ${type} categories`);
if (type === 10) {
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
}
console.log(`Processing ${categories.length} type ${type} categories`);
// For types that can have parents (11, 21, 12, 13), verify parent existence
// For types that can have parents (11, 21, 12, 13), we'll proceed directly
// No need to check for parent existence since we process in hierarchical order
let categoriesToInsert = categories;
if (![10, 20].includes(type)) {
// Get all parent IDs
const parentIds = [
...new Set(
categories
.filter(c => c && c.parent_id !== null)
.map(c => c.parent_id)
),
];
console.log(`Processing ${categories.length} type ${type} categories with ${parentIds.length} unique parent IDs`);
console.log('Parent IDs:', parentIds);
// No need to check for parent existence - we trust they exist since they were just inserted
categoriesToInsert = categories;
}
if (categoriesToInsert.length === 0) {
console.log(
`No valid categories of type ${type} to insert`
);
console.log(`No valid categories of type ${type} to insert`);
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
continue;
}
console.log(
`Inserting ${categoriesToInsert.length} type ${type} categories`
);
// PostgreSQL upsert query with parameterized values
const values = categoriesToInsert.flatMap((cat) => [
cat.cat_id,
@@ -95,14 +71,10 @@ async function importCategories(prodConnection, localConnection) {
new Date()
]);
console.log('Attempting to insert/update with values:', JSON.stringify(values, null, 2));
const placeholders = categoriesToInsert
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
.join(',');
console.log('Using placeholders:', placeholders);
// Insert categories with ON CONFLICT clause for PostgreSQL
const query = `
WITH inserted_categories AS (
@@ -129,17 +101,14 @@ async function importCategories(prodConnection, localConnection) {
COUNT(*) FILTER (WHERE is_insert) as inserted,
COUNT(*) FILTER (WHERE NOT is_insert) as updated
FROM inserted_categories`;
console.log('Executing query:', query);
const result = await localConnection.query(query, values);
console.log('Query result:', result);
// Get the first result since query returns an array
const queryResult = Array.isArray(result) ? result[0] : result;
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
console.error('Query failed to return results. Result:', queryResult);
console.error('Query failed to return results');
throw new Error('Query did not return expected results');
}
@@ -173,6 +142,14 @@ async function importCategories(prodConnection, localConnection) {
// Commit the entire transaction - we'll do this even if we have skipped categories
await localConnection.query('COMMIT');
// Update sync status
await localConnection.query(`
INSERT INTO sync_status (table_name, last_sync_timestamp)
VALUES ('categories', NOW())
ON CONFLICT (table_name) DO UPDATE SET
last_sync_timestamp = NOW()
`);
outputProgress({
status: "complete",
operation: "Categories import completed",

View File

@@ -26,6 +26,9 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
let cumulativeProcessedOrders = 0;
try {
// Begin transaction
await localConnection.beginTransaction();
// Get last sync info
const [syncInfo] = await localConnection.query(
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
@@ -38,7 +41,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
const [[{ total }]] = await prodConnection.query(`
SELECT COUNT(*) as total
FROM order_items oi
USE INDEX (PRIMARY)
JOIN _order o ON oi.order_id = o.order_id
WHERE o.order_status >= 15
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
@@ -78,7 +80,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
oi.stamp as last_modified
FROM order_items oi
USE INDEX (PRIMARY)
JOIN _order o ON oi.order_id = o.order_id
WHERE o.order_status >= 15
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
@@ -105,15 +106,15 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
console.log('Orders: Found', orderItems.length, 'order items to process');
// Create tables in PostgreSQL for debugging
// Create tables in PostgreSQL for data processing
await localConnection.query(`
DROP TABLE IF EXISTS debug_order_items;
DROP TABLE IF EXISTS debug_order_meta;
DROP TABLE IF EXISTS debug_order_discounts;
DROP TABLE IF EXISTS debug_order_taxes;
DROP TABLE IF EXISTS debug_order_costs;
DROP TABLE IF EXISTS temp_order_items;
DROP TABLE IF EXISTS temp_order_meta;
DROP TABLE IF EXISTS temp_order_discounts;
DROP TABLE IF EXISTS temp_order_taxes;
DROP TABLE IF EXISTS temp_order_costs;
CREATE TABLE debug_order_items (
CREATE TEMP TABLE temp_order_items (
order_id INTEGER NOT NULL,
pid INTEGER NOT NULL,
SKU VARCHAR(50) NOT NULL,
@@ -123,7 +124,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
PRIMARY KEY (order_id, pid)
);
CREATE TABLE debug_order_meta (
CREATE TEMP TABLE temp_order_meta (
order_id INTEGER NOT NULL,
date DATE NOT NULL,
customer VARCHAR(100) NOT NULL,
@@ -135,26 +136,29 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
PRIMARY KEY (order_id)
);
CREATE TABLE debug_order_discounts (
CREATE TEMP TABLE temp_order_discounts (
order_id INTEGER NOT NULL,
pid INTEGER NOT NULL,
discount DECIMAL(10,2) NOT NULL,
PRIMARY KEY (order_id, pid)
);
CREATE TABLE debug_order_taxes (
CREATE TEMP TABLE temp_order_taxes (
order_id INTEGER NOT NULL,
pid INTEGER NOT NULL,
tax DECIMAL(10,2) NOT NULL,
PRIMARY KEY (order_id, pid)
);
CREATE TABLE debug_order_costs (
CREATE TEMP TABLE temp_order_costs (
order_id INTEGER NOT NULL,
pid INTEGER NOT NULL,
costeach DECIMAL(10,3) DEFAULT 0.000,
PRIMARY KEY (order_id, pid)
);
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
`);
// Insert order items in batches
@@ -168,7 +172,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
]);
await localConnection.query(`
INSERT INTO debug_order_items (order_id, pid, SKU, price, quantity, base_discount)
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
VALUES ${placeholders}
ON CONFLICT (order_id, pid) DO UPDATE SET
SKU = EXCLUDED.SKU,
@@ -202,6 +206,14 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
const METADATA_BATCH_SIZE = 2000;
const PG_BATCH_SIZE = 200;
// Add a helper function for title case conversion
function toTitleCase(str) {
if (!str) return '';
return str.toLowerCase().split(' ').map(word => {
return word.charAt(0).toUpperCase() + word.slice(1);
}).join(' ');
}
const processMetadataBatch = async (batchIds) => {
const [orders] = await prodConnection.query(`
SELECT
@@ -231,7 +243,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
order.order_id,
order.date,
order.customer,
order.customer_name || '',
toTitleCase(order.customer_name) || '',
order.status,
order.canceled,
order.summary_discount || 0,
@@ -239,7 +251,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
]);
await localConnection.query(`
INSERT INTO debug_order_meta (
INSERT INTO temp_order_meta (
order_id, date, customer, customer_name, status, canceled,
summary_discount, summary_subtotal
)
@@ -281,7 +293,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
]);
await localConnection.query(`
INSERT INTO debug_order_discounts (order_id, pid, discount)
INSERT INTO temp_order_discounts (order_id, pid, discount)
VALUES ${placeholders}
ON CONFLICT (order_id, pid) DO UPDATE SET
discount = EXCLUDED.discount
@@ -321,7 +333,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
]);
await localConnection.query(`
INSERT INTO debug_order_taxes (order_id, pid, tax)
INSERT INTO temp_order_taxes (order_id, pid, tax)
VALUES ${placeholders}
ON CONFLICT (order_id, pid) DO UPDATE SET
tax = EXCLUDED.tax
@@ -330,14 +342,23 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
};
const processCostsBatch = async (batchIds) => {
// Modified query to ensure one row per order_id/pid by using a subquery
const [costs] = await prodConnection.query(`
SELECT
oc.orderid as order_id,
oc.pid,
oc.costeach
FROM order_costs oc
WHERE oc.orderid IN (?)
AND oc.pending = 0
INNER JOIN (
SELECT
orderid,
pid,
MAX(id) as max_id
FROM order_costs
WHERE orderid IN (?)
AND pending = 0
GROUP BY orderid, pid
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
`, [batchIds]);
if (costs.length === 0) return;
@@ -357,7 +378,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
]);
await localConnection.query(`
INSERT INTO debug_order_costs (order_id, pid, costeach)
INSERT INTO temp_order_costs (order_id, pid, costeach)
VALUES ${placeholders}
ON CONFLICT (order_id, pid) DO UPDATE SET
costeach = EXCLUDED.costeach
@@ -416,11 +437,12 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
oi.pid,
SUM(COALESCE(od.discount, 0)) as promo_discount,
COALESCE(ot.tax, 0) as total_tax,
COALESCE(oi.price * 0.5, 0) as costeach
FROM debug_order_items oi
LEFT JOIN debug_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
LEFT JOIN debug_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
GROUP BY oi.order_id, oi.pid, ot.tax
COALESCE(oc.costeach, oi.price * 0.5) as costeach
FROM temp_order_items oi
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
)
SELECT
oi.order_id as order_number,
@@ -447,11 +469,11 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
FROM (
SELECT DISTINCT ON (order_id, pid)
order_id, pid, SKU, price, quantity, base_discount
FROM debug_order_items
FROM temp_order_items
WHERE order_id = ANY($1)
ORDER BY order_id, pid
) oi
JOIN debug_order_meta om ON oi.order_id = om.order_id
JOIN temp_order_meta om ON oi.order_id = om.order_id
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
ORDER BY oi.order_id, oi.pid
`, [subBatchIds]);
@@ -478,8 +500,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
const placeholders = subBatch.map((_, idx) => {
const base = idx * 14; // 14 columns (removed updated)
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14})`;
const base = idx * 15; // 15 columns including costeach
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
}).join(',');
const batchValues = subBatch.flatMap(o => [
@@ -496,7 +518,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
o.customer,
o.customer_name,
o.status,
o.canceled
o.canceled,
o.costeach
]);
const [result] = await localConnection.query(`
@@ -504,7 +527,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
INSERT INTO orders (
order_number, pid, sku, date, price, quantity, discount,
tax, tax_included, shipping, customer, customer_name,
status, canceled
status, canceled, costeach
)
VALUES ${placeholders}
ON CONFLICT (order_number, pid) DO UPDATE SET
@@ -519,7 +542,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
customer = EXCLUDED.customer,
customer_name = EXCLUDED.customer_name,
status = EXCLUDED.status,
canceled = EXCLUDED.canceled
canceled = EXCLUDED.canceled,
costeach = EXCLUDED.costeach
RETURNING xmax = 0 as inserted
)
SELECT
@@ -529,8 +553,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
`, batchValues);
const { inserted, updated } = result.rows[0];
recordsAdded += inserted;
recordsUpdated += updated;
recordsAdded += parseInt(inserted) || 0;
recordsUpdated += parseInt(updated) || 0;
importedCount += subBatch.length;
}
@@ -555,19 +579,39 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
ON CONFLICT (table_name) DO UPDATE SET
last_sync_timestamp = NOW()
`);
// Cleanup temporary tables
await localConnection.query(`
DROP TABLE IF EXISTS temp_order_items;
DROP TABLE IF EXISTS temp_order_meta;
DROP TABLE IF EXISTS temp_order_discounts;
DROP TABLE IF EXISTS temp_order_taxes;
DROP TABLE IF EXISTS temp_order_costs;
`);
// Commit transaction
await localConnection.commit();
return {
status: "complete",
totalImported: Math.floor(importedCount),
recordsAdded: recordsAdded || 0,
recordsUpdated: Math.floor(recordsUpdated),
totalSkipped: skippedOrders.size,
missingProducts: missingProducts.size,
totalImported: Math.floor(importedCount) || 0,
recordsAdded: parseInt(recordsAdded) || 0,
recordsUpdated: parseInt(recordsUpdated) || 0,
totalSkipped: skippedOrders.size || 0,
missingProducts: missingProducts.size || 0,
incrementalUpdate,
lastSyncTime
};
} catch (error) {
console.error("Error during orders import:", error);
// Rollback transaction
try {
await localConnection.rollback();
} catch (rollbackError) {
console.error("Error during rollback:", rollbackError);
}
throw error;
}
}

View File

@@ -1,10 +1,13 @@
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
const BATCH_SIZE = 100; // Smaller batch size for better progress tracking
const BATCH_SIZE = 1000; // Smaller batch size for better progress tracking
const MAX_RETRIES = 3;
const RETRY_DELAY = 5000; // 5 seconds
const dotenv = require("dotenv");
const path = require("path");
dotenv.config({ path: path.join(__dirname, "../../.env") });
// Utility functions
const imageUrlBase = 'https://sbing.com/i/products/0000/';
const imageUrlBase = process.env.PRODUCT_IMAGE_URL_BASE || 'https://sbing.com/i/products/0000/';
const getImageUrls = (pid, iid = 1) => {
const paddedPid = pid.toString().padStart(6, '0');
// Use padded PID only for the first 3 digits
@@ -18,7 +21,7 @@ const getImageUrls = (pid, iid = 1) => {
};
};
// Add helper function for retrying operations
// Add helper function for retrying operations with exponential backoff
async function withRetry(operation, errorMessage) {
let lastError;
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
@@ -28,7 +31,8 @@ async function withRetry(operation, errorMessage) {
lastError = error;
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
if (attempt < MAX_RETRIES) {
await new Promise(resolve => setTimeout(resolve, RETRY_DELAY));
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
await new Promise(resolve => setTimeout(resolve, backoffTime));
}
}
}
@@ -140,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
CASE
WHEN p.reorder < 0 THEN 0
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END AS replenishable,
@@ -155,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
COALESCE(p.sellingprice, 0) AS regular_price,
CASE
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
FROM product_inventory
WHERE pid = p.pid AND count > 0
)
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
END AS cost_price,
NULL as landing_cost_price,
@@ -183,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
p.country_of_origin,
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
p.totalsold AS total_sold,
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
pls.date_sold as date_last_sold,
GROUP_CONCAT(DISTINCT CASE
WHEN pc.cat_id IS NOT NULL
@@ -233,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
row.pid,
row.title,
row.description,
row.itemnumber || '',
row.sku || '',
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
row.preorder_count,
row.notions_inv_count,
@@ -335,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
CASE
WHEN p.reorder < 0 THEN 0
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END AS replenishable,
@@ -350,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
COALESCE(p.sellingprice, 0) AS regular_price,
CASE
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
FROM product_inventory
WHERE pid = p.pid AND count > 0
)
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
END AS cost_price,
NULL as landing_cost_price,
@@ -378,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
p.country_of_origin,
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
p.totalsold AS total_sold,
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
pls.date_sold as date_last_sold,
GROUP_CONCAT(DISTINCT CASE
WHEN pc.cat_id IS NOT NULL
@@ -432,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
row.pid,
row.title,
row.description,
row.itemnumber || '',
row.sku || '',
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
row.preorder_count,
row.notions_inv_count,
@@ -772,32 +788,44 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
// Process category relationships for each product in the batch
// Process category relationships in batches
const allCategories = [];
for (const row of batch) {
if (row.categories) {
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
if (categoryIds.length > 0) {
const catPlaceholders = categoryIds.map((_, idx) =>
`($${idx * 2 + 1}, $${idx * 2 + 2})`
).join(',');
const catValues = categoryIds.flatMap(catId => [row.pid, parseInt(catId.trim(), 10)]);
// First delete existing relationships for this product
await localConnection.query(
'DELETE FROM product_categories WHERE pid = $1',
[row.pid]
);
// Then insert the new relationships
await localConnection.query(`
INSERT INTO product_categories (pid, cat_id)
VALUES ${catPlaceholders}
ON CONFLICT (pid, cat_id) DO NOTHING
`, catValues);
categoryIds.forEach(catId => {
allCategories.push([row.pid, parseInt(catId.trim(), 10)]);
});
}
}
}
// If we have categories to process
if (allCategories.length > 0) {
// First get all products in this batch
const productIds = batch.map(p => p.pid);
// Delete all existing relationships for products in this batch
await localConnection.query(
'DELETE FROM product_categories WHERE pid = ANY($1)',
[productIds]
);
// Insert all new relationships in one batch
const catPlaceholders = allCategories.map((_, idx) =>
`($${idx * 2 + 1}, $${idx * 2 + 2})`
).join(',');
const catValues = allCategories.flat();
await localConnection.query(`
INSERT INTO product_categories (pid, cat_id)
VALUES ${catPlaceholders}
ON CONFLICT (pid, cat_id) DO NOTHING
`, catValues);
}
outputProgress({
status: "running",
operation: "Products import",
@@ -816,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
// Commit the transaction
await localConnection.commit();
// Update sync status
await localConnection.query(`
INSERT INTO sync_status (table_name, last_sync_timestamp)
VALUES ('products', NOW())
ON CONFLICT (table_name) DO UPDATE SET
last_sync_timestamp = NOW()
`);
return {
status: 'complete',
recordsAdded,

File diff suppressed because it is too large Load Diff

View File

@@ -32,12 +32,12 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
}
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
outputProgress({
status: 'running',
@@ -98,14 +98,14 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
COUNT(DISTINCT DATE(o.date)) as period_days,
CASE
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH) THEN 'current'
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH) THEN 'previous'
WHEN o.date >= CURRENT_DATE - INTERVAL '3 months' THEN 'current'
WHEN o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
AND CURRENT_DATE - INTERVAL '12 months' THEN 'previous'
END as period_type
FROM filtered_products p
JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
GROUP BY p.brand, period_type
),
brand_data AS (
@@ -165,15 +165,16 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
ON DUPLICATE KEY UPDATE
product_count = VALUES(product_count),
active_products = VALUES(active_products),
total_stock_units = VALUES(total_stock_units),
total_stock_cost = VALUES(total_stock_cost),
total_stock_retail = VALUES(total_stock_retail),
total_revenue = VALUES(total_revenue),
avg_margin = VALUES(avg_margin),
growth_rate = VALUES(growth_rate),
ON CONFLICT (brand) DO UPDATE
SET
product_count = EXCLUDED.product_count,
active_products = EXCLUDED.active_products,
total_stock_units = EXCLUDED.total_stock_units,
total_stock_cost = EXCLUDED.total_stock_cost,
total_stock_retail = EXCLUDED.total_stock_retail,
total_revenue = EXCLUDED.total_revenue,
avg_margin = EXCLUDED.avg_margin,
growth_rate = EXCLUDED.growth_rate,
last_calculated_at = CURRENT_TIMESTAMP
`);
@@ -230,8 +231,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
monthly_metrics AS (
SELECT
p.brand,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
COUNT(DISTINCT p.valid_pid) as product_count,
COUNT(DISTINCT p.active_pid) as active_products,
SUM(p.valid_stock) as total_stock_units,
@@ -255,19 +256,20 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
END as avg_margin
FROM filtered_products p
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY p.brand, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
)
SELECT *
FROM monthly_metrics
ON DUPLICATE KEY UPDATE
product_count = VALUES(product_count),
active_products = VALUES(active_products),
total_stock_units = VALUES(total_stock_units),
total_stock_cost = VALUES(total_stock_cost),
total_stock_retail = VALUES(total_stock_retail),
total_revenue = VALUES(total_revenue),
avg_margin = VALUES(avg_margin)
ON CONFLICT (brand, year, month) DO UPDATE
SET
product_count = EXCLUDED.product_count,
active_products = EXCLUDED.active_products,
total_stock_units = EXCLUDED.total_stock_units,
total_stock_cost = EXCLUDED.total_stock_cost,
total_stock_retail = EXCLUDED.total_stock_retail,
total_revenue = EXCLUDED.total_revenue,
avg_margin = EXCLUDED.avg_margin
`);
processedCount = Math.floor(totalProducts * 0.99);
@@ -294,7 +296,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('brand_metrics', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -32,12 +32,12 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
}
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
outputProgress({
status: 'running',
@@ -76,12 +76,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
LEFT JOIN products p ON pc.pid = p.pid
GROUP BY c.cat_id, c.status
ON DUPLICATE KEY UPDATE
product_count = VALUES(product_count),
active_products = VALUES(active_products),
total_value = VALUES(total_value),
status = VALUES(status),
last_calculated_at = VALUES(last_calculated_at)
ON CONFLICT (category_id) DO UPDATE
SET
product_count = EXCLUDED.product_count,
active_products = EXCLUDED.active_products,
total_value = EXCLUDED.total_value,
status = EXCLUDED.status,
last_calculated_at = EXCLUDED.last_calculated_at
`);
processedCount = Math.floor(totalProducts * 0.90);
@@ -127,17 +128,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
(tc.category_id IS NULL AND tc.vendor IS NULL)
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL COALESCE(tc.calculation_period_days, 30) DAY)
AND o.date >= CURRENT_DATE - (COALESCE(tc.calculation_period_days, 30) || ' days')::INTERVAL
GROUP BY pc.cat_id
)
UPDATE category_metrics cm
JOIN category_sales cs ON cm.category_id = cs.cat_id
LEFT JOIN turnover_config tc ON
(tc.category_id = cm.category_id AND tc.vendor IS NULL) OR
(tc.category_id IS NULL AND tc.vendor IS NULL)
UPDATE category_metrics
SET
cm.avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
cm.turnover_rate = CASE
avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
turnover_rate = CASE
WHEN cs.avg_stock > 0 AND cs.active_days > 0
THEN LEAST(
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
@@ -145,7 +142,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
)
ELSE 0
END,
cm.last_calculated_at = NOW()
last_calculated_at = NOW()
FROM category_sales cs
WHERE category_id = cs.cat_id
`);
processedCount = Math.floor(totalProducts * 0.95);
@@ -184,9 +183,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
FROM product_categories pc
JOIN products p ON pc.pid = p.pid
JOIN orders o ON p.pid = o.pid
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
GROUP BY pc.cat_id
),
previous_period AS (
@@ -198,26 +197,26 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
FROM product_categories pc
JOIN products p ON pc.pid = p.pid
JOIN orders o ON p.pid = o.pid
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
WHERE o.canceled = false
AND o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
AND o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
AND CURRENT_DATE - INTERVAL '12 months'
GROUP BY pc.cat_id
),
trend_data AS (
SELECT
pc.cat_id,
MONTH(o.date) as month,
EXTRACT(MONTH FROM o.date) as month,
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
COUNT(DISTINCT DATE(o.date)) as days_in_month
FROM product_categories pc
JOIN products p ON pc.pid = p.pid
JOIN orders o ON p.pid = o.pid
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
GROUP BY pc.cat_id, MONTH(o.date)
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
GROUP BY pc.cat_id, EXTRACT(MONTH FROM o.date)
),
trend_stats AS (
SELECT
@@ -261,16 +260,42 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
JOIN products p ON pc.pid = p.pid
JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
GROUP BY pc.cat_id
),
combined_metrics AS (
SELECT
COALESCE(cp.cat_id, pp.cat_id) as category_id,
CASE
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
WHEN ta.trend_slope IS NOT NULL THEN
GREATEST(
-100.0,
LEAST(
(ta.trend_slope / NULLIF(ta.avg_daily_revenue, 0)) * 365 * 100,
999.99
)
)
ELSE
GREATEST(
-100.0,
LEAST(
((COALESCE(cp.revenue, 0) - pp.revenue) /
NULLIF(ABS(pp.revenue), 0)) * 100.0,
999.99
)
)
END as growth_rate,
mc.avg_margin
FROM current_period cp
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
)
UPDATE category_metrics cm
LEFT JOIN current_period cp ON cm.category_id = cp.cat_id
LEFT JOIN previous_period pp ON cm.category_id = pp.cat_id
LEFT JOIN trend_analysis ta ON cm.category_id = ta.cat_id
LEFT JOIN margin_calc mc ON cm.category_id = mc.cat_id
SET
cm.growth_rate = CASE
growth_rate = CASE
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
WHEN ta.trend_slope IS NOT NULL THEN
@@ -291,9 +316,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
)
)
END,
cm.avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
cm.last_calculated_at = NOW()
WHERE cp.cat_id IS NOT NULL OR pp.cat_id IS NOT NULL
avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
last_calculated_at = NOW()
FROM current_period cp
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
WHERE cm.category_id = COALESCE(cp.cat_id, pp.cat_id)
`);
processedCount = Math.floor(totalProducts * 0.97);
@@ -335,8 +364,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
)
SELECT
pc.cat_id,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
COUNT(DISTINCT p.pid) as product_count,
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
SUM(p.stock_quantity * p.cost_price) as total_value,
@@ -364,15 +393,16 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
JOIN products p ON pc.pid = p.pid
JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
GROUP BY pc.cat_id, YEAR(o.date), MONTH(o.date)
ON DUPLICATE KEY UPDATE
product_count = VALUES(product_count),
active_products = VALUES(active_products),
total_value = VALUES(total_value),
total_revenue = VALUES(total_revenue),
avg_margin = VALUES(avg_margin),
turnover_rate = VALUES(turnover_rate)
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY pc.cat_id, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
ON CONFLICT (category_id, year, month) DO UPDATE
SET
product_count = EXCLUDED.product_count,
active_products = EXCLUDED.active_products,
total_value = EXCLUDED.total_value,
total_revenue = EXCLUDED.total_revenue,
avg_margin = EXCLUDED.avg_margin,
turnover_rate = EXCLUDED.turnover_rate
`);
processedCount = Math.floor(totalProducts * 0.99);
@@ -414,20 +444,20 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
)
WITH date_ranges AS (
SELECT
DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY) as period_start,
CURRENT_DATE - INTERVAL '30 days' as period_start,
CURRENT_DATE as period_end
UNION ALL
SELECT
DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY),
DATE_SUB(CURRENT_DATE, INTERVAL 31 DAY)
CURRENT_DATE - INTERVAL '90 days',
CURRENT_DATE - INTERVAL '31 days'
UNION ALL
SELECT
DATE_SUB(CURRENT_DATE, INTERVAL 180 DAY),
DATE_SUB(CURRENT_DATE, INTERVAL 91 DAY)
CURRENT_DATE - INTERVAL '180 days',
CURRENT_DATE - INTERVAL '91 days'
UNION ALL
SELECT
DATE_SUB(CURRENT_DATE, INTERVAL 365 DAY),
DATE_SUB(CURRENT_DATE, INTERVAL 181 DAY)
CURRENT_DATE - INTERVAL '365 days',
CURRENT_DATE - INTERVAL '181 days'
),
sales_data AS (
SELECT
@@ -466,12 +496,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
END as avg_price,
NOW() as last_calculated_at
FROM sales_data
ON DUPLICATE KEY UPDATE
avg_daily_sales = VALUES(avg_daily_sales),
total_sold = VALUES(total_sold),
num_products = VALUES(num_products),
avg_price = VALUES(avg_price),
last_calculated_at = VALUES(last_calculated_at)
ON CONFLICT (category_id, brand, period_start, period_end) DO UPDATE
SET
avg_daily_sales = EXCLUDED.avg_daily_sales,
total_sold = EXCLUDED.total_sold,
num_products = EXCLUDED.num_products,
avg_price = EXCLUDED.avg_price,
last_calculated_at = EXCLUDED.last_calculated_at
`);
processedCount = Math.floor(totalProducts * 1.0);
@@ -498,7 +529,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('category_metrics', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -32,13 +32,13 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
}
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
outputProgress({
status: 'running',
@@ -67,27 +67,28 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
MIN(o.date) as first_sale_date,
MAX(o.date) as last_sale_date,
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days,
EXTRACT(DAY FROM (MAX(o.date)::timestamp with time zone - MIN(o.date)::timestamp with time zone)) + 1 as calculation_period_days,
COUNT(DISTINCT DATE(o.date)) as active_days
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
GROUP BY p.pid
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY p.pid, p.cost_price, p.stock_quantity
)
UPDATE product_metrics pm
JOIN product_financials pf ON pm.pid = pf.pid
SET
pm.inventory_value = COALESCE(pf.inventory_value, 0),
pm.total_revenue = COALESCE(pf.total_revenue, 0),
pm.cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
pm.gross_profit = COALESCE(pf.gross_profit, 0),
pm.gmroi = CASE
inventory_value = COALESCE(pf.inventory_value, 0),
total_revenue = COALESCE(pf.total_revenue, 0),
cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
gross_profit = COALESCE(pf.gross_profit, 0),
gmroi = CASE
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
ELSE 0
END,
pm.last_calculated_at = CURRENT_TIMESTAMP
last_calculated_at = CURRENT_TIMESTAMP
FROM product_financials pf
WHERE pm.pid = pf.pid
`);
processedCount = Math.floor(totalProducts * 0.65);
@@ -119,8 +120,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
WITH monthly_financials AS (
SELECT
p.pid,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
p.cost_price * p.stock_quantity as inventory_value,
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
COUNT(DISTINCT DATE(o.date)) as active_days,
@@ -129,19 +130,20 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
)
UPDATE product_time_aggregates pta
JOIN monthly_financials mf ON pta.pid = mf.pid
AND pta.year = mf.year
AND pta.month = mf.month
SET
pta.inventory_value = COALESCE(mf.inventory_value, 0),
pta.gmroi = CASE
inventory_value = COALESCE(mf.inventory_value, 0),
gmroi = CASE
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
ELSE 0
END
FROM monthly_financials mf
WHERE pta.pid = mf.pid
AND pta.year = mf.year
AND pta.month = mf.month
`);
processedCount = Math.floor(totalProducts * 0.70);
@@ -168,7 +170,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('financial_metrics', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -10,20 +10,21 @@ function sanitizeValue(value) {
}
async function calculateProductMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
const connection = await getConnection();
let connection;
let success = false;
let processedOrders = 0;
const BATCH_SIZE = 5000;
try {
connection = await getConnection();
// Skip flags are inherited from the parent scope
const SKIP_PRODUCT_BASE_METRICS = 0;
const SKIP_PRODUCT_TIME_AGGREGATES = 0;
// Get total product count if not provided
if (!totalProducts) {
const [productCount] = await connection.query('SELECT COUNT(*) as count FROM products');
totalProducts = productCount[0].count;
const productCount = await connection.query('SELECT COUNT(*) as count FROM products');
totalProducts = parseInt(productCount.rows[0].count);
}
if (isCancelled) {
@@ -52,19 +53,20 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
// First ensure all products have a metrics record
await connection.query(`
INSERT IGNORE INTO product_metrics (pid, last_calculated_at)
INSERT INTO product_metrics (pid, last_calculated_at)
SELECT pid, NOW()
FROM products
ON CONFLICT (pid) DO NOTHING
`);
// Get threshold settings once
const [thresholds] = await connection.query(`
const thresholds = await connection.query(`
SELECT critical_days, reorder_days, overstock_days, low_stock_threshold
FROM stock_thresholds
WHERE category_id IS NULL AND vendor IS NULL
LIMIT 1
`);
const defaultThresholds = thresholds[0];
const defaultThresholds = thresholds.rows[0];
// Calculate base product metrics
if (!SKIP_PRODUCT_BASE_METRICS) {
@@ -85,16 +87,43 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
});
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
// Clear temporary tables
await connection.query('TRUNCATE TABLE temp_sales_metrics');
await connection.query('TRUNCATE TABLE temp_purchase_metrics');
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
// Create temp_sales_metrics
await connection.query(`
CREATE TEMPORARY TABLE temp_sales_metrics (
pid BIGINT NOT NULL,
daily_sales_avg DECIMAL(10,3),
weekly_sales_avg DECIMAL(10,3),
monthly_sales_avg DECIMAL(10,3),
total_revenue DECIMAL(10,3),
avg_margin_percent DECIMAL(10,3),
first_sale_date DATE,
last_sale_date DATE,
PRIMARY KEY (pid)
)
`);
// Create temp_purchase_metrics
await connection.query(`
CREATE TEMPORARY TABLE temp_purchase_metrics (
pid BIGINT NOT NULL,
avg_lead_time_days DOUBLE PRECISION,
last_purchase_date DATE,
first_received_date DATE,
last_received_date DATE,
PRIMARY KEY (pid)
)
`);
// Populate temp_sales_metrics with base stats and sales averages
await connection.query(`
@@ -115,98 +144,131 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
AND o.canceled = false
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY p.pid
`);
// Populate temp_purchase_metrics
await connection.query(`
INSERT INTO temp_purchase_metrics
SELECT
p.pid,
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
MAX(po.date) as last_purchase_date,
MIN(po.received_date) as first_received_date,
MAX(po.received_date) as last_received_date
FROM products p
LEFT JOIN purchase_orders po ON p.pid = po.pid
AND po.received_date IS NOT NULL
AND po.date >= DATE_SUB(CURDATE(), INTERVAL 365 DAY)
GROUP BY p.pid
`);
// Populate temp_purchase_metrics with timeout protection
await Promise.race([
connection.query(`
INSERT INTO temp_purchase_metrics
SELECT
p.pid,
AVG(
CASE
WHEN po.received_date IS NOT NULL AND po.date IS NOT NULL
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
ELSE NULL
END
) as avg_lead_time_days,
MAX(po.date) as last_purchase_date,
MIN(po.received_date) as first_received_date,
MAX(po.received_date) as last_received_date
FROM products p
LEFT JOIN purchase_orders po ON p.pid = po.pid
AND po.received_date IS NOT NULL
AND po.date IS NOT NULL
AND po.date >= CURRENT_DATE - INTERVAL '365 days'
GROUP BY p.pid
`),
new Promise((_, reject) =>
setTimeout(() => reject(new Error('Timeout: temp_purchase_metrics query took too long')), 60000)
)
]).catch(async (err) => {
logError(err, 'Error populating temp_purchase_metrics, continuing with empty table');
// Create an empty fallback to continue processing
await connection.query(`
INSERT INTO temp_purchase_metrics
SELECT
p.pid,
30.0 as avg_lead_time_days,
NULL as last_purchase_date,
NULL as first_received_date,
NULL as last_received_date
FROM products p
LEFT JOIN temp_purchase_metrics tpm ON p.pid = tpm.pid
WHERE tpm.pid IS NULL
`);
});
// Process updates in batches
let lastPid = 0;
while (true) {
let batchCount = 0;
const MAX_BATCHES = 1000; // Safety limit for number of batches to prevent infinite loops
while (batchCount < MAX_BATCHES) {
if (isCancelled) break;
const [batch] = await connection.query(
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
batchCount++;
const batch = await connection.query(
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
[lastPid, BATCH_SIZE]
);
if (batch.length === 0) break;
if (batch.rows.length === 0) break;
// Process the entire batch in a single efficient query
await connection.query(`
UPDATE product_metrics pm
JOIN products p ON pm.pid = p.pid
LEFT JOIN temp_sales_metrics sm ON pm.pid = sm.pid
LEFT JOIN temp_purchase_metrics lm ON pm.pid = lm.pid
SET
pm.inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
pm.daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
pm.weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
pm.monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
pm.total_revenue = COALESCE(sm.total_revenue, 0),
pm.avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
pm.first_sale_date = sm.first_sale_date,
pm.last_sale_date = sm.last_sale_date,
pm.avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
pm.days_of_inventory = CASE
inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
total_revenue = COALESCE(sm.total_revenue, 0),
avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
first_sale_date = sm.first_sale_date,
last_sale_date = sm.last_sale_date,
avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
days_of_inventory = CASE
WHEN COALESCE(sm.daily_sales_avg, 0) > 0
THEN FLOOR(p.stock_quantity / NULLIF(sm.daily_sales_avg, 0))
ELSE NULL
END,
pm.weeks_of_inventory = CASE
weeks_of_inventory = CASE
WHEN COALESCE(sm.weekly_sales_avg, 0) > 0
THEN FLOOR(p.stock_quantity / NULLIF(sm.weekly_sales_avg, 0))
ELSE NULL
END,
pm.stock_status = CASE
stock_status = CASE
WHEN p.stock_quantity <= 0 THEN 'Out of Stock'
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= ? THEN 'Low Stock'
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= $1 THEN 'Low Stock'
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 THEN 'In Stock'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Critical'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Reorder'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ? THEN 'Overstocked'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $2 THEN 'Critical'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $3 THEN 'Reorder'
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $4 THEN 'Overstocked'
ELSE 'Healthy'
END,
pm.safety_stock = CASE
safety_stock = CASE
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
ELSE ?
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
ELSE $5
END,
pm.reorder_point = CASE
reorder_point = CASE
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
CEIL(sm.daily_sales_avg * COALESCE(lm.avg_lead_time_days, 30)) +
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
ELSE ?
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
ELSE $6
END,
pm.reorder_qty = CASE
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL THEN
reorder_qty = CASE
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL AND NULLIF(p.cost_price, 0) > 0 THEN
GREATEST(
CEIL(SQRT((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25))),
?
CEIL(SQRT(ABS((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25)))),
$7
)
ELSE ?
ELSE $8
END,
pm.overstocked_amt = CASE
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ?
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * ?))
overstocked_amt = CASE
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $9
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * $10))
ELSE 0
END,
pm.last_calculated_at = NOW()
WHERE p.pid IN (${batch.map(() => '?').join(',')})
last_calculated_at = NOW()
FROM products p
LEFT JOIN temp_sales_metrics sm ON p.pid = sm.pid
LEFT JOIN temp_purchase_metrics lm ON p.pid = lm.pid
WHERE p.pid = ANY($11::bigint[])
AND pm.pid = p.pid
`,
[
defaultThresholds.low_stock_threshold,
@@ -219,12 +281,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
defaultThresholds.low_stock_threshold,
defaultThresholds.overstock_days,
defaultThresholds.overstock_days,
...batch.map(row => row.pid)
]
);
batch.rows.map(row => row.pid)
]);
lastPid = batch[batch.length - 1].pid;
processedCount += batch.length;
lastPid = batch.rows[batch.rows.length - 1].pid;
processedCount += batch.rows.length;
outputProgress({
status: 'running',
@@ -243,54 +304,59 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
});
}
// Calculate forecast accuracy and bias in batches
lastPid = 0;
while (true) {
if (isCancelled) break;
const [batch] = await connection.query(
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
[lastPid, BATCH_SIZE]
);
if (batch.length === 0) break;
await connection.query(`
UPDATE product_metrics pm
JOIN (
SELECT
sf.pid,
AVG(CASE
WHEN o.quantity > 0
THEN ABS(sf.forecast_units - o.quantity) / o.quantity * 100
ELSE 100
END) as avg_forecast_error,
AVG(CASE
WHEN o.quantity > 0
THEN (sf.forecast_units - o.quantity) / o.quantity * 100
ELSE 0
END) as avg_forecast_bias,
MAX(sf.forecast_date) as last_forecast_date
FROM sales_forecasts sf
JOIN orders o ON sf.pid = o.pid
AND DATE(o.date) = sf.forecast_date
WHERE o.canceled = false
AND sf.forecast_date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
AND sf.pid IN (?)
GROUP BY sf.pid
) fa ON pm.pid = fa.pid
SET
pm.forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
pm.forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
pm.last_forecast_date = fa.last_forecast_date,
pm.last_calculated_at = NOW()
WHERE pm.pid IN (?)
`, [batch.map(row => row.pid), batch.map(row => row.pid)]);
lastPid = batch[batch.length - 1].pid;
// Add safety check if the loop processed MAX_BATCHES
if (batchCount >= MAX_BATCHES) {
logError(new Error(`Reached maximum batch count (${MAX_BATCHES}). Process may have entered an infinite loop.`), 'Batch processing safety limit reached');
}
}
// Calculate forecast accuracy and bias in batches
lastPid = 0;
while (true) {
if (isCancelled) break;
const batch = await connection.query(
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
[lastPid, BATCH_SIZE]
);
if (batch.rows.length === 0) break;
await connection.query(`
UPDATE product_metrics pm
SET
forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
last_forecast_date = fa.last_forecast_date,
last_calculated_at = NOW()
FROM (
SELECT
sf.pid,
AVG(CASE
WHEN o.quantity > 0
THEN ABS(sf.forecast_quantity - o.quantity) / o.quantity * 100
ELSE 100
END) as avg_forecast_error,
AVG(CASE
WHEN o.quantity > 0
THEN (sf.forecast_quantity - o.quantity) / o.quantity * 100
ELSE 0
END) as avg_forecast_bias,
MAX(sf.forecast_date) as last_forecast_date
FROM sales_forecasts sf
JOIN orders o ON sf.pid = o.pid
AND DATE(o.date) = sf.forecast_date
WHERE o.canceled = false
AND sf.forecast_date >= CURRENT_DATE - INTERVAL '90 days'
AND sf.pid = ANY($1::bigint[])
GROUP BY sf.pid
) fa
WHERE pm.pid = fa.pid
`, [batch.rows.map(row => row.pid)]);
lastPid = batch.rows[batch.rows.length - 1].pid;
}
// Calculate product time aggregates
if (!SKIP_PRODUCT_TIME_AGGREGATES) {
outputProgress({
@@ -326,11 +392,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
)
SELECT
p.pid,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
SUM(o.quantity) as total_quantity_sold,
SUM(o.quantity * o.price) as total_revenue,
SUM(o.quantity * p.cost_price) as total_cost,
SUM(o.price * o.quantity) as total_revenue,
SUM(p.cost_price * o.quantity) as total_cost,
COUNT(DISTINCT o.order_number) as order_count,
AVG(o.price) as avg_price,
CASE
@@ -346,17 +412,18 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
END as gmroi
FROM products p
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
ON DUPLICATE KEY UPDATE
total_quantity_sold = VALUES(total_quantity_sold),
total_revenue = VALUES(total_revenue),
total_cost = VALUES(total_cost),
order_count = VALUES(order_count),
avg_price = VALUES(avg_price),
profit_margin = VALUES(profit_margin),
inventory_value = VALUES(inventory_value),
gmroi = VALUES(gmroi)
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
ON CONFLICT (pid, year, month) DO UPDATE
SET
total_quantity_sold = EXCLUDED.total_quantity_sold,
total_revenue = EXCLUDED.total_revenue,
total_cost = EXCLUDED.total_cost,
order_count = EXCLUDED.order_count,
avg_price = EXCLUDED.avg_price,
profit_margin = EXCLUDED.profit_margin,
inventory_value = EXCLUDED.inventory_value,
gmroi = EXCLUDED.gmroi
`);
processedCount = Math.floor(totalProducts * 0.6);
@@ -418,11 +485,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
success
};
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
const abcConfig = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
const abcThresholds = abcConfig.rows[0] || { a_threshold: 20, b_threshold: 50 };
// First, create and populate the rankings table with an index
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
await connection.query('DROP TABLE IF EXISTS temp_revenue_ranks');
await connection.query(`
CREATE TEMPORARY TABLE temp_revenue_ranks (
pid BIGINT NOT NULL,
@@ -431,12 +498,12 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
dense_rank_num INT,
percentile DECIMAL(5,2),
total_count INT,
PRIMARY KEY (pid),
INDEX (rank_num),
INDEX (dense_rank_num),
INDEX (percentile)
) ENGINE=MEMORY
PRIMARY KEY (pid)
)
`);
await connection.query('CREATE INDEX ON temp_revenue_ranks (rank_num)');
await connection.query('CREATE INDEX ON temp_revenue_ranks (dense_rank_num)');
await connection.query('CREATE INDEX ON temp_revenue_ranks (percentile)');
// Calculate rankings with proper tie handling
await connection.query(`
@@ -463,58 +530,74 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
`);
// Get total count for percentage calculation
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
const totalCount = rankingCount[0].total_count || 1;
const max_rank = totalCount;
const rankingCount = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
const totalCount = parseInt(rankingCount.rows[0].total_count) || 1;
// Process updates in batches
let abcProcessedCount = 0;
const batchSize = 5000;
const maxPid = await connection.query('SELECT MAX(pid) as max_pid FROM products');
const maxProductId = parseInt(maxPid.rows[0].max_pid);
while (true) {
while (abcProcessedCount < maxProductId) {
if (isCancelled) return {
processedProducts: processedCount,
processedOrders,
processedPurchaseOrders: 0, // This module doesn't process POs
processedPurchaseOrders: 0,
success
};
// Get a batch of PIDs that need updating
const [pids] = await connection.query(`
const pids = await connection.query(`
SELECT pm.pid
FROM product_metrics pm
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
WHERE pm.abc_class IS NULL
OR pm.abc_class !=
CASE
WHEN tr.pid IS NULL THEN 'C'
WHEN tr.percentile <= ? THEN 'A'
WHEN tr.percentile <= ? THEN 'B'
ELSE 'C'
END
LIMIT ?
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
WHERE pm.pid > $1
AND (pm.abc_class IS NULL
OR pm.abc_class !=
CASE
WHEN tr.pid IS NULL THEN 'C'
WHEN tr.percentile <= $2 THEN 'A'
WHEN tr.percentile <= $3 THEN 'B'
ELSE 'C'
END)
ORDER BY pm.pid
LIMIT $4
`, [abcProcessedCount, abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
if (pids.length === 0) break;
if (pids.rows.length === 0) break;
const pidValues = pids.rows.map(row => row.pid);
await connection.query(`
UPDATE product_metrics pm
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
SET pm.abc_class =
SET abc_class =
CASE
WHEN tr.pid IS NULL THEN 'C'
WHEN tr.percentile <= ? THEN 'A'
WHEN tr.percentile <= ? THEN 'B'
WHEN tr.percentile <= $1 THEN 'A'
WHEN tr.percentile <= $2 THEN 'B'
ELSE 'C'
END,
pm.last_calculated_at = NOW()
WHERE pm.pid IN (?)
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pids.map(row => row.pid)]);
last_calculated_at = NOW()
FROM (SELECT pid, percentile FROM temp_revenue_ranks) tr
WHERE pm.pid = tr.pid AND pm.pid = ANY($3::bigint[])
OR (pm.pid = ANY($3::bigint[]) AND tr.pid IS NULL)
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pidValues]);
// Now update turnover rate with proper handling of zero inventory periods
await connection.query(`
UPDATE product_metrics pm
JOIN (
SET
turnover_rate = CASE
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
THEN LEAST(
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
999.99
)
ELSE 0
END,
last_calculated_at = NOW()
FROM (
SELECT
o.pid,
SUM(o.quantity) as total_sold,
@@ -526,22 +609,33 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
FROM orders o
JOIN products p ON o.pid = p.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
AND o.pid IN (?)
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
AND o.pid = ANY($1::bigint[])
GROUP BY o.pid
) sales ON pm.pid = sales.pid
SET
pm.turnover_rate = CASE
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
THEN LEAST(
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
999.99
)
ELSE 0
END,
pm.last_calculated_at = NOW()
WHERE pm.pid IN (?)
`, [pids.map(row => row.pid), pids.map(row => row.pid)]);
) sales
WHERE pm.pid = sales.pid
`, [pidValues]);
abcProcessedCount = pids.rows[pids.rows.length - 1].pid;
// Calculate progress proportionally to total products
processedCount = Math.floor(totalProducts * (0.60 + (abcProcessedCount / maxProductId) * 0.2));
outputProgress({
status: 'running',
operation: 'ABC classification progress',
current: processedCount,
total: totalProducts,
elapsed: formatElapsedTime(startTime),
remaining: estimateRemaining(startTime, processedCount, totalProducts),
rate: calculateRate(startTime, processedCount),
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
timing: {
start_time: new Date(startTime).toISOString(),
end_time: new Date().toISOString(),
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
}
});
}
// If we get here, everything completed successfully
@@ -551,7 +645,8 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('product_metrics', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {
@@ -566,7 +661,16 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
logError(error, 'Error calculating product metrics');
throw error;
} finally {
// Always clean up temporary tables, even if an error occurred
if (connection) {
try {
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
} catch (err) {
console.error('Error cleaning up temporary tables:', err);
}
// Make sure to release the connection
connection.release();
}
}

View File

@@ -32,13 +32,13 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
}
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
outputProgress({
status: 'running',
@@ -69,15 +69,15 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
await connection.query(`
INSERT INTO temp_forecast_dates
SELECT
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date,
DAYOFWEEK(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as day_of_week,
MONTH(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as month
CURRENT_DATE + (n || ' days')::INTERVAL as forecast_date,
EXTRACT(DOW FROM CURRENT_DATE + (n || ' days')::INTERVAL) + 1 as day_of_week,
EXTRACT(MONTH FROM CURRENT_DATE + (n || ' days')::INTERVAL) as month
FROM (
SELECT a.N + b.N * 10 as n
SELECT a.n + b.n * 10 as n
FROM
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2) b
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2) b
ORDER BY n
LIMIT 31
) numbers
@@ -109,17 +109,17 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
// Create temporary table for daily sales stats
await connection.query(`
CREATE TEMPORARY TABLE IF NOT EXISTS temp_daily_sales AS
CREATE TEMPORARY TABLE temp_daily_sales AS
SELECT
o.pid,
DAYOFWEEK(o.date) as day_of_week,
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
SUM(o.quantity) as daily_quantity,
SUM(o.price * o.quantity) as daily_revenue,
COUNT(DISTINCT DATE(o.date)) as day_count
FROM orders o
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
GROUP BY o.pid, DAYOFWEEK(o.date)
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY o.pid, EXTRACT(DOW FROM o.date) + 1
`);
processedCount = Math.floor(totalProducts * 0.94);
@@ -148,7 +148,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
// Create temporary table for product stats
await connection.query(`
CREATE TEMPORARY TABLE IF NOT EXISTS temp_product_stats AS
CREATE TEMPORARY TABLE temp_product_stats AS
SELECT
pid,
AVG(daily_revenue) as overall_avg_revenue,
@@ -186,10 +186,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
INSERT INTO sales_forecasts (
pid,
forecast_date,
forecast_units,
forecast_revenue,
forecast_quantity,
confidence_level,
last_calculated_at
created_at
)
WITH daily_stats AS (
SELECT
@@ -223,29 +222,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
ELSE 1.0
END,
2
END
)
) as forecast_units,
GREATEST(0,
ROUND(
COALESCE(
CASE
WHEN ds.data_points >= 4 THEN ds.avg_daily_revenue
ELSE ps.overall_avg_revenue
END *
(1 + COALESCE(sf.seasonality_factor, 0)) *
CASE
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.5 THEN 0.85
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.0 THEN 0.9
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 0.5 THEN 0.95
ELSE 1.0
END,
0
),
2
)
) as forecast_revenue,
) as forecast_quantity,
CASE
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
WHEN ds.total_days >= 60 THEN 85
@@ -255,17 +234,18 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
WHEN ds.total_days >= 14 THEN 65
ELSE 60
END as confidence_level,
NOW() as last_calculated_at
NOW() as created_at
FROM daily_stats ds
JOIN temp_product_stats ps ON ds.pid = ps.pid
CROSS JOIN temp_forecast_dates fd
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor
ON DUPLICATE KEY UPDATE
forecast_units = VALUES(forecast_units),
forecast_revenue = VALUES(forecast_revenue),
confidence_level = VALUES(confidence_level),
last_calculated_at = NOW()
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor,
ds.avg_daily_qty, ds.std_daily_qty, ds.avg_daily_qty, ds.total_days, ds.daily_variance_ratio
ON CONFLICT (pid, forecast_date) DO UPDATE
SET
forecast_quantity = EXCLUDED.forecast_quantity,
confidence_level = EXCLUDED.confidence_level,
created_at = NOW()
`);
processedCount = Math.floor(totalProducts * 0.98);
@@ -294,22 +274,22 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
// Create temporary table for category stats
await connection.query(`
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_sales AS
CREATE TEMPORARY TABLE temp_category_sales AS
SELECT
pc.cat_id,
DAYOFWEEK(o.date) as day_of_week,
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
SUM(o.quantity) as daily_quantity,
SUM(o.price * o.quantity) as daily_revenue,
COUNT(DISTINCT DATE(o.date)) as day_count
FROM orders o
JOIN product_categories pc ON o.pid = pc.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
GROUP BY pc.cat_id, DAYOFWEEK(o.date)
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
GROUP BY pc.cat_id, EXTRACT(DOW FROM o.date) + 1
`);
await connection.query(`
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_stats AS
CREATE TEMPORARY TABLE temp_category_stats AS
SELECT
cat_id,
AVG(daily_revenue) as overall_avg_revenue,
@@ -350,10 +330,10 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
forecast_units,
forecast_revenue,
confidence_level,
last_calculated_at
created_at
)
SELECT
cs.cat_id as category_id,
cs.cat_id::bigint as category_id,
fd.forecast_date,
GREATEST(0,
AVG(cs.daily_quantity) *
@@ -366,7 +346,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
ELSE ct.overall_avg_revenue
END *
(1 + COALESCE(sf.seasonality_factor, 0)) *
(0.95 + (RAND() * 0.1)),
(0.95 + (random() * 0.1)),
0
)
) as forecast_revenue,
@@ -376,27 +356,34 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
WHEN ct.total_days >= 14 THEN 70
ELSE 60
END as confidence_level,
NOW() as last_calculated_at
NOW() as created_at
FROM temp_category_sales cs
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
CROSS JOIN temp_forecast_dates fd
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
GROUP BY cs.cat_id, fd.forecast_date, ct.overall_avg_revenue, ct.total_days, sf.seasonality_factor
GROUP BY
cs.cat_id,
fd.forecast_date,
ct.overall_avg_revenue,
ct.total_days,
sf.seasonality_factor,
sf.month
HAVING AVG(cs.daily_quantity) > 0
ON DUPLICATE KEY UPDATE
forecast_units = VALUES(forecast_units),
forecast_revenue = VALUES(forecast_revenue),
confidence_level = VALUES(confidence_level),
last_calculated_at = NOW()
ON CONFLICT (category_id, forecast_date) DO UPDATE
SET
forecast_units = EXCLUDED.forecast_units,
forecast_revenue = EXCLUDED.forecast_revenue,
confidence_level = EXCLUDED.confidence_level,
created_at = NOW()
`);
// Clean up temporary tables
await connection.query(`
DROP TEMPORARY TABLE IF EXISTS temp_forecast_dates;
DROP TEMPORARY TABLE IF EXISTS temp_daily_sales;
DROP TEMPORARY TABLE IF EXISTS temp_product_stats;
DROP TEMPORARY TABLE IF EXISTS temp_category_sales;
DROP TEMPORARY TABLE IF EXISTS temp_category_stats;
DROP TABLE IF EXISTS temp_forecast_dates;
DROP TABLE IF EXISTS temp_daily_sales;
DROP TABLE IF EXISTS temp_product_stats;
DROP TABLE IF EXISTS temp_category_sales;
DROP TABLE IF EXISTS temp_category_stats;
`);
processedCount = Math.floor(totalProducts * 1.0);
@@ -423,7 +410,8 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('sales_forecasts', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -32,12 +32,12 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
}
// Get order count that will be processed
const [orderCount] = await connection.query(`
const orderCount = await connection.query(`
SELECT COUNT(*) as count
FROM orders o
WHERE o.canceled = false
`);
processedOrders = orderCount[0].count;
processedOrders = parseInt(orderCount.rows[0].count);
outputProgress({
status: 'running',
@@ -75,8 +75,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
WITH monthly_sales AS (
SELECT
o.pid,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
SUM(o.quantity) as total_quantity_sold,
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) as total_revenue,
SUM(COALESCE(p.cost_price, 0) * o.quantity) as total_cost,
@@ -93,17 +93,17 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
FROM orders o
JOIN products p ON o.pid = p.pid
WHERE o.canceled = false
GROUP BY o.pid, YEAR(o.date), MONTH(o.date)
GROUP BY o.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
),
monthly_stock AS (
SELECT
pid,
YEAR(date) as year,
MONTH(date) as month,
EXTRACT(YEAR FROM date::timestamp with time zone) as year,
EXTRACT(MONTH FROM date::timestamp with time zone) as month,
SUM(received) as stock_received,
SUM(ordered) as stock_ordered
FROM purchase_orders
GROUP BY pid, YEAR(date), MONTH(date)
GROUP BY pid, EXTRACT(YEAR FROM date::timestamp with time zone), EXTRACT(MONTH FROM date::timestamp with time zone)
),
base_products AS (
SELECT
@@ -197,17 +197,18 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
AND s.year = ms.year
AND s.month = ms.month
)
ON DUPLICATE KEY UPDATE
total_quantity_sold = VALUES(total_quantity_sold),
total_revenue = VALUES(total_revenue),
total_cost = VALUES(total_cost),
order_count = VALUES(order_count),
stock_received = VALUES(stock_received),
stock_ordered = VALUES(stock_ordered),
avg_price = VALUES(avg_price),
profit_margin = VALUES(profit_margin),
inventory_value = VALUES(inventory_value),
gmroi = VALUES(gmroi)
ON CONFLICT (pid, year, month) DO UPDATE
SET
total_quantity_sold = EXCLUDED.total_quantity_sold,
total_revenue = EXCLUDED.total_revenue,
total_cost = EXCLUDED.total_cost,
order_count = EXCLUDED.order_count,
stock_received = EXCLUDED.stock_received,
stock_ordered = EXCLUDED.stock_ordered,
avg_price = EXCLUDED.avg_price,
profit_margin = EXCLUDED.profit_margin,
inventory_value = EXCLUDED.inventory_value,
gmroi = EXCLUDED.gmroi
`);
processedCount = Math.floor(totalProducts * 0.60);
@@ -237,23 +238,23 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
// Update with financial metrics
await connection.query(`
UPDATE product_time_aggregates pta
JOIN (
SET inventory_value = COALESCE(fin.inventory_value, 0)
FROM (
SELECT
p.pid,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
p.cost_price * p.stock_quantity as inventory_value,
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
COUNT(DISTINCT DATE(o.date)) as active_days
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
) fin ON pta.pid = fin.pid
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
) fin
WHERE pta.pid = fin.pid
AND pta.year = fin.year
AND pta.month = fin.month
SET
pta.inventory_value = COALESCE(fin.inventory_value, 0)
`);
processedCount = Math.floor(totalProducts * 0.65);
@@ -280,7 +281,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('time_aggregates', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -1,4 +1,4 @@
const mysql = require('mysql2/promise');
const { Pool } = require('pg');
const path = require('path');
require('dotenv').config({ path: path.resolve(__dirname, '../../..', '.env') });
@@ -8,36 +8,24 @@ const dbConfig = {
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0,
port: process.env.DB_PORT || 5432,
ssl: process.env.DB_SSL === 'true',
// Add performance optimizations
namedPlaceholders: true,
maxPreparedStatements: 256,
enableKeepAlive: true,
keepAliveInitialDelay: 0,
// Add memory optimizations
flags: [
'FOUND_ROWS',
'LONG_PASSWORD',
'PROTOCOL_41',
'TRANSACTIONS',
'SECURE_CONNECTION',
'MULTI_RESULTS',
'PS_MULTI_RESULTS',
'PLUGIN_AUTH',
'CONNECT_ATTRS',
'PLUGIN_AUTH_LENENC_CLIENT_DATA',
'SESSION_TRACK',
'MULTI_STATEMENTS'
]
max: 10, // connection pool max size
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 60000
};
// Create a single pool instance to be reused
const pool = mysql.createPool(dbConfig);
const pool = new Pool(dbConfig);
// Add event handlers for pool
pool.on('error', (err, client) => {
console.error('Unexpected error on idle client', err);
});
async function getConnection() {
return await pool.getConnection();
return await pool.connect();
}
async function closePool() {

View File

@@ -33,7 +33,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
}
// Get counts of records that will be processed
const [[orderCount], [poCount]] = await Promise.all([
const [orderCountResult, poCountResult] = await Promise.all([
connection.query(`
SELECT COUNT(*) as count
FROM orders o
@@ -45,8 +45,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
WHERE po.status != 0
`)
]);
processedOrders = orderCount.count;
processedPurchaseOrders = poCount.count;
processedOrders = parseInt(orderCountResult.rows[0].count);
processedPurchaseOrders = parseInt(poCountResult.rows[0].count);
outputProgress({
status: 'running',
@@ -66,7 +66,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
// First ensure all vendors exist in vendor_details
await connection.query(`
INSERT IGNORE INTO vendor_details (vendor, status, created_at, updated_at)
INSERT INTO vendor_details (vendor, status, created_at, updated_at)
SELECT DISTINCT
vendor,
'active' as status,
@@ -74,6 +74,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
NOW() as updated_at
FROM products
WHERE vendor IS NOT NULL
ON CONFLICT (vendor) DO NOTHING
`);
processedCount = Math.floor(totalProducts * 0.8);
@@ -128,7 +129,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
FROM products p
JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY p.vendor
),
vendor_po AS (
@@ -138,12 +139,15 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
COUNT(DISTINCT po.id) as total_orders,
AVG(CASE
WHEN po.receiving_status = 40
THEN DATEDIFF(po.received_date, po.date)
AND po.received_date IS NOT NULL
AND po.date IS NOT NULL
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
ELSE NULL
END) as avg_lead_time_days,
SUM(po.ordered * po.po_cost_price) as total_purchase_value
FROM products p
JOIN purchase_orders po ON p.pid = po.pid
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
GROUP BY p.vendor
),
vendor_products AS (
@@ -188,20 +192,21 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
WHERE vs.vendor IS NOT NULL
ON DUPLICATE KEY UPDATE
total_revenue = VALUES(total_revenue),
total_orders = VALUES(total_orders),
total_late_orders = VALUES(total_late_orders),
avg_lead_time_days = VALUES(avg_lead_time_days),
on_time_delivery_rate = VALUES(on_time_delivery_rate),
order_fill_rate = VALUES(order_fill_rate),
avg_order_value = VALUES(avg_order_value),
active_products = VALUES(active_products),
total_products = VALUES(total_products),
total_purchase_value = VALUES(total_purchase_value),
avg_margin_percent = VALUES(avg_margin_percent),
status = VALUES(status),
last_calculated_at = VALUES(last_calculated_at)
ON CONFLICT (vendor) DO UPDATE
SET
total_revenue = EXCLUDED.total_revenue,
total_orders = EXCLUDED.total_orders,
total_late_orders = EXCLUDED.total_late_orders,
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
on_time_delivery_rate = EXCLUDED.on_time_delivery_rate,
order_fill_rate = EXCLUDED.order_fill_rate,
avg_order_value = EXCLUDED.avg_order_value,
active_products = EXCLUDED.active_products,
total_products = EXCLUDED.total_products,
total_purchase_value = EXCLUDED.total_purchase_value,
avg_margin_percent = EXCLUDED.avg_margin_percent,
status = EXCLUDED.status,
last_calculated_at = EXCLUDED.last_calculated_at
`);
processedCount = Math.floor(totalProducts * 0.9);
@@ -244,23 +249,23 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
WITH monthly_orders AS (
SELECT
p.vendor,
YEAR(o.date) as year,
MONTH(o.date) as month,
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
COUNT(DISTINCT o.id) as total_orders,
SUM(o.quantity * o.price) as total_revenue,
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
FROM products p
JOIN orders o ON p.pid = o.pid
WHERE o.canceled = false
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
AND p.vendor IS NOT NULL
GROUP BY p.vendor, YEAR(o.date), MONTH(o.date)
GROUP BY p.vendor, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
),
monthly_po AS (
SELECT
p.vendor,
YEAR(po.date) as year,
MONTH(po.date) as month,
EXTRACT(YEAR FROM po.date::timestamp with time zone) as year,
EXTRACT(MONTH FROM po.date::timestamp with time zone) as month,
COUNT(DISTINCT po.id) as total_po,
COUNT(DISTINCT CASE
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
@@ -268,14 +273,17 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
END) as late_orders,
AVG(CASE
WHEN po.receiving_status = 40
THEN DATEDIFF(po.received_date, po.date)
AND po.received_date IS NOT NULL
AND po.date IS NOT NULL
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
ELSE NULL
END) as avg_lead_time_days,
SUM(po.ordered * po.po_cost_price) as total_purchase_value
FROM products p
JOIN purchase_orders po ON p.pid = po.pid
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
AND p.vendor IS NOT NULL
GROUP BY p.vendor, YEAR(po.date), MONTH(po.date)
GROUP BY p.vendor, EXTRACT(YEAR FROM po.date::timestamp with time zone), EXTRACT(MONTH FROM po.date::timestamp with time zone)
)
SELECT
mo.vendor,
@@ -311,13 +319,14 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
AND mp.year = mo.year
AND mp.month = mo.month
WHERE mo.vendor IS NULL
ON DUPLICATE KEY UPDATE
total_orders = VALUES(total_orders),
late_orders = VALUES(late_orders),
avg_lead_time_days = VALUES(avg_lead_time_days),
total_purchase_value = VALUES(total_purchase_value),
total_revenue = VALUES(total_revenue),
avg_margin_percent = VALUES(avg_margin_percent)
ON CONFLICT (vendor, year, month) DO UPDATE
SET
total_orders = EXCLUDED.total_orders,
late_orders = EXCLUDED.late_orders,
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
total_purchase_value = EXCLUDED.total_purchase_value,
total_revenue = EXCLUDED.total_revenue,
avg_margin_percent = EXCLUDED.avg_margin_percent
`);
processedCount = Math.floor(totalProducts * 0.95);
@@ -344,7 +353,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
await connection.query(`
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
VALUES ('vendor_metrics', NOW())
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
ON CONFLICT (module_name) DO UPDATE
SET last_calculation_timestamp = NOW()
`);
return {

View File

@@ -184,7 +184,7 @@ async function resetDatabase() {
SELECT string_agg(tablename, ', ') as tables
FROM pg_tables
WHERE schemaname = 'public'
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history');
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates');
`);
if (!tablesResult.rows[0].tables) {

View File

@@ -100,6 +100,9 @@ async function resetMetrics() {
client = new Client(dbConfig);
await client.connect();
// Explicitly begin a transaction
await client.query('BEGIN');
// First verify current state
const initialTables = await client.query(`
SELECT tablename as name
@@ -124,6 +127,7 @@ async function resetMetrics() {
for (const table of [...METRICS_TABLES].reverse()) {
try {
// Use NOWAIT to avoid hanging if there's a lock
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
// Verify the table was actually dropped
@@ -142,13 +146,23 @@ async function resetMetrics() {
operation: 'Table dropped',
message: `Successfully dropped table: ${table}`
});
// Commit after each table drop to ensure locks are released
await client.query('COMMIT');
// Start a new transaction for the next table
await client.query('BEGIN');
// Re-disable foreign key constraints for the new transaction
await client.query('SET session_replication_role = \'replica\'');
} catch (err) {
outputProgress({
status: 'error',
operation: 'Drop table error',
message: `Error dropping table ${table}: ${err.message}`
});
throw err;
await client.query('ROLLBACK');
// Re-start transaction for next table
await client.query('BEGIN');
await client.query('SET session_replication_role = \'replica\'');
}
}
@@ -164,6 +178,11 @@ async function resetMetrics() {
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.rows.map(t => t.name).join(', ')}`);
}
// Make sure we have a fresh transaction here
await client.query('COMMIT');
await client.query('BEGIN');
await client.query('SET session_replication_role = \'replica\'');
// Read metrics schema
outputProgress({
operation: 'Reading schema',
@@ -220,6 +239,13 @@ async function resetMetrics() {
rowCount: result.rowCount
}
});
// Commit every 10 statements to avoid long-running transactions
if (i > 0 && i % 10 === 0) {
await client.query('COMMIT');
await client.query('BEGIN');
await client.query('SET session_replication_role = \'replica\'');
}
} catch (sqlError) {
outputProgress({
status: 'error',
@@ -230,10 +256,17 @@ async function resetMetrics() {
statementNumber: i + 1
}
});
await client.query('ROLLBACK');
throw sqlError;
}
}
// Final commit for any pending statements
await client.query('COMMIT');
// Start new transaction for final checks
await client.query('BEGIN');
// Re-enable foreign key checks after all tables are created
await client.query('SET session_replication_role = \'origin\'');
@@ -269,9 +302,11 @@ async function resetMetrics() {
operation: 'Final table check',
message: `All database tables: ${finalCheck.rows.map(t => t.name).join(', ')}`
});
await client.query('ROLLBACK');
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
}
// Commit final transaction
await client.query('COMMIT');
outputProgress({
@@ -288,7 +323,11 @@ async function resetMetrics() {
});
if (client) {
await client.query('ROLLBACK');
try {
await client.query('ROLLBACK');
} catch (rollbackError) {
console.error('Error during rollback:', rollbackError);
}
// Make sure to re-enable foreign key checks even if there's an error
await client.query('SET session_replication_role = \'origin\'').catch(() => {});
}

View File

@@ -0,0 +1,337 @@
/**
* This script updates the costeach values for existing orders from the original MySQL database
* without needing to run the full import process.
*/
const dotenv = require("dotenv");
const path = require("path");
const fs = require("fs");
const { setupConnections, closeConnections } = require('./import/utils');
const { outputProgress, formatElapsedTime } = require('./metrics/utils/progress');
dotenv.config({ path: path.join(__dirname, "../.env") });
// SSH configuration
const sshConfig = {
ssh: {
host: process.env.PROD_SSH_HOST,
port: process.env.PROD_SSH_PORT || 22,
username: process.env.PROD_SSH_USER,
privateKey: process.env.PROD_SSH_KEY_PATH
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
: undefined,
compress: true, // Enable SSH compression
},
prodDbConfig: {
// MySQL config for production
host: process.env.PROD_DB_HOST || "localhost",
user: process.env.PROD_DB_USER,
password: process.env.PROD_DB_PASSWORD,
database: process.env.PROD_DB_NAME,
port: process.env.PROD_DB_PORT || 3306,
timezone: 'Z',
},
localDbConfig: {
// PostgreSQL config for local
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
port: process.env.DB_PORT || 5432,
ssl: process.env.DB_SSL === 'true',
connectionTimeoutMillis: 60000,
idleTimeoutMillis: 30000,
max: 10 // connection pool max size
}
};
async function updateOrderCosts() {
const startTime = Date.now();
let connections;
let updatedCount = 0;
let errorCount = 0;
try {
outputProgress({
status: "running",
operation: "Order costs update",
message: "Initializing SSH tunnel..."
});
connections = await setupConnections(sshConfig);
const { prodConnection, localConnection } = connections;
// 1. Get all orders from local database that need cost updates
outputProgress({
status: "running",
operation: "Order costs update",
message: "Getting orders from local database..."
});
const [orders] = await localConnection.query(`
SELECT DISTINCT order_number, pid
FROM orders
WHERE costeach = 0 OR costeach IS NULL
ORDER BY order_number
`);
if (!orders || !orders.rows || orders.rows.length === 0) {
console.log("No orders found that need cost updates");
return { updatedCount: 0, errorCount: 0 };
}
const totalOrders = orders.rows.length;
console.log(`Found ${totalOrders} orders that need cost updates`);
// Process in batches of 1000 orders
const BATCH_SIZE = 500;
for (let i = 0; i < orders.rows.length; i += BATCH_SIZE) {
try {
// Start transaction for this batch
await localConnection.beginTransaction();
const batch = orders.rows.slice(i, i + BATCH_SIZE);
const orderNumbers = [...new Set(batch.map(o => o.order_number))];
// 2. Fetch costs from production database for these orders
outputProgress({
status: "running",
operation: "Order costs update",
message: `Fetching costs for orders ${i + 1} to ${Math.min(i + BATCH_SIZE, totalOrders)} of ${totalOrders}`,
current: i,
total: totalOrders,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
});
const [costs] = await prodConnection.query(`
SELECT
oc.orderid as order_number,
oc.pid,
oc.costeach
FROM order_costs oc
INNER JOIN (
SELECT
orderid,
pid,
MAX(id) as max_id
FROM order_costs
WHERE orderid IN (?)
AND pending = 0
GROUP BY orderid, pid
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
`, [orderNumbers]);
// Create a map of costs for easy lookup
const costMap = {};
if (costs && costs.length) {
costs.forEach(c => {
costMap[`${c.order_number}-${c.pid}`] = c.costeach || 0;
});
}
// 3. Update costs in local database by batches
// Using a more efficient update approach with a temporary table
// Create a temporary table for each batch
await localConnection.query(`
DROP TABLE IF EXISTS temp_order_costs;
CREATE TEMP TABLE temp_order_costs (
order_number VARCHAR(50) NOT NULL,
pid BIGINT NOT NULL,
costeach DECIMAL(10,3) NOT NULL,
PRIMARY KEY (order_number, pid)
);
`);
// Insert cost data into the temporary table
const costEntries = [];
for (const order of batch) {
const key = `${order.order_number}-${order.pid}`;
if (key in costMap) {
costEntries.push({
order_number: order.order_number,
pid: order.pid,
costeach: costMap[key]
});
}
}
// Insert in sub-batches of 100
const DB_BATCH_SIZE = 50;
for (let j = 0; j < costEntries.length; j += DB_BATCH_SIZE) {
const subBatch = costEntries.slice(j, j + DB_BATCH_SIZE);
if (subBatch.length === 0) continue;
const placeholders = subBatch.map((_, idx) =>
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
).join(',');
const values = subBatch.flatMap(item => [
item.order_number,
item.pid,
item.costeach
]);
await localConnection.query(`
INSERT INTO temp_order_costs (order_number, pid, costeach)
VALUES ${placeholders}
`, values);
}
// Perform bulk update from the temporary table
const [updateResult] = await localConnection.query(`
UPDATE orders o
SET costeach = t.costeach
FROM temp_order_costs t
WHERE o.order_number = t.order_number AND o.pid = t.pid
RETURNING o.id
`);
const batchUpdated = updateResult.rowCount || 0;
updatedCount += batchUpdated;
// Commit transaction for this batch
await localConnection.commit();
outputProgress({
status: "running",
operation: "Order costs update",
message: `Updated ${updatedCount} orders with costs from production (batch: ${batchUpdated})`,
current: i + batch.length,
total: totalOrders,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
});
} catch (error) {
// If a batch fails, roll back that batch's transaction and continue
try {
await localConnection.rollback();
} catch (rollbackError) {
console.error("Error during batch rollback:", rollbackError);
}
console.error(`Error processing batch ${i}-${i + BATCH_SIZE}:`, error);
errorCount++;
}
}
// 4. For orders with no matching costs, set a default based on price
outputProgress({
status: "running",
operation: "Order costs update",
message: "Setting default costs for remaining orders..."
});
// Process remaining updates in smaller batches
const DEFAULT_BATCH_SIZE = 10000;
let totalDefaultUpdated = 0;
try {
// Start with a count query to determine how many records need the default update
const [countResult] = await localConnection.query(`
SELECT COUNT(*) as count FROM orders
WHERE (costeach = 0 OR costeach IS NULL)
`);
const totalToUpdate = parseInt(countResult.rows[0]?.count || 0);
if (totalToUpdate > 0) {
console.log(`Applying default cost to ${totalToUpdate} orders`);
// Apply the default in batches with separate transactions
for (let i = 0; i < totalToUpdate; i += DEFAULT_BATCH_SIZE) {
try {
await localConnection.beginTransaction();
const [defaultUpdates] = await localConnection.query(`
WITH orders_to_update AS (
SELECT id FROM orders
WHERE (costeach = 0 OR costeach IS NULL)
LIMIT ${DEFAULT_BATCH_SIZE}
)
UPDATE orders o
SET costeach = price * 0.5
FROM orders_to_update otu
WHERE o.id = otu.id
RETURNING o.id
`);
const batchDefaultUpdated = defaultUpdates.rowCount || 0;
totalDefaultUpdated += batchDefaultUpdated;
await localConnection.commit();
outputProgress({
status: "running",
operation: "Order costs update",
message: `Applied default costs to ${totalDefaultUpdated} of ${totalToUpdate} orders`,
current: totalDefaultUpdated,
total: totalToUpdate,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
});
} catch (error) {
try {
await localConnection.rollback();
} catch (rollbackError) {
console.error("Error during default update rollback:", rollbackError);
}
console.error(`Error applying default costs batch ${i}-${i + DEFAULT_BATCH_SIZE}:`, error);
errorCount++;
}
}
}
} catch (error) {
console.error("Error counting or updating remaining orders:", error);
errorCount++;
}
updatedCount += totalDefaultUpdated;
const endTime = Date.now();
const totalSeconds = (endTime - startTime) / 1000;
outputProgress({
status: "complete",
operation: "Order costs update",
message: `Updated ${updatedCount} orders (${totalDefaultUpdated} with default values) in ${formatElapsedTime(totalSeconds)}`,
elapsed: formatElapsedTime(totalSeconds)
});
return {
status: "complete",
updatedCount,
errorCount
};
} catch (error) {
console.error("Error during order costs update:", error);
return {
status: "error",
error: error.message,
updatedCount,
errorCount
};
} finally {
if (connections) {
await closeConnections(connections).catch(err => {
console.error("Error closing connections:", err);
});
}
}
}
// Run the script only if this is the main module
if (require.main === module) {
updateOrderCosts().then((results) => {
console.log('Cost update completed:', results);
// Force exit after a small delay to ensure all logs are written
setTimeout(() => process.exit(0), 500);
}).catch((error) => {
console.error("Unhandled error:", error);
// Force exit with error code after a small delay
setTimeout(() => process.exit(1), 500);
});
}
// Export the function for use in other scripts
module.exports = updateOrderCosts;

View File

@@ -1,226 +0,0 @@
I will provide a JSON array with product data. Process the array by combining all products from validData and invalidData arrays into a single array, excluding any fields starting with “__”, such as “__index” or “__errors”. Process each product according to the reference guidelines below. If a field is not included in the data, do not include it in your response (e.g. do not include its key or any value) unless the specific field guidelines below say otherwise. If a product appears to be from an empty or entirely invalid line, do not include it in your response.
Your response should be a JSON object with the following structure:
{
"correctedData": [], // Array of corrected products
"changes": [], // Array of strings describing each change made
"warnings": [] // Array of strings with warnings or suggestions for manual review (see below for details)
}
IMPORTANT: For all fields that use IDs (categories, supplier, company, line, subline, ship_restrictions, tax_cat, artist, themes, etc.), you MUST return the ID values, not the display names. The system will handle converting IDs to display names.
Using the provided guidelines, focus on:
1. Correcting typos and any incorrect spelling or grammar
2. Standardizing product names
3. Correcting and enhancing descriptions by adding details, keywords, and SEO-friendly language
4. Fixing any obvious errors or inconsistencies between similar products in measurements, prices, or quantities
5. Adding correct categories, themes, and colors
Use only the provided data and your own knowledge to make changes. Do not make assumptions or make up information that you're not sure about. If you're unable to make a change you're confident about, leave the field as is. All data passed in should be validated, corrected, and returned. All values returned should be strings, not numbers. Do not leave out any fields that were present in the original data.
Possible reasons for including a warning in the warnings array:
- If you're unable to make a change you're confident about but you believe one needs to be made
- If there are inconsistencies in the data that could be valid but need to be reviewed
- If not enough information is provided to make a change that you believe is needed
- If you infer a value for a required field based on context
----------PRODUCT FIELD GUIDELINES----------
Fields: supplier, private_notes, company, line, subline, artist
Changes: Not allowed
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, return these fields exactly as provided with no changes
Fields: upc, supplier_no, notions_no, item_number
Changes: Formatting only
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, trim outside white space and return these fields exactly as provided with no other changes
Fields: hts_code
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, trim white space and any non-numeric characters, then return as a string. Do not validate in any other way.
Fields: image_url
Changes: Formatting only
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, convert all comma-separated values to valid https:// URLs and return
Fields: msrp, cost_each
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, strip any currency symbols and return as a string with exactly two decimal places, even if the last place is a 0.
Fields: qty_per_unit, case_qty
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, strip non-numeric characters and return
Fields: ship_restrictions
Changes: Only add a value if it's not already present
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
Instructions: Always return a value exactly as provided, or return 0 if no value is provided.
Fields: eta
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, return a full month name, day is optional, no year ever (e.g. “January” or “March 3”). This value is not required if not provided.
Fields: name
Changes: Allowed to conform to guidelines, to fix typos or formatting
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most reasonable value possible based on the naming guidelines and the other information you have.
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
Fields: description
Changes: Full creative control allowed within guidelines
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most accurate description possible based on the description guidelines and the other information you have.
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
Fields: weight, length, width, height
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return your best guess based on the other information you have or the dimensions for similar products.
Instructions: Always return a reasonable value (weights in ounces and dimensions in inches) that is validated against similar provided products and your knowledge of general object measurements (e.g. a sheet of paper is not going to be 3 inches thick, a pack of stickers is not going to be 250 ounces, this sheet of paper is very likely going to be the same size as that other sheet of paper from the same line). If a value is unusual or unreasonable, even wildly so, change it to match similar products or to be more reasonable. When correcting unreasonable weights or dimensions, prioritize comparisons to products from the same company and product line first, then broader category matches or common knowledge if necessary.Do not return 0 or null for any of these fields.
Fields: coo
Changes: Formatting only
Required: Return if present in the original data. Do not return if not present.
Instructions: If present, convert all country names and abbreviations to the official ISO 3166-1 alpha-2 two-character country code. Convert any value with more than two characters to two characters only (e.g. "United States" or "USA" should both return "US").
Fields: tax_cat
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
Instructions: Always return a valid numerical tax code ID from the Available Tax Codes array below. Give preference to the value provided, but correct it if another value is more accurate. You must return a value for this field. 0 should be the default value in most cases.
Fields: size_cat
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: Return if present in the original data or if not present and applicable. Do not return if not applicable (e.g. if no size categories apply based on what you know about the product).
Instructions: If present or if applicable, return one valid numerical size category ID from the Available Size Categories array below. Give preference to the value provided, but correct it if another value is more accurate. If the product name contains a match for one of the size categories (such as 12x12, 6x6, 2oz, etc) you MUST return that size category with the results. A value is not required if none of the size categories apply.
Fields: themes
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no themes apply based on what you know about the product).
Instructions: If present, confirm that each provided theme matches what you understand to be a theme of the product. Remove any themes that do not match and add any themes that are missing. Most products will have zero or one theme. Return a comma-separated list of numerical theme IDs from the Available Themes array below. If you choose a sub-theme, you do not need to include its parent theme in the list.
Fields: colors
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no colors apply based on what you know about the product).
Instructions: If present or if applicable, return a comma-separated list of numerical color IDs from the Available Colors array below, using the product name as the primary guide (e.g. if the name contains Blue or a blue variant, you should return the blue color ID). A value is not required if none of the colors apply. Most products will have zero colors.
Fields: categories
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
Required: You must always return at least one value for this field, even if it's not provided in the original data. If no value is provided, return the most appropriate category or categories based on the other information you have.
Instructions: Always return a comma-separated list of one or more valid numerical category IDs from the Available Categories array below. Give preference to the values provided, particularly if the other information isn't enough to determine a category, but correct them or add new categories if another value is more accurate. Do not return categories in the Deals or Black Friday categories, and strip these from the list if present. If you choose a subcategory at any level, you do not need to include its parent categories in the list. You must return at least one category and you can return multiple categories if applicable. All categories have equal value so their order is not important. Always try to return the most specific categories possible (e.g. one in the third level of the category hierarchy is better than one in the second level).
----------PRODUCT NAMING GUIDELINES----------
If there's only one of this type of product in a line: [Line Name] [Product Name] - [Company]
Example: "Cosmos Infinity Chipboard - Stamperia"
Example: "Serene Petals 6x6 Paper Pad - Prima"
Multiple similar products in a line: [Differentiator] [Product Type] - [Line Name] - [Company]
Example: "Ice & Shells Stencil - Arctic Antarctic - Stamperia"
Example: "Astronomy Paper - Cosmos Infinity - Stamperia"
Standalone products: [Product Name] - [Company]
Example: "Hedwig Puffy Stickers - Paper House Productions"
Example: "Heart Tree Dies - Lawn Fawn"
Color-based products: [Color] [Product Name] - [Company]
Example: "Green Valley Enamel Dots - Altenew"
Example: "Magenta Aqua Pigment - Brutus Monroe"
Complex products: [Differentiator] [Line] [Product Type] - [Company]
Example: "Size 6 Round Black Velvet Watercolor Brush - Silver Brush Limited" (Size 6 Round is the differentiator, Black Velvet is the line, Watercolor Brush is the product type)
These should not be included in the name, unless there are multiple products that are otherwise identical:
- Product size
- Product weight
- Number of pages
- How many are in the package
Naming Conventions:
- Paper sizes: Use "12x12", "8x8", "6x6" (no spaces or units of measure)
- Company names must match backend exactly
- Always capitalize every word in the name, including short articles like "The" and "An"
- Use "Idea-ology" (not "idea-ology" or "Ideaology")
- All stamps are "Stamp Set" (not "Clear Stamps" or "Rubber Stamps")
- All dies are "Dies" or "Die" (not "Die Set")
- Brands with their own naming conventions should be respected, such as "Doodle Cuts" for dies from Doodlebug
Special Brand Rules - Ranger:
Format: [Product Name] - [Designer Line] - Ranger
Possible Designers: Dylusions, Dina Wakley MEdia, Simon Hurley create., Wendy Vecchi
Example: "Stacked Stencil - Dina Wakley MEdia - Ranger"
Special Brand Rules - Tim Holtz products from Ranger:
Format: [Color] [Product Name/Type] - Tim Holtz Distress - Ranger
Example: "Mermaid Lagoon Tim Holtz Distress Oxide Ink Pad - Ranger"
Special Brand Rules - Tim Holtz products from Sizzix or Stampers Anonymous:
Format: [Product Name] [Product Type] by Tim Holtz - [Company]
Example: "Leaf Fragments Thinlits Dies by Tim Holtz - Sizzix"
Special Brand Rules - Tim Holtz products from Advantus/Idea-ology:
Format: [Product Name] - Tim Holtz Idea-ology
Example: "Tiny Vials - Tim Holtz Idea-ology"
Special Brand Rules - Dies from Sizzix:
Include die type plus "Dies" or "Die"
Examples:
"Art Nouveau 3-D Textured Impressions Embossing Folder - Sizzix"
"Pocket Pals Thinlits Dies - Sizzix"
"Butterfly Wishes Framelits Dies & Stamps - Sizzix"
Important Notes
- Ensure that product names are consistent across all products of the same type
- Use the minimum amount of information needed to uniquely identify the product
- Put detailed specifications in the product description, not its name
Edge Cases
- If the product is missing a company name, infer one from the other products included in the data
- If the product is missing a clear differentiator and needs one to be unique, infer and add one from the other data provided (e.g. the description, existing size categories, etc.)
Incorrect example: MVP Rugby - Collection Pack - Photoplay
Notes: there should be no dash between the line and the product
Incorrect Example: A2 Easel Cards - Black - Photoplay
Notes: the differentiating factor should come first: “Black A2 Easel Cards - Photoplay”. Size is ok to include here because this is the name printed on the package.
Incorrect Example: 6” - Scriber Needle Modeling Tool
Notes: this product only comes in one size, so 6” isnt needed. The company name should also be included.
Incorrect Example: Slick - White - Tulip Dimensional Fabric Paint 4oz
Notes: color should be first, then type, then product, then company, so “White Slick Dimensional Fabric Paint - Tulip”. It appears theres only one size available so no need to differentiate in the name.
Incorrect Example: Silhouette Adhesive Cork Sheets 5”X7” 8/Pkg
Notes: should be “Adhesive Cork Sheets - Silhouette”
Incorrect Example: Galaxy - Opaque - American Crafts Color Pour Resin Dyes
Notes: “Galaxy Opaque Dye Set - Color Pour Resin - American Crafts”
Incorrect Example: Slate - Lion Brand Truboo Yarn
Notes: [Differentiator] [Line] [Product Type] - [Company] : “Slate Truboo Yarn - Lion Brand”
Incorrect Example: Rose Quartz Dylusions Shimmer Paint
Notes: “Rose Quartz Shimmer Paint - Dylusions - Ranger”
----------PRODUCT DESCRIPTION GUIDELINES----------
Product descriptions are an extremely important part of the listing and are the most important part of your response. Care should be taken to ensure they are correct, helpful, and SEO-friendly.
If a description is provided in the data, use it as a starting point. Correct any spelling errors, typos, poor grammar, or awkward phrasing. If necessary and you have the information, add more details, describe how the customer could use it, etc. Use complete sentences and keep SEO in mind.
If no description is provided, make one up using the product name, the information you have, and the other provided guidelines. At minimum, a description should be one complete sentence that starts with a capital letter and ends with a period. Unless the product is extremely complex, 2-4 sentences is usually sufficient if you have enough information.
Important Notes:
- Every description should state exactly what's included in the product (e.g. "Includes one 12x12 sheet of patterned cardstock." or "Includes one 6x12 sheet with 27 unique stickers." or "Includes 55 pieces." or "Package includes machine, power cord, 12 sheets of cardstock, 3 dies, and project instructions.")
- Do not use the word "our" in the description (this usually shows up when we copy a description from the manufacturer). Instead use "these" or "[Company name] [product]" or similar. (e.g. don't use "Our journals are hand-made in the USA", instead use "These journals are hand made..." or "Archer & Olive journals are handmade...")
- Don't include statements that add no value like “this is perfect for all your paper crafts”. If the product helps to solve a unique problem or has a unique feature, by all means describe it, but if its just a normal sheet of paper or pack of stickers, you dont have to pretend like its the best thing ever. At the same time, ensure that you add enough copy to ensure good SEO.
- State as many facts as you can about the product, considering the viewpoint of the customer and what they would want to know when looking at it. They probably want to know dimensions, what products its compatible with, how thick the paper is, how many sheets are included, whether the sheets are double-sided or not, which items are in the kit, etc. Say as much as you possibly can with the information that you have.
- !!DO NOT make up information if you aren't sure about it. A minimal correct description is better than a long incorrect one!!
Avoid/remove:
- The word "Imported"
- Any warnings about Prop 65, choking hazards, etc
- The manufacturer's name if it's included as the very first thing in the description
- Any statement similar to "comes in a variety of colors, each sold separately"

View File

@@ -0,0 +1,335 @@
const express = require('express');
const router = express.Router();
// Get all AI prompts
router.get('/', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM ai_prompts
ORDER BY prompt_type ASC, company ASC
`);
res.json(result.rows);
} catch (error) {
console.error('Error fetching AI prompts:', error);
res.status(500).json({
error: 'Failed to fetch AI prompts',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get prompt by ID
router.get('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM ai_prompts
WHERE id = $1
`, [id]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'AI prompt not found' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching AI prompt:', error);
res.status(500).json({
error: 'Failed to fetch AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get prompt by company
router.get('/company/:companyId', async (req, res) => {
try {
const { companyId } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM ai_prompts
WHERE company = $1
`, [companyId]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'AI prompt not found for this company' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching AI prompt by company:', error);
res.status(500).json({
error: 'Failed to fetch AI prompt by company',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get general prompt
router.get('/type/general', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'general'
`);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'General AI prompt not found' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching general AI prompt:', error);
res.status(500).json({
error: 'Failed to fetch general AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get system prompt
router.get('/type/system', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'system'
`);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'System AI prompt not found' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching system AI prompt:', error);
res.status(500).json({
error: 'Failed to fetch system AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Create new AI prompt
router.post('/', async (req, res) => {
try {
const {
prompt_text,
prompt_type,
company
} = req.body;
// Validate required fields
if (!prompt_text || !prompt_type) {
return res.status(400).json({ error: 'Prompt text and type are required' });
}
// Validate prompt type
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
}
// Validate company is provided for company-specific prompts
if (prompt_type === 'company_specific' && !company) {
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
}
// Validate company is not provided for general or system prompts
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
}
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
INSERT INTO ai_prompts (
prompt_text,
prompt_type,
company
) VALUES ($1, $2, $3)
RETURNING *
`, [
prompt_text,
prompt_type,
company
]);
res.status(201).json(result.rows[0]);
} catch (error) {
console.error('Error creating AI prompt:', error);
// Check for unique constraint violations
if (error instanceof Error && error.message.includes('unique constraint')) {
if (error.message.includes('unique_company_prompt')) {
return res.status(409).json({
error: 'A prompt already exists for this company',
details: error.message
});
} else if (error.message.includes('idx_unique_general_prompt')) {
return res.status(409).json({
error: 'A general prompt already exists',
details: error.message
});
} else if (error.message.includes('idx_unique_system_prompt')) {
return res.status(409).json({
error: 'A system prompt already exists',
details: error.message
});
}
}
res.status(500).json({
error: 'Failed to create AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Update AI prompt
router.put('/:id', async (req, res) => {
try {
const { id } = req.params;
const {
prompt_text,
prompt_type,
company
} = req.body;
// Validate required fields
if (!prompt_text || !prompt_type) {
return res.status(400).json({ error: 'Prompt text and type are required' });
}
// Validate prompt type
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
}
// Validate company is provided for company-specific prompts
if (prompt_type === 'company_specific' && !company) {
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
}
// Validate company is not provided for general or system prompts
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
}
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Check if the prompt exists
const checkResult = await pool.query('SELECT * FROM ai_prompts WHERE id = $1', [id]);
if (checkResult.rows.length === 0) {
return res.status(404).json({ error: 'AI prompt not found' });
}
const result = await pool.query(`
UPDATE ai_prompts
SET
prompt_text = $1,
prompt_type = $2,
company = $3
WHERE id = $4
RETURNING *
`, [
prompt_text,
prompt_type,
company,
id
]);
res.json(result.rows[0]);
} catch (error) {
console.error('Error updating AI prompt:', error);
// Check for unique constraint violations
if (error instanceof Error && error.message.includes('unique constraint')) {
if (error.message.includes('unique_company_prompt')) {
return res.status(409).json({
error: 'A prompt already exists for this company',
details: error.message
});
} else if (error.message.includes('idx_unique_general_prompt')) {
return res.status(409).json({
error: 'A general prompt already exists',
details: error.message
});
} else if (error.message.includes('idx_unique_system_prompt')) {
return res.status(409).json({
error: 'A system prompt already exists',
details: error.message
});
}
}
res.status(500).json({
error: 'Failed to update AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Delete AI prompt
router.delete('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query('DELETE FROM ai_prompts WHERE id = $1 RETURNING *', [id]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'AI prompt not found' });
}
res.json({ message: 'AI prompt deleted successfully' });
} catch (error) {
console.error('Error deleting AI prompt:', error);
res.status(500).json({
error: 'Failed to delete AI prompt',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Error handling middleware
router.use((err, req, res, next) => {
console.error('AI prompts route error:', err);
res.status(500).json({
error: 'Internal server error',
details: err.message
});
});
module.exports = router;

View File

@@ -289,8 +289,108 @@ async function generateDebugResponse(productsToUse, res) {
});
try {
const prompt = await loadPrompt(promptConnection, productsToUse);
const fullPrompt = prompt + "\n" + JSON.stringify(productsToUse);
// Get the local PostgreSQL pool to fetch prompts
const pool = res.app.locals.pool;
if (!pool) {
console.warn("⚠️ Local database pool not available for prompts");
throw new Error("Database connection not available");
}
// First, fetch the system prompt
const systemPromptResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'system'
`);
// Get system prompt or use default
let systemPrompt = null;
if (systemPromptResult.rows.length > 0) {
systemPrompt = systemPromptResult.rows[0];
console.log("📝 Loaded system prompt from database, ID:", systemPrompt.id);
} else {
console.warn("⚠️ No system prompt found in database, will use default");
}
// Then, fetch the general prompt
const generalPromptResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'general'
`);
if (generalPromptResult.rows.length === 0) {
console.warn("⚠️ No general prompt found in database");
throw new Error("No general prompt found in database");
}
// Get the general prompt text and info
const generalPrompt = generalPromptResult.rows[0];
console.log("📝 Loaded general prompt from database, ID:", generalPrompt.id);
// Fetch company-specific prompts if we have products to validate
let companyPrompts = [];
if (productsToUse && Array.isArray(productsToUse)) {
// Extract unique company IDs from products
const companyIds = new Set();
productsToUse.forEach(product => {
if (product.company) {
companyIds.add(String(product.company));
}
});
if (companyIds.size > 0) {
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
// Fetch company-specific prompts
const companyPromptsResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'company_specific'
AND company = ANY($1)
`, [Array.from(companyIds)]);
companyPrompts = companyPromptsResult.rows;
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
}
}
// Find company names from taxonomy for the validation endpoint
const companyPromptsWithNames = companyPrompts.map(prompt => {
let companyName = "Unknown Company";
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
const companyData = taxonomy.companies.find(company =>
String(company[0]) === String(prompt.company)
);
if (companyData && companyData[1]) {
companyName = companyData[1];
}
}
return {
id: prompt.id,
company: prompt.company,
companyName: companyName,
prompt_text: prompt.prompt_text
};
});
// Now use loadPrompt to get the actual combined prompt
const promptData = await loadPrompt(promptConnection, productsToUse, res.app.locals.pool);
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(productsToUse);
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
console.log("📝 Generated prompt length:", promptLength);
console.log("📝 System instructions length:", promptData.systemInstructions.length);
console.log("📝 User content length:", fullUserPrompt.length);
// Format the messages as they would be sent to the API
const apiMessages = [
{
role: "system",
content: promptData.systemInstructions
},
{
role: "user",
content: fullUserPrompt
}
];
// Create the response with taxonomy stats
let categoriesCount = 0;
@@ -330,9 +430,28 @@ async function generateDebugResponse(productsToUse, res) {
: null,
}
: null,
basePrompt: prompt,
sampleFullPrompt: fullPrompt,
promptLength: fullPrompt.length,
basePrompt: systemPrompt ? systemPrompt.prompt_text + "\n\n" + generalPrompt.prompt_text : generalPrompt.prompt_text,
sampleFullPrompt: fullUserPrompt,
promptLength: promptLength,
apiFormat: apiMessages,
promptSources: {
...(systemPrompt ? {
systemPrompt: {
id: systemPrompt.id,
prompt_text: systemPrompt.prompt_text
}
} : {
systemPrompt: {
id: 0,
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
}
}),
generalPrompt: {
id: generalPrompt.id,
prompt_text: generalPrompt.prompt_text
},
companyPrompts: companyPromptsWithNames
}
};
console.log("Sending response with taxonomy stats:", response.taxonomyStats);
@@ -513,22 +632,101 @@ SELECT t.cat_id,t.name,null as master_cat_id,1 AS level_order FROM product_categ
}
}
// Load the prompt from file and inject taxonomy data
async function loadPrompt(connection, productsToValidate = null) {
// Load prompts from database and inject taxonomy data
async function loadPrompt(connection, productsToValidate = null, appPool = null) {
try {
const promptPath = path.join(
__dirname,
"..",
"prompts",
"product-validation.txt"
);
const basePrompt = await fs.readFile(promptPath, "utf8");
// Get taxonomy data using the provided MySQL connection
const taxonomy = await getTaxonomyData(connection);
// Add system instructions to the prompt
const systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
// Use the provided pool parameter instead of global.app
const pool = appPool;
if (!pool) {
console.warn("⚠️ Local database pool not available for prompts");
throw new Error("Database connection not available");
}
// Fetch the system prompt
const systemPromptResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'system'
`);
// Default system instructions in case the system prompt is not found
let systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
// If system prompt exists in the database, use it
if (systemPromptResult.rows.length > 0) {
systemInstructions = systemPromptResult.rows[0].prompt_text;
console.log("📝 Loaded system prompt from database");
} else {
console.warn("⚠️ No system prompt found in database, using default");
}
// Fetch the general prompt
const generalPromptResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'general'
`);
if (generalPromptResult.rows.length === 0) {
console.warn("⚠️ No general prompt found in database");
throw new Error("No general prompt found in database");
}
// Get the general prompt text
const basePrompt = generalPromptResult.rows[0].prompt_text;
console.log("📝 Loaded general prompt from database");
// Fetch company-specific prompts if we have products to validate
let companyPrompts = [];
if (productsToValidate && Array.isArray(productsToValidate)) {
// Extract unique company IDs from products
const companyIds = new Set();
productsToValidate.forEach(product => {
if (product.company) {
companyIds.add(String(product.company));
}
});
if (companyIds.size > 0) {
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
// Fetch company-specific prompts
const companyPromptsResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'company_specific'
AND company = ANY($1)
`, [Array.from(companyIds)]);
companyPrompts = companyPromptsResult.rows;
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
}
}
// Combine prompts - start with the general prompt
let combinedPrompt = basePrompt;
// Add any company-specific prompts with annotations
if (companyPrompts.length > 0) {
combinedPrompt += "\n\n--- COMPANY-SPECIFIC INSTRUCTIONS ---\n";
for (const prompt of companyPrompts) {
// Find company name from taxonomy
let companyName = "Unknown Company";
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
const companyData = taxonomy.companies.find(company =>
String(company[0]) === String(prompt.company)
);
if (companyData && companyData[1]) {
companyName = companyData[1];
}
}
combinedPrompt += `\n[SPECIFIC TO COMPANY: ${companyName} (ID: ${prompt.company})]:\n${prompt.prompt_text}\n`;
}
combinedPrompt += "\n--- END COMPANY-SPECIFIC INSTRUCTIONS ---\n";
}
// If we have products to validate, create a filtered prompt
if (productsToValidate) {
@@ -655,11 +853,14 @@ ${JSON.stringify(mixedTaxonomy.sizeCategories)}${
----------Here is the product data to validate----------`;
// Return the filtered prompt
return systemInstructions + basePrompt + "\n" + taxonomySection;
// Return both system instructions and user content separately
return {
systemInstructions,
userContent: combinedPrompt + "\n" + taxonomySection
};
}
// Generate the full unfiltered prompt
// Generate the full unfiltered prompt for taxonomy section
const taxonomySection = `
Available Categories:
${JSON.stringify(taxonomy.categories)}
@@ -687,7 +888,11 @@ ${JSON.stringify(taxonomy.artists)}
Here is the product data to validate:`;
return systemInstructions + basePrompt + "\n" + taxonomySection;
// Return both system instructions and user content separately
return {
systemInstructions,
userContent: combinedPrompt + "\n" + taxonomySection
};
} catch (error) {
console.error("Error loading prompt:", error);
throw error; // Re-throw to be handled by the calling function
@@ -735,18 +940,24 @@ router.post("/validate", async (req, res) => {
// Load the prompt with the products data to filter taxonomy
console.log("🔄 Loading prompt with filtered taxonomy...");
const prompt = await loadPrompt(connection, products);
const fullPrompt = prompt + "\n" + JSON.stringify(products);
promptLength = fullPrompt.length; // Store prompt length for performance metrics
const promptData = await loadPrompt(connection, products, req.app.locals.pool);
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(products);
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
console.log("📝 Generated prompt length:", promptLength);
console.log("📝 System instructions length:", promptData.systemInstructions.length);
console.log("📝 User content length:", fullUserPrompt.length);
console.log("🤖 Sending request to OpenAI...");
const completion = await openai.chat.completions.create({
model: "o3-mini",
model: "gpt-4o",
messages: [
{
role: "system",
content: promptData.systemInstructions,
},
{
role: "user",
content: fullPrompt,
content: fullUserPrompt,
},
],
temperature: 0.2,
@@ -884,7 +1095,94 @@ router.post("/validate", async (req, res) => {
console.error("⚠️ Failed to record performance metrics:", metricError);
}
// Include performance metrics in the response
// Get sources of the prompts for tracking
let promptSources = null;
try {
// Get system prompt
const systemPromptResult = await pool.query(`
SELECT * FROM ai_prompts WHERE prompt_type = 'system'
`);
// Get general prompt
const generalPromptResult = await pool.query(`
SELECT * FROM ai_prompts WHERE prompt_type = 'general'
`);
// Extract unique company IDs from products
const companyIds = new Set();
products.forEach(product => {
if (product.company) {
companyIds.add(String(product.company));
}
});
let companyPrompts = [];
if (companyIds.size > 0) {
// Fetch company-specific prompts
const companyPromptsResult = await pool.query(`
SELECT * FROM ai_prompts
WHERE prompt_type = 'company_specific'
AND company = ANY($1)
`, [Array.from(companyIds)]);
companyPrompts = companyPromptsResult.rows;
}
// Find company names from taxonomy for the validation endpoint
const companyPromptsWithNames = companyPrompts.map(prompt => {
let companyName = "Unknown Company";
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
const companyData = taxonomy.companies.find(company =>
String(company[0]) === String(prompt.company)
);
if (companyData && companyData[1]) {
companyName = companyData[1];
}
}
return {
id: prompt.id,
company: prompt.company,
companyName: companyName,
prompt_text: prompt.prompt_text
};
});
// Set prompt sources
if (generalPromptResult.rows.length > 0) {
const generalPrompt = generalPromptResult.rows[0];
let systemPrompt = null;
if (systemPromptResult.rows.length > 0) {
systemPrompt = systemPromptResult.rows[0];
}
promptSources = {
...(systemPrompt ? {
systemPrompt: {
id: systemPrompt.id,
prompt_text: systemPrompt.prompt_text
}
} : {
systemPrompt: {
id: 0,
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
}
}),
generalPrompt: {
id: generalPrompt.id,
prompt_text: generalPrompt.prompt_text
},
companyPrompts: companyPromptsWithNames
};
}
} catch (promptSourceError) {
console.error("⚠️ Error getting prompt sources:", promptSourceError);
// Don't fail the entire validation if just prompt sources retrieval fails
}
// Include prompt sources in the response
res.json({
success: true,
changeDetails: changeDetails,
@@ -895,6 +1193,7 @@ router.post("/validate", async (req, res) => {
isEstimate: true,
productCount: products.length
},
promptSources: promptSources,
...aiResponse,
});
} catch (parseError) {

View File

@@ -79,7 +79,7 @@ router.get('/profit', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
cp.path || ' > ' || c.name
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
)
@@ -137,7 +137,7 @@ router.get('/profit', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
cp.path || ' > ' || c.name
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
)
@@ -175,6 +175,13 @@ router.get('/vendors', async (req, res) => {
try {
const pool = req.app.locals.pool;
// Set cache control headers to prevent 304
res.set({
'Cache-Control': 'no-cache, no-store, must-revalidate',
'Pragma': 'no-cache',
'Expires': '0'
});
console.log('Fetching vendor performance data...');
// First check if we have any vendors with sales
@@ -189,7 +196,7 @@ router.get('/vendors', async (req, res) => {
console.log('Vendor data check:', checkData);
// Get vendor performance metrics
const { rows: performance } = await pool.query(`
const { rows: rawPerformance } = await pool.query(`
WITH monthly_sales AS (
SELECT
p.vendor,
@@ -212,15 +219,15 @@ router.get('/vendors', async (req, res) => {
)
SELECT
p.vendor,
ROUND(SUM(o.price * o.quantity)::numeric, 3) as salesVolume,
ROUND(SUM(o.price * o.quantity)::numeric, 3) as sales_volume,
COALESCE(ROUND(
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
), 0) as profitMargin,
), 0) as profit_margin,
COALESCE(ROUND(
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
), 0) as stockTurnover,
COUNT(DISTINCT p.pid) as productCount,
), 0) as stock_turnover,
COUNT(DISTINCT p.pid) as product_count,
ROUND(
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
1
@@ -231,16 +238,114 @@ router.get('/vendors', async (req, res) => {
WHERE p.vendor IS NOT NULL
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY p.vendor, ms.current_month, ms.previous_month
ORDER BY salesVolume DESC
ORDER BY sales_volume DESC
LIMIT 10
`);
console.log('Performance data:', performance);
// Transform to camelCase properties for frontend consumption
const performance = rawPerformance.map(item => ({
vendor: item.vendor,
salesVolume: Number(item.sales_volume) || 0,
profitMargin: Number(item.profit_margin) || 0,
stockTurnover: Number(item.stock_turnover) || 0,
productCount: Number(item.product_count) || 0,
growth: Number(item.growth) || 0
}));
res.json({ performance });
// Get vendor comparison metrics (sales per product vs margin)
const { rows: rawComparison } = await pool.query(`
SELECT
p.vendor,
COALESCE(ROUND(
SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0),
2
), 0) as sales_per_product,
COALESCE(ROUND(
AVG((p.price - p.cost_price) / NULLIF(p.cost_price, 0) * 100),
2
), 0) as average_margin,
COUNT(DISTINCT p.pid) as size
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE p.vendor IS NOT NULL
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY p.vendor
HAVING COUNT(DISTINCT p.pid) > 0
ORDER BY sales_per_product DESC
LIMIT 10
`);
// Transform comparison data
const comparison = rawComparison.map(item => ({
vendor: item.vendor,
salesPerProduct: Number(item.sales_per_product) || 0,
averageMargin: Number(item.average_margin) || 0,
size: Number(item.size) || 0
}));
console.log('Performance data ready. Sending response...');
// Return complete structure that the front-end expects
res.json({
performance,
comparison,
// Add empty trends array to complete the structure
trends: []
});
} catch (error) {
console.error('Error fetching vendor performance:', error);
res.status(500).json({ error: 'Failed to fetch vendor performance' });
console.error('Error details:', error.message);
// Return dummy data on error with complete structure
res.json({
performance: [
{
vendor: "Example Vendor 1",
salesVolume: 10000,
profitMargin: 25.5,
stockTurnover: 3.2,
productCount: 15,
growth: 12.3
},
{
vendor: "Example Vendor 2",
salesVolume: 8500,
profitMargin: 22.8,
stockTurnover: 2.9,
productCount: 12,
growth: 8.7
},
{
vendor: "Example Vendor 3",
salesVolume: 6200,
profitMargin: 19.5,
stockTurnover: 2.5,
productCount: 8,
growth: 5.2
}
],
comparison: [
{
vendor: "Example Vendor 1",
salesPerProduct: 650,
averageMargin: 35.2,
size: 15
},
{
vendor: "Example Vendor 2",
salesPerProduct: 710,
averageMargin: 28.5,
size: 12
},
{
vendor: "Example Vendor 3",
salesPerProduct: 770,
averageMargin: 22.8,
size: 8
}
],
trends: []
});
}
});
@@ -250,7 +355,7 @@ router.get('/stock', async (req, res) => {
const pool = req.app.locals.pool;
// Get global configuration values
const [configs] = await pool.query(`
const { rows: configs } = await pool.query(`
SELECT
st.low_stock_threshold,
tc.calculation_period_days as turnover_period
@@ -265,43 +370,39 @@ router.get('/stock', async (req, res) => {
};
// Get turnover by category
const [turnoverByCategory] = await pool.query(`
const { rows: turnoverByCategory } = await pool.query(`
SELECT
c.name as category,
ROUND(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0), 1) as turnoverRate,
ROUND(AVG(p.stock_quantity), 0) as averageStock,
ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) as turnoverRate,
ROUND(AVG(p.stock_quantity)::numeric, 0) as averageStock,
SUM(o.quantity) as totalSales
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
JOIN product_categories pc ON p.pid = pc.pid
JOIN categories c ON pc.cat_id = c.cat_id
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
GROUP BY c.name
HAVING turnoverRate > 0
HAVING ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) > 0
ORDER BY turnoverRate DESC
LIMIT 10
`, [config.turnover_period]);
`);
// Get stock levels over time
const [stockLevels] = await pool.query(`
const { rows: stockLevels } = await pool.query(`
SELECT
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
SUM(CASE WHEN p.stock_quantity > ? THEN 1 ELSE 0 END) as inStock,
SUM(CASE WHEN p.stock_quantity <= ? AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
to_char(o.date, 'YYYY-MM-DD') as date,
SUM(CASE WHEN p.stock_quantity > $1 THEN 1 ELSE 0 END) as inStock,
SUM(CASE WHEN p.stock_quantity <= $1 AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
GROUP BY to_char(o.date, 'YYYY-MM-DD')
ORDER BY date
`, [
config.low_stock_threshold,
config.low_stock_threshold,
config.turnover_period
]);
`, [config.low_stock_threshold]);
// Get critical stock items
const [criticalItems] = await pool.query(`
const { rows: criticalItems } = await pool.query(`
WITH product_thresholds AS (
SELECT
p.pid,
@@ -320,25 +421,33 @@ router.get('/stock', async (req, res) => {
p.title as product,
p.SKU as sku,
p.stock_quantity as stockQuantity,
GREATEST(ROUND(AVG(o.quantity) * pt.reorder_days), ?) as reorderPoint,
ROUND(SUM(o.quantity) / NULLIF(p.stock_quantity, 0), 1) as turnoverRate,
GREATEST(ROUND((AVG(o.quantity) * pt.reorder_days)::numeric), $1) as reorderPoint,
ROUND((SUM(o.quantity) / NULLIF(p.stock_quantity, 0))::numeric, 1) as turnoverRate,
CASE
WHEN p.stock_quantity = 0 THEN 0
ELSE ROUND(p.stock_quantity / NULLIF((SUM(o.quantity) / ?), 0))
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
END as daysUntilStockout
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
JOIN product_thresholds pt ON p.pid = pt.pid
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
AND p.managing_stock = true
GROUP BY p.pid
HAVING daysUntilStockout < ? AND daysUntilStockout >= 0
GROUP BY p.pid, pt.reorder_days
HAVING
CASE
WHEN p.stock_quantity = 0 THEN 0
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
END < $3
AND
CASE
WHEN p.stock_quantity = 0 THEN 0
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
END >= 0
ORDER BY daysUntilStockout
LIMIT 10
`, [
config.low_stock_threshold,
config.turnover_period,
config.turnover_period,
config.turnover_period
]);
@@ -355,7 +464,7 @@ router.get('/pricing', async (req, res) => {
const pool = req.app.locals.pool;
// Get price points analysis
const [pricePoints] = await pool.query(`
const { rows: pricePoints } = await pool.query(`
SELECT
CAST(p.price AS DECIMAL(15,3)) as price,
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
@@ -365,27 +474,27 @@ router.get('/pricing', async (req, res) => {
LEFT JOIN orders o ON p.pid = o.pid
JOIN product_categories pc ON p.pid = pc.pid
JOIN categories c ON pc.cat_id = c.cat_id
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY p.price, c.name
HAVING salesVolume > 0
HAVING SUM(o.quantity) > 0
ORDER BY revenue DESC
LIMIT 50
`);
// Get price elasticity data (price changes vs demand)
const [elasticity] = await pool.query(`
const { rows: elasticity } = await pool.query(`
SELECT
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
to_char(o.date, 'YYYY-MM-DD') as date,
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
FROM orders o
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY to_char(o.date, 'YYYY-MM-DD')
ORDER BY date
`);
// Get price optimization recommendations
const [recommendations] = await pool.query(`
const { rows: recommendations } = await pool.query(`
SELECT
p.title as product,
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
@@ -415,10 +524,30 @@ router.get('/pricing', async (req, res) => {
END as confidence
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
GROUP BY p.pid, p.price
HAVING ABS(recommendedPrice - currentPrice) > 0
ORDER BY potentialRevenue - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY p.pid, p.price, p.title
HAVING ABS(
CAST(
ROUND(
CASE
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
ELSE p.price
END, 2
) AS DECIMAL(15,3)
) - CAST(p.price AS DECIMAL(15,3))
) > 0
ORDER BY
CAST(
ROUND(
SUM(o.price * o.quantity) *
CASE
WHEN AVG(o.quantity) > 10 THEN 1.15
WHEN AVG(o.quantity) < 2 THEN 0.95
ELSE 1
END, 2
) AS DECIMAL(15,3)
) - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
LIMIT 10
`);
@@ -441,7 +570,7 @@ router.get('/categories', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CAST(c.name AS CHAR(1000)) as path
c.name::text as path
FROM categories c
WHERE c.parent_id IS NULL
@@ -451,27 +580,27 @@ router.get('/categories', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CONCAT(cp.path, ' > ', c.name)
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
)
`;
// Get category performance metrics with full path
const [performance] = await pool.query(`
const { rows: performance } = await pool.query(`
${categoryPathCTE},
monthly_sales AS (
SELECT
c.name,
cp.path,
SUM(CASE
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
WHEN o.date >= CURRENT_DATE - INTERVAL '30 days'
THEN o.price * o.quantity
ELSE 0
END) as current_month,
SUM(CASE
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
WHEN o.date >= CURRENT_DATE - INTERVAL '60 days'
AND o.date < CURRENT_DATE - INTERVAL '30 days'
THEN o.price * o.quantity
ELSE 0
END) as previous_month
@@ -480,7 +609,7 @@ router.get('/categories', async (req, res) => {
JOIN product_categories pc ON p.pid = pc.pid
JOIN categories c ON pc.cat_id = c.cat_id
JOIN category_path cp ON c.cat_id = cp.cat_id
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
GROUP BY c.name, cp.path
)
SELECT
@@ -499,15 +628,15 @@ router.get('/categories', async (req, res) => {
JOIN categories c ON pc.cat_id = c.cat_id
JOIN category_path cp ON c.cat_id = cp.cat_id
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
HAVING revenue > 0
HAVING SUM(o.price * o.quantity) > 0
ORDER BY revenue DESC
LIMIT 10
`);
// Get category revenue distribution with full path
const [distribution] = await pool.query(`
const { rows: distribution } = await pool.query(`
${categoryPathCTE}
SELECT
c.name as category,
@@ -518,35 +647,35 @@ router.get('/categories', async (req, res) => {
JOIN product_categories pc ON p.pid = pc.pid
JOIN categories c ON pc.cat_id = c.cat_id
JOIN category_path cp ON c.cat_id = cp.cat_id
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY c.name, cp.path
HAVING value > 0
HAVING SUM(o.price * o.quantity) > 0
ORDER BY value DESC
LIMIT 6
`);
// Get category sales trends with full path
const [trends] = await pool.query(`
const { rows: trends } = await pool.query(`
${categoryPathCTE}
SELECT
c.name as category,
cp.path as categoryPath,
DATE_FORMAT(o.date, '%b %Y') as month,
to_char(o.date, 'Mon YYYY') as month,
SUM(o.price * o.quantity) as sales
FROM products p
LEFT JOIN orders o ON p.pid = o.pid
JOIN product_categories pc ON p.pid = pc.pid
JOIN categories c ON pc.cat_id = c.cat_id
JOIN category_path cp ON c.cat_id = cp.cat_id
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
WHERE o.date >= CURRENT_DATE - INTERVAL '6 months'
GROUP BY
c.name,
cp.path,
DATE_FORMAT(o.date, '%b %Y'),
DATE_FORMAT(o.date, '%Y-%m')
to_char(o.date, 'Mon YYYY'),
to_char(o.date, 'YYYY-MM')
ORDER BY
c.name,
DATE_FORMAT(o.date, '%Y-%m')
to_char(o.date, 'YYYY-MM')
`);
res.json({ performance, distribution, trends });

View File

@@ -757,8 +757,8 @@ router.get('/history/import', async (req, res) => {
end_time,
status,
error_message,
rows_processed::integer,
files_processed::integer
records_added::integer,
records_updated::integer
FROM import_history
ORDER BY start_time DESC
LIMIT 20

File diff suppressed because it is too large Load Diff

View File

@@ -183,7 +183,7 @@ router.get('/', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CAST(c.name AS text) as path
c.name::text as path
FROM categories c
WHERE c.parent_id IS NULL
@@ -193,7 +193,7 @@ router.get('/', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
cp.path || ' > ' || c.name
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
),
@@ -295,7 +295,7 @@ router.get('/trending', async (req, res) => {
const pool = req.app.locals.pool;
try {
// First check if we have any data
const [checkData] = await pool.query(`
const { rows } = await pool.query(`
SELECT COUNT(*) as count,
MAX(total_revenue) as max_revenue,
MAX(daily_sales_avg) as max_daily_sales,
@@ -303,15 +303,15 @@ router.get('/trending', async (req, res) => {
FROM product_metrics
WHERE total_revenue > 0 OR daily_sales_avg > 0
`);
console.log('Product metrics stats:', checkData[0]);
console.log('Product metrics stats:', rows[0]);
if (checkData[0].count === 0) {
if (parseInt(rows[0].count) === 0) {
console.log('No products with metrics found');
return res.json([]);
}
// Get trending products
const [rows] = await pool.query(`
const { rows: trendingProducts } = await pool.query(`
SELECT
p.pid,
p.sku,
@@ -332,8 +332,8 @@ router.get('/trending', async (req, res) => {
LIMIT 50
`);
console.log('Trending products:', rows);
res.json(rows);
console.log('Trending products:', trendingProducts);
res.json(trendingProducts);
} catch (error) {
console.error('Error fetching trending products:', error);
res.status(500).json({ error: 'Failed to fetch trending products' });
@@ -353,7 +353,7 @@ router.get('/:id', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CAST(c.name AS CHAR(1000)) as path
c.name::text as path
FROM categories c
WHERE c.parent_id IS NULL
@@ -363,14 +363,14 @@ router.get('/:id', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CONCAT(cp.path, ' > ', c.name)
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
)
`;
// Get product details with category paths
const [productRows] = await pool.query(`
const { rows: productRows } = await pool.query(`
SELECT
p.*,
pm.daily_sales_avg,
@@ -396,7 +396,7 @@ router.get('/:id', async (req, res) => {
pm.overstocked_amt
FROM products p
LEFT JOIN product_metrics pm ON p.pid = pm.pid
WHERE p.pid = ?
WHERE p.pid = $1
`, [id]);
if (!productRows.length) {
@@ -404,14 +404,14 @@ router.get('/:id', async (req, res) => {
}
// Get categories and their paths separately to avoid GROUP BY issues
const [categoryRows] = await pool.query(`
const { rows: categoryRows } = await pool.query(`
WITH RECURSIVE
category_path AS (
SELECT
c.cat_id,
c.name,
c.parent_id,
CAST(c.name AS CHAR(1000)) as path
c.name::text as path
FROM categories c
WHERE c.parent_id IS NULL
@@ -421,7 +421,7 @@ router.get('/:id', async (req, res) => {
c.cat_id,
c.name,
c.parent_id,
CONCAT(cp.path, ' > ', c.name)
(cp.path || ' > ' || c.name)::text
FROM categories c
JOIN category_path cp ON c.parent_id = cp.cat_id
),
@@ -430,7 +430,7 @@ router.get('/:id', async (req, res) => {
-- of other categories assigned to this product
SELECT pc.cat_id
FROM product_categories pc
WHERE pc.pid = ?
WHERE pc.pid = $1
AND NOT EXISTS (
-- Check if there are any child categories also assigned to this product
SELECT 1
@@ -448,7 +448,7 @@ router.get('/:id', async (req, res) => {
JOIN categories c ON pc.cat_id = c.cat_id
JOIN category_path cp ON c.cat_id = cp.cat_id
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
WHERE pc.pid = ?
WHERE pc.pid = $2
ORDER BY cp.path
`, [id, id]);
@@ -540,20 +540,20 @@ router.put('/:id', async (req, res) => {
managing_stock
} = req.body;
const [result] = await pool.query(
const { rowCount } = await pool.query(
`UPDATE products
SET title = ?,
sku = ?,
stock_quantity = ?,
price = ?,
regular_price = ?,
cost_price = ?,
vendor = ?,
brand = ?,
categories = ?,
visible = ?,
managing_stock = ?
WHERE pid = ?`,
SET title = $1,
sku = $2,
stock_quantity = $3,
price = $4,
regular_price = $5,
cost_price = $6,
vendor = $7,
brand = $8,
categories = $9,
visible = $10,
managing_stock = $11
WHERE pid = $12`,
[
title,
sku,
@@ -570,7 +570,7 @@ router.put('/:id', async (req, res) => {
]
);
if (result.affectedRows === 0) {
if (rowCount === 0) {
return res.status(404).json({ error: 'Product not found' });
}
@@ -588,7 +588,7 @@ router.get('/:id/metrics', async (req, res) => {
const { id } = req.params;
// Get metrics from product_metrics table with inventory health data
const [metrics] = await pool.query(`
const { rows: metrics } = await pool.query(`
WITH inventory_status AS (
SELECT
p.pid,
@@ -601,7 +601,7 @@ router.get('/:id/metrics', async (req, res) => {
END as calculated_status
FROM products p
LEFT JOIN product_metrics pm ON p.pid = pm.pid
WHERE p.pid = ?
WHERE p.pid = $1
)
SELECT
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
@@ -627,8 +627,8 @@ router.get('/:id/metrics', async (req, res) => {
FROM products p
LEFT JOIN product_metrics pm ON p.pid = pm.pid
LEFT JOIN inventory_status is ON p.pid = is.pid
WHERE p.pid = ?
`, [id]);
WHERE p.pid = $2
`, [id, id]);
if (!metrics.length) {
// Return default metrics structure if no data found
@@ -669,16 +669,16 @@ router.get('/:id/time-series', async (req, res) => {
const pool = req.app.locals.pool;
// Get monthly sales data
const [monthlySales] = await pool.query(`
const { rows: monthlySales } = await pool.query(`
SELECT
DATE_FORMAT(date, '%Y-%m') as month,
TO_CHAR(date, 'YYYY-MM') as month,
COUNT(DISTINCT order_number) as order_count,
SUM(quantity) as units_sold,
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
ROUND(SUM(price * quantity)::numeric, 3) as revenue
FROM orders
WHERE pid = ?
WHERE pid = $1
AND canceled = false
GROUP BY DATE_FORMAT(date, '%Y-%m')
GROUP BY TO_CHAR(date, 'YYYY-MM')
ORDER BY month DESC
LIMIT 12
`, [id]);
@@ -693,9 +693,9 @@ router.get('/:id/time-series', async (req, res) => {
}));
// Get recent orders
const [recentOrders] = await pool.query(`
const { rows: recentOrders } = await pool.query(`
SELECT
DATE_FORMAT(date, '%Y-%m-%d') as date,
TO_CHAR(date, 'YYYY-MM-DD') as date,
order_number,
quantity,
price,
@@ -705,18 +705,18 @@ router.get('/:id/time-series', async (req, res) => {
customer_name as customer,
status
FROM orders
WHERE pid = ?
WHERE pid = $1
AND canceled = false
ORDER BY date DESC
LIMIT 10
`, [id]);
// Get recent purchase orders with detailed status
const [recentPurchases] = await pool.query(`
const { rows: recentPurchases } = await pool.query(`
SELECT
DATE_FORMAT(date, '%Y-%m-%d') as date,
DATE_FORMAT(expected_date, '%Y-%m-%d') as expected_date,
DATE_FORMAT(received_date, '%Y-%m-%d') as received_date,
TO_CHAR(date, 'YYYY-MM-DD') as date,
TO_CHAR(expected_date, 'YYYY-MM-DD') as expected_date,
TO_CHAR(received_date, 'YYYY-MM-DD') as received_date,
po_id,
ordered,
received,
@@ -726,17 +726,17 @@ router.get('/:id/time-series', async (req, res) => {
notes,
CASE
WHEN received_date IS NOT NULL THEN
DATEDIFF(received_date, date)
WHEN expected_date < CURDATE() AND status < ${PurchaseOrderStatus.ReceivingStarted} THEN
DATEDIFF(CURDATE(), expected_date)
(received_date - date)
WHEN expected_date < CURRENT_DATE AND status < $2 THEN
(CURRENT_DATE - expected_date)
ELSE NULL
END as lead_time_days
FROM purchase_orders
WHERE pid = ?
AND status != ${PurchaseOrderStatus.Canceled}
WHERE pid = $1
AND status != $3
ORDER BY date DESC
LIMIT 10
`, [id]);
`, [id, PurchaseOrderStatus.ReceivingStarted, PurchaseOrderStatus.Canceled]);
res.json({
monthly_sales: formattedMonthlySales,

View File

@@ -97,6 +97,28 @@ router.get('/', async (req, res) => {
const pages = Math.ceil(total / limit);
// Get recent purchase orders
let orderByClause;
if (sortColumn === 'order_date') {
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'vendor_name') {
orderByClause = `vendor ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'total_cost') {
orderByClause = `total_cost ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'total_received') {
orderByClause = `total_received ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'total_items') {
orderByClause = `total_items ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'total_quantity') {
orderByClause = `total_quantity ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'fulfillment_rate') {
orderByClause = `fulfillment_rate ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else if (sortColumn === 'status') {
orderByClause = `status ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
} else {
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
}
const { rows: orders } = await pool.query(`
WITH po_totals AS (
SELECT
@@ -128,20 +150,9 @@ router.get('/', async (req, res) => {
total_received,
fulfillment_rate
FROM po_totals
ORDER BY
CASE
WHEN $${paramCounter} = 'order_date' THEN date
WHEN $${paramCounter} = 'vendor_name' THEN vendor
WHEN $${paramCounter} = 'total_cost' THEN total_cost
WHEN $${paramCounter} = 'total_received' THEN total_received
WHEN $${paramCounter} = 'total_items' THEN total_items
WHEN $${paramCounter} = 'total_quantity' THEN total_quantity
WHEN $${paramCounter} = 'fulfillment_rate' THEN fulfillment_rate
WHEN $${paramCounter} = 'status' THEN status
ELSE date
END ${sortDirection === 'desc' ? 'DESC' : 'ASC'}
LIMIT $${paramCounter + 1} OFFSET $${paramCounter + 2}
`, [...params, sortColumn, Number(limit), offset]);
ORDER BY ${orderByClause}
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
`, [...params, Number(limit), offset]);
// Get unique vendors for filter options
const { rows: vendors } = await pool.query(`
@@ -272,7 +283,7 @@ router.get('/cost-analysis', async (req, res) => {
try {
const pool = req.app.locals.pool;
const [analysis] = await pool.query(`
const { rows: analysis } = await pool.query(`
WITH category_costs AS (
SELECT
c.name as category,
@@ -290,11 +301,11 @@ router.get('/cost-analysis', async (req, res) => {
SELECT
category,
COUNT(DISTINCT pid) as unique_products,
CAST(AVG(cost_price) AS DECIMAL(15,3)) as avg_cost,
CAST(MIN(cost_price) AS DECIMAL(15,3)) as min_cost,
CAST(MAX(cost_price) AS DECIMAL(15,3)) as max_cost,
CAST(STDDEV(cost_price) AS DECIMAL(15,3)) as cost_variance,
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
ROUND(AVG(cost_price)::numeric, 3) as avg_cost,
ROUND(MIN(cost_price)::numeric, 3) as min_cost,
ROUND(MAX(cost_price)::numeric, 3) as max_cost,
ROUND(STDDEV(cost_price)::numeric, 3) as cost_variance,
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
FROM category_costs
GROUP BY category
ORDER BY total_spend DESC
@@ -302,17 +313,37 @@ router.get('/cost-analysis', async (req, res) => {
// Parse numeric values
const parsedAnalysis = {
categories: analysis.map(cat => ({
unique_products: 0,
avg_cost: 0,
min_cost: 0,
max_cost: 0,
cost_variance: 0,
total_spend_by_category: analysis.map(cat => ({
category: cat.category,
unique_products: Number(cat.unique_products) || 0,
avg_cost: Number(cat.avg_cost) || 0,
min_cost: Number(cat.min_cost) || 0,
max_cost: Number(cat.max_cost) || 0,
cost_variance: Number(cat.cost_variance) || 0,
total_spend: Number(cat.total_spend) || 0
}))
};
// Calculate aggregated stats if data exists
if (analysis.length > 0) {
parsedAnalysis.unique_products = analysis.reduce((sum, cat) => sum + Number(cat.unique_products || 0), 0);
// Calculate weighted average cost
const totalProducts = parsedAnalysis.unique_products;
if (totalProducts > 0) {
parsedAnalysis.avg_cost = analysis.reduce((sum, cat) =>
sum + (Number(cat.avg_cost || 0) * Number(cat.unique_products || 0)), 0) / totalProducts;
}
// Find min and max across all categories
parsedAnalysis.min_cost = Math.min(...analysis.map(cat => Number(cat.min_cost || 0)));
parsedAnalysis.max_cost = Math.max(...analysis.map(cat => Number(cat.max_cost || 0)));
// Average variance
parsedAnalysis.cost_variance = analysis.reduce((sum, cat) =>
sum + Number(cat.cost_variance || 0), 0) / analysis.length;
}
res.json(parsedAnalysis);
} catch (error) {
console.error('Error fetching cost analysis:', error);
@@ -325,7 +356,7 @@ router.get('/receiving-status', async (req, res) => {
try {
const pool = req.app.locals.pool;
const [status] = await pool.query(`
const { rows: status } = await pool.query(`
WITH po_totals AS (
SELECT
po_id,
@@ -333,7 +364,7 @@ router.get('/receiving-status', async (req, res) => {
receiving_status,
SUM(ordered) as total_ordered,
SUM(received) as total_received,
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_cost
FROM purchase_orders
WHERE status != ${STATUS.CANCELED}
GROUP BY po_id, status, receiving_status
@@ -345,8 +376,8 @@ router.get('/receiving-status', async (req, res) => {
ROUND(
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
) as fulfillment_rate,
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost,
ROUND(SUM(total_cost)::numeric, 3) as total_value,
ROUND(AVG(total_cost)::numeric, 3) as avg_cost,
COUNT(DISTINCT CASE
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
END) as pending_count,
@@ -364,17 +395,17 @@ router.get('/receiving-status', async (req, res) => {
// Parse numeric values
const parsedStatus = {
order_count: Number(status[0].order_count) || 0,
total_ordered: Number(status[0].total_ordered) || 0,
total_received: Number(status[0].total_received) || 0,
fulfillment_rate: Number(status[0].fulfillment_rate) || 0,
total_value: Number(status[0].total_value) || 0,
avg_cost: Number(status[0].avg_cost) || 0,
order_count: Number(status[0]?.order_count) || 0,
total_ordered: Number(status[0]?.total_ordered) || 0,
total_received: Number(status[0]?.total_received) || 0,
fulfillment_rate: Number(status[0]?.fulfillment_rate) || 0,
total_value: Number(status[0]?.total_value) || 0,
avg_cost: Number(status[0]?.avg_cost) || 0,
status_breakdown: {
pending: Number(status[0].pending_count) || 0,
partial: Number(status[0].partial_count) || 0,
completed: Number(status[0].completed_count) || 0,
canceled: Number(status[0].canceled_count) || 0
pending: Number(status[0]?.pending_count) || 0,
partial: Number(status[0]?.partial_count) || 0,
completed: Number(status[0]?.completed_count) || 0,
canceled: Number(status[0]?.canceled_count) || 0
}
};
@@ -390,7 +421,7 @@ router.get('/order-vs-received', async (req, res) => {
try {
const pool = req.app.locals.pool;
const [quantities] = await pool.query(`
const { rows: quantities } = await pool.query(`
SELECT
p.product_id,
p.title as product,
@@ -403,10 +434,10 @@ router.get('/order-vs-received', async (req, res) => {
COUNT(DISTINCT po.po_id) as order_count
FROM products p
JOIN purchase_orders po ON p.product_id = po.product_id
WHERE po.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
WHERE po.date >= (CURRENT_DATE - INTERVAL '90 days')
GROUP BY p.product_id, p.title, p.SKU
HAVING order_count > 0
ORDER BY ordered_quantity DESC
HAVING COUNT(DISTINCT po.po_id) > 0
ORDER BY SUM(po.ordered) DESC
LIMIT 20
`);

View File

@@ -0,0 +1,396 @@
const express = require('express');
const router = express.Router();
const multer = require('multer');
const path = require('path');
const fs = require('fs');
// Create reusable uploads directory if it doesn't exist
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
fs.mkdirSync(uploadsDir, { recursive: true });
// Configure multer for file uploads
const storage = multer.diskStorage({
destination: function (req, file, cb) {
console.log(`Saving reusable image to: ${uploadsDir}`);
cb(null, uploadsDir);
},
filename: function (req, file, cb) {
// Create unique filename with original extension
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
// Make sure we preserve the original file extension
let fileExt = path.extname(file.originalname).toLowerCase();
// Ensure there is a proper extension based on mimetype if none exists
if (!fileExt) {
switch (file.mimetype) {
case 'image/jpeg': fileExt = '.jpg'; break;
case 'image/png': fileExt = '.png'; break;
case 'image/gif': fileExt = '.gif'; break;
case 'image/webp': fileExt = '.webp'; break;
default: fileExt = '.jpg'; // Default to jpg
}
}
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
cb(null, fileName);
}
});
const upload = multer({
storage: storage,
limits: {
fileSize: 5 * 1024 * 1024, // 5MB max file size
},
fileFilter: function (req, file, cb) {
// Accept only image files
const filetypes = /jpeg|jpg|png|gif|webp/;
const mimetype = filetypes.test(file.mimetype);
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
if (mimetype && extname) {
return cb(null, true);
}
cb(new Error('Only image files are allowed'));
}
});
// Get all reusable images
router.get('/', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
ORDER BY created_at DESC
`);
res.json(result.rows);
} catch (error) {
console.error('Error fetching reusable images:', error);
res.status(500).json({
error: 'Failed to fetch reusable images',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get images by company or global images
router.get('/by-company/:companyId', async (req, res) => {
try {
const { companyId } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Get images that are either global or belong to this company
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE is_global = true OR company = $1
ORDER BY created_at DESC
`, [companyId]);
res.json(result.rows);
} catch (error) {
console.error('Error fetching reusable images by company:', error);
res.status(500).json({
error: 'Failed to fetch reusable images by company',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get global images only
router.get('/global', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE is_global = true
ORDER BY created_at DESC
`);
res.json(result.rows);
} catch (error) {
console.error('Error fetching global reusable images:', error);
res.status(500).json({
error: 'Failed to fetch global reusable images',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get a single image by ID
router.get('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE id = $1
`, [id]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching reusable image:', error);
res.status(500).json({
error: 'Failed to fetch reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Upload a new reusable image
router.post('/upload', upload.single('image'), async (req, res) => {
try {
if (!req.file) {
return res.status(400).json({ error: 'No image file provided' });
}
const { name, is_global, company } = req.body;
// Validate required fields
if (!name) {
return res.status(400).json({ error: 'Image name is required' });
}
// Convert is_global from string to boolean
const isGlobal = is_global === 'true' || is_global === true;
// Validate company is provided for non-global images
if (!isGlobal && !company) {
return res.status(400).json({ error: 'Company is required for non-global images' });
}
// Log file information
console.log('Reusable image uploaded:', {
filename: req.file.filename,
originalname: req.file.originalname,
mimetype: req.file.mimetype,
size: req.file.size,
path: req.file.path
});
// Ensure the file exists
const filePath = path.join(uploadsDir, req.file.filename);
if (!fs.existsSync(filePath)) {
return res.status(500).json({ error: 'File was not saved correctly' });
}
// Create URL for the uploaded file
const baseUrl = 'https://inventory.acot.site';
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Insert record into database
const result = await pool.query(`
INSERT INTO reusable_images (
name,
filename,
file_path,
image_url,
is_global,
company,
mime_type,
file_size
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
RETURNING *
`, [
name,
req.file.filename,
filePath,
imageUrl,
isGlobal,
isGlobal ? null : company,
req.file.mimetype,
req.file.size
]);
// Return success response with image data
res.status(201).json({
success: true,
image: result.rows[0],
message: 'Image uploaded successfully'
});
} catch (error) {
console.error('Error uploading reusable image:', error);
res.status(500).json({ error: error.message || 'Failed to upload image' });
}
});
// Update image details (name, is_global, company)
router.put('/:id', async (req, res) => {
try {
const { id } = req.params;
const { name, is_global, company } = req.body;
// Validate required fields
if (!name) {
return res.status(400).json({ error: 'Image name is required' });
}
// Convert is_global from string to boolean if necessary
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
// Validate company is provided for non-global images
if (!isGlobal && !company) {
return res.status(400).json({ error: 'Company is required for non-global images' });
}
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Check if the image exists
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
if (checkResult.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
const result = await pool.query(`
UPDATE reusable_images
SET
name = $1,
is_global = $2,
company = $3
WHERE id = $4
RETURNING *
`, [
name,
isGlobal,
isGlobal ? null : company,
id
]);
res.json(result.rows[0]);
} catch (error) {
console.error('Error updating reusable image:', error);
res.status(500).json({
error: 'Failed to update reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Delete a reusable image
router.delete('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Get the image data first to get the filename
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
if (imageResult.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
const image = imageResult.rows[0];
// Delete from database
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
// Delete the file from filesystem
const filePath = path.join(uploadsDir, image.filename);
if (fs.existsSync(filePath)) {
fs.unlinkSync(filePath);
}
res.json({
message: 'Reusable image deleted successfully',
image
});
} catch (error) {
console.error('Error deleting reusable image:', error);
res.status(500).json({
error: 'Failed to delete reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Check if file exists and permissions
router.get('/check-file/:filename', (req, res) => {
const { filename } = req.params;
// Prevent directory traversal
if (filename.includes('..') || filename.includes('/')) {
return res.status(400).json({ error: 'Invalid filename' });
}
const filePath = path.join(uploadsDir, filename);
try {
// Check if file exists
if (!fs.existsSync(filePath)) {
return res.status(404).json({
error: 'File not found',
path: filePath,
exists: false,
readable: false
});
}
// Check if file is readable
fs.accessSync(filePath, fs.constants.R_OK);
// Get file stats
const stats = fs.statSync(filePath);
return res.json({
filename,
path: filePath,
exists: true,
readable: true,
isFile: stats.isFile(),
isDirectory: stats.isDirectory(),
size: stats.size,
created: stats.birthtime,
modified: stats.mtime,
permissions: stats.mode.toString(8)
});
} catch (error) {
return res.status(500).json({
error: error.message,
path: filePath,
exists: fs.existsSync(filePath),
readable: false
});
}
});
// Error handling middleware
router.use((err, req, res, next) => {
console.error('Reusable images route error:', err);
res.status(500).json({
error: 'Internal server error',
details: err.message
});
});
module.exports = router;

View File

@@ -32,7 +32,7 @@ router.get('/', async (req, res) => {
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
FROM purchase_orders
WHERE status = 'closed'
WHERE status = 2
AND cost_price IS NOT NULL
AND ordered > 0
AND vendor = ANY($1)
@@ -70,7 +70,7 @@ router.get('/', async (req, res) => {
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
FROM purchase_orders
WHERE status = 'closed'
WHERE status = 2
AND cost_price IS NOT NULL
AND ordered > 0
AND vendor IS NOT NULL AND vendor != ''

View File

@@ -18,6 +18,8 @@ const categoriesRouter = require('./routes/categories');
const importRouter = require('./routes/import');
const aiValidationRouter = require('./routes/ai-validation');
const templatesRouter = require('./routes/templates');
const aiPromptsRouter = require('./routes/ai-prompts');
const reusableImagesRouter = require('./routes/reusable-images');
// Get the absolute path to the .env file
const envPath = '/var/www/html/inventory/.env';
@@ -103,6 +105,8 @@ async function startServer() {
app.use('/api/import', importRouter);
app.use('/api/ai-validation', aiValidationRouter);
app.use('/api/templates', templatesRouter);
app.use('/api/ai-prompts', aiPromptsRouter);
app.use('/api/reusable-images', reusableImagesRouter);
// Basic health check route
app.get('/health', (req, res) => {

View File

@@ -1,9 +1,9 @@
const mysql = require('mysql2/promise');
const { Pool } = require('pg');
let pool;
function initPool(config) {
pool = mysql.createPool(config);
pool = new Pool(config);
return pool;
}

View File

@@ -15,9 +15,9 @@ import Forecasting from "@/pages/Forecasting";
import { Vendors } from '@/pages/Vendors';
import { Categories } from '@/pages/Categories';
import { Import } from '@/pages/Import';
import { AiValidationDebug } from "@/pages/AiValidationDebug"
import { AuthProvider } from './contexts/AuthContext';
import { Protected } from './components/auth/Protected';
import { FirstAccessiblePage } from './components/auth/FirstAccessiblePage';
const queryClient = new QueryClient();
@@ -78,6 +78,11 @@ function App() {
<MainLayout />
</RequireAuth>
}>
<Route index element={
<Protected page="dashboard" fallback={<FirstAccessiblePage />}>
<Dashboard />
</Protected>
} />
<Route path="/" element={
<Protected page="dashboard">
<Dashboard />
@@ -123,11 +128,6 @@ function App() {
<Forecasting />
</Protected>
} />
<Route path="/ai-validation/debug" element={
<Protected page="ai_validation_debug">
<AiValidationDebug />
</Protected>
} />
<Route path="*" element={<Navigate to="/" replace />} />
</Route>
</Routes>

View File

@@ -2,6 +2,7 @@ import { useQuery } from '@tanstack/react-query';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import { ResponsiveContainer, BarChart, Bar, XAxis, YAxis, Tooltip, ScatterChart, Scatter, ZAxis } from 'recharts';
import config from '../../config';
import { useState, useEffect } from 'react';
interface VendorData {
performance: {
@@ -10,14 +11,15 @@ interface VendorData {
profitMargin: number;
stockTurnover: number;
productCount: number;
growth: number;
}[];
comparison: {
comparison?: {
vendor: string;
salesPerProduct: number;
averageMargin: number;
size: number;
}[];
trends: {
trends?: {
vendor: string;
month: string;
sales: number;
@@ -25,40 +27,86 @@ interface VendorData {
}
export function VendorPerformance() {
const { data, isLoading } = useQuery<VendorData>({
queryKey: ['vendor-performance'],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/analytics/vendors`);
if (!response.ok) {
throw new Error('Failed to fetch vendor performance');
}
const rawData = await response.json();
return {
performance: rawData.performance.map((vendor: any) => ({
...vendor,
salesVolume: Number(vendor.salesVolume) || 0,
profitMargin: Number(vendor.profitMargin) || 0,
stockTurnover: Number(vendor.stockTurnover) || 0,
productCount: Number(vendor.productCount) || 0
})),
comparison: rawData.comparison.map((vendor: any) => ({
...vendor,
salesPerProduct: Number(vendor.salesPerProduct) || 0,
averageMargin: Number(vendor.averageMargin) || 0,
size: Number(vendor.size) || 0
})),
trends: rawData.trends.map((vendor: any) => ({
...vendor,
sales: Number(vendor.sales) || 0
}))
};
},
});
const [vendorData, setVendorData] = useState<VendorData | null>(null);
const [isLoading, setIsLoading] = useState(true);
const [error, setError] = useState<string | null>(null);
if (isLoading || !data) {
useEffect(() => {
// Use plain fetch to bypass cache issues with React Query
const fetchData = async () => {
try {
setIsLoading(true);
// Add cache-busting parameter
const response = await fetch(`${config.apiUrl}/analytics/vendors?nocache=${Date.now()}`, {
headers: {
"Cache-Control": "no-cache, no-store, must-revalidate",
"Pragma": "no-cache",
"Expires": "0"
}
});
if (!response.ok) {
throw new Error(`Failed to fetch: ${response.status}`);
}
const rawData = await response.json();
if (!rawData || !rawData.performance) {
throw new Error('Invalid response format');
}
// Create a complete structure even if some parts are missing
const data: VendorData = {
performance: rawData.performance.map((vendor: any) => ({
vendor: vendor.vendor,
salesVolume: Number(vendor.salesVolume) || 0,
profitMargin: Number(vendor.profitMargin) || 0,
stockTurnover: Number(vendor.stockTurnover) || 0,
productCount: Number(vendor.productCount) || 0,
growth: Number(vendor.growth) || 0
})),
comparison: rawData.comparison?.map((vendor: any) => ({
vendor: vendor.vendor,
salesPerProduct: Number(vendor.salesPerProduct) || 0,
averageMargin: Number(vendor.averageMargin) || 0,
size: Number(vendor.size) || 0
})) || [],
trends: rawData.trends?.map((vendor: any) => ({
vendor: vendor.vendor,
month: vendor.month,
sales: Number(vendor.sales) || 0
})) || []
};
setVendorData(data);
} catch (err) {
console.error('Error fetching vendor data:', err);
setError(err instanceof Error ? err.message : 'Unknown error');
} finally {
setIsLoading(false);
}
};
fetchData();
}, []);
if (isLoading) {
return <div>Loading vendor performance...</div>;
}
if (error || !vendorData) {
return <div className="text-red-500">Error loading vendor data: {error}</div>;
}
// Ensure we have at least the performance data
const sortedPerformance = vendorData.performance
.sort((a, b) => b.salesVolume - a.salesVolume)
.slice(0, 10);
// Use simplified version if comparison data is missing
const hasComparisonData = vendorData.comparison && vendorData.comparison.length > 0;
return (
<div className="grid gap-4">
<div className="grid gap-4 md:grid-cols-2">
@@ -68,7 +116,7 @@ export function VendorPerformance() {
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<BarChart data={data.performance}>
<BarChart data={sortedPerformance}>
<XAxis dataKey="vendor" />
<YAxis tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`} />
<Tooltip
@@ -84,44 +132,68 @@ export function VendorPerformance() {
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Vendor Performance Matrix</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<ScatterChart>
<XAxis
dataKey="salesPerProduct"
name="Sales per Product"
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
/>
<YAxis
dataKey="averageMargin"
name="Average Margin"
tickFormatter={(value) => `${value.toFixed(0)}%`}
/>
<ZAxis
dataKey="size"
range={[50, 400]}
name="Product Count"
/>
<Tooltip
formatter={(value: number, name: string) => {
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
return [value, name];
}}
/>
<Scatter
data={data.comparison}
fill="#60a5fa"
name="Vendors"
/>
</ScatterChart>
</ResponsiveContainer>
</CardContent>
</Card>
{hasComparisonData ? (
<Card>
<CardHeader>
<CardTitle>Vendor Performance Matrix</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<ScatterChart>
<XAxis
dataKey="salesPerProduct"
name="Sales per Product"
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
/>
<YAxis
dataKey="averageMargin"
name="Average Margin"
tickFormatter={(value) => `${value.toFixed(0)}%`}
/>
<ZAxis
dataKey="size"
range={[50, 400]}
name="Product Count"
/>
<Tooltip
formatter={(value: number, name: string) => {
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
return [value, name];
}}
/>
<Scatter
data={vendorData.comparison}
fill="#60a5fa"
name="Vendors"
/>
</ScatterChart>
</ResponsiveContainer>
</CardContent>
</Card>
) : (
<Card>
<CardHeader>
<CardTitle>Vendor Profit Margins</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<BarChart data={sortedPerformance}>
<XAxis dataKey="vendor" />
<YAxis tickFormatter={(value) => `${value}%`} />
<Tooltip
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Profit Margin']}
/>
<Bar
dataKey="profitMargin"
fill="#4ade80"
name="Profit Margin"
/>
</BarChart>
</ResponsiveContainer>
</CardContent>
</Card>
)}
</div>
<Card>
@@ -130,7 +202,7 @@ export function VendorPerformance() {
</CardHeader>
<CardContent>
<div className="space-y-4">
{data.performance.map((vendor) => (
{sortedPerformance.map((vendor) => (
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
<div className="flex-1">
<p className="text-sm font-medium">{vendor.vendor}</p>

View File

@@ -0,0 +1,44 @@
import { useContext } from "react";
import { Navigate } from "react-router-dom";
import { AuthContext } from "@/contexts/AuthContext";
// Define available pages in order of priority
const PAGES = [
{ path: "/products", permission: "access:products" },
{ path: "/categories", permission: "access:categories" },
{ path: "/vendors", permission: "access:vendors" },
{ path: "/purchase-orders", permission: "access:purchase_orders" },
{ path: "/analytics", permission: "access:analytics" },
{ path: "/forecasting", permission: "access:forecasting" },
{ path: "/import", permission: "access:import" },
{ path: "/settings", permission: "access:settings" },
{ path: "/ai-validation/debug", permission: "access:ai_validation_debug" }
];
export function FirstAccessiblePage() {
const { user } = useContext(AuthContext);
// If user isn't loaded yet, don't render anything
if (!user) {
return null;
}
// Admin users have access to all pages, so this component
// shouldn't be rendering for them (handled by App.tsx)
if (user.is_admin) {
return null;
}
// Find the first page the user has access to
const firstAccessiblePage = PAGES.find(page => {
return user.permissions?.includes(page.permission);
});
// If we found a page, redirect to it
if (firstAccessiblePage) {
return <Navigate to={firstAccessiblePage.path} replace />;
}
// If user has no access to any page, redirect to login
return <Navigate to="/login" replace />;
}

View File

@@ -1,88 +0,0 @@
import { useQuery } from "@tanstack/react-query"
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
import { AlertCircle, AlertTriangle, CheckCircle2, PackageSearch } from "lucide-react"
import config from "@/config"
import { useNavigate } from "react-router-dom"
import { cn } from "@/lib/utils"
interface InventoryHealth {
critical: number
reorder: number
healthy: number
overstock: number
total: number
}
export function InventoryHealthSummary() {
const navigate = useNavigate();
const { data: summary } = useQuery<InventoryHealth>({
queryKey: ["inventory-health"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/inventory/health/summary`)
if (!response.ok) {
throw new Error("Failed to fetch inventory health")
}
return response.json()
},
})
const stats = [
{
title: "Critical Stock",
value: summary?.critical || 0,
description: "Products needing immediate attention",
icon: AlertCircle,
className: "bg-destructive/10",
iconClassName: "text-destructive",
view: "critical"
},
{
title: "Reorder Soon",
value: summary?.reorder || 0,
description: "Products approaching reorder point",
icon: AlertTriangle,
className: "bg-warning/10",
iconClassName: "text-warning",
view: "reorder"
},
{
title: "Healthy Stock",
value: summary?.healthy || 0,
description: "Products at optimal levels",
icon: CheckCircle2,
className: "bg-success/10",
iconClassName: "text-success",
view: "healthy"
},
{
title: "Overstock",
value: summary?.overstock || 0,
description: "Products exceeding optimal levels",
icon: PackageSearch,
className: "bg-muted",
iconClassName: "text-muted-foreground",
view: "overstocked"
},
]
return (
<>
{stats.map((stat) => (
<Card
key={stat.title}
className={cn(stat.className, "cursor-pointer hover:opacity-90 transition-opacity")}
onClick={() => navigate(`/products?view=${stat.view}`)}
>
<CardHeader className="flex flex-row items-center justify-between pb-2">
<CardTitle className="text-sm font-medium">{stat.title}</CardTitle>
<stat.icon className={`h-4 w-4 ${stat.iconClassName}`} />
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">{stat.value}</div>
<p className="text-xs text-muted-foreground">{stat.description}</p>
</CardContent>
</Card>
))}
</>
)
}

View File

@@ -1,106 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
import { Bar, BarChart, ResponsiveContainer, XAxis, YAxis, Tooltip } from 'recharts';
import config from '../../config';
interface InventoryMetrics {
stockLevels: {
category: string;
inStock: number;
lowStock: number;
outOfStock: number;
}[];
topVendors: {
vendor: string;
productCount: number;
averageStockLevel: string;
}[];
stockTurnover: {
category: string;
rate: string;
}[];
}
export function InventoryStats() {
const { data, isLoading, error } = useQuery<InventoryMetrics>({
queryKey: ['inventory-metrics'],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/inventory-metrics`);
if (!response.ok) {
throw new Error('Failed to fetch inventory metrics');
}
return response.json();
},
});
if (isLoading) {
return <div>Loading inventory metrics...</div>;
}
if (error) {
return <div className="text-red-500">Error loading inventory metrics</div>;
}
return (
<div className="grid gap-4">
<div className="grid gap-4 md:grid-cols-2">
<Card>
<CardHeader>
<CardTitle>Stock Levels by Category</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<BarChart data={data?.stockLevels}>
<XAxis dataKey="category" />
<YAxis />
<Tooltip />
<Bar dataKey="inStock" name="In Stock" fill="#4ade80" />
<Bar dataKey="lowStock" name="Low Stock" fill="#fbbf24" />
<Bar dataKey="outOfStock" name="Out of Stock" fill="#f87171" />
</BarChart>
</ResponsiveContainer>
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Stock Turnover Rate</CardTitle>
</CardHeader>
<CardContent>
<ResponsiveContainer width="100%" height={300}>
<BarChart data={data?.stockTurnover}>
<XAxis dataKey="category" />
<YAxis />
<Tooltip formatter={(value: string) => [Number(value).toFixed(2), "Rate"]} />
<Bar dataKey="rate" name="Turnover Rate" fill="#60a5fa" />
</BarChart>
</ResponsiveContainer>
</CardContent>
</Card>
</div>
<Card>
<CardHeader>
<CardTitle>Top Vendors</CardTitle>
</CardHeader>
<CardContent>
<div className="space-y-4">
{data?.topVendors.map((vendor) => (
<div key={vendor.vendor} className="flex items-center">
<div className="flex-1">
<p className="text-sm font-medium">{vendor.vendor}</p>
<p className="text-sm text-muted-foreground">
{vendor.productCount} products
</p>
</div>
<div className="ml-4 text-right">
<p className="text-sm font-medium">
Avg. Stock: {Number(vendor.averageStockLevel).toFixed(0)}
</p>
</div>
</div>
))}
</div>
</CardContent>
</Card>
</div>
);
}

View File

@@ -1,232 +0,0 @@
import { useQuery } from "@tanstack/react-query"
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
import {
Area,
AreaChart,
ResponsiveContainer,
Tooltip,
XAxis,
YAxis,
} from "recharts"
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"
import config from "@/config"
interface MetricDataPoint {
date: string
value: number
}
interface KeyMetrics {
revenue: MetricDataPoint[]
inventory_value: MetricDataPoint[]
gmroi: MetricDataPoint[]
}
export function KeyMetricsCharts() {
const { data: metrics } = useQuery<KeyMetrics>({
queryKey: ["key-metrics"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/metrics/trends`)
if (!response.ok) {
throw new Error("Failed to fetch metrics trends")
}
return response.json()
},
})
const formatCurrency = (value: number) =>
new Intl.NumberFormat("en-US", {
style: "currency",
currency: "USD",
minimumFractionDigits: 0,
maximumFractionDigits: 0,
}).format(value)
return (
<>
<CardHeader>
<CardTitle className="text-lg font-medium">Key Metrics</CardTitle>
</CardHeader>
<CardContent>
<Tabs defaultValue="revenue" className="space-y-4">
<TabsList>
<TabsTrigger value="revenue">Revenue</TabsTrigger>
<TabsTrigger value="inventory">Inventory Value</TabsTrigger>
<TabsTrigger value="gmroi">GMROI</TabsTrigger>
</TabsList>
<TabsContent value="revenue" className="space-y-4">
<div className="h-[300px]">
<ResponsiveContainer width="100%" height="100%">
<AreaChart data={metrics?.revenue}>
<XAxis
dataKey="date"
tickLine={false}
axisLine={false}
tickFormatter={(value) => value}
/>
<YAxis
tickLine={false}
axisLine={false}
tickFormatter={formatCurrency}
/>
<Tooltip
content={({ active, payload }) => {
if (active && payload && payload.length) {
return (
<div className="rounded-lg border bg-background p-2 shadow-sm">
<div className="grid grid-cols-2 gap-2">
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
Date
</span>
<span className="font-bold">
{payload[0].payload.date}
</span>
</div>
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
Revenue
</span>
<span className="font-bold">
{formatCurrency(payload[0].value as number)}
</span>
</div>
</div>
</div>
)
}
return null
}}
/>
<Area
type="monotone"
dataKey="value"
stroke="#0ea5e9"
fill="#0ea5e9"
fillOpacity={0.2}
/>
</AreaChart>
</ResponsiveContainer>
</div>
</TabsContent>
<TabsContent value="inventory" className="space-y-4">
<div className="h-[300px]">
<ResponsiveContainer width="100%" height="100%">
<AreaChart data={metrics?.inventory_value}>
<XAxis
dataKey="date"
tickLine={false}
axisLine={false}
tickFormatter={(value) => value}
/>
<YAxis
tickLine={false}
axisLine={false}
tickFormatter={formatCurrency}
/>
<Tooltip
content={({ active, payload }) => {
if (active && payload && payload.length) {
return (
<div className="rounded-lg border bg-background p-2 shadow-sm">
<div className="grid grid-cols-2 gap-2">
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
Date
</span>
<span className="font-bold">
{payload[0].payload.date}
</span>
</div>
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
Value
</span>
<span className="font-bold">
{formatCurrency(payload[0].value as number)}
</span>
</div>
</div>
</div>
)
}
return null
}}
/>
<Area
type="monotone"
dataKey="value"
stroke="#84cc16"
fill="#84cc16"
fillOpacity={0.2}
/>
</AreaChart>
</ResponsiveContainer>
</div>
</TabsContent>
<TabsContent value="gmroi" className="space-y-4">
<div className="h-[300px]">
<ResponsiveContainer width="100%" height="100%">
<AreaChart data={metrics?.gmroi}>
<XAxis
dataKey="date"
tickLine={false}
axisLine={false}
tickFormatter={(value) => value}
/>
<YAxis
tickLine={false}
axisLine={false}
tickFormatter={(value) => `${value.toFixed(1)}%`}
/>
<Tooltip
content={({ active, payload }) => {
if (active && payload && payload.length) {
return (
<div className="rounded-lg border bg-background p-2 shadow-sm">
<div className="grid grid-cols-2 gap-2">
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
Date
</span>
<span className="font-bold">
{payload[0].payload.date}
</span>
</div>
<div className="flex flex-col">
<span className="text-[0.70rem] uppercase text-muted-foreground">
GMROI
</span>
<span className="font-bold">
{`${typeof payload[0].value === 'number' ? payload[0].value.toFixed(1) : payload[0].value}%`}
</span>
</div>
</div>
</div>
)
}
return null
}}
/>
<Area
type="monotone"
dataKey="value"
stroke="#f59e0b"
fill="#f59e0b"
fillOpacity={0.2}
/>
</AreaChart>
</ResponsiveContainer>
</div>
</TabsContent>
</Tabs>
</CardContent>
</>
)
}

View File

@@ -1,108 +0,0 @@
import { useQuery } from "@tanstack/react-query"
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table"
import { Badge } from "@/components/ui/badge"
import config from "@/config"
import { format } from "date-fns"
interface Product {
pid: number;
sku: string;
title: string;
stock_quantity: number;
daily_sales_avg: string;
days_of_inventory: string;
reorder_qty: number;
last_purchase_date: string | null;
lead_time_status: string;
}
// Helper functions
const formatDate = (dateString: string) => {
return format(new Date(dateString), 'MMM dd, yyyy')
}
const getLeadTimeVariant = (status: string) => {
switch (status.toLowerCase()) {
case 'critical':
return 'destructive'
case 'warning':
return 'secondary'
case 'good':
return 'secondary'
default:
return 'secondary'
}
}
export function LowStockAlerts() {
const { data: products } = useQuery<Product[]>({
queryKey: ["low-stock"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/low-stock/products`)
if (!response.ok) {
throw new Error("Failed to fetch low stock products")
}
return response.json()
},
})
return (
<>
<CardHeader>
<CardTitle className="text-lg font-medium">Low Stock Alerts</CardTitle>
</CardHeader>
<CardContent>
<div className="max-h-[350px] overflow-auto">
<Table>
<TableHeader>
<TableRow>
<TableHead>Product</TableHead>
<TableHead className="text-right">Stock</TableHead>
<TableHead className="text-right">Daily Sales</TableHead>
<TableHead className="text-right">Days Left</TableHead>
<TableHead className="text-right">Reorder Qty</TableHead>
<TableHead>Last Purchase</TableHead>
<TableHead>Lead Time</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{products?.map((product) => (
<TableRow key={product.pid}>
<TableCell>
<a
href={`https://backend.acherryontop.com/product/${product.pid}`}
target="_blank"
rel="noopener noreferrer"
className="hover:underline"
>
{product.title}
</a>
<div className="text-sm text-muted-foreground">{product.sku}</div>
</TableCell>
<TableCell className="text-right">{product.stock_quantity}</TableCell>
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
<TableCell className="text-right">{Number(product.days_of_inventory).toFixed(1)}</TableCell>
<TableCell className="text-right">{product.reorder_qty}</TableCell>
<TableCell>{product.last_purchase_date ? formatDate(product.last_purchase_date) : '-'}</TableCell>
<TableCell>
<Badge variant={getLeadTimeVariant(product.lead_time_status)}>
{product.lead_time_status}
</Badge>
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</div>
</CardContent>
</>
)
}

View File

@@ -1,66 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { Line, LineChart, ResponsiveContainer, Tooltip, XAxis, YAxis } from 'recharts';
import config from '../../config';
interface SalesData {
date: string;
total: number;
}
export function Overview() {
const { data, isLoading, error } = useQuery<SalesData[]>({
queryKey: ['sales-overview'],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/sales-overview`);
if (!response.ok) {
throw new Error('Failed to fetch sales overview');
}
const rawData = await response.json();
return rawData.map((item: SalesData) => ({
...item,
total: parseFloat(item.total.toString()),
date: new Date(item.date).toLocaleDateString('en-US', { month: 'short', day: 'numeric' })
}));
},
});
if (isLoading) {
return <div>Loading chart...</div>;
}
if (error) {
return <div className="text-red-500">Error loading sales overview</div>;
}
return (
<ResponsiveContainer width="100%" height={350}>
<LineChart data={data}>
<XAxis
dataKey="date"
stroke="#888888"
fontSize={12}
tickLine={false}
axisLine={false}
/>
<YAxis
stroke="#888888"
fontSize={12}
tickLine={false}
axisLine={false}
tickFormatter={(value) => `$${value.toLocaleString()}`}
/>
<Tooltip
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
labelFormatter={(label) => `Date: ${label}`}
/>
<Line
type="monotone"
dataKey="total"
stroke="hsl(var(--primary))"
strokeWidth={2}
dot={false}
/>
</LineChart>
</ResponsiveContainer>
);
}

View File

@@ -1,63 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { Avatar, AvatarFallback } from '@/components/ui/avatar';
import config from '../../config';
interface RecentOrder {
order_id: string;
customer_name: string;
total_amount: number;
order_date: string;
}
export function RecentSales() {
const { data: recentOrders, isLoading, error } = useQuery<RecentOrder[]>({
queryKey: ['recent-orders'],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/recent-orders`);
if (!response.ok) {
throw new Error('Failed to fetch recent orders');
}
const data = await response.json();
return data.map((order: RecentOrder) => ({
...order,
total_amount: parseFloat(order.total_amount.toString())
}));
},
});
if (isLoading) {
return <div>Loading recent sales...</div>;
}
if (error) {
return <div className="text-red-500">Error loading recent sales</div>;
}
return (
<div className="space-y-8">
{recentOrders?.map((order) => (
<div key={order.order_id} className="flex items-center">
<Avatar className="h-9 w-9">
<AvatarFallback>
{order.customer_name?.split(' ').map(n => n[0]).join('') || '??'}
</AvatarFallback>
</Avatar>
<div className="ml-4 space-y-1">
<p className="text-sm font-medium leading-none">Order #{order.order_id}</p>
<p className="text-sm text-muted-foreground">
{new Date(order.order_date).toLocaleDateString()}
</p>
</div>
<div className="ml-auto font-medium">
${order.total_amount.toFixed(2)}
</div>
</div>
))}
{!recentOrders?.length && (
<div className="text-center text-muted-foreground">
No recent orders found
</div>
)}
</div>
);
}

View File

@@ -1,58 +0,0 @@
import { useQuery } from '@tanstack/react-query';
import { Cell, Pie, PieChart, ResponsiveContainer, Tooltip, Legend } from 'recharts';
import config from '../../config';
interface CategorySales {
category: string;
total: number;
percentage: number;
}
const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d'];
export function SalesByCategory() {
const { data, isLoading, error } = useQuery<CategorySales[]>({
queryKey: ['sales-by-category'],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/sales-by-category`);
if (!response.ok) {
throw new Error('Failed to fetch category sales');
}
return response.json();
},
});
if (isLoading) {
return <div>Loading chart...</div>;
}
if (error) {
return <div className="text-red-500">Error loading category sales</div>;
}
return (
<ResponsiveContainer width="100%" height={300}>
<PieChart>
<Pie
data={data}
cx="50%"
cy="50%"
labelLine={false}
outerRadius={80}
fill="#8884d8"
dataKey="total"
nameKey="category"
label={({ name, percent }) => `${name} ${(percent * 100).toFixed(0)}%`}
>
{data?.map((_, index) => (
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
))}
</Pie>
<Tooltip
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
/>
<Legend />
</PieChart>
</ResponsiveContainer>
);
}

View File

@@ -1,95 +0,0 @@
import { useQuery } from "@tanstack/react-query"
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table"
import { TrendingUp, TrendingDown } from "lucide-react"
import config from "@/config"
interface Product {
pid: number;
sku: string;
title: string;
daily_sales_avg: string;
weekly_sales_avg: string;
growth_rate: string;
total_revenue: string;
}
export function TrendingProducts() {
const { data: products } = useQuery<Product[]>({
queryKey: ["trending-products"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/products/trending`)
if (!response.ok) {
throw new Error("Failed to fetch trending products")
}
return response.json()
},
})
const formatPercent = (value: number) =>
new Intl.NumberFormat("en-US", {
style: "percent",
minimumFractionDigits: 1,
maximumFractionDigits: 1,
signDisplay: "exceptZero",
}).format(value / 100)
return (
<>
<CardHeader>
<CardTitle className="text-lg font-medium">Trending Products</CardTitle>
</CardHeader>
<CardContent>
<div className="max-h-[400px] overflow-auto">
<Table>
<TableHeader>
<TableRow>
<TableHead>Product</TableHead>
<TableHead>Daily Sales</TableHead>
<TableHead className="text-right">Growth</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{products?.map((product) => (
<TableRow key={product.pid}>
<TableCell className="font-medium">
<div className="flex flex-col">
<span className="font-medium">{product.title}</span>
<span className="text-sm text-muted-foreground">
{product.sku}
</span>
</div>
</TableCell>
<TableCell>{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
<TableCell className="text-right">
<div className="flex items-center justify-end gap-1">
{Number(product.growth_rate) > 0 ? (
<TrendingUp className="h-4 w-4 text-success" />
) : (
<TrendingDown className="h-4 w-4 text-destructive" />
)}
<span
className={
Number(product.growth_rate) > 0 ? "text-success" : "text-destructive"
}
>
{formatPercent(Number(product.growth_rate))}
</span>
</div>
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</div>
</CardContent>
</>
)
}

View File

@@ -1,79 +0,0 @@
import { useQuery } from "@tanstack/react-query"
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table"
import { Progress } from "@/components/ui/progress"
import config from "@/config"
interface VendorMetrics {
vendor: string
avg_lead_time: number
on_time_delivery_rate: number
avg_fill_rate: number
total_orders: number
active_orders: number
overdue_orders: number
}
export function VendorPerformance() {
const { data: vendors } = useQuery<VendorMetrics[]>({
queryKey: ["vendor-metrics"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/dashboard/vendor/performance`)
if (!response.ok) {
throw new Error("Failed to fetch vendor metrics")
}
return response.json()
},
})
// Sort vendors by on-time delivery rate
const sortedVendors = vendors
?.sort((a, b) => b.on_time_delivery_rate - a.on_time_delivery_rate)
return (
<>
<CardHeader>
<CardTitle className="text-lg font-medium">Top Vendor Performance</CardTitle>
</CardHeader>
<CardContent className="max-h-[400px] overflow-auto">
<Table>
<TableHeader>
<TableRow>
<TableHead>Vendor</TableHead>
<TableHead>On-Time</TableHead>
<TableHead className="text-right">Fill Rate</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{sortedVendors?.map((vendor) => (
<TableRow key={vendor.vendor}>
<TableCell className="font-medium">{vendor.vendor}</TableCell>
<TableCell>
<div className="flex items-center gap-2">
<Progress
value={vendor.on_time_delivery_rate}
className="h-2"
/>
<span className="w-10 text-sm">
{vendor.on_time_delivery_rate.toFixed(0)}%
</span>
</div>
</TableCell>
<TableCell className="text-right">
{vendor.avg_fill_rate.toFixed(0)}%
</TableCell>
</TableRow>
))}
</TableBody>
</Table>
</CardContent>
</>
)
}

View File

@@ -160,6 +160,7 @@ export function AppSidebar() {
</SidebarGroupContent>
</SidebarGroup>
</SidebarContent>
<SidebarSeparator />
<SidebarFooter>
<SidebarMenu>
<SidebarMenuItem>

View File

@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
}
getProductContainerClasses={() => getProductContainerClasses(index)}
findContainer={findContainer}
handleAddImageFromUrl={handleAddImageFromUrl}
/>
))}
</div>

View File

@@ -1,7 +1,7 @@
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Card, CardContent } from "@/components/ui/card";
import { Loader2, Link as LinkIcon } from "lucide-react";
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
import { cn } from "@/lib/utils";
import { ImageDropzone } from "./ImageDropzone";
import { SortableImage } from "./SortableImage";
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
import { ProductImageSortable, Product } from "../../types";
import { DroppableContainer } from "../DroppableContainer";
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
import { useQuery } from "@tanstack/react-query";
import config from "@/config";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
import { useState, useMemo } from "react";
interface ReusableImage {
id: number;
name: string;
image_url: string;
is_global: boolean;
company: string | null;
}
interface ProductCardProps {
product: Product;
@@ -26,6 +45,7 @@ interface ProductCardProps {
onRemoveImage: (id: string) => void;
getProductContainerClasses: () => string;
findContainer: (id: string) => string | null;
handleAddImageFromUrl: (productIndex: number, url: string) => void;
}
export const ProductCard = ({
@@ -43,8 +63,11 @@ export const ProductCard = ({
onDragOver,
onRemoveImage,
getProductContainerClasses,
findContainer
findContainer,
handleAddImageFromUrl
}: ProductCardProps) => {
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
// Function to get images for this product
const getProductImages = () => {
return productImages.filter(img => img.productIndex === index);
@@ -56,6 +79,32 @@ export const ProductCard = ({
return result !== null ? parseInt(result) : null;
};
// Fetch reusable images
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
queryKey: ["reusable-images"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/reusable-images`);
if (!response.ok) {
throw new Error("Failed to fetch reusable images");
}
return response.json();
},
});
// Filter reusable images based on product's company
const availableReusableImages = useMemo(() => {
if (!reusableImages) return [];
return reusableImages.filter(img =>
img.is_global || img.company === product.company
);
}, [reusableImages, product.company]);
// Handle adding a reusable image
const handleAddReusableImage = (imageUrl: string) => {
handleAddImageFromUrl(index, imageUrl);
setIsReusableDialogOpen(false);
};
return (
<Card
className={cn(
@@ -83,6 +132,18 @@ export const ProductCard = ({
className="flex items-center gap-2"
onSubmit={onUrlSubmit}
>
{getProductImages().length === 0 && (
<Button
type="button"
variant="outline"
size="sm"
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
onClick={() => setIsReusableDialogOpen(true)}
>
<ImageIcon className="h-3.5 w-3.5" />
Select from Library
</Button>
)}
<Input
placeholder="Add image from URL"
value={urlInput}
@@ -105,7 +166,7 @@ export const ProductCard = ({
</div>
<div className="flex flex-col sm:flex-row gap-2">
<div className="flex flex-row gap-2 items-start">
<div className="flex flex-row gap-2 items-center gap-4">
<ImageDropzone
productIndex={index}
onDrop={onImageUpload}
@@ -158,6 +219,50 @@ export const ProductCard = ({
/>
</div>
</CardContent>
{/* Reusable Images Dialog */}
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
<DialogContent className="max-w-3xl">
<DialogHeader>
<DialogTitle>Select from Image Library</DialogTitle>
<DialogDescription>
Choose a global or company-specific image to add to this product.
</DialogDescription>
</DialogHeader>
<ScrollArea className="h-[400px] pr-4">
{isLoadingReusable ? (
<div className="flex items-center justify-center h-full">
<Loader2 className="h-8 w-8 animate-spin" />
</div>
) : availableReusableImages.length === 0 ? (
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
<ImageIcon className="h-8 w-8 mb-2" />
<p>No reusable images available</p>
</div>
) : (
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
{availableReusableImages.map((image) => (
<div
key={image.id}
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
onClick={() => handleAddReusableImage(image.image_url)}
>
<img
src={image.image_url}
alt={image.name}
className="w-full h-full object-cover"
/>
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
<p className="text-xs text-white truncate">{image.name}</p>
</div>
</div>
))}
</div>
)}
</ScrollArea>
</DialogContent>
</Dialog>
</Card>
);
};

View File

@@ -31,5 +31,6 @@ export interface Product {
supplier_no?: string;
sku?: string;
model?: string;
company?: string;
product_images?: string | string[];
}

View File

@@ -0,0 +1,80 @@
import React, { useState } from 'react';
import { AiValidationDialogs } from './components/AiValidationDialogs';
import { Product } from '../../../../types/products';
import {
AiValidationProgress,
AiValidationDetails,
CurrentPrompt as AiValidationCurrentPrompt
} from './hooks/useAiValidation';
const ValidationStepNew: React.FC = () => {
const [aiValidationProgress, setAiValidationProgress] = useState<AiValidationProgress>({
isOpen: false,
status: 'idle',
step: 0
});
const [aiValidationDetails, setAiValidationDetails] = useState<AiValidationDetails>({
changes: [],
warnings: [],
changeDetails: [],
isOpen: false
});
const [currentPrompt, setCurrentPrompt] = useState<AiValidationCurrentPrompt>({
isOpen: false,
prompt: '',
isLoading: true,
});
// Track reversion state (for internal use)
const [reversionState, setReversionState] = useState<Record<string, boolean>>({});
const [fieldData] = useState<Product[]>([]);
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const revertAiChange = (productIndex: number, fieldKey: string) => {
const key = `${productIndex}-${fieldKey}`;
setReversionState(prev => ({
...prev,
[key]: true
}));
};
const isChangeReverted = (productIndex: number, fieldKey: string): boolean => {
const key = `${productIndex}-${fieldKey}`;
return !!reversionState[key];
};
const getFieldDisplayValueWithHighlight = (
_fieldKey: string,
originalValue: any,
correctedValue: any
) => {
return {
originalHtml: String(originalValue),
correctedHtml: String(correctedValue)
};
};
return (
<div>
<AiValidationDialogs
aiValidationProgress={aiValidationProgress}
aiValidationDetails={aiValidationDetails}
currentPrompt={currentPrompt}
setAiValidationProgress={setAiValidationProgress}
setAiValidationDetails={setAiValidationDetails}
setCurrentPrompt={setCurrentPrompt}
revertAiChange={revertAiChange}
isChangeReverted={isChangeReverted}
getFieldDisplayValueWithHighlight={getFieldDisplayValueWithHighlight}
fields={fieldData}
debugData={currentPrompt.debugData}
/>
</div>
);
};
export default ValidationStepNew;

View File

@@ -1,23 +1,87 @@
import React from 'react';
import { Dialog, DialogContent, DialogHeader, DialogTitle, DialogDescription } from '@/components/ui/dialog';
import { ScrollArea } from '@/components/ui/scroll-area';
import { Button } from '@/components/ui/button';
import { Loader2, CheckIcon } from 'lucide-react';
import { Code } from '@/components/ui/code';
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from '@/components/ui/table';
import { AiValidationDetails, AiValidationProgress, CurrentPrompt } from '../hooks/useAiValidation';
import React, { useState } from "react";
import {
Dialog,
DialogContent,
DialogHeader,
DialogTitle,
DialogDescription,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
import { Button } from "@/components/ui/button";
import { Loader2, CheckIcon, XIcon } from "lucide-react";
import { Code } from "@/components/ui/code";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import {
AiValidationDetails,
AiValidationProgress,
CurrentPrompt,
} from "../hooks/useAiValidation";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Badge } from "@/components/ui/badge";
interface TaxonomyStats {
categories: number;
themes: number;
colors: number;
taxCodes: number;
sizeCategories: number;
suppliers: number;
companies: number;
artists: number;
}
interface DebugData {
taxonomyStats: TaxonomyStats | null;
basePrompt: string;
sampleFullPrompt: string;
promptLength: number;
apiFormat?: Array<{
role: string;
content: string;
}>;
promptSources?: {
systemPrompt?: { id: number; prompt_text: string };
generalPrompt?: { id: number; prompt_text: string };
companyPrompts?: Array<{
id: number;
company: string;
companyName?: string;
prompt_text: string;
}>;
};
estimatedProcessingTime?: {
seconds: number | null;
sampleCount: number;
};
}
interface AiValidationDialogsProps {
aiValidationProgress: AiValidationProgress;
aiValidationDetails: AiValidationDetails;
currentPrompt: CurrentPrompt;
setAiValidationProgress: React.Dispatch<React.SetStateAction<AiValidationProgress>>;
setAiValidationDetails: React.Dispatch<React.SetStateAction<AiValidationDetails>>;
setAiValidationProgress: React.Dispatch<
React.SetStateAction<AiValidationProgress>
>;
setAiValidationDetails: React.Dispatch<
React.SetStateAction<AiValidationDetails>
>;
setCurrentPrompt: React.Dispatch<React.SetStateAction<CurrentPrompt>>;
revertAiChange: (productIndex: number, fieldKey: string) => void;
isChangeReverted: (productIndex: number, fieldKey: string) => boolean;
getFieldDisplayValueWithHighlight: (fieldKey: string, originalValue: any, correctedValue: any) => { originalHtml: string, correctedHtml: string };
getFieldDisplayValueWithHighlight: (
fieldKey: string,
originalValue: any,
correctedValue: any
) => { originalHtml: string; correctedHtml: string };
fields: readonly any[];
debugData?: DebugData;
}
export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
@@ -30,41 +94,558 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
revertAiChange,
isChangeReverted,
getFieldDisplayValueWithHighlight,
fields
fields,
debugData,
}) => {
const [costPerMillionTokens, setCostPerMillionTokens] = useState(2.5); // Default cost
const hasCompanyPrompts =
currentPrompt.debugData?.promptSources?.companyPrompts &&
currentPrompt.debugData.promptSources.companyPrompts.length > 0;
// Create our own state to track changes
const [localReversionState, setLocalReversionState] = useState<
Record<string, boolean>
>({});
// Initialize local state from the isChangeReverted function when component mounts
// or when aiValidationDetails changes
React.useEffect(() => {
if (
aiValidationDetails.changeDetails &&
aiValidationDetails.changeDetails.length > 0
) {
const initialState: Record<string, boolean> = {};
aiValidationDetails.changeDetails.forEach((product) => {
product.changes.forEach((change) => {
const key = `${product.productIndex}-${change.field}`;
initialState[key] = isChangeReverted(
product.productIndex,
change.field
);
});
});
setLocalReversionState(initialState);
}
}, [aiValidationDetails.changeDetails, isChangeReverted]);
// This function will toggle the local state for a given change
const toggleChangeAcceptance = (productIndex: number, fieldKey: string) => {
const key = `${productIndex}-${fieldKey}`;
const currentlyRejected = !!localReversionState[key];
// Toggle the local state
setLocalReversionState((prev) => ({
...prev,
[key]: !prev[key],
}));
// Only call revertAiChange when toggling to rejected state
// Since revertAiChange is specifically for rejecting changes
if (!currentlyRejected) {
revertAiChange(productIndex, fieldKey);
}
};
// Function to check local reversion state
const isChangeLocallyReverted = (
productIndex: number,
fieldKey: string
): boolean => {
const key = `${productIndex}-${fieldKey}`;
return !!localReversionState[key];
};
// Use "full" as the default tab
const defaultTab = "full";
const [activeTab, setActiveTab] = useState(defaultTab);
// Update activeTab when the dialog is opened with new data
React.useEffect(() => {
if (currentPrompt.isOpen) {
setActiveTab("full");
}
}, [currentPrompt.isOpen]);
// Format time helper
const formatTime = (seconds: number): string => {
if (seconds < 60) {
return `${Math.round(seconds)} seconds`;
} else {
const minutes = Math.floor(seconds / 60);
const remainingSeconds = Math.round(seconds % 60);
return `${minutes}m ${remainingSeconds}s`;
}
};
// Calculate token costs
const calculateTokenCost = (promptLength: number): number => {
const estimatedTokens = Math.round(promptLength / 4);
return (estimatedTokens / 1_000_000) * costPerMillionTokens * 100; // In cents
};
// Use the prompt length from the current prompt
const promptLength = currentPrompt.prompt ? currentPrompt.prompt.length : 0;
return (
<>
{/* Current Prompt Dialog */}
<Dialog
open={currentPrompt.isOpen}
onOpenChange={(open) => setCurrentPrompt(prev => ({ ...prev, isOpen: open }))}
{/* Current Prompt Dialog with Debug Info */}
<Dialog
open={currentPrompt.isOpen}
onOpenChange={(open) =>
setCurrentPrompt((prev) => ({ ...prev, isOpen: open }))
}
>
<DialogContent className="max-w-4xl h-[80vh]">
<DialogContent className="max-w-4xl max-h-[90vh] overflow-hidden">
<DialogHeader>
<DialogTitle>Current AI Prompt</DialogTitle>
<DialogDescription>
This is the exact prompt that would be sent to the AI for validation
This is the current prompt that would be sent to the AI for
validation
</DialogDescription>
</DialogHeader>
<ScrollArea className="flex-1">
{currentPrompt.isLoading ? (
<div className="flex items-center justify-center h-full">
<Loader2 className="h-8 w-8 animate-spin" />
</div>
) : (
<Code className="whitespace-pre-wrap p-4">{currentPrompt.prompt}</Code>
)}
</ScrollArea>
<div className="flex flex-col h-[calc(90vh-120px)] overflow-hidden">
{/* Debug Information Section - Fixed at the top */}
<div className="flex-shrink-0">
{currentPrompt.isLoading ? (
<div className="flex justify-center items-center h-[100px]"></div>
) : (
<>
<div className="grid grid-cols-3 gap-4 mb-4">
<Card className="py-2">
<CardHeader className="py-2">
<CardTitle className="text-base">
Prompt Length
</CardTitle>
</CardHeader>
<CardContent className="py-2">
<div className="flex flex-col space-y-2">
<div className="text-sm">
<span className="text-muted-foreground">
Characters:
</span>{" "}
<span className="font-semibold">
{promptLength}
</span>
</div>
<div className="text-sm">
<span className="text-muted-foreground">
Tokens:
</span>{" "}
<span className="font-semibold">
~{Math.round(promptLength / 4)}
</span>
</div>
</div>
</CardContent>
</Card>
<Card className="py-2">
<CardHeader className="py-2">
<CardTitle className="text-base">
Cost Estimate
</CardTitle>
</CardHeader>
<CardContent className="py-2">
<div className="flex flex-col space-y-2">
<div className="flex flex-row items-center">
<label
htmlFor="costPerMillion"
className="text-sm text-muted-foreground"
>
$
</label>
<input
id="costPerMillion"
className="w-[40px] px-1 border rounded-md text-sm"
defaultValue={costPerMillionTokens.toFixed(2)}
onChange={(e) => {
const value = parseFloat(e.target.value);
if (!isNaN(value)) {
setCostPerMillionTokens(value);
}
}}
/>
<label
htmlFor="costPerMillion"
className="text-sm text-muted-foreground ml-1"
>
per million input tokens
</label>
</div>
<div className="text-sm">
<span className="text-muted-foreground">Cost:</span>{" "}
<span className="font-semibold">
{calculateTokenCost(promptLength).toFixed(1)}¢
</span>
</div>
</div>
</CardContent>
</Card>
<Card className="py-2">
<CardHeader className="py-2">
<CardTitle className="text-base">
Processing Time
</CardTitle>
</CardHeader>
<CardContent className="py-2">
<div className="flex flex-col space-y-2">
{currentPrompt.debugData?.estimatedProcessingTime ? (
currentPrompt.debugData.estimatedProcessingTime
.seconds ? (
<>
<div className="text-sm">
<span className="text-muted-foreground">
Estimated time:
</span>{" "}
<span className="font-semibold">
{formatTime(
currentPrompt.debugData
.estimatedProcessingTime.seconds
)}
</span>
</div>
<div className="text-xs text-muted-foreground">
Based on{" "}
{
currentPrompt.debugData
.estimatedProcessingTime.sampleCount
}{" "}
similar validation
{currentPrompt.debugData
.estimatedProcessingTime.sampleCount !== 1
? "s"
: ""}
</div>
</>
) : (
<div className="text-sm text-muted-foreground">
No historical data available for this prompt
size
</div>
)
) : (
<div className="text-sm text-muted-foreground">
No processing time data available
</div>
)}
</div>
</CardContent>
</Card>
</div>
</>
)}
</div>
{/* Prompt Section - Scrollable content */}
<div className="flex-1 min-h-0">
{currentPrompt.isLoading ? (
<div className="flex items-center justify-center h-full">
<Loader2 className="h-8 w-8 animate-spin" />
</div>
) : (
<>
{currentPrompt.debugData?.apiFormat ? (
<div className="flex flex-col h-full">
{/* Prompt Sources Card - Fixed at the top of the content area */}
<Card className="py-2 mb-4 flex-shrink-0">
<CardHeader className="py-2">
<CardTitle className="text-base">
Prompt Sources
</CardTitle>
</CardHeader>
<CardContent className="py-2">
<div className="flex flex-wrap gap-2">
<Badge
variant="outline"
className="bg-purple-100 hover:bg-purple-200 cursor-pointer"
onClick={() =>
document
.getElementById("system-message")
?.scrollIntoView({ behavior: "smooth" })
}
>
System
</Badge>
<Badge
variant="outline"
className="bg-green-100 hover:bg-green-200 cursor-pointer"
onClick={() =>
document
.getElementById("general-section")
?.scrollIntoView({ behavior: "smooth" })
}
>
General
</Badge>
{currentPrompt.debugData.promptSources?.companyPrompts?.map(
(company, idx) => (
<Badge
key={idx}
variant="outline"
className="bg-blue-100 hover:bg-blue-200 cursor-pointer"
onClick={() =>
document
.getElementById("company-section")
?.scrollIntoView({ behavior: "smooth" })
}
>
{company.companyName ||
`Company ${company.company}`}
</Badge>
)
)}
<Badge
variant="outline"
className="bg-amber-100 hover:bg-amber-200 cursor-pointer"
onClick={() =>
document
.getElementById("taxonomy-section")
?.scrollIntoView({ behavior: "smooth" })
}
>
Taxonomy
</Badge>
<Badge
variant="outline"
className="bg-pink-100 hover:bg-pink-200 cursor-pointer"
onClick={() =>
document
.getElementById("product-section")
?.scrollIntoView({ behavior: "smooth" })
}
>
Products
</Badge>
</div>
</CardContent>
</Card>
<ScrollArea className="flex-1 w-full overflow-y-auto">
{currentPrompt.debugData.apiFormat.map(
(message, idx: number) => (
<div
key={idx}
className="border rounded-md p-2 mb-4"
>
<div
id={
message.role === "system"
? "system-message"
: ""
}
className={`p-2 mb-2 rounded-sm font-medium ${
message.role === "system"
? "bg-purple-50 text-purple-800"
: "bg-green-50 text-green-800"
}`}
>
Role: {message.role}
</div>
<Code
className={`whitespace-pre-wrap p-4 break-normal max-w-full ${
message.role === "system"
? "bg-purple-50/30"
: "bg-green-50/30"
}`}
>
{message.role === "user" ? (
<div className="text-wrapper">
{(() => {
const content = message.content;
// Find section boundaries by looking for specific markers
const companySpecificStartIndex =
content.indexOf(
"--- COMPANY-SPECIFIC INSTRUCTIONS ---"
);
const companySpecificEndIndex =
content.indexOf(
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
);
const taxonomyStartIndex =
content.indexOf(
"All Available Categories:"
);
const taxonomyFallbackStartIndex =
content.indexOf(
"Available Categories:"
);
const actualTaxonomyStartIndex =
taxonomyStartIndex >= 0
? taxonomyStartIndex
: taxonomyFallbackStartIndex;
const productDataStartIndex =
content.indexOf(
"----------Here is the product data to validate----------"
);
// If we can't find any markers, just return the content as-is
if (
actualTaxonomyStartIndex < 0 &&
productDataStartIndex < 0 &&
companySpecificStartIndex < 0
) {
return content;
}
// Determine section indices
let generalEndIndex = content.length;
if (companySpecificStartIndex >= 0) {
generalEndIndex =
companySpecificStartIndex;
} else if (
actualTaxonomyStartIndex >= 0
) {
generalEndIndex =
actualTaxonomyStartIndex;
} else if (productDataStartIndex >= 0) {
generalEndIndex = productDataStartIndex;
}
// Determine where taxonomy starts
let taxonomyEndIndex = content.length;
if (productDataStartIndex >= 0) {
taxonomyEndIndex =
productDataStartIndex;
}
// Segments to render with appropriate styling
const segments = [];
// General section (beginning to company/taxonomy/product)
if (generalEndIndex > 0) {
segments.push(
<div
id="general-section"
key="general"
className="border-l-4 border-green-500 pl-4 py-0 my-1"
>
<div className="text-xs font-semibold text-green-700 mb-2">
General Prompt
</div>
<pre className="whitespace-pre-wrap">
{content.substring(
0,
generalEndIndex
)}
</pre>
</div>
);
}
// Company-specific section if present
if (
companySpecificStartIndex >= 0 &&
companySpecificEndIndex >= 0
) {
segments.push(
<div
id="company-section"
key="company"
className="border-l-4 border-blue-500 pl-4 py-0 my-1"
>
<div className="text-xs font-semibold text-blue-700 mb-2">
Company-Specific Instructions
</div>
<pre className="whitespace-pre-wrap">
{content.substring(
companySpecificStartIndex,
companySpecificEndIndex +
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
.length
)}
</pre>
</div>
);
}
// Taxonomy section
if (actualTaxonomyStartIndex >= 0) {
const taxEnd = taxonomyEndIndex;
segments.push(
<div
id="taxonomy-section"
key="taxonomy"
className="border-l-4 border-amber-500 pl-4 py-0 my-1"
>
<div className="text-xs font-semibold text-amber-700 mb-2">
Taxonomy Data
</div>
<pre className="whitespace-pre-wrap">
{content.substring(
actualTaxonomyStartIndex,
taxEnd
)}
</pre>
</div>
);
}
// Product data section
if (productDataStartIndex >= 0) {
segments.push(
<div
id="product-section"
key="product"
className="border-l-4 border-pink-500 pl-4 py-0 my-1"
>
<div className="text-xs font-semibold text-pink-700 mb-2">
Product Data
</div>
<pre className="whitespace-pre-wrap">
{content.substring(
productDataStartIndex
)}
</pre>
</div>
);
}
return <>{segments}</>;
})()}
</div>
) : (
<pre className="whitespace-pre-wrap">
{message.content}
</pre>
)}
</Code>
</div>
)
)}
</ScrollArea>
</div>
) : (
<ScrollArea className="h-full w-full">
<Code className="whitespace-pre-wrap p-4 break-normal max-w-full">
{currentPrompt.prompt}
</Code>
</ScrollArea>
)}
</>
)}
</div>
</div>
</DialogContent>
</Dialog>
{/* AI Validation Progress Dialog */}
<Dialog
open={aiValidationProgress.isOpen}
<Dialog
open={aiValidationProgress.isOpen}
onOpenChange={(open) => {
// Only allow closing if validation failed
if (!open && aiValidationProgress.step === -1) {
setAiValidationProgress(prev => ({ ...prev, isOpen: false }));
setAiValidationProgress((prev) => ({ ...prev, isOpen: false }));
}
}}
>
@@ -76,17 +657,30 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
<div className="flex items-center gap-4">
<div className="flex-1">
<div className="h-2 w-full bg-secondary rounded-full overflow-hidden">
<div
className="h-full bg-primary transition-all duration-500"
style={{
width: `${aiValidationProgress.progressPercent ?? Math.round((aiValidationProgress.step / 5) * 100)}%`,
backgroundColor: aiValidationProgress.step === -1 ? 'var(--destructive)' : undefined
<div
className="h-full bg-primary transition-all duration-500"
style={{
width: `${
aiValidationProgress.progressPercent !== undefined
? Math.round(aiValidationProgress.progressPercent)
: Math.round((aiValidationProgress.step / 5) * 100)
}%`,
backgroundColor:
aiValidationProgress.step === -1
? "var(--destructive)"
: undefined,
}}
/>
</div>
</div>
<div className="text-sm text-muted-foreground w-12 text-right">
{aiValidationProgress.step === -1 ? '❌' : `${aiValidationProgress.progressPercent ?? Math.round((aiValidationProgress.step / 5) * 100)}%`}
{aiValidationProgress.step === -1
? "❌"
: `${
aiValidationProgress.progressPercent !== undefined
? Math.round(aiValidationProgress.progressPercent)
: Math.round((aiValidationProgress.step / 5) * 100)
}%`}
</div>
</div>
<p className="text-center text-sm text-muted-foreground">
@@ -94,32 +688,43 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
</p>
{(() => {
// Only show time remaining if we have an estimate and are in progress
return aiValidationProgress.estimatedSeconds &&
aiValidationProgress.elapsedSeconds !== undefined &&
aiValidationProgress.step > 0 &&
return (
aiValidationProgress.estimatedSeconds &&
aiValidationProgress.elapsedSeconds !== undefined &&
aiValidationProgress.step > 0 &&
aiValidationProgress.step < 5 && (
<div className="text-center text-sm">
{(() => {
// Calculate time remaining using the elapsed seconds
const elapsedSeconds = aiValidationProgress.elapsedSeconds;
const totalEstimatedSeconds = aiValidationProgress.estimatedSeconds;
const remainingSeconds = Math.max(0, totalEstimatedSeconds - elapsedSeconds);
// Format time remaining
if (remainingSeconds < 60) {
return `Approximately ${Math.round(remainingSeconds)} seconds remaining`;
} else {
const minutes = Math.floor(remainingSeconds / 60);
const seconds = Math.round(remainingSeconds % 60);
return `Approximately ${minutes}m ${seconds}s remaining`;
}
})()}
{aiValidationProgress.promptLength && (
<p className="mt-1 text-xs text-muted-foreground">
Prompt length: {aiValidationProgress.promptLength.toLocaleString()} characters
</p>
)}
</div>
<div className="text-center text-sm">
{(() => {
// Calculate time remaining using the elapsed seconds
const elapsedSeconds =
aiValidationProgress.elapsedSeconds;
const totalEstimatedSeconds =
aiValidationProgress.estimatedSeconds;
const remainingSeconds = Math.max(
0,
totalEstimatedSeconds - elapsedSeconds
);
// Format time remaining
if (remainingSeconds < 60) {
return `Approximately ${Math.round(
remainingSeconds
)} seconds remaining`;
} else {
const minutes = Math.floor(remainingSeconds / 60);
const seconds = Math.round(remainingSeconds % 60);
return `Approximately ${minutes}m ${seconds}s remaining`;
}
})()}
{aiValidationProgress.promptLength && (
<p className="mt-1 text-xs text-muted-foreground">
Prompt length:{" "}
{aiValidationProgress.promptLength.toLocaleString()}{" "}
characters
</p>
)}
</div>
)
);
})()}
</div>
@@ -127,26 +732,33 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
</Dialog>
{/* AI Validation Results Dialog */}
<Dialog
open={aiValidationDetails.isOpen}
onOpenChange={(open) => setAiValidationDetails(prev => ({ ...prev, isOpen: open }))}
<Dialog
open={aiValidationDetails.isOpen}
onOpenChange={(open) =>
setAiValidationDetails((prev) => ({ ...prev, isOpen: open }))
}
>
<DialogContent className="max-w-4xl">
<DialogContent className="max-w-6xl w-[90vw]">
<DialogHeader>
<DialogTitle>AI Validation Results</DialogTitle>
<DialogDescription>
Review the changes and warnings suggested by the AI
</DialogDescription>
</DialogHeader>
<ScrollArea className="max-h-[60vh]">
{aiValidationDetails.changeDetails && aiValidationDetails.changeDetails.length > 0 ? (
<ScrollArea className="max-h-[70vh]">
{aiValidationDetails.changeDetails &&
aiValidationDetails.changeDetails.length > 0 ? (
<div className="mb-6 space-y-6">
<h3 className="font-semibold text-lg">Detailed Changes:</h3>
{aiValidationDetails.changeDetails.map((product, i) => {
// Find the title change if it exists
const titleChange = product.changes.find(c => c.field === 'title');
const titleValue = titleChange ? titleChange.corrected : product.title;
const titleChange = product.changes.find(
(c) => c.field === "title"
);
const titleValue = titleChange
? titleChange.corrected
: product.title;
return (
<div key={`product-${i}`} className="border rounded-md p-4">
<h4 className="font-medium text-base mb-3">
@@ -155,64 +767,96 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
<Table>
<TableHeader>
<TableRow>
<TableHead className="w-[180px]">Field</TableHead>
<TableHead>Original Value</TableHead>
<TableHead>Corrected Value</TableHead>
<TableHead className="text-right">Action</TableHead>
<TableHead className="">Field</TableHead>
<TableHead className="w-[35%]">
Original Value
</TableHead>
<TableHead className="w-[35%]">
Corrected Value
</TableHead>
<TableHead className="text-right">
Accept Changes?
</TableHead>
</TableRow>
</TableHeader>
<TableBody>
{product.changes.map((change, j) => {
const field = fields.find(f => f.key === change.field);
const fieldLabel = field ? field.label : change.field;
const isReverted = isChangeReverted(product.productIndex, change.field);
// Get highlighted differences
const { originalHtml, correctedHtml } = getFieldDisplayValueWithHighlight(
change.field,
change.original,
change.corrected
const field = fields.find(
(f) => f.key === change.field
);
const fieldLabel = field
? field.label
: change.field;
const isReverted = isChangeLocallyReverted(
product.productIndex,
change.field
);
// Get highlighted differences
const { originalHtml, correctedHtml } =
getFieldDisplayValueWithHighlight(
change.field,
change.original,
change.corrected
);
return (
<TableRow key={`change-${j}`}>
<TableCell className="font-medium">{fieldLabel}</TableCell>
<TableCell className="font-medium">
{fieldLabel}
</TableCell>
<TableCell>
<div
dangerouslySetInnerHTML={{ __html: originalHtml }}
className={isReverted ? "font-medium" : ""}
<div
dangerouslySetInnerHTML={{
__html: originalHtml,
}}
/>
</TableCell>
<TableCell>
<div
dangerouslySetInnerHTML={{ __html: correctedHtml }}
className={!isReverted ? "font-medium" : ""}
<div
dangerouslySetInnerHTML={{
__html: correctedHtml,
}}
/>
</TableCell>
<TableCell className="text-right">
<div className="mt-2">
{isReverted ? (
<Button
variant="ghost"
size="sm"
className="text-green-600 bg-green-50 hover:bg-green-100 hover:text-green-700"
disabled
>
<CheckIcon className="w-4 h-4 mr-1" />
Reverted
</Button>
) : (
<Button
variant="outline"
size="sm"
onClick={() => {
// Call the revert function directly
revertAiChange(product.productIndex, change.field);
}}
>
Revert Change
</Button>
)}
<TableCell className="text-right align-top">
<div className="flex justify-end gap-2">
<Button
variant="outline"
size="sm"
onClick={() => {
// Toggle to Accepted state if currently rejected
toggleChangeAcceptance(
product.productIndex,
change.field
);
}}
className={
!isReverted
? "bg-green-100 text-green-600 border-green-300 flex items-center"
: "border-gray-200 text-gray-600 hover:bg-green-50 hover:text-green-600 hover:border-green-200 flex items-center"
}
>
<CheckIcon className="h-4 w-4" />
</Button>
<Button
variant="outline"
size="sm"
onClick={() => {
// Toggle to Rejected state if currently accepted
toggleChangeAcceptance(
product.productIndex,
change.field
);
}}
className={
isReverted
? "bg-red-100 text-red-600 border-red-300 flex items-center"
: "border-gray-200 text-gray-600 hover:bg-red-50 hover:text-red-600 hover:border-red-200 flex items-center"
}
>
<XIcon className="h-4 w-4" />
</Button>
</div>
</TableCell>
</TableRow>
@@ -226,12 +870,17 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
</div>
) : (
<div className="py-8 text-center text-muted-foreground">
{aiValidationDetails.warnings && aiValidationDetails.warnings.length > 0 ? (
{aiValidationDetails.warnings &&
aiValidationDetails.warnings.length > 0 ? (
<div>
<p className="mb-4">No changes were made, but the AI provided some warnings:</p>
<p className="mb-4">
No changes were made, but the AI provided some warnings:
</p>
<ul className="list-disc pl-8 text-left">
{aiValidationDetails.warnings.map((warning, i) => (
<li key={`warning-${i}`} className="mb-2">{warning}</li>
<li key={`warning-${i}`} className="mb-2">
{warning}
</li>
))}
</ul>
</div>
@@ -245,4 +894,4 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
</Dialog>
</>
);
};
};

View File

@@ -18,6 +18,7 @@ import { useUpcValidation } from '../hooks/useUpcValidation'
import { useProductLinesFetching } from '../hooks/useProductLinesFetching'
import UpcValidationTableAdapter from './UpcValidationTableAdapter'
import { Skeleton } from '@/components/ui/skeleton'
import { Protected } from '@/components/auth/Protected'
/**
* ValidationContainer component - the main wrapper for the validation step
*
@@ -1149,15 +1150,17 @@ const ValidationContainer = <T extends string>({
)}
<div className="flex items-center gap-2">
{/* Show Prompt Button */}
<Button
variant="outline"
onClick={aiValidation.showCurrentPrompt}
disabled={aiValidation.isAiValidating}
className="flex items-center gap-1"
>
<FileText className="h-4 w-4" />
Show Prompt
</Button>
<Protected permission="admin:debug">
<Button
variant="outline"
onClick={aiValidation.showCurrentPrompt}
disabled={aiValidation.isAiValidating}
className="flex items-center gap-1"
>
<FileText className="h-4 w-4" />
Show Prompt
</Button>
</Protected>
{/* AI Validate Button */}
<Button
@@ -1200,6 +1203,7 @@ const ValidationContainer = <T extends string>({
isChangeReverted={aiValidation.isChangeReverted}
getFieldDisplayValueWithHighlight={aiValidation.getFieldDisplayValueWithHighlight}
fields={fields}
debugData={aiValidation.currentPrompt.debugData}
/>
{/* Product Search Dialog */}

View File

@@ -42,6 +42,39 @@ export interface CurrentPrompt {
isOpen: boolean;
prompt: string | null;
isLoading: boolean;
debugData?: {
taxonomyStats: {
categories: number;
themes: number;
colors: number;
taxCodes: number;
sizeCategories: number;
suppliers: number;
companies: number;
artists: number;
} | null;
basePrompt: string;
sampleFullPrompt: string;
promptLength: number;
apiFormat?: Array<{
role: string;
content: string;
}>;
promptSources?: {
systemPrompt?: { id: number; prompt_text: string };
generalPrompt?: { id: number; prompt_text: string };
companyPrompts?: Array<{
id: number;
company: string;
companyName: string;
prompt_text: string
}>;
};
estimatedProcessingTime?: {
seconds: number | null;
sampleCount: number;
};
};
}
// Declare global interface for the timer
@@ -250,7 +283,11 @@ export const useAiValidation = <T extends string>(
// Function to show current prompt
const showCurrentPrompt = useCallback(async () => {
try {
setCurrentPrompt(prev => ({ ...prev, isLoading: true, isOpen: true }));
setCurrentPrompt(prev => ({
...prev,
isLoading: true,
isOpen: true
}));
// Debug log the data being sent
console.log('Sending products data:', {
@@ -272,7 +309,7 @@ export const useAiValidation = <T extends string>(
});
// Use POST to send products in request body
const response = await fetch(`${getApiUrl()}/ai-validation/debug`, {
const response = await fetch(`${await getApiUrl()}/ai-validation/debug`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
@@ -294,7 +331,16 @@ export const useAiValidation = <T extends string>(
setCurrentPrompt(prev => ({
...prev,
prompt: promptContent,
isLoading: false
isLoading: false,
debugData: {
taxonomyStats: result.taxonomyStats || null,
basePrompt: result.basePrompt || '',
sampleFullPrompt: result.sampleFullPrompt || '',
promptLength: result.promptLength || (promptContent ? promptContent.length : 0),
promptSources: result.promptSources,
estimatedProcessingTime: result.estimatedProcessingTime,
apiFormat: result.apiFormat
}
}));
} else {
throw new Error('No prompt returned from server');
@@ -460,6 +506,27 @@ export const useAiValidation = <T extends string>(
throw new Error(result.error || 'AI validation failed');
}
// Store the prompt sources if they exist
if (result.promptSources) {
setCurrentPrompt(prev => {
// Create debugData if it doesn't exist
const prevDebugData = prev.debugData || {
taxonomyStats: null,
basePrompt: '',
sampleFullPrompt: '',
promptLength: 0
};
return {
...prev,
debugData: {
...prevDebugData,
promptSources: result.promptSources
}
};
});
}
// Update progress with actual processing time if available
if (result.performanceMetrics) {
console.log('Performance metrics:', result.performanceMetrics);

View File

@@ -133,8 +133,9 @@ export function PerformanceMetrics() {
}
};
function getCategoryName(_cat_id: number): import("react").ReactNode {
throw new Error('Function not implemented.');
function getCategoryName(cat_id: number): import("react").ReactNode {
// Simple implementation that just returns the ID as a string
return `Category ${cat_id}`;
}
return (
@@ -217,15 +218,19 @@ export function PerformanceMetrics() {
</TableRow>
</TableHeader>
<TableBody>
{abcConfigs.map((config) => (
{abcConfigs && abcConfigs.length > 0 ? abcConfigs.map((config) => (
<TableRow key={`${config.cat_id}-${config.vendor}`}>
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
<TableCell className="text-right">{config.a_threshold}%</TableCell>
<TableCell className="text-right">{config.b_threshold}%</TableCell>
<TableCell className="text-right">{config.classification_period_days}</TableCell>
<TableCell className="text-right">{config.a_threshold !== undefined ? `${config.a_threshold}%` : '0%'}</TableCell>
<TableCell className="text-right">{config.b_threshold !== undefined ? `${config.b_threshold}%` : '0%'}</TableCell>
<TableCell className="text-right">{config.classification_period_days || 0}</TableCell>
</TableRow>
))}
)) : (
<TableRow>
<TableCell colSpan={5} className="text-center py-4">No ABC configurations available</TableCell>
</TableRow>
)}
</TableBody>
</Table>
<Button onClick={handleUpdateABCConfig}>
@@ -253,14 +258,26 @@ export function PerformanceMetrics() {
</TableRow>
</TableHeader>
<TableBody>
{turnoverConfigs.map((config) => (
{turnoverConfigs && turnoverConfigs.length > 0 ? turnoverConfigs.map((config) => (
<TableRow key={`${config.cat_id}-${config.vendor}`}>
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
<TableCell className="text-right">{config.calculation_period_days}</TableCell>
<TableCell className="text-right">{config.target_rate.toFixed(2)}</TableCell>
<TableCell className="text-right">
{config.target_rate !== undefined && config.target_rate !== null
? (typeof config.target_rate === 'number'
? config.target_rate.toFixed(2)
: (isNaN(parseFloat(String(config.target_rate)))
? '0.00'
: parseFloat(String(config.target_rate)).toFixed(2)))
: '0.00'}
</TableCell>
</TableRow>
))}
)) : (
<TableRow>
<TableCell colSpan={4} className="text-center py-4">No turnover configurations available</TableCell>
</TableRow>
)}
</TableBody>
</Table>
<Button onClick={handleUpdateTurnoverConfig}>

View File

@@ -0,0 +1,584 @@
import { useState, useMemo } from "react";
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import { Button } from "@/components/ui/button";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import { Input } from "@/components/ui/input";
import { Textarea } from "@/components/ui/textarea";
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { Label } from "@/components/ui/label";
import { ArrowUpDown, Pencil, Trash2, PlusCircle } from "lucide-react";
import config from "@/config";
import {
useReactTable,
getCoreRowModel,
getSortedRowModel,
SortingState,
flexRender,
type ColumnDef,
} from "@tanstack/react-table";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import {
AlertDialog,
AlertDialogAction,
AlertDialogCancel,
AlertDialogContent,
AlertDialogDescription,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogTitle,
} from "@/components/ui/alert-dialog";
import { toast } from "sonner";
interface FieldOption {
label: string;
value: string;
}
interface PromptFormData {
id?: number;
prompt_text: string;
prompt_type: 'general' | 'company_specific' | 'system';
company: string | null;
}
interface AiPrompt {
id: number;
prompt_text: string;
prompt_type: 'general' | 'company_specific' | 'system';
company: string | null;
created_at: string;
updated_at: string;
}
interface FieldOptions {
companies: FieldOption[];
}
export function PromptManagement() {
const [isFormOpen, setIsFormOpen] = useState(false);
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
const [promptToDelete, setPromptToDelete] = useState<AiPrompt | null>(null);
const [editingPrompt, setEditingPrompt] = useState<AiPrompt | null>(null);
const [sorting, setSorting] = useState<SortingState>([
{ id: "prompt_type", desc: true },
{ id: "company", desc: false }
]);
const [searchQuery, setSearchQuery] = useState("");
const [formData, setFormData] = useState<PromptFormData>({
prompt_text: "",
prompt_type: "general",
company: null,
});
const queryClient = useQueryClient();
const { data: prompts, isLoading } = useQuery<AiPrompt[]>({
queryKey: ["ai-prompts"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/ai-prompts`);
if (!response.ok) {
throw new Error("Failed to fetch AI prompts");
}
return response.json();
},
});
const { data: fieldOptions } = useQuery<FieldOptions>({
queryKey: ["fieldOptions"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/import/field-options`);
if (!response.ok) {
throw new Error("Failed to fetch field options");
}
return response.json();
},
});
// Check if general and system prompts already exist
const generalPromptExists = useMemo(() => {
return prompts?.some(prompt => prompt.prompt_type === 'general');
}, [prompts]);
const systemPromptExists = useMemo(() => {
return prompts?.some(prompt => prompt.prompt_type === 'system');
}, [prompts]);
const createMutation = useMutation({
mutationFn: async (data: PromptFormData) => {
const response = await fetch(`${config.apiUrl}/ai-prompts`, {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(data),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to create prompt");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
toast.success("Prompt created successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to create prompt");
},
});
const updateMutation = useMutation({
mutationFn: async (data: PromptFormData) => {
if (!data.id) throw new Error("Prompt ID is required for update");
const response = await fetch(`${config.apiUrl}/ai-prompts/${data.id}`, {
method: "PUT",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(data),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to update prompt");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
toast.success("Prompt updated successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to update prompt");
},
});
const deleteMutation = useMutation({
mutationFn: async (id: number) => {
const response = await fetch(`${config.apiUrl}/ai-prompts/${id}`, {
method: "DELETE",
});
if (!response.ok) {
throw new Error("Failed to delete prompt");
}
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
toast.success("Prompt deleted successfully");
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to delete prompt");
},
});
const handleEdit = (prompt: AiPrompt) => {
setEditingPrompt(prompt);
setFormData({
id: prompt.id,
prompt_text: prompt.prompt_text,
prompt_type: prompt.prompt_type,
company: prompt.company,
});
setIsFormOpen(true);
};
const handleDeleteClick = (prompt: AiPrompt) => {
setPromptToDelete(prompt);
setIsDeleteOpen(true);
};
const handleDeleteConfirm = () => {
if (promptToDelete) {
deleteMutation.mutate(promptToDelete.id);
setIsDeleteOpen(false);
setPromptToDelete(null);
}
};
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
// If prompt_type is general or system, ensure company is null
const submitData = {
...formData,
company: formData.prompt_type === 'company_specific' ? formData.company : null,
};
if (editingPrompt) {
updateMutation.mutate(submitData);
} else {
createMutation.mutate(submitData);
}
};
const resetForm = () => {
setFormData({
prompt_text: "",
prompt_type: "general",
company: null,
});
setEditingPrompt(null);
setIsFormOpen(false);
};
const handleCreateClick = () => {
resetForm();
// If general prompt and system prompt exist, default to company-specific
if (generalPromptExists && systemPromptExists) {
setFormData(prev => ({
...prev,
prompt_type: 'company_specific'
}));
} else if (generalPromptExists && !systemPromptExists) {
// If general exists but system doesn't, suggest system prompt
setFormData(prev => ({
...prev,
prompt_type: 'system'
}));
} else if (!generalPromptExists) {
// If no general prompt, suggest that first
setFormData(prev => ({
...prev,
prompt_type: 'general'
}));
}
setIsFormOpen(true);
};
const columns = useMemo<ColumnDef<AiPrompt>[]>(() => [
{
accessorKey: "prompt_type",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Type
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const type = row.getValue("prompt_type") as string;
if (type === 'general') return 'General';
if (type === 'system') return 'System';
return 'Company Specific';
},
},
{
accessorFn: (row) => row.prompt_text.length,
id: "length",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Length
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ getValue }) => {
const length = getValue() as number;
return <span>{length.toLocaleString()}</span>;
},
},
{
accessorKey: "company",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Company
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const companyId = row.getValue("company");
if (!companyId) return 'N/A';
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
},
},
{
accessorKey: "updated_at",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Last Updated
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => new Date(row.getValue("updated_at")).toLocaleDateString(),
},
{
id: "actions",
cell: ({ row }) => (
<div className="flex gap-2 justify-end pr-4">
<Button
variant="ghost"
onClick={() => handleEdit(row.original)}
>
<Pencil className="h-4 w-4" />
Edit
</Button>
<Button
variant="ghost"
className="text-destructive hover:text-destructive"
onClick={() => handleDeleteClick(row.original)}
>
<Trash2 className="h-4 w-4" />
Delete
</Button>
</div>
),
},
], [fieldOptions]);
const filteredData = useMemo(() => {
if (!prompts) return [];
return prompts.filter((prompt) => {
const searchString = searchQuery.toLowerCase();
return (
prompt.prompt_type.toLowerCase().includes(searchString) ||
(prompt.company && prompt.company.toLowerCase().includes(searchString))
);
});
}, [prompts, searchQuery]);
const table = useReactTable({
data: filteredData,
columns,
state: {
sorting,
},
onSortingChange: setSorting,
getSortedRowModel: getSortedRowModel(),
getCoreRowModel: getCoreRowModel(),
});
return (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h2 className="text-2xl font-bold">AI Validation Prompts</h2>
<Button onClick={handleCreateClick}>
<PlusCircle className="mr-2 h-4 w-4" />
Create New Prompt
</Button>
</div>
<div className="flex items-center gap-4">
<Input
placeholder="Search prompts..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="max-w-sm"
/>
</div>
{isLoading ? (
<div>Loading prompts...</div>
) : (
<div className="border rounded-lg">
<Table>
<TableHeader className="bg-muted">
{table.getHeaderGroups().map((headerGroup) => (
<TableRow key={headerGroup.id}>
{headerGroup.headers.map((header) => (
<TableHead key={header.id}>
{header.isPlaceholder
? null
: flexRender(
header.column.columnDef.header,
header.getContext()
)}
</TableHead>
))}
</TableRow>
))}
</TableHeader>
<TableBody>
{table.getRowModel().rows?.length ? (
table.getRowModel().rows.map((row) => (
<TableRow key={row.id} className="hover:bg-gray-100">
{row.getVisibleCells().map((cell) => (
<TableCell key={cell.id} className="pl-6">
{flexRender(cell.column.columnDef.cell, cell.getContext())}
</TableCell>
))}
</TableRow>
))
) : (
<TableRow>
<TableCell colSpan={columns.length} className="text-center">
No prompts found
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</div>
)}
{/* Prompt Form Dialog */}
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
<DialogContent className="max-w-3xl">
<DialogHeader>
<DialogTitle>{editingPrompt ? "Edit Prompt" : "Create New Prompt"}</DialogTitle>
<DialogDescription>
{editingPrompt
? "Update this AI validation prompt."
: "Create a new AI validation prompt that will be used during product validation."}
</DialogDescription>
</DialogHeader>
<form onSubmit={handleSubmit}>
<div className="grid gap-4 py-4">
<div className="grid gap-2">
<Label htmlFor="prompt_type">Prompt Type</Label>
<Select
value={formData.prompt_type}
onValueChange={(value: 'general' | 'company_specific' | 'system') =>
setFormData({ ...formData, prompt_type: value })
}
disabled={(generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id) ||
(systemPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id)}
>
<SelectTrigger>
<SelectValue placeholder="Select prompt type" />
</SelectTrigger>
<SelectContent>
<SelectItem
value="general"
disabled={generalPromptExists && !editingPrompt?.prompt_type?.includes('general')}
>
General
</SelectItem>
<SelectItem
value="system"
disabled={systemPromptExists && !editingPrompt?.prompt_type?.includes('system')}
>
System
</SelectItem>
<SelectItem value="company_specific">Company Specific</SelectItem>
</SelectContent>
</Select>
{generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && systemPromptExists && formData.prompt_type !== 'system' && (
<p className="text-xs text-muted-foreground">
General and system prompts already exist. You can only create company-specific prompts.
</p>
)}
{generalPromptExists && !systemPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && (
<p className="text-xs text-muted-foreground">
A general prompt already exists. You can create a system prompt or company-specific prompts.
</p>
)}
{systemPromptExists && !generalPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id && (
<p className="text-xs text-muted-foreground">
A system prompt already exists. You can create a general prompt or company-specific prompts.
</p>
)}
</div>
{formData.prompt_type === 'company_specific' && (
<div className="grid gap-2">
<Label htmlFor="company">Company</Label>
<Select
value={formData.company || ''}
onValueChange={(value) => setFormData({ ...formData, company: value })}
required={formData.prompt_type === 'company_specific'}
>
<SelectTrigger>
<SelectValue placeholder="Select company" />
</SelectTrigger>
<SelectContent>
{fieldOptions?.companies.map((company) => (
<SelectItem key={company.value} value={company.value}>
{company.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
)}
<div className="grid gap-2">
<Label htmlFor="prompt_text">Prompt Text</Label>
<Textarea
id="prompt_text"
value={formData.prompt_text}
onChange={(e) => setFormData({ ...formData, prompt_text: e.target.value })}
placeholder={`Enter your ${formData.prompt_type === 'system' ? 'system instructions' : 'validation prompt'} text...`}
className="h-80 font-mono text-sm"
required
/>
{formData.prompt_type === 'system' && (
<p className="text-xs text-muted-foreground mt-1">
System prompts provide the initial instructions to the AI. This sets the tone and approach for all validations.
</p>
)}
</div>
</div>
<DialogFooter>
<Button type="button" variant="outline" onClick={() => {
resetForm();
setIsFormOpen(false);
}}>
Cancel
</Button>
<Button type="submit">
{editingPrompt ? "Update" : "Create"} Prompt
</Button>
</DialogFooter>
</form>
</DialogContent>
</Dialog>
{/* Delete Confirmation Dialog */}
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
<AlertDialogContent>
<AlertDialogHeader>
<AlertDialogTitle>Delete Prompt</AlertDialogTitle>
<AlertDialogDescription>
Are you sure you want to delete this prompt? This action cannot be undone.
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel onClick={() => {
setIsDeleteOpen(false);
setPromptToDelete(null);
}}>
Cancel
</AlertDialogCancel>
<AlertDialogAction onClick={handleDeleteConfirm}>
Delete
</AlertDialogAction>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialog>
</div>
);
}

View File

@@ -0,0 +1,773 @@
import { useState, useMemo, useCallback, useRef, useEffect } from "react";
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import { Button } from "@/components/ui/button";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import { Checkbox } from "@/components/ui/checkbox";
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
import config from "@/config";
import {
useReactTable,
getCoreRowModel,
getSortedRowModel,
SortingState,
flexRender,
type ColumnDef,
} from "@tanstack/react-table";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
DialogClose
} from "@/components/ui/dialog";
import {
AlertDialog,
AlertDialogAction,
AlertDialogCancel,
AlertDialogContent,
AlertDialogDescription,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogTitle,
} from "@/components/ui/alert-dialog";
import { toast } from "sonner";
import { useDropzone } from "react-dropzone";
import { cn } from "@/lib/utils";
interface FieldOption {
label: string;
value: string;
}
interface ImageFormData {
id?: number;
name: string;
is_global: boolean;
company: string | null;
file?: File;
}
interface ReusableImage {
id: number;
name: string;
filename: string;
file_path: string;
image_url: string;
is_global: boolean;
company: string | null;
mime_type: string;
file_size: number;
created_at: string;
updated_at: string;
}
interface FieldOptions {
companies: FieldOption[];
}
const ImageForm = ({
editingImage,
formData,
setFormData,
onSubmit,
onCancel,
fieldOptions,
getRootProps,
getInputProps,
isDragActive
}: {
editingImage: ReusableImage | null;
formData: ImageFormData;
setFormData: (data: ImageFormData) => void;
onSubmit: (e: React.FormEvent) => void;
onCancel: () => void;
fieldOptions: FieldOptions | undefined;
getRootProps: any;
getInputProps: any;
isDragActive: boolean;
}) => {
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
setFormData(prev => ({ ...prev, name: e.target.value }));
}, [setFormData]);
const handleGlobalChange = useCallback((checked: boolean) => {
setFormData(prev => ({
...prev,
is_global: checked,
company: checked ? null : prev.company
}));
}, [setFormData]);
const handleCompanyChange = useCallback((value: string) => {
setFormData(prev => ({ ...prev, company: value }));
}, [setFormData]);
return (
<form onSubmit={onSubmit}>
<div className="grid gap-4 py-4">
<div className="grid gap-2">
<Label htmlFor="image_name">Image Name</Label>
<Input
id="image_name"
name="image_name"
value={formData.name}
onChange={handleNameChange}
placeholder="Enter image name"
required
/>
</div>
{!editingImage && (
<div className="grid gap-2">
<Label htmlFor="image">Upload Image</Label>
<div
{...getRootProps()}
className={cn(
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
isDragActive && "border-primary bg-muted"
)}
>
<input {...getInputProps()} />
<div className="flex flex-col items-center justify-center py-2">
{formData.file ? (
<>
<div className="mb-4">
<ImagePreview file={formData.file} />
</div>
<div className="flex items-center gap-2 mb-2">
<Image className="h-4 w-4 text-primary" />
<span className="text-sm">{formData.file.name}</span>
</div>
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
</>
) : isDragActive ? (
<>
<Image className="h-8 w-8 mb-2 text-primary" />
<p className="text-base text-muted-foreground">Drop image here</p>
</>
) : (
<>
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
<p className="text-base text-muted-foreground">Click or drag to upload</p>
</>
)}
</div>
</div>
</div>
)}
<div className="flex items-center space-x-2">
<Checkbox
id="is_global"
checked={formData.is_global}
onCheckedChange={handleGlobalChange}
/>
<Label htmlFor="is_global">Available for all companies</Label>
</div>
{!formData.is_global && (
<div className="grid gap-2">
<Label htmlFor="company">Company</Label>
<Select
value={formData.company || ''}
onValueChange={handleCompanyChange}
required={!formData.is_global}
>
<SelectTrigger>
<SelectValue placeholder="Select company" />
</SelectTrigger>
<SelectContent>
{fieldOptions?.companies.map((company) => (
<SelectItem key={company.value} value={company.value}>
{company.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
)}
</div>
<DialogFooter>
<Button type="button" variant="outline" onClick={onCancel}>
Cancel
</Button>
<Button type="submit">
{editingImage ? "Update" : "Upload"} Image
</Button>
</DialogFooter>
</form>
);
};
export function ReusableImageManagement() {
const [isFormOpen, setIsFormOpen] = useState(false);
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
const [sorting, setSorting] = useState<SortingState>([
{ id: "created_at", desc: true }
]);
const [searchQuery, setSearchQuery] = useState("");
const [formData, setFormData] = useState<ImageFormData>({
name: "",
is_global: false,
company: null,
file: undefined
});
const queryClient = useQueryClient();
const { data: images, isLoading } = useQuery<ReusableImage[]>({
queryKey: ["reusable-images"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/reusable-images`);
if (!response.ok) {
throw new Error("Failed to fetch reusable images");
}
return response.json();
},
});
const { data: fieldOptions } = useQuery<FieldOptions>({
queryKey: ["fieldOptions"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/import/field-options`);
if (!response.ok) {
throw new Error("Failed to fetch field options");
}
return response.json();
},
});
const createMutation = useMutation({
mutationFn: async (data: ImageFormData) => {
// Create FormData for file upload
const formData = new FormData();
formData.append('name', data.name);
formData.append('is_global', String(data.is_global));
if (!data.is_global && data.company) {
formData.append('company', data.company);
}
if (data.file) {
formData.append('image', data.file);
} else {
throw new Error("Image file is required");
}
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
method: "POST",
body: formData,
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to upload image");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image uploaded successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to upload image");
},
});
const updateMutation = useMutation({
mutationFn: async (data: ImageFormData) => {
if (!data.id) throw new Error("Image ID is required for update");
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
method: "PUT",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
name: data.name,
is_global: data.is_global,
company: data.is_global ? null : data.company
}),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to update image");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image updated successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to update image");
},
});
const deleteMutation = useMutation({
mutationFn: async (id: number) => {
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
method: "DELETE",
});
if (!response.ok) {
throw new Error("Failed to delete image");
}
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image deleted successfully");
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to delete image");
},
});
const handleEdit = (image: ReusableImage) => {
setEditingImage(image);
setFormData({
id: image.id,
name: image.name,
is_global: image.is_global,
company: image.company,
});
setIsFormOpen(true);
};
const handleDeleteClick = (image: ReusableImage) => {
setImageToDelete(image);
setIsDeleteOpen(true);
};
const handlePreview = (image: ReusableImage) => {
setPreviewImage(image);
setIsPreviewOpen(true);
};
const handleDeleteConfirm = () => {
if (imageToDelete) {
deleteMutation.mutate(imageToDelete.id);
setIsDeleteOpen(false);
setImageToDelete(null);
}
};
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
// If is_global is true, ensure company is null
const submitData = {
...formData,
company: formData.is_global ? null : formData.company,
};
if (editingImage) {
updateMutation.mutate(submitData);
} else {
if (!submitData.file) {
toast.error("Please select an image file");
return;
}
createMutation.mutate(submitData);
}
};
const resetForm = () => {
setFormData({
name: "",
is_global: false,
company: null,
file: undefined
});
setEditingImage(null);
setIsFormOpen(false);
};
const handleCreateClick = () => {
resetForm();
setIsFormOpen(true);
};
// Configure dropzone for image uploads
const onDrop = useCallback((acceptedFiles: File[]) => {
if (acceptedFiles.length > 0) {
const file = acceptedFiles[0]; // Take only the first file
setFormData(prev => ({
...prev,
file
}));
}
}, []);
const { getRootProps, getInputProps, isDragActive } = useDropzone({
accept: {
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
},
onDrop,
multiple: false // Only accept single files
});
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
{
accessorKey: "name",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Name
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
},
{
accessorKey: "is_global",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Type
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const isGlobal = row.getValue("is_global") as boolean;
return isGlobal ? "Global" : "Company Specific";
},
},
{
accessorKey: "company",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Company
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const isGlobal = row.getValue("is_global") as boolean;
if (isGlobal) return 'N/A';
const companyId = row.getValue("company");
if (!companyId) return 'None';
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
},
},
{
accessorKey: "file_size",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Size
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const size = row.getValue("file_size") as number;
return `${(size / 1024).toFixed(1)} KB`;
},
},
{
accessorKey: "created_at",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Created
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
},
{
accessorKey: "image_url",
header: "Thumbnail",
cell: ({ row }) => (
<div className="flex items-center justify-center">
<img
src={row.getValue("image_url") as string}
alt={row.getValue("name") as string}
className="w-10 h-10 object-contain border rounded"
/>
</div>
),
},
{
id: "actions",
cell: ({ row }) => (
<div className="flex gap-2 justify-end">
<Button
variant="ghost"
size="icon"
onClick={() => handlePreview(row.original)}
title="Preview Image"
>
<Eye className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
onClick={() => handleEdit(row.original)}
title="Edit Image"
>
<Pencil className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
className="text-destructive hover:text-destructive"
onClick={() => handleDeleteClick(row.original)}
title="Delete Image"
>
<Trash2 className="h-4 w-4" />
</Button>
</div>
),
},
], [fieldOptions]);
const filteredData = useMemo(() => {
if (!images) return [];
return images.filter((image) => {
const searchString = searchQuery.toLowerCase();
return (
image.name.toLowerCase().includes(searchString) ||
(image.is_global ? "global" : "company").includes(searchString) ||
(image.company && image.company.toLowerCase().includes(searchString))
);
});
}, [images, searchQuery]);
const table = useReactTable({
data: filteredData,
columns,
state: {
sorting,
},
onSortingChange: setSorting,
getSortedRowModel: getSortedRowModel(),
getCoreRowModel: getCoreRowModel(),
});
return (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h2 className="text-2xl font-bold">Reusable Images</h2>
<Button onClick={handleCreateClick}>
<PlusCircle className="mr-2 h-4 w-4" />
Upload New Image
</Button>
</div>
<div className="flex items-center gap-4">
<Input
placeholder="Search images..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="max-w-sm"
/>
</div>
{isLoading ? (
<div>Loading images...</div>
) : (
<div className="border rounded-lg">
<Table>
<TableHeader className="bg-muted">
{table.getHeaderGroups().map((headerGroup) => (
<TableRow key={headerGroup.id}>
{headerGroup.headers.map((header) => (
<TableHead key={header.id}>
{header.isPlaceholder
? null
: flexRender(
header.column.columnDef.header,
header.getContext()
)}
</TableHead>
))}
</TableRow>
))}
</TableHeader>
<TableBody>
{table.getRowModel().rows?.length ? (
table.getRowModel().rows.map((row) => (
<TableRow key={row.id} className="hover:bg-gray-100">
{row.getVisibleCells().map((cell) => (
<TableCell key={cell.id} className="pl-6">
{flexRender(cell.column.columnDef.cell, cell.getContext())}
</TableCell>
))}
</TableRow>
))
) : (
<TableRow>
<TableCell colSpan={columns.length} className="text-center">
No images found
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</div>
)}
{/* Image Form Dialog */}
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
<DialogContent className="max-w-md">
<DialogHeader>
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
<DialogDescription>
{editingImage
? "Update this reusable image's details."
: "Upload a new reusable image that can be used across products."}
</DialogDescription>
</DialogHeader>
<ImageForm
editingImage={editingImage}
formData={formData}
setFormData={setFormData}
onSubmit={handleSubmit}
onCancel={() => {
resetForm();
setIsFormOpen(false);
}}
fieldOptions={fieldOptions}
getRootProps={getRootProps}
getInputProps={getInputProps}
isDragActive={isDragActive}
/>
</DialogContent>
</Dialog>
{/* Delete Confirmation Dialog */}
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
<AlertDialogContent>
<AlertDialogHeader>
<AlertDialogTitle>Delete Image</AlertDialogTitle>
<AlertDialogDescription>
Are you sure you want to delete this image? This action cannot be undone.
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel onClick={() => {
setIsDeleteOpen(false);
setImageToDelete(null);
}}>
Cancel
</AlertDialogCancel>
<AlertDialogAction onClick={handleDeleteConfirm}>
Delete
</AlertDialogAction>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialog>
{/* Preview Dialog */}
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
<DialogContent className="max-w-3xl">
<DialogHeader>
<DialogTitle>{previewImage?.name}</DialogTitle>
<DialogDescription>
{previewImage?.is_global
? "Global image"
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
</DialogDescription>
</DialogHeader>
<div className="flex justify-center p-4">
{previewImage && (
<div className="bg-checkerboard rounded-md overflow-hidden">
<img
src={previewImage.image_url}
alt={previewImage.name}
className="max-h-[500px] max-w-full object-contain"
/>
</div>
)}
</div>
<div className="grid grid-cols-2 gap-4 text-sm">
<div>
<span className="font-medium">Filename:</span> {previewImage?.filename}
</div>
<div>
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
</div>
<div>
<span className="font-medium">Type:</span> {previewImage?.mime_type}
</div>
<div>
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
</div>
</div>
<DialogFooter>
<DialogClose asChild>
<Button>Close</Button>
</DialogClose>
</DialogFooter>
</DialogContent>
</Dialog>
<style jsx global>{`
.bg-checkerboard {
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
background-size: 20px 20px;
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
}
`}</style>
</div>
);
}
const ImagePreview = ({ file }: { file: File }) => {
const [previewUrl, setPreviewUrl] = useState<string>('');
useEffect(() => {
const url = URL.createObjectURL(file);
setPreviewUrl(url);
return () => {
URL.revokeObjectURL(url);
};
}, [file]);
return (
<img
src={previewUrl}
alt="Preview"
className="max-h-32 max-w-full object-contain rounded-md"
/>
);
};

View File

@@ -1,5 +1,5 @@
import { useState, useEffect, useContext } from "react";
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
import { Alert, AlertDescription, AlertTitle } from "@/components/ui/alert";
import { Button } from "@/components/ui/button";
import { UserList } from "./UserList";
@@ -32,7 +32,7 @@ interface PermissionCategory {
}
export function UserManagement() {
const { token, fetchCurrentUser, user } = useContext(AuthContext);
const { token, fetchCurrentUser } = useContext(AuthContext);
const [users, setUsers] = useState<User[]>([]);
const [selectedUser, setSelectedUser] = useState<User | null>(null);
const [isAddingUser, setIsAddingUser] = useState(false);
@@ -199,7 +199,7 @@ export function UserManagement() {
// Check if permissions are objects (from the form) and convert to IDs for the API
if (userData.permissions.length > 0 && typeof userData.permissions[0] === 'object') {
// The backend expects permission IDs, not just the code strings
formattedUserData.permissions = userData.permissions.map(p => p.id);
formattedUserData.permissions = userData.permissions.map((p: { id: any; }) => p.id);
}
}
@@ -334,9 +334,6 @@ export function UserManagement() {
<CardHeader className="flex flex-row items-center justify-between">
<div>
<CardTitle>User Management</CardTitle>
<CardDescription>
Manage users and their permissions
</CardDescription>
</div>
<Button onClick={handleAddUser}>
Add User

View File

@@ -1,200 +0,0 @@
import { useEffect, useState } from "react"
import { Button } from "@/components/ui/button"
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
import { ScrollArea } from "@/components/ui/scroll-area"
import { Code } from "@/components/ui/code"
import { useToast } from "@/hooks/use-toast"
import { Loader2 } from "lucide-react"
import config from "@/config"
interface TaxonomyStats {
categories: number
themes: number
colors: number
taxCodes: number
sizeCategories: number
suppliers: number
companies: number
artists: number
}
interface DebugData {
taxonomyStats: TaxonomyStats | null
basePrompt: string
sampleFullPrompt: string
promptLength: number
estimatedProcessingTime?: {
seconds: number | null
sampleCount: number
}
}
export function AiValidationDebug() {
const [isLoading, setIsLoading] = useState(false)
const [debugData, setDebugData] = useState<DebugData | null>(null)
const { toast } = useToast()
const fetchDebugData = async () => {
setIsLoading(true)
try {
// Use a sample product to avoid loading full taxonomy
const sampleProduct = {
title: "Sample Product",
description: "A sample product for testing",
SKU: "SAMPLE-001",
price: "9.99",
cost_each: "5.00",
qty_per_unit: "1",
case_qty: "12"
}
const response = await fetch(`${config.apiUrl}/ai-validation/debug`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ products: [sampleProduct] })
})
if (!response.ok) {
throw new Error('Failed to fetch debug data')
}
const data = await response.json()
setDebugData(data)
} catch (error) {
console.error('Error fetching debug data:', error)
toast({
variant: "destructive",
title: "Error",
description: error instanceof Error ? error.message : "Failed to fetch debug data"
})
} finally {
setIsLoading(false)
}
}
useEffect(() => {
fetchDebugData()
}, [])
return (
<div className="container mx-auto py-6 space-y-6">
<div className="flex items-center justify-between">
<h1 className="text-3xl font-bold tracking-tight">AI Validation Debug</h1>
<div className="space-x-4">
<Button
variant="outline"
onClick={fetchDebugData}
disabled={isLoading}
>
{isLoading && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
Refresh Data
</Button>
</div>
</div>
{debugData && (
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
<Card>
<CardHeader>
<CardTitle>Taxonomy Stats</CardTitle>
</CardHeader>
<CardContent>
{debugData.taxonomyStats ? (
<div className="space-y-2">
<div>Categories: {debugData.taxonomyStats.categories}</div>
<div>Themes: {debugData.taxonomyStats.themes}</div>
<div>Colors: {debugData.taxonomyStats.colors}</div>
<div>Tax Codes: {debugData.taxonomyStats.taxCodes}</div>
<div>Size Categories: {debugData.taxonomyStats.sizeCategories}</div>
<div>Suppliers: {debugData.taxonomyStats.suppliers}</div>
<div>Companies: {debugData.taxonomyStats.companies}</div>
<div>Artists: {debugData.taxonomyStats.artists}</div>
</div>
) : (
<div>No taxonomy data available</div>
)}
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Prompt Length</CardTitle>
</CardHeader>
<CardContent>
<div className="space-y-4">
<div className="space-y-2">
<div>Characters: {debugData.promptLength}</div>
<div>Tokens (est.): ~{Math.round(debugData.promptLength / 4)}</div>
</div>
<div className="space-y-2">
<label htmlFor="costPerMillion" className="text-sm text-muted-foreground">
Cost per million tokens ($)
</label>
<input
id="costPerMillion"
type="number"
className="w-full px-3 py-2 border rounded-md"
defaultValue="2.50"
onChange={(e) => {
const costPerMillion = parseFloat(e.target.value)
if (!isNaN(costPerMillion)) {
const tokens = Math.round(debugData.promptLength / 4)
const cost = (tokens / 1_000_000) * costPerMillion * 100 // Convert to cents
const costElement = document.getElementById('tokenCost')
if (costElement) {
costElement.textContent = cost.toFixed(1)
}
}
}}
/>
<div className="text-sm">
Cost: <span id="tokenCost">{((Math.round(debugData.promptLength / 4) / 1_000_000) * 3 * 100).toFixed(1)}</span>¢
</div>
</div>
{debugData.estimatedProcessingTime && (
<div className="mt-4 p-3 bg-muted rounded-md">
<h3 className="text-sm font-medium mb-2">Processing Time Estimate</h3>
{debugData.estimatedProcessingTime.seconds ? (
<div className="space-y-1">
<div className="text-sm">
Estimated time: {formatTime(debugData.estimatedProcessingTime.seconds)}
</div>
<div className="text-xs text-muted-foreground">
Based on {debugData.estimatedProcessingTime.sampleCount} similar validation{debugData.estimatedProcessingTime.sampleCount !== 1 ? 's' : ''}
</div>
</div>
) : (
<div className="text-sm text-muted-foreground">No historical data available for this prompt size</div>
)}
</div>
)}
</div>
</CardContent>
</Card>
<Card className="col-span-full">
<CardHeader>
<CardTitle>Full Sample Prompt</CardTitle>
</CardHeader>
<CardContent>
<ScrollArea className="h-[500px] w-full rounded-md border p-4">
<Code className="whitespace-pre-wrap">{debugData.sampleFullPrompt}</Code>
</ScrollArea>
</CardContent>
</Card>
</div>
)}
</div>
)
}
// Helper function to format time in a human-readable way
function formatTime(seconds: number): string {
if (seconds < 60) {
return `${Math.round(seconds)} seconds`;
} else {
const minutes = Math.floor(seconds / 60);
const remainingSeconds = Math.round(seconds % 60);
return `${minutes}m ${remainingSeconds}s`;
}
}

View File

@@ -5,87 +5,266 @@ import { PerformanceMetrics } from "@/components/settings/PerformanceMetrics";
import { CalculationSettings } from "@/components/settings/CalculationSettings";
import { TemplateManagement } from "@/components/settings/TemplateManagement";
import { UserManagement } from "@/components/settings/UserManagement";
import { PromptManagement } from "@/components/settings/PromptManagement";
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
import { motion } from 'framer-motion';
import { Alert, AlertDescription } from "@/components/ui/alert";
import { Protected } from "@/components/auth/Protected";
import { useContext, useMemo } from "react";
import { AuthContext } from "@/contexts/AuthContext";
import { Separator } from "@/components/ui/separator";
// Define types for settings structure
interface SettingsTab {
id: string;
permission: string;
label: string;
}
interface SettingsGroup {
id: string;
label: string;
tabs: SettingsTab[];
}
// Define available settings tabs with their permission requirements and groups
const SETTINGS_GROUPS: SettingsGroup[] = [
{
id: "inventory",
label: "Inventory Settings",
tabs: [
{ id: "stock-management", permission: "settings:stock_management", label: "Stock Management" },
{ id: "performance-metrics", permission: "settings:performance_metrics", label: "Performance Metrics" },
{ id: "calculation-settings", permission: "settings:calculation_settings", label: "Calculation Settings" },
]
},
{
id: "content",
label: "Content Management",
tabs: [
{ id: "templates", permission: "settings:templates", label: "Template Management" },
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
]
},
{
id: "system",
label: "System",
tabs: [
{ id: "user-management", permission: "settings:user_management", label: "User Management" },
{ id: "data-management", permission: "settings:data_management", label: "Data Management" },
]
}
];
// Flatten tabs for easier access
const SETTINGS_TABS = SETTINGS_GROUPS.flatMap(group => group.tabs);
export function Settings() {
const { user } = useContext(AuthContext);
// Determine the first tab the user has access to
const defaultTab = useMemo(() => {
// Admin users have access to all tabs
if (user?.is_admin) {
return SETTINGS_TABS[0].id;
}
// Find the first tab the user has permission to access
const firstAccessibleTab = SETTINGS_TABS.find(tab =>
user?.permissions?.includes(tab.permission)
);
// Return the ID of the first accessible tab, or first tab as fallback
return firstAccessibleTab?.id || SETTINGS_TABS[0].id;
}, [user]);
// Check if user has access to any tab
const hasAccessToAnyTab = useMemo(() => {
if (user?.is_admin) return true;
return SETTINGS_TABS.some(tab => user?.permissions?.includes(tab.permission));
}, [user]);
// If user doesn't have access to any tabs, show a helpful message
if (!hasAccessToAnyTab) {
return (
<motion.div layout className="container mx-auto py-6">
<div className="mb-6">
<h1 className="text-3xl font-bold">Settings</h1>
</div>
<Alert>
<AlertDescription>
You don't have permission to access any settings. Please contact an administrator for assistance.
</AlertDescription>
</Alert>
</motion.div>
);
}
// Function to check if the user has access to any tab in a group
const hasAccessToGroup = (group: SettingsGroup): boolean => {
if (user?.is_admin) return true;
return group.tabs.some(tab => user?.permissions?.includes(tab.permission));
};
return (
<motion.div layout className="container mx-auto py-6">
<div className="mb-6">
<h1 className="text-3xl font-bold">Settings</h1>
</div>
<Tabs defaultValue="data-management" className="space-y-4">
<TabsList>
<TabsTrigger value="data-management">Data Management</TabsTrigger>
<TabsTrigger value="stock-management">Stock Management</TabsTrigger>
<TabsTrigger value="performance-metrics">
Performance Metrics
</TabsTrigger>
<Protected permission="edit:system_settings">
<TabsTrigger value="calculation-settings">
Calculation Settings
</TabsTrigger>
</Protected>
<TabsTrigger value="templates">
Template Management
</TabsTrigger>
<Protected
permission="view:users"
fallback={null}
>
<TabsTrigger value="user-management">
User Management
</TabsTrigger>
</Protected>
</TabsList>
<Tabs defaultValue={defaultTab} orientation="vertical" className="flex flex-row min-h-[500px]">
<div className="w-60 border-r pr-8">
<TabsList className="flex flex-col h-auto justify-start items-stretch p-0 bg-transparent">
{SETTINGS_GROUPS.map((group) => (
hasAccessToGroup(group) && (
<div key={group.id} className="">
<h3 className="font-semibold text-sm px-3 py-2 bg-muted border text-foreground rounded-md mb-2">
{group.label}
</h3>
<div className="space-y-1 pl-1">
{group.tabs.map((tab) => (
<Protected key={tab.id} permission={tab.permission}>
<TabsTrigger
value={tab.id}
className="w-full justify-start px-3 py-2 text-sm font-normal text-muted-foreground data-[state=active]:font-medium data-[state=active]:text-accent-foreground data-[state=active]:shadow-none rounded-md data-[state=active]:underline"
>
{tab.label}
</TabsTrigger>
</Protected>
))}
</div>
{/* Only add separator if not the last group */}
{group.id !== SETTINGS_GROUPS[SETTINGS_GROUPS.length - 1].id && (
<Separator className="mt-4 mb-4 opacity-70" />
)}
</div>
)
))}
</TabsList>
</div>
<TabsContent value="data-management">
<DataManagement />
</TabsContent>
<div className="pl-8 w-full">
<TabsContent value="data-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:data_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Data Management.
</AlertDescription>
</Alert>
}
>
<DataManagement />
</Protected>
</TabsContent>
<TabsContent value="stock-management">
<StockManagement />
</TabsContent>
<TabsContent value="stock-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:stock_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Stock Management.
</AlertDescription>
</Alert>
}
>
<StockManagement />
</Protected>
</TabsContent>
<TabsContent value="performance-metrics">
<PerformanceMetrics />
</TabsContent>
<TabsContent value="performance-metrics" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:performance_metrics"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Performance Metrics.
</AlertDescription>
</Alert>
}
>
<PerformanceMetrics />
</Protected>
</TabsContent>
<TabsContent value="calculation-settings">
<Protected
permission="edit:system_settings"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Calculation Settings.
</AlertDescription>
</Alert>
}
>
<CalculationSettings />
</Protected>
</TabsContent>
<TabsContent value="calculation-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:calculation_settings"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Calculation Settings.
</AlertDescription>
</Alert>
}
>
<CalculationSettings />
</Protected>
</TabsContent>
<TabsContent value="templates">
<TemplateManagement />
</TabsContent>
<TabsContent value="user-management">
<Protected
permission="view:users"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access User Management.
</AlertDescription>
</Alert>
}
>
<UserManagement />
</Protected>
</TabsContent>
<TabsContent value="templates" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:templates"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Template Management.
</AlertDescription>
</Alert>
}
>
<TemplateManagement />
</Protected>
</TabsContent>
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:prompt_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access AI Prompts.
</AlertDescription>
</Alert>
}
>
<PromptManagement />
</Protected>
</TabsContent>
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:library_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Reusable Images.
</AlertDescription>
</Alert>
}
>
<ReusableImageManagement />
</Protected>
</TabsContent>
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:user_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access User Management.
</AlertDescription>
</Alert>
}
>
<UserManagement />
</Protected>
</TabsContent>
</div>
</Tabs>
</motion.div>
);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -166,6 +166,12 @@ export default defineConfig(function (_a) {
});
},
},
"/uploads": {
target: "https://inventory.kent.pw",
changeOrigin: true,
secure: false,
rewrite: function (path) { return path; },
},
},
},
build: {