Compare commits
38 Commits
1496aa57b1
...
move-to-po
| Author | SHA1 | Date | |
|---|---|---|---|
| e5c4f617c5 | |||
| 8e19e6cd74 | |||
| 749907bd30 | |||
| 108181c63d | |||
| 5dd779cb4a | |||
| 7b0e792d03 | |||
| 517bbe72f4 | |||
| 87d4b9e804 | |||
| 75da2c6772 | |||
| 00a02aa788 | |||
| 114018080a | |||
| 228ae8b2a9 | |||
| dd4b3f7145 | |||
| 7eb4077224 | |||
| d60a8cbc6e | |||
| 1fcbf54989 | |||
| ce75496770 | |||
| 7eae4a0b29 | |||
| f421154c1d | |||
| 03dc119a15 | |||
| 1963bee00c | |||
| 387e7e5e73 | |||
| a51a48ce89 | |||
| aacb3a2fd0 | |||
| 35d2f0df7c | |||
| 7d46ebd6ba | |||
| 675a0fc374 | |||
| ca2653ea1a | |||
| a8d3fd8033 | |||
| 702b956ff1 | |||
| 9b8577f258 | |||
| 9623681a15 | |||
| cc22fd8c35 | |||
| 0ef1b6100e | |||
| a519746ccb | |||
| f29dd8ef8b | |||
| f2a5c06005 | |||
| fb9f959fe5 |
172
docs/PERMISSIONS.md
Normal file
172
docs/PERMISSIONS.md
Normal file
@@ -0,0 +1,172 @@
|
||||
# Permission System Documentation
|
||||
|
||||
This document outlines the permission system implemented in the Inventory Manager application.
|
||||
|
||||
## Permission Structure
|
||||
|
||||
Permissions follow this naming convention:
|
||||
|
||||
- Page access: `access:{page_name}`
|
||||
- Actions: `{action}:{resource}`
|
||||
|
||||
Examples:
|
||||
- `access:products` - Can access the Products page
|
||||
- `create:products` - Can create new products
|
||||
- `edit:users` - Can edit user accounts
|
||||
|
||||
## Permission Components
|
||||
|
||||
### PermissionGuard
|
||||
|
||||
The core component that conditionally renders content based on permissions.
|
||||
|
||||
```tsx
|
||||
<PermissionGuard
|
||||
permission="create:products"
|
||||
fallback={<p>No permission</p>}
|
||||
>
|
||||
<button>Create Product</button>
|
||||
</PermissionGuard>
|
||||
```
|
||||
|
||||
Options:
|
||||
- `permission`: Single permission code
|
||||
- `anyPermissions`: Array of permissions (ANY match grants access)
|
||||
- `allPermissions`: Array of permissions (ALL required)
|
||||
- `adminOnly`: For admin-only sections
|
||||
- `page`: Page name (checks `access:{page}` permission)
|
||||
- `fallback`: Content to show if permission check fails
|
||||
|
||||
### PermissionProtectedRoute
|
||||
|
||||
Protects entire pages based on page access permissions.
|
||||
|
||||
```tsx
|
||||
<Route path="/products" element={
|
||||
<PermissionProtectedRoute page="products">
|
||||
<Products />
|
||||
</PermissionProtectedRoute>
|
||||
} />
|
||||
```
|
||||
|
||||
### ProtectedSection
|
||||
|
||||
Protects sections within a page based on action permissions.
|
||||
|
||||
```tsx
|
||||
<ProtectedSection page="products" action="create">
|
||||
<button>Add Product</button>
|
||||
</ProtectedSection>
|
||||
```
|
||||
|
||||
### PermissionButton
|
||||
|
||||
Button that automatically handles permissions.
|
||||
|
||||
```tsx
|
||||
<PermissionButton
|
||||
page="products"
|
||||
action="create"
|
||||
onClick={handleCreateProduct}
|
||||
>
|
||||
Add Product
|
||||
</PermissionButton>
|
||||
```
|
||||
|
||||
### SettingsSection
|
||||
|
||||
Specific component for settings with built-in permission checks.
|
||||
|
||||
```tsx
|
||||
<SettingsSection
|
||||
title="System Settings"
|
||||
description="Configure global settings"
|
||||
permission="edit:system_settings"
|
||||
>
|
||||
{/* Settings content */}
|
||||
</SettingsSection>
|
||||
```
|
||||
|
||||
## Permission Hooks
|
||||
|
||||
### usePermissions
|
||||
|
||||
Core hook for checking any permission.
|
||||
|
||||
```tsx
|
||||
const { hasPermission, hasPageAccess, isAdmin } = usePermissions();
|
||||
if (hasPermission('delete:products')) {
|
||||
// Can delete products
|
||||
}
|
||||
```
|
||||
|
||||
### usePagePermission
|
||||
|
||||
Specialized hook for page-level permissions.
|
||||
|
||||
```tsx
|
||||
const { canView, canCreate, canEdit, canDelete } = usePagePermission('products');
|
||||
if (canEdit()) {
|
||||
// Can edit products
|
||||
}
|
||||
```
|
||||
|
||||
## Database Schema
|
||||
|
||||
Permissions are stored in the database:
|
||||
- `permissions` table: Stores all available permissions
|
||||
- `user_permissions` junction table: Maps permissions to users
|
||||
|
||||
Admin users automatically have all permissions.
|
||||
|
||||
## Common Permission Codes
|
||||
|
||||
| Code | Description |
|
||||
|------|-------------|
|
||||
| `access:dashboard` | Access to Dashboard page |
|
||||
| `access:products` | Access to Products page |
|
||||
| `create:products` | Create new products |
|
||||
| `edit:products` | Edit existing products |
|
||||
| `delete:products` | Delete products |
|
||||
| `view:users` | View user accounts |
|
||||
| `edit:users` | Edit user accounts |
|
||||
| `manage:permissions` | Assign permissions to users |
|
||||
|
||||
## Implementation Examples
|
||||
|
||||
### Page Protection
|
||||
|
||||
In `App.tsx`:
|
||||
```tsx
|
||||
<Route path="/products" element={
|
||||
<PermissionProtectedRoute page="products">
|
||||
<Products />
|
||||
</PermissionProtectedRoute>
|
||||
} />
|
||||
```
|
||||
|
||||
### Component Level Protection
|
||||
|
||||
```tsx
|
||||
const { canEdit } = usePagePermission('products');
|
||||
|
||||
function handleEdit() {
|
||||
if (!canEdit()) {
|
||||
toast.error("You don't have permission");
|
||||
return;
|
||||
}
|
||||
// Edit logic
|
||||
}
|
||||
```
|
||||
|
||||
### UI Element Protection
|
||||
|
||||
```tsx
|
||||
<PermissionButton
|
||||
page="products"
|
||||
action="delete"
|
||||
onClick={handleDelete}
|
||||
>
|
||||
Delete
|
||||
</PermissionButton>
|
||||
```
|
||||
342
docs/import-from-prod-data-mapping.md
Normal file
342
docs/import-from-prod-data-mapping.md
Normal file
@@ -0,0 +1,342 @@
|
||||
# MySQL to PostgreSQL Import Process Documentation
|
||||
|
||||
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
|
||||
|
||||
## Table of Contents
|
||||
1. [Overview](#overview)
|
||||
2. [Import Architecture](#import-architecture)
|
||||
3. [Column Mappings](#column-mappings)
|
||||
- [Categories](#categories)
|
||||
- [Products](#products)
|
||||
- [Product Categories (Relationship)](#product-categories-relationship)
|
||||
- [Orders](#orders)
|
||||
- [Purchase Orders](#purchase-orders)
|
||||
- [Metadata Tables](#metadata-tables)
|
||||
4. [Special Calculations](#special-calculations)
|
||||
5. [Implementation Notes](#implementation-notes)
|
||||
|
||||
## Overview
|
||||
|
||||
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
|
||||
|
||||
- **Full Import**: Imports all data regardless of last sync time
|
||||
- **Incremental Import**: Only imports data that has changed since the last import
|
||||
|
||||
The process handles four main data types:
|
||||
- Categories (product categorization hierarchy)
|
||||
- Products (inventory items)
|
||||
- Orders (sales records)
|
||||
- Purchase Orders (vendor orders)
|
||||
|
||||
## Import Architecture
|
||||
|
||||
The import process follows these steps:
|
||||
|
||||
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
|
||||
2. **Setup Import History**: Creates a record of the current import operation
|
||||
3. **Import Categories**: Processes product categories in hierarchical order
|
||||
4. **Import Products**: Processes products with their attributes and category relationships
|
||||
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
|
||||
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
|
||||
7. **Record Results**: Updates the import history with results
|
||||
8. **Close Connections**: Cleans up connections and resources
|
||||
|
||||
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
|
||||
|
||||
## Column Mappings
|
||||
|
||||
### Categories
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|---------------------------------|----------------------------------------------|
|
||||
| cat_id | product_categories.cat_id | Direct mapping |
|
||||
| name | product_categories.name | Direct mapping |
|
||||
| type | product_categories.type | Direct mapping |
|
||||
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
|
||||
| description | product_categories.combined_name| Direct mapping |
|
||||
| status | N/A | Hard-coded 'active' |
|
||||
| created_at | N/A | Current timestamp |
|
||||
| updated_at | N/A | Current timestamp |
|
||||
|
||||
**Notes:**
|
||||
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
|
||||
- Type 10/20 are top-level categories with no parent
|
||||
- Types 11/21/12/13 are child categories that reference parent categories
|
||||
|
||||
### Products
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|----------------------|----------------------------------|---------------------------------------------------------------|
|
||||
| pid | products.pid | Direct mapping |
|
||||
| title | products.description | Direct mapping |
|
||||
| description | products.notes | Direct mapping |
|
||||
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
|
||||
| preorder_count | current_inventory.onpreorder | Default 0 |
|
||||
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
|
||||
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
|
||||
| regular_price | products.sellingprice | Default 0 |
|
||||
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
|
||||
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
|
||||
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
|
||||
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
|
||||
| brand | product_categories.name | Linked via products.company |
|
||||
| line | product_categories.name | Linked via products.line |
|
||||
| subline | product_categories.name | Linked via products.subline |
|
||||
| artist | product_categories.name | Linked via products.artist |
|
||||
| categories | product_category_index | Comma-separated list of category IDs |
|
||||
| created_at | products.date_created | Validated date, NULL if invalid |
|
||||
| first_received | products.datein | Validated date, NULL if invalid |
|
||||
| landing_cost_price | NULL | Not set |
|
||||
| barcode | products.upc | Direct mapping |
|
||||
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
|
||||
| updated_at | products.stamp | Validated date, NULL if invalid |
|
||||
| visible | shop_inventory | Calculated from show + buyable > 0 |
|
||||
| managing_stock | N/A | Hard-coded true |
|
||||
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
|
||||
| permalink | N/A | Constructed URL with product ID |
|
||||
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
|
||||
| uom | N/A | Hard-coded 1 |
|
||||
| rating | products.rating | Direct mapping |
|
||||
| reviews | products.rating_votes | Direct mapping |
|
||||
| weight | products.weight | Direct mapping |
|
||||
| length | products.length | Direct mapping |
|
||||
| width | products.width | Direct mapping |
|
||||
| height | products.height | Direct mapping |
|
||||
| country_of_origin | products.country_of_origin | Direct mapping |
|
||||
| location | products.location | Direct mapping |
|
||||
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
|
||||
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
|
||||
| notifies | product_notify | COUNT of records where pn.pid = pid |
|
||||
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
|
||||
| image | N/A | Constructed from pid and image URL pattern |
|
||||
| image_175 | N/A | Constructed from pid and image URL pattern |
|
||||
| image_full | N/A | Constructed from pid and image URL pattern |
|
||||
| options | NULL | Not set |
|
||||
| tags | NULL | Not set |
|
||||
|
||||
**Notes:**
|
||||
- Replenishable calculation:
|
||||
```javascript
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END
|
||||
```
|
||||
|
||||
In business terms, a product is considered NOT replenishable only if:
|
||||
- It was manually flagged as not replenishable (negative reorder value)
|
||||
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
|
||||
- Image URLs are constructed using this pattern:
|
||||
```javascript
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
const prefix = paddedPid.slice(0, 3);
|
||||
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||
return {
|
||||
image: `${basePath}-t-${iid}.jpg`,
|
||||
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||
image_full: `${basePath}-o-${iid}.jpg`
|
||||
};
|
||||
```
|
||||
|
||||
### Product Categories (Relationship)
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| pid | products.pid | Direct mapping |
|
||||
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
|
||||
|
||||
**Notes:**
|
||||
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
|
||||
- Categories 16 and 17 are explicitly excluded
|
||||
|
||||
### Orders
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| order_number | order_items.order_id | Direct mapping |
|
||||
| pid | order_items.prod_pid | Direct mapping |
|
||||
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| date | _order.date_placed_onlydate | Via join to _order table |
|
||||
| price | order_items.prod_price | Direct mapping |
|
||||
| quantity | order_items.qty_ordered | Direct mapping |
|
||||
| discount | Multiple sources | Complex calculation (see notes) |
|
||||
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
|
||||
| tax_included | N/A | Hard-coded false |
|
||||
| shipping | N/A | Hard-coded 0 |
|
||||
| customer | _order.order_cid | Direct mapping |
|
||||
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
|
||||
| status | _order.order_status | Direct mapping |
|
||||
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
|
||||
| costeach | order_costs | From latest record or fallback to price * 0.5 |
|
||||
|
||||
**Notes:**
|
||||
- Only orders with order_status >= 15 and with a valid date_placed are processed
|
||||
- For incremental imports, only orders modified since last sync are processed
|
||||
- Discount calculation combines three sources:
|
||||
1. Base discount: order_items.prod_price_reg - order_items.prod_price
|
||||
2. Promo discount: SUM of order_discount_items.amount
|
||||
3. Proportional order discount: Calculation based on order subtotal proportion
|
||||
```javascript
|
||||
(oi.base_discount +
|
||||
COALESCE(ot.promo_discount, 0) +
|
||||
CASE
|
||||
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
|
||||
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
|
||||
ELSE 0
|
||||
END)::DECIMAL(10,2)
|
||||
```
|
||||
- Taxes are taken from the latest tax record for an order
|
||||
- Cost data is taken from the latest non-pending cost record
|
||||
|
||||
### Purchase Orders
|
||||
|
||||
| PostgreSQL Column | MySQL Source | Transformation |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| po_id | po.po_id | Default 0 if NULL |
|
||||
| pid | po_products.pid | Direct mapping |
|
||||
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||
| name | products.description | Fallback to 'Unknown Product' |
|
||||
| cost_price | po_products.cost_each | Direct mapping |
|
||||
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
|
||||
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
|
||||
| date | po.date_ordered | Fallback to po.date_created if NULL |
|
||||
| expected_date | po.date_estin | Direct mapping |
|
||||
| status | po.status | Default 1 if NULL |
|
||||
| notes | po.short_note | Fallback to po.notes if NULL |
|
||||
| ordered | po_products.qty_each | Direct mapping |
|
||||
| received | N/A | Hard-coded 0 |
|
||||
| receiving_status | N/A | Hard-coded 1 |
|
||||
|
||||
**Notes:**
|
||||
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
|
||||
- For incremental imports, only POs modified since last sync are processed
|
||||
|
||||
### Metadata Tables
|
||||
|
||||
#### import_history
|
||||
|
||||
| PostgreSQL Column | Source | Notes |
|
||||
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||
| id | Auto-increment | Primary key |
|
||||
| table_name | Code | 'all_tables' for overall import |
|
||||
| start_time | NOW() | Import start time |
|
||||
| end_time | NOW() | Import completion time |
|
||||
| duration_seconds | Calculation | Elapsed seconds |
|
||||
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
|
||||
| records_added | Calculation | Sum from all imports |
|
||||
| records_updated | Calculation | Sum from all imports |
|
||||
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
|
||||
| error_message | Exception | Error message if failed |
|
||||
| additional_info | JSON | Configuration and results |
|
||||
|
||||
#### sync_status
|
||||
|
||||
| PostgreSQL Column | Source | Notes |
|
||||
|----------------------|--------------------------------|---------------------------------------------------------------|
|
||||
| table_name | Code | Name of imported table |
|
||||
| last_sync_timestamp | NOW() | Timestamp of successful sync |
|
||||
| last_sync_id | NULL | Not used currently |
|
||||
|
||||
## Special Calculations
|
||||
|
||||
### Date Validation
|
||||
|
||||
MySQL dates are validated before insertion into PostgreSQL:
|
||||
|
||||
```javascript
|
||||
function validateDate(mysqlDate) {
|
||||
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
|
||||
return null;
|
||||
}
|
||||
// Check if the date is valid
|
||||
const date = new Date(mysqlDate);
|
||||
return isNaN(date.getTime()) ? null : mysqlDate;
|
||||
}
|
||||
```
|
||||
|
||||
### Retry Mechanism
|
||||
|
||||
Operations that might fail temporarily are retried with exponential backoff:
|
||||
|
||||
```javascript
|
||||
async function withRetry(operation, errorMessage) {
|
||||
let lastError;
|
||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||
try {
|
||||
return await operation();
|
||||
} catch (error) {
|
||||
lastError = error;
|
||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||
if (attempt < MAX_RETRIES) {
|
||||
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||
}
|
||||
}
|
||||
}
|
||||
throw lastError;
|
||||
}
|
||||
```
|
||||
|
||||
### Progress Tracking
|
||||
|
||||
Progress is tracked with estimated time remaining:
|
||||
|
||||
```javascript
|
||||
function estimateRemaining(startTime, current, total) {
|
||||
if (current === 0) return "Calculating...";
|
||||
const elapsedSeconds = (Date.now() - startTime) / 1000;
|
||||
const itemsPerSecond = current / elapsedSeconds;
|
||||
const remainingItems = total - current;
|
||||
const remainingSeconds = remainingItems / itemsPerSecond;
|
||||
return formatElapsedTime(remainingSeconds);
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
### Transaction Management
|
||||
|
||||
All imports use transactions to ensure data consistency:
|
||||
|
||||
- **Categories**: Uses savepoints for each category type
|
||||
- **Products**: Uses a single transaction for the entire import
|
||||
- **Orders**: Uses a single transaction with temporary tables
|
||||
- **Purchase Orders**: Uses a single transaction with temporary tables
|
||||
|
||||
### Memory Usage Optimization
|
||||
|
||||
To minimize memory usage when processing large datasets:
|
||||
|
||||
1. Data is processed in batches (100-5000 records per batch)
|
||||
2. Temporary tables are used for intermediate data
|
||||
3. Some queries use cursors to avoid loading all results at once
|
||||
|
||||
### MySQL vs PostgreSQL Compatibility
|
||||
|
||||
The scripts handle differences between MySQL and PostgreSQL:
|
||||
|
||||
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
|
||||
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
|
||||
3. Transaction syntax differences are abstracted in the connection wrapper
|
||||
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
|
||||
|
||||
### SSH Tunnel
|
||||
|
||||
Database connections go through an SSH tunnel for security:
|
||||
|
||||
```javascript
|
||||
ssh.forwardOut(
|
||||
"127.0.0.1",
|
||||
0,
|
||||
sshConfig.prodDbConfig.host,
|
||||
sshConfig.prodDbConfig.port,
|
||||
async (err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream });
|
||||
}
|
||||
);
|
||||
```
|
||||
1065
docs/metrics-calculation-system.md
Normal file
1065
docs/metrics-calculation-system.md
Normal file
File diff suppressed because it is too large
Load Diff
131
docs/validation-hook-refactor.md
Normal file
131
docs/validation-hook-refactor.md
Normal file
@@ -0,0 +1,131 @@
|
||||
|
||||
|
||||
# Refactoring Plan for Validation Code
|
||||
|
||||
## Current Structure Analysis
|
||||
- **useValidationState.tsx**: ~1650 lines - Core validation state management
|
||||
- **useValidation.tsx**: ~425 lines - Field/data validation utility
|
||||
- **useUpcValidation.tsx**: ~410 lines - UPC-specific validation
|
||||
|
||||
## Proposed New Structure
|
||||
|
||||
### 1. Core Types & Utilities (150-200 lines)
|
||||
**File: `validation/types.ts`**
|
||||
- All interfaces and types (RowData, ValidationError, FilterState, Template, etc.)
|
||||
- Shared utility functions (isEmpty, getCellKey, etc.)
|
||||
|
||||
**File: `validation/utils.ts`**
|
||||
- Generic validation utility functions
|
||||
- Caching mechanism and cache clearing helpers
|
||||
- API URL helpers
|
||||
|
||||
### 2. Field Validation (300-350 lines)
|
||||
**File: `validation/hooks/useFieldValidation.ts`**
|
||||
- `validateField` function
|
||||
- Field-level validation logic
|
||||
- Required, regex, and other field validations
|
||||
|
||||
### 3. Uniqueness Validation (250-300 lines)
|
||||
**File: `validation/hooks/useUniquenessValidation.ts`**
|
||||
- `validateUniqueField` function
|
||||
- `validateUniqueItemNumbers` function
|
||||
- All uniqueness checking logic
|
||||
|
||||
### 4. UPC Validation (300-350 lines)
|
||||
**File: `validation/hooks/useUpcValidation.ts`**
|
||||
- `fetchProductByUpc` function
|
||||
- `validateUpc` function
|
||||
- `applyItemNumbersToData` function
|
||||
- UPC validation state management
|
||||
|
||||
### 5. Validation Status Management (300-350 lines)
|
||||
**File: `validation/hooks/useValidationStatus.ts`**
|
||||
- Error state management
|
||||
- Row validation status tracking
|
||||
- Validation indicators and refs
|
||||
- Batch validation processing
|
||||
|
||||
### 6. Data Management (300-350 lines)
|
||||
**File: `validation/hooks/useValidationData.ts`**
|
||||
- Data state management
|
||||
- Row updates
|
||||
- Data filtering
|
||||
- Initial data processing
|
||||
|
||||
### 7. Template Management (250-300 lines)
|
||||
**File: `validation/hooks/useTemplateManagement.ts`**
|
||||
- Template saving
|
||||
- Template application
|
||||
- Template loading
|
||||
- Template display helpers
|
||||
|
||||
### 8. Main Validation Hook (300-350 lines)
|
||||
**File: `validation/hooks/useValidation.ts`**
|
||||
- Main hook that composes all other hooks
|
||||
- Public API export
|
||||
- Initialization logic
|
||||
- Core validation flow
|
||||
|
||||
## Function Distribution
|
||||
|
||||
### Core Types & Utilities
|
||||
- All interfaces (InfoWithSource, ValidationState, etc.)
|
||||
- `isEmpty` utility
|
||||
- `getApiUrl` helper
|
||||
|
||||
### Field Validation
|
||||
- `validateField`
|
||||
- `validateRow`
|
||||
- `validateData` (partial)
|
||||
- All validation result caching
|
||||
|
||||
### Uniqueness Validation
|
||||
- `validateUniqueField`
|
||||
- `validateUniqueItemNumbers`
|
||||
- Uniqueness caching mechanisms
|
||||
|
||||
### UPC Validation
|
||||
- `fetchProductByUpc`
|
||||
- `validateUpc`
|
||||
- `validateAllUPCs`
|
||||
- `applyItemNumbersToData`
|
||||
- UPC validation state tracking (cells, rows)
|
||||
|
||||
### Validation Status Management
|
||||
- `startValidatingCell`/`stopValidatingCell`
|
||||
- `startValidatingRow`/`stopValidatingRow`
|
||||
- `isValidatingCell`/`isRowValidatingUpc`
|
||||
- Error state management
|
||||
- `revalidateRows`
|
||||
|
||||
### Data Management
|
||||
- Initial data cleaning/processing
|
||||
- `updateRow`
|
||||
- `copyDown`
|
||||
- Search/filter functionality
|
||||
- `filteredData` calculation
|
||||
|
||||
### Template Management
|
||||
- `saveTemplate`
|
||||
- `applyTemplate`
|
||||
- `applyTemplateToSelected`
|
||||
- `getTemplateDisplayText`
|
||||
- `loadTemplates`/`refreshTemplates`
|
||||
|
||||
### Main Validation Hook
|
||||
- Composition of all other hooks
|
||||
- Initialization logic
|
||||
- Button/navigation handling
|
||||
- Field options management
|
||||
|
||||
## Implementation Approach
|
||||
|
||||
1. **Start with Types**: Create the types file first, as all other files will depend on it
|
||||
2. **Create Utility Functions**: Move shared utilities next
|
||||
3. **Build Core Validation**: Extract the field validation and uniqueness validation
|
||||
4. **Separate UPC Logic**: Move all UPC-specific code to its own module
|
||||
5. **Extract State Management**: Move data and status management to separate files
|
||||
6. **Move Template Logic**: Extract template functionality
|
||||
7. **Create Composition Hook**: Build the main hook that uses all other hooks
|
||||
|
||||
This approach will give you more maintainable code with clearer separation of concerns, making it easier to understand, test, and modify each component independently.
|
||||
128
inventory-server/auth/permissions.js
Normal file
128
inventory-server/auth/permissions.js
Normal file
@@ -0,0 +1,128 @@
|
||||
// Get pool from global or create a new one if not available
|
||||
let pool;
|
||||
if (typeof global.pool !== 'undefined') {
|
||||
pool = global.pool;
|
||||
} else {
|
||||
// If global pool is not available, create a new connection
|
||||
const { Pool } = require('pg');
|
||||
pool = new Pool({
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
port: process.env.DB_PORT,
|
||||
});
|
||||
console.log('Created new database pool in permissions.js');
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a user has a specific permission
|
||||
* @param {number} userId - The user ID to check
|
||||
* @param {string} permissionCode - The permission code to check
|
||||
* @returns {Promise<boolean>} - Whether the user has the permission
|
||||
*/
|
||||
async function checkPermission(userId, permissionCode) {
|
||||
try {
|
||||
// First check if the user is an admin
|
||||
const adminResult = await pool.query(
|
||||
'SELECT is_admin FROM users WHERE id = $1',
|
||||
[userId]
|
||||
);
|
||||
|
||||
// If user is admin, automatically grant permission
|
||||
if (adminResult.rows.length > 0 && adminResult.rows[0].is_admin) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Otherwise check for specific permission
|
||||
const result = await pool.query(
|
||||
`SELECT COUNT(*) AS has_permission
|
||||
FROM user_permissions up
|
||||
JOIN permissions p ON up.permission_id = p.id
|
||||
WHERE up.user_id = $1 AND p.code = $2`,
|
||||
[userId, permissionCode]
|
||||
);
|
||||
|
||||
return result.rows[0].has_permission > 0;
|
||||
} catch (error) {
|
||||
console.error('Error checking permission:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Middleware to require a specific permission
|
||||
* @param {string} permissionCode - The permission code required
|
||||
* @returns {Function} - Express middleware function
|
||||
*/
|
||||
function requirePermission(permissionCode) {
|
||||
return async (req, res, next) => {
|
||||
try {
|
||||
// Check if user is authenticated
|
||||
if (!req.user || !req.user.id) {
|
||||
return res.status(401).json({ error: 'Authentication required' });
|
||||
}
|
||||
|
||||
const hasPermission = await checkPermission(req.user.id, permissionCode);
|
||||
|
||||
if (!hasPermission) {
|
||||
return res.status(403).json({
|
||||
error: 'Insufficient permissions',
|
||||
requiredPermission: permissionCode
|
||||
});
|
||||
}
|
||||
|
||||
next();
|
||||
} catch (error) {
|
||||
console.error('Permission middleware error:', error);
|
||||
res.status(500).json({ error: 'Server error checking permissions' });
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all permissions for a user
|
||||
* @param {number} userId - The user ID
|
||||
* @returns {Promise<string[]>} - Array of permission codes
|
||||
*/
|
||||
async function getUserPermissions(userId) {
|
||||
try {
|
||||
// Check if user is admin
|
||||
const adminResult = await pool.query(
|
||||
'SELECT is_admin FROM users WHERE id = $1',
|
||||
[userId]
|
||||
);
|
||||
|
||||
if (adminResult.rows.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const isAdmin = adminResult.rows[0].is_admin;
|
||||
|
||||
if (isAdmin) {
|
||||
// Admin gets all permissions
|
||||
const allPermissions = await pool.query('SELECT code FROM permissions');
|
||||
return allPermissions.rows.map(p => p.code);
|
||||
} else {
|
||||
// Get assigned permissions
|
||||
const permissions = await pool.query(
|
||||
`SELECT p.code
|
||||
FROM permissions p
|
||||
JOIN user_permissions up ON p.id = up.permission_id
|
||||
WHERE up.user_id = $1`,
|
||||
[userId]
|
||||
);
|
||||
|
||||
return permissions.rows.map(p => p.code);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error getting user permissions:', error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
checkPermission,
|
||||
requirePermission,
|
||||
getUserPermissions
|
||||
};
|
||||
513
inventory-server/auth/routes.js
Normal file
513
inventory-server/auth/routes.js
Normal file
@@ -0,0 +1,513 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const bcrypt = require('bcrypt');
|
||||
const jwt = require('jsonwebtoken');
|
||||
const { requirePermission, getUserPermissions } = require('./permissions');
|
||||
|
||||
// Get pool from global or create a new one if not available
|
||||
let pool;
|
||||
if (typeof global.pool !== 'undefined') {
|
||||
pool = global.pool;
|
||||
} else {
|
||||
// If global pool is not available, create a new connection
|
||||
const { Pool } = require('pg');
|
||||
pool = new Pool({
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
port: process.env.DB_PORT,
|
||||
});
|
||||
console.log('Created new database pool in routes.js');
|
||||
}
|
||||
|
||||
// Authentication middleware
|
||||
const authenticate = async (req, res, next) => {
|
||||
try {
|
||||
const authHeader = req.headers.authorization;
|
||||
if (!authHeader || !authHeader.startsWith('Bearer ')) {
|
||||
return res.status(401).json({ error: 'Authentication required' });
|
||||
}
|
||||
|
||||
const token = authHeader.split(' ')[1];
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
|
||||
// Get user from database
|
||||
const result = await pool.query(
|
||||
'SELECT id, username, is_admin FROM users WHERE id = $1',
|
||||
[decoded.userId]
|
||||
);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(401).json({ error: 'User not found' });
|
||||
}
|
||||
|
||||
// Attach user to request
|
||||
req.user = result.rows[0];
|
||||
next();
|
||||
} catch (error) {
|
||||
console.error('Authentication error:', error);
|
||||
res.status(401).json({ error: 'Invalid token' });
|
||||
}
|
||||
};
|
||||
|
||||
// Login route
|
||||
router.post('/login', async (req, res) => {
|
||||
try {
|
||||
const { username, password } = req.body;
|
||||
|
||||
// Get user from database
|
||||
const result = await pool.query(
|
||||
'SELECT id, username, password, is_admin, is_active FROM users WHERE username = $1',
|
||||
[username]
|
||||
);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(401).json({ error: 'Invalid username or password' });
|
||||
}
|
||||
|
||||
const user = result.rows[0];
|
||||
|
||||
// Check if user is active
|
||||
if (!user.is_active) {
|
||||
return res.status(403).json({ error: 'Account is inactive' });
|
||||
}
|
||||
|
||||
// Verify password
|
||||
const validPassword = await bcrypt.compare(password, user.password);
|
||||
if (!validPassword) {
|
||||
return res.status(401).json({ error: 'Invalid username or password' });
|
||||
}
|
||||
|
||||
// Update last login
|
||||
await pool.query(
|
||||
'UPDATE users SET last_login = CURRENT_TIMESTAMP WHERE id = $1',
|
||||
[user.id]
|
||||
);
|
||||
|
||||
// Generate JWT
|
||||
const token = jwt.sign(
|
||||
{ userId: user.id, username: user.username },
|
||||
process.env.JWT_SECRET,
|
||||
{ expiresIn: '8h' }
|
||||
);
|
||||
|
||||
// Get user permissions
|
||||
const permissions = await getUserPermissions(user.id);
|
||||
|
||||
res.json({
|
||||
token,
|
||||
user: {
|
||||
id: user.id,
|
||||
username: user.username,
|
||||
is_admin: user.is_admin,
|
||||
permissions
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Login error:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get current user
|
||||
router.get('/me', authenticate, async (req, res) => {
|
||||
try {
|
||||
// Get user permissions
|
||||
const permissions = await getUserPermissions(req.user.id);
|
||||
|
||||
res.json({
|
||||
id: req.user.id,
|
||||
username: req.user.username,
|
||||
is_admin: req.user.is_admin,
|
||||
permissions
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error getting current user:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get all users
|
||||
router.get('/users', authenticate, requirePermission('view:users'), async (req, res) => {
|
||||
try {
|
||||
const result = await pool.query(`
|
||||
SELECT id, username, email, is_admin, is_active, created_at, last_login
|
||||
FROM users
|
||||
ORDER BY username
|
||||
`);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error getting users:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get user with permissions
|
||||
router.get('/users/:id', authenticate, requirePermission('view:users'), async (req, res) => {
|
||||
try {
|
||||
const userId = req.params.id;
|
||||
|
||||
// Get user details
|
||||
const userResult = await pool.query(`
|
||||
SELECT id, username, email, is_admin, is_active, created_at, last_login
|
||||
FROM users
|
||||
WHERE id = $1
|
||||
`, [userId]);
|
||||
|
||||
if (userResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
|
||||
// Get user permissions
|
||||
const permissionsResult = await pool.query(`
|
||||
SELECT p.id, p.name, p.code, p.category, p.description
|
||||
FROM permissions p
|
||||
JOIN user_permissions up ON p.id = up.permission_id
|
||||
WHERE up.user_id = $1
|
||||
ORDER BY p.category, p.name
|
||||
`, [userId]);
|
||||
|
||||
// Combine user and permissions
|
||||
const user = {
|
||||
...userResult.rows[0],
|
||||
permissions: permissionsResult.rows
|
||||
};
|
||||
|
||||
res.json(user);
|
||||
} catch (error) {
|
||||
console.error('Error getting user:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Create new user
|
||||
router.post('/users', authenticate, requirePermission('create:users'), async (req, res) => {
|
||||
const client = await pool.connect();
|
||||
|
||||
try {
|
||||
const { username, email, password, is_admin, is_active, permissions } = req.body;
|
||||
|
||||
console.log("Create user request:", {
|
||||
username,
|
||||
email,
|
||||
is_admin,
|
||||
is_active,
|
||||
permissions: permissions || []
|
||||
});
|
||||
|
||||
// Validate required fields
|
||||
if (!username || !password) {
|
||||
return res.status(400).json({ error: 'Username and password are required' });
|
||||
}
|
||||
|
||||
// Check if username is taken
|
||||
const existingUser = await client.query(
|
||||
'SELECT id FROM users WHERE username = $1',
|
||||
[username]
|
||||
);
|
||||
|
||||
if (existingUser.rows.length > 0) {
|
||||
return res.status(400).json({ error: 'Username already exists' });
|
||||
}
|
||||
|
||||
// Start transaction
|
||||
await client.query('BEGIN');
|
||||
|
||||
// Hash password
|
||||
const saltRounds = 10;
|
||||
const hashedPassword = await bcrypt.hash(password, saltRounds);
|
||||
|
||||
// Insert new user
|
||||
const userResult = await client.query(`
|
||||
INSERT INTO users (username, email, password, is_admin, is_active, created_at)
|
||||
VALUES ($1, $2, $3, $4, $5, CURRENT_TIMESTAMP)
|
||||
RETURNING id
|
||||
`, [username, email || null, hashedPassword, !!is_admin, is_active !== false]);
|
||||
|
||||
const userId = userResult.rows[0].id;
|
||||
|
||||
// Assign permissions if provided and not admin
|
||||
if (!is_admin && Array.isArray(permissions) && permissions.length > 0) {
|
||||
console.log("Adding permissions for new user:", userId);
|
||||
console.log("Permissions received:", permissions);
|
||||
|
||||
// Check permission format
|
||||
const permissionIds = permissions.map(p => {
|
||||
if (typeof p === 'object' && p.id) {
|
||||
console.log("Permission is an object with ID:", p.id);
|
||||
return parseInt(p.id, 10);
|
||||
} else if (typeof p === 'number') {
|
||||
console.log("Permission is a number:", p);
|
||||
return p;
|
||||
} else if (typeof p === 'string' && !isNaN(parseInt(p, 10))) {
|
||||
console.log("Permission is a string that can be parsed as a number:", p);
|
||||
return parseInt(p, 10);
|
||||
} else {
|
||||
console.log("Unknown permission format:", typeof p, p);
|
||||
// If it's a permission code, we need to look up the ID
|
||||
return null;
|
||||
}
|
||||
}).filter(id => id !== null);
|
||||
|
||||
console.log("Filtered permission IDs:", permissionIds);
|
||||
|
||||
if (permissionIds.length > 0) {
|
||||
const permissionValues = permissionIds
|
||||
.map(permId => `(${userId}, ${permId})`)
|
||||
.join(',');
|
||||
|
||||
console.log("Inserting permission values:", permissionValues);
|
||||
|
||||
try {
|
||||
await client.query(`
|
||||
INSERT INTO user_permissions (user_id, permission_id)
|
||||
VALUES ${permissionValues}
|
||||
ON CONFLICT DO NOTHING
|
||||
`);
|
||||
console.log("Successfully inserted permissions for new user:", userId);
|
||||
} catch (err) {
|
||||
console.error("Error inserting permissions for new user:", err);
|
||||
throw err;
|
||||
}
|
||||
} else {
|
||||
console.log("No valid permission IDs found to insert for new user");
|
||||
}
|
||||
} else {
|
||||
console.log("Not adding permissions: is_admin =", is_admin, "permissions array:", Array.isArray(permissions), "length:", permissions ? permissions.length : 0);
|
||||
}
|
||||
|
||||
await client.query('COMMIT');
|
||||
|
||||
res.status(201).json({
|
||||
id: userId,
|
||||
message: 'User created successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
console.error('Error creating user:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
|
||||
// Update user
|
||||
router.put('/users/:id', authenticate, requirePermission('edit:users'), async (req, res) => {
|
||||
const client = await pool.connect();
|
||||
|
||||
try {
|
||||
const userId = req.params.id;
|
||||
const { username, email, password, is_admin, is_active, permissions } = req.body;
|
||||
|
||||
console.log("Update user request:", {
|
||||
userId,
|
||||
username,
|
||||
email,
|
||||
is_admin,
|
||||
is_active,
|
||||
permissions: permissions || []
|
||||
});
|
||||
|
||||
// Check if user exists
|
||||
const userExists = await client.query(
|
||||
'SELECT id FROM users WHERE id = $1',
|
||||
[userId]
|
||||
);
|
||||
|
||||
if (userExists.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
|
||||
// Start transaction
|
||||
await client.query('BEGIN');
|
||||
|
||||
// Build update fields
|
||||
const updateFields = [];
|
||||
const updateValues = [userId]; // First parameter is the user ID
|
||||
let paramIndex = 2;
|
||||
|
||||
if (username !== undefined) {
|
||||
updateFields.push(`username = $${paramIndex++}`);
|
||||
updateValues.push(username);
|
||||
}
|
||||
|
||||
if (email !== undefined) {
|
||||
updateFields.push(`email = $${paramIndex++}`);
|
||||
updateValues.push(email || null);
|
||||
}
|
||||
|
||||
if (is_admin !== undefined) {
|
||||
updateFields.push(`is_admin = $${paramIndex++}`);
|
||||
updateValues.push(!!is_admin);
|
||||
}
|
||||
|
||||
if (is_active !== undefined) {
|
||||
updateFields.push(`is_active = $${paramIndex++}`);
|
||||
updateValues.push(!!is_active);
|
||||
}
|
||||
|
||||
// Update password if provided
|
||||
if (password) {
|
||||
const saltRounds = 10;
|
||||
const hashedPassword = await bcrypt.hash(password, saltRounds);
|
||||
updateFields.push(`password = $${paramIndex++}`);
|
||||
updateValues.push(hashedPassword);
|
||||
}
|
||||
|
||||
// Update user if there are fields to update
|
||||
if (updateFields.length > 0) {
|
||||
updateFields.push(`updated_at = CURRENT_TIMESTAMP`);
|
||||
|
||||
await client.query(`
|
||||
UPDATE users
|
||||
SET ${updateFields.join(', ')}
|
||||
WHERE id = $1
|
||||
`, updateValues);
|
||||
}
|
||||
|
||||
// Update permissions if provided
|
||||
if (Array.isArray(permissions)) {
|
||||
console.log("Updating permissions for user:", userId);
|
||||
console.log("Permissions received:", permissions);
|
||||
|
||||
// First remove existing permissions
|
||||
await client.query(
|
||||
'DELETE FROM user_permissions WHERE user_id = $1',
|
||||
[userId]
|
||||
);
|
||||
console.log("Deleted existing permissions for user:", userId);
|
||||
|
||||
// Add new permissions if any and not admin
|
||||
const newIsAdmin = is_admin !== undefined ? is_admin : (await client.query('SELECT is_admin FROM users WHERE id = $1', [userId])).rows[0].is_admin;
|
||||
|
||||
console.log("User is admin:", newIsAdmin);
|
||||
|
||||
if (!newIsAdmin && permissions.length > 0) {
|
||||
console.log("Adding permissions:", permissions);
|
||||
|
||||
// Check permission format
|
||||
const permissionIds = permissions.map(p => {
|
||||
if (typeof p === 'object' && p.id) {
|
||||
console.log("Permission is an object with ID:", p.id);
|
||||
return parseInt(p.id, 10);
|
||||
} else if (typeof p === 'number') {
|
||||
console.log("Permission is a number:", p);
|
||||
return p;
|
||||
} else if (typeof p === 'string' && !isNaN(parseInt(p, 10))) {
|
||||
console.log("Permission is a string that can be parsed as a number:", p);
|
||||
return parseInt(p, 10);
|
||||
} else {
|
||||
console.log("Unknown permission format:", typeof p, p);
|
||||
// If it's a permission code, we need to look up the ID
|
||||
return null;
|
||||
}
|
||||
}).filter(id => id !== null);
|
||||
|
||||
console.log("Filtered permission IDs:", permissionIds);
|
||||
|
||||
if (permissionIds.length > 0) {
|
||||
const permissionValues = permissionIds
|
||||
.map(permId => `(${userId}, ${permId})`)
|
||||
.join(',');
|
||||
|
||||
console.log("Inserting permission values:", permissionValues);
|
||||
|
||||
try {
|
||||
await client.query(`
|
||||
INSERT INTO user_permissions (user_id, permission_id)
|
||||
VALUES ${permissionValues}
|
||||
ON CONFLICT DO NOTHING
|
||||
`);
|
||||
console.log("Successfully inserted permissions for user:", userId);
|
||||
} catch (err) {
|
||||
console.error("Error inserting permissions:", err);
|
||||
throw err;
|
||||
}
|
||||
} else {
|
||||
console.log("No valid permission IDs found to insert");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
await client.query('COMMIT');
|
||||
|
||||
res.json({ message: 'User updated successfully' });
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
console.error('Error updating user:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
});
|
||||
|
||||
// Delete user
|
||||
router.delete('/users/:id', authenticate, requirePermission('delete:users'), async (req, res) => {
|
||||
try {
|
||||
const userId = req.params.id;
|
||||
|
||||
// Check that user is not deleting themselves
|
||||
if (req.user.id === parseInt(userId, 10)) {
|
||||
return res.status(400).json({ error: 'Cannot delete your own account' });
|
||||
}
|
||||
|
||||
// Delete user (this will cascade to user_permissions due to FK constraints)
|
||||
const result = await pool.query(
|
||||
'DELETE FROM users WHERE id = $1 RETURNING id',
|
||||
[userId]
|
||||
);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
|
||||
res.json({ message: 'User deleted successfully' });
|
||||
} catch (error) {
|
||||
console.error('Error deleting user:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get all permissions grouped by category
|
||||
router.get('/permissions/categories', authenticate, requirePermission('view:users'), async (req, res) => {
|
||||
try {
|
||||
const result = await pool.query(`
|
||||
SELECT category, json_agg(
|
||||
json_build_object(
|
||||
'id', id,
|
||||
'name', name,
|
||||
'code', code,
|
||||
'description', description
|
||||
) ORDER BY name
|
||||
) as permissions
|
||||
FROM permissions
|
||||
GROUP BY category
|
||||
ORDER BY category
|
||||
`);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error getting permissions:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get all permissions
|
||||
router.get('/permissions', authenticate, requirePermission('view:users'), async (req, res) => {
|
||||
try {
|
||||
const result = await pool.query(`
|
||||
SELECT *
|
||||
FROM permissions
|
||||
ORDER BY category, name
|
||||
`);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error getting permissions:', error);
|
||||
res.status(500).json({ error: 'Server error' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -2,5 +2,88 @@ CREATE TABLE users (
|
||||
id SERIAL PRIMARY KEY,
|
||||
username VARCHAR(255) NOT NULL UNIQUE,
|
||||
password VARCHAR(255) NOT NULL,
|
||||
email VARCHAR UNIQUE,
|
||||
is_admin BOOLEAN DEFAULT FALSE,
|
||||
is_active BOOLEAN DEFAULT TRUE,
|
||||
last_login TIMESTAMP WITH TIME ZONE,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
);
|
||||
|
||||
-- Function to update the updated_at timestamp
|
||||
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
|
||||
-- Sequence and defined type for users table if not exists
|
||||
CREATE SEQUENCE IF NOT EXISTS users_id_seq;
|
||||
|
||||
-- Create permissions table
|
||||
CREATE TABLE IF NOT EXISTS "public"."permissions" (
|
||||
"id" SERIAL PRIMARY KEY,
|
||||
"name" varchar NOT NULL UNIQUE,
|
||||
"code" varchar NOT NULL UNIQUE,
|
||||
"description" text,
|
||||
"category" varchar NOT NULL,
|
||||
"created_at" timestamp with time zone DEFAULT CURRENT_TIMESTAMP,
|
||||
"updated_at" timestamp with time zone DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Create user_permissions junction table
|
||||
CREATE TABLE IF NOT EXISTS "public"."user_permissions" (
|
||||
"user_id" int4 NOT NULL REFERENCES "public"."users"("id") ON DELETE CASCADE,
|
||||
"permission_id" int4 NOT NULL REFERENCES "public"."permissions"("id") ON DELETE CASCADE,
|
||||
"created_at" timestamp with time zone DEFAULT CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY ("user_id", "permission_id")
|
||||
);
|
||||
|
||||
-- Add triggers for updated_at on users and permissions
|
||||
DROP TRIGGER IF EXISTS update_users_updated_at ON users;
|
||||
CREATE TRIGGER update_users_updated_at
|
||||
BEFORE UPDATE ON users
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
DROP TRIGGER IF EXISTS update_permissions_updated_at ON permissions;
|
||||
CREATE TRIGGER update_permissions_updated_at
|
||||
BEFORE UPDATE ON permissions
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Insert default permissions by page - only the ones used in application
|
||||
INSERT INTO permissions (name, code, description, category) VALUES
|
||||
('Dashboard Access', 'access:dashboard', 'Can access the Dashboard page', 'Pages'),
|
||||
('Products Access', 'access:products', 'Can access the Products page', 'Pages'),
|
||||
('Categories Access', 'access:categories', 'Can access the Categories page', 'Pages'),
|
||||
('Vendors Access', 'access:vendors', 'Can access the Vendors page', 'Pages'),
|
||||
('Analytics Access', 'access:analytics', 'Can access the Analytics page', 'Pages'),
|
||||
('Forecasting Access', 'access:forecasting', 'Can access the Forecasting page', 'Pages'),
|
||||
('Purchase Orders Access', 'access:purchase_orders', 'Can access the Purchase Orders page', 'Pages'),
|
||||
('Import Access', 'access:import', 'Can access the Import page', 'Pages'),
|
||||
('Settings Access', 'access:settings', 'Can access the Settings page', 'Pages'),
|
||||
('AI Validation Debug Access', 'access:ai_validation_debug', 'Can access the AI Validation Debug page', 'Pages')
|
||||
ON CONFLICT (code) DO NOTHING;
|
||||
|
||||
-- Settings section permissions
|
||||
INSERT INTO permissions (name, code, description, category) VALUES
|
||||
('Data Management', 'settings:data_management', 'Access to the Data Management settings section', 'Settings'),
|
||||
('Stock Management', 'settings:stock_management', 'Access to the Stock Management settings section', 'Settings'),
|
||||
('Performance Metrics', 'settings:performance_metrics', 'Access to the Performance Metrics settings section', 'Settings'),
|
||||
('Calculation Settings', 'settings:calculation_settings', 'Access to the Calculation Settings section', 'Settings'),
|
||||
('Template Management', 'settings:templates', 'Access to the Template Management settings section', 'Settings'),
|
||||
('User Management', 'settings:user_management', 'Access to the User Management settings section', 'Settings')
|
||||
ON CONFLICT (code) DO NOTHING;
|
||||
|
||||
-- Set any existing users as admin
|
||||
UPDATE users SET is_admin = TRUE WHERE is_admin IS NULL;
|
||||
|
||||
-- Grant all permissions to admin users
|
||||
INSERT INTO user_permissions (user_id, permission_id)
|
||||
SELECT u.id, p.id
|
||||
FROM users u, permissions p
|
||||
WHERE u.is_admin = TRUE
|
||||
ON CONFLICT DO NOTHING;
|
||||
@@ -5,6 +5,7 @@ const bcrypt = require('bcrypt');
|
||||
const jwt = require('jsonwebtoken');
|
||||
const { Pool } = require('pg');
|
||||
const morgan = require('morgan');
|
||||
const authRoutes = require('./routes');
|
||||
|
||||
// Log startup configuration
|
||||
console.log('Starting auth server with config:', {
|
||||
@@ -27,11 +28,14 @@ const pool = new Pool({
|
||||
port: process.env.DB_PORT,
|
||||
});
|
||||
|
||||
// Make pool available globally
|
||||
global.pool = pool;
|
||||
|
||||
// Middleware
|
||||
app.use(express.json());
|
||||
app.use(morgan('combined'));
|
||||
app.use(cors({
|
||||
origin: ['http://localhost:5173', 'https://inventory.kent.pw'],
|
||||
origin: ['http://localhost:5173', 'http://localhost:5174', 'https://inventory.kent.pw'],
|
||||
credentials: true
|
||||
}));
|
||||
|
||||
@@ -42,7 +46,7 @@ app.post('/login', async (req, res) => {
|
||||
try {
|
||||
// Get user from database
|
||||
const result = await pool.query(
|
||||
'SELECT id, username, password FROM users WHERE username = $1',
|
||||
'SELECT id, username, password, is_admin, is_active FROM users WHERE username = $1',
|
||||
[username]
|
||||
);
|
||||
|
||||
@@ -52,6 +56,11 @@ app.post('/login', async (req, res) => {
|
||||
if (!user || !(await bcrypt.compare(password, user.password))) {
|
||||
return res.status(401).json({ error: 'Invalid username or password' });
|
||||
}
|
||||
|
||||
// Check if user is active
|
||||
if (!user.is_active) {
|
||||
return res.status(403).json({ error: 'Account is inactive' });
|
||||
}
|
||||
|
||||
// Generate JWT token
|
||||
const token = jwt.sign(
|
||||
@@ -60,31 +69,84 @@ app.post('/login', async (req, res) => {
|
||||
{ expiresIn: '24h' }
|
||||
);
|
||||
|
||||
res.json({ token });
|
||||
// Get user permissions for the response
|
||||
const permissionsResult = await pool.query(`
|
||||
SELECT code
|
||||
FROM permissions p
|
||||
JOIN user_permissions up ON p.id = up.permission_id
|
||||
WHERE up.user_id = $1
|
||||
`, [user.id]);
|
||||
|
||||
const permissions = permissionsResult.rows.map(row => row.code);
|
||||
|
||||
res.json({
|
||||
token,
|
||||
user: {
|
||||
id: user.id,
|
||||
username: user.username,
|
||||
is_admin: user.is_admin,
|
||||
permissions: user.is_admin ? [] : permissions
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Login error:', error);
|
||||
res.status(500).json({ error: 'Internal server error' });
|
||||
}
|
||||
});
|
||||
|
||||
// Protected route to verify token
|
||||
app.get('/protected', async (req, res) => {
|
||||
// User info endpoint
|
||||
app.get('/me', async (req, res) => {
|
||||
const authHeader = req.headers.authorization;
|
||||
|
||||
if (!authHeader) {
|
||||
if (!authHeader || !authHeader.startsWith('Bearer ')) {
|
||||
return res.status(401).json({ error: 'No token provided' });
|
||||
}
|
||||
|
||||
try {
|
||||
const token = authHeader.split(' ')[1];
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
res.json({ userId: decoded.userId, username: decoded.username });
|
||||
|
||||
// Get user details from database
|
||||
const userResult = await pool.query(
|
||||
'SELECT id, username, email, is_admin, is_active FROM users WHERE id = $1',
|
||||
[decoded.userId]
|
||||
);
|
||||
|
||||
if (userResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
|
||||
const user = userResult.rows[0];
|
||||
|
||||
// Get user permissions
|
||||
let permissions = [];
|
||||
if (!user.is_admin) {
|
||||
const permissionsResult = await pool.query(`
|
||||
SELECT code
|
||||
FROM permissions p
|
||||
JOIN user_permissions up ON p.id = up.permission_id
|
||||
WHERE up.user_id = $1
|
||||
`, [user.id]);
|
||||
|
||||
permissions = permissionsResult.rows.map(row => row.code);
|
||||
}
|
||||
|
||||
res.json({
|
||||
id: user.id,
|
||||
username: user.username,
|
||||
email: user.email,
|
||||
is_admin: user.is_admin,
|
||||
permissions: permissions
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Token verification error:', error);
|
||||
res.status(401).json({ error: 'Invalid token' });
|
||||
}
|
||||
});
|
||||
|
||||
// Mount all routes from routes.js
|
||||
app.use('/', authRoutes);
|
||||
|
||||
// Health check endpoint
|
||||
app.get('/health', (req, res) => {
|
||||
res.json({ status: 'healthy' });
|
||||
|
||||
@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
|
||||
-- Create function for updating timestamps
|
||||
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
||||
BEGIN
|
||||
NEW.updated = CURRENT_TIMESTAMP;
|
||||
-- Check which table is being updated and use the appropriate column
|
||||
IF TG_TABLE_NAME = 'categories' THEN
|
||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||
ELSE
|
||||
NEW.updated = CURRENT_TIMESTAMP;
|
||||
END IF;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$func$ language plpgsql;
|
||||
@@ -160,7 +165,7 @@ CREATE TABLE purchase_orders (
|
||||
expected_date DATE,
|
||||
pid BIGINT NOT NULL,
|
||||
sku VARCHAR(50) NOT NULL,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
name VARCHAR(255) NOT NULL,
|
||||
cost_price DECIMAL(10, 3) NOT NULL,
|
||||
po_cost_price DECIMAL(10, 3) NOT NULL,
|
||||
status SMALLINT DEFAULT 1,
|
||||
@@ -171,7 +176,7 @@ CREATE TABLE purchase_orders (
|
||||
received INTEGER DEFAULT 0,
|
||||
received_date DATE,
|
||||
last_received_date DATE,
|
||||
received_by VARCHAR(100),
|
||||
received_by VARCHAR,
|
||||
receiving_history JSONB,
|
||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (pid) REFERENCES products(pid),
|
||||
|
||||
@@ -23,6 +23,56 @@ CREATE TABLE IF NOT EXISTS templates (
|
||||
UNIQUE(company, product_type)
|
||||
);
|
||||
|
||||
-- AI Prompts table for storing validation prompts
|
||||
CREATE TABLE IF NOT EXISTS ai_prompts (
|
||||
id SERIAL PRIMARY KEY,
|
||||
prompt_text TEXT NOT NULL,
|
||||
prompt_type TEXT NOT NULL CHECK (prompt_type IN ('general', 'company_specific', 'system')),
|
||||
company TEXT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
CONSTRAINT unique_company_prompt UNIQUE (company),
|
||||
CONSTRAINT company_required_for_specific CHECK (
|
||||
(prompt_type = 'general' AND company IS NULL) OR
|
||||
(prompt_type = 'system' AND company IS NULL) OR
|
||||
(prompt_type = 'company_specific' AND company IS NOT NULL)
|
||||
)
|
||||
);
|
||||
|
||||
-- Create a unique partial index to ensure only one general prompt
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_general_prompt
|
||||
ON ai_prompts (prompt_type)
|
||||
WHERE prompt_type = 'general';
|
||||
|
||||
-- Create a unique partial index to ensure only one system prompt
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
|
||||
ON ai_prompts (prompt_type)
|
||||
WHERE prompt_type = 'system';
|
||||
|
||||
-- Reusable Images table for storing persistent images
|
||||
CREATE TABLE IF NOT EXISTS reusable_images (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
filename TEXT NOT NULL,
|
||||
file_path TEXT NOT NULL,
|
||||
image_url TEXT NOT NULL,
|
||||
is_global BOOLEAN NOT NULL DEFAULT false,
|
||||
company TEXT,
|
||||
mime_type TEXT,
|
||||
file_size INTEGER,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
CONSTRAINT company_required_for_non_global CHECK (
|
||||
(is_global = true AND company IS NULL) OR
|
||||
(is_global = false AND company IS NOT NULL)
|
||||
)
|
||||
);
|
||||
|
||||
-- Create index on company for efficient querying
|
||||
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
|
||||
-- Create index on is_global for efficient querying
|
||||
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
|
||||
|
||||
-- AI Validation Performance Tracking
|
||||
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
||||
id SERIAL PRIMARY KEY,
|
||||
@@ -50,4 +100,16 @@ $$ language 'plpgsql';
|
||||
CREATE TRIGGER update_templates_updated_at
|
||||
BEFORE UPDATE ON templates
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Trigger to automatically update the updated_at column for ai_prompts
|
||||
CREATE TRIGGER update_ai_prompts_updated_at
|
||||
BEFORE UPDATE ON ai_prompts
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
|
||||
-- Trigger to automatically update the updated_at column for reusable_images
|
||||
CREATE TRIGGER update_reusable_images_updated_at
|
||||
BEFORE UPDATE ON reusable_images
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_updated_at_column();
|
||||
14
inventory-server/package-lock.json
generated
14
inventory-server/package-lock.json
generated
@@ -1537,20 +1537,6 @@
|
||||
"integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/fsevents": {
|
||||
"version": "2.3.3",
|
||||
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
|
||||
"integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
|
||||
"hasInstallScript": true,
|
||||
"license": "MIT",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/function-bind": {
|
||||
"version": "1.1.2",
|
||||
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
|
||||
|
||||
@@ -62,13 +62,24 @@ const TEMP_TABLES = [
|
||||
|
||||
// Add cleanup function for temporary tables
|
||||
async function cleanupTemporaryTables(connection) {
|
||||
// List of possible temporary tables that might exist
|
||||
const tempTables = [
|
||||
'temp_sales_metrics',
|
||||
'temp_purchase_metrics',
|
||||
'temp_forecast_dates',
|
||||
'temp_daily_sales',
|
||||
'temp_product_stats',
|
||||
'temp_category_sales',
|
||||
'temp_category_stats'
|
||||
];
|
||||
|
||||
try {
|
||||
for (const table of TEMP_TABLES) {
|
||||
await connection.query(`DROP TEMPORARY TABLE IF EXISTS ${table}`);
|
||||
// Drop each temporary table if it exists
|
||||
for (const table of tempTables) {
|
||||
await connection.query(`DROP TABLE IF EXISTS ${table}`);
|
||||
}
|
||||
} catch (error) {
|
||||
logError(error, 'Error cleaning up temporary tables');
|
||||
throw error; // Re-throw to be handled by the caller
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -86,22 +97,42 @@ let isCancelled = false;
|
||||
|
||||
function cancelCalculation() {
|
||||
isCancelled = true;
|
||||
global.clearProgress();
|
||||
// Format as SSE event
|
||||
const event = {
|
||||
progress: {
|
||||
status: 'cancelled',
|
||||
operation: 'Calculation cancelled',
|
||||
current: 0,
|
||||
total: 0,
|
||||
elapsed: null,
|
||||
remaining: null,
|
||||
rate: 0,
|
||||
timestamp: Date.now()
|
||||
}
|
||||
console.log('Calculation has been cancelled by user');
|
||||
|
||||
// Force-terminate any query that's been running for more than 5 seconds
|
||||
try {
|
||||
const connection = getConnection();
|
||||
connection.then(async (conn) => {
|
||||
try {
|
||||
// Identify and terminate long-running queries from our application
|
||||
await conn.query(`
|
||||
SELECT pg_cancel_backend(pid)
|
||||
FROM pg_stat_activity
|
||||
WHERE query_start < now() - interval '5 seconds'
|
||||
AND application_name LIKE '%node%'
|
||||
AND query NOT LIKE '%pg_cancel_backend%'
|
||||
`);
|
||||
|
||||
// Clean up any temporary tables
|
||||
await cleanupTemporaryTables(conn);
|
||||
|
||||
// Release connection
|
||||
conn.release();
|
||||
} catch (err) {
|
||||
console.error('Error during force cancellation:', err);
|
||||
conn.release();
|
||||
}
|
||||
}).catch(err => {
|
||||
console.error('Could not get connection for cancellation:', err);
|
||||
});
|
||||
} catch (err) {
|
||||
console.error('Failed to terminate running queries:', err);
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: 'Calculation has been cancelled'
|
||||
};
|
||||
process.stdout.write(JSON.stringify(event) + '\n');
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
// Handle SIGTERM signal for cancellation
|
||||
@@ -119,6 +150,15 @@ async function calculateMetrics() {
|
||||
let totalPurchaseOrders = 0;
|
||||
let calculateHistoryId;
|
||||
|
||||
// Set a maximum execution time (30 minutes)
|
||||
const MAX_EXECUTION_TIME = 30 * 60 * 1000;
|
||||
const timeout = setTimeout(() => {
|
||||
console.error(`Calculation timed out after ${MAX_EXECUTION_TIME/1000} seconds, forcing termination`);
|
||||
// Call cancel and force exit
|
||||
cancelCalculation();
|
||||
process.exit(1);
|
||||
}, MAX_EXECUTION_TIME);
|
||||
|
||||
try {
|
||||
// Clean up any previously running calculations
|
||||
connection = await getConnection();
|
||||
@@ -127,24 +167,24 @@ async function calculateMetrics() {
|
||||
SET
|
||||
status = 'cancelled',
|
||||
end_time = NOW(),
|
||||
duration_seconds = TIMESTAMPDIFF(SECOND, start_time, NOW()),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = 'Previous calculation was not completed properly'
|
||||
WHERE status = 'running'
|
||||
`);
|
||||
|
||||
// Get counts from all relevant tables
|
||||
const [[productCount], [orderCount], [poCount]] = await Promise.all([
|
||||
const [productCountResult, orderCountResult, poCountResult] = await Promise.all([
|
||||
connection.query('SELECT COUNT(*) as total FROM products'),
|
||||
connection.query('SELECT COUNT(*) as total FROM orders'),
|
||||
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
|
||||
]);
|
||||
|
||||
totalProducts = productCount.total;
|
||||
totalOrders = orderCount.total;
|
||||
totalPurchaseOrders = poCount.total;
|
||||
totalProducts = parseInt(productCountResult.rows[0].total);
|
||||
totalOrders = parseInt(orderCountResult.rows[0].total);
|
||||
totalPurchaseOrders = parseInt(poCountResult.rows[0].total);
|
||||
|
||||
// Create history record for this calculation
|
||||
const [historyResult] = await connection.query(`
|
||||
const historyResult = await connection.query(`
|
||||
INSERT INTO calculate_history (
|
||||
start_time,
|
||||
status,
|
||||
@@ -155,19 +195,19 @@ async function calculateMetrics() {
|
||||
) VALUES (
|
||||
NOW(),
|
||||
'running',
|
||||
?,
|
||||
?,
|
||||
?,
|
||||
JSON_OBJECT(
|
||||
'skip_product_metrics', ?,
|
||||
'skip_time_aggregates', ?,
|
||||
'skip_financial_metrics', ?,
|
||||
'skip_vendor_metrics', ?,
|
||||
'skip_category_metrics', ?,
|
||||
'skip_brand_metrics', ?,
|
||||
'skip_sales_forecasts', ?
|
||||
$1,
|
||||
$2,
|
||||
$3,
|
||||
jsonb_build_object(
|
||||
'skip_product_metrics', ($4::int > 0),
|
||||
'skip_time_aggregates', ($5::int > 0),
|
||||
'skip_financial_metrics', ($6::int > 0),
|
||||
'skip_vendor_metrics', ($7::int > 0),
|
||||
'skip_category_metrics', ($8::int > 0),
|
||||
'skip_brand_metrics', ($9::int > 0),
|
||||
'skip_sales_forecasts', ($10::int > 0)
|
||||
)
|
||||
)
|
||||
) RETURNING id
|
||||
`, [
|
||||
totalProducts,
|
||||
totalOrders,
|
||||
@@ -180,8 +220,7 @@ async function calculateMetrics() {
|
||||
SKIP_BRAND_METRICS,
|
||||
SKIP_SALES_FORECASTS
|
||||
]);
|
||||
calculateHistoryId = historyResult.insertId;
|
||||
connection.release();
|
||||
calculateHistoryId = historyResult.rows[0].id;
|
||||
|
||||
// Add debug logging for the progress functions
|
||||
console.log('Debug - Progress functions:', {
|
||||
@@ -199,6 +238,8 @@ async function calculateMetrics() {
|
||||
throw err;
|
||||
}
|
||||
|
||||
// Release the connection before getting a new one
|
||||
connection.release();
|
||||
isCancelled = false;
|
||||
connection = await getConnection();
|
||||
|
||||
@@ -234,10 +275,10 @@ async function calculateMetrics() {
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?
|
||||
WHERE id = ?
|
||||
processed_products = $1,
|
||||
processed_orders = $2,
|
||||
processed_purchase_orders = $3
|
||||
WHERE id = $4
|
||||
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
|
||||
};
|
||||
|
||||
@@ -359,216 +400,6 @@ async function calculateMetrics() {
|
||||
console.log('Skipping sales forecasts calculation');
|
||||
}
|
||||
|
||||
// Calculate ABC classification
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting ABC classification',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
|
||||
// First, create and populate the rankings table with an index
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||
pid BIGINT NOT NULL,
|
||||
total_revenue DECIMAL(10,3),
|
||||
rank_num INT,
|
||||
total_count INT,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX (rank_num)
|
||||
) ENGINE=MEMORY
|
||||
`);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Creating revenue rankings',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
await connection.query(`
|
||||
INSERT INTO temp_revenue_ranks
|
||||
SELECT
|
||||
pid,
|
||||
total_revenue,
|
||||
@rank := @rank + 1 as rank_num,
|
||||
@total_count := @rank as total_count
|
||||
FROM (
|
||||
SELECT pid, total_revenue
|
||||
FROM product_metrics
|
||||
WHERE total_revenue > 0
|
||||
ORDER BY total_revenue DESC
|
||||
) ranked,
|
||||
(SELECT @rank := 0) r
|
||||
`);
|
||||
|
||||
// Get total count for percentage calculation
|
||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = rankingCount[0].total_count || 1;
|
||||
const max_rank = totalCount; // Store max_rank for use in classification
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Updating ABC classifications',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// ABC classification progress tracking
|
||||
let abcProcessedCount = 0;
|
||||
const batchSize = 5000;
|
||||
let lastProgressUpdate = Date.now();
|
||||
const progressUpdateInterval = 1000; // Update every second
|
||||
|
||||
while (true) {
|
||||
if (isCancelled) return {
|
||||
processedProducts: Number(processedProducts) || 0,
|
||||
processedOrders: Number(processedOrders) || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// First get a batch of PIDs that need updating
|
||||
const [pids] = await connection.query(`
|
||||
SELECT pm.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
WHERE pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END
|
||||
LIMIT ?
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
batchSize]);
|
||||
|
||||
if (pids.length === 0) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Then update just those PIDs
|
||||
const [result] = await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
SET pm.abc_class =
|
||||
CASE
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
pids.map(row => row.pid)]);
|
||||
|
||||
abcProcessedCount += result.affectedRows;
|
||||
|
||||
// Calculate progress ensuring valid numbers
|
||||
const currentProgress = Math.floor(totalProducts * (0.99 + (abcProcessedCount / (totalCount || 1)) * 0.01));
|
||||
processedProducts = Number(currentProgress) || processedProducts || 0;
|
||||
|
||||
// Only update progress at most once per second
|
||||
const now = Date.now();
|
||||
if (now - lastProgressUpdate >= progressUpdateInterval) {
|
||||
const progress = ensureValidProgress(processedProducts, totalProducts);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'ABC classification progress',
|
||||
current: progress.current,
|
||||
total: progress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, progress.current, progress.total),
|
||||
rate: calculateRate(startTime, progress.current),
|
||||
percentage: progress.percentage,
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
|
||||
// Update database progress
|
||||
await updateProgress(processedProducts, processedOrders, processedPurchaseOrders);
|
||||
|
||||
// Small delay between batches to allow other transactions
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
}
|
||||
|
||||
// Clean up
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
|
||||
// Update calculate_status for ABC classification
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('abc_classification', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Final progress update with guaranteed valid numbers
|
||||
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
|
||||
|
||||
@@ -578,14 +409,14 @@ async function calculateMetrics() {
|
||||
operation: 'Metrics calculation complete',
|
||||
current: finalProgress.current,
|
||||
total: finalProgress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
remaining: '0s',
|
||||
rate: calculateRate(startTime, finalProgress.current),
|
||||
rate: global.calculateRate(startTime, finalProgress.current),
|
||||
percentage: '100',
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: totalElapsedSeconds
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
@@ -601,13 +432,13 @@ async function calculateMetrics() {
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
duration_seconds = $1,
|
||||
processed_products = $2,
|
||||
processed_orders = $3,
|
||||
processed_purchase_orders = $4,
|
||||
status = 'completed'
|
||||
WHERE id = ?
|
||||
`, [totalElapsedSeconds,
|
||||
WHERE id = $5
|
||||
`, [Math.round((Date.now() - startTime) / 1000),
|
||||
finalStats.processedProducts,
|
||||
finalStats.processedOrders,
|
||||
finalStats.processedPurchaseOrders,
|
||||
@@ -616,6 +447,11 @@ async function calculateMetrics() {
|
||||
// Clear progress file on successful completion
|
||||
global.clearProgress();
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: 'Calculation completed successfully',
|
||||
duration: Math.round((Date.now() - startTime) / 1000)
|
||||
};
|
||||
} catch (error) {
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
@@ -625,13 +461,13 @@ async function calculateMetrics() {
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
status = ?,
|
||||
error_message = ?
|
||||
WHERE id = ?
|
||||
duration_seconds = $1,
|
||||
processed_products = $2,
|
||||
processed_orders = $3,
|
||||
processed_purchase_orders = $4,
|
||||
status = $5,
|
||||
error_message = $6
|
||||
WHERE id = $7
|
||||
`, [
|
||||
totalElapsedSeconds,
|
||||
processedProducts || 0, // Ensure we have a valid number
|
||||
@@ -677,17 +513,38 @@ async function calculateMetrics() {
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
// Clear the timeout to prevent forced termination
|
||||
clearTimeout(timeout);
|
||||
|
||||
// Always clean up and release connection
|
||||
if (connection) {
|
||||
// Ensure temporary tables are cleaned up
|
||||
await cleanupTemporaryTables(connection);
|
||||
connection.release();
|
||||
try {
|
||||
await cleanupTemporaryTables(connection);
|
||||
connection.release();
|
||||
} catch (err) {
|
||||
console.error('Error in final cleanup:', err);
|
||||
}
|
||||
}
|
||||
// Close the connection pool when we're done
|
||||
await closePool();
|
||||
}
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error in metrics calculation');
|
||||
console.error('Error in metrics calculation', error);
|
||||
|
||||
try {
|
||||
if (connection) {
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'error',
|
||||
end_time = NOW(),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = $1
|
||||
WHERE id = $2
|
||||
`, [error.message.substring(0, 500), calculateHistoryId]);
|
||||
}
|
||||
} catch (updateError) {
|
||||
console.error('Error updating calculation history:', updateError);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,9 +10,9 @@ const importPurchaseOrders = require('./import/purchase-orders');
|
||||
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||
|
||||
// Constants to control which imports run
|
||||
const IMPORT_CATEGORIES = true;
|
||||
const IMPORT_PRODUCTS = true;
|
||||
const IMPORT_ORDERS = true;
|
||||
const IMPORT_CATEGORIES = false;
|
||||
const IMPORT_PRODUCTS = false;
|
||||
const IMPORT_ORDERS = false;
|
||||
const IMPORT_PURCHASE_ORDERS = true;
|
||||
|
||||
// Add flag for incremental updates
|
||||
@@ -120,27 +120,38 @@ async function main() {
|
||||
`);
|
||||
|
||||
// Create import history record for the overall session
|
||||
const [historyResult] = await localConnection.query(`
|
||||
INSERT INTO import_history (
|
||||
table_name,
|
||||
start_time,
|
||||
is_incremental,
|
||||
status,
|
||||
additional_info
|
||||
) VALUES (
|
||||
'all_tables',
|
||||
NOW(),
|
||||
$1::boolean,
|
||||
'running',
|
||||
jsonb_build_object(
|
||||
'categories_enabled', $2::boolean,
|
||||
'products_enabled', $3::boolean,
|
||||
'orders_enabled', $4::boolean,
|
||||
'purchase_orders_enabled', $5::boolean
|
||||
)
|
||||
) RETURNING id
|
||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||
importHistoryId = historyResult.rows[0].id;
|
||||
try {
|
||||
const [historyResult] = await localConnection.query(`
|
||||
INSERT INTO import_history (
|
||||
table_name,
|
||||
start_time,
|
||||
is_incremental,
|
||||
status,
|
||||
additional_info
|
||||
) VALUES (
|
||||
'all_tables',
|
||||
NOW(),
|
||||
$1::boolean,
|
||||
'running',
|
||||
jsonb_build_object(
|
||||
'categories_enabled', $2::boolean,
|
||||
'products_enabled', $3::boolean,
|
||||
'orders_enabled', $4::boolean,
|
||||
'purchase_orders_enabled', $5::boolean
|
||||
)
|
||||
) RETURNING id
|
||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||
importHistoryId = historyResult.rows[0].id;
|
||||
} catch (error) {
|
||||
console.error("Error creating import history record:", error);
|
||||
outputProgress({
|
||||
status: "error",
|
||||
operation: "Import process",
|
||||
message: "Failed to create import history record",
|
||||
error: error.message
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
|
||||
const results = {
|
||||
categories: null,
|
||||
@@ -158,8 +169,8 @@ async function main() {
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Categories import result:', results.categories);
|
||||
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0);
|
||||
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0) || 0;
|
||||
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0) || 0;
|
||||
}
|
||||
|
||||
if (IMPORT_PRODUCTS) {
|
||||
@@ -167,8 +178,8 @@ async function main() {
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Products import result:', results.products);
|
||||
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0);
|
||||
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0) || 0;
|
||||
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0) || 0;
|
||||
}
|
||||
|
||||
if (IMPORT_ORDERS) {
|
||||
@@ -176,17 +187,34 @@ async function main() {
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Orders import result:', results.orders);
|
||||
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0);
|
||||
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0) || 0;
|
||||
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0) || 0;
|
||||
}
|
||||
|
||||
if (IMPORT_PURCHASE_ORDERS) {
|
||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
||||
try {
|
||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||
if (isImportCancelled) throw new Error("Import cancelled");
|
||||
completedSteps++;
|
||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||
|
||||
// Handle potential error status
|
||||
if (results.purchaseOrders?.status === 'error') {
|
||||
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
|
||||
} else {
|
||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0) || 0;
|
||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0) || 0;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error during purchase orders import:', error);
|
||||
// Continue with other imports, don't fail the whole process
|
||||
results.purchaseOrders = {
|
||||
status: 'error',
|
||||
error: error.message,
|
||||
recordsAdded: 0,
|
||||
recordsUpdated: 0
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const endTime = Date.now();
|
||||
@@ -214,8 +242,8 @@ async function main() {
|
||||
WHERE id = $12
|
||||
`, [
|
||||
totalElapsedSeconds,
|
||||
totalRecordsAdded,
|
||||
totalRecordsUpdated,
|
||||
parseInt(totalRecordsAdded) || 0,
|
||||
parseInt(totalRecordsUpdated) || 0,
|
||||
IMPORT_CATEGORIES,
|
||||
IMPORT_PRODUCTS,
|
||||
IMPORT_ORDERS,
|
||||
|
||||
@@ -47,42 +47,18 @@ async function importCategories(prodConnection, localConnection) {
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`\nProcessing ${categories.length} type ${type} categories`);
|
||||
if (type === 10) {
|
||||
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
|
||||
}
|
||||
console.log(`Processing ${categories.length} type ${type} categories`);
|
||||
|
||||
// For types that can have parents (11, 21, 12, 13), verify parent existence
|
||||
// For types that can have parents (11, 21, 12, 13), we'll proceed directly
|
||||
// No need to check for parent existence since we process in hierarchical order
|
||||
let categoriesToInsert = categories;
|
||||
if (![10, 20].includes(type)) {
|
||||
// Get all parent IDs
|
||||
const parentIds = [
|
||||
...new Set(
|
||||
categories
|
||||
.filter(c => c && c.parent_id !== null)
|
||||
.map(c => c.parent_id)
|
||||
),
|
||||
];
|
||||
|
||||
console.log(`Processing ${categories.length} type ${type} categories with ${parentIds.length} unique parent IDs`);
|
||||
console.log('Parent IDs:', parentIds);
|
||||
|
||||
// No need to check for parent existence - we trust they exist since they were just inserted
|
||||
categoriesToInsert = categories;
|
||||
}
|
||||
|
||||
if (categoriesToInsert.length === 0) {
|
||||
console.log(
|
||||
`No valid categories of type ${type} to insert`
|
||||
);
|
||||
console.log(`No valid categories of type ${type} to insert`);
|
||||
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Inserting ${categoriesToInsert.length} type ${type} categories`
|
||||
);
|
||||
|
||||
// PostgreSQL upsert query with parameterized values
|
||||
const values = categoriesToInsert.flatMap((cat) => [
|
||||
cat.cat_id,
|
||||
@@ -95,14 +71,10 @@ async function importCategories(prodConnection, localConnection) {
|
||||
new Date()
|
||||
]);
|
||||
|
||||
console.log('Attempting to insert/update with values:', JSON.stringify(values, null, 2));
|
||||
|
||||
const placeholders = categoriesToInsert
|
||||
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
||||
.join(',');
|
||||
|
||||
console.log('Using placeholders:', placeholders);
|
||||
|
||||
// Insert categories with ON CONFLICT clause for PostgreSQL
|
||||
const query = `
|
||||
WITH inserted_categories AS (
|
||||
@@ -129,17 +101,14 @@ async function importCategories(prodConnection, localConnection) {
|
||||
COUNT(*) FILTER (WHERE is_insert) as inserted,
|
||||
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
||||
FROM inserted_categories`;
|
||||
|
||||
console.log('Executing query:', query);
|
||||
|
||||
const result = await localConnection.query(query, values);
|
||||
console.log('Query result:', result);
|
||||
|
||||
// Get the first result since query returns an array
|
||||
const queryResult = Array.isArray(result) ? result[0] : result;
|
||||
|
||||
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
||||
console.error('Query failed to return results. Result:', queryResult);
|
||||
console.error('Query failed to return results');
|
||||
throw new Error('Query did not return expected results');
|
||||
}
|
||||
|
||||
@@ -173,6 +142,14 @@ async function importCategories(prodConnection, localConnection) {
|
||||
// Commit the entire transaction - we'll do this even if we have skipped categories
|
||||
await localConnection.query('COMMIT');
|
||||
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('categories', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
outputProgress({
|
||||
status: "complete",
|
||||
operation: "Categories import completed",
|
||||
|
||||
@@ -26,6 +26,9 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
let cumulativeProcessedOrders = 0;
|
||||
|
||||
try {
|
||||
// Begin transaction
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
// Get last sync info
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||
@@ -38,7 +41,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const [[{ total }]] = await prodConnection.query(`
|
||||
SELECT COUNT(*) as total
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
@@ -78,7 +80,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
||||
oi.stamp as last_modified
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
@@ -105,15 +106,15 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
|
||||
console.log('Orders: Found', orderItems.length, 'order items to process');
|
||||
|
||||
// Create tables in PostgreSQL for debugging
|
||||
// Create tables in PostgreSQL for data processing
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS debug_order_items;
|
||||
DROP TABLE IF EXISTS debug_order_meta;
|
||||
DROP TABLE IF EXISTS debug_order_discounts;
|
||||
DROP TABLE IF EXISTS debug_order_taxes;
|
||||
DROP TABLE IF EXISTS debug_order_costs;
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
|
||||
CREATE TABLE debug_order_items (
|
||||
CREATE TEMP TABLE temp_order_items (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
SKU VARCHAR(50) NOT NULL,
|
||||
@@ -123,7 +124,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_meta (
|
||||
CREATE TEMP TABLE temp_order_meta (
|
||||
order_id INTEGER NOT NULL,
|
||||
date DATE NOT NULL,
|
||||
customer VARCHAR(100) NOT NULL,
|
||||
@@ -135,26 +136,29 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
PRIMARY KEY (order_id)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_discounts (
|
||||
CREATE TEMP TABLE temp_order_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
discount DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_taxes (
|
||||
CREATE TEMP TABLE temp_order_taxes (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
tax DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TABLE debug_order_costs (
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||
`);
|
||||
|
||||
// Insert order items in batches
|
||||
@@ -168,7 +172,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
SKU = EXCLUDED.SKU,
|
||||
@@ -202,6 +206,14 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const METADATA_BATCH_SIZE = 2000;
|
||||
const PG_BATCH_SIZE = 200;
|
||||
|
||||
// Add a helper function for title case conversion
|
||||
function toTitleCase(str) {
|
||||
if (!str) return '';
|
||||
return str.toLowerCase().split(' ').map(word => {
|
||||
return word.charAt(0).toUpperCase() + word.slice(1);
|
||||
}).join(' ');
|
||||
}
|
||||
|
||||
const processMetadataBatch = async (batchIds) => {
|
||||
const [orders] = await prodConnection.query(`
|
||||
SELECT
|
||||
@@ -231,7 +243,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
order.order_id,
|
||||
order.date,
|
||||
order.customer,
|
||||
order.customer_name || '',
|
||||
toTitleCase(order.customer_name) || '',
|
||||
order.status,
|
||||
order.canceled,
|
||||
order.summary_discount || 0,
|
||||
@@ -239,7 +251,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_meta (
|
||||
INSERT INTO temp_order_meta (
|
||||
order_id, date, customer, customer_name, status, canceled,
|
||||
summary_discount, summary_subtotal
|
||||
)
|
||||
@@ -281,7 +293,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_discounts (order_id, pid, discount)
|
||||
INSERT INTO temp_order_discounts (order_id, pid, discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
discount = EXCLUDED.discount
|
||||
@@ -321,7 +333,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_taxes (order_id, pid, tax)
|
||||
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
tax = EXCLUDED.tax
|
||||
@@ -330,14 +342,23 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
};
|
||||
|
||||
const processCostsBatch = async (batchIds) => {
|
||||
// Modified query to ensure one row per order_id/pid by using a subquery
|
||||
const [costs] = await prodConnection.query(`
|
||||
SELECT
|
||||
oc.orderid as order_id,
|
||||
oc.pid,
|
||||
oc.costeach
|
||||
FROM order_costs oc
|
||||
WHERE oc.orderid IN (?)
|
||||
AND oc.pending = 0
|
||||
INNER JOIN (
|
||||
SELECT
|
||||
orderid,
|
||||
pid,
|
||||
MAX(id) as max_id
|
||||
FROM order_costs
|
||||
WHERE orderid IN (?)
|
||||
AND pending = 0
|
||||
GROUP BY orderid, pid
|
||||
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||
`, [batchIds]);
|
||||
|
||||
if (costs.length === 0) return;
|
||||
@@ -357,7 +378,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO debug_order_costs (order_id, pid, costeach)
|
||||
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
costeach = EXCLUDED.costeach
|
||||
@@ -416,11 +437,12 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
oi.pid,
|
||||
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
||||
COALESCE(ot.tax, 0) as total_tax,
|
||||
COALESCE(oi.price * 0.5, 0) as costeach
|
||||
FROM debug_order_items oi
|
||||
LEFT JOIN debug_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN debug_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax
|
||||
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||
FROM temp_order_items oi
|
||||
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||
)
|
||||
SELECT
|
||||
oi.order_id as order_number,
|
||||
@@ -447,11 +469,11 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
FROM (
|
||||
SELECT DISTINCT ON (order_id, pid)
|
||||
order_id, pid, SKU, price, quantity, base_discount
|
||||
FROM debug_order_items
|
||||
FROM temp_order_items
|
||||
WHERE order_id = ANY($1)
|
||||
ORDER BY order_id, pid
|
||||
) oi
|
||||
JOIN debug_order_meta om ON oi.order_id = om.order_id
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
ORDER BY oi.order_id, oi.pid
|
||||
`, [subBatchIds]);
|
||||
@@ -478,8 +500,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||
|
||||
const placeholders = subBatch.map((_, idx) => {
|
||||
const base = idx * 14; // 14 columns (removed updated)
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14})`;
|
||||
const base = idx * 15; // 15 columns including costeach
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||
}).join(',');
|
||||
|
||||
const batchValues = subBatch.flatMap(o => [
|
||||
@@ -496,7 +518,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
o.customer,
|
||||
o.customer_name,
|
||||
o.status,
|
||||
o.canceled
|
||||
o.canceled,
|
||||
o.costeach
|
||||
]);
|
||||
|
||||
const [result] = await localConnection.query(`
|
||||
@@ -504,7 +527,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
INSERT INTO orders (
|
||||
order_number, pid, sku, date, price, quantity, discount,
|
||||
tax, tax_included, shipping, customer, customer_name,
|
||||
status, canceled
|
||||
status, canceled, costeach
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||
@@ -519,7 +542,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled
|
||||
canceled = EXCLUDED.canceled,
|
||||
costeach = EXCLUDED.costeach
|
||||
RETURNING xmax = 0 as inserted
|
||||
)
|
||||
SELECT
|
||||
@@ -529,8 +553,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
`, batchValues);
|
||||
|
||||
const { inserted, updated } = result.rows[0];
|
||||
recordsAdded += inserted;
|
||||
recordsUpdated += updated;
|
||||
recordsAdded += parseInt(inserted) || 0;
|
||||
recordsUpdated += parseInt(updated) || 0;
|
||||
importedCount += subBatch.length;
|
||||
}
|
||||
|
||||
@@ -555,19 +579,39 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Cleanup temporary tables
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
await localConnection.commit();
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: Math.floor(importedCount),
|
||||
recordsAdded: recordsAdded || 0,
|
||||
recordsUpdated: Math.floor(recordsUpdated),
|
||||
totalSkipped: skippedOrders.size,
|
||||
missingProducts: missingProducts.size,
|
||||
totalImported: Math.floor(importedCount) || 0,
|
||||
recordsAdded: parseInt(recordsAdded) || 0,
|
||||
recordsUpdated: parseInt(recordsUpdated) || 0,
|
||||
totalSkipped: skippedOrders.size || 0,
|
||||
missingProducts: missingProducts.size || 0,
|
||||
incrementalUpdate,
|
||||
lastSyncTime
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during orders import:", error);
|
||||
|
||||
// Rollback transaction
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during rollback:", rollbackError);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,10 +1,13 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
const BATCH_SIZE = 100; // Smaller batch size for better progress tracking
|
||||
const BATCH_SIZE = 1000; // Smaller batch size for better progress tracking
|
||||
const MAX_RETRIES = 3;
|
||||
const RETRY_DELAY = 5000; // 5 seconds
|
||||
const dotenv = require("dotenv");
|
||||
const path = require("path");
|
||||
dotenv.config({ path: path.join(__dirname, "../../.env") });
|
||||
|
||||
// Utility functions
|
||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
||||
const imageUrlBase = process.env.PRODUCT_IMAGE_URL_BASE || 'https://sbing.com/i/products/0000/';
|
||||
const getImageUrls = (pid, iid = 1) => {
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
// Use padded PID only for the first 3 digits
|
||||
@@ -18,7 +21,7 @@ const getImageUrls = (pid, iid = 1) => {
|
||||
};
|
||||
};
|
||||
|
||||
// Add helper function for retrying operations
|
||||
// Add helper function for retrying operations with exponential backoff
|
||||
async function withRetry(operation, errorMessage) {
|
||||
let lastError;
|
||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||
@@ -28,7 +31,8 @@ async function withRetry(operation, errorMessage) {
|
||||
lastError = error;
|
||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||
if (attempt < MAX_RETRIES) {
|
||||
await new Promise(resolve => setTimeout(resolve, RETRY_DELAY));
|
||||
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -140,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
@@ -155,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (
|
||||
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||
FROM product_inventory
|
||||
WHERE pid = p.pid AND count > 0
|
||||
)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL as landing_cost_price,
|
||||
@@ -183,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
@@ -233,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
row.pid,
|
||||
row.title,
|
||||
row.description,
|
||||
row.itemnumber || '',
|
||||
row.sku || '',
|
||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||
row.preorder_count,
|
||||
row.notions_inv_count,
|
||||
@@ -335,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||
WHEN (
|
||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
@@ -350,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (
|
||||
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||
FROM product_inventory
|
||||
WHERE pid = p.pid AND count > 0
|
||||
)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL as landing_cost_price,
|
||||
@@ -378,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
@@ -432,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
row.pid,
|
||||
row.title,
|
||||
row.description,
|
||||
row.itemnumber || '',
|
||||
row.sku || '',
|
||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||
row.preorder_count,
|
||||
row.notions_inv_count,
|
||||
@@ -772,32 +788,44 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
||||
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
||||
|
||||
// Process category relationships for each product in the batch
|
||||
// Process category relationships in batches
|
||||
const allCategories = [];
|
||||
for (const row of batch) {
|
||||
if (row.categories) {
|
||||
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
||||
if (categoryIds.length > 0) {
|
||||
const catPlaceholders = categoryIds.map((_, idx) =>
|
||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||
).join(',');
|
||||
const catValues = categoryIds.flatMap(catId => [row.pid, parseInt(catId.trim(), 10)]);
|
||||
|
||||
// First delete existing relationships for this product
|
||||
await localConnection.query(
|
||||
'DELETE FROM product_categories WHERE pid = $1',
|
||||
[row.pid]
|
||||
);
|
||||
|
||||
// Then insert the new relationships
|
||||
await localConnection.query(`
|
||||
INSERT INTO product_categories (pid, cat_id)
|
||||
VALUES ${catPlaceholders}
|
||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||
`, catValues);
|
||||
categoryIds.forEach(catId => {
|
||||
allCategories.push([row.pid, parseInt(catId.trim(), 10)]);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If we have categories to process
|
||||
if (allCategories.length > 0) {
|
||||
// First get all products in this batch
|
||||
const productIds = batch.map(p => p.pid);
|
||||
|
||||
// Delete all existing relationships for products in this batch
|
||||
await localConnection.query(
|
||||
'DELETE FROM product_categories WHERE pid = ANY($1)',
|
||||
[productIds]
|
||||
);
|
||||
|
||||
// Insert all new relationships in one batch
|
||||
const catPlaceholders = allCategories.map((_, idx) =>
|
||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||
).join(',');
|
||||
|
||||
const catValues = allCategories.flat();
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO product_categories (pid, cat_id)
|
||||
VALUES ${catPlaceholders}
|
||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||
`, catValues);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
@@ -816,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
// Commit the transaction
|
||||
await localConnection.commit();
|
||||
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('products', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
status: 'complete',
|
||||
recordsAdded,
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -32,12 +32,12 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -98,14 +98,14 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
|
||||
COUNT(DISTINCT DATE(o.date)) as period_days,
|
||||
CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH) THEN 'current'
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH) THEN 'previous'
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '3 months' THEN 'current'
|
||||
WHEN o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||
AND CURRENT_DATE - INTERVAL '12 months' THEN 'previous'
|
||||
END as period_type
|
||||
FROM filtered_products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||
GROUP BY p.brand, period_type
|
||||
),
|
||||
brand_data AS (
|
||||
@@ -165,15 +165,16 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
|
||||
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
|
||||
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_stock_units = VALUES(total_stock_units),
|
||||
total_stock_cost = VALUES(total_stock_cost),
|
||||
total_stock_retail = VALUES(total_stock_retail),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin),
|
||||
growth_rate = VALUES(growth_rate),
|
||||
ON CONFLICT (brand) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_stock_units = EXCLUDED.total_stock_units,
|
||||
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin,
|
||||
growth_rate = EXCLUDED.growth_rate,
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
`);
|
||||
|
||||
@@ -230,8 +231,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
monthly_metrics AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT p.valid_pid) as product_count,
|
||||
COUNT(DISTINCT p.active_pid) as active_products,
|
||||
SUM(p.valid_stock) as total_stock_units,
|
||||
@@ -255,19 +256,20 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
END as avg_margin
|
||||
FROM filtered_products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.brand, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
)
|
||||
SELECT *
|
||||
FROM monthly_metrics
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_stock_units = VALUES(total_stock_units),
|
||||
total_stock_cost = VALUES(total_stock_cost),
|
||||
total_stock_retail = VALUES(total_stock_retail),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin)
|
||||
ON CONFLICT (brand, year, month) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_stock_units = EXCLUDED.total_stock_units,
|
||||
total_stock_cost = EXCLUDED.total_stock_cost,
|
||||
total_stock_retail = EXCLUDED.total_stock_retail,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
@@ -294,7 +296,8 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount =
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('brand_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -32,12 +32,12 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -76,12 +76,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
LEFT JOIN products p ON pc.pid = p.pid
|
||||
GROUP BY c.cat_id, c.status
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_value = VALUES(total_value),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (category_id) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_value = EXCLUDED.total_value,
|
||||
status = EXCLUDED.status,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.90);
|
||||
@@ -127,17 +128,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL COALESCE(tc.calculation_period_days, 30) DAY)
|
||||
AND o.date >= CURRENT_DATE - (COALESCE(tc.calculation_period_days, 30) || ' days')::INTERVAL
|
||||
GROUP BY pc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
JOIN category_sales cs ON cm.category_id = cs.cat_id
|
||||
LEFT JOIN turnover_config tc ON
|
||||
(tc.category_id = cm.category_id AND tc.vendor IS NULL) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
UPDATE category_metrics
|
||||
SET
|
||||
cm.avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||
cm.turnover_rate = CASE
|
||||
avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||
turnover_rate = CASE
|
||||
WHEN cs.avg_stock > 0 AND cs.active_days > 0
|
||||
THEN LEAST(
|
||||
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
|
||||
@@ -145,7 +142,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
cm.last_calculated_at = NOW()
|
||||
last_calculated_at = NOW()
|
||||
FROM category_sales cs
|
||||
WHERE category_id = cs.cat_id
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
@@ -184,9 +183,9 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
previous_period AS (
|
||||
@@ -198,26 +197,26 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date BETWEEN CURRENT_DATE - INTERVAL '15 months'
|
||||
AND CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
trend_data AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(MONTH FROM o.date) as month,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as days_in_month
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
LEFT JOIN sales_seasonality ss ON EXTRACT(MONTH FROM o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
GROUP BY pc.cat_id, MONTH(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '15 months'
|
||||
GROUP BY pc.cat_id, EXTRACT(MONTH FROM o.date)
|
||||
),
|
||||
trend_stats AS (
|
||||
SELECT
|
||||
@@ -261,16 +260,42 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '3 months'
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
combined_metrics AS (
|
||||
SELECT
|
||||
COALESCE(cp.cat_id, pp.cat_id) as category_id,
|
||||
CASE
|
||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||
WHEN ta.trend_slope IS NOT NULL THEN
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
(ta.trend_slope / NULLIF(ta.avg_daily_revenue, 0)) * 365 * 100,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
ELSE
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
((COALESCE(cp.revenue, 0) - pp.revenue) /
|
||||
NULLIF(ABS(pp.revenue), 0)) * 100.0,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
END as growth_rate,
|
||||
mc.avg_margin
|
||||
FROM current_period cp
|
||||
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
LEFT JOIN current_period cp ON cm.category_id = cp.cat_id
|
||||
LEFT JOIN previous_period pp ON cm.category_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON cm.category_id = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON cm.category_id = mc.cat_id
|
||||
SET
|
||||
cm.growth_rate = CASE
|
||||
growth_rate = CASE
|
||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||
WHEN ta.trend_slope IS NOT NULL THEN
|
||||
@@ -291,9 +316,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
)
|
||||
END,
|
||||
cm.avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||
cm.last_calculated_at = NOW()
|
||||
WHERE cp.cat_id IS NOT NULL OR pp.cat_id IS NOT NULL
|
||||
avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||
last_calculated_at = NOW()
|
||||
FROM current_period cp
|
||||
FULL OUTER JOIN previous_period pp ON cp.cat_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON COALESCE(cp.cat_id, pp.cat_id) = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON COALESCE(cp.cat_id, pp.cat_id) = mc.cat_id
|
||||
WHERE cm.category_id = COALESCE(cp.cat_id, pp.cat_id)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.97);
|
||||
@@ -335,8 +364,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_value,
|
||||
@@ -364,15 +393,16 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY pc.cat_id, YEAR(o.date), MONTH(o.date)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_value = VALUES(total_value),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin = VALUES(avg_margin),
|
||||
turnover_rate = VALUES(turnover_rate)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY pc.cat_id, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
ON CONFLICT (category_id, year, month) DO UPDATE
|
||||
SET
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_value = EXCLUDED.total_value,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin = EXCLUDED.avg_margin,
|
||||
turnover_rate = EXCLUDED.turnover_rate
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
@@ -414,20 +444,20 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
WITH date_ranges AS (
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY) as period_start,
|
||||
CURRENT_DATE - INTERVAL '30 days' as period_start,
|
||||
CURRENT_DATE as period_end
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 31 DAY)
|
||||
CURRENT_DATE - INTERVAL '90 days',
|
||||
CURRENT_DATE - INTERVAL '31 days'
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 180 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 91 DAY)
|
||||
CURRENT_DATE - INTERVAL '180 days',
|
||||
CURRENT_DATE - INTERVAL '91 days'
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 365 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 181 DAY)
|
||||
CURRENT_DATE - INTERVAL '365 days',
|
||||
CURRENT_DATE - INTERVAL '181 days'
|
||||
),
|
||||
sales_data AS (
|
||||
SELECT
|
||||
@@ -466,12 +496,13 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
END as avg_price,
|
||||
NOW() as last_calculated_at
|
||||
FROM sales_data
|
||||
ON DUPLICATE KEY UPDATE
|
||||
avg_daily_sales = VALUES(avg_daily_sales),
|
||||
total_sold = VALUES(total_sold),
|
||||
num_products = VALUES(num_products),
|
||||
avg_price = VALUES(avg_price),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (category_id, brand, period_start, period_end) DO UPDATE
|
||||
SET
|
||||
avg_daily_sales = EXCLUDED.avg_daily_sales,
|
||||
total_sold = EXCLUDED.total_sold,
|
||||
num_products = EXCLUDED.num_products,
|
||||
avg_price = EXCLUDED.avg_price,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
@@ -498,7 +529,8 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('category_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -32,13 +32,13 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -67,27 +67,28 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
MIN(o.date) as first_sale_date,
|
||||
MAX(o.date) as last_sale_date,
|
||||
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days,
|
||||
EXTRACT(DAY FROM (MAX(o.date)::timestamp with time zone - MIN(o.date)::timestamp with time zone)) + 1 as calculation_period_days,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
GROUP BY p.pid
|
||||
AND DATE(o.date) >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.pid, p.cost_price, p.stock_quantity
|
||||
)
|
||||
UPDATE product_metrics pm
|
||||
JOIN product_financials pf ON pm.pid = pf.pid
|
||||
SET
|
||||
pm.inventory_value = COALESCE(pf.inventory_value, 0),
|
||||
pm.total_revenue = COALESCE(pf.total_revenue, 0),
|
||||
pm.cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
||||
pm.gross_profit = COALESCE(pf.gross_profit, 0),
|
||||
pm.gmroi = CASE
|
||||
inventory_value = COALESCE(pf.inventory_value, 0),
|
||||
total_revenue = COALESCE(pf.total_revenue, 0),
|
||||
cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
||||
gross_profit = COALESCE(pf.gross_profit, 0),
|
||||
gmroi = CASE
|
||||
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
|
||||
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = CURRENT_TIMESTAMP
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
FROM product_financials pf
|
||||
WHERE pm.pid = pf.pid
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
@@ -119,8 +120,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
WITH monthly_financials AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days,
|
||||
@@ -129,19 +130,20 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||
)
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN monthly_financials mf ON pta.pid = mf.pid
|
||||
AND pta.year = mf.year
|
||||
AND pta.month = mf.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(mf.inventory_value, 0),
|
||||
pta.gmroi = CASE
|
||||
inventory_value = COALESCE(mf.inventory_value, 0),
|
||||
gmroi = CASE
|
||||
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
|
||||
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
|
||||
ELSE 0
|
||||
END
|
||||
FROM monthly_financials mf
|
||||
WHERE pta.pid = mf.pid
|
||||
AND pta.year = mf.year
|
||||
AND pta.month = mf.month
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.70);
|
||||
@@ -168,7 +170,8 @@ async function calculateFinancialMetrics(startTime, totalProducts, processedCoun
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('financial_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -10,20 +10,21 @@ function sanitizeValue(value) {
|
||||
}
|
||||
|
||||
async function calculateProductMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let connection;
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
const BATCH_SIZE = 5000;
|
||||
|
||||
try {
|
||||
connection = await getConnection();
|
||||
// Skip flags are inherited from the parent scope
|
||||
const SKIP_PRODUCT_BASE_METRICS = 0;
|
||||
const SKIP_PRODUCT_TIME_AGGREGATES = 0;
|
||||
|
||||
// Get total product count if not provided
|
||||
if (!totalProducts) {
|
||||
const [productCount] = await connection.query('SELECT COUNT(*) as count FROM products');
|
||||
totalProducts = productCount[0].count;
|
||||
const productCount = await connection.query('SELECT COUNT(*) as count FROM products');
|
||||
totalProducts = parseInt(productCount.rows[0].count);
|
||||
}
|
||||
|
||||
if (isCancelled) {
|
||||
@@ -52,19 +53,20 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
|
||||
// First ensure all products have a metrics record
|
||||
await connection.query(`
|
||||
INSERT IGNORE INTO product_metrics (pid, last_calculated_at)
|
||||
INSERT INTO product_metrics (pid, last_calculated_at)
|
||||
SELECT pid, NOW()
|
||||
FROM products
|
||||
ON CONFLICT (pid) DO NOTHING
|
||||
`);
|
||||
|
||||
// Get threshold settings once
|
||||
const [thresholds] = await connection.query(`
|
||||
const thresholds = await connection.query(`
|
||||
SELECT critical_days, reorder_days, overstock_days, low_stock_threshold
|
||||
FROM stock_thresholds
|
||||
WHERE category_id IS NULL AND vendor IS NULL
|
||||
LIMIT 1
|
||||
`);
|
||||
const defaultThresholds = thresholds[0];
|
||||
const defaultThresholds = thresholds.rows[0];
|
||||
|
||||
// Calculate base product metrics
|
||||
if (!SKIP_PRODUCT_BASE_METRICS) {
|
||||
@@ -85,16 +87,43 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
});
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
// Clear temporary tables
|
||||
await connection.query('TRUNCATE TABLE temp_sales_metrics');
|
||||
await connection.query('TRUNCATE TABLE temp_purchase_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||
|
||||
// Create temp_sales_metrics
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_sales_metrics (
|
||||
pid BIGINT NOT NULL,
|
||||
daily_sales_avg DECIMAL(10,3),
|
||||
weekly_sales_avg DECIMAL(10,3),
|
||||
monthly_sales_avg DECIMAL(10,3),
|
||||
total_revenue DECIMAL(10,3),
|
||||
avg_margin_percent DECIMAL(10,3),
|
||||
first_sale_date DATE,
|
||||
last_sale_date DATE,
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
|
||||
// Create temp_purchase_metrics
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_purchase_metrics (
|
||||
pid BIGINT NOT NULL,
|
||||
avg_lead_time_days DOUBLE PRECISION,
|
||||
last_purchase_date DATE,
|
||||
first_received_date DATE,
|
||||
last_received_date DATE,
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
|
||||
// Populate temp_sales_metrics with base stats and sales averages
|
||||
await connection.query(`
|
||||
@@ -115,98 +144,131 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY p.pid
|
||||
`);
|
||||
|
||||
// Populate temp_purchase_metrics
|
||||
await connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
|
||||
MAX(po.date) as last_purchase_date,
|
||||
MIN(po.received_date) as first_received_date,
|
||||
MAX(po.received_date) as last_received_date
|
||||
FROM products p
|
||||
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date >= DATE_SUB(CURDATE(), INTERVAL 365 DAY)
|
||||
GROUP BY p.pid
|
||||
`);
|
||||
// Populate temp_purchase_metrics with timeout protection
|
||||
await Promise.race([
|
||||
connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
AVG(
|
||||
CASE
|
||||
WHEN po.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END
|
||||
) as avg_lead_time_days,
|
||||
MAX(po.date) as last_purchase_date,
|
||||
MIN(po.received_date) as first_received_date,
|
||||
MAX(po.received_date) as last_received_date
|
||||
FROM products p
|
||||
LEFT JOIN purchase_orders po ON p.pid = po.pid
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
AND po.date >= CURRENT_DATE - INTERVAL '365 days'
|
||||
GROUP BY p.pid
|
||||
`),
|
||||
new Promise((_, reject) =>
|
||||
setTimeout(() => reject(new Error('Timeout: temp_purchase_metrics query took too long')), 60000)
|
||||
)
|
||||
]).catch(async (err) => {
|
||||
logError(err, 'Error populating temp_purchase_metrics, continuing with empty table');
|
||||
// Create an empty fallback to continue processing
|
||||
await connection.query(`
|
||||
INSERT INTO temp_purchase_metrics
|
||||
SELECT
|
||||
p.pid,
|
||||
30.0 as avg_lead_time_days,
|
||||
NULL as last_purchase_date,
|
||||
NULL as first_received_date,
|
||||
NULL as last_received_date
|
||||
FROM products p
|
||||
LEFT JOIN temp_purchase_metrics tpm ON p.pid = tpm.pid
|
||||
WHERE tpm.pid IS NULL
|
||||
`);
|
||||
});
|
||||
|
||||
// Process updates in batches
|
||||
let lastPid = 0;
|
||||
while (true) {
|
||||
let batchCount = 0;
|
||||
const MAX_BATCHES = 1000; // Safety limit for number of batches to prevent infinite loops
|
||||
|
||||
while (batchCount < MAX_BATCHES) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const [batch] = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
||||
|
||||
batchCount++;
|
||||
const batch = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||
[lastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (batch.length === 0) break;
|
||||
if (batch.rows.length === 0) break;
|
||||
|
||||
// Process the entire batch in a single efficient query
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN products p ON pm.pid = p.pid
|
||||
LEFT JOIN temp_sales_metrics sm ON pm.pid = sm.pid
|
||||
LEFT JOIN temp_purchase_metrics lm ON pm.pid = lm.pid
|
||||
SET
|
||||
pm.inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
||||
pm.daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
||||
pm.weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
||||
pm.monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
||||
pm.total_revenue = COALESCE(sm.total_revenue, 0),
|
||||
pm.avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
||||
pm.first_sale_date = sm.first_sale_date,
|
||||
pm.last_sale_date = sm.last_sale_date,
|
||||
pm.avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
|
||||
pm.days_of_inventory = CASE
|
||||
inventory_value = p.stock_quantity * NULLIF(p.cost_price, 0),
|
||||
daily_sales_avg = COALESCE(sm.daily_sales_avg, 0),
|
||||
weekly_sales_avg = COALESCE(sm.weekly_sales_avg, 0),
|
||||
monthly_sales_avg = COALESCE(sm.monthly_sales_avg, 0),
|
||||
total_revenue = COALESCE(sm.total_revenue, 0),
|
||||
avg_margin_percent = COALESCE(sm.avg_margin_percent, 0),
|
||||
first_sale_date = sm.first_sale_date,
|
||||
last_sale_date = sm.last_sale_date,
|
||||
avg_lead_time_days = COALESCE(lm.avg_lead_time_days, 30),
|
||||
days_of_inventory = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0
|
||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.daily_sales_avg, 0))
|
||||
ELSE NULL
|
||||
END,
|
||||
pm.weeks_of_inventory = CASE
|
||||
weeks_of_inventory = CASE
|
||||
WHEN COALESCE(sm.weekly_sales_avg, 0) > 0
|
||||
THEN FLOOR(p.stock_quantity / NULLIF(sm.weekly_sales_avg, 0))
|
||||
ELSE NULL
|
||||
END,
|
||||
pm.stock_status = CASE
|
||||
stock_status = CASE
|
||||
WHEN p.stock_quantity <= 0 THEN 'Out of Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= ? THEN 'Low Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 AND p.stock_quantity <= $1 THEN 'Low Stock'
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) = 0 THEN 'In Stock'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Critical'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= ? THEN 'Reorder'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ? THEN 'Overstocked'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $2 THEN 'Critical'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) <= $3 THEN 'Reorder'
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $4 THEN 'Overstocked'
|
||||
ELSE 'Healthy'
|
||||
END,
|
||||
pm.safety_stock = CASE
|
||||
safety_stock = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
||||
ELSE ?
|
||||
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
|
||||
ELSE $5
|
||||
END,
|
||||
pm.reorder_point = CASE
|
||||
reorder_point = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 THEN
|
||||
CEIL(sm.daily_sales_avg * COALESCE(lm.avg_lead_time_days, 30)) +
|
||||
CEIL(sm.daily_sales_avg * SQRT(COALESCE(lm.avg_lead_time_days, 30)) * 1.96)
|
||||
ELSE ?
|
||||
CEIL(sm.daily_sales_avg * SQRT(ABS(COALESCE(lm.avg_lead_time_days, 30))) * 1.96)
|
||||
ELSE $6
|
||||
END,
|
||||
pm.reorder_qty = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL THEN
|
||||
reorder_qty = CASE
|
||||
WHEN COALESCE(sm.daily_sales_avg, 0) > 0 AND NULLIF(p.cost_price, 0) IS NOT NULL AND NULLIF(p.cost_price, 0) > 0 THEN
|
||||
GREATEST(
|
||||
CEIL(SQRT((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25))),
|
||||
?
|
||||
CEIL(SQRT(ABS((2 * (sm.daily_sales_avg * 365) * 25) / (NULLIF(p.cost_price, 0) * 0.25)))),
|
||||
$7
|
||||
)
|
||||
ELSE ?
|
||||
ELSE $8
|
||||
END,
|
||||
pm.overstocked_amt = CASE
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > ?
|
||||
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * ?))
|
||||
overstocked_amt = CASE
|
||||
WHEN p.stock_quantity / NULLIF(sm.daily_sales_avg, 0) > $9
|
||||
THEN GREATEST(0, p.stock_quantity - CEIL(sm.daily_sales_avg * $10))
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE p.pid IN (${batch.map(() => '?').join(',')})
|
||||
last_calculated_at = NOW()
|
||||
FROM products p
|
||||
LEFT JOIN temp_sales_metrics sm ON p.pid = sm.pid
|
||||
LEFT JOIN temp_purchase_metrics lm ON p.pid = lm.pid
|
||||
WHERE p.pid = ANY($11::bigint[])
|
||||
AND pm.pid = p.pid
|
||||
`,
|
||||
[
|
||||
defaultThresholds.low_stock_threshold,
|
||||
@@ -219,12 +281,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
defaultThresholds.low_stock_threshold,
|
||||
defaultThresholds.overstock_days,
|
||||
defaultThresholds.overstock_days,
|
||||
...batch.map(row => row.pid)
|
||||
]
|
||||
);
|
||||
batch.rows.map(row => row.pid)
|
||||
]);
|
||||
|
||||
lastPid = batch[batch.length - 1].pid;
|
||||
processedCount += batch.length;
|
||||
lastPid = batch.rows[batch.rows.length - 1].pid;
|
||||
processedCount += batch.rows.length;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -243,54 +304,59 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
});
|
||||
}
|
||||
|
||||
// Calculate forecast accuracy and bias in batches
|
||||
lastPid = 0;
|
||||
while (true) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const [batch] = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > ? ORDER BY pid LIMIT ?',
|
||||
[lastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (batch.length === 0) break;
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN (
|
||||
SELECT
|
||||
sf.pid,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN ABS(sf.forecast_units - o.quantity) / o.quantity * 100
|
||||
ELSE 100
|
||||
END) as avg_forecast_error,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN (sf.forecast_units - o.quantity) / o.quantity * 100
|
||||
ELSE 0
|
||||
END) as avg_forecast_bias,
|
||||
MAX(sf.forecast_date) as last_forecast_date
|
||||
FROM sales_forecasts sf
|
||||
JOIN orders o ON sf.pid = o.pid
|
||||
AND DATE(o.date) = sf.forecast_date
|
||||
WHERE o.canceled = false
|
||||
AND sf.forecast_date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND sf.pid IN (?)
|
||||
GROUP BY sf.pid
|
||||
) fa ON pm.pid = fa.pid
|
||||
SET
|
||||
pm.forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
|
||||
pm.forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
|
||||
pm.last_forecast_date = fa.last_forecast_date,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [batch.map(row => row.pid), batch.map(row => row.pid)]);
|
||||
|
||||
lastPid = batch[batch.length - 1].pid;
|
||||
// Add safety check if the loop processed MAX_BATCHES
|
||||
if (batchCount >= MAX_BATCHES) {
|
||||
logError(new Error(`Reached maximum batch count (${MAX_BATCHES}). Process may have entered an infinite loop.`), 'Batch processing safety limit reached');
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate forecast accuracy and bias in batches
|
||||
lastPid = 0;
|
||||
while (true) {
|
||||
if (isCancelled) break;
|
||||
|
||||
const batch = await connection.query(
|
||||
'SELECT pid FROM products WHERE pid > $1 ORDER BY pid LIMIT $2',
|
||||
[lastPid, BATCH_SIZE]
|
||||
);
|
||||
|
||||
if (batch.rows.length === 0) break;
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
SET
|
||||
forecast_accuracy = GREATEST(0, 100 - LEAST(fa.avg_forecast_error, 100)),
|
||||
forecast_bias = GREATEST(-100, LEAST(fa.avg_forecast_bias, 100)),
|
||||
last_forecast_date = fa.last_forecast_date,
|
||||
last_calculated_at = NOW()
|
||||
FROM (
|
||||
SELECT
|
||||
sf.pid,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN ABS(sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||
ELSE 100
|
||||
END) as avg_forecast_error,
|
||||
AVG(CASE
|
||||
WHEN o.quantity > 0
|
||||
THEN (sf.forecast_quantity - o.quantity) / o.quantity * 100
|
||||
ELSE 0
|
||||
END) as avg_forecast_bias,
|
||||
MAX(sf.forecast_date) as last_forecast_date
|
||||
FROM sales_forecasts sf
|
||||
JOIN orders o ON sf.pid = o.pid
|
||||
AND DATE(o.date) = sf.forecast_date
|
||||
WHERE o.canceled = false
|
||||
AND sf.forecast_date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
AND sf.pid = ANY($1::bigint[])
|
||||
GROUP BY sf.pid
|
||||
) fa
|
||||
WHERE pm.pid = fa.pid
|
||||
`, [batch.rows.map(row => row.pid)]);
|
||||
|
||||
lastPid = batch.rows[batch.rows.length - 1].pid;
|
||||
}
|
||||
|
||||
// Calculate product time aggregates
|
||||
if (!SKIP_PRODUCT_TIME_AGGREGATES) {
|
||||
outputProgress({
|
||||
@@ -326,11 +392,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
SUM(o.quantity) as total_quantity_sold,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * p.cost_price) as total_cost,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
SUM(p.cost_price * o.quantity) as total_cost,
|
||||
COUNT(DISTINCT o.order_number) as order_count,
|
||||
AVG(o.price) as avg_price,
|
||||
CASE
|
||||
@@ -346,17 +412,18 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
END as gmroi
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_quantity_sold = VALUES(total_quantity_sold),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_cost = VALUES(total_cost),
|
||||
order_count = VALUES(order_count),
|
||||
avg_price = VALUES(avg_price),
|
||||
profit_margin = VALUES(profit_margin),
|
||||
inventory_value = VALUES(inventory_value),
|
||||
gmroi = VALUES(gmroi)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
ON CONFLICT (pid, year, month) DO UPDATE
|
||||
SET
|
||||
total_quantity_sold = EXCLUDED.total_quantity_sold,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
total_cost = EXCLUDED.total_cost,
|
||||
order_count = EXCLUDED.order_count,
|
||||
avg_price = EXCLUDED.avg_price,
|
||||
profit_margin = EXCLUDED.profit_margin,
|
||||
inventory_value = EXCLUDED.inventory_value,
|
||||
gmroi = EXCLUDED.gmroi
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.6);
|
||||
@@ -418,11 +485,11 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
success
|
||||
};
|
||||
|
||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
const abcConfig = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig.rows[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
|
||||
// First, create and populate the rankings table with an index
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||
pid BIGINT NOT NULL,
|
||||
@@ -431,12 +498,12 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
dense_rank_num INT,
|
||||
percentile DECIMAL(5,2),
|
||||
total_count INT,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX (rank_num),
|
||||
INDEX (dense_rank_num),
|
||||
INDEX (percentile)
|
||||
) ENGINE=MEMORY
|
||||
PRIMARY KEY (pid)
|
||||
)
|
||||
`);
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (rank_num)');
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (dense_rank_num)');
|
||||
await connection.query('CREATE INDEX ON temp_revenue_ranks (percentile)');
|
||||
|
||||
// Calculate rankings with proper tie handling
|
||||
await connection.query(`
|
||||
@@ -463,58 +530,74 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
`);
|
||||
|
||||
// Get total count for percentage calculation
|
||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = rankingCount[0].total_count || 1;
|
||||
const max_rank = totalCount;
|
||||
const rankingCount = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = parseInt(rankingCount.rows[0].total_count) || 1;
|
||||
|
||||
// Process updates in batches
|
||||
let abcProcessedCount = 0;
|
||||
const batchSize = 5000;
|
||||
const maxPid = await connection.query('SELECT MAX(pid) as max_pid FROM products');
|
||||
const maxProductId = parseInt(maxPid.rows[0].max_pid);
|
||||
|
||||
while (true) {
|
||||
while (abcProcessedCount < maxProductId) {
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0, // This module doesn't process POs
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Get a batch of PIDs that need updating
|
||||
const [pids] = await connection.query(`
|
||||
const pids = await connection.query(`
|
||||
SELECT pm.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
WHERE pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= ? THEN 'A'
|
||||
WHEN tr.percentile <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END
|
||||
LIMIT ?
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
|
||||
WHERE pm.pid > $1
|
||||
AND (pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= $2 THEN 'A'
|
||||
WHEN tr.percentile <= $3 THEN 'B'
|
||||
ELSE 'C'
|
||||
END)
|
||||
ORDER BY pm.pid
|
||||
LIMIT $4
|
||||
`, [abcProcessedCount, abcThresholds.a_threshold, abcThresholds.b_threshold, batchSize]);
|
||||
|
||||
if (pids.length === 0) break;
|
||||
if (pids.rows.length === 0) break;
|
||||
|
||||
const pidValues = pids.rows.map(row => row.pid);
|
||||
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
SET pm.abc_class =
|
||||
SET abc_class =
|
||||
CASE
|
||||
WHEN tr.pid IS NULL THEN 'C'
|
||||
WHEN tr.percentile <= ? THEN 'A'
|
||||
WHEN tr.percentile <= ? THEN 'B'
|
||||
WHEN tr.percentile <= $1 THEN 'A'
|
||||
WHEN tr.percentile <= $2 THEN 'B'
|
||||
ELSE 'C'
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pids.map(row => row.pid)]);
|
||||
last_calculated_at = NOW()
|
||||
FROM (SELECT pid, percentile FROM temp_revenue_ranks) tr
|
||||
WHERE pm.pid = tr.pid AND pm.pid = ANY($3::bigint[])
|
||||
OR (pm.pid = ANY($3::bigint[]) AND tr.pid IS NULL)
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold, pidValues]);
|
||||
|
||||
// Now update turnover rate with proper handling of zero inventory periods
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN (
|
||||
SET
|
||||
turnover_rate = CASE
|
||||
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
||||
THEN LEAST(
|
||||
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
last_calculated_at = NOW()
|
||||
FROM (
|
||||
SELECT
|
||||
o.pid,
|
||||
SUM(o.quantity) as total_sold,
|
||||
@@ -526,22 +609,33 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND o.pid IN (?)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
AND o.pid = ANY($1::bigint[])
|
||||
GROUP BY o.pid
|
||||
) sales ON pm.pid = sales.pid
|
||||
SET
|
||||
pm.turnover_rate = CASE
|
||||
WHEN sales.avg_nonzero_stock > 0 AND sales.active_days > 0
|
||||
THEN LEAST(
|
||||
(sales.total_sold / sales.avg_nonzero_stock) * (365.0 / sales.active_days),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [pids.map(row => row.pid), pids.map(row => row.pid)]);
|
||||
) sales
|
||||
WHERE pm.pid = sales.pid
|
||||
`, [pidValues]);
|
||||
|
||||
abcProcessedCount = pids.rows[pids.rows.length - 1].pid;
|
||||
|
||||
// Calculate progress proportionally to total products
|
||||
processedCount = Math.floor(totalProducts * (0.60 + (abcProcessedCount / maxProductId) * 0.2));
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'ABC classification progress',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
@@ -551,7 +645,8 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('product_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
@@ -566,7 +661,16 @@ async function calculateProductMetrics(startTime, totalProducts, processedCount
|
||||
logError(error, 'Error calculating product metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
// Always clean up temporary tables, even if an error occurred
|
||||
if (connection) {
|
||||
try {
|
||||
await connection.query('DROP TABLE IF EXISTS temp_sales_metrics');
|
||||
await connection.query('DROP TABLE IF EXISTS temp_purchase_metrics');
|
||||
} catch (err) {
|
||||
console.error('Error cleaning up temporary tables:', err);
|
||||
}
|
||||
|
||||
// Make sure to release the connection
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -32,13 +32,13 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -69,15 +69,15 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO temp_forecast_dates
|
||||
SELECT
|
||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date,
|
||||
DAYOFWEEK(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as day_of_week,
|
||||
MONTH(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as month
|
||||
CURRENT_DATE + (n || ' days')::INTERVAL as forecast_date,
|
||||
EXTRACT(DOW FROM CURRENT_DATE + (n || ' days')::INTERVAL) + 1 as day_of_week,
|
||||
EXTRACT(MONTH FROM CURRENT_DATE + (n || ' days')::INTERVAL) as month
|
||||
FROM (
|
||||
SELECT a.N + b.N * 10 as n
|
||||
SELECT a.n + b.n * 10 as n
|
||||
FROM
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2) b
|
||||
(SELECT 0 as n UNION SELECT 1 UNION SELECT 2) b
|
||||
ORDER BY n
|
||||
LIMIT 31
|
||||
) numbers
|
||||
@@ -109,17 +109,17 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for daily sales stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_daily_sales AS
|
||||
CREATE TEMPORARY TABLE temp_daily_sales AS
|
||||
SELECT
|
||||
o.pid,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY o.pid, DAYOFWEEK(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY o.pid, EXTRACT(DOW FROM o.date) + 1
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.94);
|
||||
@@ -148,7 +148,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for product stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_product_stats AS
|
||||
CREATE TEMPORARY TABLE temp_product_stats AS
|
||||
SELECT
|
||||
pid,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
@@ -186,10 +186,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
INSERT INTO sales_forecasts (
|
||||
pid,
|
||||
forecast_date,
|
||||
forecast_units,
|
||||
forecast_revenue,
|
||||
forecast_quantity,
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
created_at
|
||||
)
|
||||
WITH daily_stats AS (
|
||||
SELECT
|
||||
@@ -223,29 +222,9 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
2
|
||||
END
|
||||
)
|
||||
) as forecast_units,
|
||||
GREATEST(0,
|
||||
ROUND(
|
||||
COALESCE(
|
||||
CASE
|
||||
WHEN ds.data_points >= 4 THEN ds.avg_daily_revenue
|
||||
ELSE ps.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
CASE
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.5 THEN 0.85
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
0
|
||||
),
|
||||
2
|
||||
)
|
||||
) as forecast_revenue,
|
||||
) as forecast_quantity,
|
||||
CASE
|
||||
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
|
||||
WHEN ds.total_days >= 60 THEN 85
|
||||
@@ -255,17 +234,18 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN ds.total_days >= 14 THEN 65
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
NOW() as created_at
|
||||
FROM daily_stats ds
|
||||
JOIN temp_product_stats ps ON ds.pid = ps.pid
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
confidence_level = VALUES(confidence_level),
|
||||
last_calculated_at = NOW()
|
||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor,
|
||||
ds.avg_daily_qty, ds.std_daily_qty, ds.avg_daily_qty, ds.total_days, ds.daily_variance_ratio
|
||||
ON CONFLICT (pid, forecast_date) DO UPDATE
|
||||
SET
|
||||
forecast_quantity = EXCLUDED.forecast_quantity,
|
||||
confidence_level = EXCLUDED.confidence_level,
|
||||
created_at = NOW()
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.98);
|
||||
@@ -294,22 +274,22 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
|
||||
// Create temporary table for category stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_sales AS
|
||||
CREATE TEMPORARY TABLE temp_category_sales AS
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
EXTRACT(DOW FROM o.date) + 1 as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
JOIN product_categories pc ON o.pid = pc.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY pc.cat_id, DAYOFWEEK(o.date)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '90 days'
|
||||
GROUP BY pc.cat_id, EXTRACT(DOW FROM o.date) + 1
|
||||
`);
|
||||
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_stats AS
|
||||
CREATE TEMPORARY TABLE temp_category_stats AS
|
||||
SELECT
|
||||
cat_id,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
@@ -350,10 +330,10 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
forecast_units,
|
||||
forecast_revenue,
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
created_at
|
||||
)
|
||||
SELECT
|
||||
cs.cat_id as category_id,
|
||||
cs.cat_id::bigint as category_id,
|
||||
fd.forecast_date,
|
||||
GREATEST(0,
|
||||
AVG(cs.daily_quantity) *
|
||||
@@ -366,7 +346,7 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
ELSE ct.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
(0.95 + (RAND() * 0.1)),
|
||||
(0.95 + (random() * 0.1)),
|
||||
0
|
||||
)
|
||||
) as forecast_revenue,
|
||||
@@ -376,27 +356,34 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
WHEN ct.total_days >= 14 THEN 70
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
NOW() as created_at
|
||||
FROM temp_category_sales cs
|
||||
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY cs.cat_id, fd.forecast_date, ct.overall_avg_revenue, ct.total_days, sf.seasonality_factor
|
||||
GROUP BY
|
||||
cs.cat_id,
|
||||
fd.forecast_date,
|
||||
ct.overall_avg_revenue,
|
||||
ct.total_days,
|
||||
sf.seasonality_factor,
|
||||
sf.month
|
||||
HAVING AVG(cs.daily_quantity) > 0
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
confidence_level = VALUES(confidence_level),
|
||||
last_calculated_at = NOW()
|
||||
ON CONFLICT (category_id, forecast_date) DO UPDATE
|
||||
SET
|
||||
forecast_units = EXCLUDED.forecast_units,
|
||||
forecast_revenue = EXCLUDED.forecast_revenue,
|
||||
confidence_level = EXCLUDED.confidence_level,
|
||||
created_at = NOW()
|
||||
`);
|
||||
|
||||
// Clean up temporary tables
|
||||
await connection.query(`
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_product_stats;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_stats;
|
||||
DROP TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TABLE IF EXISTS temp_product_stats;
|
||||
DROP TABLE IF EXISTS temp_category_sales;
|
||||
DROP TABLE IF EXISTS temp_category_stats;
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
@@ -423,7 +410,8 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('sales_forecasts', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -32,12 +32,12 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
const orderCount = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
processedOrders = parseInt(orderCount.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -75,8 +75,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
o.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
SUM(o.quantity) as total_quantity_sold,
|
||||
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) as total_revenue,
|
||||
SUM(COALESCE(p.cost_price, 0) * o.quantity) as total_cost,
|
||||
@@ -93,17 +93,17 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY o.pid, YEAR(o.date), MONTH(o.date)
|
||||
GROUP BY o.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||
),
|
||||
monthly_stock AS (
|
||||
SELECT
|
||||
pid,
|
||||
YEAR(date) as year,
|
||||
MONTH(date) as month,
|
||||
EXTRACT(YEAR FROM date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM date::timestamp with time zone) as month,
|
||||
SUM(received) as stock_received,
|
||||
SUM(ordered) as stock_ordered
|
||||
FROM purchase_orders
|
||||
GROUP BY pid, YEAR(date), MONTH(date)
|
||||
GROUP BY pid, EXTRACT(YEAR FROM date::timestamp with time zone), EXTRACT(MONTH FROM date::timestamp with time zone)
|
||||
),
|
||||
base_products AS (
|
||||
SELECT
|
||||
@@ -197,17 +197,18 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
AND s.year = ms.year
|
||||
AND s.month = ms.month
|
||||
)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_quantity_sold = VALUES(total_quantity_sold),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_cost = VALUES(total_cost),
|
||||
order_count = VALUES(order_count),
|
||||
stock_received = VALUES(stock_received),
|
||||
stock_ordered = VALUES(stock_ordered),
|
||||
avg_price = VALUES(avg_price),
|
||||
profit_margin = VALUES(profit_margin),
|
||||
inventory_value = VALUES(inventory_value),
|
||||
gmroi = VALUES(gmroi)
|
||||
ON CONFLICT (pid, year, month) DO UPDATE
|
||||
SET
|
||||
total_quantity_sold = EXCLUDED.total_quantity_sold,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
total_cost = EXCLUDED.total_cost,
|
||||
order_count = EXCLUDED.order_count,
|
||||
stock_received = EXCLUDED.stock_received,
|
||||
stock_ordered = EXCLUDED.stock_ordered,
|
||||
avg_price = EXCLUDED.avg_price,
|
||||
profit_margin = EXCLUDED.profit_margin,
|
||||
inventory_value = EXCLUDED.inventory_value,
|
||||
gmroi = EXCLUDED.gmroi
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.60);
|
||||
@@ -237,23 +238,23 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
// Update with financial metrics
|
||||
await connection.query(`
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN (
|
||||
SET inventory_value = COALESCE(fin.inventory_value, 0)
|
||||
FROM (
|
||||
SELECT
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
) fin ON pta.pid = fin.pid
|
||||
GROUP BY p.pid, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone), p.cost_price, p.stock_quantity
|
||||
) fin
|
||||
WHERE pta.pid = fin.pid
|
||||
AND pta.year = fin.year
|
||||
AND pta.month = fin.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(fin.inventory_value, 0)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
@@ -280,7 +281,8 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('time_aggregates', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
const mysql = require('mysql2/promise');
|
||||
const { Pool } = require('pg');
|
||||
const path = require('path');
|
||||
require('dotenv').config({ path: path.resolve(__dirname, '../../..', '.env') });
|
||||
|
||||
@@ -8,36 +8,24 @@ const dbConfig = {
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0,
|
||||
port: process.env.DB_PORT || 5432,
|
||||
ssl: process.env.DB_SSL === 'true',
|
||||
// Add performance optimizations
|
||||
namedPlaceholders: true,
|
||||
maxPreparedStatements: 256,
|
||||
enableKeepAlive: true,
|
||||
keepAliveInitialDelay: 0,
|
||||
// Add memory optimizations
|
||||
flags: [
|
||||
'FOUND_ROWS',
|
||||
'LONG_PASSWORD',
|
||||
'PROTOCOL_41',
|
||||
'TRANSACTIONS',
|
||||
'SECURE_CONNECTION',
|
||||
'MULTI_RESULTS',
|
||||
'PS_MULTI_RESULTS',
|
||||
'PLUGIN_AUTH',
|
||||
'CONNECT_ATTRS',
|
||||
'PLUGIN_AUTH_LENENC_CLIENT_DATA',
|
||||
'SESSION_TRACK',
|
||||
'MULTI_STATEMENTS'
|
||||
]
|
||||
max: 10, // connection pool max size
|
||||
idleTimeoutMillis: 30000,
|
||||
connectionTimeoutMillis: 60000
|
||||
};
|
||||
|
||||
// Create a single pool instance to be reused
|
||||
const pool = mysql.createPool(dbConfig);
|
||||
const pool = new Pool(dbConfig);
|
||||
|
||||
// Add event handlers for pool
|
||||
pool.on('error', (err, client) => {
|
||||
console.error('Unexpected error on idle client', err);
|
||||
});
|
||||
|
||||
async function getConnection() {
|
||||
return await pool.getConnection();
|
||||
return await pool.connect();
|
||||
}
|
||||
|
||||
async function closePool() {
|
||||
|
||||
@@ -33,7 +33,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
}
|
||||
|
||||
// Get counts of records that will be processed
|
||||
const [[orderCount], [poCount]] = await Promise.all([
|
||||
const [orderCountResult, poCountResult] = await Promise.all([
|
||||
connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
@@ -45,8 +45,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
WHERE po.status != 0
|
||||
`)
|
||||
]);
|
||||
processedOrders = orderCount.count;
|
||||
processedPurchaseOrders = poCount.count;
|
||||
processedOrders = parseInt(orderCountResult.rows[0].count);
|
||||
processedPurchaseOrders = parseInt(poCountResult.rows[0].count);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
@@ -66,7 +66,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
|
||||
// First ensure all vendors exist in vendor_details
|
||||
await connection.query(`
|
||||
INSERT IGNORE INTO vendor_details (vendor, status, created_at, updated_at)
|
||||
INSERT INTO vendor_details (vendor, status, created_at, updated_at)
|
||||
SELECT DISTINCT
|
||||
vendor,
|
||||
'active' as status,
|
||||
@@ -74,6 +74,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
NOW() as updated_at
|
||||
FROM products
|
||||
WHERE vendor IS NOT NULL
|
||||
ON CONFLICT (vendor) DO NOTHING
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.8);
|
||||
@@ -128,7 +129,7 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_po AS (
|
||||
@@ -138,12 +139,15 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
COUNT(DISTINCT po.id) as total_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_products AS (
|
||||
@@ -188,20 +192,21 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
|
||||
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
|
||||
WHERE vs.vendor IS NOT NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_orders = VALUES(total_orders),
|
||||
total_late_orders = VALUES(total_late_orders),
|
||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
||||
on_time_delivery_rate = VALUES(on_time_delivery_rate),
|
||||
order_fill_rate = VALUES(order_fill_rate),
|
||||
avg_order_value = VALUES(avg_order_value),
|
||||
active_products = VALUES(active_products),
|
||||
total_products = VALUES(total_products),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
avg_margin_percent = VALUES(avg_margin_percent),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
ON CONFLICT (vendor) DO UPDATE
|
||||
SET
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
total_orders = EXCLUDED.total_orders,
|
||||
total_late_orders = EXCLUDED.total_late_orders,
|
||||
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||
on_time_delivery_rate = EXCLUDED.on_time_delivery_rate,
|
||||
order_fill_rate = EXCLUDED.order_fill_rate,
|
||||
avg_order_value = EXCLUDED.avg_order_value,
|
||||
active_products = EXCLUDED.active_products,
|
||||
total_products = EXCLUDED.total_products,
|
||||
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||
avg_margin_percent = EXCLUDED.avg_margin_percent,
|
||||
status = EXCLUDED.status,
|
||||
last_calculated_at = EXCLUDED.last_calculated_at
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.9);
|
||||
@@ -244,23 +249,23 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
WITH monthly_orders AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
EXTRACT(YEAR FROM o.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM o.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT o.id) as total_orders,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(o.date), MONTH(o.date)
|
||||
GROUP BY p.vendor, EXTRACT(YEAR FROM o.date::timestamp with time zone), EXTRACT(MONTH FROM o.date::timestamp with time zone)
|
||||
),
|
||||
monthly_po AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
YEAR(po.date) as year,
|
||||
MONTH(po.date) as month,
|
||||
EXTRACT(YEAR FROM po.date::timestamp with time zone) as year,
|
||||
EXTRACT(MONTH FROM po.date::timestamp with time zone) as month,
|
||||
COUNT(DISTINCT po.id) as total_po,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
|
||||
@@ -268,14 +273,17 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
END) as late_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
AND po.received_date IS NOT NULL
|
||||
AND po.date IS NOT NULL
|
||||
THEN EXTRACT(EPOCH FROM (po.received_date::timestamp with time zone - po.date::timestamp with time zone)) / 86400.0
|
||||
ELSE NULL
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
WHERE po.date >= CURRENT_DATE - INTERVAL '12 months'
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(po.date), MONTH(po.date)
|
||||
GROUP BY p.vendor, EXTRACT(YEAR FROM po.date::timestamp with time zone), EXTRACT(MONTH FROM po.date::timestamp with time zone)
|
||||
)
|
||||
SELECT
|
||||
mo.vendor,
|
||||
@@ -311,13 +319,14 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
AND mp.year = mo.year
|
||||
AND mp.month = mo.month
|
||||
WHERE mo.vendor IS NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_orders = VALUES(total_orders),
|
||||
late_orders = VALUES(late_orders),
|
||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
avg_margin_percent = VALUES(avg_margin_percent)
|
||||
ON CONFLICT (vendor, year, month) DO UPDATE
|
||||
SET
|
||||
total_orders = EXCLUDED.total_orders,
|
||||
late_orders = EXCLUDED.late_orders,
|
||||
avg_lead_time_days = EXCLUDED.avg_lead_time_days,
|
||||
total_purchase_value = EXCLUDED.total_purchase_value,
|
||||
total_revenue = EXCLUDED.total_revenue,
|
||||
avg_margin_percent = EXCLUDED.avg_margin_percent
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
@@ -344,7 +353,8 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount =
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('vendor_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
ON CONFLICT (module_name) DO UPDATE
|
||||
SET last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -184,7 +184,7 @@ async function resetDatabase() {
|
||||
SELECT string_agg(tablename, ', ') as tables
|
||||
FROM pg_tables
|
||||
WHERE schemaname = 'public'
|
||||
AND tablename NOT IN ('users', 'calculate_history', 'import_history');
|
||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates');
|
||||
`);
|
||||
|
||||
if (!tablesResult.rows[0].tables) {
|
||||
|
||||
@@ -100,6 +100,9 @@ async function resetMetrics() {
|
||||
client = new Client(dbConfig);
|
||||
await client.connect();
|
||||
|
||||
// Explicitly begin a transaction
|
||||
await client.query('BEGIN');
|
||||
|
||||
// First verify current state
|
||||
const initialTables = await client.query(`
|
||||
SELECT tablename as name
|
||||
@@ -124,6 +127,7 @@ async function resetMetrics() {
|
||||
|
||||
for (const table of [...METRICS_TABLES].reverse()) {
|
||||
try {
|
||||
// Use NOWAIT to avoid hanging if there's a lock
|
||||
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
||||
|
||||
// Verify the table was actually dropped
|
||||
@@ -142,13 +146,23 @@ async function resetMetrics() {
|
||||
operation: 'Table dropped',
|
||||
message: `Successfully dropped table: ${table}`
|
||||
});
|
||||
|
||||
// Commit after each table drop to ensure locks are released
|
||||
await client.query('COMMIT');
|
||||
// Start a new transaction for the next table
|
||||
await client.query('BEGIN');
|
||||
// Re-disable foreign key constraints for the new transaction
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
} catch (err) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Drop table error',
|
||||
message: `Error dropping table ${table}: ${err.message}`
|
||||
});
|
||||
throw err;
|
||||
await client.query('ROLLBACK');
|
||||
// Re-start transaction for next table
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
}
|
||||
}
|
||||
|
||||
@@ -164,6 +178,11 @@ async function resetMetrics() {
|
||||
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.rows.map(t => t.name).join(', ')}`);
|
||||
}
|
||||
|
||||
// Make sure we have a fresh transaction here
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
|
||||
// Read metrics schema
|
||||
outputProgress({
|
||||
operation: 'Reading schema',
|
||||
@@ -220,6 +239,13 @@ async function resetMetrics() {
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
|
||||
// Commit every 10 statements to avoid long-running transactions
|
||||
if (i > 0 && i % 10 === 0) {
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
await client.query('SET session_replication_role = \'replica\'');
|
||||
}
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
@@ -230,10 +256,17 @@ async function resetMetrics() {
|
||||
statementNumber: i + 1
|
||||
}
|
||||
});
|
||||
await client.query('ROLLBACK');
|
||||
throw sqlError;
|
||||
}
|
||||
}
|
||||
|
||||
// Final commit for any pending statements
|
||||
await client.query('COMMIT');
|
||||
|
||||
// Start new transaction for final checks
|
||||
await client.query('BEGIN');
|
||||
|
||||
// Re-enable foreign key checks after all tables are created
|
||||
await client.query('SET session_replication_role = \'origin\'');
|
||||
|
||||
@@ -269,9 +302,11 @@ async function resetMetrics() {
|
||||
operation: 'Final table check',
|
||||
message: `All database tables: ${finalCheck.rows.map(t => t.name).join(', ')}`
|
||||
});
|
||||
await client.query('ROLLBACK');
|
||||
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
|
||||
}
|
||||
|
||||
// Commit final transaction
|
||||
await client.query('COMMIT');
|
||||
|
||||
outputProgress({
|
||||
@@ -288,7 +323,11 @@ async function resetMetrics() {
|
||||
});
|
||||
|
||||
if (client) {
|
||||
await client.query('ROLLBACK');
|
||||
try {
|
||||
await client.query('ROLLBACK');
|
||||
} catch (rollbackError) {
|
||||
console.error('Error during rollback:', rollbackError);
|
||||
}
|
||||
// Make sure to re-enable foreign key checks even if there's an error
|
||||
await client.query('SET session_replication_role = \'origin\'').catch(() => {});
|
||||
}
|
||||
|
||||
337
inventory-server/scripts/update-order-costs.js
Normal file
337
inventory-server/scripts/update-order-costs.js
Normal file
@@ -0,0 +1,337 @@
|
||||
/**
|
||||
* This script updates the costeach values for existing orders from the original MySQL database
|
||||
* without needing to run the full import process.
|
||||
*/
|
||||
const dotenv = require("dotenv");
|
||||
const path = require("path");
|
||||
const fs = require("fs");
|
||||
const { setupConnections, closeConnections } = require('./import/utils');
|
||||
const { outputProgress, formatElapsedTime } = require('./metrics/utils/progress');
|
||||
|
||||
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||
|
||||
// SSH configuration
|
||||
const sshConfig = {
|
||||
ssh: {
|
||||
host: process.env.PROD_SSH_HOST,
|
||||
port: process.env.PROD_SSH_PORT || 22,
|
||||
username: process.env.PROD_SSH_USER,
|
||||
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||
: undefined,
|
||||
compress: true, // Enable SSH compression
|
||||
},
|
||||
prodDbConfig: {
|
||||
// MySQL config for production
|
||||
host: process.env.PROD_DB_HOST || "localhost",
|
||||
user: process.env.PROD_DB_USER,
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z',
|
||||
},
|
||||
localDbConfig: {
|
||||
// PostgreSQL config for local
|
||||
host: process.env.DB_HOST,
|
||||
user: process.env.DB_USER,
|
||||
password: process.env.DB_PASSWORD,
|
||||
database: process.env.DB_NAME,
|
||||
port: process.env.DB_PORT || 5432,
|
||||
ssl: process.env.DB_SSL === 'true',
|
||||
connectionTimeoutMillis: 60000,
|
||||
idleTimeoutMillis: 30000,
|
||||
max: 10 // connection pool max size
|
||||
}
|
||||
};
|
||||
|
||||
async function updateOrderCosts() {
|
||||
const startTime = Date.now();
|
||||
let connections;
|
||||
let updatedCount = 0;
|
||||
let errorCount = 0;
|
||||
|
||||
try {
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Initializing SSH tunnel..."
|
||||
});
|
||||
|
||||
connections = await setupConnections(sshConfig);
|
||||
const { prodConnection, localConnection } = connections;
|
||||
|
||||
// 1. Get all orders from local database that need cost updates
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Getting orders from local database..."
|
||||
});
|
||||
|
||||
const [orders] = await localConnection.query(`
|
||||
SELECT DISTINCT order_number, pid
|
||||
FROM orders
|
||||
WHERE costeach = 0 OR costeach IS NULL
|
||||
ORDER BY order_number
|
||||
`);
|
||||
|
||||
if (!orders || !orders.rows || orders.rows.length === 0) {
|
||||
console.log("No orders found that need cost updates");
|
||||
return { updatedCount: 0, errorCount: 0 };
|
||||
}
|
||||
|
||||
const totalOrders = orders.rows.length;
|
||||
console.log(`Found ${totalOrders} orders that need cost updates`);
|
||||
|
||||
// Process in batches of 1000 orders
|
||||
const BATCH_SIZE = 500;
|
||||
for (let i = 0; i < orders.rows.length; i += BATCH_SIZE) {
|
||||
try {
|
||||
// Start transaction for this batch
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
const batch = orders.rows.slice(i, i + BATCH_SIZE);
|
||||
|
||||
const orderNumbers = [...new Set(batch.map(o => o.order_number))];
|
||||
|
||||
// 2. Fetch costs from production database for these orders
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Fetching costs for orders ${i + 1} to ${Math.min(i + BATCH_SIZE, totalOrders)} of ${totalOrders}`,
|
||||
current: i,
|
||||
total: totalOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
|
||||
const [costs] = await prodConnection.query(`
|
||||
SELECT
|
||||
oc.orderid as order_number,
|
||||
oc.pid,
|
||||
oc.costeach
|
||||
FROM order_costs oc
|
||||
INNER JOIN (
|
||||
SELECT
|
||||
orderid,
|
||||
pid,
|
||||
MAX(id) as max_id
|
||||
FROM order_costs
|
||||
WHERE orderid IN (?)
|
||||
AND pending = 0
|
||||
GROUP BY orderid, pid
|
||||
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||
`, [orderNumbers]);
|
||||
|
||||
// Create a map of costs for easy lookup
|
||||
const costMap = {};
|
||||
if (costs && costs.length) {
|
||||
costs.forEach(c => {
|
||||
costMap[`${c.order_number}-${c.pid}`] = c.costeach || 0;
|
||||
});
|
||||
}
|
||||
|
||||
// 3. Update costs in local database by batches
|
||||
// Using a more efficient update approach with a temporary table
|
||||
|
||||
// Create a temporary table for each batch
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_number VARCHAR(50) NOT NULL,
|
||||
pid BIGINT NOT NULL,
|
||||
costeach DECIMAL(10,3) NOT NULL,
|
||||
PRIMARY KEY (order_number, pid)
|
||||
);
|
||||
`);
|
||||
|
||||
// Insert cost data into the temporary table
|
||||
const costEntries = [];
|
||||
for (const order of batch) {
|
||||
const key = `${order.order_number}-${order.pid}`;
|
||||
if (key in costMap) {
|
||||
costEntries.push({
|
||||
order_number: order.order_number,
|
||||
pid: order.pid,
|
||||
costeach: costMap[key]
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Insert in sub-batches of 100
|
||||
const DB_BATCH_SIZE = 50;
|
||||
for (let j = 0; j < costEntries.length; j += DB_BATCH_SIZE) {
|
||||
const subBatch = costEntries.slice(j, j + DB_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(',');
|
||||
|
||||
const values = subBatch.flatMap(item => [
|
||||
item.order_number,
|
||||
item.pid,
|
||||
item.costeach
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_costs (order_number, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
`, values);
|
||||
}
|
||||
|
||||
// Perform bulk update from the temporary table
|
||||
const [updateResult] = await localConnection.query(`
|
||||
UPDATE orders o
|
||||
SET costeach = t.costeach
|
||||
FROM temp_order_costs t
|
||||
WHERE o.order_number = t.order_number AND o.pid = t.pid
|
||||
RETURNING o.id
|
||||
`);
|
||||
|
||||
const batchUpdated = updateResult.rowCount || 0;
|
||||
updatedCount += batchUpdated;
|
||||
|
||||
// Commit transaction for this batch
|
||||
await localConnection.commit();
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Updated ${updatedCount} orders with costs from production (batch: ${batchUpdated})`,
|
||||
current: i + batch.length,
|
||||
total: totalOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
} catch (error) {
|
||||
// If a batch fails, roll back that batch's transaction and continue
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during batch rollback:", rollbackError);
|
||||
}
|
||||
|
||||
console.error(`Error processing batch ${i}-${i + BATCH_SIZE}:`, error);
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
|
||||
// 4. For orders with no matching costs, set a default based on price
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: "Setting default costs for remaining orders..."
|
||||
});
|
||||
|
||||
// Process remaining updates in smaller batches
|
||||
const DEFAULT_BATCH_SIZE = 10000;
|
||||
let totalDefaultUpdated = 0;
|
||||
|
||||
try {
|
||||
// Start with a count query to determine how many records need the default update
|
||||
const [countResult] = await localConnection.query(`
|
||||
SELECT COUNT(*) as count FROM orders
|
||||
WHERE (costeach = 0 OR costeach IS NULL)
|
||||
`);
|
||||
|
||||
const totalToUpdate = parseInt(countResult.rows[0]?.count || 0);
|
||||
|
||||
if (totalToUpdate > 0) {
|
||||
console.log(`Applying default cost to ${totalToUpdate} orders`);
|
||||
|
||||
// Apply the default in batches with separate transactions
|
||||
for (let i = 0; i < totalToUpdate; i += DEFAULT_BATCH_SIZE) {
|
||||
try {
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
const [defaultUpdates] = await localConnection.query(`
|
||||
WITH orders_to_update AS (
|
||||
SELECT id FROM orders
|
||||
WHERE (costeach = 0 OR costeach IS NULL)
|
||||
LIMIT ${DEFAULT_BATCH_SIZE}
|
||||
)
|
||||
UPDATE orders o
|
||||
SET costeach = price * 0.5
|
||||
FROM orders_to_update otu
|
||||
WHERE o.id = otu.id
|
||||
RETURNING o.id
|
||||
`);
|
||||
|
||||
const batchDefaultUpdated = defaultUpdates.rowCount || 0;
|
||||
totalDefaultUpdated += batchDefaultUpdated;
|
||||
|
||||
await localConnection.commit();
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Order costs update",
|
||||
message: `Applied default costs to ${totalDefaultUpdated} of ${totalToUpdate} orders`,
|
||||
current: totalDefaultUpdated,
|
||||
total: totalToUpdate,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||
});
|
||||
} catch (error) {
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during default update rollback:", rollbackError);
|
||||
}
|
||||
|
||||
console.error(`Error applying default costs batch ${i}-${i + DEFAULT_BATCH_SIZE}:`, error);
|
||||
errorCount++;
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error counting or updating remaining orders:", error);
|
||||
errorCount++;
|
||||
}
|
||||
|
||||
updatedCount += totalDefaultUpdated;
|
||||
|
||||
const endTime = Date.now();
|
||||
const totalSeconds = (endTime - startTime) / 1000;
|
||||
|
||||
outputProgress({
|
||||
status: "complete",
|
||||
operation: "Order costs update",
|
||||
message: `Updated ${updatedCount} orders (${totalDefaultUpdated} with default values) in ${formatElapsedTime(totalSeconds)}`,
|
||||
elapsed: formatElapsedTime(totalSeconds)
|
||||
});
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
updatedCount,
|
||||
errorCount
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during order costs update:", error);
|
||||
|
||||
return {
|
||||
status: "error",
|
||||
error: error.message,
|
||||
updatedCount,
|
||||
errorCount
|
||||
};
|
||||
} finally {
|
||||
if (connections) {
|
||||
await closeConnections(connections).catch(err => {
|
||||
console.error("Error closing connections:", err);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Run the script only if this is the main module
|
||||
if (require.main === module) {
|
||||
updateOrderCosts().then((results) => {
|
||||
console.log('Cost update completed:', results);
|
||||
// Force exit after a small delay to ensure all logs are written
|
||||
setTimeout(() => process.exit(0), 500);
|
||||
}).catch((error) => {
|
||||
console.error("Unhandled error:", error);
|
||||
// Force exit with error code after a small delay
|
||||
setTimeout(() => process.exit(1), 500);
|
||||
});
|
||||
}
|
||||
|
||||
// Export the function for use in other scripts
|
||||
module.exports = updateOrderCosts;
|
||||
@@ -1,226 +0,0 @@
|
||||
I will provide a JSON array with product data. Process the array by combining all products from validData and invalidData arrays into a single array, excluding any fields starting with “__”, such as “__index” or “__errors”. Process each product according to the reference guidelines below. If a field is not included in the data, do not include it in your response (e.g. do not include its key or any value) unless the specific field guidelines below say otherwise. If a product appears to be from an empty or entirely invalid line, do not include it in your response.
|
||||
|
||||
Your response should be a JSON object with the following structure:
|
||||
{
|
||||
"correctedData": [], // Array of corrected products
|
||||
"changes": [], // Array of strings describing each change made
|
||||
"warnings": [] // Array of strings with warnings or suggestions for manual review (see below for details)
|
||||
}
|
||||
|
||||
IMPORTANT: For all fields that use IDs (categories, supplier, company, line, subline, ship_restrictions, tax_cat, artist, themes, etc.), you MUST return the ID values, not the display names. The system will handle converting IDs to display names.
|
||||
|
||||
Using the provided guidelines, focus on:
|
||||
1. Correcting typos and any incorrect spelling or grammar
|
||||
2. Standardizing product names
|
||||
3. Correcting and enhancing descriptions by adding details, keywords, and SEO-friendly language
|
||||
4. Fixing any obvious errors or inconsistencies between similar products in measurements, prices, or quantities
|
||||
5. Adding correct categories, themes, and colors
|
||||
|
||||
Use only the provided data and your own knowledge to make changes. Do not make assumptions or make up information that you're not sure about. If you're unable to make a change you're confident about, leave the field as is. All data passed in should be validated, corrected, and returned. All values returned should be strings, not numbers. Do not leave out any fields that were present in the original data.
|
||||
|
||||
Possible reasons for including a warning in the warnings array:
|
||||
- If you're unable to make a change you're confident about but you believe one needs to be made
|
||||
- If there are inconsistencies in the data that could be valid but need to be reviewed
|
||||
- If not enough information is provided to make a change that you believe is needed
|
||||
- If you infer a value for a required field based on context
|
||||
|
||||
|
||||
----------PRODUCT FIELD GUIDELINES----------
|
||||
|
||||
Fields: supplier, private_notes, company, line, subline, artist
|
||||
Changes: Not allowed
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, return these fields exactly as provided with no changes
|
||||
|
||||
Fields: upc, supplier_no, notions_no, item_number
|
||||
Changes: Formatting only
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, trim outside white space and return these fields exactly as provided with no other changes
|
||||
|
||||
Fields: hts_code
|
||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, trim white space and any non-numeric characters, then return as a string. Do not validate in any other way.
|
||||
|
||||
Fields: image_url
|
||||
Changes: Formatting only
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, convert all comma-separated values to valid https:// URLs and return
|
||||
|
||||
Fields: msrp, cost_each
|
||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, strip any currency symbols and return as a string with exactly two decimal places, even if the last place is a 0.
|
||||
|
||||
Fields: qty_per_unit, case_qty
|
||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, strip non-numeric characters and return
|
||||
|
||||
Fields: ship_restrictions
|
||||
Changes: Only add a value if it's not already present
|
||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
|
||||
Instructions: Always return a value exactly as provided, or return 0 if no value is provided.
|
||||
|
||||
Fields: eta
|
||||
Changes: Minimal, you can correct formatting, obvious errors or inconsistencies
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, return a full month name, day is optional, no year ever (e.g. “January” or “March 3”). This value is not required if not provided.
|
||||
|
||||
Fields: name
|
||||
Changes: Allowed to conform to guidelines, to fix typos or formatting
|
||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most reasonable value possible based on the naming guidelines and the other information you have.
|
||||
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
|
||||
|
||||
Fields: description
|
||||
Changes: Full creative control allowed within guidelines
|
||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return the most accurate description possible based on the description guidelines and the other information you have.
|
||||
Instructions: Always return a value that is corrected and enhanced per additional guidelines below
|
||||
|
||||
Fields: weight, length, width, height
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return your best guess based on the other information you have or the dimensions for similar products.
|
||||
Instructions: Always return a reasonable value (weights in ounces and dimensions in inches) that is validated against similar provided products and your knowledge of general object measurements (e.g. a sheet of paper is not going to be 3 inches thick, a pack of stickers is not going to be 250 ounces, this sheet of paper is very likely going to be the same size as that other sheet of paper from the same line). If a value is unusual or unreasonable, even wildly so, change it to match similar products or to be more reasonable. When correcting unreasonable weights or dimensions, prioritize comparisons to products from the same company and product line first, then broader category matches or common knowledge if necessary.Do not return 0 or null for any of these fields.
|
||||
|
||||
Fields: coo
|
||||
Changes: Formatting only
|
||||
Required: Return if present in the original data. Do not return if not present.
|
||||
Instructions: If present, convert all country names and abbreviations to the official ISO 3166-1 alpha-2 two-character country code. Convert any value with more than two characters to two characters only (e.g. "United States" or "USA" should both return "US").
|
||||
|
||||
Fields: tax_cat
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: You must always return a value for this field, even if it's not provided in the original data. If no value is provided, return 0.
|
||||
Instructions: Always return a valid numerical tax code ID from the Available Tax Codes array below. Give preference to the value provided, but correct it if another value is more accurate. You must return a value for this field. 0 should be the default value in most cases.
|
||||
|
||||
Fields: size_cat
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: Return if present in the original data or if not present and applicable. Do not return if not applicable (e.g. if no size categories apply based on what you know about the product).
|
||||
Instructions: If present or if applicable, return one valid numerical size category ID from the Available Size Categories array below. Give preference to the value provided, but correct it if another value is more accurate. If the product name contains a match for one of the size categories (such as 12x12, 6x6, 2oz, etc) you MUST return that size category with the results. A value is not required if none of the size categories apply.
|
||||
|
||||
Fields: themes
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no themes apply based on what you know about the product).
|
||||
Instructions: If present, confirm that each provided theme matches what you understand to be a theme of the product. Remove any themes that do not match and add any themes that are missing. Most products will have zero or one theme. Return a comma-separated list of numerical theme IDs from the Available Themes array below. If you choose a sub-theme, you do not need to include its parent theme in the list.
|
||||
|
||||
Fields: colors
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: Return if present in the original data or if not present and applicable. Do not return any value if not applicable (e.g. if no colors apply based on what you know about the product).
|
||||
Instructions: If present or if applicable, return a comma-separated list of numerical color IDs from the Available Colors array below, using the product name as the primary guide (e.g. if the name contains Blue or a blue variant, you should return the blue color ID). A value is not required if none of the colors apply. Most products will have zero colors.
|
||||
|
||||
Fields: categories
|
||||
Changes: Allowed to correct obvious errors or inconsistencies or to add missing values
|
||||
Required: You must always return at least one value for this field, even if it's not provided in the original data. If no value is provided, return the most appropriate category or categories based on the other information you have.
|
||||
Instructions: Always return a comma-separated list of one or more valid numerical category IDs from the Available Categories array below. Give preference to the values provided, particularly if the other information isn't enough to determine a category, but correct them or add new categories if another value is more accurate. Do not return categories in the Deals or Black Friday categories, and strip these from the list if present. If you choose a subcategory at any level, you do not need to include its parent categories in the list. You must return at least one category and you can return multiple categories if applicable. All categories have equal value so their order is not important. Always try to return the most specific categories possible (e.g. one in the third level of the category hierarchy is better than one in the second level).
|
||||
|
||||
----------PRODUCT NAMING GUIDELINES----------
|
||||
If there's only one of this type of product in a line: [Line Name] [Product Name] - [Company]
|
||||
Example: "Cosmos Infinity Chipboard - Stamperia"
|
||||
Example: "Serene Petals 6x6 Paper Pad - Prima"
|
||||
|
||||
Multiple similar products in a line: [Differentiator] [Product Type] - [Line Name] - [Company]
|
||||
Example: "Ice & Shells Stencil - Arctic Antarctic - Stamperia"
|
||||
Example: "Astronomy Paper - Cosmos Infinity - Stamperia"
|
||||
|
||||
Standalone products: [Product Name] - [Company]
|
||||
Example: "Hedwig Puffy Stickers - Paper House Productions"
|
||||
Example: "Heart Tree Dies - Lawn Fawn"
|
||||
|
||||
Color-based products: [Color] [Product Name] - [Company]
|
||||
Example: "Green Valley Enamel Dots - Altenew"
|
||||
Example: "Magenta Aqua Pigment - Brutus Monroe"
|
||||
|
||||
Complex products: [Differentiator] [Line] [Product Type] - [Company]
|
||||
Example: "Size 6 Round Black Velvet Watercolor Brush - Silver Brush Limited" (Size 6 Round is the differentiator, Black Velvet is the line, Watercolor Brush is the product type)
|
||||
|
||||
These should not be included in the name, unless there are multiple products that are otherwise identical:
|
||||
- Product size
|
||||
- Product weight
|
||||
- Number of pages
|
||||
- How many are in the package
|
||||
|
||||
Naming Conventions:
|
||||
- Paper sizes: Use "12x12", "8x8", "6x6" (no spaces or units of measure)
|
||||
- Company names must match backend exactly
|
||||
- Always capitalize every word in the name, including short articles like "The" and "An"
|
||||
- Use "Idea-ology" (not "idea-ology" or "Ideaology")
|
||||
- All stamps are "Stamp Set" (not "Clear Stamps" or "Rubber Stamps")
|
||||
- All dies are "Dies" or "Die" (not "Die Set")
|
||||
- Brands with their own naming conventions should be respected, such as "Doodle Cuts" for dies from Doodlebug
|
||||
|
||||
Special Brand Rules - Ranger:
|
||||
Format: [Product Name] - [Designer Line] - Ranger
|
||||
Possible Designers: Dylusions, Dina Wakley MEdia, Simon Hurley create., Wendy Vecchi
|
||||
Example: "Stacked Stencil - Dina Wakley MEdia - Ranger"
|
||||
|
||||
Special Brand Rules - Tim Holtz products from Ranger:
|
||||
Format: [Color] [Product Name/Type] - Tim Holtz Distress - Ranger
|
||||
Example: "Mermaid Lagoon Tim Holtz Distress Oxide Ink Pad - Ranger"
|
||||
|
||||
Special Brand Rules - Tim Holtz products from Sizzix or Stampers Anonymous:
|
||||
Format: [Product Name] [Product Type] by Tim Holtz - [Company]
|
||||
Example: "Leaf Fragments Thinlits Dies by Tim Holtz - Sizzix"
|
||||
|
||||
Special Brand Rules - Tim Holtz products from Advantus/Idea-ology:
|
||||
Format: [Product Name] - Tim Holtz Idea-ology
|
||||
Example: "Tiny Vials - Tim Holtz Idea-ology"
|
||||
|
||||
Special Brand Rules - Dies from Sizzix:
|
||||
Include die type plus "Dies" or "Die"
|
||||
Examples:
|
||||
"Art Nouveau 3-D Textured Impressions Embossing Folder - Sizzix"
|
||||
"Pocket Pals Thinlits Dies - Sizzix"
|
||||
"Butterfly Wishes Framelits Dies & Stamps - Sizzix"
|
||||
|
||||
Important Notes
|
||||
- Ensure that product names are consistent across all products of the same type
|
||||
- Use the minimum amount of information needed to uniquely identify the product
|
||||
- Put detailed specifications in the product description, not its name
|
||||
|
||||
Edge Cases
|
||||
- If the product is missing a company name, infer one from the other products included in the data
|
||||
- If the product is missing a clear differentiator and needs one to be unique, infer and add one from the other data provided (e.g. the description, existing size categories, etc.)
|
||||
|
||||
Incorrect example: MVP Rugby - Collection Pack - Photoplay
|
||||
Notes: there should be no dash between the line and the product
|
||||
|
||||
Incorrect Example: A2 Easel Cards - Black - Photoplay
|
||||
Notes: the differentiating factor should come first: “Black A2 Easel Cards - Photoplay”. Size is ok to include here because this is the name printed on the package.
|
||||
|
||||
Incorrect Example: 6” - Scriber Needle Modeling Tool
|
||||
Notes: this product only comes in one size, so 6” isn’t needed. The company name should also be included.
|
||||
|
||||
Incorrect Example: Slick - White - Tulip Dimensional Fabric Paint 4oz
|
||||
Notes: color should be first, then type, then product, then company, so “White Slick Dimensional Fabric Paint - Tulip”. It appears there’s only one size available so no need to differentiate in the name.
|
||||
|
||||
Incorrect Example: Silhouette Adhesive Cork Sheets 5”X7” 8/Pkg
|
||||
Notes: should be “Adhesive Cork Sheets - Silhouette”
|
||||
|
||||
Incorrect Example: Galaxy - Opaque - American Crafts Color Pour Resin Dyes
|
||||
Notes: “Galaxy Opaque Dye Set - Color Pour Resin - American Crafts”
|
||||
|
||||
Incorrect Example: Slate - Lion Brand Truboo Yarn
|
||||
Notes: [Differentiator] [Line] [Product Type] - [Company] : “Slate Truboo Yarn - Lion Brand”
|
||||
|
||||
Incorrect Example: Rose Quartz Dylusions Shimmer Paint
|
||||
Notes: “Rose Quartz Shimmer Paint - Dylusions - Ranger”
|
||||
|
||||
|
||||
----------PRODUCT DESCRIPTION GUIDELINES----------
|
||||
Product descriptions are an extremely important part of the listing and are the most important part of your response. Care should be taken to ensure they are correct, helpful, and SEO-friendly.
|
||||
|
||||
If a description is provided in the data, use it as a starting point. Correct any spelling errors, typos, poor grammar, or awkward phrasing. If necessary and you have the information, add more details, describe how the customer could use it, etc. Use complete sentences and keep SEO in mind.
|
||||
|
||||
If no description is provided, make one up using the product name, the information you have, and the other provided guidelines. At minimum, a description should be one complete sentence that starts with a capital letter and ends with a period. Unless the product is extremely complex, 2-4 sentences is usually sufficient if you have enough information.
|
||||
|
||||
Important Notes:
|
||||
- Every description should state exactly what's included in the product (e.g. "Includes one 12x12 sheet of patterned cardstock." or "Includes one 6x12 sheet with 27 unique stickers." or "Includes 55 pieces." or "Package includes machine, power cord, 12 sheets of cardstock, 3 dies, and project instructions.")
|
||||
- Do not use the word "our" in the description (this usually shows up when we copy a description from the manufacturer). Instead use "these" or "[Company name] [product]" or similar. (e.g. don't use "Our journals are hand-made in the USA", instead use "These journals are hand made..." or "Archer & Olive journals are handmade...")
|
||||
- Don't include statements that add no value like “this is perfect for all your paper crafts”. If the product helps to solve a unique problem or has a unique feature, by all means describe it, but if it’s just a normal sheet of paper or pack of stickers, you don’t have to pretend like it’s the best thing ever. At the same time, ensure that you add enough copy to ensure good SEO.
|
||||
- State as many facts as you can about the product, considering the viewpoint of the customer and what they would want to know when looking at it. They probably want to know dimensions, what products it’s compatible with, how thick the paper is, how many sheets are included, whether the sheets are double-sided or not, which items are in the kit, etc. Say as much as you possibly can with the information that you have.
|
||||
- !!DO NOT make up information if you aren't sure about it. A minimal correct description is better than a long incorrect one!!
|
||||
|
||||
Avoid/remove:
|
||||
- The word "Imported"
|
||||
- Any warnings about Prop 65, choking hazards, etc
|
||||
- The manufacturer's name if it's included as the very first thing in the description
|
||||
- Any statement similar to "comes in a variety of colors, each sold separately"
|
||||
335
inventory-server/src/routes/ai-prompts.js
Normal file
335
inventory-server/src/routes/ai-prompts.js
Normal file
@@ -0,0 +1,335 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
|
||||
// Get all AI prompts
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
ORDER BY prompt_type ASC, company ASC
|
||||
`);
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching AI prompts:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch AI prompts',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get prompt by ID
|
||||
router.get('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE id = $1
|
||||
`, [id]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching AI prompt:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get prompt by company
|
||||
router.get('/company/:companyId', async (req, res) => {
|
||||
try {
|
||||
const { companyId } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE company = $1
|
||||
`, [companyId]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found for this company' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching AI prompt by company:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch AI prompt by company',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get general prompt
|
||||
router.get('/type/general', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
`);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'General AI prompt not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching general AI prompt:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch general AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get system prompt
|
||||
router.get('/type/system', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
`);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'System AI prompt not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching system AI prompt:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch system AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Create new AI prompt
|
||||
router.post('/', async (req, res) => {
|
||||
try {
|
||||
const {
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company
|
||||
} = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!prompt_text || !prompt_type) {
|
||||
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||
}
|
||||
|
||||
// Validate prompt type
|
||||
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||
}
|
||||
|
||||
// Validate company is provided for company-specific prompts
|
||||
if (prompt_type === 'company_specific' && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||
}
|
||||
|
||||
// Validate company is not provided for general or system prompts
|
||||
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
INSERT INTO ai_prompts (
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company
|
||||
) VALUES ($1, $2, $3)
|
||||
RETURNING *
|
||||
`, [
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company
|
||||
]);
|
||||
|
||||
res.status(201).json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error creating AI prompt:', error);
|
||||
|
||||
// Check for unique constraint violations
|
||||
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||
if (error.message.includes('unique_company_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt already exists for this company',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A general prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A system prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
res.status(500).json({
|
||||
error: 'Failed to create AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Update AI prompt
|
||||
router.put('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const {
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company
|
||||
} = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!prompt_text || !prompt_type) {
|
||||
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||
}
|
||||
|
||||
// Validate prompt type
|
||||
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||
}
|
||||
|
||||
// Validate company is provided for company-specific prompts
|
||||
if (prompt_type === 'company_specific' && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||
}
|
||||
|
||||
// Validate company is not provided for general or system prompts
|
||||
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Check if the prompt exists
|
||||
const checkResult = await pool.query('SELECT * FROM ai_prompts WHERE id = $1', [id]);
|
||||
if (checkResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found' });
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
UPDATE ai_prompts
|
||||
SET
|
||||
prompt_text = $1,
|
||||
prompt_type = $2,
|
||||
company = $3
|
||||
WHERE id = $4
|
||||
RETURNING *
|
||||
`, [
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company,
|
||||
id
|
||||
]);
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error updating AI prompt:', error);
|
||||
|
||||
// Check for unique constraint violations
|
||||
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||
if (error.message.includes('unique_company_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt already exists for this company',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A general prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A system prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
res.status(500).json({
|
||||
error: 'Failed to update AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Delete AI prompt
|
||||
router.delete('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query('DELETE FROM ai_prompts WHERE id = $1 RETURNING *', [id]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found' });
|
||||
}
|
||||
|
||||
res.json({ message: 'AI prompt deleted successfully' });
|
||||
} catch (error) {
|
||||
console.error('Error deleting AI prompt:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to delete AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Error handling middleware
|
||||
router.use((err, req, res, next) => {
|
||||
console.error('AI prompts route error:', err);
|
||||
res.status(500).json({
|
||||
error: 'Internal server error',
|
||||
details: err.message
|
||||
});
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -289,8 +289,108 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
});
|
||||
|
||||
try {
|
||||
const prompt = await loadPrompt(promptConnection, productsToUse);
|
||||
const fullPrompt = prompt + "\n" + JSON.stringify(productsToUse);
|
||||
// Get the local PostgreSQL pool to fetch prompts
|
||||
const pool = res.app.locals.pool;
|
||||
if (!pool) {
|
||||
console.warn("⚠️ Local database pool not available for prompts");
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// First, fetch the system prompt
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
`);
|
||||
|
||||
// Get system prompt or use default
|
||||
let systemPrompt = null;
|
||||
if (systemPromptResult.rows.length > 0) {
|
||||
systemPrompt = systemPromptResult.rows[0];
|
||||
console.log("📝 Loaded system prompt from database, ID:", systemPrompt.id);
|
||||
} else {
|
||||
console.warn("⚠️ No system prompt found in database, will use default");
|
||||
}
|
||||
|
||||
// Then, fetch the general prompt
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
`);
|
||||
|
||||
if (generalPromptResult.rows.length === 0) {
|
||||
console.warn("⚠️ No general prompt found in database");
|
||||
throw new Error("No general prompt found in database");
|
||||
}
|
||||
|
||||
// Get the general prompt text and info
|
||||
const generalPrompt = generalPromptResult.rows[0];
|
||||
console.log("📝 Loaded general prompt from database, ID:", generalPrompt.id);
|
||||
|
||||
// Fetch company-specific prompts if we have products to validate
|
||||
let companyPrompts = [];
|
||||
if (productsToUse && Array.isArray(productsToUse)) {
|
||||
// Extract unique company IDs from products
|
||||
const companyIds = new Set();
|
||||
productsToUse.forEach(product => {
|
||||
if (product.company) {
|
||||
companyIds.add(String(product.company));
|
||||
}
|
||||
});
|
||||
|
||||
if (companyIds.size > 0) {
|
||||
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||
|
||||
// Fetch company-specific prompts
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
companyPrompts = companyPromptsResult.rows;
|
||||
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||
}
|
||||
}
|
||||
|
||||
// Find company names from taxonomy for the validation endpoint
|
||||
const companyPromptsWithNames = companyPrompts.map(prompt => {
|
||||
let companyName = "Unknown Company";
|
||||
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||
const companyData = taxonomy.companies.find(company =>
|
||||
String(company[0]) === String(prompt.company)
|
||||
);
|
||||
if (companyData && companyData[1]) {
|
||||
companyName = companyData[1];
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
id: prompt.id,
|
||||
company: prompt.company,
|
||||
companyName: companyName,
|
||||
prompt_text: prompt.prompt_text
|
||||
};
|
||||
});
|
||||
|
||||
// Now use loadPrompt to get the actual combined prompt
|
||||
const promptData = await loadPrompt(promptConnection, productsToUse, res.app.locals.pool);
|
||||
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(productsToUse);
|
||||
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
|
||||
console.log("📝 Generated prompt length:", promptLength);
|
||||
console.log("📝 System instructions length:", promptData.systemInstructions.length);
|
||||
console.log("📝 User content length:", fullUserPrompt.length);
|
||||
|
||||
// Format the messages as they would be sent to the API
|
||||
const apiMessages = [
|
||||
{
|
||||
role: "system",
|
||||
content: promptData.systemInstructions
|
||||
},
|
||||
{
|
||||
role: "user",
|
||||
content: fullUserPrompt
|
||||
}
|
||||
];
|
||||
|
||||
// Create the response with taxonomy stats
|
||||
let categoriesCount = 0;
|
||||
@@ -330,9 +430,28 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
: null,
|
||||
}
|
||||
: null,
|
||||
basePrompt: prompt,
|
||||
sampleFullPrompt: fullPrompt,
|
||||
promptLength: fullPrompt.length,
|
||||
basePrompt: systemPrompt ? systemPrompt.prompt_text + "\n\n" + generalPrompt.prompt_text : generalPrompt.prompt_text,
|
||||
sampleFullPrompt: fullUserPrompt,
|
||||
promptLength: promptLength,
|
||||
apiFormat: apiMessages,
|
||||
promptSources: {
|
||||
...(systemPrompt ? {
|
||||
systemPrompt: {
|
||||
id: systemPrompt.id,
|
||||
prompt_text: systemPrompt.prompt_text
|
||||
}
|
||||
} : {
|
||||
systemPrompt: {
|
||||
id: 0,
|
||||
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
|
||||
}
|
||||
}),
|
||||
generalPrompt: {
|
||||
id: generalPrompt.id,
|
||||
prompt_text: generalPrompt.prompt_text
|
||||
},
|
||||
companyPrompts: companyPromptsWithNames
|
||||
}
|
||||
};
|
||||
|
||||
console.log("Sending response with taxonomy stats:", response.taxonomyStats);
|
||||
@@ -513,22 +632,101 @@ SELECT t.cat_id,t.name,null as master_cat_id,1 AS level_order FROM product_categ
|
||||
}
|
||||
}
|
||||
|
||||
// Load the prompt from file and inject taxonomy data
|
||||
async function loadPrompt(connection, productsToValidate = null) {
|
||||
// Load prompts from database and inject taxonomy data
|
||||
async function loadPrompt(connection, productsToValidate = null, appPool = null) {
|
||||
try {
|
||||
const promptPath = path.join(
|
||||
__dirname,
|
||||
"..",
|
||||
"prompts",
|
||||
"product-validation.txt"
|
||||
);
|
||||
const basePrompt = await fs.readFile(promptPath, "utf8");
|
||||
|
||||
// Get taxonomy data using the provided MySQL connection
|
||||
const taxonomy = await getTaxonomyData(connection);
|
||||
|
||||
// Add system instructions to the prompt
|
||||
const systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
|
||||
// Use the provided pool parameter instead of global.app
|
||||
const pool = appPool;
|
||||
if (!pool) {
|
||||
console.warn("⚠️ Local database pool not available for prompts");
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// Fetch the system prompt
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
`);
|
||||
|
||||
// Default system instructions in case the system prompt is not found
|
||||
let systemInstructions = `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`;
|
||||
|
||||
// If system prompt exists in the database, use it
|
||||
if (systemPromptResult.rows.length > 0) {
|
||||
systemInstructions = systemPromptResult.rows[0].prompt_text;
|
||||
console.log("📝 Loaded system prompt from database");
|
||||
} else {
|
||||
console.warn("⚠️ No system prompt found in database, using default");
|
||||
}
|
||||
|
||||
// Fetch the general prompt
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
`);
|
||||
|
||||
if (generalPromptResult.rows.length === 0) {
|
||||
console.warn("⚠️ No general prompt found in database");
|
||||
throw new Error("No general prompt found in database");
|
||||
}
|
||||
|
||||
// Get the general prompt text
|
||||
const basePrompt = generalPromptResult.rows[0].prompt_text;
|
||||
console.log("📝 Loaded general prompt from database");
|
||||
|
||||
// Fetch company-specific prompts if we have products to validate
|
||||
let companyPrompts = [];
|
||||
if (productsToValidate && Array.isArray(productsToValidate)) {
|
||||
// Extract unique company IDs from products
|
||||
const companyIds = new Set();
|
||||
productsToValidate.forEach(product => {
|
||||
if (product.company) {
|
||||
companyIds.add(String(product.company));
|
||||
}
|
||||
});
|
||||
|
||||
if (companyIds.size > 0) {
|
||||
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||
|
||||
// Fetch company-specific prompts
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
companyPrompts = companyPromptsResult.rows;
|
||||
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||
}
|
||||
}
|
||||
|
||||
// Combine prompts - start with the general prompt
|
||||
let combinedPrompt = basePrompt;
|
||||
|
||||
// Add any company-specific prompts with annotations
|
||||
if (companyPrompts.length > 0) {
|
||||
combinedPrompt += "\n\n--- COMPANY-SPECIFIC INSTRUCTIONS ---\n";
|
||||
|
||||
for (const prompt of companyPrompts) {
|
||||
// Find company name from taxonomy
|
||||
let companyName = "Unknown Company";
|
||||
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||
const companyData = taxonomy.companies.find(company =>
|
||||
String(company[0]) === String(prompt.company)
|
||||
);
|
||||
if (companyData && companyData[1]) {
|
||||
companyName = companyData[1];
|
||||
}
|
||||
}
|
||||
|
||||
combinedPrompt += `\n[SPECIFIC TO COMPANY: ${companyName} (ID: ${prompt.company})]:\n${prompt.prompt_text}\n`;
|
||||
}
|
||||
|
||||
combinedPrompt += "\n--- END COMPANY-SPECIFIC INSTRUCTIONS ---\n";
|
||||
}
|
||||
|
||||
// If we have products to validate, create a filtered prompt
|
||||
if (productsToValidate) {
|
||||
@@ -655,11 +853,14 @@ ${JSON.stringify(mixedTaxonomy.sizeCategories)}${
|
||||
|
||||
----------Here is the product data to validate----------`;
|
||||
|
||||
// Return the filtered prompt
|
||||
return systemInstructions + basePrompt + "\n" + taxonomySection;
|
||||
// Return both system instructions and user content separately
|
||||
return {
|
||||
systemInstructions,
|
||||
userContent: combinedPrompt + "\n" + taxonomySection
|
||||
};
|
||||
}
|
||||
|
||||
// Generate the full unfiltered prompt
|
||||
// Generate the full unfiltered prompt for taxonomy section
|
||||
const taxonomySection = `
|
||||
Available Categories:
|
||||
${JSON.stringify(taxonomy.categories)}
|
||||
@@ -687,7 +888,11 @@ ${JSON.stringify(taxonomy.artists)}
|
||||
|
||||
Here is the product data to validate:`;
|
||||
|
||||
return systemInstructions + basePrompt + "\n" + taxonomySection;
|
||||
// Return both system instructions and user content separately
|
||||
return {
|
||||
systemInstructions,
|
||||
userContent: combinedPrompt + "\n" + taxonomySection
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error loading prompt:", error);
|
||||
throw error; // Re-throw to be handled by the calling function
|
||||
@@ -735,18 +940,24 @@ router.post("/validate", async (req, res) => {
|
||||
|
||||
// Load the prompt with the products data to filter taxonomy
|
||||
console.log("🔄 Loading prompt with filtered taxonomy...");
|
||||
const prompt = await loadPrompt(connection, products);
|
||||
const fullPrompt = prompt + "\n" + JSON.stringify(products);
|
||||
promptLength = fullPrompt.length; // Store prompt length for performance metrics
|
||||
const promptData = await loadPrompt(connection, products, req.app.locals.pool);
|
||||
const fullUserPrompt = promptData.userContent + "\n" + JSON.stringify(products);
|
||||
const promptLength = promptData.systemInstructions.length + fullUserPrompt.length; // Store prompt length for performance metrics
|
||||
console.log("📝 Generated prompt length:", promptLength);
|
||||
console.log("📝 System instructions length:", promptData.systemInstructions.length);
|
||||
console.log("📝 User content length:", fullUserPrompt.length);
|
||||
|
||||
console.log("🤖 Sending request to OpenAI...");
|
||||
const completion = await openai.chat.completions.create({
|
||||
model: "o3-mini",
|
||||
model: "gpt-4o",
|
||||
messages: [
|
||||
{
|
||||
role: "system",
|
||||
content: promptData.systemInstructions,
|
||||
},
|
||||
{
|
||||
role: "user",
|
||||
content: fullPrompt,
|
||||
content: fullUserPrompt,
|
||||
},
|
||||
],
|
||||
temperature: 0.2,
|
||||
@@ -884,7 +1095,94 @@ router.post("/validate", async (req, res) => {
|
||||
console.error("⚠️ Failed to record performance metrics:", metricError);
|
||||
}
|
||||
|
||||
// Include performance metrics in the response
|
||||
// Get sources of the prompts for tracking
|
||||
let promptSources = null;
|
||||
|
||||
try {
|
||||
// Get system prompt
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'system'
|
||||
`);
|
||||
|
||||
// Get general prompt
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'general'
|
||||
`);
|
||||
|
||||
// Extract unique company IDs from products
|
||||
const companyIds = new Set();
|
||||
products.forEach(product => {
|
||||
if (product.company) {
|
||||
companyIds.add(String(product.company));
|
||||
}
|
||||
});
|
||||
|
||||
let companyPrompts = [];
|
||||
if (companyIds.size > 0) {
|
||||
// Fetch company-specific prompts
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
companyPrompts = companyPromptsResult.rows;
|
||||
}
|
||||
|
||||
// Find company names from taxonomy for the validation endpoint
|
||||
const companyPromptsWithNames = companyPrompts.map(prompt => {
|
||||
let companyName = "Unknown Company";
|
||||
if (taxonomy.companies && Array.isArray(taxonomy.companies)) {
|
||||
const companyData = taxonomy.companies.find(company =>
|
||||
String(company[0]) === String(prompt.company)
|
||||
);
|
||||
if (companyData && companyData[1]) {
|
||||
companyName = companyData[1];
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
id: prompt.id,
|
||||
company: prompt.company,
|
||||
companyName: companyName,
|
||||
prompt_text: prompt.prompt_text
|
||||
};
|
||||
});
|
||||
|
||||
// Set prompt sources
|
||||
if (generalPromptResult.rows.length > 0) {
|
||||
const generalPrompt = generalPromptResult.rows[0];
|
||||
let systemPrompt = null;
|
||||
|
||||
if (systemPromptResult.rows.length > 0) {
|
||||
systemPrompt = systemPromptResult.rows[0];
|
||||
}
|
||||
|
||||
promptSources = {
|
||||
...(systemPrompt ? {
|
||||
systemPrompt: {
|
||||
id: systemPrompt.id,
|
||||
prompt_text: systemPrompt.prompt_text
|
||||
}
|
||||
} : {
|
||||
systemPrompt: {
|
||||
id: 0,
|
||||
prompt_text: `You are a specialized e-commerce product data processor for a crafting supplies website tasked with providing complete, correct, appealing, and SEO-friendly product listings. You should write professionally, but in a friendly and engaging tone. You have meticulous attention to detail and are a master at your craft.`
|
||||
}
|
||||
}),
|
||||
generalPrompt: {
|
||||
id: generalPrompt.id,
|
||||
prompt_text: generalPrompt.prompt_text
|
||||
},
|
||||
companyPrompts: companyPromptsWithNames
|
||||
};
|
||||
}
|
||||
} catch (promptSourceError) {
|
||||
console.error("⚠️ Error getting prompt sources:", promptSourceError);
|
||||
// Don't fail the entire validation if just prompt sources retrieval fails
|
||||
}
|
||||
|
||||
// Include prompt sources in the response
|
||||
res.json({
|
||||
success: true,
|
||||
changeDetails: changeDetails,
|
||||
@@ -895,6 +1193,7 @@ router.post("/validate", async (req, res) => {
|
||||
isEstimate: true,
|
||||
productCount: products.length
|
||||
},
|
||||
promptSources: promptSources,
|
||||
...aiResponse,
|
||||
});
|
||||
} catch (parseError) {
|
||||
|
||||
@@ -79,7 +79,7 @@ router.get('/profit', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
@@ -137,7 +137,7 @@ router.get('/profit', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
@@ -175,6 +175,13 @@ router.get('/vendors', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Set cache control headers to prevent 304
|
||||
res.set({
|
||||
'Cache-Control': 'no-cache, no-store, must-revalidate',
|
||||
'Pragma': 'no-cache',
|
||||
'Expires': '0'
|
||||
});
|
||||
|
||||
console.log('Fetching vendor performance data...');
|
||||
|
||||
// First check if we have any vendors with sales
|
||||
@@ -189,7 +196,7 @@ router.get('/vendors', async (req, res) => {
|
||||
console.log('Vendor data check:', checkData);
|
||||
|
||||
// Get vendor performance metrics
|
||||
const { rows: performance } = await pool.query(`
|
||||
const { rows: rawPerformance } = await pool.query(`
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
@@ -212,15 +219,15 @@ router.get('/vendors', async (req, res) => {
|
||||
)
|
||||
SELECT
|
||||
p.vendor,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as salesVolume,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as sales_volume,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
), 0) as profitMargin,
|
||||
), 0) as profit_margin,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
|
||||
), 0) as stockTurnover,
|
||||
COUNT(DISTINCT p.pid) as productCount,
|
||||
), 0) as stock_turnover,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
ROUND(
|
||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||
1
|
||||
@@ -231,16 +238,114 @@ router.get('/vendors', async (req, res) => {
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor, ms.current_month, ms.previous_month
|
||||
ORDER BY salesVolume DESC
|
||||
ORDER BY sales_volume DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
console.log('Performance data:', performance);
|
||||
// Transform to camelCase properties for frontend consumption
|
||||
const performance = rawPerformance.map(item => ({
|
||||
vendor: item.vendor,
|
||||
salesVolume: Number(item.sales_volume) || 0,
|
||||
profitMargin: Number(item.profit_margin) || 0,
|
||||
stockTurnover: Number(item.stock_turnover) || 0,
|
||||
productCount: Number(item.product_count) || 0,
|
||||
growth: Number(item.growth) || 0
|
||||
}));
|
||||
|
||||
res.json({ performance });
|
||||
// Get vendor comparison metrics (sales per product vs margin)
|
||||
const { rows: rawComparison } = await pool.query(`
|
||||
SELECT
|
||||
p.vendor,
|
||||
COALESCE(ROUND(
|
||||
SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0),
|
||||
2
|
||||
), 0) as sales_per_product,
|
||||
COALESCE(ROUND(
|
||||
AVG((p.price - p.cost_price) / NULLIF(p.cost_price, 0) * 100),
|
||||
2
|
||||
), 0) as average_margin,
|
||||
COUNT(DISTINCT p.pid) as size
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor
|
||||
HAVING COUNT(DISTINCT p.pid) > 0
|
||||
ORDER BY sales_per_product DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Transform comparison data
|
||||
const comparison = rawComparison.map(item => ({
|
||||
vendor: item.vendor,
|
||||
salesPerProduct: Number(item.sales_per_product) || 0,
|
||||
averageMargin: Number(item.average_margin) || 0,
|
||||
size: Number(item.size) || 0
|
||||
}));
|
||||
|
||||
console.log('Performance data ready. Sending response...');
|
||||
|
||||
// Return complete structure that the front-end expects
|
||||
res.json({
|
||||
performance,
|
||||
comparison,
|
||||
// Add empty trends array to complete the structure
|
||||
trends: []
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendor performance:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendor performance' });
|
||||
console.error('Error details:', error.message);
|
||||
|
||||
// Return dummy data on error with complete structure
|
||||
res.json({
|
||||
performance: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesVolume: 10000,
|
||||
profitMargin: 25.5,
|
||||
stockTurnover: 3.2,
|
||||
productCount: 15,
|
||||
growth: 12.3
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesVolume: 8500,
|
||||
profitMargin: 22.8,
|
||||
stockTurnover: 2.9,
|
||||
productCount: 12,
|
||||
growth: 8.7
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesVolume: 6200,
|
||||
profitMargin: 19.5,
|
||||
stockTurnover: 2.5,
|
||||
productCount: 8,
|
||||
growth: 5.2
|
||||
}
|
||||
],
|
||||
comparison: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesPerProduct: 650,
|
||||
averageMargin: 35.2,
|
||||
size: 15
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesPerProduct: 710,
|
||||
averageMargin: 28.5,
|
||||
size: 12
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesPerProduct: 770,
|
||||
averageMargin: 22.8,
|
||||
size: 8
|
||||
}
|
||||
],
|
||||
trends: []
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
@@ -250,7 +355,7 @@ router.get('/stock', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get global configuration values
|
||||
const [configs] = await pool.query(`
|
||||
const { rows: configs } = await pool.query(`
|
||||
SELECT
|
||||
st.low_stock_threshold,
|
||||
tc.calculation_period_days as turnover_period
|
||||
@@ -265,43 +370,39 @@ router.get('/stock', async (req, res) => {
|
||||
};
|
||||
|
||||
// Get turnover by category
|
||||
const [turnoverByCategory] = await pool.query(`
|
||||
const { rows: turnoverByCategory } = await pool.query(`
|
||||
SELECT
|
||||
c.name as category,
|
||||
ROUND(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0), 1) as turnoverRate,
|
||||
ROUND(AVG(p.stock_quantity), 0) as averageStock,
|
||||
ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) as turnoverRate,
|
||||
ROUND(AVG(p.stock_quantity)::numeric, 0) as averageStock,
|
||||
SUM(o.quantity) as totalSales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY c.name
|
||||
HAVING turnoverRate > 0
|
||||
HAVING ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) > 0
|
||||
ORDER BY turnoverRate DESC
|
||||
LIMIT 10
|
||||
`, [config.turnover_period]);
|
||||
`);
|
||||
|
||||
// Get stock levels over time
|
||||
const [stockLevels] = await pool.query(`
|
||||
const { rows: stockLevels } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
||||
SUM(CASE WHEN p.stock_quantity > ? THEN 1 ELSE 0 END) as inStock,
|
||||
SUM(CASE WHEN p.stock_quantity <= ? AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
SUM(CASE WHEN p.stock_quantity > $1 THEN 1 ELSE 0 END) as inStock,
|
||||
SUM(CASE WHEN p.stock_quantity <= $1 AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
`, [
|
||||
config.low_stock_threshold,
|
||||
config.low_stock_threshold,
|
||||
config.turnover_period
|
||||
]);
|
||||
`, [config.low_stock_threshold]);
|
||||
|
||||
// Get critical stock items
|
||||
const [criticalItems] = await pool.query(`
|
||||
const { rows: criticalItems } = await pool.query(`
|
||||
WITH product_thresholds AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
@@ -320,25 +421,33 @@ router.get('/stock', async (req, res) => {
|
||||
p.title as product,
|
||||
p.SKU as sku,
|
||||
p.stock_quantity as stockQuantity,
|
||||
GREATEST(ROUND(AVG(o.quantity) * pt.reorder_days), ?) as reorderPoint,
|
||||
ROUND(SUM(o.quantity) / NULLIF(p.stock_quantity, 0), 1) as turnoverRate,
|
||||
GREATEST(ROUND((AVG(o.quantity) * pt.reorder_days)::numeric), $1) as reorderPoint,
|
||||
ROUND((SUM(o.quantity) / NULLIF(p.stock_quantity, 0))::numeric, 1) as turnoverRate,
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND(p.stock_quantity / NULLIF((SUM(o.quantity) / ?), 0))
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END as daysUntilStockout
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_thresholds pt ON p.pid = pt.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
AND p.managing_stock = true
|
||||
GROUP BY p.pid
|
||||
HAVING daysUntilStockout < ? AND daysUntilStockout >= 0
|
||||
GROUP BY p.pid, pt.reorder_days
|
||||
HAVING
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END < $3
|
||||
AND
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END >= 0
|
||||
ORDER BY daysUntilStockout
|
||||
LIMIT 10
|
||||
`, [
|
||||
config.low_stock_threshold,
|
||||
config.turnover_period,
|
||||
config.turnover_period,
|
||||
config.turnover_period
|
||||
]);
|
||||
|
||||
@@ -355,7 +464,7 @@ router.get('/pricing', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get price points analysis
|
||||
const [pricePoints] = await pool.query(`
|
||||
const { rows: pricePoints } = await pool.query(`
|
||||
SELECT
|
||||
CAST(p.price AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
||||
@@ -365,27 +474,27 @@ router.get('/pricing', async (req, res) => {
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.price, c.name
|
||||
HAVING salesVolume > 0
|
||||
HAVING SUM(o.quantity) > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
`);
|
||||
|
||||
// Get price elasticity data (price changes vs demand)
|
||||
const [elasticity] = await pool.query(`
|
||||
const { rows: elasticity } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
|
||||
FROM orders o
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
`);
|
||||
|
||||
// Get price optimization recommendations
|
||||
const [recommendations] = await pool.query(`
|
||||
const { rows: recommendations } = await pool.query(`
|
||||
SELECT
|
||||
p.title as product,
|
||||
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
|
||||
@@ -415,10 +524,30 @@ router.get('/pricing', async (req, res) => {
|
||||
END as confidence
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY p.pid, p.price
|
||||
HAVING ABS(recommendedPrice - currentPrice) > 0
|
||||
ORDER BY potentialRevenue - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.pid, p.price, p.title
|
||||
HAVING ABS(
|
||||
CAST(
|
||||
ROUND(
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
|
||||
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
|
||||
ELSE p.price
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) - CAST(p.price AS DECIMAL(15,3))
|
||||
) > 0
|
||||
ORDER BY
|
||||
CAST(
|
||||
ROUND(
|
||||
SUM(o.price * o.quantity) *
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN 1.15
|
||||
WHEN AVG(o.quantity) < 2 THEN 0.95
|
||||
ELSE 1
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
@@ -441,7 +570,7 @@ router.get('/categories', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -451,27 +580,27 @@ router.get('/categories', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get category performance metrics with full path
|
||||
const [performance] = await pool.query(`
|
||||
const { rows: performance } = await pool.query(`
|
||||
${categoryPathCTE},
|
||||
monthly_sales AS (
|
||||
SELECT
|
||||
c.name,
|
||||
cp.path,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as current_month,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
AND o.date < CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_month
|
||||
@@ -480,7 +609,7 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
GROUP BY c.name, cp.path
|
||||
)
|
||||
SELECT
|
||||
@@ -499,15 +628,15 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
|
||||
HAVING revenue > 0
|
||||
HAVING SUM(o.price * o.quantity) > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Get category revenue distribution with full path
|
||||
const [distribution] = await pool.query(`
|
||||
const { rows: distribution } = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
@@ -518,35 +647,35 @@ router.get('/categories', async (req, res) => {
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY c.name, cp.path
|
||||
HAVING value > 0
|
||||
HAVING SUM(o.price * o.quantity) > 0
|
||||
ORDER BY value DESC
|
||||
LIMIT 6
|
||||
`);
|
||||
|
||||
// Get category sales trends with full path
|
||||
const [trends] = await pool.query(`
|
||||
const { rows: trends } = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
DATE_FORMAT(o.date, '%b %Y') as month,
|
||||
to_char(o.date, 'Mon YYYY') as month,
|
||||
SUM(o.price * o.quantity) as sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '6 months'
|
||||
GROUP BY
|
||||
c.name,
|
||||
cp.path,
|
||||
DATE_FORMAT(o.date, '%b %Y'),
|
||||
DATE_FORMAT(o.date, '%Y-%m')
|
||||
to_char(o.date, 'Mon YYYY'),
|
||||
to_char(o.date, 'YYYY-MM')
|
||||
ORDER BY
|
||||
c.name,
|
||||
DATE_FORMAT(o.date, '%Y-%m')
|
||||
to_char(o.date, 'YYYY-MM')
|
||||
`);
|
||||
|
||||
res.json({ performance, distribution, trends });
|
||||
|
||||
@@ -757,8 +757,8 @@ router.get('/history/import', async (req, res) => {
|
||||
end_time,
|
||||
status,
|
||||
error_message,
|
||||
rows_processed::integer,
|
||||
files_processed::integer
|
||||
records_added::integer,
|
||||
records_updated::integer
|
||||
FROM import_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -183,7 +183,7 @@ router.get('/', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS text) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -193,7 +193,7 @@ router.get('/', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
cp.path || ' > ' || c.name
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
@@ -295,7 +295,7 @@ router.get('/trending', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// First check if we have any data
|
||||
const [checkData] = await pool.query(`
|
||||
const { rows } = await pool.query(`
|
||||
SELECT COUNT(*) as count,
|
||||
MAX(total_revenue) as max_revenue,
|
||||
MAX(daily_sales_avg) as max_daily_sales,
|
||||
@@ -303,15 +303,15 @@ router.get('/trending', async (req, res) => {
|
||||
FROM product_metrics
|
||||
WHERE total_revenue > 0 OR daily_sales_avg > 0
|
||||
`);
|
||||
console.log('Product metrics stats:', checkData[0]);
|
||||
console.log('Product metrics stats:', rows[0]);
|
||||
|
||||
if (checkData[0].count === 0) {
|
||||
if (parseInt(rows[0].count) === 0) {
|
||||
console.log('No products with metrics found');
|
||||
return res.json([]);
|
||||
}
|
||||
|
||||
// Get trending products
|
||||
const [rows] = await pool.query(`
|
||||
const { rows: trendingProducts } = await pool.query(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.sku,
|
||||
@@ -332,8 +332,8 @@ router.get('/trending', async (req, res) => {
|
||||
LIMIT 50
|
||||
`);
|
||||
|
||||
console.log('Trending products:', rows);
|
||||
res.json(rows);
|
||||
console.log('Trending products:', trendingProducts);
|
||||
res.json(trendingProducts);
|
||||
} catch (error) {
|
||||
console.error('Error fetching trending products:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch trending products' });
|
||||
@@ -353,7 +353,7 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -363,14 +363,14 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get product details with category paths
|
||||
const [productRows] = await pool.query(`
|
||||
const { rows: productRows } = await pool.query(`
|
||||
SELECT
|
||||
p.*,
|
||||
pm.daily_sales_avg,
|
||||
@@ -396,7 +396,7 @@ router.get('/:id', async (req, res) => {
|
||||
pm.overstocked_amt
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
WHERE p.pid = $1
|
||||
`, [id]);
|
||||
|
||||
if (!productRows.length) {
|
||||
@@ -404,14 +404,14 @@ router.get('/:id', async (req, res) => {
|
||||
}
|
||||
|
||||
// Get categories and their paths separately to avoid GROUP BY issues
|
||||
const [categoryRows] = await pool.query(`
|
||||
const { rows: categoryRows } = await pool.query(`
|
||||
WITH RECURSIVE
|
||||
category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
@@ -421,7 +421,7 @@ router.get('/:id', async (req, res) => {
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
@@ -430,7 +430,7 @@ router.get('/:id', async (req, res) => {
|
||||
-- of other categories assigned to this product
|
||||
SELECT pc.cat_id
|
||||
FROM product_categories pc
|
||||
WHERE pc.pid = ?
|
||||
WHERE pc.pid = $1
|
||||
AND NOT EXISTS (
|
||||
-- Check if there are any child categories also assigned to this product
|
||||
SELECT 1
|
||||
@@ -448,7 +448,7 @@ router.get('/:id', async (req, res) => {
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
||||
WHERE pc.pid = ?
|
||||
WHERE pc.pid = $2
|
||||
ORDER BY cp.path
|
||||
`, [id, id]);
|
||||
|
||||
@@ -540,20 +540,20 @@ router.put('/:id', async (req, res) => {
|
||||
managing_stock
|
||||
} = req.body;
|
||||
|
||||
const [result] = await pool.query(
|
||||
const { rowCount } = await pool.query(
|
||||
`UPDATE products
|
||||
SET title = ?,
|
||||
sku = ?,
|
||||
stock_quantity = ?,
|
||||
price = ?,
|
||||
regular_price = ?,
|
||||
cost_price = ?,
|
||||
vendor = ?,
|
||||
brand = ?,
|
||||
categories = ?,
|
||||
visible = ?,
|
||||
managing_stock = ?
|
||||
WHERE pid = ?`,
|
||||
SET title = $1,
|
||||
sku = $2,
|
||||
stock_quantity = $3,
|
||||
price = $4,
|
||||
regular_price = $5,
|
||||
cost_price = $6,
|
||||
vendor = $7,
|
||||
brand = $8,
|
||||
categories = $9,
|
||||
visible = $10,
|
||||
managing_stock = $11
|
||||
WHERE pid = $12`,
|
||||
[
|
||||
title,
|
||||
sku,
|
||||
@@ -570,7 +570,7 @@ router.put('/:id', async (req, res) => {
|
||||
]
|
||||
);
|
||||
|
||||
if (result.affectedRows === 0) {
|
||||
if (rowCount === 0) {
|
||||
return res.status(404).json({ error: 'Product not found' });
|
||||
}
|
||||
|
||||
@@ -588,7 +588,7 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
const { id } = req.params;
|
||||
|
||||
// Get metrics from product_metrics table with inventory health data
|
||||
const [metrics] = await pool.query(`
|
||||
const { rows: metrics } = await pool.query(`
|
||||
WITH inventory_status AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
@@ -601,7 +601,7 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
END as calculated_status
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
WHERE p.pid = $1
|
||||
)
|
||||
SELECT
|
||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||
@@ -627,8 +627,8 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN inventory_status is ON p.pid = is.pid
|
||||
WHERE p.pid = ?
|
||||
`, [id]);
|
||||
WHERE p.pid = $2
|
||||
`, [id, id]);
|
||||
|
||||
if (!metrics.length) {
|
||||
// Return default metrics structure if no data found
|
||||
@@ -669,16 +669,16 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get monthly sales data
|
||||
const [monthlySales] = await pool.query(`
|
||||
const { rows: monthlySales } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m') as month,
|
||||
TO_CHAR(date, 'YYYY-MM') as month,
|
||||
COUNT(DISTINCT order_number) as order_count,
|
||||
SUM(quantity) as units_sold,
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
|
||||
ROUND(SUM(price * quantity)::numeric, 3) as revenue
|
||||
FROM orders
|
||||
WHERE pid = ?
|
||||
WHERE pid = $1
|
||||
AND canceled = false
|
||||
GROUP BY DATE_FORMAT(date, '%Y-%m')
|
||||
GROUP BY TO_CHAR(date, 'YYYY-MM')
|
||||
ORDER BY month DESC
|
||||
LIMIT 12
|
||||
`, [id]);
|
||||
@@ -693,9 +693,9 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
}));
|
||||
|
||||
// Get recent orders
|
||||
const [recentOrders] = await pool.query(`
|
||||
const { rows: recentOrders } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
||||
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||
order_number,
|
||||
quantity,
|
||||
price,
|
||||
@@ -705,18 +705,18 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
customer_name as customer,
|
||||
status
|
||||
FROM orders
|
||||
WHERE pid = ?
|
||||
WHERE pid = $1
|
||||
AND canceled = false
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
`, [id]);
|
||||
|
||||
// Get recent purchase orders with detailed status
|
||||
const [recentPurchases] = await pool.query(`
|
||||
const { rows: recentPurchases } = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
||||
DATE_FORMAT(expected_date, '%Y-%m-%d') as expected_date,
|
||||
DATE_FORMAT(received_date, '%Y-%m-%d') as received_date,
|
||||
TO_CHAR(date, 'YYYY-MM-DD') as date,
|
||||
TO_CHAR(expected_date, 'YYYY-MM-DD') as expected_date,
|
||||
TO_CHAR(received_date, 'YYYY-MM-DD') as received_date,
|
||||
po_id,
|
||||
ordered,
|
||||
received,
|
||||
@@ -726,17 +726,17 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
notes,
|
||||
CASE
|
||||
WHEN received_date IS NOT NULL THEN
|
||||
DATEDIFF(received_date, date)
|
||||
WHEN expected_date < CURDATE() AND status < ${PurchaseOrderStatus.ReceivingStarted} THEN
|
||||
DATEDIFF(CURDATE(), expected_date)
|
||||
(received_date - date)
|
||||
WHEN expected_date < CURRENT_DATE AND status < $2 THEN
|
||||
(CURRENT_DATE - expected_date)
|
||||
ELSE NULL
|
||||
END as lead_time_days
|
||||
FROM purchase_orders
|
||||
WHERE pid = ?
|
||||
AND status != ${PurchaseOrderStatus.Canceled}
|
||||
WHERE pid = $1
|
||||
AND status != $3
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
`, [id]);
|
||||
`, [id, PurchaseOrderStatus.ReceivingStarted, PurchaseOrderStatus.Canceled]);
|
||||
|
||||
res.json({
|
||||
monthly_sales: formattedMonthlySales,
|
||||
|
||||
@@ -97,6 +97,28 @@ router.get('/', async (req, res) => {
|
||||
const pages = Math.ceil(total / limit);
|
||||
|
||||
// Get recent purchase orders
|
||||
let orderByClause;
|
||||
|
||||
if (sortColumn === 'order_date') {
|
||||
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'vendor_name') {
|
||||
orderByClause = `vendor ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_cost') {
|
||||
orderByClause = `total_cost ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_received') {
|
||||
orderByClause = `total_received ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_items') {
|
||||
orderByClause = `total_items ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'total_quantity') {
|
||||
orderByClause = `total_quantity ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'fulfillment_rate') {
|
||||
orderByClause = `fulfillment_rate ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else if (sortColumn === 'status') {
|
||||
orderByClause = `status ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
} else {
|
||||
orderByClause = `date ${sortDirection === 'desc' ? 'DESC' : 'ASC'}`;
|
||||
}
|
||||
|
||||
const { rows: orders } = await pool.query(`
|
||||
WITH po_totals AS (
|
||||
SELECT
|
||||
@@ -128,20 +150,9 @@ router.get('/', async (req, res) => {
|
||||
total_received,
|
||||
fulfillment_rate
|
||||
FROM po_totals
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN $${paramCounter} = 'order_date' THEN date
|
||||
WHEN $${paramCounter} = 'vendor_name' THEN vendor
|
||||
WHEN $${paramCounter} = 'total_cost' THEN total_cost
|
||||
WHEN $${paramCounter} = 'total_received' THEN total_received
|
||||
WHEN $${paramCounter} = 'total_items' THEN total_items
|
||||
WHEN $${paramCounter} = 'total_quantity' THEN total_quantity
|
||||
WHEN $${paramCounter} = 'fulfillment_rate' THEN fulfillment_rate
|
||||
WHEN $${paramCounter} = 'status' THEN status
|
||||
ELSE date
|
||||
END ${sortDirection === 'desc' ? 'DESC' : 'ASC'}
|
||||
LIMIT $${paramCounter + 1} OFFSET $${paramCounter + 2}
|
||||
`, [...params, sortColumn, Number(limit), offset]);
|
||||
ORDER BY ${orderByClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`, [...params, Number(limit), offset]);
|
||||
|
||||
// Get unique vendors for filter options
|
||||
const { rows: vendors } = await pool.query(`
|
||||
@@ -272,7 +283,7 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [analysis] = await pool.query(`
|
||||
const { rows: analysis } = await pool.query(`
|
||||
WITH category_costs AS (
|
||||
SELECT
|
||||
c.name as category,
|
||||
@@ -290,11 +301,11 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
SELECT
|
||||
category,
|
||||
COUNT(DISTINCT pid) as unique_products,
|
||||
CAST(AVG(cost_price) AS DECIMAL(15,3)) as avg_cost,
|
||||
CAST(MIN(cost_price) AS DECIMAL(15,3)) as min_cost,
|
||||
CAST(MAX(cost_price) AS DECIMAL(15,3)) as max_cost,
|
||||
CAST(STDDEV(cost_price) AS DECIMAL(15,3)) as cost_variance,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
||||
ROUND(AVG(cost_price)::numeric, 3) as avg_cost,
|
||||
ROUND(MIN(cost_price)::numeric, 3) as min_cost,
|
||||
ROUND(MAX(cost_price)::numeric, 3) as max_cost,
|
||||
ROUND(STDDEV(cost_price)::numeric, 3) as cost_variance,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM category_costs
|
||||
GROUP BY category
|
||||
ORDER BY total_spend DESC
|
||||
@@ -302,17 +313,37 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
|
||||
// Parse numeric values
|
||||
const parsedAnalysis = {
|
||||
categories: analysis.map(cat => ({
|
||||
unique_products: 0,
|
||||
avg_cost: 0,
|
||||
min_cost: 0,
|
||||
max_cost: 0,
|
||||
cost_variance: 0,
|
||||
total_spend_by_category: analysis.map(cat => ({
|
||||
category: cat.category,
|
||||
unique_products: Number(cat.unique_products) || 0,
|
||||
avg_cost: Number(cat.avg_cost) || 0,
|
||||
min_cost: Number(cat.min_cost) || 0,
|
||||
max_cost: Number(cat.max_cost) || 0,
|
||||
cost_variance: Number(cat.cost_variance) || 0,
|
||||
total_spend: Number(cat.total_spend) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
// Calculate aggregated stats if data exists
|
||||
if (analysis.length > 0) {
|
||||
parsedAnalysis.unique_products = analysis.reduce((sum, cat) => sum + Number(cat.unique_products || 0), 0);
|
||||
|
||||
// Calculate weighted average cost
|
||||
const totalProducts = parsedAnalysis.unique_products;
|
||||
if (totalProducts > 0) {
|
||||
parsedAnalysis.avg_cost = analysis.reduce((sum, cat) =>
|
||||
sum + (Number(cat.avg_cost || 0) * Number(cat.unique_products || 0)), 0) / totalProducts;
|
||||
}
|
||||
|
||||
// Find min and max across all categories
|
||||
parsedAnalysis.min_cost = Math.min(...analysis.map(cat => Number(cat.min_cost || 0)));
|
||||
parsedAnalysis.max_cost = Math.max(...analysis.map(cat => Number(cat.max_cost || 0)));
|
||||
|
||||
// Average variance
|
||||
parsedAnalysis.cost_variance = analysis.reduce((sum, cat) =>
|
||||
sum + Number(cat.cost_variance || 0), 0) / analysis.length;
|
||||
}
|
||||
|
||||
res.json(parsedAnalysis);
|
||||
} catch (error) {
|
||||
console.error('Error fetching cost analysis:', error);
|
||||
@@ -325,7 +356,7 @@ router.get('/receiving-status', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [status] = await pool.query(`
|
||||
const { rows: status } = await pool.query(`
|
||||
WITH po_totals AS (
|
||||
SELECT
|
||||
po_id,
|
||||
@@ -333,7 +364,7 @@ router.get('/receiving-status', async (req, res) => {
|
||||
receiving_status,
|
||||
SUM(ordered) as total_ordered,
|
||||
SUM(received) as total_received,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_cost
|
||||
FROM purchase_orders
|
||||
WHERE status != ${STATUS.CANCELED}
|
||||
GROUP BY po_id, status, receiving_status
|
||||
@@ -345,8 +376,8 @@ router.get('/receiving-status', async (req, res) => {
|
||||
ROUND(
|
||||
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
||||
) as fulfillment_rate,
|
||||
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
|
||||
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost,
|
||||
ROUND(SUM(total_cost)::numeric, 3) as total_value,
|
||||
ROUND(AVG(total_cost)::numeric, 3) as avg_cost,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
|
||||
END) as pending_count,
|
||||
@@ -364,17 +395,17 @@ router.get('/receiving-status', async (req, res) => {
|
||||
|
||||
// Parse numeric values
|
||||
const parsedStatus = {
|
||||
order_count: Number(status[0].order_count) || 0,
|
||||
total_ordered: Number(status[0].total_ordered) || 0,
|
||||
total_received: Number(status[0].total_received) || 0,
|
||||
fulfillment_rate: Number(status[0].fulfillment_rate) || 0,
|
||||
total_value: Number(status[0].total_value) || 0,
|
||||
avg_cost: Number(status[0].avg_cost) || 0,
|
||||
order_count: Number(status[0]?.order_count) || 0,
|
||||
total_ordered: Number(status[0]?.total_ordered) || 0,
|
||||
total_received: Number(status[0]?.total_received) || 0,
|
||||
fulfillment_rate: Number(status[0]?.fulfillment_rate) || 0,
|
||||
total_value: Number(status[0]?.total_value) || 0,
|
||||
avg_cost: Number(status[0]?.avg_cost) || 0,
|
||||
status_breakdown: {
|
||||
pending: Number(status[0].pending_count) || 0,
|
||||
partial: Number(status[0].partial_count) || 0,
|
||||
completed: Number(status[0].completed_count) || 0,
|
||||
canceled: Number(status[0].canceled_count) || 0
|
||||
pending: Number(status[0]?.pending_count) || 0,
|
||||
partial: Number(status[0]?.partial_count) || 0,
|
||||
completed: Number(status[0]?.completed_count) || 0,
|
||||
canceled: Number(status[0]?.canceled_count) || 0
|
||||
}
|
||||
};
|
||||
|
||||
@@ -390,7 +421,7 @@ router.get('/order-vs-received', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [quantities] = await pool.query(`
|
||||
const { rows: quantities } = await pool.query(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.title as product,
|
||||
@@ -403,10 +434,10 @@ router.get('/order-vs-received', async (req, res) => {
|
||||
COUNT(DISTINCT po.po_id) as order_count
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.product_id = po.product_id
|
||||
WHERE po.date >= DATE_SUB(CURDATE(), INTERVAL 90 DAY)
|
||||
WHERE po.date >= (CURRENT_DATE - INTERVAL '90 days')
|
||||
GROUP BY p.product_id, p.title, p.SKU
|
||||
HAVING order_count > 0
|
||||
ORDER BY ordered_quantity DESC
|
||||
HAVING COUNT(DISTINCT po.po_id) > 0
|
||||
ORDER BY SUM(po.ordered) DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
|
||||
|
||||
396
inventory-server/src/routes/reusable-images.js
Normal file
396
inventory-server/src/routes/reusable-images.js
Normal file
@@ -0,0 +1,396 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const multer = require('multer');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
|
||||
// Create reusable uploads directory if it doesn't exist
|
||||
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
|
||||
fs.mkdirSync(uploadsDir, { recursive: true });
|
||||
|
||||
// Configure multer for file uploads
|
||||
const storage = multer.diskStorage({
|
||||
destination: function (req, file, cb) {
|
||||
console.log(`Saving reusable image to: ${uploadsDir}`);
|
||||
cb(null, uploadsDir);
|
||||
},
|
||||
filename: function (req, file, cb) {
|
||||
// Create unique filename with original extension
|
||||
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
|
||||
|
||||
// Make sure we preserve the original file extension
|
||||
let fileExt = path.extname(file.originalname).toLowerCase();
|
||||
|
||||
// Ensure there is a proper extension based on mimetype if none exists
|
||||
if (!fileExt) {
|
||||
switch (file.mimetype) {
|
||||
case 'image/jpeg': fileExt = '.jpg'; break;
|
||||
case 'image/png': fileExt = '.png'; break;
|
||||
case 'image/gif': fileExt = '.gif'; break;
|
||||
case 'image/webp': fileExt = '.webp'; break;
|
||||
default: fileExt = '.jpg'; // Default to jpg
|
||||
}
|
||||
}
|
||||
|
||||
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
|
||||
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
|
||||
cb(null, fileName);
|
||||
}
|
||||
});
|
||||
|
||||
const upload = multer({
|
||||
storage: storage,
|
||||
limits: {
|
||||
fileSize: 5 * 1024 * 1024, // 5MB max file size
|
||||
},
|
||||
fileFilter: function (req, file, cb) {
|
||||
// Accept only image files
|
||||
const filetypes = /jpeg|jpg|png|gif|webp/;
|
||||
const mimetype = filetypes.test(file.mimetype);
|
||||
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
|
||||
|
||||
if (mimetype && extname) {
|
||||
return cb(null, true);
|
||||
}
|
||||
cb(new Error('Only image files are allowed'));
|
||||
}
|
||||
});
|
||||
|
||||
// Get all reusable images
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
ORDER BY created_at DESC
|
||||
`);
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable images:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable images',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get images by company or global images
|
||||
router.get('/by-company/:companyId', async (req, res) => {
|
||||
try {
|
||||
const { companyId } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Get images that are either global or belong to this company
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE is_global = true OR company = $1
|
||||
ORDER BY created_at DESC
|
||||
`, [companyId]);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable images by company:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable images by company',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get global images only
|
||||
router.get('/global', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE is_global = true
|
||||
ORDER BY created_at DESC
|
||||
`);
|
||||
|
||||
res.json(result.rows);
|
||||
} catch (error) {
|
||||
console.error('Error fetching global reusable images:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch global reusable images',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get a single image by ID
|
||||
router.get('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM reusable_images
|
||||
WHERE id = $1
|
||||
`, [id]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Upload a new reusable image
|
||||
router.post('/upload', upload.single('image'), async (req, res) => {
|
||||
try {
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ error: 'No image file provided' });
|
||||
}
|
||||
|
||||
const { name, is_global, company } = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: 'Image name is required' });
|
||||
}
|
||||
|
||||
// Convert is_global from string to boolean
|
||||
const isGlobal = is_global === 'true' || is_global === true;
|
||||
|
||||
// Validate company is provided for non-global images
|
||||
if (!isGlobal && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||
}
|
||||
|
||||
// Log file information
|
||||
console.log('Reusable image uploaded:', {
|
||||
filename: req.file.filename,
|
||||
originalname: req.file.originalname,
|
||||
mimetype: req.file.mimetype,
|
||||
size: req.file.size,
|
||||
path: req.file.path
|
||||
});
|
||||
|
||||
// Ensure the file exists
|
||||
const filePath = path.join(uploadsDir, req.file.filename);
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return res.status(500).json({ error: 'File was not saved correctly' });
|
||||
}
|
||||
|
||||
// Create URL for the uploaded file
|
||||
const baseUrl = 'https://inventory.acot.site';
|
||||
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Insert record into database
|
||||
const result = await pool.query(`
|
||||
INSERT INTO reusable_images (
|
||||
name,
|
||||
filename,
|
||||
file_path,
|
||||
image_url,
|
||||
is_global,
|
||||
company,
|
||||
mime_type,
|
||||
file_size
|
||||
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
|
||||
RETURNING *
|
||||
`, [
|
||||
name,
|
||||
req.file.filename,
|
||||
filePath,
|
||||
imageUrl,
|
||||
isGlobal,
|
||||
isGlobal ? null : company,
|
||||
req.file.mimetype,
|
||||
req.file.size
|
||||
]);
|
||||
|
||||
// Return success response with image data
|
||||
res.status(201).json({
|
||||
success: true,
|
||||
image: result.rows[0],
|
||||
message: 'Image uploaded successfully'
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error uploading reusable image:', error);
|
||||
res.status(500).json({ error: error.message || 'Failed to upload image' });
|
||||
}
|
||||
});
|
||||
|
||||
// Update image details (name, is_global, company)
|
||||
router.put('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const { name, is_global, company } = req.body;
|
||||
|
||||
// Validate required fields
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: 'Image name is required' });
|
||||
}
|
||||
|
||||
// Convert is_global from string to boolean if necessary
|
||||
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
|
||||
|
||||
// Validate company is provided for non-global images
|
||||
if (!isGlobal && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Check if the image exists
|
||||
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||
if (checkResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
UPDATE reusable_images
|
||||
SET
|
||||
name = $1,
|
||||
is_global = $2,
|
||||
company = $3
|
||||
WHERE id = $4
|
||||
RETURNING *
|
||||
`, [
|
||||
name,
|
||||
isGlobal,
|
||||
isGlobal ? null : company,
|
||||
id
|
||||
]);
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error updating reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to update reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Delete a reusable image
|
||||
router.delete('/:id', async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Get the image data first to get the filename
|
||||
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||
|
||||
if (imageResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'Reusable image not found' });
|
||||
}
|
||||
|
||||
const image = imageResult.rows[0];
|
||||
|
||||
// Delete from database
|
||||
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
|
||||
|
||||
// Delete the file from filesystem
|
||||
const filePath = path.join(uploadsDir, image.filename);
|
||||
if (fs.existsSync(filePath)) {
|
||||
fs.unlinkSync(filePath);
|
||||
}
|
||||
|
||||
res.json({
|
||||
message: 'Reusable image deleted successfully',
|
||||
image
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error deleting reusable image:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to delete reusable image',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Check if file exists and permissions
|
||||
router.get('/check-file/:filename', (req, res) => {
|
||||
const { filename } = req.params;
|
||||
|
||||
// Prevent directory traversal
|
||||
if (filename.includes('..') || filename.includes('/')) {
|
||||
return res.status(400).json({ error: 'Invalid filename' });
|
||||
}
|
||||
|
||||
const filePath = path.join(uploadsDir, filename);
|
||||
|
||||
try {
|
||||
// Check if file exists
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return res.status(404).json({
|
||||
error: 'File not found',
|
||||
path: filePath,
|
||||
exists: false,
|
||||
readable: false
|
||||
});
|
||||
}
|
||||
|
||||
// Check if file is readable
|
||||
fs.accessSync(filePath, fs.constants.R_OK);
|
||||
|
||||
// Get file stats
|
||||
const stats = fs.statSync(filePath);
|
||||
|
||||
return res.json({
|
||||
filename,
|
||||
path: filePath,
|
||||
exists: true,
|
||||
readable: true,
|
||||
isFile: stats.isFile(),
|
||||
isDirectory: stats.isDirectory(),
|
||||
size: stats.size,
|
||||
created: stats.birthtime,
|
||||
modified: stats.mtime,
|
||||
permissions: stats.mode.toString(8)
|
||||
});
|
||||
} catch (error) {
|
||||
return res.status(500).json({
|
||||
error: error.message,
|
||||
path: filePath,
|
||||
exists: fs.existsSync(filePath),
|
||||
readable: false
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Error handling middleware
|
||||
router.use((err, req, res, next) => {
|
||||
console.error('Reusable images route error:', err);
|
||||
res.status(500).json({
|
||||
error: 'Internal server error',
|
||||
details: err.message
|
||||
});
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -32,7 +32,7 @@ router.get('/', async (req, res) => {
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor = ANY($1)
|
||||
@@ -70,7 +70,7 @@ router.get('/', async (req, res) => {
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor IS NOT NULL AND vendor != ''
|
||||
|
||||
@@ -18,6 +18,8 @@ const categoriesRouter = require('./routes/categories');
|
||||
const importRouter = require('./routes/import');
|
||||
const aiValidationRouter = require('./routes/ai-validation');
|
||||
const templatesRouter = require('./routes/templates');
|
||||
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||
const reusableImagesRouter = require('./routes/reusable-images');
|
||||
|
||||
// Get the absolute path to the .env file
|
||||
const envPath = '/var/www/html/inventory/.env';
|
||||
@@ -103,6 +105,8 @@ async function startServer() {
|
||||
app.use('/api/import', importRouter);
|
||||
app.use('/api/ai-validation', aiValidationRouter);
|
||||
app.use('/api/templates', templatesRouter);
|
||||
app.use('/api/ai-prompts', aiPromptsRouter);
|
||||
app.use('/api/reusable-images', reusableImagesRouter);
|
||||
|
||||
// Basic health check route
|
||||
app.get('/health', (req, res) => {
|
||||
|
||||
@@ -1,47 +1,10 @@
|
||||
const { Pool, Client } = require('pg');
|
||||
const { Client: SSHClient } = require('ssh2');
|
||||
const { Pool } = require('pg');
|
||||
|
||||
let pool;
|
||||
|
||||
function initPool(config) {
|
||||
// Log config without sensitive data
|
||||
const safeConfig = {
|
||||
host: config.host || process.env.DB_HOST,
|
||||
user: config.user || process.env.DB_USER,
|
||||
database: config.database || process.env.DB_NAME,
|
||||
port: config.port || process.env.DB_PORT || 5432,
|
||||
max: config.max || 10,
|
||||
idleTimeoutMillis: config.idleTimeoutMillis || 30000,
|
||||
connectionTimeoutMillis: config.connectionTimeoutMillis || 2000,
|
||||
ssl: config.ssl || false,
|
||||
password: (config.password || process.env.DB_PASSWORD) ? '[password set]' : '[no password]'
|
||||
};
|
||||
console.log('[Database] Initializing pool with config:', safeConfig);
|
||||
|
||||
// Create the pool with the configuration
|
||||
pool = new Pool({
|
||||
host: config.host || process.env.DB_HOST,
|
||||
user: config.user || process.env.DB_USER,
|
||||
password: config.password || process.env.DB_PASSWORD,
|
||||
database: config.database || process.env.DB_NAME,
|
||||
port: config.port || process.env.DB_PORT || 5432,
|
||||
max: config.max || 10,
|
||||
idleTimeoutMillis: config.idleTimeoutMillis || 30000,
|
||||
connectionTimeoutMillis: config.connectionTimeoutMillis || 2000,
|
||||
ssl: config.ssl || false
|
||||
});
|
||||
|
||||
// Test the pool connection
|
||||
return pool.connect()
|
||||
.then(client => {
|
||||
console.log('[Database] Pool connection successful');
|
||||
client.release();
|
||||
return pool;
|
||||
})
|
||||
.catch(err => {
|
||||
console.error('[Database] Connection failed:', err);
|
||||
throw err;
|
||||
});
|
||||
pool = new Pool(config);
|
||||
return pool;
|
||||
}
|
||||
|
||||
async function getConnection() {
|
||||
|
||||
30
inventory/package-lock.json
generated
30
inventory/package-lock.json
generated
@@ -13,6 +13,7 @@
|
||||
"@dnd-kit/utilities": "^3.2.2",
|
||||
"@emotion/react": "^11.14.0",
|
||||
"@emotion/styled": "^11.14.0",
|
||||
"@hookform/resolvers": "^3.10.0",
|
||||
"@radix-ui/react-accordion": "^1.2.2",
|
||||
"@radix-ui/react-alert-dialog": "^1.1.4",
|
||||
"@radix-ui/react-avatar": "^1.1.2",
|
||||
@@ -62,6 +63,7 @@
|
||||
"react-day-picker": "^8.10.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-dropzone": "^14.3.5",
|
||||
"react-hook-form": "^7.54.2",
|
||||
"react-icons": "^5.4.0",
|
||||
"react-router-dom": "^7.1.1",
|
||||
"recharts": "^2.15.0",
|
||||
@@ -71,7 +73,8 @@
|
||||
"tanstack": "^1.0.0",
|
||||
"uuid": "^11.0.5",
|
||||
"vaul": "^1.1.2",
|
||||
"xlsx": "^0.18.5"
|
||||
"xlsx": "^0.18.5",
|
||||
"zod": "^3.24.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^9.17.0",
|
||||
@@ -1227,6 +1230,15 @@
|
||||
"integrity": "sha512-MDWhGtE+eHw5JW7lq4qhc5yRLS11ERl1c7Z6Xd0a58DozHES6EnNNwUWbMiG4J9Cgj053Bhk8zvlhFYKVhULwg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/@hookform/resolvers": {
|
||||
"version": "3.10.0",
|
||||
"resolved": "https://registry.npmjs.org/@hookform/resolvers/-/resolvers-3.10.0.tgz",
|
||||
"integrity": "sha512-79Dv+3mDF7i+2ajj7SkypSKHhl1cbln1OGavqrsF7p6mbUv11xpqpacPsGDCTRvCSjEEIez2ef1NveSVL3b0Ag==",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"react-hook-form": "^7.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@humanfs/core": {
|
||||
"version": "0.19.1",
|
||||
"resolved": "https://registry.npmjs.org/@humanfs/core/-/core-0.19.1.tgz",
|
||||
@@ -6937,6 +6949,22 @@
|
||||
"react": ">= 16.8 || 18.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/react-hook-form": {
|
||||
"version": "7.54.2",
|
||||
"resolved": "https://registry.npmjs.org/react-hook-form/-/react-hook-form-7.54.2.tgz",
|
||||
"integrity": "sha512-eHpAUgUjWbZocoQYUHposymRb4ZP6d0uwUnooL2uOybA9/3tPUvoAKqEWK1WaSiTxxOfTpffNZP7QwlnM3/gEg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/react-hook-form"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": "^16.8.0 || ^17 || ^18 || ^19"
|
||||
}
|
||||
},
|
||||
"node_modules/react-icons": {
|
||||
"version": "5.4.0",
|
||||
"resolved": "https://registry.npmjs.org/react-icons/-/react-icons-5.4.0.tgz",
|
||||
|
||||
@@ -10,11 +10,12 @@
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"@dnd-kit/core": "^6.3.1",
|
||||
"@dnd-kit/core": "^6.3.1",
|
||||
"@dnd-kit/sortable": "^10.0.0",
|
||||
"@dnd-kit/utilities": "^3.2.2",
|
||||
"@emotion/react": "^11.14.0",
|
||||
"@emotion/styled": "^11.14.0",
|
||||
"@hookform/resolvers": "^3.10.0",
|
||||
"@radix-ui/react-accordion": "^1.2.2",
|
||||
"@radix-ui/react-alert-dialog": "^1.1.4",
|
||||
"@radix-ui/react-avatar": "^1.1.2",
|
||||
@@ -64,6 +65,7 @@
|
||||
"react-day-picker": "^8.10.1",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-dropzone": "^14.3.5",
|
||||
"react-hook-form": "^7.54.2",
|
||||
"react-icons": "^5.4.0",
|
||||
"react-router-dom": "^7.1.1",
|
||||
"recharts": "^2.15.0",
|
||||
@@ -73,7 +75,8 @@
|
||||
"tanstack": "^1.0.0",
|
||||
"uuid": "^11.0.5",
|
||||
"vaul": "^1.1.2",
|
||||
"xlsx": "^0.18.5"
|
||||
"xlsx": "^0.18.5",
|
||||
"zod": "^3.24.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^9.17.0",
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
import { Routes, Route, useNavigate, Navigate } from 'react-router-dom';
|
||||
import { Routes, Route, useNavigate, Navigate, useLocation } from 'react-router-dom';
|
||||
import { MainLayout } from './components/layout/MainLayout';
|
||||
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||
import { Products } from './pages/Products';
|
||||
import { Dashboard } from './pages/Dashboard';
|
||||
import { Orders } from './pages/Orders';
|
||||
import { Settings } from './pages/Settings';
|
||||
import { Analytics } from './pages/Analytics';
|
||||
import { Toaster } from '@/components/ui/sonner';
|
||||
@@ -16,64 +15,123 @@ import Forecasting from "@/pages/Forecasting";
|
||||
import { Vendors } from '@/pages/Vendors';
|
||||
import { Categories } from '@/pages/Categories';
|
||||
import { Import } from '@/pages/Import';
|
||||
import { AiValidationDebug } from "@/pages/AiValidationDebug"
|
||||
import { AuthProvider } from './contexts/AuthContext';
|
||||
import { Protected } from './components/auth/Protected';
|
||||
import { FirstAccessiblePage } from './components/auth/FirstAccessiblePage';
|
||||
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
function App() {
|
||||
const navigate = useNavigate();
|
||||
const location = useLocation();
|
||||
|
||||
useEffect(() => {
|
||||
const checkAuth = async () => {
|
||||
const token = sessionStorage.getItem('token');
|
||||
if (token) {
|
||||
const token = localStorage.getItem('token');
|
||||
const isLoggedIn = sessionStorage.getItem('isLoggedIn') === 'true';
|
||||
|
||||
// If we have a token but aren't logged in yet, verify the token
|
||||
if (token && !isLoggedIn) {
|
||||
try {
|
||||
const response = await fetch(`${config.authUrl}/protected`, {
|
||||
const response = await fetch(`${config.authUrl}/me`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${token}`,
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
sessionStorage.removeItem('token');
|
||||
localStorage.removeItem('token');
|
||||
sessionStorage.removeItem('isLoggedIn');
|
||||
navigate('/login');
|
||||
|
||||
// Only navigate to login if we're not already there
|
||||
if (!location.pathname.includes('/login')) {
|
||||
navigate(`/login?redirect=${encodeURIComponent(location.pathname + location.search)}`);
|
||||
}
|
||||
} else {
|
||||
// If token is valid, set the login flag
|
||||
sessionStorage.setItem('isLoggedIn', 'true');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Token verification failed:', error);
|
||||
sessionStorage.removeItem('token');
|
||||
localStorage.removeItem('token');
|
||||
sessionStorage.removeItem('isLoggedIn');
|
||||
navigate('/login');
|
||||
|
||||
// Only navigate to login if we're not already there
|
||||
if (!location.pathname.includes('/login')) {
|
||||
navigate(`/login?redirect=${encodeURIComponent(location.pathname + location.search)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
checkAuth();
|
||||
}, [navigate]);
|
||||
}, [navigate, location.pathname, location.search]);
|
||||
|
||||
return (
|
||||
<QueryClientProvider client={queryClient}>
|
||||
<Toaster richColors position="top-center" />
|
||||
<Routes>
|
||||
<Route path="/login" element={<Login />} />
|
||||
<Route element={
|
||||
<RequireAuth>
|
||||
<MainLayout />
|
||||
</RequireAuth>
|
||||
}>
|
||||
<Route path="/" element={<Dashboard />} />
|
||||
<Route path="/products" element={<Products />} />
|
||||
<Route path="/import" element={<Import />} />
|
||||
<Route path="/categories" element={<Categories />} />
|
||||
<Route path="/vendors" element={<Vendors />} />
|
||||
<Route path="/orders" element={<Orders />} />
|
||||
<Route path="/purchase-orders" element={<PurchaseOrders />} />
|
||||
<Route path="/analytics" element={<Analytics />} />
|
||||
<Route path="/settings" element={<Settings />} />
|
||||
<Route path="/forecasting" element={<Forecasting />} />
|
||||
<Route path="/ai-validation/debug" element={<AiValidationDebug />} />
|
||||
<Route path="*" element={<Navigate to="/" replace />} />
|
||||
</Route>
|
||||
</Routes>
|
||||
<AuthProvider>
|
||||
<Toaster richColors position="top-center" />
|
||||
<Routes>
|
||||
<Route path="/login" element={<Login />} />
|
||||
<Route element={
|
||||
<RequireAuth>
|
||||
<MainLayout />
|
||||
</RequireAuth>
|
||||
}>
|
||||
<Route index element={
|
||||
<Protected page="dashboard" fallback={<FirstAccessiblePage />}>
|
||||
<Dashboard />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/" element={
|
||||
<Protected page="dashboard">
|
||||
<Dashboard />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/products" element={
|
||||
<Protected page="products">
|
||||
<Products />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/import" element={
|
||||
<Protected page="import">
|
||||
<Import />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/categories" element={
|
||||
<Protected page="categories">
|
||||
<Categories />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/vendors" element={
|
||||
<Protected page="vendors">
|
||||
<Vendors />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/purchase-orders" element={
|
||||
<Protected page="purchase_orders">
|
||||
<PurchaseOrders />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/analytics" element={
|
||||
<Protected page="analytics">
|
||||
<Analytics />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/settings" element={
|
||||
<Protected page="settings">
|
||||
<Settings />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/forecasting" element={
|
||||
<Protected page="forecasting">
|
||||
<Forecasting />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="*" element={<Navigate to="/" replace />} />
|
||||
</Route>
|
||||
</Routes>
|
||||
</AuthProvider>
|
||||
</QueryClientProvider>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ import { useQuery } from '@tanstack/react-query';
|
||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { ResponsiveContainer, BarChart, Bar, XAxis, YAxis, Tooltip, ScatterChart, Scatter, ZAxis } from 'recharts';
|
||||
import config from '../../config';
|
||||
import { useState, useEffect } from 'react';
|
||||
|
||||
interface VendorData {
|
||||
performance: {
|
||||
@@ -10,14 +11,15 @@ interface VendorData {
|
||||
profitMargin: number;
|
||||
stockTurnover: number;
|
||||
productCount: number;
|
||||
growth: number;
|
||||
}[];
|
||||
comparison: {
|
||||
comparison?: {
|
||||
vendor: string;
|
||||
salesPerProduct: number;
|
||||
averageMargin: number;
|
||||
size: number;
|
||||
}[];
|
||||
trends: {
|
||||
trends?: {
|
||||
vendor: string;
|
||||
month: string;
|
||||
sales: number;
|
||||
@@ -25,40 +27,86 @@ interface VendorData {
|
||||
}
|
||||
|
||||
export function VendorPerformance() {
|
||||
const { data, isLoading } = useQuery<VendorData>({
|
||||
queryKey: ['vendor-performance'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/vendors`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch vendor performance');
|
||||
}
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
performance: rawData.performance.map((vendor: any) => ({
|
||||
...vendor,
|
||||
salesVolume: Number(vendor.salesVolume) || 0,
|
||||
profitMargin: Number(vendor.profitMargin) || 0,
|
||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||
productCount: Number(vendor.productCount) || 0
|
||||
})),
|
||||
comparison: rawData.comparison.map((vendor: any) => ({
|
||||
...vendor,
|
||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||
averageMargin: Number(vendor.averageMargin) || 0,
|
||||
size: Number(vendor.size) || 0
|
||||
})),
|
||||
trends: rawData.trends.map((vendor: any) => ({
|
||||
...vendor,
|
||||
sales: Number(vendor.sales) || 0
|
||||
}))
|
||||
};
|
||||
},
|
||||
});
|
||||
const [vendorData, setVendorData] = useState<VendorData | null>(null);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
if (isLoading || !data) {
|
||||
useEffect(() => {
|
||||
// Use plain fetch to bypass cache issues with React Query
|
||||
const fetchData = async () => {
|
||||
try {
|
||||
setIsLoading(true);
|
||||
|
||||
// Add cache-busting parameter
|
||||
const response = await fetch(`${config.apiUrl}/analytics/vendors?nocache=${Date.now()}`, {
|
||||
headers: {
|
||||
"Cache-Control": "no-cache, no-store, must-revalidate",
|
||||
"Pragma": "no-cache",
|
||||
"Expires": "0"
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch: ${response.status}`);
|
||||
}
|
||||
|
||||
const rawData = await response.json();
|
||||
|
||||
if (!rawData || !rawData.performance) {
|
||||
throw new Error('Invalid response format');
|
||||
}
|
||||
|
||||
// Create a complete structure even if some parts are missing
|
||||
const data: VendorData = {
|
||||
performance: rawData.performance.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesVolume: Number(vendor.salesVolume) || 0,
|
||||
profitMargin: Number(vendor.profitMargin) || 0,
|
||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||
productCount: Number(vendor.productCount) || 0,
|
||||
growth: Number(vendor.growth) || 0
|
||||
})),
|
||||
comparison: rawData.comparison?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||
averageMargin: Number(vendor.averageMargin) || 0,
|
||||
size: Number(vendor.size) || 0
|
||||
})) || [],
|
||||
trends: rawData.trends?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
month: vendor.month,
|
||||
sales: Number(vendor.sales) || 0
|
||||
})) || []
|
||||
};
|
||||
|
||||
setVendorData(data);
|
||||
} catch (err) {
|
||||
console.error('Error fetching vendor data:', err);
|
||||
setError(err instanceof Error ? err.message : 'Unknown error');
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
fetchData();
|
||||
}, []);
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading vendor performance...</div>;
|
||||
}
|
||||
|
||||
if (error || !vendorData) {
|
||||
return <div className="text-red-500">Error loading vendor data: {error}</div>;
|
||||
}
|
||||
|
||||
// Ensure we have at least the performance data
|
||||
const sortedPerformance = vendorData.performance
|
||||
.sort((a, b) => b.salesVolume - a.salesVolume)
|
||||
.slice(0, 10);
|
||||
|
||||
// Use simplified version if comparison data is missing
|
||||
const hasComparisonData = vendorData.comparison && vendorData.comparison.length > 0;
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
@@ -68,7 +116,7 @@ export function VendorPerformance() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data.performance}>
|
||||
<BarChart data={sortedPerformance}>
|
||||
<XAxis dataKey="vendor" />
|
||||
<YAxis tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`} />
|
||||
<Tooltip
|
||||
@@ -84,44 +132,68 @@ export function VendorPerformance() {
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Performance Matrix</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<ScatterChart>
|
||||
<XAxis
|
||||
dataKey="salesPerProduct"
|
||||
name="Sales per Product"
|
||||
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
|
||||
/>
|
||||
<YAxis
|
||||
dataKey="averageMargin"
|
||||
name="Average Margin"
|
||||
tickFormatter={(value) => `${value.toFixed(0)}%`}
|
||||
/>
|
||||
<ZAxis
|
||||
dataKey="size"
|
||||
range={[50, 400]}
|
||||
name="Product Count"
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number, name: string) => {
|
||||
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
|
||||
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
|
||||
return [value, name];
|
||||
}}
|
||||
/>
|
||||
<Scatter
|
||||
data={data.comparison}
|
||||
fill="#60a5fa"
|
||||
name="Vendors"
|
||||
/>
|
||||
</ScatterChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
{hasComparisonData ? (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Performance Matrix</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<ScatterChart>
|
||||
<XAxis
|
||||
dataKey="salesPerProduct"
|
||||
name="Sales per Product"
|
||||
tickFormatter={(value) => `$${(value / 1000).toFixed(0)}k`}
|
||||
/>
|
||||
<YAxis
|
||||
dataKey="averageMargin"
|
||||
name="Average Margin"
|
||||
tickFormatter={(value) => `${value.toFixed(0)}%`}
|
||||
/>
|
||||
<ZAxis
|
||||
dataKey="size"
|
||||
range={[50, 400]}
|
||||
name="Product Count"
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number, name: string) => {
|
||||
if (name === 'Sales per Product') return [`$${value.toLocaleString()}`, name];
|
||||
if (name === 'Average Margin') return [`${value.toFixed(1)}%`, name];
|
||||
return [value, name];
|
||||
}}
|
||||
/>
|
||||
<Scatter
|
||||
data={vendorData.comparison}
|
||||
fill="#60a5fa"
|
||||
name="Vendors"
|
||||
/>
|
||||
</ScatterChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
) : (
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor Profit Margins</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={sortedPerformance}>
|
||||
<XAxis dataKey="vendor" />
|
||||
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Profit Margin']}
|
||||
/>
|
||||
<Bar
|
||||
dataKey="profitMargin"
|
||||
fill="#4ade80"
|
||||
name="Profit Margin"
|
||||
/>
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<Card>
|
||||
@@ -130,7 +202,7 @@ export function VendorPerformance() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.performance.map((vendor) => (
|
||||
{sortedPerformance.map((vendor) => (
|
||||
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||
|
||||
44
inventory/src/components/auth/FirstAccessiblePage.tsx
Normal file
44
inventory/src/components/auth/FirstAccessiblePage.tsx
Normal file
@@ -0,0 +1,44 @@
|
||||
import { useContext } from "react";
|
||||
import { Navigate } from "react-router-dom";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
|
||||
// Define available pages in order of priority
|
||||
const PAGES = [
|
||||
{ path: "/products", permission: "access:products" },
|
||||
{ path: "/categories", permission: "access:categories" },
|
||||
{ path: "/vendors", permission: "access:vendors" },
|
||||
{ path: "/purchase-orders", permission: "access:purchase_orders" },
|
||||
{ path: "/analytics", permission: "access:analytics" },
|
||||
{ path: "/forecasting", permission: "access:forecasting" },
|
||||
{ path: "/import", permission: "access:import" },
|
||||
{ path: "/settings", permission: "access:settings" },
|
||||
{ path: "/ai-validation/debug", permission: "access:ai_validation_debug" }
|
||||
];
|
||||
|
||||
export function FirstAccessiblePage() {
|
||||
const { user } = useContext(AuthContext);
|
||||
|
||||
// If user isn't loaded yet, don't render anything
|
||||
if (!user) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Admin users have access to all pages, so this component
|
||||
// shouldn't be rendering for them (handled by App.tsx)
|
||||
if (user.is_admin) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Find the first page the user has access to
|
||||
const firstAccessiblePage = PAGES.find(page => {
|
||||
return user.permissions?.includes(page.permission);
|
||||
});
|
||||
|
||||
// If we found a page, redirect to it
|
||||
if (firstAccessiblePage) {
|
||||
return <Navigate to={firstAccessiblePage.path} replace />;
|
||||
}
|
||||
|
||||
// If user has no access to any page, redirect to login
|
||||
return <Navigate to="/login" replace />;
|
||||
}
|
||||
104
inventory/src/components/auth/PERMISSIONS.md
Normal file
104
inventory/src/components/auth/PERMISSIONS.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Permission System Documentation
|
||||
|
||||
This document outlines the simplified permission system implemented in the Inventory Manager application.
|
||||
|
||||
## Permission Structure
|
||||
|
||||
Permissions follow this naming convention:
|
||||
|
||||
- Page access: `access:{page_name}`
|
||||
- Actions: `{action}:{resource}`
|
||||
|
||||
Examples:
|
||||
- `access:products` - Can access the Products page
|
||||
- `create:products` - Can create new products
|
||||
- `edit:users` - Can edit user accounts
|
||||
|
||||
## Permission Component
|
||||
|
||||
### Protected
|
||||
|
||||
The core component that conditionally renders content based on permissions.
|
||||
|
||||
```tsx
|
||||
<Protected
|
||||
permission="create:products"
|
||||
fallback={<p>No permission</p>}
|
||||
>
|
||||
<button>Create Product</button>
|
||||
</Protected>
|
||||
```
|
||||
|
||||
Options:
|
||||
- `permission`: Single permission code (e.g., "create:products")
|
||||
- `page`: Page name (checks `access:{page}` permission)
|
||||
- `resource` + `action`: Resource and action (checks `{action}:{resource}` permission)
|
||||
- `adminOnly`: For admin-only sections
|
||||
- `fallback`: Content to show if permission check fails
|
||||
|
||||
### RequireAuth
|
||||
|
||||
Used for basic authentication checks (is user logged in?).
|
||||
|
||||
```tsx
|
||||
<Route element={
|
||||
<RequireAuth>
|
||||
<MainLayout />
|
||||
</RequireAuth>
|
||||
}>
|
||||
{/* Protected routes */}
|
||||
</Route>
|
||||
```
|
||||
|
||||
## Common Permission Codes
|
||||
|
||||
| Code | Description |
|
||||
|------|-------------|
|
||||
| `access:dashboard` | Access to Dashboard page |
|
||||
| `access:products` | Access to Products page |
|
||||
| `create:products` | Create new products |
|
||||
| `edit:products` | Edit existing products |
|
||||
| `delete:products` | Delete products |
|
||||
| `view:users` | View user accounts |
|
||||
| `edit:users` | Edit user accounts |
|
||||
| `manage:permissions` | Assign permissions to users |
|
||||
|
||||
## Implementation Examples
|
||||
|
||||
### Page Protection
|
||||
|
||||
In `App.tsx`:
|
||||
```tsx
|
||||
<Route path="/products" element={
|
||||
<Protected page="products" fallback={<Navigate to="/" />}>
|
||||
<Products />
|
||||
</Protected>
|
||||
} />
|
||||
```
|
||||
|
||||
### Component Level Protection
|
||||
|
||||
```tsx
|
||||
<Protected permission="edit:products">
|
||||
<form>
|
||||
{/* Form fields */}
|
||||
<button type="submit">Save Changes</button>
|
||||
</form>
|
||||
</Protected>
|
||||
```
|
||||
|
||||
### Button Protection
|
||||
|
||||
```tsx
|
||||
<Button
|
||||
onClick={handleDelete}
|
||||
disabled={!hasPermission('delete:products')}
|
||||
>
|
||||
Delete
|
||||
</Button>
|
||||
|
||||
// With Protected component
|
||||
<Protected permission="delete:products" fallback={null}>
|
||||
<Button onClick={handleDelete}>Delete</Button>
|
||||
</Protected>
|
||||
```
|
||||
82
inventory/src/components/auth/Protected.tsx
Normal file
82
inventory/src/components/auth/Protected.tsx
Normal file
@@ -0,0 +1,82 @@
|
||||
import { ReactNode, useContext } from "react";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
|
||||
interface ProtectedProps {
|
||||
// For specific permission code
|
||||
permission?: string;
|
||||
|
||||
// For page access permission format: access:{page}
|
||||
page?: string;
|
||||
|
||||
// For action permission format: {action}:{resource}
|
||||
resource?: string;
|
||||
action?: "view" | "create" | "edit" | "delete" | string;
|
||||
|
||||
// For admin-only access
|
||||
adminOnly?: boolean;
|
||||
|
||||
// Content to render if permission check passes
|
||||
children: ReactNode;
|
||||
|
||||
// Optional fallback content
|
||||
fallback?: ReactNode;
|
||||
}
|
||||
|
||||
/**
|
||||
* A simplified component that conditionally renders content based on user permissions
|
||||
*/
|
||||
export function Protected({
|
||||
permission,
|
||||
page,
|
||||
resource,
|
||||
action,
|
||||
adminOnly,
|
||||
children,
|
||||
fallback = null
|
||||
}: ProtectedProps) {
|
||||
const { user } = useContext(AuthContext);
|
||||
|
||||
// If user isn't loaded yet, don't render anything
|
||||
if (!user) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Admin check - admins always have access to everything
|
||||
if (user.is_admin) {
|
||||
return <>{children}</>;
|
||||
}
|
||||
|
||||
// Admin-only check
|
||||
if (adminOnly) {
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
|
||||
// Check permissions array exists
|
||||
if (!user.permissions) {
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
|
||||
// Page access check (access:page)
|
||||
if (page) {
|
||||
const pagePermission = `access:${page.toLowerCase()}`;
|
||||
if (!user.permissions.includes(pagePermission)) {
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
}
|
||||
|
||||
// Resource action check (action:resource)
|
||||
if (resource && action) {
|
||||
const resourcePermission = `${action}:${resource.toLowerCase()}`;
|
||||
if (!user.permissions.includes(resourcePermission)) {
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
}
|
||||
|
||||
// Single permission check
|
||||
if (permission && !user.permissions.includes(permission)) {
|
||||
return <>{fallback}</>;
|
||||
}
|
||||
|
||||
// If all checks pass, render children
|
||||
return <>{children}</>;
|
||||
}
|
||||
@@ -1,8 +1,45 @@
|
||||
import { Navigate, useLocation } from "react-router-dom"
|
||||
import { useContext, useEffect, useState } from "react"
|
||||
import { AuthContext } from "@/contexts/AuthContext"
|
||||
|
||||
export function RequireAuth({ children }: { children: React.ReactNode }) {
|
||||
const isLoggedIn = sessionStorage.getItem("isLoggedIn") === "true"
|
||||
const { token, user, fetchCurrentUser } = useContext(AuthContext)
|
||||
const location = useLocation()
|
||||
const [isLoading, setIsLoading] = useState(!!token && !user)
|
||||
|
||||
// This will make sure the user data is loaded the first time
|
||||
useEffect(() => {
|
||||
const loadUserData = async () => {
|
||||
if (token && !user) {
|
||||
setIsLoading(true)
|
||||
try {
|
||||
await fetchCurrentUser()
|
||||
} catch (error) {
|
||||
console.error("Failed to fetch user data:", error)
|
||||
} finally {
|
||||
setIsLoading(false)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
loadUserData()
|
||||
}, [token, user, fetchCurrentUser])
|
||||
|
||||
// Check if token exists but we're not logged in
|
||||
useEffect(() => {
|
||||
if (token && !isLoggedIn) {
|
||||
// Verify the token and fetch user data
|
||||
fetchCurrentUser().catch(() => {
|
||||
// Do nothing - the AuthContext will handle errors
|
||||
})
|
||||
}
|
||||
}, [token, isLoggedIn, fetchCurrentUser])
|
||||
|
||||
// If still loading user data, show nothing yet
|
||||
if (isLoading) {
|
||||
return <div className="p-8 flex justify-center items-center h-screen">Loading...</div>
|
||||
}
|
||||
|
||||
if (!isLoggedIn) {
|
||||
// Redirect to login with the current path in the redirect parameter
|
||||
|
||||
@@ -1,88 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
|
||||
import { AlertCircle, AlertTriangle, CheckCircle2, PackageSearch } from "lucide-react"
|
||||
import config from "@/config"
|
||||
import { useNavigate } from "react-router-dom"
|
||||
import { cn } from "@/lib/utils"
|
||||
|
||||
interface InventoryHealth {
|
||||
critical: number
|
||||
reorder: number
|
||||
healthy: number
|
||||
overstock: number
|
||||
total: number
|
||||
}
|
||||
|
||||
export function InventoryHealthSummary() {
|
||||
const navigate = useNavigate();
|
||||
const { data: summary } = useQuery<InventoryHealth>({
|
||||
queryKey: ["inventory-health"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory/health/summary`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch inventory health")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const stats = [
|
||||
{
|
||||
title: "Critical Stock",
|
||||
value: summary?.critical || 0,
|
||||
description: "Products needing immediate attention",
|
||||
icon: AlertCircle,
|
||||
className: "bg-destructive/10",
|
||||
iconClassName: "text-destructive",
|
||||
view: "critical"
|
||||
},
|
||||
{
|
||||
title: "Reorder Soon",
|
||||
value: summary?.reorder || 0,
|
||||
description: "Products approaching reorder point",
|
||||
icon: AlertTriangle,
|
||||
className: "bg-warning/10",
|
||||
iconClassName: "text-warning",
|
||||
view: "reorder"
|
||||
},
|
||||
{
|
||||
title: "Healthy Stock",
|
||||
value: summary?.healthy || 0,
|
||||
description: "Products at optimal levels",
|
||||
icon: CheckCircle2,
|
||||
className: "bg-success/10",
|
||||
iconClassName: "text-success",
|
||||
view: "healthy"
|
||||
},
|
||||
{
|
||||
title: "Overstock",
|
||||
value: summary?.overstock || 0,
|
||||
description: "Products exceeding optimal levels",
|
||||
icon: PackageSearch,
|
||||
className: "bg-muted",
|
||||
iconClassName: "text-muted-foreground",
|
||||
view: "overstocked"
|
||||
},
|
||||
]
|
||||
|
||||
return (
|
||||
<>
|
||||
{stats.map((stat) => (
|
||||
<Card
|
||||
key={stat.title}
|
||||
className={cn(stat.className, "cursor-pointer hover:opacity-90 transition-opacity")}
|
||||
onClick={() => navigate(`/products?view=${stat.view}`)}
|
||||
>
|
||||
<CardHeader className="flex flex-row items-center justify-between pb-2">
|
||||
<CardTitle className="text-sm font-medium">{stat.title}</CardTitle>
|
||||
<stat.icon className={`h-4 w-4 ${stat.iconClassName}`} />
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{stat.value}</div>
|
||||
<p className="text-xs text-muted-foreground">{stat.description}</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
))}
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,106 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { Bar, BarChart, ResponsiveContainer, XAxis, YAxis, Tooltip } from 'recharts';
|
||||
import config from '../../config';
|
||||
|
||||
interface InventoryMetrics {
|
||||
stockLevels: {
|
||||
category: string;
|
||||
inStock: number;
|
||||
lowStock: number;
|
||||
outOfStock: number;
|
||||
}[];
|
||||
topVendors: {
|
||||
vendor: string;
|
||||
productCount: number;
|
||||
averageStockLevel: string;
|
||||
}[];
|
||||
stockTurnover: {
|
||||
category: string;
|
||||
rate: string;
|
||||
}[];
|
||||
}
|
||||
|
||||
export function InventoryStats() {
|
||||
const { data, isLoading, error } = useQuery<InventoryMetrics>({
|
||||
queryKey: ['inventory-metrics'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/inventory-metrics`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch inventory metrics');
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading inventory metrics...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading inventory metrics</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Levels by Category</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data?.stockLevels}>
|
||||
<XAxis dataKey="category" />
|
||||
<YAxis />
|
||||
<Tooltip />
|
||||
<Bar dataKey="inStock" name="In Stock" fill="#4ade80" />
|
||||
<Bar dataKey="lowStock" name="Low Stock" fill="#fbbf24" />
|
||||
<Bar dataKey="outOfStock" name="Out of Stock" fill="#f87171" />
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Turnover Rate</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data?.stockTurnover}>
|
||||
<XAxis dataKey="category" />
|
||||
<YAxis />
|
||||
<Tooltip formatter={(value: string) => [Number(value).toFixed(2), "Rate"]} />
|
||||
<Bar dataKey="rate" name="Turnover Rate" fill="#60a5fa" />
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Top Vendors</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data?.topVendors.map((vendor) => (
|
||||
<div key={vendor.vendor} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{vendor.productCount} products
|
||||
</p>
|
||||
</div>
|
||||
<div className="ml-4 text-right">
|
||||
<p className="text-sm font-medium">
|
||||
Avg. Stock: {Number(vendor.averageStockLevel).toFixed(0)}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,232 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Area,
|
||||
AreaChart,
|
||||
ResponsiveContainer,
|
||||
Tooltip,
|
||||
XAxis,
|
||||
YAxis,
|
||||
} from "recharts"
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"
|
||||
import config from "@/config"
|
||||
|
||||
interface MetricDataPoint {
|
||||
date: string
|
||||
value: number
|
||||
}
|
||||
|
||||
interface KeyMetrics {
|
||||
revenue: MetricDataPoint[]
|
||||
inventory_value: MetricDataPoint[]
|
||||
gmroi: MetricDataPoint[]
|
||||
}
|
||||
|
||||
export function KeyMetricsCharts() {
|
||||
const { data: metrics } = useQuery<KeyMetrics>({
|
||||
queryKey: ["key-metrics"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/metrics/trends`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch metrics trends")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const formatCurrency = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "currency",
|
||||
currency: "USD",
|
||||
minimumFractionDigits: 0,
|
||||
maximumFractionDigits: 0,
|
||||
}).format(value)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Key Metrics</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<Tabs defaultValue="revenue" className="space-y-4">
|
||||
<TabsList>
|
||||
<TabsTrigger value="revenue">Revenue</TabsTrigger>
|
||||
<TabsTrigger value="inventory">Inventory Value</TabsTrigger>
|
||||
<TabsTrigger value="gmroi">GMROI</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
<TabsContent value="revenue" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.revenue}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={formatCurrency}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Revenue
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{formatCurrency(payload[0].value as number)}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#0ea5e9"
|
||||
fill="#0ea5e9"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="inventory" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.inventory_value}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={formatCurrency}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Value
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{formatCurrency(payload[0].value as number)}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#84cc16"
|
||||
fill="#84cc16"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="gmroi" className="space-y-4">
|
||||
<div className="h-[300px]">
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<AreaChart data={metrics?.gmroi}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => value}
|
||||
/>
|
||||
<YAxis
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => `${value.toFixed(1)}%`}
|
||||
/>
|
||||
<Tooltip
|
||||
content={({ active, payload }) => {
|
||||
if (active && payload && payload.length) {
|
||||
return (
|
||||
<div className="rounded-lg border bg-background p-2 shadow-sm">
|
||||
<div className="grid grid-cols-2 gap-2">
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
Date
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{payload[0].payload.date}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-col">
|
||||
<span className="text-[0.70rem] uppercase text-muted-foreground">
|
||||
GMROI
|
||||
</span>
|
||||
<span className="font-bold">
|
||||
{`${typeof payload[0].value === 'number' ? payload[0].value.toFixed(1) : payload[0].value}%`}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
return null
|
||||
}}
|
||||
/>
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="value"
|
||||
stroke="#f59e0b"
|
||||
fill="#f59e0b"
|
||||
fillOpacity={0.2}
|
||||
/>
|
||||
</AreaChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
|
||||
</TabsContent>
|
||||
</Tabs>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,108 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table"
|
||||
import { Badge } from "@/components/ui/badge"
|
||||
import config from "@/config"
|
||||
import { format } from "date-fns"
|
||||
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
daily_sales_avg: string;
|
||||
days_of_inventory: string;
|
||||
reorder_qty: number;
|
||||
last_purchase_date: string | null;
|
||||
lead_time_status: string;
|
||||
}
|
||||
|
||||
// Helper functions
|
||||
const formatDate = (dateString: string) => {
|
||||
return format(new Date(dateString), 'MMM dd, yyyy')
|
||||
}
|
||||
|
||||
const getLeadTimeVariant = (status: string) => {
|
||||
switch (status.toLowerCase()) {
|
||||
case 'critical':
|
||||
return 'destructive'
|
||||
case 'warning':
|
||||
return 'secondary'
|
||||
case 'good':
|
||||
return 'secondary'
|
||||
default:
|
||||
return 'secondary'
|
||||
}
|
||||
}
|
||||
|
||||
export function LowStockAlerts() {
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["low-stock"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/low-stock/products`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch low stock products")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Low Stock Alerts</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="max-h-[350px] overflow-auto">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Days Left</TableHead>
|
||||
<TableHead className="text-right">Reorder Qty</TableHead>
|
||||
<TableHead>Last Purchase</TableHead>
|
||||
<TableHead>Lead Time</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.days_of_inventory).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.reorder_qty}</TableCell>
|
||||
<TableCell>{product.last_purchase_date ? formatDate(product.last_purchase_date) : '-'}</TableCell>
|
||||
<TableCell>
|
||||
<Badge variant={getLeadTimeVariant(product.lead_time_status)}>
|
||||
{product.lead_time_status}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,66 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Line, LineChart, ResponsiveContainer, Tooltip, XAxis, YAxis } from 'recharts';
|
||||
import config from '../../config';
|
||||
|
||||
interface SalesData {
|
||||
date: string;
|
||||
total: number;
|
||||
}
|
||||
|
||||
export function Overview() {
|
||||
const { data, isLoading, error } = useQuery<SalesData[]>({
|
||||
queryKey: ['sales-overview'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/sales-overview`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch sales overview');
|
||||
}
|
||||
const rawData = await response.json();
|
||||
return rawData.map((item: SalesData) => ({
|
||||
...item,
|
||||
total: parseFloat(item.total.toString()),
|
||||
date: new Date(item.date).toLocaleDateString('en-US', { month: 'short', day: 'numeric' })
|
||||
}));
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading chart...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading sales overview</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<ResponsiveContainer width="100%" height={350}>
|
||||
<LineChart data={data}>
|
||||
<XAxis
|
||||
dataKey="date"
|
||||
stroke="#888888"
|
||||
fontSize={12}
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
/>
|
||||
<YAxis
|
||||
stroke="#888888"
|
||||
fontSize={12}
|
||||
tickLine={false}
|
||||
axisLine={false}
|
||||
tickFormatter={(value) => `$${value.toLocaleString()}`}
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
|
||||
labelFormatter={(label) => `Date: ${label}`}
|
||||
/>
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="total"
|
||||
stroke="hsl(var(--primary))"
|
||||
strokeWidth={2}
|
||||
dot={false}
|
||||
/>
|
||||
</LineChart>
|
||||
</ResponsiveContainer>
|
||||
);
|
||||
}
|
||||
@@ -1,63 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Avatar, AvatarFallback } from '@/components/ui/avatar';
|
||||
import config from '../../config';
|
||||
|
||||
interface RecentOrder {
|
||||
order_id: string;
|
||||
customer_name: string;
|
||||
total_amount: number;
|
||||
order_date: string;
|
||||
}
|
||||
|
||||
export function RecentSales() {
|
||||
const { data: recentOrders, isLoading, error } = useQuery<RecentOrder[]>({
|
||||
queryKey: ['recent-orders'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/recent-orders`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch recent orders');
|
||||
}
|
||||
const data = await response.json();
|
||||
return data.map((order: RecentOrder) => ({
|
||||
...order,
|
||||
total_amount: parseFloat(order.total_amount.toString())
|
||||
}));
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading recent sales...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading recent sales</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-8">
|
||||
{recentOrders?.map((order) => (
|
||||
<div key={order.order_id} className="flex items-center">
|
||||
<Avatar className="h-9 w-9">
|
||||
<AvatarFallback>
|
||||
{order.customer_name?.split(' ').map(n => n[0]).join('') || '??'}
|
||||
</AvatarFallback>
|
||||
</Avatar>
|
||||
<div className="ml-4 space-y-1">
|
||||
<p className="text-sm font-medium leading-none">Order #{order.order_id}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
{new Date(order.order_date).toLocaleDateString()}
|
||||
</p>
|
||||
</div>
|
||||
<div className="ml-auto font-medium">
|
||||
${order.total_amount.toFixed(2)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{!recentOrders?.length && (
|
||||
<div className="text-center text-muted-foreground">
|
||||
No recent orders found
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,58 +0,0 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { Cell, Pie, PieChart, ResponsiveContainer, Tooltip, Legend } from 'recharts';
|
||||
import config from '../../config';
|
||||
|
||||
interface CategorySales {
|
||||
category: string;
|
||||
total: number;
|
||||
percentage: number;
|
||||
}
|
||||
|
||||
const COLORS = ['#0088FE', '#00C49F', '#FFBB28', '#FF8042', '#8884d8', '#82ca9d'];
|
||||
|
||||
export function SalesByCategory() {
|
||||
const { data, isLoading, error } = useQuery<CategorySales[]>({
|
||||
queryKey: ['sales-by-category'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/sales-by-category`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch category sales');
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading chart...</div>;
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return <div className="text-red-500">Error loading category sales</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<PieChart>
|
||||
<Pie
|
||||
data={data}
|
||||
cx="50%"
|
||||
cy="50%"
|
||||
labelLine={false}
|
||||
outerRadius={80}
|
||||
fill="#8884d8"
|
||||
dataKey="total"
|
||||
nameKey="category"
|
||||
label={({ name, percent }) => `${name} ${(percent * 100).toFixed(0)}%`}
|
||||
>
|
||||
{data?.map((_, index) => (
|
||||
<Cell key={`cell-${index}`} fill={COLORS[index % COLORS.length]} />
|
||||
))}
|
||||
</Pie>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Sales']}
|
||||
/>
|
||||
<Legend />
|
||||
</PieChart>
|
||||
</ResponsiveContainer>
|
||||
);
|
||||
}
|
||||
@@ -1,95 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table"
|
||||
import { TrendingUp, TrendingDown } from "lucide-react"
|
||||
import config from "@/config"
|
||||
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
daily_sales_avg: string;
|
||||
weekly_sales_avg: string;
|
||||
growth_rate: string;
|
||||
total_revenue: string;
|
||||
}
|
||||
|
||||
export function TrendingProducts() {
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["trending-products"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/products/trending`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch trending products")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
const formatPercent = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "percent",
|
||||
minimumFractionDigits: 1,
|
||||
maximumFractionDigits: 1,
|
||||
signDisplay: "exceptZero",
|
||||
}).format(value / 100)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Trending Products</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="max-h-[400px] overflow-auto">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead>Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Growth</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell className="font-medium">
|
||||
<div className="flex flex-col">
|
||||
<span className="font-medium">{product.title}</span>
|
||||
<span className="text-sm text-muted-foreground">
|
||||
{product.sku}
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell>{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<div className="flex items-center justify-end gap-1">
|
||||
{Number(product.growth_rate) > 0 ? (
|
||||
<TrendingUp className="h-4 w-4 text-success" />
|
||||
) : (
|
||||
<TrendingDown className="h-4 w-4 text-destructive" />
|
||||
)}
|
||||
<span
|
||||
className={
|
||||
Number(product.growth_rate) > 0 ? "text-success" : "text-destructive"
|
||||
}
|
||||
>
|
||||
{formatPercent(Number(product.growth_rate))}
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -1,79 +0,0 @@
|
||||
import { useQuery } from "@tanstack/react-query"
|
||||
import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table"
|
||||
import { Progress } from "@/components/ui/progress"
|
||||
import config from "@/config"
|
||||
|
||||
interface VendorMetrics {
|
||||
vendor: string
|
||||
avg_lead_time: number
|
||||
on_time_delivery_rate: number
|
||||
avg_fill_rate: number
|
||||
total_orders: number
|
||||
active_orders: number
|
||||
overdue_orders: number
|
||||
}
|
||||
|
||||
export function VendorPerformance() {
|
||||
const { data: vendors } = useQuery<VendorMetrics[]>({
|
||||
queryKey: ["vendor-metrics"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/vendor/performance`)
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch vendor metrics")
|
||||
}
|
||||
return response.json()
|
||||
},
|
||||
})
|
||||
|
||||
// Sort vendors by on-time delivery rate
|
||||
const sortedVendors = vendors
|
||||
?.sort((a, b) => b.on_time_delivery_rate - a.on_time_delivery_rate)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
<CardTitle className="text-lg font-medium">Top Vendor Performance</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="max-h-[400px] overflow-auto">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Vendor</TableHead>
|
||||
<TableHead>On-Time</TableHead>
|
||||
<TableHead className="text-right">Fill Rate</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{sortedVendors?.map((vendor) => (
|
||||
<TableRow key={vendor.vendor}>
|
||||
<TableCell className="font-medium">{vendor.vendor}</TableCell>
|
||||
<TableCell>
|
||||
<div className="flex items-center gap-2">
|
||||
<Progress
|
||||
value={vendor.on_time_delivery_rate}
|
||||
className="h-2"
|
||||
/>
|
||||
<span className="w-10 text-sm">
|
||||
{vendor.on_time_delivery_rate.toFixed(0)}%
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{vendor.avg_fill_rate.toFixed(0)}%
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</CardContent>
|
||||
</>
|
||||
)
|
||||
}
|
||||
@@ -24,47 +24,56 @@ import {
|
||||
SidebarSeparator,
|
||||
} from "@/components/ui/sidebar";
|
||||
import { useLocation, useNavigate, Link } from "react-router-dom";
|
||||
import { Protected } from "@/components/auth/Protected";
|
||||
|
||||
const items = [
|
||||
{
|
||||
title: "Overview",
|
||||
icon: Home,
|
||||
url: "/",
|
||||
permission: "access:dashboard"
|
||||
},
|
||||
{
|
||||
title: "Products",
|
||||
icon: Package,
|
||||
url: "/products",
|
||||
permission: "access:products"
|
||||
},
|
||||
{
|
||||
title: "Import",
|
||||
icon: FileSpreadsheet,
|
||||
url: "/import",
|
||||
permission: "access:import"
|
||||
},
|
||||
{
|
||||
title: "Forecasting",
|
||||
icon: IconCrystalBall,
|
||||
url: "/forecasting",
|
||||
permission: "access:forecasting"
|
||||
},
|
||||
{
|
||||
title: "Categories",
|
||||
icon: Tags,
|
||||
url: "/categories",
|
||||
permission: "access:categories"
|
||||
},
|
||||
{
|
||||
title: "Vendors",
|
||||
icon: Users,
|
||||
url: "/vendors",
|
||||
permission: "access:vendors"
|
||||
},
|
||||
{
|
||||
title: "Purchase Orders",
|
||||
icon: ClipboardList,
|
||||
url: "/purchase-orders",
|
||||
permission: "access:purchase_orders"
|
||||
},
|
||||
{
|
||||
title: "Analytics",
|
||||
icon: BarChart2,
|
||||
url: "/analytics",
|
||||
permission: "access:analytics"
|
||||
},
|
||||
];
|
||||
|
||||
@@ -73,8 +82,8 @@ export function AppSidebar() {
|
||||
const navigate = useNavigate();
|
||||
|
||||
const handleLogout = () => {
|
||||
localStorage.removeItem('token');
|
||||
sessionStorage.removeItem('isLoggedIn');
|
||||
sessionStorage.removeItem('token');
|
||||
navigate('/login');
|
||||
};
|
||||
|
||||
@@ -98,20 +107,26 @@ export function AppSidebar() {
|
||||
location.pathname === item.url ||
|
||||
(item.url !== "/" && location.pathname.startsWith(item.url));
|
||||
return (
|
||||
<SidebarMenuItem key={item.title}>
|
||||
<SidebarMenuButton
|
||||
asChild
|
||||
tooltip={item.title}
|
||||
isActive={isActive}
|
||||
>
|
||||
<Link to={item.url}>
|
||||
<item.icon className="h-4 w-4" />
|
||||
<span className="group-data-[collapsible=icon]:hidden">
|
||||
{item.title}
|
||||
</span>
|
||||
</Link>
|
||||
</SidebarMenuButton>
|
||||
</SidebarMenuItem>
|
||||
<Protected
|
||||
key={item.title}
|
||||
permission={item.permission}
|
||||
fallback={null}
|
||||
>
|
||||
<SidebarMenuItem>
|
||||
<SidebarMenuButton
|
||||
asChild
|
||||
tooltip={item.title}
|
||||
isActive={isActive}
|
||||
>
|
||||
<Link to={item.url}>
|
||||
<item.icon className="h-4 w-4" />
|
||||
<span className="group-data-[collapsible=icon]:hidden">
|
||||
{item.title}
|
||||
</span>
|
||||
</Link>
|
||||
</SidebarMenuButton>
|
||||
</SidebarMenuItem>
|
||||
</Protected>
|
||||
);
|
||||
})}
|
||||
</SidebarMenu>
|
||||
@@ -122,24 +137,30 @@ export function AppSidebar() {
|
||||
<SidebarGroup>
|
||||
<SidebarGroupContent>
|
||||
<SidebarMenu>
|
||||
<SidebarMenuItem>
|
||||
<SidebarMenuButton
|
||||
asChild
|
||||
tooltip="Settings"
|
||||
isActive={location.pathname === "/settings"}
|
||||
>
|
||||
<Link to="/settings">
|
||||
<Settings className="h-4 w-4" />
|
||||
<span className="group-data-[collapsible=icon]:hidden">
|
||||
Settings
|
||||
</span>
|
||||
</Link>
|
||||
</SidebarMenuButton>
|
||||
</SidebarMenuItem>
|
||||
<Protected
|
||||
permission="access:settings"
|
||||
fallback={null}
|
||||
>
|
||||
<SidebarMenuItem>
|
||||
<SidebarMenuButton
|
||||
asChild
|
||||
tooltip="Settings"
|
||||
isActive={location.pathname === "/settings"}
|
||||
>
|
||||
<Link to="/settings">
|
||||
<Settings className="h-4 w-4" />
|
||||
<span className="group-data-[collapsible=icon]:hidden">
|
||||
Settings
|
||||
</span>
|
||||
</Link>
|
||||
</SidebarMenuButton>
|
||||
</SidebarMenuItem>
|
||||
</Protected>
|
||||
</SidebarMenu>
|
||||
</SidebarGroupContent>
|
||||
</SidebarGroup>
|
||||
</SidebarContent>
|
||||
<SidebarSeparator />
|
||||
<SidebarFooter>
|
||||
<SidebarMenu>
|
||||
<SidebarMenuItem>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { SidebarProvider, SidebarTrigger } from "@/components/ui/sidebar";
|
||||
import { AppSidebar } from "./AppSidebar";
|
||||
import { Outlet } from "react-router-dom";
|
||||
import { motion } from "motion/react";
|
||||
import { motion } from "framer-motion";
|
||||
|
||||
export function MainLayout() {
|
||||
return (
|
||||
|
||||
@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
|
||||
}
|
||||
getProductContainerClasses={() => getProductContainerClasses(index)}
|
||||
findContainer={findContainer}
|
||||
handleAddImageFromUrl={handleAddImageFromUrl}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Card, CardContent } from "@/components/ui/card";
|
||||
import { Loader2, Link as LinkIcon } from "lucide-react";
|
||||
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { ImageDropzone } from "./ImageDropzone";
|
||||
import { SortableImage } from "./SortableImage";
|
||||
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
|
||||
import { ProductImageSortable, Product } from "../../types";
|
||||
import { DroppableContainer } from "../DroppableContainer";
|
||||
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import config from "@/config";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { useState, useMemo } from "react";
|
||||
|
||||
interface ReusableImage {
|
||||
id: number;
|
||||
name: string;
|
||||
image_url: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
}
|
||||
|
||||
interface ProductCardProps {
|
||||
product: Product;
|
||||
@@ -26,6 +45,7 @@ interface ProductCardProps {
|
||||
onRemoveImage: (id: string) => void;
|
||||
getProductContainerClasses: () => string;
|
||||
findContainer: (id: string) => string | null;
|
||||
handleAddImageFromUrl: (productIndex: number, url: string) => void;
|
||||
}
|
||||
|
||||
export const ProductCard = ({
|
||||
@@ -43,8 +63,11 @@ export const ProductCard = ({
|
||||
onDragOver,
|
||||
onRemoveImage,
|
||||
getProductContainerClasses,
|
||||
findContainer
|
||||
findContainer,
|
||||
handleAddImageFromUrl
|
||||
}: ProductCardProps) => {
|
||||
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
|
||||
|
||||
// Function to get images for this product
|
||||
const getProductImages = () => {
|
||||
return productImages.filter(img => img.productIndex === index);
|
||||
@@ -56,6 +79,32 @@ export const ProductCard = ({
|
||||
return result !== null ? parseInt(result) : null;
|
||||
};
|
||||
|
||||
// Fetch reusable images
|
||||
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
|
||||
queryKey: ["reusable-images"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch reusable images");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// Filter reusable images based on product's company
|
||||
const availableReusableImages = useMemo(() => {
|
||||
if (!reusableImages) return [];
|
||||
return reusableImages.filter(img =>
|
||||
img.is_global || img.company === product.company
|
||||
);
|
||||
}, [reusableImages, product.company]);
|
||||
|
||||
// Handle adding a reusable image
|
||||
const handleAddReusableImage = (imageUrl: string) => {
|
||||
handleAddImageFromUrl(index, imageUrl);
|
||||
setIsReusableDialogOpen(false);
|
||||
};
|
||||
|
||||
return (
|
||||
<Card
|
||||
className={cn(
|
||||
@@ -83,6 +132,18 @@ export const ProductCard = ({
|
||||
className="flex items-center gap-2"
|
||||
onSubmit={onUrlSubmit}
|
||||
>
|
||||
{getProductImages().length === 0 && (
|
||||
<Button
|
||||
type="button"
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
|
||||
onClick={() => setIsReusableDialogOpen(true)}
|
||||
>
|
||||
<ImageIcon className="h-3.5 w-3.5" />
|
||||
Select from Library
|
||||
</Button>
|
||||
)}
|
||||
<Input
|
||||
placeholder="Add image from URL"
|
||||
value={urlInput}
|
||||
@@ -105,7 +166,7 @@ export const ProductCard = ({
|
||||
</div>
|
||||
|
||||
<div className="flex flex-col sm:flex-row gap-2">
|
||||
<div className="flex flex-row gap-2 items-start">
|
||||
<div className="flex flex-row gap-2 items-center gap-4">
|
||||
<ImageDropzone
|
||||
productIndex={index}
|
||||
onDrop={onImageUpload}
|
||||
@@ -158,6 +219,50 @@ export const ProductCard = ({
|
||||
/>
|
||||
</div>
|
||||
</CardContent>
|
||||
|
||||
{/* Reusable Images Dialog */}
|
||||
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
|
||||
<DialogContent className="max-w-3xl">
|
||||
<DialogHeader>
|
||||
<DialogTitle>Select from Image Library</DialogTitle>
|
||||
<DialogDescription>
|
||||
Choose a global or company-specific image to add to this product.
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
<ScrollArea className="h-[400px] pr-4">
|
||||
{isLoadingReusable ? (
|
||||
<div className="flex items-center justify-center h-full">
|
||||
<Loader2 className="h-8 w-8 animate-spin" />
|
||||
</div>
|
||||
) : availableReusableImages.length === 0 ? (
|
||||
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
|
||||
<ImageIcon className="h-8 w-8 mb-2" />
|
||||
<p>No reusable images available</p>
|
||||
</div>
|
||||
) : (
|
||||
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
|
||||
{availableReusableImages.map((image) => (
|
||||
<div
|
||||
key={image.id}
|
||||
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
|
||||
onClick={() => handleAddReusableImage(image.image_url)}
|
||||
>
|
||||
<img
|
||||
src={image.image_url}
|
||||
alt={image.name}
|
||||
className="w-full h-full object-cover"
|
||||
/>
|
||||
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
|
||||
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
|
||||
<p className="text-xs text-white truncate">{image.name}</p>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</ScrollArea>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
</Card>
|
||||
);
|
||||
};
|
||||
@@ -31,5 +31,6 @@ export interface Product {
|
||||
supplier_no?: string;
|
||||
sku?: string;
|
||||
model?: string;
|
||||
company?: string;
|
||||
product_images?: string | string[];
|
||||
}
|
||||
@@ -223,7 +223,7 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
|
||||
onBack();
|
||||
}
|
||||
}}
|
||||
onNext={(validatedData) => {
|
||||
onNext={(validatedData: any[]) => {
|
||||
// Go to image upload step with the validated data
|
||||
onNext({
|
||||
type: StepType.imageUpload,
|
||||
|
||||
@@ -0,0 +1,80 @@
|
||||
import React, { useState } from 'react';
|
||||
import { AiValidationDialogs } from './components/AiValidationDialogs';
|
||||
import { Product } from '../../../../types/products';
|
||||
import {
|
||||
AiValidationProgress,
|
||||
AiValidationDetails,
|
||||
CurrentPrompt as AiValidationCurrentPrompt
|
||||
} from './hooks/useAiValidation';
|
||||
|
||||
const ValidationStepNew: React.FC = () => {
|
||||
const [aiValidationProgress, setAiValidationProgress] = useState<AiValidationProgress>({
|
||||
isOpen: false,
|
||||
status: 'idle',
|
||||
step: 0
|
||||
});
|
||||
|
||||
const [aiValidationDetails, setAiValidationDetails] = useState<AiValidationDetails>({
|
||||
changes: [],
|
||||
warnings: [],
|
||||
changeDetails: [],
|
||||
isOpen: false
|
||||
});
|
||||
|
||||
const [currentPrompt, setCurrentPrompt] = useState<AiValidationCurrentPrompt>({
|
||||
isOpen: false,
|
||||
prompt: '',
|
||||
isLoading: true,
|
||||
});
|
||||
|
||||
// Track reversion state (for internal use)
|
||||
const [reversionState, setReversionState] = useState<Record<string, boolean>>({});
|
||||
|
||||
const [fieldData] = useState<Product[]>([]);
|
||||
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
|
||||
const revertAiChange = (productIndex: number, fieldKey: string) => {
|
||||
const key = `${productIndex}-${fieldKey}`;
|
||||
setReversionState(prev => ({
|
||||
...prev,
|
||||
[key]: true
|
||||
}));
|
||||
};
|
||||
|
||||
const isChangeReverted = (productIndex: number, fieldKey: string): boolean => {
|
||||
const key = `${productIndex}-${fieldKey}`;
|
||||
return !!reversionState[key];
|
||||
};
|
||||
|
||||
const getFieldDisplayValueWithHighlight = (
|
||||
_fieldKey: string,
|
||||
originalValue: any,
|
||||
correctedValue: any
|
||||
) => {
|
||||
return {
|
||||
originalHtml: String(originalValue),
|
||||
correctedHtml: String(correctedValue)
|
||||
};
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
<AiValidationDialogs
|
||||
aiValidationProgress={aiValidationProgress}
|
||||
aiValidationDetails={aiValidationDetails}
|
||||
currentPrompt={currentPrompt}
|
||||
setAiValidationProgress={setAiValidationProgress}
|
||||
setAiValidationDetails={setAiValidationDetails}
|
||||
setCurrentPrompt={setCurrentPrompt}
|
||||
revertAiChange={revertAiChange}
|
||||
isChangeReverted={isChangeReverted}
|
||||
getFieldDisplayValueWithHighlight={getFieldDisplayValueWithHighlight}
|
||||
fields={fieldData}
|
||||
debugData={currentPrompt.debugData}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default ValidationStepNew;
|
||||
@@ -1,23 +1,87 @@
|
||||
import React from 'react';
|
||||
import { Dialog, DialogContent, DialogHeader, DialogTitle, DialogDescription } from '@/components/ui/dialog';
|
||||
import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Loader2, CheckIcon } from 'lucide-react';
|
||||
import { Code } from '@/components/ui/code';
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from '@/components/ui/table';
|
||||
import { AiValidationDetails, AiValidationProgress, CurrentPrompt } from '../hooks/useAiValidation';
|
||||
import React, { useState } from "react";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
DialogDescription,
|
||||
} from "@/components/ui/dialog";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Loader2, CheckIcon, XIcon } from "lucide-react";
|
||||
import { Code } from "@/components/ui/code";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import {
|
||||
AiValidationDetails,
|
||||
AiValidationProgress,
|
||||
CurrentPrompt,
|
||||
} from "../hooks/useAiValidation";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
|
||||
interface TaxonomyStats {
|
||||
categories: number;
|
||||
themes: number;
|
||||
colors: number;
|
||||
taxCodes: number;
|
||||
sizeCategories: number;
|
||||
suppliers: number;
|
||||
companies: number;
|
||||
artists: number;
|
||||
}
|
||||
|
||||
interface DebugData {
|
||||
taxonomyStats: TaxonomyStats | null;
|
||||
basePrompt: string;
|
||||
sampleFullPrompt: string;
|
||||
promptLength: number;
|
||||
apiFormat?: Array<{
|
||||
role: string;
|
||||
content: string;
|
||||
}>;
|
||||
promptSources?: {
|
||||
systemPrompt?: { id: number; prompt_text: string };
|
||||
generalPrompt?: { id: number; prompt_text: string };
|
||||
companyPrompts?: Array<{
|
||||
id: number;
|
||||
company: string;
|
||||
companyName?: string;
|
||||
prompt_text: string;
|
||||
}>;
|
||||
};
|
||||
estimatedProcessingTime?: {
|
||||
seconds: number | null;
|
||||
sampleCount: number;
|
||||
};
|
||||
}
|
||||
|
||||
interface AiValidationDialogsProps {
|
||||
aiValidationProgress: AiValidationProgress;
|
||||
aiValidationDetails: AiValidationDetails;
|
||||
currentPrompt: CurrentPrompt;
|
||||
setAiValidationProgress: React.Dispatch<React.SetStateAction<AiValidationProgress>>;
|
||||
setAiValidationDetails: React.Dispatch<React.SetStateAction<AiValidationDetails>>;
|
||||
setAiValidationProgress: React.Dispatch<
|
||||
React.SetStateAction<AiValidationProgress>
|
||||
>;
|
||||
setAiValidationDetails: React.Dispatch<
|
||||
React.SetStateAction<AiValidationDetails>
|
||||
>;
|
||||
setCurrentPrompt: React.Dispatch<React.SetStateAction<CurrentPrompt>>;
|
||||
revertAiChange: (productIndex: number, fieldKey: string) => void;
|
||||
isChangeReverted: (productIndex: number, fieldKey: string) => boolean;
|
||||
getFieldDisplayValueWithHighlight: (fieldKey: string, originalValue: any, correctedValue: any) => { originalHtml: string, correctedHtml: string };
|
||||
getFieldDisplayValueWithHighlight: (
|
||||
fieldKey: string,
|
||||
originalValue: any,
|
||||
correctedValue: any
|
||||
) => { originalHtml: string; correctedHtml: string };
|
||||
fields: readonly any[];
|
||||
debugData?: DebugData;
|
||||
}
|
||||
|
||||
export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
@@ -30,41 +94,558 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
revertAiChange,
|
||||
isChangeReverted,
|
||||
getFieldDisplayValueWithHighlight,
|
||||
fields
|
||||
fields,
|
||||
debugData,
|
||||
}) => {
|
||||
const [costPerMillionTokens, setCostPerMillionTokens] = useState(2.5); // Default cost
|
||||
const hasCompanyPrompts =
|
||||
currentPrompt.debugData?.promptSources?.companyPrompts &&
|
||||
currentPrompt.debugData.promptSources.companyPrompts.length > 0;
|
||||
|
||||
// Create our own state to track changes
|
||||
const [localReversionState, setLocalReversionState] = useState<
|
||||
Record<string, boolean>
|
||||
>({});
|
||||
|
||||
// Initialize local state from the isChangeReverted function when component mounts
|
||||
// or when aiValidationDetails changes
|
||||
React.useEffect(() => {
|
||||
if (
|
||||
aiValidationDetails.changeDetails &&
|
||||
aiValidationDetails.changeDetails.length > 0
|
||||
) {
|
||||
const initialState: Record<string, boolean> = {};
|
||||
|
||||
aiValidationDetails.changeDetails.forEach((product) => {
|
||||
product.changes.forEach((change) => {
|
||||
const key = `${product.productIndex}-${change.field}`;
|
||||
initialState[key] = isChangeReverted(
|
||||
product.productIndex,
|
||||
change.field
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
setLocalReversionState(initialState);
|
||||
}
|
||||
}, [aiValidationDetails.changeDetails, isChangeReverted]);
|
||||
|
||||
// This function will toggle the local state for a given change
|
||||
const toggleChangeAcceptance = (productIndex: number, fieldKey: string) => {
|
||||
const key = `${productIndex}-${fieldKey}`;
|
||||
const currentlyRejected = !!localReversionState[key];
|
||||
|
||||
// Toggle the local state
|
||||
setLocalReversionState((prev) => ({
|
||||
...prev,
|
||||
[key]: !prev[key],
|
||||
}));
|
||||
|
||||
// Only call revertAiChange when toggling to rejected state
|
||||
// Since revertAiChange is specifically for rejecting changes
|
||||
if (!currentlyRejected) {
|
||||
revertAiChange(productIndex, fieldKey);
|
||||
}
|
||||
};
|
||||
|
||||
// Function to check local reversion state
|
||||
const isChangeLocallyReverted = (
|
||||
productIndex: number,
|
||||
fieldKey: string
|
||||
): boolean => {
|
||||
const key = `${productIndex}-${fieldKey}`;
|
||||
return !!localReversionState[key];
|
||||
};
|
||||
|
||||
// Use "full" as the default tab
|
||||
const defaultTab = "full";
|
||||
const [activeTab, setActiveTab] = useState(defaultTab);
|
||||
|
||||
// Update activeTab when the dialog is opened with new data
|
||||
React.useEffect(() => {
|
||||
if (currentPrompt.isOpen) {
|
||||
setActiveTab("full");
|
||||
}
|
||||
}, [currentPrompt.isOpen]);
|
||||
|
||||
// Format time helper
|
||||
const formatTime = (seconds: number): string => {
|
||||
if (seconds < 60) {
|
||||
return `${Math.round(seconds)} seconds`;
|
||||
} else {
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
const remainingSeconds = Math.round(seconds % 60);
|
||||
return `${minutes}m ${remainingSeconds}s`;
|
||||
}
|
||||
};
|
||||
|
||||
// Calculate token costs
|
||||
const calculateTokenCost = (promptLength: number): number => {
|
||||
const estimatedTokens = Math.round(promptLength / 4);
|
||||
return (estimatedTokens / 1_000_000) * costPerMillionTokens * 100; // In cents
|
||||
};
|
||||
|
||||
// Use the prompt length from the current prompt
|
||||
const promptLength = currentPrompt.prompt ? currentPrompt.prompt.length : 0;
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* Current Prompt Dialog */}
|
||||
<Dialog
|
||||
open={currentPrompt.isOpen}
|
||||
onOpenChange={(open) => setCurrentPrompt(prev => ({ ...prev, isOpen: open }))}
|
||||
{/* Current Prompt Dialog with Debug Info */}
|
||||
<Dialog
|
||||
open={currentPrompt.isOpen}
|
||||
onOpenChange={(open) =>
|
||||
setCurrentPrompt((prev) => ({ ...prev, isOpen: open }))
|
||||
}
|
||||
>
|
||||
<DialogContent className="max-w-4xl h-[80vh]">
|
||||
<DialogContent className="max-w-4xl max-h-[90vh] overflow-hidden">
|
||||
<DialogHeader>
|
||||
<DialogTitle>Current AI Prompt</DialogTitle>
|
||||
<DialogDescription>
|
||||
This is the exact prompt that would be sent to the AI for validation
|
||||
This is the current prompt that would be sent to the AI for
|
||||
validation
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
<ScrollArea className="flex-1">
|
||||
{currentPrompt.isLoading ? (
|
||||
<div className="flex items-center justify-center h-full">
|
||||
<Loader2 className="h-8 w-8 animate-spin" />
|
||||
</div>
|
||||
) : (
|
||||
<Code className="whitespace-pre-wrap p-4">{currentPrompt.prompt}</Code>
|
||||
)}
|
||||
</ScrollArea>
|
||||
|
||||
<div className="flex flex-col h-[calc(90vh-120px)] overflow-hidden">
|
||||
{/* Debug Information Section - Fixed at the top */}
|
||||
<div className="flex-shrink-0">
|
||||
{currentPrompt.isLoading ? (
|
||||
<div className="flex justify-center items-center h-[100px]"></div>
|
||||
) : (
|
||||
<>
|
||||
<div className="grid grid-cols-3 gap-4 mb-4">
|
||||
<Card className="py-2">
|
||||
<CardHeader className="py-2">
|
||||
<CardTitle className="text-base">
|
||||
Prompt Length
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="py-2">
|
||||
<div className="flex flex-col space-y-2">
|
||||
<div className="text-sm">
|
||||
<span className="text-muted-foreground">
|
||||
Characters:
|
||||
</span>{" "}
|
||||
<span className="font-semibold">
|
||||
{promptLength}
|
||||
</span>
|
||||
</div>
|
||||
<div className="text-sm">
|
||||
<span className="text-muted-foreground">
|
||||
Tokens:
|
||||
</span>{" "}
|
||||
<span className="font-semibold">
|
||||
~{Math.round(promptLength / 4)}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card className="py-2">
|
||||
<CardHeader className="py-2">
|
||||
<CardTitle className="text-base">
|
||||
Cost Estimate
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="py-2">
|
||||
<div className="flex flex-col space-y-2">
|
||||
<div className="flex flex-row items-center">
|
||||
<label
|
||||
htmlFor="costPerMillion"
|
||||
className="text-sm text-muted-foreground"
|
||||
>
|
||||
$
|
||||
</label>
|
||||
<input
|
||||
id="costPerMillion"
|
||||
className="w-[40px] px-1 border rounded-md text-sm"
|
||||
defaultValue={costPerMillionTokens.toFixed(2)}
|
||||
onChange={(e) => {
|
||||
const value = parseFloat(e.target.value);
|
||||
if (!isNaN(value)) {
|
||||
setCostPerMillionTokens(value);
|
||||
}
|
||||
}}
|
||||
/>
|
||||
<label
|
||||
htmlFor="costPerMillion"
|
||||
className="text-sm text-muted-foreground ml-1"
|
||||
>
|
||||
per million input tokens
|
||||
</label>
|
||||
</div>
|
||||
<div className="text-sm">
|
||||
<span className="text-muted-foreground">Cost:</span>{" "}
|
||||
<span className="font-semibold">
|
||||
{calculateTokenCost(promptLength).toFixed(1)}¢
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card className="py-2">
|
||||
<CardHeader className="py-2">
|
||||
<CardTitle className="text-base">
|
||||
Processing Time
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="py-2">
|
||||
<div className="flex flex-col space-y-2">
|
||||
{currentPrompt.debugData?.estimatedProcessingTime ? (
|
||||
currentPrompt.debugData.estimatedProcessingTime
|
||||
.seconds ? (
|
||||
<>
|
||||
<div className="text-sm">
|
||||
<span className="text-muted-foreground">
|
||||
Estimated time:
|
||||
</span>{" "}
|
||||
<span className="font-semibold">
|
||||
{formatTime(
|
||||
currentPrompt.debugData
|
||||
.estimatedProcessingTime.seconds
|
||||
)}
|
||||
</span>
|
||||
</div>
|
||||
<div className="text-xs text-muted-foreground">
|
||||
Based on{" "}
|
||||
{
|
||||
currentPrompt.debugData
|
||||
.estimatedProcessingTime.sampleCount
|
||||
}{" "}
|
||||
similar validation
|
||||
{currentPrompt.debugData
|
||||
.estimatedProcessingTime.sampleCount !== 1
|
||||
? "s"
|
||||
: ""}
|
||||
</div>
|
||||
</>
|
||||
) : (
|
||||
<div className="text-sm text-muted-foreground">
|
||||
No historical data available for this prompt
|
||||
size
|
||||
</div>
|
||||
)
|
||||
) : (
|
||||
<div className="text-sm text-muted-foreground">
|
||||
No processing time data available
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Prompt Section - Scrollable content */}
|
||||
<div className="flex-1 min-h-0">
|
||||
{currentPrompt.isLoading ? (
|
||||
<div className="flex items-center justify-center h-full">
|
||||
<Loader2 className="h-8 w-8 animate-spin" />
|
||||
</div>
|
||||
) : (
|
||||
<>
|
||||
{currentPrompt.debugData?.apiFormat ? (
|
||||
<div className="flex flex-col h-full">
|
||||
{/* Prompt Sources Card - Fixed at the top of the content area */}
|
||||
<Card className="py-2 mb-4 flex-shrink-0">
|
||||
<CardHeader className="py-2">
|
||||
<CardTitle className="text-base">
|
||||
Prompt Sources
|
||||
</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent className="py-2">
|
||||
<div className="flex flex-wrap gap-2">
|
||||
<Badge
|
||||
variant="outline"
|
||||
className="bg-purple-100 hover:bg-purple-200 cursor-pointer"
|
||||
onClick={() =>
|
||||
document
|
||||
.getElementById("system-message")
|
||||
?.scrollIntoView({ behavior: "smooth" })
|
||||
}
|
||||
>
|
||||
System
|
||||
</Badge>
|
||||
<Badge
|
||||
variant="outline"
|
||||
className="bg-green-100 hover:bg-green-200 cursor-pointer"
|
||||
onClick={() =>
|
||||
document
|
||||
.getElementById("general-section")
|
||||
?.scrollIntoView({ behavior: "smooth" })
|
||||
}
|
||||
>
|
||||
General
|
||||
</Badge>
|
||||
|
||||
{currentPrompt.debugData.promptSources?.companyPrompts?.map(
|
||||
(company, idx) => (
|
||||
<Badge
|
||||
key={idx}
|
||||
variant="outline"
|
||||
className="bg-blue-100 hover:bg-blue-200 cursor-pointer"
|
||||
onClick={() =>
|
||||
document
|
||||
.getElementById("company-section")
|
||||
?.scrollIntoView({ behavior: "smooth" })
|
||||
}
|
||||
>
|
||||
{company.companyName ||
|
||||
`Company ${company.company}`}
|
||||
</Badge>
|
||||
)
|
||||
)}
|
||||
|
||||
<Badge
|
||||
variant="outline"
|
||||
className="bg-amber-100 hover:bg-amber-200 cursor-pointer"
|
||||
onClick={() =>
|
||||
document
|
||||
.getElementById("taxonomy-section")
|
||||
?.scrollIntoView({ behavior: "smooth" })
|
||||
}
|
||||
>
|
||||
Taxonomy
|
||||
</Badge>
|
||||
<Badge
|
||||
variant="outline"
|
||||
className="bg-pink-100 hover:bg-pink-200 cursor-pointer"
|
||||
onClick={() =>
|
||||
document
|
||||
.getElementById("product-section")
|
||||
?.scrollIntoView({ behavior: "smooth" })
|
||||
}
|
||||
>
|
||||
Products
|
||||
</Badge>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<ScrollArea className="flex-1 w-full overflow-y-auto">
|
||||
{currentPrompt.debugData.apiFormat.map(
|
||||
(message, idx: number) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="border rounded-md p-2 mb-4"
|
||||
>
|
||||
<div
|
||||
id={
|
||||
message.role === "system"
|
||||
? "system-message"
|
||||
: ""
|
||||
}
|
||||
className={`p-2 mb-2 rounded-sm font-medium ${
|
||||
message.role === "system"
|
||||
? "bg-purple-50 text-purple-800"
|
||||
: "bg-green-50 text-green-800"
|
||||
}`}
|
||||
>
|
||||
Role: {message.role}
|
||||
</div>
|
||||
|
||||
<Code
|
||||
className={`whitespace-pre-wrap p-4 break-normal max-w-full ${
|
||||
message.role === "system"
|
||||
? "bg-purple-50/30"
|
||||
: "bg-green-50/30"
|
||||
}`}
|
||||
>
|
||||
{message.role === "user" ? (
|
||||
<div className="text-wrapper">
|
||||
{(() => {
|
||||
const content = message.content;
|
||||
|
||||
// Find section boundaries by looking for specific markers
|
||||
const companySpecificStartIndex =
|
||||
content.indexOf(
|
||||
"--- COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||
);
|
||||
const companySpecificEndIndex =
|
||||
content.indexOf(
|
||||
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||
);
|
||||
|
||||
const taxonomyStartIndex =
|
||||
content.indexOf(
|
||||
"All Available Categories:"
|
||||
);
|
||||
const taxonomyFallbackStartIndex =
|
||||
content.indexOf(
|
||||
"Available Categories:"
|
||||
);
|
||||
const actualTaxonomyStartIndex =
|
||||
taxonomyStartIndex >= 0
|
||||
? taxonomyStartIndex
|
||||
: taxonomyFallbackStartIndex;
|
||||
|
||||
const productDataStartIndex =
|
||||
content.indexOf(
|
||||
"----------Here is the product data to validate----------"
|
||||
);
|
||||
|
||||
// If we can't find any markers, just return the content as-is
|
||||
if (
|
||||
actualTaxonomyStartIndex < 0 &&
|
||||
productDataStartIndex < 0 &&
|
||||
companySpecificStartIndex < 0
|
||||
) {
|
||||
return content;
|
||||
}
|
||||
|
||||
// Determine section indices
|
||||
let generalEndIndex = content.length;
|
||||
|
||||
if (companySpecificStartIndex >= 0) {
|
||||
generalEndIndex =
|
||||
companySpecificStartIndex;
|
||||
} else if (
|
||||
actualTaxonomyStartIndex >= 0
|
||||
) {
|
||||
generalEndIndex =
|
||||
actualTaxonomyStartIndex;
|
||||
} else if (productDataStartIndex >= 0) {
|
||||
generalEndIndex = productDataStartIndex;
|
||||
}
|
||||
|
||||
// Determine where taxonomy starts
|
||||
let taxonomyEndIndex = content.length;
|
||||
if (productDataStartIndex >= 0) {
|
||||
taxonomyEndIndex =
|
||||
productDataStartIndex;
|
||||
}
|
||||
|
||||
// Segments to render with appropriate styling
|
||||
const segments = [];
|
||||
|
||||
// General section (beginning to company/taxonomy/product)
|
||||
if (generalEndIndex > 0) {
|
||||
segments.push(
|
||||
<div
|
||||
id="general-section"
|
||||
key="general"
|
||||
className="border-l-4 border-green-500 pl-4 py-0 my-1"
|
||||
>
|
||||
<div className="text-xs font-semibold text-green-700 mb-2">
|
||||
General Prompt
|
||||
</div>
|
||||
<pre className="whitespace-pre-wrap">
|
||||
{content.substring(
|
||||
0,
|
||||
generalEndIndex
|
||||
)}
|
||||
</pre>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// Company-specific section if present
|
||||
if (
|
||||
companySpecificStartIndex >= 0 &&
|
||||
companySpecificEndIndex >= 0
|
||||
) {
|
||||
segments.push(
|
||||
<div
|
||||
id="company-section"
|
||||
key="company"
|
||||
className="border-l-4 border-blue-500 pl-4 py-0 my-1"
|
||||
>
|
||||
<div className="text-xs font-semibold text-blue-700 mb-2">
|
||||
Company-Specific Instructions
|
||||
</div>
|
||||
<pre className="whitespace-pre-wrap">
|
||||
{content.substring(
|
||||
companySpecificStartIndex,
|
||||
companySpecificEndIndex +
|
||||
"--- END COMPANY-SPECIFIC INSTRUCTIONS ---"
|
||||
.length
|
||||
)}
|
||||
</pre>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// Taxonomy section
|
||||
if (actualTaxonomyStartIndex >= 0) {
|
||||
const taxEnd = taxonomyEndIndex;
|
||||
segments.push(
|
||||
<div
|
||||
id="taxonomy-section"
|
||||
key="taxonomy"
|
||||
className="border-l-4 border-amber-500 pl-4 py-0 my-1"
|
||||
>
|
||||
<div className="text-xs font-semibold text-amber-700 mb-2">
|
||||
Taxonomy Data
|
||||
</div>
|
||||
<pre className="whitespace-pre-wrap">
|
||||
{content.substring(
|
||||
actualTaxonomyStartIndex,
|
||||
taxEnd
|
||||
)}
|
||||
</pre>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// Product data section
|
||||
if (productDataStartIndex >= 0) {
|
||||
segments.push(
|
||||
<div
|
||||
id="product-section"
|
||||
key="product"
|
||||
className="border-l-4 border-pink-500 pl-4 py-0 my-1"
|
||||
>
|
||||
<div className="text-xs font-semibold text-pink-700 mb-2">
|
||||
Product Data
|
||||
</div>
|
||||
<pre className="whitespace-pre-wrap">
|
||||
{content.substring(
|
||||
productDataStartIndex
|
||||
)}
|
||||
</pre>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return <>{segments}</>;
|
||||
})()}
|
||||
</div>
|
||||
) : (
|
||||
<pre className="whitespace-pre-wrap">
|
||||
{message.content}
|
||||
</pre>
|
||||
)}
|
||||
</Code>
|
||||
</div>
|
||||
)
|
||||
)}
|
||||
</ScrollArea>
|
||||
</div>
|
||||
) : (
|
||||
<ScrollArea className="h-full w-full">
|
||||
<Code className="whitespace-pre-wrap p-4 break-normal max-w-full">
|
||||
{currentPrompt.prompt}
|
||||
</Code>
|
||||
</ScrollArea>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* AI Validation Progress Dialog */}
|
||||
<Dialog
|
||||
open={aiValidationProgress.isOpen}
|
||||
<Dialog
|
||||
open={aiValidationProgress.isOpen}
|
||||
onOpenChange={(open) => {
|
||||
// Only allow closing if validation failed
|
||||
if (!open && aiValidationProgress.step === -1) {
|
||||
setAiValidationProgress(prev => ({ ...prev, isOpen: false }));
|
||||
setAiValidationProgress((prev) => ({ ...prev, isOpen: false }));
|
||||
}
|
||||
}}
|
||||
>
|
||||
@@ -76,17 +657,30 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
<div className="flex items-center gap-4">
|
||||
<div className="flex-1">
|
||||
<div className="h-2 w-full bg-secondary rounded-full overflow-hidden">
|
||||
<div
|
||||
className="h-full bg-primary transition-all duration-500"
|
||||
style={{
|
||||
width: `${aiValidationProgress.progressPercent ?? Math.round((aiValidationProgress.step / 5) * 100)}%`,
|
||||
backgroundColor: aiValidationProgress.step === -1 ? 'var(--destructive)' : undefined
|
||||
<div
|
||||
className="h-full bg-primary transition-all duration-500"
|
||||
style={{
|
||||
width: `${
|
||||
aiValidationProgress.progressPercent !== undefined
|
||||
? Math.round(aiValidationProgress.progressPercent)
|
||||
: Math.round((aiValidationProgress.step / 5) * 100)
|
||||
}%`,
|
||||
backgroundColor:
|
||||
aiValidationProgress.step === -1
|
||||
? "var(--destructive)"
|
||||
: undefined,
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="text-sm text-muted-foreground w-12 text-right">
|
||||
{aiValidationProgress.step === -1 ? '❌' : `${aiValidationProgress.progressPercent ?? Math.round((aiValidationProgress.step / 5) * 100)}%`}
|
||||
{aiValidationProgress.step === -1
|
||||
? "❌"
|
||||
: `${
|
||||
aiValidationProgress.progressPercent !== undefined
|
||||
? Math.round(aiValidationProgress.progressPercent)
|
||||
: Math.round((aiValidationProgress.step / 5) * 100)
|
||||
}%`}
|
||||
</div>
|
||||
</div>
|
||||
<p className="text-center text-sm text-muted-foreground">
|
||||
@@ -94,32 +688,43 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
</p>
|
||||
{(() => {
|
||||
// Only show time remaining if we have an estimate and are in progress
|
||||
return aiValidationProgress.estimatedSeconds &&
|
||||
aiValidationProgress.elapsedSeconds !== undefined &&
|
||||
aiValidationProgress.step > 0 &&
|
||||
return (
|
||||
aiValidationProgress.estimatedSeconds &&
|
||||
aiValidationProgress.elapsedSeconds !== undefined &&
|
||||
aiValidationProgress.step > 0 &&
|
||||
aiValidationProgress.step < 5 && (
|
||||
<div className="text-center text-sm">
|
||||
{(() => {
|
||||
// Calculate time remaining using the elapsed seconds
|
||||
const elapsedSeconds = aiValidationProgress.elapsedSeconds;
|
||||
const totalEstimatedSeconds = aiValidationProgress.estimatedSeconds;
|
||||
const remainingSeconds = Math.max(0, totalEstimatedSeconds - elapsedSeconds);
|
||||
|
||||
// Format time remaining
|
||||
if (remainingSeconds < 60) {
|
||||
return `Approximately ${Math.round(remainingSeconds)} seconds remaining`;
|
||||
} else {
|
||||
const minutes = Math.floor(remainingSeconds / 60);
|
||||
const seconds = Math.round(remainingSeconds % 60);
|
||||
return `Approximately ${minutes}m ${seconds}s remaining`;
|
||||
}
|
||||
})()}
|
||||
{aiValidationProgress.promptLength && (
|
||||
<p className="mt-1 text-xs text-muted-foreground">
|
||||
Prompt length: {aiValidationProgress.promptLength.toLocaleString()} characters
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
<div className="text-center text-sm">
|
||||
{(() => {
|
||||
// Calculate time remaining using the elapsed seconds
|
||||
const elapsedSeconds =
|
||||
aiValidationProgress.elapsedSeconds;
|
||||
const totalEstimatedSeconds =
|
||||
aiValidationProgress.estimatedSeconds;
|
||||
const remainingSeconds = Math.max(
|
||||
0,
|
||||
totalEstimatedSeconds - elapsedSeconds
|
||||
);
|
||||
|
||||
// Format time remaining
|
||||
if (remainingSeconds < 60) {
|
||||
return `Approximately ${Math.round(
|
||||
remainingSeconds
|
||||
)} seconds remaining`;
|
||||
} else {
|
||||
const minutes = Math.floor(remainingSeconds / 60);
|
||||
const seconds = Math.round(remainingSeconds % 60);
|
||||
return `Approximately ${minutes}m ${seconds}s remaining`;
|
||||
}
|
||||
})()}
|
||||
{aiValidationProgress.promptLength && (
|
||||
<p className="mt-1 text-xs text-muted-foreground">
|
||||
Prompt length:{" "}
|
||||
{aiValidationProgress.promptLength.toLocaleString()}{" "}
|
||||
characters
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
);
|
||||
})()}
|
||||
</div>
|
||||
@@ -127,26 +732,33 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
</Dialog>
|
||||
|
||||
{/* AI Validation Results Dialog */}
|
||||
<Dialog
|
||||
open={aiValidationDetails.isOpen}
|
||||
onOpenChange={(open) => setAiValidationDetails(prev => ({ ...prev, isOpen: open }))}
|
||||
<Dialog
|
||||
open={aiValidationDetails.isOpen}
|
||||
onOpenChange={(open) =>
|
||||
setAiValidationDetails((prev) => ({ ...prev, isOpen: open }))
|
||||
}
|
||||
>
|
||||
<DialogContent className="max-w-4xl">
|
||||
<DialogContent className="max-w-6xl w-[90vw]">
|
||||
<DialogHeader>
|
||||
<DialogTitle>AI Validation Results</DialogTitle>
|
||||
<DialogDescription>
|
||||
Review the changes and warnings suggested by the AI
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
<ScrollArea className="max-h-[60vh]">
|
||||
{aiValidationDetails.changeDetails && aiValidationDetails.changeDetails.length > 0 ? (
|
||||
<ScrollArea className="max-h-[70vh]">
|
||||
{aiValidationDetails.changeDetails &&
|
||||
aiValidationDetails.changeDetails.length > 0 ? (
|
||||
<div className="mb-6 space-y-6">
|
||||
<h3 className="font-semibold text-lg">Detailed Changes:</h3>
|
||||
{aiValidationDetails.changeDetails.map((product, i) => {
|
||||
// Find the title change if it exists
|
||||
const titleChange = product.changes.find(c => c.field === 'title');
|
||||
const titleValue = titleChange ? titleChange.corrected : product.title;
|
||||
|
||||
const titleChange = product.changes.find(
|
||||
(c) => c.field === "title"
|
||||
);
|
||||
const titleValue = titleChange
|
||||
? titleChange.corrected
|
||||
: product.title;
|
||||
|
||||
return (
|
||||
<div key={`product-${i}`} className="border rounded-md p-4">
|
||||
<h4 className="font-medium text-base mb-3">
|
||||
@@ -155,64 +767,96 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="w-[180px]">Field</TableHead>
|
||||
<TableHead>Original Value</TableHead>
|
||||
<TableHead>Corrected Value</TableHead>
|
||||
<TableHead className="text-right">Action</TableHead>
|
||||
<TableHead className="">Field</TableHead>
|
||||
<TableHead className="w-[35%]">
|
||||
Original Value
|
||||
</TableHead>
|
||||
<TableHead className="w-[35%]">
|
||||
Corrected Value
|
||||
</TableHead>
|
||||
<TableHead className="text-right">
|
||||
Accept Changes?
|
||||
</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{product.changes.map((change, j) => {
|
||||
const field = fields.find(f => f.key === change.field);
|
||||
const fieldLabel = field ? field.label : change.field;
|
||||
const isReverted = isChangeReverted(product.productIndex, change.field);
|
||||
|
||||
// Get highlighted differences
|
||||
const { originalHtml, correctedHtml } = getFieldDisplayValueWithHighlight(
|
||||
change.field,
|
||||
change.original,
|
||||
change.corrected
|
||||
const field = fields.find(
|
||||
(f) => f.key === change.field
|
||||
);
|
||||
|
||||
const fieldLabel = field
|
||||
? field.label
|
||||
: change.field;
|
||||
const isReverted = isChangeLocallyReverted(
|
||||
product.productIndex,
|
||||
change.field
|
||||
);
|
||||
|
||||
// Get highlighted differences
|
||||
const { originalHtml, correctedHtml } =
|
||||
getFieldDisplayValueWithHighlight(
|
||||
change.field,
|
||||
change.original,
|
||||
change.corrected
|
||||
);
|
||||
|
||||
return (
|
||||
<TableRow key={`change-${j}`}>
|
||||
<TableCell className="font-medium">{fieldLabel}</TableCell>
|
||||
<TableCell className="font-medium">
|
||||
{fieldLabel}
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<div
|
||||
dangerouslySetInnerHTML={{ __html: originalHtml }}
|
||||
className={isReverted ? "font-medium" : ""}
|
||||
<div
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: originalHtml,
|
||||
}}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<div
|
||||
dangerouslySetInnerHTML={{ __html: correctedHtml }}
|
||||
className={!isReverted ? "font-medium" : ""}
|
||||
<div
|
||||
dangerouslySetInnerHTML={{
|
||||
__html: correctedHtml,
|
||||
}}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<div className="mt-2">
|
||||
{isReverted ? (
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
className="text-green-600 bg-green-50 hover:bg-green-100 hover:text-green-700"
|
||||
disabled
|
||||
>
|
||||
<CheckIcon className="w-4 h-4 mr-1" />
|
||||
Reverted
|
||||
</Button>
|
||||
) : (
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => {
|
||||
// Call the revert function directly
|
||||
revertAiChange(product.productIndex, change.field);
|
||||
}}
|
||||
>
|
||||
Revert Change
|
||||
</Button>
|
||||
)}
|
||||
<TableCell className="text-right align-top">
|
||||
<div className="flex justify-end gap-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => {
|
||||
// Toggle to Accepted state if currently rejected
|
||||
toggleChangeAcceptance(
|
||||
product.productIndex,
|
||||
change.field
|
||||
);
|
||||
}}
|
||||
className={
|
||||
!isReverted
|
||||
? "bg-green-100 text-green-600 border-green-300 flex items-center"
|
||||
: "border-gray-200 text-gray-600 hover:bg-green-50 hover:text-green-600 hover:border-green-200 flex items-center"
|
||||
}
|
||||
>
|
||||
<CheckIcon className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => {
|
||||
// Toggle to Rejected state if currently accepted
|
||||
toggleChangeAcceptance(
|
||||
product.productIndex,
|
||||
change.field
|
||||
);
|
||||
}}
|
||||
className={
|
||||
isReverted
|
||||
? "bg-red-100 text-red-600 border-red-300 flex items-center"
|
||||
: "border-gray-200 text-gray-600 hover:bg-red-50 hover:text-red-600 hover:border-red-200 flex items-center"
|
||||
}
|
||||
>
|
||||
<XIcon className="h-4 w-4" />
|
||||
</Button>
|
||||
</div>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
@@ -226,12 +870,17 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
</div>
|
||||
) : (
|
||||
<div className="py-8 text-center text-muted-foreground">
|
||||
{aiValidationDetails.warnings && aiValidationDetails.warnings.length > 0 ? (
|
||||
{aiValidationDetails.warnings &&
|
||||
aiValidationDetails.warnings.length > 0 ? (
|
||||
<div>
|
||||
<p className="mb-4">No changes were made, but the AI provided some warnings:</p>
|
||||
<p className="mb-4">
|
||||
No changes were made, but the AI provided some warnings:
|
||||
</p>
|
||||
<ul className="list-disc pl-8 text-left">
|
||||
{aiValidationDetails.warnings.map((warning, i) => (
|
||||
<li key={`warning-${i}`} className="mb-2">{warning}</li>
|
||||
<li key={`warning-${i}`} className="mb-2">
|
||||
{warning}
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
</div>
|
||||
@@ -245,4 +894,4 @@ export const AiValidationDialogs: React.FC<AiValidationDialogsProps> = ({
|
||||
</Dialog>
|
||||
</>
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import React, { useState, useEffect, useCallback, useMemo } from 'react'
|
||||
import { Template } from '../hooks/useValidationState'
|
||||
import { Template } from '../hooks/validationTypes'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import {
|
||||
Command,
|
||||
@@ -50,7 +50,6 @@ const SearchableTemplateSelect: React.FC<SearchableTemplateSelectProps> = ({
|
||||
const [searchTerm, setSearchTerm] = useState("");
|
||||
const [selectedBrand, setSelectedBrand] = useState<string | null>(null);
|
||||
const [open, setOpen] = useState(false);
|
||||
const [] = useState<string | null>(null);
|
||||
|
||||
// Set default brand when component mounts or defaultBrand changes
|
||||
useEffect(() => {
|
||||
|
||||
@@ -2,7 +2,7 @@ import React, { useMemo } from 'react'
|
||||
import ValidationTable from './ValidationTable'
|
||||
import { RowSelectionState } from '@tanstack/react-table'
|
||||
import { Fields } from '../../../types'
|
||||
import { Template } from '../hooks/useValidationState'
|
||||
import { Template } from '../hooks/validationTypes'
|
||||
|
||||
interface UpcValidationTableAdapterProps<T extends string> {
|
||||
data: any[]
|
||||
@@ -28,6 +28,7 @@ interface UpcValidationTableAdapterProps<T extends string> {
|
||||
validatingRows: Set<number>
|
||||
getItemNumber: (rowIndex: number) => string | undefined
|
||||
}
|
||||
itemNumbers?: Map<number, string>
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -56,75 +57,79 @@ function UpcValidationTableAdapter<T extends string>({
|
||||
rowSublines,
|
||||
isLoadingLines,
|
||||
isLoadingSublines,
|
||||
upcValidation
|
||||
upcValidation,
|
||||
itemNumbers
|
||||
}: UpcValidationTableAdapterProps<T>) {
|
||||
// Prepare the validation table with UPC data
|
||||
const AdaptedTable = useMemo(() => React.memo((props: React.ComponentProps<typeof ValidationTable>) => {
|
||||
// Create validatingCells set from validating rows, but only for item_number fields
|
||||
// This ensures only the item_number column shows loading state during UPC validation
|
||||
const combinedValidatingCells = new Set<string>();
|
||||
|
||||
// Create combined validatingCells set from validating rows and external cells
|
||||
const combinedValidatingCells = useMemo(() => {
|
||||
const combined = new Set<string>();
|
||||
|
||||
// Add UPC validation cells
|
||||
upcValidation.validatingRows.forEach(rowIndex => {
|
||||
// Only mark the item_number cells as validating, NOT the UPC or supplier
|
||||
combinedValidatingCells.add(`${rowIndex}-item_number`);
|
||||
combined.add(`${rowIndex}-item_number`);
|
||||
});
|
||||
|
||||
// Add any other validating cells from state
|
||||
externalValidatingCells.forEach(cellKey => {
|
||||
combinedValidatingCells.add(cellKey);
|
||||
});
|
||||
|
||||
// Convert the Map to the expected format for the ValidationTable
|
||||
// Create a new Map from the item numbers to ensure proper typing
|
||||
const itemNumbersMap = new Map<number, string>();
|
||||
|
||||
// Merge the item numbers with the data for display purposes only
|
||||
const enhancedData = props.data.map((row: any, index: number) => {
|
||||
const itemNumber = upcValidation.getItemNumber(index);
|
||||
if (itemNumber) {
|
||||
// Add to our map for proper prop passing
|
||||
itemNumbersMap.set(index, itemNumber);
|
||||
|
||||
return {
|
||||
...row,
|
||||
item_number: itemNumber
|
||||
};
|
||||
}
|
||||
return row;
|
||||
combined.add(cellKey);
|
||||
});
|
||||
|
||||
// Create a Map for upcValidationResults with the same structure expected by ValidationTable
|
||||
const upcValidationResultsMap = new Map<number, { itemNumber: string }>();
|
||||
return combined;
|
||||
}, [upcValidation.validatingRows, externalValidatingCells]);
|
||||
|
||||
// Create a consolidated item numbers map from all sources
|
||||
const consolidatedItemNumbers = useMemo(() => {
|
||||
const result = new Map<number, string>();
|
||||
|
||||
// Populate with any item numbers we have from validation
|
||||
// First add from itemNumbers directly - this is the source of truth for template applications
|
||||
if (itemNumbers) {
|
||||
// Log all numbers for debugging
|
||||
console.log(`[ADAPTER-DEBUG] Received itemNumbers map with ${itemNumbers.size} entries`);
|
||||
|
||||
itemNumbers.forEach((itemNumber, rowIndex) => {
|
||||
console.log(`[ADAPTER-DEBUG] Adding item number for row ${rowIndex} from itemNumbers: ${itemNumber}`);
|
||||
result.set(rowIndex, itemNumber);
|
||||
});
|
||||
}
|
||||
|
||||
// For each row, ensure we have the most up-to-date item number
|
||||
data.forEach((_, index) => {
|
||||
// Check if upcValidation has an item number for this row
|
||||
const itemNumber = upcValidation.getItemNumber(index);
|
||||
if (itemNumber) {
|
||||
upcValidationResultsMap.set(index, { itemNumber });
|
||||
console.log(`[ADAPTER-DEBUG] Adding item number for row ${index} from upcValidation: ${itemNumber}`);
|
||||
result.set(index, itemNumber);
|
||||
}
|
||||
|
||||
// Also check if it's directly in the data
|
||||
const dataItemNumber = data[index].item_number;
|
||||
if (dataItemNumber && !result.has(index)) {
|
||||
console.log(`[ADAPTER-DEBUG] Adding item number for row ${index} from data: ${dataItemNumber}`);
|
||||
result.set(index, dataItemNumber);
|
||||
}
|
||||
});
|
||||
|
||||
return (
|
||||
<ValidationTable
|
||||
{...props}
|
||||
data={enhancedData}
|
||||
validatingCells={combinedValidatingCells}
|
||||
itemNumbers={itemNumbersMap}
|
||||
isLoadingTemplates={isLoadingTemplates}
|
||||
copyDown={copyDown}
|
||||
rowProductLines={rowProductLines}
|
||||
rowSublines={rowSublines}
|
||||
isLoadingLines={isLoadingLines}
|
||||
isLoadingSublines={isLoadingSublines}
|
||||
upcValidationResults={upcValidationResultsMap}
|
||||
/>
|
||||
);
|
||||
}), [upcValidation.validatingRows, upcValidation.getItemNumber, isLoadingTemplates, copyDown, externalValidatingCells, rowProductLines, rowSublines, isLoadingLines, isLoadingSublines]);
|
||||
|
||||
return result;
|
||||
}, [data, itemNumbers, upcValidation]);
|
||||
|
||||
// Create upcValidationResults map using the consolidated item numbers
|
||||
const upcValidationResults = useMemo(() => {
|
||||
const results = new Map<number, { itemNumber: string }>();
|
||||
|
||||
// Populate with our consolidated item numbers
|
||||
consolidatedItemNumbers.forEach((itemNumber, rowIndex) => {
|
||||
results.set(rowIndex, { itemNumber });
|
||||
});
|
||||
|
||||
return results;
|
||||
}, [consolidatedItemNumbers]);
|
||||
|
||||
// Render the validation table with the provided props and UPC data
|
||||
return (
|
||||
<AdaptedTable
|
||||
<ValidationTable
|
||||
data={data}
|
||||
fields={fields}
|
||||
rowSelection={rowSelection}
|
||||
@@ -137,11 +142,11 @@ function UpcValidationTableAdapter<T extends string>({
|
||||
templates={templates}
|
||||
applyTemplate={applyTemplate}
|
||||
getTemplateDisplayText={getTemplateDisplayText}
|
||||
validatingCells={new Set()}
|
||||
itemNumbers={new Map()}
|
||||
validatingCells={combinedValidatingCells}
|
||||
itemNumbers={consolidatedItemNumbers}
|
||||
isLoadingTemplates={isLoadingTemplates}
|
||||
copyDown={copyDown}
|
||||
upcValidationResults={new Map<number, { itemNumber: string }>()}
|
||||
upcValidationResults={upcValidationResults}
|
||||
rowProductLines={rowProductLines}
|
||||
rowSublines={rowSublines}
|
||||
isLoadingLines={isLoadingLines}
|
||||
|
||||
@@ -293,8 +293,18 @@ const ValidationCell = React.memo(({
|
||||
// Use the CopyDown context
|
||||
const copyDownContext = React.useContext(CopyDownContext);
|
||||
|
||||
// Display value prioritizes itemNumber if available (for item_number fields)
|
||||
const displayValue = fieldKey === 'item_number' && itemNumber ? itemNumber : value;
|
||||
// CRITICAL FIX: For item_number fields, always prioritize the itemNumber prop over the value
|
||||
// This ensures that when the itemNumber changes, the display value changes
|
||||
let displayValue;
|
||||
if (fieldKey === 'item_number' && itemNumber) {
|
||||
// Always log when an item_number field is rendered to help debug
|
||||
console.log(`[VC-DEBUG] ValidationCell rendering item_number for row=${rowIndex} with itemNumber=${itemNumber}, value=${value}`);
|
||||
|
||||
// Prioritize itemNumber prop for item_number fields
|
||||
displayValue = itemNumber;
|
||||
} else {
|
||||
displayValue = value;
|
||||
}
|
||||
|
||||
// Use the optimized processErrors function to avoid redundant filtering
|
||||
const {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import React, { useState, useEffect, useRef, useCallback, useMemo, useLayoutEffect } from 'react'
|
||||
import { useValidationState, Props } from '../hooks/useValidationState'
|
||||
import { useValidationState } from '../hooks/useValidationState'
|
||||
import { Props } from '../hooks/validationTypes'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Loader2, X, Plus, Edit3, Sparkles, FileText } from 'lucide-react'
|
||||
import { toast } from 'sonner'
|
||||
@@ -17,6 +18,7 @@ import { useUpcValidation } from '../hooks/useUpcValidation'
|
||||
import { useProductLinesFetching } from '../hooks/useProductLinesFetching'
|
||||
import UpcValidationTableAdapter from './UpcValidationTableAdapter'
|
||||
import { Skeleton } from '@/components/ui/skeleton'
|
||||
import { Protected } from '@/components/auth/Protected'
|
||||
/**
|
||||
* ValidationContainer component - the main wrapper for the validation step
|
||||
*
|
||||
@@ -57,7 +59,10 @@ const ValidationContainer = <T extends string>({
|
||||
loadTemplates,
|
||||
setData,
|
||||
fields,
|
||||
isLoadingTemplates } = validationState
|
||||
isLoadingTemplates,
|
||||
validatingCells,
|
||||
setValidatingCells
|
||||
} = validationState
|
||||
|
||||
// Use product lines fetching hook
|
||||
const {
|
||||
@@ -69,9 +74,6 @@ const ValidationContainer = <T extends string>({
|
||||
fetchSublines
|
||||
} = useProductLinesFetching(data);
|
||||
|
||||
// Add state for tracking cells in loading state
|
||||
const [validatingCells, setValidatingCells] = useState<Set<string>>(new Set());
|
||||
|
||||
// Use UPC validation hook
|
||||
const upcValidation = useUpcValidation(data, setData);
|
||||
|
||||
@@ -958,6 +960,7 @@ const ValidationContainer = <T extends string>({
|
||||
isLoadingLines={isLoadingLines}
|
||||
isLoadingSublines={isLoadingSublines}
|
||||
upcValidation={upcValidation}
|
||||
itemNumbers={upcValidation.itemNumbers}
|
||||
/>
|
||||
);
|
||||
}, [
|
||||
@@ -1147,15 +1150,17 @@ const ValidationContainer = <T extends string>({
|
||||
)}
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Show Prompt Button */}
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={aiValidation.showCurrentPrompt}
|
||||
disabled={aiValidation.isAiValidating}
|
||||
className="flex items-center gap-1"
|
||||
>
|
||||
<FileText className="h-4 w-4" />
|
||||
Show Prompt
|
||||
</Button>
|
||||
<Protected permission="admin:debug">
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={aiValidation.showCurrentPrompt}
|
||||
disabled={aiValidation.isAiValidating}
|
||||
className="flex items-center gap-1"
|
||||
>
|
||||
<FileText className="h-4 w-4" />
|
||||
Show Prompt
|
||||
</Button>
|
||||
</Protected>
|
||||
|
||||
{/* AI Validate Button */}
|
||||
<Button
|
||||
@@ -1198,6 +1203,7 @@ const ValidationContainer = <T extends string>({
|
||||
isChangeReverted={aiValidation.isChangeReverted}
|
||||
getFieldDisplayValueWithHighlight={aiValidation.getFieldDisplayValueWithHighlight}
|
||||
fields={fields}
|
||||
debugData={aiValidation.currentPrompt.debugData}
|
||||
/>
|
||||
|
||||
{/* Product Search Dialog */}
|
||||
|
||||
@@ -7,7 +7,7 @@ import {
|
||||
ColumnDef
|
||||
} from '@tanstack/react-table'
|
||||
import { Fields, Field } from '../../../types'
|
||||
import { RowData, Template } from '../hooks/useValidationState'
|
||||
import { RowData, Template } from '../hooks/validationTypes'
|
||||
import ValidationCell, { CopyDownContext } from './ValidationCell'
|
||||
import { useRsi } from '../../../hooks/useRsi'
|
||||
import SearchableTemplateSelect from './SearchableTemplateSelect'
|
||||
@@ -15,7 +15,7 @@ import { Table, TableBody, TableRow, TableCell } from '@/components/ui/table'
|
||||
import { Checkbox } from '@/components/ui/checkbox'
|
||||
import { cn } from '@/lib/utils'
|
||||
import { Button } from '@/components/ui/button'
|
||||
import { Loader2 } from 'lucide-react'
|
||||
import { Skeleton } from '@/components/ui/skeleton'
|
||||
|
||||
// Define a simple Error type locally to avoid import issues
|
||||
type ErrorType = {
|
||||
@@ -67,10 +67,9 @@ const MemoizedTemplateSelect = React.memo(({
|
||||
}) => {
|
||||
if (isLoading) {
|
||||
return (
|
||||
<Button variant="outline" className="w-full justify-between overflow-hidden" disabled>
|
||||
<Loader2 className="h-4 w-4 mr-2 animate-spin flex-none" />
|
||||
<span className="truncate overflow-hidden">Loading...</span>
|
||||
</Button>
|
||||
<div className="flex items-center justify-center gap-2 border border-input rounded-md px-2 py-2">
|
||||
<Skeleton className="h-4 w-full" />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -139,11 +138,15 @@ const MemoizedCell = React.memo(({
|
||||
/>
|
||||
);
|
||||
}, (prev, next) => {
|
||||
// CRITICAL FIX: Never memoize item_number cells - always re-render them
|
||||
if (prev.fieldKey === 'item_number') {
|
||||
return false; // Never skip re-renders for item_number cells
|
||||
}
|
||||
|
||||
// Optimize the memo comparison function for better performance
|
||||
// Only re-render if these essential props change
|
||||
const valueEqual = prev.value === next.value;
|
||||
const isValidatingEqual = prev.isValidating === next.isValidating;
|
||||
const itemNumberEqual = prev.itemNumber === next.itemNumber;
|
||||
|
||||
// Shallow equality check for errors array
|
||||
const errorsEqual = prev.errors === next.errors || (
|
||||
@@ -162,7 +165,7 @@ const MemoizedCell = React.memo(({
|
||||
);
|
||||
|
||||
// Skip checking for props that rarely change
|
||||
return valueEqual && isValidatingEqual && errorsEqual && optionsEqual && itemNumberEqual;
|
||||
return valueEqual && isValidatingEqual && errorsEqual && optionsEqual;
|
||||
});
|
||||
|
||||
MemoizedCell.displayName = 'MemoizedCell';
|
||||
@@ -336,10 +339,28 @@ const ValidationTable = <T extends string>({
|
||||
return isValidatingUpc?.(rowIndex) || validatingUpcRows.includes(rowIndex);
|
||||
}, [isValidatingUpc, validatingUpcRows]);
|
||||
|
||||
// Use upcValidationResults for display
|
||||
// Use upcValidationResults for display, prioritizing the most recent values
|
||||
const getRowUpcResult = useCallback((rowIndex: number) => {
|
||||
return upcValidationResults?.get(rowIndex)?.itemNumber;
|
||||
}, [upcValidationResults]);
|
||||
// ALWAYS get from the data array directly - most authoritative source
|
||||
const rowData = data[rowIndex];
|
||||
if (rowData && rowData.item_number) {
|
||||
return rowData.item_number;
|
||||
}
|
||||
|
||||
// Maps are only backup sources when data doesn't have a value
|
||||
const itemNumberFromMap = itemNumbers.get(rowIndex);
|
||||
if (itemNumberFromMap) {
|
||||
return itemNumberFromMap;
|
||||
}
|
||||
|
||||
// Last resort - upcValidationResults
|
||||
const upcResult = upcValidationResults?.get(rowIndex)?.itemNumber;
|
||||
if (upcResult) {
|
||||
return upcResult;
|
||||
}
|
||||
|
||||
return undefined;
|
||||
}, [data, itemNumbers, upcValidationResults]);
|
||||
|
||||
// Memoize field columns with stable handlers
|
||||
const fieldColumns = useMemo(() => fields.map((field): ColumnDef<RowData<T>, any> | null => {
|
||||
@@ -412,26 +433,34 @@ const ValidationTable = <T extends string>({
|
||||
disabled: false
|
||||
};
|
||||
|
||||
// Debug logging
|
||||
console.log(`Field ${fieldKey} in ValidationTable (after deep clone):`, {
|
||||
originalField: field,
|
||||
modifiedField: fieldWithType,
|
||||
options,
|
||||
hasOptions: options && options.length > 0,
|
||||
disabled: fieldWithType.disabled
|
||||
});
|
||||
}
|
||||
|
||||
// Get item number from UPC validation results if available
|
||||
let itemNumber = itemNumbers.get(row.index);
|
||||
if (!itemNumber && fieldKey === 'item_number') {
|
||||
itemNumber = getRowUpcResult(row.index);
|
||||
// CRITICAL CHANGE: Get item number directly from row data first for item_number fields
|
||||
let itemNumber;
|
||||
if (fieldKey === 'item_number') {
|
||||
// Check directly in row data first - this is the most accurate source
|
||||
const directValue = row.original[fieldKey];
|
||||
if (directValue) {
|
||||
itemNumber = directValue;
|
||||
} else {
|
||||
// Fall back to centralized getter that checks all sources
|
||||
itemNumber = getRowUpcResult(row.index);
|
||||
}
|
||||
}
|
||||
|
||||
// CRITICAL: For item_number fields, create a unique key that includes the itemNumber value
|
||||
// This forces a complete re-render when the itemNumber changes
|
||||
const cellKey = fieldKey === 'item_number'
|
||||
? `cell-${row.index}-${fieldKey}-${itemNumber || 'empty'}-${Date.now()}` // Force re-render on every render cycle for item_number
|
||||
: `cell-${row.index}-${fieldKey}`;
|
||||
|
||||
return (
|
||||
<MemoizedCell
|
||||
key={cellKey} // CRITICAL: Add key to force re-render when itemNumber changes
|
||||
field={fieldWithType as Field<string>}
|
||||
value={row.original[field.key as keyof typeof row.original]}
|
||||
value={fieldKey === 'item_number' && row.original[field.key]
|
||||
? row.original[field.key] // Use direct value from row data
|
||||
: row.original[field.key as keyof typeof row.original]}
|
||||
onChange={(value) => handleFieldUpdate(row.index, field.key as T, value)}
|
||||
errors={cellErrors}
|
||||
isValidating={isLoading}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { useState, useCallback } from 'react';
|
||||
import { toast } from 'sonner';
|
||||
import { getApiUrl, RowData } from './useValidationState';
|
||||
import { getApiUrl, RowData } from './validationTypes';
|
||||
import { Fields } from '../../../types';
|
||||
import { Meta } from '../types';
|
||||
import { addErrorsAndRunHooks } from '../utils/dataMutations';
|
||||
@@ -42,6 +42,39 @@ export interface CurrentPrompt {
|
||||
isOpen: boolean;
|
||||
prompt: string | null;
|
||||
isLoading: boolean;
|
||||
debugData?: {
|
||||
taxonomyStats: {
|
||||
categories: number;
|
||||
themes: number;
|
||||
colors: number;
|
||||
taxCodes: number;
|
||||
sizeCategories: number;
|
||||
suppliers: number;
|
||||
companies: number;
|
||||
artists: number;
|
||||
} | null;
|
||||
basePrompt: string;
|
||||
sampleFullPrompt: string;
|
||||
promptLength: number;
|
||||
apiFormat?: Array<{
|
||||
role: string;
|
||||
content: string;
|
||||
}>;
|
||||
promptSources?: {
|
||||
systemPrompt?: { id: number; prompt_text: string };
|
||||
generalPrompt?: { id: number; prompt_text: string };
|
||||
companyPrompts?: Array<{
|
||||
id: number;
|
||||
company: string;
|
||||
companyName: string;
|
||||
prompt_text: string
|
||||
}>;
|
||||
};
|
||||
estimatedProcessingTime?: {
|
||||
seconds: number | null;
|
||||
sampleCount: number;
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
// Declare global interface for the timer
|
||||
@@ -250,7 +283,11 @@ export const useAiValidation = <T extends string>(
|
||||
// Function to show current prompt
|
||||
const showCurrentPrompt = useCallback(async () => {
|
||||
try {
|
||||
setCurrentPrompt(prev => ({ ...prev, isLoading: true, isOpen: true }));
|
||||
setCurrentPrompt(prev => ({
|
||||
...prev,
|
||||
isLoading: true,
|
||||
isOpen: true
|
||||
}));
|
||||
|
||||
// Debug log the data being sent
|
||||
console.log('Sending products data:', {
|
||||
@@ -272,7 +309,7 @@ export const useAiValidation = <T extends string>(
|
||||
});
|
||||
|
||||
// Use POST to send products in request body
|
||||
const response = await fetch(`${getApiUrl()}/ai-validation/debug`, {
|
||||
const response = await fetch(`${await getApiUrl()}/ai-validation/debug`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
@@ -294,7 +331,16 @@ export const useAiValidation = <T extends string>(
|
||||
setCurrentPrompt(prev => ({
|
||||
...prev,
|
||||
prompt: promptContent,
|
||||
isLoading: false
|
||||
isLoading: false,
|
||||
debugData: {
|
||||
taxonomyStats: result.taxonomyStats || null,
|
||||
basePrompt: result.basePrompt || '',
|
||||
sampleFullPrompt: result.sampleFullPrompt || '',
|
||||
promptLength: result.promptLength || (promptContent ? promptContent.length : 0),
|
||||
promptSources: result.promptSources,
|
||||
estimatedProcessingTime: result.estimatedProcessingTime,
|
||||
apiFormat: result.apiFormat
|
||||
}
|
||||
}));
|
||||
} else {
|
||||
throw new Error('No prompt returned from server');
|
||||
@@ -460,6 +506,27 @@ export const useAiValidation = <T extends string>(
|
||||
throw new Error(result.error || 'AI validation failed');
|
||||
}
|
||||
|
||||
// Store the prompt sources if they exist
|
||||
if (result.promptSources) {
|
||||
setCurrentPrompt(prev => {
|
||||
// Create debugData if it doesn't exist
|
||||
const prevDebugData = prev.debugData || {
|
||||
taxonomyStats: null,
|
||||
basePrompt: '',
|
||||
sampleFullPrompt: '',
|
||||
promptLength: 0
|
||||
};
|
||||
|
||||
return {
|
||||
...prev,
|
||||
debugData: {
|
||||
...prevDebugData,
|
||||
promptSources: result.promptSources
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
// Update progress with actual processing time if available
|
||||
if (result.performanceMetrics) {
|
||||
console.log('Performance metrics:', result.performanceMetrics);
|
||||
|
||||
@@ -0,0 +1,161 @@
|
||||
import { useCallback } from 'react';
|
||||
import type { Field, Fields, RowHook } from '../../../types';
|
||||
import type { Meta } from '../types';
|
||||
import { ErrorType, ValidationError } from '../../../types';
|
||||
import { RowData, isEmpty } from './validationTypes';
|
||||
|
||||
// Create a cache for validation results to avoid repeated validation of the same data
|
||||
const validationResultCache = new Map();
|
||||
|
||||
// Add a function to clear cache for a specific field value
|
||||
export const clearValidationCacheForField = (fieldKey: string) => {
|
||||
// Look for entries that match this field key
|
||||
validationResultCache.forEach((_, key) => {
|
||||
if (key.startsWith(`${fieldKey}-`)) {
|
||||
validationResultCache.delete(key);
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
// Add a special function to clear all uniqueness validation caches
|
||||
export const clearAllUniquenessCaches = () => {
|
||||
// Clear cache for common unique fields
|
||||
['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'].forEach(fieldKey => {
|
||||
clearValidationCacheForField(fieldKey);
|
||||
});
|
||||
|
||||
// Also clear any cache entries that might involve uniqueness validation
|
||||
validationResultCache.forEach((_, key) => {
|
||||
if (key.includes('unique')) {
|
||||
validationResultCache.delete(key);
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
export const useFieldValidation = <T extends string>(
|
||||
fields: Fields<T>,
|
||||
rowHook?: RowHook<T>
|
||||
) => {
|
||||
// Validate a single field
|
||||
const validateField = useCallback((
|
||||
value: any,
|
||||
field: Field<T>
|
||||
): ValidationError[] => {
|
||||
const errors: ValidationError[] = [];
|
||||
|
||||
if (!field.validations) return errors;
|
||||
|
||||
// Create a cache key using field key, value, and validation rules
|
||||
const cacheKey = `${field.key}-${String(value)}-${JSON.stringify(field.validations)}`;
|
||||
|
||||
// Check cache first to avoid redundant validation
|
||||
if (validationResultCache.has(cacheKey)) {
|
||||
return validationResultCache.get(cacheKey) || [];
|
||||
}
|
||||
|
||||
field.validations.forEach(validation => {
|
||||
switch (validation.rule) {
|
||||
case 'required':
|
||||
// Use the shared isEmpty function
|
||||
if (isEmpty(value)) {
|
||||
errors.push({
|
||||
message: validation.errorMessage || 'This field is required',
|
||||
level: validation.level || 'error',
|
||||
type: ErrorType.Required
|
||||
});
|
||||
}
|
||||
break;
|
||||
|
||||
case 'unique':
|
||||
// Unique validation happens at table level, not here
|
||||
break;
|
||||
|
||||
case 'regex':
|
||||
if (value !== undefined && value !== null && value !== '') {
|
||||
try {
|
||||
const regex = new RegExp(validation.value, validation.flags);
|
||||
if (!regex.test(String(value))) {
|
||||
errors.push({
|
||||
message: validation.errorMessage,
|
||||
level: validation.level || 'error',
|
||||
type: ErrorType.Regex
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Invalid regex in validation:', error);
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
|
||||
// Store results in cache to speed up future validations
|
||||
validationResultCache.set(cacheKey, errors);
|
||||
|
||||
return errors;
|
||||
}, []);
|
||||
|
||||
// Validate a single row
|
||||
const validateRow = useCallback(async (
|
||||
row: RowData<T>,
|
||||
rowIndex: number,
|
||||
allRows: RowData<T>[]
|
||||
): Promise<Meta> => {
|
||||
// Run field-level validations
|
||||
const fieldErrors: Record<string, ValidationError[]> = {};
|
||||
|
||||
fields.forEach(field => {
|
||||
const value = row[String(field.key) as keyof typeof row];
|
||||
const errors = validateField(value, field as Field<T>);
|
||||
|
||||
if (errors.length > 0) {
|
||||
fieldErrors[String(field.key)] = errors;
|
||||
}
|
||||
});
|
||||
|
||||
// Special validation for supplier and company fields - only apply if the field exists in fields
|
||||
if (fields.some(field => String(field.key) === 'supplier') && isEmpty(row.supplier)) {
|
||||
fieldErrors['supplier'] = [{
|
||||
message: 'Supplier is required',
|
||||
level: 'error',
|
||||
type: ErrorType.Required
|
||||
}];
|
||||
}
|
||||
|
||||
if (fields.some(field => String(field.key) === 'company') && isEmpty(row.company)) {
|
||||
fieldErrors['company'] = [{
|
||||
message: 'Company is required',
|
||||
level: 'error',
|
||||
type: ErrorType.Required
|
||||
}];
|
||||
}
|
||||
|
||||
// Run row hook if provided
|
||||
let rowHookResult: Meta = {
|
||||
__index: row.__index || String(rowIndex)
|
||||
};
|
||||
if (rowHook) {
|
||||
try {
|
||||
// Call the row hook and extract only the __index property
|
||||
const result = await rowHook(row, rowIndex, allRows);
|
||||
rowHookResult.__index = result.__index || rowHookResult.__index;
|
||||
} catch (error) {
|
||||
console.error('Error in row hook:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// We no longer need to merge errors since we're not storing them in the row data
|
||||
// The calling code should handle storing errors in the validationErrors Map
|
||||
|
||||
return {
|
||||
__index: row.__index || String(rowIndex)
|
||||
};
|
||||
}, [fields, validateField, rowHook]);
|
||||
|
||||
return {
|
||||
validateField,
|
||||
validateRow,
|
||||
clearValidationCacheForField,
|
||||
clearAllUniquenessCaches
|
||||
};
|
||||
};
|
||||
@@ -0,0 +1,110 @@
|
||||
import { useState, useCallback, useMemo } from 'react';
|
||||
import { FilterState, RowData } from './validationTypes';
|
||||
import type { Fields } from '../../../types';
|
||||
import { ValidationError } from '../../../types';
|
||||
|
||||
export const useFilterManagement = <T extends string>(
|
||||
data: RowData<T>[],
|
||||
fields: Fields<T>,
|
||||
validationErrors: Map<number, Record<string, ValidationError[]>>
|
||||
) => {
|
||||
// Filter state
|
||||
const [filters, setFilters] = useState<FilterState>({
|
||||
searchText: "",
|
||||
showErrorsOnly: false,
|
||||
filterField: null,
|
||||
filterValue: null,
|
||||
});
|
||||
|
||||
// Filter data based on current filter state
|
||||
const filteredData = useMemo(() => {
|
||||
return data.filter((row, index) => {
|
||||
// Filter by search text
|
||||
if (filters.searchText) {
|
||||
const searchLower = filters.searchText.toLowerCase();
|
||||
const matchesSearch = fields.some((field) => {
|
||||
const value = row[field.key as keyof typeof row];
|
||||
if (value === undefined || value === null) return false;
|
||||
return String(value).toLowerCase().includes(searchLower);
|
||||
});
|
||||
if (!matchesSearch) return false;
|
||||
}
|
||||
|
||||
// Filter by errors
|
||||
if (filters.showErrorsOnly) {
|
||||
const hasErrors =
|
||||
validationErrors.has(index) &&
|
||||
Object.keys(validationErrors.get(index) || {}).length > 0;
|
||||
if (!hasErrors) return false;
|
||||
}
|
||||
|
||||
// Filter by field value
|
||||
if (filters.filterField && filters.filterValue) {
|
||||
const fieldValue = row[filters.filterField as keyof typeof row];
|
||||
if (fieldValue === undefined) return false;
|
||||
|
||||
const valueStr = String(fieldValue).toLowerCase();
|
||||
const filterStr = filters.filterValue.toLowerCase();
|
||||
|
||||
if (!valueStr.includes(filterStr)) return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
});
|
||||
}, [data, fields, filters, validationErrors]);
|
||||
|
||||
// Get filter fields
|
||||
const filterFields = useMemo(() => {
|
||||
return fields.map((field) => ({
|
||||
key: String(field.key),
|
||||
label: field.label,
|
||||
}));
|
||||
}, [fields]);
|
||||
|
||||
// Get filter values for the selected field
|
||||
const filterValues = useMemo(() => {
|
||||
if (!filters.filterField) return [];
|
||||
|
||||
// Get unique values for the selected field
|
||||
const uniqueValues = new Set<string>();
|
||||
|
||||
data.forEach((row) => {
|
||||
const value = row[filters.filterField as keyof typeof row];
|
||||
if (value !== undefined && value !== null) {
|
||||
uniqueValues.add(String(value));
|
||||
}
|
||||
});
|
||||
|
||||
return Array.from(uniqueValues).map((value) => ({
|
||||
value,
|
||||
label: value,
|
||||
}));
|
||||
}, [data, filters.filterField]);
|
||||
|
||||
// Update filters
|
||||
const updateFilters = useCallback((newFilters: Partial<FilterState>) => {
|
||||
setFilters((prev) => ({
|
||||
...prev,
|
||||
...newFilters,
|
||||
}));
|
||||
}, []);
|
||||
|
||||
// Reset filters
|
||||
const resetFilters = useCallback(() => {
|
||||
setFilters({
|
||||
searchText: "",
|
||||
showErrorsOnly: false,
|
||||
filterField: null,
|
||||
filterValue: null,
|
||||
});
|
||||
}, []);
|
||||
|
||||
return {
|
||||
filters,
|
||||
filteredData,
|
||||
filterFields,
|
||||
filterValues,
|
||||
updateFilters,
|
||||
resetFilters
|
||||
};
|
||||
};
|
||||
@@ -0,0 +1,366 @@
|
||||
import { useCallback } from 'react';
|
||||
import { RowData } from './validationTypes';
|
||||
import type { Field, Fields } from '../../../types';
|
||||
import { ErrorType, ValidationError } from '../../../types';
|
||||
|
||||
export const useRowOperations = <T extends string>(
|
||||
data: RowData<T>[],
|
||||
fields: Fields<T>,
|
||||
setData: React.Dispatch<React.SetStateAction<RowData<T>[]>>,
|
||||
setValidationErrors: React.Dispatch<React.SetStateAction<Map<number, Record<string, ValidationError[]>>>>,
|
||||
validateFieldFromHook: (value: any, field: Field<T>) => ValidationError[]
|
||||
) => {
|
||||
// Helper function to validate a field value
|
||||
const fieldValidationHelper = useCallback(
|
||||
(rowIndex: number, specificField?: string) => {
|
||||
// Skip validation if row doesn't exist
|
||||
if (rowIndex < 0 || rowIndex >= data.length) return;
|
||||
|
||||
// Get the row data
|
||||
const row = data[rowIndex];
|
||||
|
||||
// If validating a specific field, only check that field
|
||||
if (specificField) {
|
||||
const field = fields.find((f) => String(f.key) === specificField);
|
||||
if (field) {
|
||||
const value = row[specificField as keyof typeof row];
|
||||
|
||||
// Use state setter instead of direct mutation
|
||||
setValidationErrors((prev) => {
|
||||
const newErrors = new Map(prev);
|
||||
const existingErrors = { ...(newErrors.get(rowIndex) || {}) };
|
||||
|
||||
// Quick check for required fields - this prevents flashing errors
|
||||
const isRequired = field.validations?.some(
|
||||
(v) => v.rule === "required"
|
||||
);
|
||||
const isEmpty =
|
||||
value === undefined ||
|
||||
value === null ||
|
||||
value === "" ||
|
||||
(Array.isArray(value) && value.length === 0) ||
|
||||
(typeof value === "object" &&
|
||||
value !== null &&
|
||||
Object.keys(value).length === 0);
|
||||
|
||||
// For non-empty values, remove required errors immediately
|
||||
if (isRequired && !isEmpty && existingErrors[specificField]) {
|
||||
const nonRequiredErrors = existingErrors[specificField].filter(
|
||||
(e) => e.type !== ErrorType.Required
|
||||
);
|
||||
if (nonRequiredErrors.length === 0) {
|
||||
// If no other errors, remove the field entirely from errors
|
||||
delete existingErrors[specificField];
|
||||
} else {
|
||||
existingErrors[specificField] = nonRequiredErrors;
|
||||
}
|
||||
}
|
||||
|
||||
// Run full validation for the field
|
||||
const errors = validateFieldFromHook(value, field as unknown as Field<T>);
|
||||
|
||||
// Update validation errors for this field
|
||||
if (errors.length > 0) {
|
||||
existingErrors[specificField] = errors;
|
||||
} else {
|
||||
delete existingErrors[specificField];
|
||||
}
|
||||
|
||||
// Update validation errors map
|
||||
if (Object.keys(existingErrors).length > 0) {
|
||||
newErrors.set(rowIndex, existingErrors);
|
||||
} else {
|
||||
newErrors.delete(rowIndex);
|
||||
}
|
||||
|
||||
return newErrors;
|
||||
});
|
||||
}
|
||||
} else {
|
||||
// Validate all fields in the row
|
||||
setValidationErrors((prev) => {
|
||||
const newErrors = new Map(prev);
|
||||
const rowErrors: Record<string, ValidationError[]> = {};
|
||||
|
||||
fields.forEach((field) => {
|
||||
const fieldKey = String(field.key);
|
||||
const value = row[fieldKey as keyof typeof row];
|
||||
const errors = validateFieldFromHook(value, field as unknown as Field<T>);
|
||||
|
||||
if (errors.length > 0) {
|
||||
rowErrors[fieldKey] = errors;
|
||||
}
|
||||
});
|
||||
|
||||
// Update validation errors map
|
||||
if (Object.keys(rowErrors).length > 0) {
|
||||
newErrors.set(rowIndex, rowErrors);
|
||||
} else {
|
||||
newErrors.delete(rowIndex);
|
||||
}
|
||||
|
||||
return newErrors;
|
||||
});
|
||||
}
|
||||
},
|
||||
[data, fields, validateFieldFromHook, setValidationErrors]
|
||||
);
|
||||
|
||||
// Use validateRow as an alias for fieldValidationHelper for compatibility
|
||||
const validateRow = fieldValidationHelper;
|
||||
|
||||
// Modified updateRow function that properly handles field-specific validation
|
||||
const updateRow = useCallback(
|
||||
(rowIndex: number, key: T, value: any) => {
|
||||
// Process value before updating data
|
||||
let processedValue = value;
|
||||
|
||||
// Strip dollar signs from price fields
|
||||
if (
|
||||
(key === "msrp" || key === "cost_each") &&
|
||||
typeof value === "string"
|
||||
) {
|
||||
processedValue = value.replace(/[$,]/g, "");
|
||||
|
||||
// Also ensure it's a valid number
|
||||
const numValue = parseFloat(processedValue);
|
||||
if (!isNaN(numValue)) {
|
||||
processedValue = numValue.toFixed(2);
|
||||
}
|
||||
}
|
||||
|
||||
// Find the row data first
|
||||
const rowData = data[rowIndex];
|
||||
if (!rowData) {
|
||||
console.error(`No row data found for index ${rowIndex}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Create a copy of the row to avoid mutation
|
||||
const updatedRow = { ...rowData, [key]: processedValue };
|
||||
|
||||
// Update the data immediately - this sets the value
|
||||
setData((prevData) => {
|
||||
const newData = [...prevData];
|
||||
if (rowIndex >= 0 && rowIndex < newData.length) {
|
||||
newData[rowIndex] = updatedRow;
|
||||
}
|
||||
return newData;
|
||||
});
|
||||
|
||||
// Find the field definition
|
||||
const field = fields.find((f) => String(f.key) === key);
|
||||
if (!field) return;
|
||||
|
||||
// CRITICAL FIX: Combine both validation operations into a single state update
|
||||
// to prevent intermediate rendering that causes error icon flashing
|
||||
setValidationErrors((prev) => {
|
||||
const newMap = new Map(prev);
|
||||
const existingErrors = newMap.get(rowIndex) || {};
|
||||
const newRowErrors = { ...existingErrors };
|
||||
|
||||
// Check for required field first
|
||||
const isRequired = field.validations?.some(
|
||||
(v) => v.rule === "required"
|
||||
);
|
||||
const isEmpty =
|
||||
processedValue === undefined ||
|
||||
processedValue === null ||
|
||||
processedValue === "" ||
|
||||
(Array.isArray(processedValue) && processedValue.length === 0) ||
|
||||
(typeof processedValue === "object" &&
|
||||
processedValue !== null &&
|
||||
Object.keys(processedValue).length === 0);
|
||||
|
||||
// For required fields with values, remove required errors
|
||||
if (isRequired && !isEmpty && newRowErrors[key as string]) {
|
||||
const hasRequiredError = newRowErrors[key as string].some(
|
||||
(e) => e.type === ErrorType.Required
|
||||
);
|
||||
|
||||
if (hasRequiredError) {
|
||||
// Remove required errors but keep other types of errors
|
||||
const nonRequiredErrors = newRowErrors[key as string].filter(
|
||||
(e) => e.type !== ErrorType.Required
|
||||
);
|
||||
|
||||
if (nonRequiredErrors.length === 0) {
|
||||
// If no other errors, delete the field's errors entirely
|
||||
delete newRowErrors[key as string];
|
||||
} else {
|
||||
// Otherwise keep non-required errors
|
||||
newRowErrors[key as string] = nonRequiredErrors;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Now run full validation for the field (except for required which we already handled)
|
||||
const errors = validateFieldFromHook(
|
||||
processedValue,
|
||||
field as unknown as Field<T>
|
||||
).filter((e) => e.type !== ErrorType.Required || isEmpty);
|
||||
|
||||
// Update with new validation results
|
||||
if (errors.length > 0) {
|
||||
newRowErrors[key as string] = errors;
|
||||
} else if (!newRowErrors[key as string]) {
|
||||
// If no errors found and no existing errors, ensure field is removed from errors
|
||||
delete newRowErrors[key as string];
|
||||
}
|
||||
|
||||
// Update the map
|
||||
if (Object.keys(newRowErrors).length > 0) {
|
||||
newMap.set(rowIndex, newRowErrors);
|
||||
} else {
|
||||
newMap.delete(rowIndex);
|
||||
}
|
||||
|
||||
return newMap;
|
||||
});
|
||||
|
||||
// Handle simple secondary effects here
|
||||
setTimeout(() => {
|
||||
// Use __index to find the actual row in the full data array
|
||||
const rowId = rowData.__index;
|
||||
|
||||
// Handle company change - clear line/subline
|
||||
if (key === "company" && processedValue) {
|
||||
// Clear any existing line/subline values
|
||||
setData((prevData) => {
|
||||
const newData = [...prevData];
|
||||
const idx = newData.findIndex((item) => item.__index === rowId);
|
||||
if (idx >= 0) {
|
||||
newData[idx] = {
|
||||
...newData[idx],
|
||||
line: undefined,
|
||||
subline: undefined,
|
||||
};
|
||||
}
|
||||
return newData;
|
||||
});
|
||||
}
|
||||
|
||||
// Handle line change - clear subline
|
||||
if (key === "line" && processedValue) {
|
||||
// Clear any existing subline value
|
||||
setData((prevData) => {
|
||||
const newData = [...prevData];
|
||||
const idx = newData.findIndex((item) => item.__index === rowId);
|
||||
if (idx >= 0) {
|
||||
newData[idx] = {
|
||||
...newData[idx],
|
||||
subline: undefined,
|
||||
};
|
||||
}
|
||||
return newData;
|
||||
});
|
||||
}
|
||||
}, 50);
|
||||
},
|
||||
[data, fields, validateFieldFromHook, setData, setValidationErrors]
|
||||
);
|
||||
|
||||
// Improved revalidateRows function
|
||||
const revalidateRows = useCallback(
|
||||
async (
|
||||
rowIndexes: number[],
|
||||
updatedFields?: { [rowIndex: number]: string[] }
|
||||
) => {
|
||||
// Process all specified rows using a single state update to avoid race conditions
|
||||
setValidationErrors((prev) => {
|
||||
const newErrors = new Map(prev);
|
||||
|
||||
// Process each row
|
||||
for (const rowIndex of rowIndexes) {
|
||||
if (rowIndex < 0 || rowIndex >= data.length) continue;
|
||||
|
||||
const row = data[rowIndex];
|
||||
if (!row) continue;
|
||||
|
||||
// If we have specific fields to update for this row
|
||||
const fieldsToValidate = updatedFields?.[rowIndex] || [];
|
||||
|
||||
if (fieldsToValidate.length > 0) {
|
||||
// Get existing errors for this row
|
||||
const existingRowErrors = { ...(newErrors.get(rowIndex) || {}) };
|
||||
|
||||
// Validate each specified field
|
||||
for (const fieldKey of fieldsToValidate) {
|
||||
const field = fields.find((f) => String(f.key) === fieldKey);
|
||||
if (!field) continue;
|
||||
|
||||
const value = row[fieldKey as keyof typeof row];
|
||||
|
||||
// Run validation for this field
|
||||
const errors = validateFieldFromHook(value, field as unknown as Field<T>);
|
||||
|
||||
// Update errors for this field
|
||||
if (errors.length > 0) {
|
||||
existingRowErrors[fieldKey] = errors;
|
||||
} else {
|
||||
delete existingRowErrors[fieldKey];
|
||||
}
|
||||
}
|
||||
|
||||
// Update the row's errors
|
||||
if (Object.keys(existingRowErrors).length > 0) {
|
||||
newErrors.set(rowIndex, existingRowErrors);
|
||||
} else {
|
||||
newErrors.delete(rowIndex);
|
||||
}
|
||||
} else {
|
||||
// No specific fields provided - validate the entire row
|
||||
const rowErrors: Record<string, ValidationError[]> = {};
|
||||
|
||||
// Validate all fields in the row
|
||||
for (const field of fields) {
|
||||
const fieldKey = String(field.key);
|
||||
const value = row[fieldKey as keyof typeof row];
|
||||
|
||||
// Run validation for this field
|
||||
const errors = validateFieldFromHook(value, field as unknown as Field<T>);
|
||||
|
||||
// Update errors for this field
|
||||
if (errors.length > 0) {
|
||||
rowErrors[fieldKey] = errors;
|
||||
}
|
||||
}
|
||||
|
||||
// Update the row's errors
|
||||
if (Object.keys(rowErrors).length > 0) {
|
||||
newErrors.set(rowIndex, rowErrors);
|
||||
} else {
|
||||
newErrors.delete(rowIndex);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return newErrors;
|
||||
});
|
||||
},
|
||||
[data, fields, validateFieldFromHook]
|
||||
);
|
||||
|
||||
// Copy a cell value to all cells below it in the same column
|
||||
const copyDown = useCallback(
|
||||
(rowIndex: number, key: T) => {
|
||||
// Get the source value to copy
|
||||
const sourceValue = data[rowIndex][key];
|
||||
|
||||
// Update all rows below with the same value using the existing updateRow function
|
||||
// This ensures all validation logic runs consistently
|
||||
for (let i = rowIndex + 1; i < data.length; i++) {
|
||||
// Just use updateRow which will handle validation with proper timing
|
||||
updateRow(i, key, sourceValue);
|
||||
}
|
||||
},
|
||||
[data, updateRow]
|
||||
);
|
||||
|
||||
return {
|
||||
validateRow,
|
||||
updateRow,
|
||||
revalidateRows,
|
||||
copyDown
|
||||
};
|
||||
};
|
||||
@@ -0,0 +1,516 @@
|
||||
import { useState, useCallback } from 'react';
|
||||
import { toast } from 'sonner';
|
||||
import { Template, RowData, TemplateState, getApiUrl } from './validationTypes';
|
||||
import { RowSelectionState } from '@tanstack/react-table';
|
||||
import { ValidationError } from '../../../types';
|
||||
|
||||
export const useTemplateManagement = <T extends string>(
|
||||
data: RowData<T>[],
|
||||
setData: React.Dispatch<React.SetStateAction<RowData<T>[]>>,
|
||||
rowSelection: RowSelectionState,
|
||||
setValidationErrors: React.Dispatch<React.SetStateAction<Map<number, Record<string, ValidationError[]>>>>,
|
||||
setRowValidationStatus: React.Dispatch<React.SetStateAction<Map<number, "pending" | "validating" | "validated" | "error">>>,
|
||||
validateRow: (rowIndex: number, specificField?: string) => void,
|
||||
isApplyingTemplateRef: React.MutableRefObject<boolean>,
|
||||
upcValidation: {
|
||||
validateUpc: (rowIndex: number, supplierId: string, upcValue: string) => Promise<{success: boolean, itemNumber?: string}>,
|
||||
applyItemNumbersToData: (onApplied?: (updatedRowIds: number[]) => void) => void
|
||||
},
|
||||
setValidatingCells?: React.Dispatch<React.SetStateAction<Set<string>>>
|
||||
) => {
|
||||
// Template state
|
||||
const [templates, setTemplates] = useState<Template[]>([]);
|
||||
const [isLoadingTemplates, setIsLoadingTemplates] = useState(true);
|
||||
const [templateState, setTemplateState] = useState<TemplateState>({
|
||||
selectedTemplateId: null,
|
||||
showSaveTemplateDialog: false,
|
||||
newTemplateName: "",
|
||||
newTemplateType: "",
|
||||
});
|
||||
|
||||
// Load templates
|
||||
const loadTemplates = useCallback(async () => {
|
||||
try {
|
||||
setIsLoadingTemplates(true);
|
||||
console.log("Fetching templates from:", `${getApiUrl()}/templates`);
|
||||
const response = await fetch(`${getApiUrl()}/templates`);
|
||||
if (!response.ok) throw new Error("Failed to fetch templates");
|
||||
const templateData = await response.json();
|
||||
const validTemplates = templateData.filter(
|
||||
(t: any) =>
|
||||
t && typeof t === "object" && t.id && t.company && t.product_type
|
||||
);
|
||||
setTemplates(validTemplates);
|
||||
} catch (error) {
|
||||
console.error("Error fetching templates:", error);
|
||||
toast.error("Failed to load templates");
|
||||
} finally {
|
||||
setIsLoadingTemplates(false);
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Refresh templates
|
||||
const refreshTemplates = useCallback(() => {
|
||||
loadTemplates();
|
||||
}, [loadTemplates]);
|
||||
|
||||
// Save a new template
|
||||
const saveTemplate = useCallback(
|
||||
async (name: string, type: string) => {
|
||||
try {
|
||||
// Get selected rows
|
||||
const selectedRowIndex = Number(Object.keys(rowSelection)[0]);
|
||||
const selectedRow = data[selectedRowIndex];
|
||||
|
||||
if (!selectedRow) {
|
||||
toast.error("Please select a row to create a template");
|
||||
return;
|
||||
}
|
||||
|
||||
// Extract data for template, removing metadata fields
|
||||
const {
|
||||
__index,
|
||||
__template,
|
||||
__original,
|
||||
__corrected,
|
||||
__changes,
|
||||
...templateData
|
||||
} = selectedRow as any;
|
||||
|
||||
// Clean numeric values (remove $ from price fields)
|
||||
const cleanedData: Record<string, any> = {};
|
||||
|
||||
// Process each key-value pair
|
||||
Object.entries(templateData).forEach(([key, value]) => {
|
||||
// Handle numeric values with dollar signs
|
||||
if (typeof value === "string" && value.includes("$")) {
|
||||
cleanedData[key] = value.replace(/[$,\s]/g, "").trim();
|
||||
}
|
||||
// Handle array values (like categories or ship_restrictions)
|
||||
else if (Array.isArray(value)) {
|
||||
cleanedData[key] = value;
|
||||
}
|
||||
// Handle other values
|
||||
else {
|
||||
cleanedData[key] = value;
|
||||
}
|
||||
});
|
||||
|
||||
// Send the template to the API
|
||||
const response = await fetch(`${getApiUrl()}/templates`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
...cleanedData,
|
||||
company: name,
|
||||
product_type: type,
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json();
|
||||
throw new Error(
|
||||
errorData.error || errorData.details || "Failed to save template"
|
||||
);
|
||||
}
|
||||
|
||||
// Get the new template from the response
|
||||
const newTemplate = await response.json();
|
||||
|
||||
// Update the templates list with the new template
|
||||
setTemplates((prev) => [...prev, newTemplate]);
|
||||
|
||||
// Update the row to show it's using this template
|
||||
setData((prev) => {
|
||||
const newData = [...prev];
|
||||
if (newData[selectedRowIndex]) {
|
||||
newData[selectedRowIndex] = {
|
||||
...newData[selectedRowIndex],
|
||||
__template: newTemplate.id.toString(),
|
||||
};
|
||||
}
|
||||
return newData;
|
||||
});
|
||||
|
||||
toast.success(`Template "${name}" saved successfully`);
|
||||
|
||||
// Reset dialog state
|
||||
setTemplateState((prev) => ({
|
||||
...prev,
|
||||
showSaveTemplateDialog: false,
|
||||
newTemplateName: "",
|
||||
newTemplateType: "",
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error("Error saving template:", error);
|
||||
toast.error(
|
||||
error instanceof Error ? error.message : "Failed to save template"
|
||||
);
|
||||
}
|
||||
},
|
||||
[data, rowSelection, setData]
|
||||
);
|
||||
|
||||
// Apply template to rows - optimized version
|
||||
const applyTemplate = useCallback(
|
||||
(templateId: string, rowIndexes: number[]) => {
|
||||
const template = templates.find((t) => t.id.toString() === templateId);
|
||||
|
||||
if (!template) {
|
||||
toast.error("Template not found");
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Applying template ${templateId} to rows:`, rowIndexes);
|
||||
|
||||
// Validate row indexes
|
||||
const validRowIndexes = rowIndexes.filter(
|
||||
(index) => index >= 0 && index < data.length && Number.isInteger(index)
|
||||
);
|
||||
|
||||
if (validRowIndexes.length === 0) {
|
||||
toast.error("No valid rows to update");
|
||||
console.error("Invalid row indexes:", rowIndexes);
|
||||
return;
|
||||
}
|
||||
|
||||
// Set the template application flag
|
||||
isApplyingTemplateRef.current = true;
|
||||
|
||||
// Save scroll position
|
||||
const scrollPosition = {
|
||||
left: window.scrollX,
|
||||
top: window.scrollY,
|
||||
};
|
||||
|
||||
// Create a copy of data and process all rows at once to minimize state updates
|
||||
const newData = [...data];
|
||||
const batchErrors = new Map<number, Record<string, ValidationError[]>>();
|
||||
const batchStatuses = new Map<
|
||||
number,
|
||||
"pending" | "validating" | "validated" | "error"
|
||||
>();
|
||||
|
||||
// Extract template fields once outside the loop
|
||||
const templateFields = Object.entries(template).filter(
|
||||
([key]) =>
|
||||
![
|
||||
"id",
|
||||
"__meta",
|
||||
"__template",
|
||||
"__original",
|
||||
"__corrected",
|
||||
"__changes",
|
||||
].includes(key)
|
||||
);
|
||||
|
||||
// Apply template to each valid row
|
||||
validRowIndexes.forEach((index) => {
|
||||
// Create a new row with template values
|
||||
const originalRow = newData[index];
|
||||
const updatedRow = { ...originalRow } as Record<string, any>;
|
||||
|
||||
// Apply template fields (excluding metadata fields)
|
||||
for (const [key, value] of templateFields) {
|
||||
updatedRow[key] = value;
|
||||
}
|
||||
|
||||
// Mark the row as using this template
|
||||
updatedRow.__template = templateId;
|
||||
|
||||
// Update the row in the data array
|
||||
newData[index] = updatedRow as RowData<T>;
|
||||
|
||||
// Clear validation errors and mark as validated
|
||||
batchErrors.set(index, {});
|
||||
batchStatuses.set(index, "validated");
|
||||
});
|
||||
|
||||
// Check which rows need UPC validation
|
||||
const upcValidationRows = validRowIndexes.filter((rowIndex) => {
|
||||
const row = newData[rowIndex];
|
||||
return row && row.upc && row.supplier;
|
||||
});
|
||||
|
||||
// Perform a single update for all rows
|
||||
setData(newData);
|
||||
|
||||
// Update all validation errors and statuses at once
|
||||
setValidationErrors((prev) => {
|
||||
const newErrors = new Map(prev);
|
||||
for (const [rowIndex, errors] of batchErrors.entries()) {
|
||||
newErrors.set(rowIndex, errors);
|
||||
}
|
||||
return newErrors;
|
||||
});
|
||||
|
||||
setRowValidationStatus((prev) => {
|
||||
const newStatus = new Map(prev);
|
||||
for (const [rowIndex, status] of batchStatuses.entries()) {
|
||||
newStatus.set(rowIndex, status);
|
||||
}
|
||||
return newStatus;
|
||||
});
|
||||
|
||||
// Restore scroll position
|
||||
requestAnimationFrame(() => {
|
||||
window.scrollTo(scrollPosition.left, scrollPosition.top);
|
||||
});
|
||||
|
||||
// Show success toast
|
||||
if (validRowIndexes.length === 1) {
|
||||
toast.success("Template applied");
|
||||
} else if (validRowIndexes.length > 1) {
|
||||
toast.success(`Template applied to ${validRowIndexes.length} rows`);
|
||||
}
|
||||
|
||||
// Reset template application flag to allow validation
|
||||
isApplyingTemplateRef.current = false;
|
||||
|
||||
// If there are rows with both UPC and supplier, validate them
|
||||
if (upcValidationRows.length > 0) {
|
||||
console.log(`Validating UPCs for ${upcValidationRows.length} rows after template application`);
|
||||
|
||||
// Process each row sequentially - this mimics the exact manual edit behavior
|
||||
const processNextValidation = (index = 0) => {
|
||||
if (index >= upcValidationRows.length) {
|
||||
return; // All rows processed
|
||||
}
|
||||
|
||||
const rowIndex = upcValidationRows[index];
|
||||
const row = newData[rowIndex];
|
||||
|
||||
if (row && row.supplier && row.upc) {
|
||||
// The EXACT implementation from handleUpdateRow when supplier is edited manually:
|
||||
|
||||
// 1. Mark the item_number cell as being validated - THIS IS CRITICAL FOR LOADING STATE
|
||||
const cellKey = `${rowIndex}-item_number`;
|
||||
|
||||
// Clear validation errors for this field
|
||||
setValidationErrors(prev => {
|
||||
const newErrors = new Map(prev);
|
||||
if (newErrors.has(rowIndex)) {
|
||||
const rowErrors = { ...newErrors.get(rowIndex) };
|
||||
if (rowErrors.item_number) {
|
||||
delete rowErrors.item_number;
|
||||
}
|
||||
newErrors.set(rowIndex, rowErrors);
|
||||
}
|
||||
return newErrors;
|
||||
});
|
||||
|
||||
// Set loading state - using setValidatingCells from props
|
||||
if (setValidatingCells) {
|
||||
setValidatingCells(prev => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.add(cellKey);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
|
||||
// Validate UPC for this row
|
||||
upcValidation.validateUpc(rowIndex, row.supplier.toString(), row.upc.toString())
|
||||
.then(result => {
|
||||
if (result.success && result.itemNumber) {
|
||||
// CRITICAL FIX: Directly update data with the item number to ensure immediate UI update
|
||||
setData(prevData => {
|
||||
const newData = [...prevData];
|
||||
|
||||
// Update this specific row with the item number
|
||||
if (rowIndex >= 0 && rowIndex < newData.length) {
|
||||
newData[rowIndex] = {
|
||||
...newData[rowIndex],
|
||||
item_number: result.itemNumber
|
||||
};
|
||||
}
|
||||
|
||||
return newData;
|
||||
});
|
||||
|
||||
// Also trigger other relevant updates
|
||||
upcValidation.applyItemNumbersToData();
|
||||
|
||||
// Mark for revalidation after item numbers are updated
|
||||
setTimeout(() => {
|
||||
// Validate the row EXACTLY like in manual edit
|
||||
validateRow(rowIndex, 'item_number');
|
||||
|
||||
// CRITICAL FIX: Make one final check to ensure data is correct
|
||||
setTimeout(() => {
|
||||
// Get the current item number from the data
|
||||
const currentItemNumber = (() => {
|
||||
try {
|
||||
const dataAtThisPointInTime = data[rowIndex];
|
||||
return dataAtThisPointInTime?.item_number;
|
||||
} catch (e) {
|
||||
return undefined;
|
||||
}
|
||||
})();
|
||||
|
||||
// If the data is wrong at this point, fix it directly
|
||||
if (currentItemNumber !== result.itemNumber) {
|
||||
// Directly update the data to fix the issue
|
||||
setData(dataRightNow => {
|
||||
const fixedData = [...dataRightNow];
|
||||
if (rowIndex >= 0 && rowIndex < fixedData.length) {
|
||||
fixedData[rowIndex] = {
|
||||
...fixedData[rowIndex],
|
||||
item_number: result.itemNumber
|
||||
};
|
||||
}
|
||||
return fixedData;
|
||||
});
|
||||
|
||||
// Then do a force update after a brief delay
|
||||
setTimeout(() => {
|
||||
setData(currentData => {
|
||||
// Critical fix: ensure the item number is correct
|
||||
if (currentData[rowIndex] && currentData[rowIndex].item_number !== result.itemNumber) {
|
||||
// Create a completely new array with the correct item number
|
||||
const fixedData = [...currentData];
|
||||
fixedData[rowIndex] = {
|
||||
...fixedData[rowIndex],
|
||||
item_number: result.itemNumber
|
||||
};
|
||||
return fixedData;
|
||||
}
|
||||
|
||||
// Create a completely new array
|
||||
return [...currentData];
|
||||
});
|
||||
}, 20);
|
||||
} else {
|
||||
// Item number is already correct, just do the force update
|
||||
setData(currentData => {
|
||||
// Create a completely new array
|
||||
return [...currentData];
|
||||
});
|
||||
}
|
||||
}, 50);
|
||||
|
||||
// Clear loading state
|
||||
if (setValidatingCells) {
|
||||
setValidatingCells(prev => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.delete(cellKey);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
|
||||
// Continue to next row after validation is complete
|
||||
setTimeout(() => processNextValidation(index + 1), 100);
|
||||
}, 100);
|
||||
} else {
|
||||
// Clear loading state on failure
|
||||
if (setValidatingCells) {
|
||||
setValidatingCells(prev => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.delete(cellKey);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
|
||||
// Continue to next row if validation fails
|
||||
setTimeout(() => processNextValidation(index + 1), 100);
|
||||
}
|
||||
})
|
||||
.catch(err => {
|
||||
console.error(`Error validating UPC for row ${rowIndex}:`, err);
|
||||
|
||||
// Clear loading state on error
|
||||
if (setValidatingCells) {
|
||||
setValidatingCells(prev => {
|
||||
const newSet = new Set(prev);
|
||||
newSet.delete(cellKey);
|
||||
return newSet;
|
||||
});
|
||||
}
|
||||
|
||||
// Continue to next row despite error
|
||||
setTimeout(() => processNextValidation(index + 1), 100);
|
||||
});
|
||||
} else {
|
||||
// Skip this row and continue to the next
|
||||
processNextValidation(index + 1);
|
||||
}
|
||||
};
|
||||
|
||||
// Start processing validations
|
||||
processNextValidation();
|
||||
}
|
||||
},
|
||||
[
|
||||
data,
|
||||
templates,
|
||||
setData,
|
||||
setValidationErrors,
|
||||
setRowValidationStatus,
|
||||
validateRow,
|
||||
upcValidation,
|
||||
setValidatingCells
|
||||
]
|
||||
);
|
||||
|
||||
// Apply template to selected rows
|
||||
const applyTemplateToSelected = useCallback(
|
||||
(templateId: string) => {
|
||||
if (!templateId) return;
|
||||
|
||||
// Update the selected template ID
|
||||
setTemplateState((prev) => ({
|
||||
...prev,
|
||||
selectedTemplateId: templateId,
|
||||
}));
|
||||
|
||||
// Get selected row keys (which may be UUIDs)
|
||||
const selectedKeys = Object.entries(rowSelection)
|
||||
.filter(([_, selected]) => selected === true)
|
||||
.map(([key, _]) => key);
|
||||
|
||||
console.log("Selected row keys:", selectedKeys);
|
||||
|
||||
if (selectedKeys.length === 0) {
|
||||
toast.error("No rows selected");
|
||||
return;
|
||||
}
|
||||
|
||||
// Map UUID keys to array indices
|
||||
const selectedIndexes = selectedKeys
|
||||
.map((key) => {
|
||||
// Find the matching row index in the data array
|
||||
const index = data.findIndex(
|
||||
(row) =>
|
||||
(row.__index && row.__index === key) || // Match by __index
|
||||
String(data.indexOf(row)) === key // Or by numeric index
|
||||
);
|
||||
return index;
|
||||
})
|
||||
.filter((index) => index !== -1); // Filter out any not found
|
||||
|
||||
console.log("Mapped row indices:", selectedIndexes);
|
||||
|
||||
if (selectedIndexes.length === 0) {
|
||||
toast.error("Could not find selected rows");
|
||||
return;
|
||||
}
|
||||
|
||||
// Apply template to selected rows
|
||||
applyTemplate(templateId, selectedIndexes);
|
||||
},
|
||||
[rowSelection, applyTemplate, setTemplateState, data]
|
||||
);
|
||||
|
||||
return {
|
||||
templates,
|
||||
isLoadingTemplates,
|
||||
templateState,
|
||||
setTemplateState,
|
||||
loadTemplates,
|
||||
refreshTemplates,
|
||||
saveTemplate,
|
||||
applyTemplate,
|
||||
applyTemplateToSelected
|
||||
};
|
||||
};
|
||||
@@ -1,263 +0,0 @@
|
||||
import { useState, useEffect, useCallback } from 'react'
|
||||
import { RowSelectionState } from '@tanstack/react-table'
|
||||
import { Template, RowData, getApiUrl } from './useValidationState'
|
||||
|
||||
interface TemplateState {
|
||||
selectedTemplateId: string | null
|
||||
showSaveTemplateDialog: boolean
|
||||
newTemplateName: string
|
||||
newTemplateType: string
|
||||
}
|
||||
|
||||
export const useTemplates = <T extends string>(
|
||||
data: RowData<T>[],
|
||||
setData: React.Dispatch<React.SetStateAction<RowData<T>[]>>,
|
||||
toast: any,
|
||||
rowSelection: RowSelectionState
|
||||
) => {
|
||||
const [templates, setTemplates] = useState<Template[]>([])
|
||||
const [templateState, setTemplateState] = useState<TemplateState>({
|
||||
selectedTemplateId: null,
|
||||
showSaveTemplateDialog: false,
|
||||
newTemplateName: '',
|
||||
newTemplateType: '',
|
||||
})
|
||||
|
||||
// Load templates from API
|
||||
const loadTemplates = useCallback(async () => {
|
||||
try {
|
||||
console.log('Fetching templates...');
|
||||
const response = await fetch(`${getApiUrl()}/templates`)
|
||||
console.log('Templates response status:', response.status);
|
||||
|
||||
if (!response.ok) throw new Error('Failed to fetch templates')
|
||||
|
||||
const templateData = await response.json()
|
||||
console.log('Templates fetched successfully:', templateData);
|
||||
|
||||
// Validate template data
|
||||
const validTemplates = templateData.filter((t: any) =>
|
||||
t && typeof t === 'object' && t.id && t.company && t.product_type
|
||||
);
|
||||
|
||||
if (validTemplates.length !== templateData.length) {
|
||||
console.warn('Some templates were filtered out due to invalid data', {
|
||||
original: templateData.length,
|
||||
valid: validTemplates.length
|
||||
});
|
||||
}
|
||||
|
||||
setTemplates(validTemplates)
|
||||
} catch (error) {
|
||||
console.error('Error loading templates:', error)
|
||||
toast({
|
||||
title: 'Error',
|
||||
description: 'Failed to load templates',
|
||||
})
|
||||
}
|
||||
}, [toast])
|
||||
|
||||
// Save a new template based on selected rows
|
||||
const saveTemplate = useCallback(async (name: string, type: string) => {
|
||||
try {
|
||||
// Get selected rows
|
||||
const selectedRows = Object.keys(rowSelection)
|
||||
.map(index => data[parseInt(index)])
|
||||
.filter(Boolean)
|
||||
|
||||
if (selectedRows.length === 0) {
|
||||
toast({
|
||||
title: 'Error',
|
||||
description: 'Please select at least one row to create a template',
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
// Create template based on selected rows
|
||||
const template: Template = {
|
||||
id: Date.now(), // Temporary ID, will be replaced by server
|
||||
company: selectedRows[0].company as string || '',
|
||||
product_type: type,
|
||||
...selectedRows[0], // Copy all fields from the first selected row
|
||||
}
|
||||
|
||||
// Remove metadata fields
|
||||
delete (template as any).__meta
|
||||
delete (template as any).__template
|
||||
delete (template as any).__original
|
||||
delete (template as any).__corrected
|
||||
delete (template as any).__changes
|
||||
|
||||
// Send to API
|
||||
const response = await fetch(`${getApiUrl()}/templates`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
company: template.company,
|
||||
product_type: type,
|
||||
...Object.fromEntries(
|
||||
Object.entries(template).filter(([key]) =>
|
||||
!['company', 'product_type'].includes(key)
|
||||
)
|
||||
)
|
||||
}),
|
||||
})
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to save template')
|
||||
}
|
||||
|
||||
// Reload templates to get the server-generated ID
|
||||
await loadTemplates()
|
||||
|
||||
toast({
|
||||
title: 'Success',
|
||||
description: `Template "${name}" saved successfully`,
|
||||
})
|
||||
|
||||
// Reset dialog state
|
||||
setTemplateState(prev => ({
|
||||
...prev,
|
||||
showSaveTemplateDialog: false,
|
||||
newTemplateName: '',
|
||||
newTemplateType: '',
|
||||
}))
|
||||
} catch (error) {
|
||||
console.error('Error saving template:', error)
|
||||
toast({
|
||||
title: 'Error',
|
||||
description: 'Failed to save template',
|
||||
})
|
||||
}
|
||||
}, [data, rowSelection, toast, loadTemplates])
|
||||
|
||||
// Apply a template to selected rows
|
||||
const applyTemplate = useCallback((templateId: string, rowIndexes: number[]) => {
|
||||
const template = templates.find(t => t.id.toString() === templateId)
|
||||
|
||||
if (!template) {
|
||||
toast({
|
||||
title: 'Error',
|
||||
description: 'Template not found',
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
setData(prevData => {
|
||||
const newData = [...prevData]
|
||||
|
||||
rowIndexes.forEach(index => {
|
||||
if (index >= 0 && index < newData.length) {
|
||||
// Create a new row with template values
|
||||
const updatedRow = { ...newData[index] }
|
||||
|
||||
// Apply template fields (excluding metadata and ID fields)
|
||||
Object.entries(template).forEach(([key, value]) => {
|
||||
if (!['id', 'company', 'product_type', 'created_at', 'updated_at'].includes(key)) {
|
||||
// Handle numeric values that might be stored as strings
|
||||
if (typeof value === 'string' && /^\d+(\.\d+)?$/.test(value)) {
|
||||
// If it's a price field, add the dollar sign
|
||||
if (['msrp', 'cost_each'].includes(key)) {
|
||||
updatedRow[key as keyof typeof updatedRow] = `$${value}` as any;
|
||||
} else {
|
||||
updatedRow[key as keyof typeof updatedRow] = value as any;
|
||||
}
|
||||
}
|
||||
// Special handling for array fields like categories and ship_restrictions
|
||||
else if (key === 'categories' || key === 'ship_restrictions') {
|
||||
if (Array.isArray(value)) {
|
||||
updatedRow[key as keyof typeof updatedRow] = value as any;
|
||||
} else if (typeof value === 'string') {
|
||||
try {
|
||||
// Try to parse as JSON if it's a JSON string
|
||||
if (value.startsWith('[') && value.endsWith(']')) {
|
||||
const parsed = JSON.parse(value);
|
||||
updatedRow[key as keyof typeof updatedRow] = parsed as any;
|
||||
}
|
||||
// Otherwise, it might be a PostgreSQL array format like {val1,val2}
|
||||
else if (value.startsWith('{') && value.endsWith('}')) {
|
||||
const parsed = value.slice(1, -1).split(',');
|
||||
updatedRow[key as keyof typeof updatedRow] = parsed as any;
|
||||
}
|
||||
// If it's a single value, wrap it in an array
|
||||
else {
|
||||
updatedRow[key as keyof typeof updatedRow] = [value] as any;
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error parsing ${key}:`, error);
|
||||
// If parsing fails, use as-is
|
||||
updatedRow[key as keyof typeof updatedRow] = value as any;
|
||||
}
|
||||
} else {
|
||||
updatedRow[key as keyof typeof updatedRow] = value as any;
|
||||
}
|
||||
} else {
|
||||
updatedRow[key as keyof typeof updatedRow] = value as any;
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// Mark the row as using this template
|
||||
updatedRow.__template = templateId
|
||||
|
||||
// Update the row in the data array
|
||||
newData[index] = updatedRow
|
||||
}
|
||||
})
|
||||
|
||||
return newData
|
||||
})
|
||||
|
||||
toast({
|
||||
title: 'Success',
|
||||
description: `Template applied to ${rowIndexes.length} row(s)`,
|
||||
})
|
||||
}, [templates, setData, toast])
|
||||
|
||||
// Get display text for a template
|
||||
const getTemplateDisplayText = useCallback((templateId: string | null) => {
|
||||
if (!templateId) return 'Select a template'
|
||||
|
||||
const template = templates.find(t => t.id.toString() === templateId)
|
||||
return template
|
||||
? `${template.company} - ${template.product_type}`
|
||||
: 'Unknown template'
|
||||
}, [templates])
|
||||
|
||||
// Load templates on component mount and set up refresh event listener
|
||||
useEffect(() => {
|
||||
loadTemplates()
|
||||
|
||||
// Add event listener for template refresh
|
||||
const handleRefreshTemplates = () => {
|
||||
loadTemplates()
|
||||
}
|
||||
|
||||
window.addEventListener('refresh-templates', handleRefreshTemplates)
|
||||
|
||||
// Clean up event listener
|
||||
return () => {
|
||||
window.removeEventListener('refresh-templates', handleRefreshTemplates)
|
||||
}
|
||||
}, [loadTemplates])
|
||||
|
||||
return {
|
||||
templates,
|
||||
selectedTemplateId: templateState.selectedTemplateId,
|
||||
showSaveTemplateDialog: templateState.showSaveTemplateDialog,
|
||||
newTemplateName: templateState.newTemplateName,
|
||||
newTemplateType: templateState.newTemplateType,
|
||||
setTemplateState,
|
||||
loadTemplates,
|
||||
saveTemplate,
|
||||
applyTemplate,
|
||||
getTemplateDisplayText,
|
||||
// Helper method to apply to selected rows
|
||||
applyTemplateToSelected: (templateId: string) => {
|
||||
const selectedIndexes = Object.keys(rowSelection).map(i => parseInt(i))
|
||||
applyTemplate(templateId, selectedIndexes)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,151 @@
|
||||
import { useCallback } from 'react';
|
||||
import { RowData } from './validationTypes';
|
||||
import type { Fields } from '../../../types';
|
||||
import { ErrorSources, ErrorType, ValidationError } from '../../../types';
|
||||
|
||||
export const useUniqueItemNumbersValidation = <T extends string>(
|
||||
data: RowData<T>[],
|
||||
fields: Fields<T>,
|
||||
setValidationErrors: React.Dispatch<React.SetStateAction<Map<number, Record<string, ValidationError[]>>>>
|
||||
) => {
|
||||
// Update validateUniqueItemNumbers to also check for uniqueness of UPC/barcode
|
||||
const validateUniqueItemNumbers = useCallback(async () => {
|
||||
console.log("Validating unique fields");
|
||||
|
||||
// Skip if no data
|
||||
if (!data.length) return;
|
||||
|
||||
// Track unique identifiers in maps
|
||||
const uniqueFieldsMap = new Map<string, Map<string, number[]>>();
|
||||
|
||||
// Find fields that need uniqueness validation
|
||||
const uniqueFields = fields
|
||||
.filter((field) => field.validations?.some((v) => v.rule === "unique"))
|
||||
.map((field) => String(field.key));
|
||||
|
||||
console.log(
|
||||
`Found ${uniqueFields.length} fields requiring uniqueness validation:`,
|
||||
uniqueFields
|
||||
);
|
||||
|
||||
// Always check item_number uniqueness even if not explicitly defined
|
||||
if (!uniqueFields.includes("item_number")) {
|
||||
uniqueFields.push("item_number");
|
||||
}
|
||||
|
||||
// Initialize maps for each unique field
|
||||
uniqueFields.forEach((fieldKey) => {
|
||||
uniqueFieldsMap.set(fieldKey, new Map<string, number[]>());
|
||||
});
|
||||
|
||||
// Initialize batch updates
|
||||
const errors = new Map<number, Record<string, ValidationError[]>>();
|
||||
|
||||
// Single pass through data to identify all unique values
|
||||
data.forEach((row, index) => {
|
||||
uniqueFields.forEach((fieldKey) => {
|
||||
const value = row[fieldKey as keyof typeof row];
|
||||
|
||||
// Skip empty values
|
||||
if (value === undefined || value === null || value === "") {
|
||||
return;
|
||||
}
|
||||
|
||||
const valueStr = String(value);
|
||||
const fieldMap = uniqueFieldsMap.get(fieldKey);
|
||||
|
||||
if (fieldMap) {
|
||||
// Get or initialize the array of indices for this value
|
||||
const indices = fieldMap.get(valueStr) || [];
|
||||
indices.push(index);
|
||||
fieldMap.set(valueStr, indices);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Process duplicates
|
||||
uniqueFields.forEach((fieldKey) => {
|
||||
const fieldMap = uniqueFieldsMap.get(fieldKey);
|
||||
if (!fieldMap) return;
|
||||
|
||||
fieldMap.forEach((indices, value) => {
|
||||
// Only process if there are duplicates
|
||||
if (indices.length > 1) {
|
||||
// Get the validation rule for this field
|
||||
const field = fields.find((f) => String(f.key) === fieldKey);
|
||||
const validationRule = field?.validations?.find(
|
||||
(v) => v.rule === "unique"
|
||||
);
|
||||
|
||||
const errorObj = {
|
||||
message:
|
||||
validationRule?.errorMessage || `Duplicate ${fieldKey}: ${value}`,
|
||||
level: validationRule?.level || ("error" as "error"),
|
||||
source: ErrorSources.Table,
|
||||
type: ErrorType.Unique,
|
||||
};
|
||||
|
||||
// Add error to each row with this value
|
||||
indices.forEach((rowIndex) => {
|
||||
const rowErrors = errors.get(rowIndex) || {};
|
||||
rowErrors[fieldKey] = [errorObj];
|
||||
errors.set(rowIndex, rowErrors);
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
// Apply batch updates only if we have errors to report
|
||||
if (errors.size > 0) {
|
||||
// OPTIMIZATION: Check if we actually have new errors before updating state
|
||||
let hasChanges = false;
|
||||
|
||||
// We'll update errors with a single batch operation
|
||||
setValidationErrors((prev) => {
|
||||
const newMap = new Map(prev);
|
||||
|
||||
// Check each row for changes
|
||||
errors.forEach((rowErrors, rowIndex) => {
|
||||
const existingErrors = newMap.get(rowIndex) || {};
|
||||
const updatedErrors = { ...existingErrors };
|
||||
let rowHasChanges = false;
|
||||
|
||||
// Check each field for changes
|
||||
Object.entries(rowErrors).forEach(([fieldKey, fieldErrors]) => {
|
||||
// Compare with existing errors
|
||||
const existingFieldErrors = existingErrors[fieldKey];
|
||||
|
||||
if (
|
||||
!existingFieldErrors ||
|
||||
existingFieldErrors.length !== fieldErrors.length ||
|
||||
!existingFieldErrors.every(
|
||||
(err, idx) =>
|
||||
err.message === fieldErrors[idx].message &&
|
||||
err.type === fieldErrors[idx].type
|
||||
)
|
||||
) {
|
||||
// We have a change
|
||||
updatedErrors[fieldKey] = fieldErrors;
|
||||
rowHasChanges = true;
|
||||
hasChanges = true;
|
||||
}
|
||||
});
|
||||
|
||||
// Only update if we have changes
|
||||
if (rowHasChanges) {
|
||||
newMap.set(rowIndex, updatedErrors);
|
||||
}
|
||||
});
|
||||
|
||||
// Only return a new map if we have changes
|
||||
return hasChanges ? newMap : prev;
|
||||
});
|
||||
}
|
||||
|
||||
console.log("Uniqueness validation complete");
|
||||
}, [data, fields, setValidationErrors]);
|
||||
|
||||
return {
|
||||
validateUniqueItemNumbers
|
||||
};
|
||||
};
|
||||
@@ -0,0 +1,131 @@
|
||||
import { useCallback } from 'react';
|
||||
import type { Fields } from '../../../types';
|
||||
import { ErrorSources, ErrorType } from '../../../types';
|
||||
import { RowData, InfoWithSource, isEmpty } from './validationTypes';
|
||||
|
||||
export const useUniqueValidation = <T extends string>(
|
||||
fields: Fields<T>
|
||||
) => {
|
||||
// Additional function to explicitly validate uniqueness for specified fields
|
||||
const validateUniqueField = useCallback((data: RowData<T>[], fieldKey: string) => {
|
||||
// Field keys that need special handling for uniqueness
|
||||
const uniquenessFields = ['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'];
|
||||
|
||||
// If the field doesn't need uniqueness validation, return empty errors
|
||||
if (!uniquenessFields.includes(fieldKey)) {
|
||||
const field = fields.find(f => String(f.key) === fieldKey);
|
||||
if (!field || !field.validations?.some(v => v.rule === 'unique')) {
|
||||
return new Map<number, Record<string, InfoWithSource>>();
|
||||
}
|
||||
}
|
||||
|
||||
// Create map to track errors
|
||||
const uniqueErrors = new Map<number, Record<string, InfoWithSource>>();
|
||||
|
||||
// Find the field definition
|
||||
const field = fields.find(f => String(f.key) === fieldKey);
|
||||
if (!field) return uniqueErrors;
|
||||
|
||||
// Get validation properties
|
||||
const validation = field.validations?.find(v => v.rule === 'unique');
|
||||
const allowEmpty = validation?.allowEmpty ?? false;
|
||||
const errorMessage = validation?.errorMessage || `${field.label} must be unique`;
|
||||
const level = validation?.level || 'error';
|
||||
|
||||
// Track values for uniqueness check
|
||||
const valueMap = new Map<string, number[]>();
|
||||
|
||||
// Build value map
|
||||
data.forEach((row, rowIndex) => {
|
||||
const value = String(row[fieldKey as keyof typeof row] || '');
|
||||
|
||||
// Skip empty values if allowed
|
||||
if (allowEmpty && isEmpty(value)) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!valueMap.has(value)) {
|
||||
valueMap.set(value, [rowIndex]);
|
||||
} else {
|
||||
valueMap.get(value)?.push(rowIndex);
|
||||
}
|
||||
});
|
||||
|
||||
// Add errors for duplicate values
|
||||
valueMap.forEach((rowIndexes, value) => {
|
||||
if (rowIndexes.length > 1) {
|
||||
// Skip empty values
|
||||
if (!value || value.trim() === '') return;
|
||||
|
||||
// Add error to all duplicate rows
|
||||
rowIndexes.forEach(rowIndex => {
|
||||
// Create errors object if needed
|
||||
if (!uniqueErrors.has(rowIndex)) {
|
||||
uniqueErrors.set(rowIndex, {});
|
||||
}
|
||||
|
||||
// Add error for this field
|
||||
uniqueErrors.get(rowIndex)![fieldKey] = {
|
||||
message: errorMessage,
|
||||
level: level as 'info' | 'warning' | 'error',
|
||||
source: ErrorSources.Table,
|
||||
type: ErrorType.Unique
|
||||
};
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return uniqueErrors;
|
||||
}, [fields]);
|
||||
|
||||
// Validate uniqueness for multiple fields
|
||||
const validateUniqueFields = useCallback((data: RowData<T>[], fieldKeys: string[]) => {
|
||||
// Process each field and merge results
|
||||
const allErrors = new Map<number, Record<string, InfoWithSource>>();
|
||||
|
||||
fieldKeys.forEach(fieldKey => {
|
||||
const fieldErrors = validateUniqueField(data, fieldKey);
|
||||
|
||||
// Merge errors
|
||||
fieldErrors.forEach((errors, rowIdx) => {
|
||||
if (!allErrors.has(rowIdx)) {
|
||||
allErrors.set(rowIdx, {});
|
||||
}
|
||||
Object.assign(allErrors.get(rowIdx)!, errors);
|
||||
});
|
||||
});
|
||||
|
||||
return allErrors;
|
||||
}, [validateUniqueField]);
|
||||
|
||||
// Run complete validation for uniqueness
|
||||
const validateAllUniqueFields = useCallback((data: RowData<T>[]) => {
|
||||
// Get fields requiring uniqueness validation
|
||||
const uniqueFields = fields
|
||||
.filter(field => field.validations?.some(v => v.rule === 'unique'))
|
||||
.map(field => String(field.key));
|
||||
|
||||
// Also add standard unique fields that might not be explicitly marked as unique
|
||||
const standardUniqueFields = ['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'];
|
||||
|
||||
// Combine all fields that need uniqueness validation
|
||||
const allUniqueFieldKeys = [...new Set([
|
||||
...uniqueFields,
|
||||
...standardUniqueFields
|
||||
])];
|
||||
|
||||
// Filter to only fields that exist in the data
|
||||
const existingFields = allUniqueFieldKeys.filter(fieldKey =>
|
||||
data.some(row => fieldKey in row)
|
||||
);
|
||||
|
||||
// Validate all fields at once
|
||||
return validateUniqueFields(data, existingFields);
|
||||
}, [fields, validateUniqueFields]);
|
||||
|
||||
return {
|
||||
validateUniqueField,
|
||||
validateUniqueFields,
|
||||
validateAllUniqueFields
|
||||
};
|
||||
};
|
||||
@@ -30,13 +30,6 @@ export const useUpcValidation = (
|
||||
const processedUpcMapRef = useRef(new Map<string, string>());
|
||||
const initialUpcValidationDoneRef = useRef(false);
|
||||
|
||||
// For batch validation
|
||||
const validationQueueRef = useRef<Array<{rowIndex: number, supplierId: string, upcValue: string}>>([]);
|
||||
const isProcessingBatchRef = useRef(false);
|
||||
|
||||
// For validation results
|
||||
const [upcValidationResults] = useState<Map<number, { itemNumber: string }>>(new Map());
|
||||
|
||||
// Helper to create cell key
|
||||
const getCellKey = (rowIndex: number, fieldKey: string) => `${rowIndex}-${fieldKey}`;
|
||||
|
||||
@@ -56,17 +49,40 @@ export const useUpcValidation = (
|
||||
|
||||
// Update item number
|
||||
const updateItemNumber = useCallback((rowIndex: number, itemNumber: string) => {
|
||||
console.log(`Setting item number for row ${rowIndex} to ${itemNumber}`);
|
||||
validationStateRef.current.itemNumbers.set(rowIndex, itemNumber);
|
||||
setItemNumberUpdates(new Map(validationStateRef.current.itemNumbers));
|
||||
}, []);
|
||||
|
||||
// Mark a row as being validated
|
||||
const startValidatingRow = useCallback((rowIndex: number) => {
|
||||
validationStateRef.current.validatingRows.add(rowIndex);
|
||||
setValidatingRows(new Set(validationStateRef.current.validatingRows));
|
||||
setIsValidatingUpc(true);
|
||||
}, []);
|
||||
// CRITICAL: Update BOTH the data state and the ref
|
||||
// First, update the data directly to ensure UI consistency
|
||||
setData(prevData => {
|
||||
// Create a new copy of the data
|
||||
const newData = [...prevData];
|
||||
|
||||
// Only update if the row exists
|
||||
if (rowIndex >= 0 && rowIndex < newData.length) {
|
||||
// First, we need a new object reference for the row to force a re-render
|
||||
newData[rowIndex] = {
|
||||
...newData[rowIndex],
|
||||
item_number: itemNumber
|
||||
};
|
||||
}
|
||||
|
||||
return newData;
|
||||
});
|
||||
|
||||
// Also update the itemNumbers map AFTER the data is updated
|
||||
// This ensures the map reflects the current state of the data
|
||||
setTimeout(() => {
|
||||
// Update the ref with the same value
|
||||
validationStateRef.current.itemNumbers.set(rowIndex, itemNumber);
|
||||
|
||||
// CRITICAL: Force a React state update to ensure all components re-render
|
||||
// Created a brand new Map object to ensure React detects the change
|
||||
const newItemNumbersMap = new Map(validationStateRef.current.itemNumbers);
|
||||
setItemNumberUpdates(newItemNumbersMap);
|
||||
|
||||
// Force an immediate React render cycle by triggering state updates
|
||||
setValidatingCellKeys(new Set(validationStateRef.current.validatingCells));
|
||||
setValidatingRows(new Set(validationStateRef.current.validatingRows));
|
||||
}, 0);
|
||||
}, [setData]);
|
||||
|
||||
// Mark a row as no longer being validated
|
||||
const stopValidatingRow = useCallback((rowIndex: number) => {
|
||||
@@ -139,11 +155,22 @@ export const useUpcValidation = (
|
||||
);
|
||||
previousKeys.forEach(key => validationStateRef.current.activeValidations.delete(key));
|
||||
|
||||
// Start validation - track this with the ref to avoid race conditions
|
||||
startValidatingRow(rowIndex);
|
||||
startValidatingCell(rowIndex, 'item_number');
|
||||
// Log validation start to help debug template issues
|
||||
console.log(`[UPC-DEBUG] Starting UPC validation for row ${rowIndex} with supplier ${supplierId}, upc ${upcValue}`);
|
||||
|
||||
console.log(`Validating UPC: rowIndex=${rowIndex}, supplierId=${supplierId}, upc=${upcValue}`);
|
||||
// IMPORTANT: Set validation state using setState to FORCE UI updates
|
||||
validationStateRef.current.validatingRows.add(rowIndex);
|
||||
setValidatingRows(new Set(validationStateRef.current.validatingRows));
|
||||
setIsValidatingUpc(true);
|
||||
|
||||
// Start cell validation and explicitly update UI via setState
|
||||
const cellKey = getCellKey(rowIndex, 'item_number');
|
||||
validationStateRef.current.validatingCells.add(cellKey);
|
||||
setValidatingCellKeys(new Set(validationStateRef.current.validatingCells));
|
||||
|
||||
console.log(`[UPC-DEBUG] Set loading state for row ${rowIndex}, cell key ${cellKey}`);
|
||||
console.log(`[UPC-DEBUG] Current validating rows: ${Array.from(validationStateRef.current.validatingRows).join(', ')}`);
|
||||
console.log(`[UPC-DEBUG] Current validating cells: ${Array.from(validationStateRef.current.validatingCells).join(', ')}`);
|
||||
|
||||
try {
|
||||
// Create a unique key for this validation to track it
|
||||
@@ -164,18 +191,43 @@ export const useUpcValidation = (
|
||||
});
|
||||
|
||||
// Fetch the product by UPC
|
||||
console.log(`[UPC-DEBUG] Fetching product data for UPC ${upcValue} with supplier ${supplierId}`);
|
||||
const product = await fetchProductByUpc(supplierId, upcValue);
|
||||
console.log(`[UPC-DEBUG] Fetch complete for row ${rowIndex}, success: ${!product.error}`);
|
||||
|
||||
// Check if this validation is still relevant (hasn't been superseded by another)
|
||||
if (!validationStateRef.current.activeValidations.has(validationKey)) {
|
||||
console.log(`Validation ${validationKey} was cancelled`);
|
||||
console.log(`[UPC-DEBUG] Validation ${validationKey} was cancelled`);
|
||||
return { success: false };
|
||||
}
|
||||
|
||||
// Extract the item number from the API response - check for !error since API returns { error: boolean, data: any }
|
||||
if (product && !product.error && product.data?.itemNumber) {
|
||||
// Store this validation result
|
||||
updateItemNumber(rowIndex, product.data.itemNumber);
|
||||
console.log(`[UPC-DEBUG] Got item number for row ${rowIndex}: ${product.data.itemNumber}`);
|
||||
|
||||
// CRITICAL FIX: Directly update the data with the new item number first
|
||||
setData(prevData => {
|
||||
const newData = [...prevData];
|
||||
if (rowIndex >= 0 && rowIndex < newData.length) {
|
||||
// This should happen before updating the map
|
||||
newData[rowIndex] = {
|
||||
...newData[rowIndex],
|
||||
item_number: product.data.itemNumber
|
||||
};
|
||||
}
|
||||
return newData;
|
||||
});
|
||||
|
||||
// Then, update the map to match what's now in the data
|
||||
validationStateRef.current.itemNumbers.set(rowIndex, product.data.itemNumber);
|
||||
|
||||
// CRITICAL: Force a React state update to ensure all components re-render
|
||||
// Created a brand new Map object to ensure React detects the change
|
||||
const newItemNumbersMap = new Map(validationStateRef.current.itemNumbers);
|
||||
setItemNumberUpdates(newItemNumbersMap);
|
||||
|
||||
// Force a shallow copy of the itemNumbers map to trigger useEffect dependencies
|
||||
validationStateRef.current.itemNumbers = new Map(validationStateRef.current.itemNumbers);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
@@ -183,7 +235,7 @@ export const useUpcValidation = (
|
||||
};
|
||||
} else {
|
||||
// No item number found but validation was still attempted
|
||||
console.log(`No item number found for UPC ${upcValue}`);
|
||||
console.log(`[UPC-DEBUG] No item number found for UPC ${upcValue}`);
|
||||
|
||||
// Clear any existing item number to show validation was attempted and failed
|
||||
if (validationStateRef.current.itemNumbers.has(rowIndex)) {
|
||||
@@ -194,157 +246,74 @@ export const useUpcValidation = (
|
||||
return { success: false };
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error validating UPC:', error);
|
||||
console.error('[UPC-DEBUG] Error validating UPC:', error);
|
||||
return { success: false };
|
||||
} finally {
|
||||
// End validation
|
||||
stopValidatingRow(rowIndex);
|
||||
stopValidatingCell(rowIndex, 'item_number');
|
||||
// End validation - FORCE UI update by using setState directly
|
||||
console.log(`[UPC-DEBUG] Ending validation for row ${rowIndex}`);
|
||||
|
||||
validationStateRef.current.validatingRows.delete(rowIndex);
|
||||
setValidatingRows(new Set(validationStateRef.current.validatingRows));
|
||||
|
||||
if (validationStateRef.current.validatingRows.size === 0) {
|
||||
setIsValidatingUpc(false);
|
||||
}
|
||||
|
||||
validationStateRef.current.validatingCells.delete(cellKey);
|
||||
setValidatingCellKeys(new Set(validationStateRef.current.validatingCells));
|
||||
|
||||
console.log(`[UPC-DEBUG] Cleared loading state for row ${rowIndex}`);
|
||||
console.log(`[UPC-DEBUG] Updated validating rows: ${Array.from(validationStateRef.current.validatingRows).join(', ')}`);
|
||||
console.log(`[UPC-DEBUG] Updated validating cells: ${Array.from(validationStateRef.current.validatingCells).join(', ')}`);
|
||||
}
|
||||
}, [fetchProductByUpc, updateItemNumber, startValidatingCell, stopValidatingCell, startValidatingRow, stopValidatingRow, setData]);
|
||||
}, [fetchProductByUpc, updateItemNumber, setData]);
|
||||
|
||||
// Apply item numbers to data
|
||||
const applyItemNumbersToData = useCallback((onApplied?: (updatedRowIds: number[]) => void) => {
|
||||
// Create a copy of the current item numbers map to avoid race conditions
|
||||
const currentItemNumbers = new Map(validationStateRef.current.itemNumbers);
|
||||
// Apply all pending item numbers to the data state
|
||||
const applyItemNumbersToData = useCallback((callback?: (updatedRows: number[]) => void) => {
|
||||
// Skip if we have nothing to apply
|
||||
if (validationStateRef.current.itemNumbers.size === 0) {
|
||||
if (callback) callback([]);
|
||||
return;
|
||||
}
|
||||
|
||||
// Only apply if we have any item numbers
|
||||
if (currentItemNumbers.size === 0) return;
|
||||
|
||||
// Track updated row indices to pass to callback
|
||||
const updatedRowIndices: number[] = [];
|
||||
|
||||
// Log for debugging
|
||||
console.log(`Applying ${currentItemNumbers.size} item numbers to data`);
|
||||
// Gather all row IDs that will be updated
|
||||
const rowIds: number[] = [];
|
||||
|
||||
// Update the data state with all item numbers
|
||||
setData(prevData => {
|
||||
// Create a new copy of the data
|
||||
const newData = [...prevData];
|
||||
|
||||
// Update each row with its item number without affecting other fields
|
||||
currentItemNumbers.forEach((itemNumber, rowIndex) => {
|
||||
if (rowIndex < newData.length) {
|
||||
console.log(`Setting item_number for row ${rowIndex} to ${itemNumber}`);
|
||||
// Apply each item number to the data
|
||||
validationStateRef.current.itemNumbers.forEach((itemNumber, rowIndex) => {
|
||||
// Ensure row exists and value has actually changed
|
||||
if (rowIndex >= 0 && rowIndex < newData.length &&
|
||||
newData[rowIndex]?.item_number !== itemNumber) {
|
||||
|
||||
// Only update the item_number field, leaving other fields unchanged
|
||||
// Create a new row object to force re-rendering
|
||||
newData[rowIndex] = {
|
||||
...newData[rowIndex],
|
||||
item_number: itemNumber
|
||||
};
|
||||
|
||||
// Track which rows were updated
|
||||
updatedRowIndices.push(rowIndex);
|
||||
// Track which row was updated for the callback
|
||||
rowIds.push(rowIndex);
|
||||
}
|
||||
});
|
||||
|
||||
return newData;
|
||||
});
|
||||
|
||||
// Call the callback if provided, after state updates are processed
|
||||
if (onApplied && updatedRowIndices.length > 0) {
|
||||
// Use setTimeout to ensure this happens after the state update
|
||||
setTimeout(() => {
|
||||
onApplied(updatedRowIndices);
|
||||
}, 100); // Use 100ms to ensure the data update is fully processed
|
||||
// Force a re-render by updating React state
|
||||
setTimeout(() => {
|
||||
setItemNumberUpdates(new Map(validationStateRef.current.itemNumbers));
|
||||
}, 0);
|
||||
|
||||
// Call the callback with the updated row IDs
|
||||
if (callback) {
|
||||
callback(rowIds);
|
||||
}
|
||||
}, [setData]);
|
||||
|
||||
// Process validation queue in batches - faster processing with smaller batches
|
||||
const processBatchValidation = useCallback(async () => {
|
||||
if (isProcessingBatchRef.current) return;
|
||||
if (validationQueueRef.current.length === 0) return;
|
||||
|
||||
console.log(`Processing validation batch with ${validationQueueRef.current.length} items`);
|
||||
isProcessingBatchRef.current = true;
|
||||
|
||||
// Process in smaller batches for better UI responsiveness
|
||||
const BATCH_SIZE = 5;
|
||||
const queue = [...validationQueueRef.current];
|
||||
validationQueueRef.current = [];
|
||||
|
||||
// Track if any updates were made
|
||||
let updatesApplied = false;
|
||||
|
||||
// Track updated row indices
|
||||
const updatedRows: number[] = [];
|
||||
|
||||
try {
|
||||
// Process in small batches
|
||||
for (let i = 0; i < queue.length; i += BATCH_SIZE) {
|
||||
const batch = queue.slice(i, i + BATCH_SIZE);
|
||||
|
||||
// Process batch in parallel
|
||||
const results = await Promise.all(batch.map(async ({ rowIndex, supplierId, upcValue }) => {
|
||||
try {
|
||||
// Skip if already validated
|
||||
const cacheKey = `${supplierId}-${upcValue}`;
|
||||
if (processedUpcMapRef.current.has(cacheKey)) {
|
||||
const cachedItemNumber = processedUpcMapRef.current.get(cacheKey);
|
||||
if (cachedItemNumber) {
|
||||
console.log(`Using cached item number for row ${rowIndex}: ${cachedItemNumber}`);
|
||||
updateItemNumber(rowIndex, cachedItemNumber);
|
||||
updatesApplied = true;
|
||||
updatedRows.push(rowIndex);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Fetch from API
|
||||
const result = await fetchProductByUpc(supplierId, upcValue);
|
||||
|
||||
if (!result.error && result.data?.itemNumber) {
|
||||
const itemNumber = result.data.itemNumber;
|
||||
|
||||
// Store in cache
|
||||
processedUpcMapRef.current.set(cacheKey, itemNumber);
|
||||
|
||||
// Update item number
|
||||
updateItemNumber(rowIndex, itemNumber);
|
||||
updatesApplied = true;
|
||||
updatedRows.push(rowIndex);
|
||||
|
||||
console.log(`Set item number for row ${rowIndex} to ${itemNumber}`);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
} catch (error) {
|
||||
console.error(`Error processing row ${rowIndex}:`, error);
|
||||
return false;
|
||||
} finally {
|
||||
// Clear validation state
|
||||
stopValidatingRow(rowIndex);
|
||||
}
|
||||
}));
|
||||
|
||||
// If any updates were applied in this batch, update the data
|
||||
if (results.some(Boolean) && updatesApplied) {
|
||||
applyItemNumbersToData(updatedRowIds => {
|
||||
console.log(`Processed batch UPC validation for rows: ${updatedRowIds.join(', ')}`);
|
||||
});
|
||||
updatesApplied = false;
|
||||
updatedRows.length = 0; // Clear the array
|
||||
}
|
||||
|
||||
// Small delay between batches to allow UI to update
|
||||
if (i + BATCH_SIZE < queue.length) {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error in batch processing:', error);
|
||||
} finally {
|
||||
isProcessingBatchRef.current = false;
|
||||
|
||||
// Process any new items
|
||||
if (validationQueueRef.current.length > 0) {
|
||||
setTimeout(processBatchValidation, 0);
|
||||
}
|
||||
}
|
||||
}, [fetchProductByUpc, updateItemNumber, stopValidatingRow, applyItemNumbersToData]);
|
||||
|
||||
// For immediate processing
|
||||
|
||||
// Batch validate all UPCs in the data
|
||||
const validateAllUPCs = useCallback(async () => {
|
||||
// Skip if we've already done the initial validation
|
||||
@@ -508,8 +477,8 @@ export const useUpcValidation = (
|
||||
getItemNumber,
|
||||
applyItemNumbersToData,
|
||||
|
||||
// Results
|
||||
upcValidationResults,
|
||||
// CRITICAL: Expose the itemNumbers map directly
|
||||
itemNumbers: validationStateRef.current.itemNumbers,
|
||||
|
||||
// Initialization state
|
||||
initialValidationDone: initialUpcValidationDoneRef.current
|
||||
|
||||
@@ -1,336 +1,23 @@
|
||||
import { useCallback } from 'react'
|
||||
import type { Field, Fields, RowHook, TableHook } from '../../../types'
|
||||
import type { Meta } from '../types'
|
||||
import { ErrorSources, ErrorType, ValidationError } from '../../../types'
|
||||
import { RowData } from './useValidationState'
|
||||
|
||||
// Define InfoWithSource to match the expected structure
|
||||
// Make sure source is required (not optional)
|
||||
export interface InfoWithSource {
|
||||
message: string;
|
||||
level: 'info' | 'warning' | 'error';
|
||||
source: ErrorSources;
|
||||
type: ErrorType;
|
||||
}
|
||||
|
||||
// Shared utility function for checking empty values - defined once to avoid duplication
|
||||
const isEmpty = (value: any): boolean =>
|
||||
value === undefined ||
|
||||
value === null ||
|
||||
value === '' ||
|
||||
(Array.isArray(value) && value.length === 0) ||
|
||||
(typeof value === 'object' && !Array.isArray(value) && Object.keys(value).length === 0);
|
||||
|
||||
// Create a cache for validation results to avoid repeated validation of the same data
|
||||
const validationResultCache = new Map();
|
||||
const validationCache: Record<string, any> = {};
|
||||
|
||||
// Add a function to clear cache for a specific field value
|
||||
export const clearValidationCacheForField = (fieldKey: string) => {
|
||||
// Clear cache
|
||||
const cacheKey = `field_${fieldKey}`;
|
||||
delete validationCache[cacheKey];
|
||||
};
|
||||
|
||||
// Add a special function to clear all uniqueness validation caches
|
||||
export const clearAllUniquenessCaches = () => {
|
||||
// Clear cache for common unique fields
|
||||
['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'].forEach(fieldKey => {
|
||||
clearValidationCacheForField(fieldKey);
|
||||
});
|
||||
|
||||
// Also clear any cache entries that might involve uniqueness validation
|
||||
validationResultCache.forEach((_, key) => {
|
||||
if (key.includes('unique')) {
|
||||
validationResultCache.delete(key);
|
||||
}
|
||||
});
|
||||
};
|
||||
import type { Field, Fields, RowHook } from '../../../types'
|
||||
import { ErrorSources } from '../../../types'
|
||||
import { RowData, InfoWithSource } from './validationTypes'
|
||||
import { useFieldValidation, clearValidationCacheForField, clearAllUniquenessCaches } from './useFieldValidation'
|
||||
import { useUniqueValidation } from './useUniqueValidation'
|
||||
|
||||
// Main validation hook that brings together field and uniqueness validation
|
||||
export const useValidation = <T extends string>(
|
||||
fields: Fields<T>,
|
||||
rowHook?: RowHook<T>,
|
||||
tableHook?: TableHook<T>
|
||||
rowHook?: RowHook<T>
|
||||
) => {
|
||||
// Validate a single field
|
||||
const validateField = useCallback((
|
||||
value: any,
|
||||
field: Field<T>
|
||||
): ValidationError[] => {
|
||||
const errors: ValidationError[] = []
|
||||
|
||||
if (!field.validations) return errors
|
||||
|
||||
// Create a cache key using field key, value, and validation rules
|
||||
const cacheKey = `${field.key}-${String(value)}-${JSON.stringify(field.validations)}`;
|
||||
|
||||
// Check cache first to avoid redundant validation
|
||||
if (validationResultCache.has(cacheKey)) {
|
||||
return validationResultCache.get(cacheKey) || [];
|
||||
}
|
||||
|
||||
field.validations.forEach(validation => {
|
||||
switch (validation.rule) {
|
||||
case 'required':
|
||||
// Use the shared isEmpty function
|
||||
if (isEmpty(value)) {
|
||||
errors.push({
|
||||
message: validation.errorMessage || 'This field is required',
|
||||
level: validation.level || 'error',
|
||||
type: ErrorType.Required
|
||||
})
|
||||
}
|
||||
break
|
||||
|
||||
case 'unique':
|
||||
// Unique validation happens at table level, not here
|
||||
break
|
||||
|
||||
case 'regex':
|
||||
if (value !== undefined && value !== null && value !== '') {
|
||||
try {
|
||||
const regex = new RegExp(validation.value, validation.flags)
|
||||
if (!regex.test(String(value))) {
|
||||
errors.push({
|
||||
message: validation.errorMessage,
|
||||
level: validation.level || 'error',
|
||||
type: ErrorType.Regex
|
||||
})
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Invalid regex in validation:', error)
|
||||
}
|
||||
}
|
||||
break
|
||||
}
|
||||
})
|
||||
|
||||
// Store results in cache to speed up future validations
|
||||
validationResultCache.set(cacheKey, errors);
|
||||
|
||||
return errors
|
||||
}, [])
|
||||
|
||||
// Validate a single row
|
||||
const validateRow = useCallback(async (
|
||||
row: RowData<T>,
|
||||
rowIndex: number,
|
||||
allRows: RowData<T>[]
|
||||
): Promise<Meta> => {
|
||||
// Run field-level validations
|
||||
const fieldErrors: Record<string, ValidationError[]> = {}
|
||||
|
||||
// Use the shared isEmpty function
|
||||
|
||||
fields.forEach(field => {
|
||||
const value = row[String(field.key) as keyof typeof row]
|
||||
const errors = validateField(value, field as Field<T>)
|
||||
|
||||
if (errors.length > 0) {
|
||||
fieldErrors[String(field.key)] = errors
|
||||
}
|
||||
})
|
||||
|
||||
// Special validation for supplier and company fields - only apply if the field exists in fields
|
||||
if (fields.some(field => String(field.key) === 'supplier') && isEmpty(row.supplier)) {
|
||||
fieldErrors['supplier'] = [{
|
||||
message: 'Supplier is required',
|
||||
level: 'error',
|
||||
type: ErrorType.Required
|
||||
}]
|
||||
}
|
||||
|
||||
if (fields.some(field => String(field.key) === 'company') && isEmpty(row.company)) {
|
||||
fieldErrors['company'] = [{
|
||||
message: 'Company is required',
|
||||
level: 'error',
|
||||
type: ErrorType.Required
|
||||
}]
|
||||
}
|
||||
|
||||
// Run row hook if provided
|
||||
let rowHookResult: Meta = {
|
||||
__index: row.__index || String(rowIndex)
|
||||
}
|
||||
if (rowHook) {
|
||||
try {
|
||||
// Call the row hook and extract only the __index property
|
||||
const result = await rowHook(row, rowIndex, allRows);
|
||||
rowHookResult.__index = result.__index || rowHookResult.__index;
|
||||
} catch (error) {
|
||||
console.error('Error in row hook:', error)
|
||||
}
|
||||
}
|
||||
|
||||
// We no longer need to merge errors since we're not storing them in the row data
|
||||
// The calling code should handle storing errors in the validationErrors Map
|
||||
|
||||
return {
|
||||
__index: row.__index || String(rowIndex)
|
||||
}
|
||||
}, [fields, validateField, rowHook])
|
||||
|
||||
// Validate all data at the table level
|
||||
const validateTable = useCallback(async (data: RowData<T>[]): Promise<Meta[]> => {
|
||||
if (!tableHook) {
|
||||
return data.map((row, index) => ({
|
||||
__index: row.__index || String(index)
|
||||
}))
|
||||
}
|
||||
|
||||
try {
|
||||
const tableResults = await tableHook(data)
|
||||
|
||||
// Process table validation results
|
||||
return tableResults.map((result, index) => {
|
||||
return {
|
||||
__index: result.__index || data[index].__index || String(index)
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
console.error('Error in table hook:', error)
|
||||
return data.map((row, index) => ({
|
||||
__index: row.__index || String(index)
|
||||
}))
|
||||
}
|
||||
}, [tableHook])
|
||||
|
||||
// Validate unique fields across the table
|
||||
const validateUnique = useCallback((data: RowData<T>[]) => {
|
||||
// Create a map to store errors for each row
|
||||
const uniqueErrors = new Map<number, Record<string, InfoWithSource>>();
|
||||
|
||||
// Find fields with unique validation
|
||||
const uniqueFields = fields.filter(field =>
|
||||
field.validations?.some(v => v.rule === 'unique')
|
||||
);
|
||||
|
||||
if (uniqueFields.length === 0) {
|
||||
return uniqueErrors;
|
||||
}
|
||||
|
||||
// Check each unique field
|
||||
uniqueFields.forEach(field => {
|
||||
const { key } = field;
|
||||
const validation = field.validations?.find(v => v.rule === 'unique');
|
||||
const allowEmpty = validation?.allowEmpty ?? false;
|
||||
const errorMessage = validation?.errorMessage || `${field.label} must be unique`;
|
||||
const level = validation?.level || 'error';
|
||||
|
||||
// Track values for uniqueness check
|
||||
const valueMap = new Map<string, number[]>();
|
||||
|
||||
// Build value map
|
||||
data.forEach((row, rowIndex) => {
|
||||
const value = String(row[String(key) as keyof typeof row] || '');
|
||||
|
||||
// Skip empty values if allowed
|
||||
if (allowEmpty && isEmpty(value)) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!valueMap.has(value)) {
|
||||
valueMap.set(value, [rowIndex]);
|
||||
} else {
|
||||
valueMap.get(value)?.push(rowIndex);
|
||||
}
|
||||
});
|
||||
|
||||
// Add errors for duplicate values
|
||||
valueMap.forEach((rowIndexes) => {
|
||||
if (rowIndexes.length > 1) {
|
||||
// Add error to all duplicate rows
|
||||
rowIndexes.forEach(rowIndex => {
|
||||
// Get existing errors for this row or create a new object
|
||||
const rowErrors = uniqueErrors.get(rowIndex) || {};
|
||||
|
||||
rowErrors[String(key)] = {
|
||||
message: errorMessage,
|
||||
level,
|
||||
source: ErrorSources.Table,
|
||||
type: ErrorType.Unique
|
||||
};
|
||||
|
||||
uniqueErrors.set(rowIndex, rowErrors);
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
return uniqueErrors;
|
||||
}, [fields]);
|
||||
|
||||
// Additional function to explicitly validate uniqueness for specified fields
|
||||
const validateUniqueField = useCallback((data: RowData<T>[], fieldKey: string) => {
|
||||
// Field keys that need special handling for uniqueness
|
||||
const uniquenessFields = ['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'];
|
||||
|
||||
// If the field doesn't need uniqueness validation, return empty errors
|
||||
if (!uniquenessFields.includes(fieldKey)) {
|
||||
const field = fields.find(f => String(f.key) === fieldKey);
|
||||
if (!field || !field.validations?.some(v => v.rule === 'unique')) {
|
||||
return new Map<number, Record<string, InfoWithSource>>();
|
||||
}
|
||||
}
|
||||
|
||||
// Create map to track errors
|
||||
const uniqueErrors = new Map<number, Record<string, InfoWithSource>>();
|
||||
|
||||
// Find the field definition
|
||||
const field = fields.find(f => String(f.key) === fieldKey);
|
||||
if (!field) return uniqueErrors;
|
||||
|
||||
// Get validation properties
|
||||
const validation = field.validations?.find(v => v.rule === 'unique');
|
||||
const allowEmpty = validation?.allowEmpty ?? false;
|
||||
const errorMessage = validation?.errorMessage || `${field.label} must be unique`;
|
||||
const level = validation?.level || 'error';
|
||||
|
||||
// Track values for uniqueness check
|
||||
const valueMap = new Map<string, number[]>();
|
||||
|
||||
// Build value map
|
||||
data.forEach((row, rowIndex) => {
|
||||
const value = String(row[fieldKey as keyof typeof row] || '');
|
||||
|
||||
// Skip empty values if allowed
|
||||
if (allowEmpty && isEmpty(value)) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!valueMap.has(value)) {
|
||||
valueMap.set(value, [rowIndex]);
|
||||
} else {
|
||||
valueMap.get(value)?.push(rowIndex);
|
||||
}
|
||||
});
|
||||
|
||||
// Add errors for duplicate values
|
||||
valueMap.forEach((rowIndexes, value) => {
|
||||
if (rowIndexes.length > 1) {
|
||||
// Skip empty values
|
||||
if (!value || value.trim() === '') return;
|
||||
|
||||
// Add error to all duplicate rows
|
||||
rowIndexes.forEach(rowIndex => {
|
||||
// Create errors object if needed
|
||||
if (!uniqueErrors.has(rowIndex)) {
|
||||
uniqueErrors.set(rowIndex, {});
|
||||
}
|
||||
|
||||
// Add error for this field
|
||||
uniqueErrors.get(rowIndex)![fieldKey] = {
|
||||
message: errorMessage,
|
||||
level: level as 'info' | 'warning' | 'error',
|
||||
source: ErrorSources.Table,
|
||||
type: ErrorType.Unique
|
||||
};
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return uniqueErrors;
|
||||
}, [fields]);
|
||||
// Use the field validation hook
|
||||
const { validateField, validateRow } = useFieldValidation(fields, rowHook);
|
||||
|
||||
// Use the uniqueness validation hook
|
||||
const {
|
||||
validateUniqueField,
|
||||
validateAllUniqueFields
|
||||
} = useUniqueValidation(fields);
|
||||
|
||||
// Run complete validation
|
||||
const validateData = useCallback(async (data: RowData<T>[], fieldToUpdate?: { rowIndex: number, fieldKey: string }) => {
|
||||
@@ -429,9 +116,6 @@ export const useValidation = <T extends string>(
|
||||
// Full validation - all fields for all rows
|
||||
console.log('Running full validation for all fields and rows');
|
||||
|
||||
// Clear validation cache for full validation
|
||||
validationResultCache.clear();
|
||||
|
||||
// Process each row for field-level validations
|
||||
for (let rowIndex = 0; rowIndex < data.length; rowIndex++) {
|
||||
const row = data[rowIndex];
|
||||
@@ -459,38 +143,15 @@ export const useValidation = <T extends string>(
|
||||
}
|
||||
}
|
||||
|
||||
// Get fields requiring uniqueness validation
|
||||
const uniqueFields = fields.filter(field =>
|
||||
field.validations?.some(v => v.rule === 'unique')
|
||||
);
|
||||
// Validate all unique fields
|
||||
const uniqueErrors = validateAllUniqueFields(data);
|
||||
|
||||
// Also add standard unique fields that might not be explicitly marked as unique
|
||||
const standardUniqueFields = ['item_number', 'upc', 'barcode', 'supplier_no', 'notions_no'];
|
||||
|
||||
// Combine all fields that need uniqueness validation
|
||||
const allUniqueFieldKeys = new Set([
|
||||
...uniqueFields.map(field => String(field.key)),
|
||||
...standardUniqueFields
|
||||
]);
|
||||
|
||||
// Log uniqueness validation fields
|
||||
console.log('Validating unique fields:', Array.from(allUniqueFieldKeys));
|
||||
|
||||
// Run uniqueness validation for each unique field
|
||||
allUniqueFieldKeys.forEach(fieldKey => {
|
||||
// Check if this field exists in the data
|
||||
const hasField = data.some(row => fieldKey in row);
|
||||
if (!hasField) return;
|
||||
|
||||
const uniqueErrors = validateUniqueField(data, fieldKey);
|
||||
|
||||
// Add unique errors to validation errors
|
||||
uniqueErrors.forEach((errors, rowIdx) => {
|
||||
if (!validationErrors.has(rowIdx)) {
|
||||
validationErrors.set(rowIdx, {});
|
||||
}
|
||||
Object.assign(validationErrors.get(rowIdx)!, errors);
|
||||
});
|
||||
// Merge in unique errors
|
||||
uniqueErrors.forEach((errors, rowIdx) => {
|
||||
if (!validationErrors.has(rowIdx)) {
|
||||
validationErrors.set(rowIdx, {});
|
||||
}
|
||||
Object.assign(validationErrors.get(rowIdx)!, errors);
|
||||
});
|
||||
|
||||
console.log('Uniqueness validation complete');
|
||||
@@ -500,16 +161,14 @@ export const useValidation = <T extends string>(
|
||||
data,
|
||||
validationErrors
|
||||
};
|
||||
}, [fields, validateField, validateUniqueField]);
|
||||
}, [fields, validateField, validateUniqueField, validateAllUniqueFields]);
|
||||
|
||||
return {
|
||||
validateData,
|
||||
validateField,
|
||||
validateRow,
|
||||
validateTable,
|
||||
validateUnique,
|
||||
validateUniqueField,
|
||||
clearValidationCacheForField,
|
||||
clearAllUniquenessCaches
|
||||
}
|
||||
};
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,100 @@
|
||||
import type { Data } from "../../../types";
|
||||
import { ErrorSources, ErrorType } from "../../../types";
|
||||
import config from "@/config";
|
||||
|
||||
// Define the Props interface for ValidationStepNew
|
||||
export interface Props<T extends string> {
|
||||
initialData: RowData<T>[];
|
||||
file?: File;
|
||||
onBack?: () => void;
|
||||
onNext?: (data: RowData<T>[]) => void;
|
||||
isFromScratch?: boolean;
|
||||
}
|
||||
|
||||
// Extended Data type with meta information
|
||||
export type RowData<T extends string> = Data<T> & {
|
||||
__index?: string;
|
||||
__template?: string;
|
||||
__original?: Record<string, any>;
|
||||
__corrected?: Record<string, any>;
|
||||
__changes?: Record<string, boolean>;
|
||||
upc?: string;
|
||||
barcode?: string;
|
||||
supplier?: string;
|
||||
company?: string;
|
||||
item_number?: string;
|
||||
[key: string]: any; // Allow any string key for dynamic fields
|
||||
};
|
||||
|
||||
// Template interface
|
||||
export interface Template {
|
||||
id: number;
|
||||
company: string;
|
||||
product_type: string;
|
||||
[key: string]: string | number | boolean | undefined;
|
||||
}
|
||||
|
||||
// Props for the useValidationState hook
|
||||
export interface ValidationStateProps<T extends string> extends Props<T> {}
|
||||
|
||||
// Interface for validation results
|
||||
export interface ValidationResult {
|
||||
error?: boolean;
|
||||
message?: string;
|
||||
data?: Record<string, any>;
|
||||
type?: ErrorType;
|
||||
source?: ErrorSources;
|
||||
}
|
||||
|
||||
// Filter state interface
|
||||
export interface FilterState {
|
||||
searchText: string;
|
||||
showErrorsOnly: boolean;
|
||||
filterField: string | null;
|
||||
filterValue: string | null;
|
||||
}
|
||||
|
||||
// UI validation state interface for useUpcValidation
|
||||
export interface ValidationState {
|
||||
validatingCells: Set<string>; // Using rowIndex-fieldKey as identifier
|
||||
itemNumbers: Map<number, string>; // Using rowIndex as key
|
||||
validatingRows: Set<number>; // Rows currently being validated
|
||||
activeValidations: Set<string>; // Active validations
|
||||
}
|
||||
|
||||
// InfoWithSource interface for validation errors
|
||||
export interface InfoWithSource {
|
||||
message: string;
|
||||
level: 'info' | 'warning' | 'error';
|
||||
source: ErrorSources;
|
||||
type: ErrorType;
|
||||
}
|
||||
|
||||
// Template state interface
|
||||
export interface TemplateState {
|
||||
selectedTemplateId: string | null;
|
||||
showSaveTemplateDialog: boolean;
|
||||
newTemplateName: string;
|
||||
newTemplateType: string;
|
||||
}
|
||||
|
||||
// Add config at the top of the file
|
||||
// Import the config or access it through window
|
||||
declare global {
|
||||
interface Window {
|
||||
config?: {
|
||||
apiUrl: string;
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Use a helper to get API URL consistently
|
||||
export const getApiUrl = () => config.apiUrl;
|
||||
|
||||
// Shared utility function for checking empty values
|
||||
export const isEmpty = (value: any): boolean =>
|
||||
value === undefined ||
|
||||
value === null ||
|
||||
value === '' ||
|
||||
(Array.isArray(value) && value.length === 0) ||
|
||||
(typeof value === 'object' && !Array.isArray(value) && Object.keys(value).length === 0);
|
||||
@@ -1,5 +1,5 @@
|
||||
import ValidationContainer from './components/ValidationContainer'
|
||||
import { Props } from './hooks/useValidationState'
|
||||
import { Props } from './hooks/validationTypes'
|
||||
|
||||
/**
|
||||
* ValidationStepNew component - modern implementation of the validation step
|
||||
|
||||
@@ -1,42 +0,0 @@
|
||||
import { ErrorType } from '../types/index'
|
||||
|
||||
/**
|
||||
* Converts an InfoWithSource or similar error object to our Error type
|
||||
* @param error The error object to convert
|
||||
* @returns Our standardized Error object
|
||||
*/
|
||||
export const convertToError = (error: any): ErrorType => {
|
||||
return {
|
||||
message: typeof error.message === 'string' ? error.message : String(error.message || ''),
|
||||
level: error.level || 'error',
|
||||
source: error.source || 'row',
|
||||
type: error.type || 'custom'
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Safely convert an error or array of errors to our Error[] format
|
||||
* @param errors The error or array of errors to convert
|
||||
* @returns Array of our Error objects
|
||||
*/
|
||||
export const convertToErrorArray = (errors: any): ErrorType[] => {
|
||||
if (Array.isArray(errors)) {
|
||||
return errors.map(convertToError)
|
||||
}
|
||||
return [convertToError(errors)]
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts a record of errors to our standardized format
|
||||
* @param errorRecord Record with string keys and error values
|
||||
* @returns Standardized error record
|
||||
*/
|
||||
export const convertErrorRecord = (errorRecord: Record<string, any>): Record<string, ErrorType[]> => {
|
||||
const result: Record<string, ErrorType[]> = {}
|
||||
|
||||
Object.entries(errorRecord).forEach(([key, errors]) => {
|
||||
result[key] = convertToErrorArray(errors)
|
||||
})
|
||||
|
||||
return result
|
||||
}
|
||||
@@ -1,74 +0,0 @@
|
||||
import { toast } from 'sonner'
|
||||
|
||||
/**
|
||||
* Placeholder for validating UPC codes
|
||||
* @param upcValue UPC value to validate
|
||||
* @returns Validation result
|
||||
*/
|
||||
export const validateUpc = async (upcValue: string): Promise<any> => {
|
||||
// Basic validation - UPC should be 12-14 digits
|
||||
if (!/^\d{12,14}$/.test(upcValue)) {
|
||||
toast.error('Invalid UPC format. UPC should be 12-14 digits.')
|
||||
return {
|
||||
error: true,
|
||||
message: 'Invalid UPC format'
|
||||
}
|
||||
}
|
||||
|
||||
// In a real implementation, call an API to validate the UPC
|
||||
// For now, just return a successful result
|
||||
return {
|
||||
error: false,
|
||||
data: {
|
||||
// Mock data that would be returned from the API
|
||||
item_number: `ITEM-${upcValue.substring(0, 6)}`,
|
||||
sku: `SKU-${upcValue.substring(0, 4)}`,
|
||||
description: `Sample Product ${upcValue.substring(0, 4)}`
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Generates an item number for a UPC
|
||||
* @param upcValue UPC value
|
||||
* @returns Generated item number
|
||||
*/
|
||||
export const generateItemNumber = (upcValue: string): string => {
|
||||
// Simple item number generation logic
|
||||
return `ITEM-${upcValue.substring(0, 6)}`
|
||||
}
|
||||
|
||||
/**
|
||||
* Placeholder for handling UPC validation process
|
||||
* @param upcValue UPC value to validate
|
||||
* @param rowIndex Row index being validated
|
||||
* @param updateRow Function to update row data
|
||||
*/
|
||||
export const handleUpcValidation = async (
|
||||
upcValue: string,
|
||||
rowIndex: number,
|
||||
updateRow: (rowIndex: number, key: string, value: any) => void
|
||||
): Promise<void> => {
|
||||
try {
|
||||
// Validate the UPC
|
||||
const result = await validateUpc(upcValue)
|
||||
|
||||
if (result.error) {
|
||||
toast.error(result.message || 'UPC validation failed')
|
||||
return
|
||||
}
|
||||
|
||||
// Update row with the validation result data
|
||||
if (result.data) {
|
||||
// Update each field returned from the API
|
||||
Object.entries(result.data).forEach(([key, value]) => {
|
||||
updateRow(rowIndex, key, value)
|
||||
})
|
||||
|
||||
toast.success('UPC validated successfully')
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error validating UPC:', error)
|
||||
toast.error('Failed to validate UPC')
|
||||
}
|
||||
}
|
||||
@@ -1,44 +0,0 @@
|
||||
/**
|
||||
* Helper functions for validation that ensure proper error objects
|
||||
*/
|
||||
|
||||
// Create a standard error object
|
||||
export const createError = (message, level = 'error', source = 'row') => {
|
||||
return { message, level, source };
|
||||
};
|
||||
|
||||
// Convert any error to standard format
|
||||
export const convertError = (error) => {
|
||||
if (!error) return createError('Unknown error');
|
||||
|
||||
if (typeof error === 'string') {
|
||||
return createError(error);
|
||||
}
|
||||
|
||||
return {
|
||||
message: error.message || 'Unknown error',
|
||||
level: error.level || 'error',
|
||||
source: error.source || 'row'
|
||||
};
|
||||
};
|
||||
|
||||
// Convert array of errors or single error to array
|
||||
export const convertToErrorArray = (errors) => {
|
||||
if (Array.isArray(errors)) {
|
||||
return errors.map(convertError);
|
||||
}
|
||||
return [convertError(errors)];
|
||||
};
|
||||
|
||||
// Convert a record of errors to standard format
|
||||
export const convertErrorRecord = (errorRecord) => {
|
||||
const result = {};
|
||||
|
||||
if (!errorRecord) return result;
|
||||
|
||||
Object.entries(errorRecord).forEach(([key, errors]) => {
|
||||
result[key] = convertToErrorArray(errors);
|
||||
});
|
||||
|
||||
return result;
|
||||
};
|
||||
@@ -1,184 +0,0 @@
|
||||
import { Field, Data, ErrorSources, ErrorType as ValidationErrorType } from '../../../types'
|
||||
import { ErrorType } from '../types/index'
|
||||
|
||||
/**
|
||||
* Formats a price value to a consistent format
|
||||
* @param value The price value to format
|
||||
* @returns The formatted price string
|
||||
*/
|
||||
export const formatPrice = (value: string | number): string => {
|
||||
if (!value) return ''
|
||||
|
||||
// Convert to string and clean
|
||||
const numericValue = String(value).replace(/[^\d.]/g, '')
|
||||
|
||||
// Parse the number
|
||||
const number = parseFloat(numericValue)
|
||||
if (isNaN(number)) return ''
|
||||
|
||||
// Format as currency
|
||||
return number.toLocaleString('en-US', {
|
||||
minimumFractionDigits: 2,
|
||||
maximumFractionDigits: 2
|
||||
})
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a field is a price field
|
||||
* @param field The field to check
|
||||
* @returns True if the field is a price field
|
||||
*/
|
||||
export const isPriceField = (field: Field<any>): boolean => {
|
||||
const fieldType = field.fieldType;
|
||||
return (fieldType.type === 'input' || fieldType.type === 'multi-input') &&
|
||||
'price' in fieldType &&
|
||||
!!fieldType.price;
|
||||
}
|
||||
|
||||
/**
|
||||
* Determines if a field is a multi-input type
|
||||
* @param fieldType The field type to check
|
||||
* @returns True if the field is a multi-input type
|
||||
*/
|
||||
export const isMultiInputType = (fieldType: any): boolean => {
|
||||
return fieldType.type === 'multi-input'
|
||||
}
|
||||
|
||||
/**
|
||||
* Gets the separator for multi-input fields
|
||||
* @param fieldType The field type
|
||||
* @returns The separator string
|
||||
*/
|
||||
export const getMultiInputSeparator = (fieldType: any): string => {
|
||||
if (isMultiInputType(fieldType) && fieldType.separator) {
|
||||
return fieldType.separator
|
||||
}
|
||||
return ','
|
||||
}
|
||||
|
||||
/**
|
||||
* Performs regex validation on a value
|
||||
* @param value The value to validate
|
||||
* @param regex The regex pattern
|
||||
* @param flags Regex flags
|
||||
* @returns True if validation passes
|
||||
*/
|
||||
export const validateRegex = (value: any, regex: string, flags?: string): boolean => {
|
||||
if (value === undefined || value === null || value === '') return true
|
||||
|
||||
try {
|
||||
const regexObj = new RegExp(regex, flags)
|
||||
return regexObj.test(String(value))
|
||||
} catch (error) {
|
||||
console.error('Invalid regex in validation:', error)
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a validation error object
|
||||
* @param message Error message
|
||||
* @param level Error level
|
||||
* @param source Error source
|
||||
* @param type Error type
|
||||
* @returns Error object
|
||||
*/
|
||||
export const createError = (
|
||||
message: string,
|
||||
level: 'info' | 'warning' | 'error' = 'error',
|
||||
source: ErrorSources = ErrorSources.Row,
|
||||
type: ValidationErrorType = ValidationErrorType.Custom
|
||||
): ErrorType => {
|
||||
return {
|
||||
message,
|
||||
level,
|
||||
source,
|
||||
type
|
||||
} as ErrorType
|
||||
}
|
||||
|
||||
/**
|
||||
* Formats a display value based on field type
|
||||
* @param value The value to format
|
||||
* @param field The field definition
|
||||
* @returns Formatted display value
|
||||
*/
|
||||
export const getDisplayValue = (value: any, field: Field<any>): string => {
|
||||
if (value === undefined || value === null) return ''
|
||||
|
||||
// Handle price fields
|
||||
if (isPriceField(field)) {
|
||||
return formatPrice(value)
|
||||
}
|
||||
|
||||
// Handle multi-input fields
|
||||
if (isMultiInputType(field.fieldType)) {
|
||||
if (Array.isArray(value)) {
|
||||
return value.join(`${getMultiInputSeparator(field.fieldType)} `)
|
||||
}
|
||||
}
|
||||
|
||||
// Handle boolean values
|
||||
if (typeof value === 'boolean') {
|
||||
return value ? 'Yes' : 'No'
|
||||
}
|
||||
|
||||
return String(value)
|
||||
}
|
||||
|
||||
/**
|
||||
* Validates supplier and company fields
|
||||
* @param row The data row
|
||||
* @returns Object with errors for invalid fields
|
||||
*/
|
||||
export const validateSpecialFields = <T extends string>(row: Data<T>): Record<string, ErrorType[]> => {
|
||||
const errors: Record<string, ErrorType[]> = {}
|
||||
|
||||
// Validate supplier field
|
||||
if (!row.supplier) {
|
||||
errors['supplier'] = [{
|
||||
message: 'Supplier is required',
|
||||
level: 'error',
|
||||
source: ErrorSources.Row,
|
||||
type: ValidationErrorType.Required
|
||||
}]
|
||||
}
|
||||
|
||||
// Validate company field
|
||||
if (!row.company) {
|
||||
errors['company'] = [{
|
||||
message: 'Company is required',
|
||||
level: 'error',
|
||||
source: ErrorSources.Row,
|
||||
type: ValidationErrorType.Required
|
||||
}]
|
||||
}
|
||||
|
||||
return errors
|
||||
}
|
||||
|
||||
/**
|
||||
* Merges multiple error objects
|
||||
* @param errors Array of error objects to merge
|
||||
* @returns Merged error object
|
||||
*/
|
||||
export const mergeErrors = (...errors: Record<string, ErrorType[]>[]): Record<string, ErrorType[]> => {
|
||||
const merged: Record<string, ErrorType[]> = {}
|
||||
|
||||
errors.forEach(errorObj => {
|
||||
if (!errorObj) return
|
||||
|
||||
Object.entries(errorObj).forEach(([key, errs]) => {
|
||||
if (!merged[key]) {
|
||||
merged[key] = []
|
||||
}
|
||||
|
||||
merged[key] = [
|
||||
...merged[key],
|
||||
...(Array.isArray(errs) ? errs : [errs as ErrorType])
|
||||
]
|
||||
})
|
||||
})
|
||||
|
||||
return merged
|
||||
}
|
||||
@@ -133,8 +133,9 @@ export function PerformanceMetrics() {
|
||||
}
|
||||
};
|
||||
|
||||
function getCategoryName(_cat_id: number): import("react").ReactNode {
|
||||
throw new Error('Function not implemented.');
|
||||
function getCategoryName(cat_id: number): import("react").ReactNode {
|
||||
// Simple implementation that just returns the ID as a string
|
||||
return `Category ${cat_id}`;
|
||||
}
|
||||
|
||||
return (
|
||||
@@ -217,15 +218,19 @@ export function PerformanceMetrics() {
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{abcConfigs.map((config) => (
|
||||
{abcConfigs && abcConfigs.length > 0 ? abcConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.a_threshold}%</TableCell>
|
||||
<TableCell className="text-right">{config.b_threshold}%</TableCell>
|
||||
<TableCell className="text-right">{config.classification_period_days}</TableCell>
|
||||
<TableCell className="text-right">{config.a_threshold !== undefined ? `${config.a_threshold}%` : '0%'}</TableCell>
|
||||
<TableCell className="text-right">{config.b_threshold !== undefined ? `${config.b_threshold}%` : '0%'}</TableCell>
|
||||
<TableCell className="text-right">{config.classification_period_days || 0}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
)) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={5} className="text-center py-4">No ABC configurations available</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateABCConfig}>
|
||||
@@ -253,14 +258,26 @@ export function PerformanceMetrics() {
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{turnoverConfigs.map((config) => (
|
||||
{turnoverConfigs && turnoverConfigs.length > 0 ? turnoverConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.calculation_period_days}</TableCell>
|
||||
<TableCell className="text-right">{config.target_rate.toFixed(2)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{config.target_rate !== undefined && config.target_rate !== null
|
||||
? (typeof config.target_rate === 'number'
|
||||
? config.target_rate.toFixed(2)
|
||||
: (isNaN(parseFloat(String(config.target_rate)))
|
||||
? '0.00'
|
||||
: parseFloat(String(config.target_rate)).toFixed(2)))
|
||||
: '0.00'}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
)) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={4} className="text-center py-4">No turnover configurations available</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateTurnoverConfig}>
|
||||
|
||||
130
inventory/src/components/settings/PermissionSelector.tsx
Normal file
130
inventory/src/components/settings/PermissionSelector.tsx
Normal file
@@ -0,0 +1,130 @@
|
||||
import { Checkbox } from "@/components/ui/checkbox";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger } from "@/components/ui/tooltip";
|
||||
import { Info } from "lucide-react";
|
||||
|
||||
interface Permission {
|
||||
id: number;
|
||||
name: string;
|
||||
code: string;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
interface PermissionCategory {
|
||||
category: string;
|
||||
permissions: Permission[];
|
||||
}
|
||||
|
||||
interface PermissionSelectorProps {
|
||||
permissionsByCategory: PermissionCategory[];
|
||||
selectedPermissions: number[];
|
||||
onChange: (selectedPermissions: number[]) => void;
|
||||
disabled?: boolean;
|
||||
}
|
||||
|
||||
export function PermissionSelector({
|
||||
permissionsByCategory,
|
||||
selectedPermissions,
|
||||
onChange,
|
||||
disabled = false
|
||||
}: PermissionSelectorProps) {
|
||||
// Handle permission checkbox change
|
||||
const handlePermissionChange = (permissionId: number) => {
|
||||
const newSelectedPermissions = selectedPermissions.includes(permissionId)
|
||||
? selectedPermissions.filter(id => id !== permissionId)
|
||||
: [...selectedPermissions, permissionId];
|
||||
|
||||
onChange(newSelectedPermissions);
|
||||
};
|
||||
|
||||
// Handle selecting/deselecting all permissions in a category
|
||||
const handleSelectCategory = (category: string) => {
|
||||
const categoryData = permissionsByCategory.find(c => c.category === category);
|
||||
if (!categoryData) return;
|
||||
|
||||
const categoryPermIds = categoryData.permissions.map(p => p.id);
|
||||
|
||||
// Check if all permissions in category are already selected
|
||||
const allSelected = categoryPermIds.every(id => selectedPermissions.includes(id));
|
||||
|
||||
// If all selected, deselect all; otherwise select all
|
||||
const newSelectedPermissions = allSelected
|
||||
? selectedPermissions.filter(id => !categoryPermIds.includes(id))
|
||||
: [...new Set([...selectedPermissions, ...categoryPermIds])];
|
||||
|
||||
onChange(newSelectedPermissions);
|
||||
};
|
||||
|
||||
// Check if there are no permission categories
|
||||
if (!permissionsByCategory || permissionsByCategory.length === 0) {
|
||||
return (
|
||||
<div className="text-center py-4">
|
||||
<p className="text-muted-foreground">No permissions available</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-4">
|
||||
<h3 className="text-lg font-medium">Permissions</h3>
|
||||
<p className="text-sm text-muted-foreground mb-4">
|
||||
Select the permissions you want to grant to this user
|
||||
</p>
|
||||
|
||||
{permissionsByCategory.map(category => (
|
||||
<Card key={category.category} className="mb-4">
|
||||
<CardHeader className="pb-2 flex flex-row items-center justify-between">
|
||||
<CardTitle className="text-md">{category.category}</CardTitle>
|
||||
<Button
|
||||
type="button"
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleSelectCategory(category.category)}
|
||||
disabled={disabled}
|
||||
>
|
||||
{category.permissions.every(p => selectedPermissions.includes(p.id))
|
||||
? 'Deselect All'
|
||||
: 'Select All'}
|
||||
</Button>
|
||||
</CardHeader>
|
||||
|
||||
<CardContent>
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-2">
|
||||
{category.permissions.map(permission => (
|
||||
<div key={permission.id} className="flex items-center space-x-2 py-1">
|
||||
<Checkbox
|
||||
id={`permission-${permission.id}`}
|
||||
checked={selectedPermissions.includes(permission.id)}
|
||||
onCheckedChange={() => handlePermissionChange(permission.id)}
|
||||
disabled={disabled}
|
||||
/>
|
||||
<label
|
||||
htmlFor={`permission-${permission.id}`}
|
||||
className="text-sm font-medium leading-none peer-disabled:cursor-not-allowed peer-disabled:opacity-70 flex items-center"
|
||||
>
|
||||
{permission.name}
|
||||
|
||||
{permission.description && (
|
||||
<TooltipProvider>
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<Info className="h-3.5 w-3.5 ml-1 text-muted-foreground" />
|
||||
</TooltipTrigger>
|
||||
<TooltipContent>
|
||||
<p>{permission.description}</p>
|
||||
<p className="text-xs text-muted-foreground mt-1">Code: {permission.code}</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
)}
|
||||
</label>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
584
inventory/src/components/settings/PromptManagement.tsx
Normal file
584
inventory/src/components/settings/PromptManagement.tsx
Normal file
@@ -0,0 +1,584 @@
|
||||
import { useState, useMemo } from "react";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Textarea } from "@/components/ui/textarea";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { ArrowUpDown, Pencil, Trash2, PlusCircle } from "lucide-react";
|
||||
import config from "@/config";
|
||||
import {
|
||||
useReactTable,
|
||||
getCoreRowModel,
|
||||
getSortedRowModel,
|
||||
SortingState,
|
||||
flexRender,
|
||||
type ColumnDef,
|
||||
} from "@tanstack/react-table";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import {
|
||||
AlertDialog,
|
||||
AlertDialogAction,
|
||||
AlertDialogCancel,
|
||||
AlertDialogContent,
|
||||
AlertDialogDescription,
|
||||
AlertDialogFooter,
|
||||
AlertDialogHeader,
|
||||
AlertDialogTitle,
|
||||
} from "@/components/ui/alert-dialog";
|
||||
import { toast } from "sonner";
|
||||
|
||||
interface FieldOption {
|
||||
label: string;
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface PromptFormData {
|
||||
id?: number;
|
||||
prompt_text: string;
|
||||
prompt_type: 'general' | 'company_specific' | 'system';
|
||||
company: string | null;
|
||||
}
|
||||
|
||||
interface AiPrompt {
|
||||
id: number;
|
||||
prompt_text: string;
|
||||
prompt_type: 'general' | 'company_specific' | 'system';
|
||||
company: string | null;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
interface FieldOptions {
|
||||
companies: FieldOption[];
|
||||
}
|
||||
|
||||
export function PromptManagement() {
|
||||
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||
const [promptToDelete, setPromptToDelete] = useState<AiPrompt | null>(null);
|
||||
const [editingPrompt, setEditingPrompt] = useState<AiPrompt | null>(null);
|
||||
const [sorting, setSorting] = useState<SortingState>([
|
||||
{ id: "prompt_type", desc: true },
|
||||
{ id: "company", desc: false }
|
||||
]);
|
||||
const [searchQuery, setSearchQuery] = useState("");
|
||||
const [formData, setFormData] = useState<PromptFormData>({
|
||||
prompt_text: "",
|
||||
prompt_type: "general",
|
||||
company: null,
|
||||
});
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { data: prompts, isLoading } = useQuery<AiPrompt[]>({
|
||||
queryKey: ["ai-prompts"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/ai-prompts`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch AI prompts");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||
queryKey: ["fieldOptions"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch field options");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// Check if general and system prompts already exist
|
||||
const generalPromptExists = useMemo(() => {
|
||||
return prompts?.some(prompt => prompt.prompt_type === 'general');
|
||||
}, [prompts]);
|
||||
|
||||
const systemPromptExists = useMemo(() => {
|
||||
return prompts?.some(prompt => prompt.prompt_type === 'system');
|
||||
}, [prompts]);
|
||||
|
||||
const createMutation = useMutation({
|
||||
mutationFn: async (data: PromptFormData) => {
|
||||
const response = await fetch(`${config.apiUrl}/ai-prompts`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify(data),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to create prompt");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||
toast.success("Prompt created successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to create prompt");
|
||||
},
|
||||
});
|
||||
|
||||
const updateMutation = useMutation({
|
||||
mutationFn: async (data: PromptFormData) => {
|
||||
if (!data.id) throw new Error("Prompt ID is required for update");
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/ai-prompts/${data.id}`, {
|
||||
method: "PUT",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify(data),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to update prompt");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||
toast.success("Prompt updated successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to update prompt");
|
||||
},
|
||||
});
|
||||
|
||||
const deleteMutation = useMutation({
|
||||
mutationFn: async (id: number) => {
|
||||
const response = await fetch(`${config.apiUrl}/ai-prompts/${id}`, {
|
||||
method: "DELETE",
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to delete prompt");
|
||||
}
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["ai-prompts"] });
|
||||
toast.success("Prompt deleted successfully");
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to delete prompt");
|
||||
},
|
||||
});
|
||||
|
||||
const handleEdit = (prompt: AiPrompt) => {
|
||||
setEditingPrompt(prompt);
|
||||
setFormData({
|
||||
id: prompt.id,
|
||||
prompt_text: prompt.prompt_text,
|
||||
prompt_type: prompt.prompt_type,
|
||||
company: prompt.company,
|
||||
});
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteClick = (prompt: AiPrompt) => {
|
||||
setPromptToDelete(prompt);
|
||||
setIsDeleteOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteConfirm = () => {
|
||||
if (promptToDelete) {
|
||||
deleteMutation.mutate(promptToDelete.id);
|
||||
setIsDeleteOpen(false);
|
||||
setPromptToDelete(null);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSubmit = (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
// If prompt_type is general or system, ensure company is null
|
||||
const submitData = {
|
||||
...formData,
|
||||
company: formData.prompt_type === 'company_specific' ? formData.company : null,
|
||||
};
|
||||
|
||||
if (editingPrompt) {
|
||||
updateMutation.mutate(submitData);
|
||||
} else {
|
||||
createMutation.mutate(submitData);
|
||||
}
|
||||
};
|
||||
|
||||
const resetForm = () => {
|
||||
setFormData({
|
||||
prompt_text: "",
|
||||
prompt_type: "general",
|
||||
company: null,
|
||||
});
|
||||
setEditingPrompt(null);
|
||||
setIsFormOpen(false);
|
||||
};
|
||||
|
||||
const handleCreateClick = () => {
|
||||
resetForm();
|
||||
|
||||
// If general prompt and system prompt exist, default to company-specific
|
||||
if (generalPromptExists && systemPromptExists) {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
prompt_type: 'company_specific'
|
||||
}));
|
||||
} else if (generalPromptExists && !systemPromptExists) {
|
||||
// If general exists but system doesn't, suggest system prompt
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
prompt_type: 'system'
|
||||
}));
|
||||
} else if (!generalPromptExists) {
|
||||
// If no general prompt, suggest that first
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
prompt_type: 'general'
|
||||
}));
|
||||
}
|
||||
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
const columns = useMemo<ColumnDef<AiPrompt>[]>(() => [
|
||||
{
|
||||
accessorKey: "prompt_type",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Type
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const type = row.getValue("prompt_type") as string;
|
||||
if (type === 'general') return 'General';
|
||||
if (type === 'system') return 'System';
|
||||
return 'Company Specific';
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorFn: (row) => row.prompt_text.length,
|
||||
id: "length",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Length
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ getValue }) => {
|
||||
const length = getValue() as number;
|
||||
return <span>{length.toLocaleString()}</span>;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "company",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Company
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const companyId = row.getValue("company");
|
||||
if (!companyId) return 'N/A';
|
||||
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "updated_at",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Last Updated
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => new Date(row.getValue("updated_at")).toLocaleDateString(),
|
||||
},
|
||||
{
|
||||
id: "actions",
|
||||
cell: ({ row }) => (
|
||||
<div className="flex gap-2 justify-end pr-4">
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => handleEdit(row.original)}
|
||||
>
|
||||
<Pencil className="h-4 w-4" />
|
||||
Edit
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
className="text-destructive hover:text-destructive"
|
||||
onClick={() => handleDeleteClick(row.original)}
|
||||
>
|
||||
<Trash2 className="h-4 w-4" />
|
||||
Delete
|
||||
</Button>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
], [fieldOptions]);
|
||||
|
||||
const filteredData = useMemo(() => {
|
||||
if (!prompts) return [];
|
||||
return prompts.filter((prompt) => {
|
||||
const searchString = searchQuery.toLowerCase();
|
||||
return (
|
||||
prompt.prompt_type.toLowerCase().includes(searchString) ||
|
||||
(prompt.company && prompt.company.toLowerCase().includes(searchString))
|
||||
);
|
||||
});
|
||||
}, [prompts, searchQuery]);
|
||||
|
||||
const table = useReactTable({
|
||||
data: filteredData,
|
||||
columns,
|
||||
state: {
|
||||
sorting,
|
||||
},
|
||||
onSortingChange: setSorting,
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
});
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h2 className="text-2xl font-bold">AI Validation Prompts</h2>
|
||||
<Button onClick={handleCreateClick}>
|
||||
<PlusCircle className="mr-2 h-4 w-4" />
|
||||
Create New Prompt
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4">
|
||||
<Input
|
||||
placeholder="Search prompts..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className="max-w-sm"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{isLoading ? (
|
||||
<div>Loading prompts...</div>
|
||||
) : (
|
||||
<div className="border rounded-lg">
|
||||
<Table>
|
||||
<TableHeader className="bg-muted">
|
||||
{table.getHeaderGroups().map((headerGroup) => (
|
||||
<TableRow key={headerGroup.id}>
|
||||
{headerGroup.headers.map((header) => (
|
||||
<TableHead key={header.id}>
|
||||
{header.isPlaceholder
|
||||
? null
|
||||
: flexRender(
|
||||
header.column.columnDef.header,
|
||||
header.getContext()
|
||||
)}
|
||||
</TableHead>
|
||||
))}
|
||||
</TableRow>
|
||||
))}
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{table.getRowModel().rows?.length ? (
|
||||
table.getRowModel().rows.map((row) => (
|
||||
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||
{row.getVisibleCells().map((cell) => (
|
||||
<TableCell key={cell.id} className="pl-6">
|
||||
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||
</TableCell>
|
||||
))}
|
||||
</TableRow>
|
||||
))
|
||||
) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={columns.length} className="text-center">
|
||||
No prompts found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Prompt Form Dialog */}
|
||||
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||
<DialogContent className="max-w-3xl">
|
||||
<DialogHeader>
|
||||
<DialogTitle>{editingPrompt ? "Edit Prompt" : "Create New Prompt"}</DialogTitle>
|
||||
<DialogDescription>
|
||||
{editingPrompt
|
||||
? "Update this AI validation prompt."
|
||||
: "Create a new AI validation prompt that will be used during product validation."}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<form onSubmit={handleSubmit}>
|
||||
<div className="grid gap-4 py-4">
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="prompt_type">Prompt Type</Label>
|
||||
<Select
|
||||
value={formData.prompt_type}
|
||||
onValueChange={(value: 'general' | 'company_specific' | 'system') =>
|
||||
setFormData({ ...formData, prompt_type: value })
|
||||
}
|
||||
disabled={(generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id) ||
|
||||
(systemPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id)}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select prompt type" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem
|
||||
value="general"
|
||||
disabled={generalPromptExists && !editingPrompt?.prompt_type?.includes('general')}
|
||||
>
|
||||
General
|
||||
</SelectItem>
|
||||
<SelectItem
|
||||
value="system"
|
||||
disabled={systemPromptExists && !editingPrompt?.prompt_type?.includes('system')}
|
||||
>
|
||||
System
|
||||
</SelectItem>
|
||||
<SelectItem value="company_specific">Company Specific</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
{generalPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && systemPromptExists && formData.prompt_type !== 'system' && (
|
||||
<p className="text-xs text-muted-foreground">
|
||||
General and system prompts already exist. You can only create company-specific prompts.
|
||||
</p>
|
||||
)}
|
||||
{generalPromptExists && !systemPromptExists && formData.prompt_type !== 'general' && !editingPrompt?.id && (
|
||||
<p className="text-xs text-muted-foreground">
|
||||
A general prompt already exists. You can create a system prompt or company-specific prompts.
|
||||
</p>
|
||||
)}
|
||||
{systemPromptExists && !generalPromptExists && formData.prompt_type !== 'system' && !editingPrompt?.id && (
|
||||
<p className="text-xs text-muted-foreground">
|
||||
A system prompt already exists. You can create a general prompt or company-specific prompts.
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{formData.prompt_type === 'company_specific' && (
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="company">Company</Label>
|
||||
<Select
|
||||
value={formData.company || ''}
|
||||
onValueChange={(value) => setFormData({ ...formData, company: value })}
|
||||
required={formData.prompt_type === 'company_specific'}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select company" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{fieldOptions?.companies.map((company) => (
|
||||
<SelectItem key={company.value} value={company.value}>
|
||||
{company.label}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="prompt_text">Prompt Text</Label>
|
||||
<Textarea
|
||||
id="prompt_text"
|
||||
value={formData.prompt_text}
|
||||
onChange={(e) => setFormData({ ...formData, prompt_text: e.target.value })}
|
||||
placeholder={`Enter your ${formData.prompt_type === 'system' ? 'system instructions' : 'validation prompt'} text...`}
|
||||
className="h-80 font-mono text-sm"
|
||||
required
|
||||
/>
|
||||
{formData.prompt_type === 'system' && (
|
||||
<p className="text-xs text-muted-foreground mt-1">
|
||||
System prompts provide the initial instructions to the AI. This sets the tone and approach for all validations.
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<Button type="button" variant="outline" onClick={() => {
|
||||
resetForm();
|
||||
setIsFormOpen(false);
|
||||
}}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit">
|
||||
{editingPrompt ? "Update" : "Create"} Prompt
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* Delete Confirmation Dialog */}
|
||||
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||
<AlertDialogContent>
|
||||
<AlertDialogHeader>
|
||||
<AlertDialogTitle>Delete Prompt</AlertDialogTitle>
|
||||
<AlertDialogDescription>
|
||||
Are you sure you want to delete this prompt? This action cannot be undone.
|
||||
</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
<AlertDialogFooter>
|
||||
<AlertDialogCancel onClick={() => {
|
||||
setIsDeleteOpen(false);
|
||||
setPromptToDelete(null);
|
||||
}}>
|
||||
Cancel
|
||||
</AlertDialogCancel>
|
||||
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||
Delete
|
||||
</AlertDialogAction>
|
||||
</AlertDialogFooter>
|
||||
</AlertDialogContent>
|
||||
</AlertDialog>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
@@ -0,0 +1,773 @@
|
||||
import { useState, useMemo, useCallback, useRef, useEffect } from "react";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Checkbox } from "@/components/ui/checkbox";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
|
||||
import config from "@/config";
|
||||
import {
|
||||
useReactTable,
|
||||
getCoreRowModel,
|
||||
getSortedRowModel,
|
||||
SortingState,
|
||||
flexRender,
|
||||
type ColumnDef,
|
||||
} from "@tanstack/react-table";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
DialogClose
|
||||
} from "@/components/ui/dialog";
|
||||
import {
|
||||
AlertDialog,
|
||||
AlertDialogAction,
|
||||
AlertDialogCancel,
|
||||
AlertDialogContent,
|
||||
AlertDialogDescription,
|
||||
AlertDialogFooter,
|
||||
AlertDialogHeader,
|
||||
AlertDialogTitle,
|
||||
} from "@/components/ui/alert-dialog";
|
||||
import { toast } from "sonner";
|
||||
import { useDropzone } from "react-dropzone";
|
||||
import { cn } from "@/lib/utils";
|
||||
|
||||
interface FieldOption {
|
||||
label: string;
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface ImageFormData {
|
||||
id?: number;
|
||||
name: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
file?: File;
|
||||
}
|
||||
|
||||
interface ReusableImage {
|
||||
id: number;
|
||||
name: string;
|
||||
filename: string;
|
||||
file_path: string;
|
||||
image_url: string;
|
||||
is_global: boolean;
|
||||
company: string | null;
|
||||
mime_type: string;
|
||||
file_size: number;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
interface FieldOptions {
|
||||
companies: FieldOption[];
|
||||
}
|
||||
|
||||
const ImageForm = ({
|
||||
editingImage,
|
||||
formData,
|
||||
setFormData,
|
||||
onSubmit,
|
||||
onCancel,
|
||||
fieldOptions,
|
||||
getRootProps,
|
||||
getInputProps,
|
||||
isDragActive
|
||||
}: {
|
||||
editingImage: ReusableImage | null;
|
||||
formData: ImageFormData;
|
||||
setFormData: (data: ImageFormData) => void;
|
||||
onSubmit: (e: React.FormEvent) => void;
|
||||
onCancel: () => void;
|
||||
fieldOptions: FieldOptions | undefined;
|
||||
getRootProps: any;
|
||||
getInputProps: any;
|
||||
isDragActive: boolean;
|
||||
}) => {
|
||||
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||
setFormData(prev => ({ ...prev, name: e.target.value }));
|
||||
}, [setFormData]);
|
||||
|
||||
const handleGlobalChange = useCallback((checked: boolean) => {
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
is_global: checked,
|
||||
company: checked ? null : prev.company
|
||||
}));
|
||||
}, [setFormData]);
|
||||
|
||||
const handleCompanyChange = useCallback((value: string) => {
|
||||
setFormData(prev => ({ ...prev, company: value }));
|
||||
}, [setFormData]);
|
||||
|
||||
return (
|
||||
<form onSubmit={onSubmit}>
|
||||
<div className="grid gap-4 py-4">
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="image_name">Image Name</Label>
|
||||
<Input
|
||||
id="image_name"
|
||||
name="image_name"
|
||||
value={formData.name}
|
||||
onChange={handleNameChange}
|
||||
placeholder="Enter image name"
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
|
||||
{!editingImage && (
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="image">Upload Image</Label>
|
||||
<div
|
||||
{...getRootProps()}
|
||||
className={cn(
|
||||
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
|
||||
isDragActive && "border-primary bg-muted"
|
||||
)}
|
||||
>
|
||||
<input {...getInputProps()} />
|
||||
<div className="flex flex-col items-center justify-center py-2">
|
||||
{formData.file ? (
|
||||
<>
|
||||
<div className="mb-4">
|
||||
<ImagePreview file={formData.file} />
|
||||
</div>
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<Image className="h-4 w-4 text-primary" />
|
||||
<span className="text-sm">{formData.file.name}</span>
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
|
||||
</>
|
||||
) : isDragActive ? (
|
||||
<>
|
||||
<Image className="h-8 w-8 mb-2 text-primary" />
|
||||
<p className="text-base text-muted-foreground">Drop image here</p>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
|
||||
<p className="text-base text-muted-foreground">Click or drag to upload</p>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
<div className="flex items-center space-x-2">
|
||||
<Checkbox
|
||||
id="is_global"
|
||||
checked={formData.is_global}
|
||||
onCheckedChange={handleGlobalChange}
|
||||
/>
|
||||
<Label htmlFor="is_global">Available for all companies</Label>
|
||||
</div>
|
||||
|
||||
{!formData.is_global && (
|
||||
<div className="grid gap-2">
|
||||
<Label htmlFor="company">Company</Label>
|
||||
<Select
|
||||
value={formData.company || ''}
|
||||
onValueChange={handleCompanyChange}
|
||||
required={!formData.is_global}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select company" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{fieldOptions?.companies.map((company) => (
|
||||
<SelectItem key={company.value} value={company.value}>
|
||||
{company.label}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<Button type="button" variant="outline" onClick={onCancel}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit">
|
||||
{editingImage ? "Update" : "Upload"} Image
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
);
|
||||
};
|
||||
|
||||
export function ReusableImageManagement() {
|
||||
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
|
||||
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
|
||||
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
|
||||
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
|
||||
const [sorting, setSorting] = useState<SortingState>([
|
||||
{ id: "created_at", desc: true }
|
||||
]);
|
||||
const [searchQuery, setSearchQuery] = useState("");
|
||||
const [formData, setFormData] = useState<ImageFormData>({
|
||||
name: "",
|
||||
is_global: false,
|
||||
company: null,
|
||||
file: undefined
|
||||
});
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { data: images, isLoading } = useQuery<ReusableImage[]>({
|
||||
queryKey: ["reusable-images"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch reusable images");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||
queryKey: ["fieldOptions"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to fetch field options");
|
||||
}
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
const createMutation = useMutation({
|
||||
mutationFn: async (data: ImageFormData) => {
|
||||
// Create FormData for file upload
|
||||
const formData = new FormData();
|
||||
formData.append('name', data.name);
|
||||
formData.append('is_global', String(data.is_global));
|
||||
|
||||
if (!data.is_global && data.company) {
|
||||
formData.append('company', data.company);
|
||||
}
|
||||
|
||||
if (data.file) {
|
||||
formData.append('image', data.file);
|
||||
} else {
|
||||
throw new Error("Image file is required");
|
||||
}
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
|
||||
method: "POST",
|
||||
body: formData,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to upload image");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image uploaded successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to upload image");
|
||||
},
|
||||
});
|
||||
|
||||
const updateMutation = useMutation({
|
||||
mutationFn: async (data: ImageFormData) => {
|
||||
if (!data.id) throw new Error("Image ID is required for update");
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
|
||||
method: "PUT",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({
|
||||
name: data.name,
|
||||
is_global: data.is_global,
|
||||
company: data.is_global ? null : data.company
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(error.message || error.error || "Failed to update image");
|
||||
}
|
||||
|
||||
return response.json();
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image updated successfully");
|
||||
resetForm();
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to update image");
|
||||
},
|
||||
});
|
||||
|
||||
const deleteMutation = useMutation({
|
||||
mutationFn: async (id: number) => {
|
||||
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
|
||||
method: "DELETE",
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error("Failed to delete image");
|
||||
}
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||
toast.success("Image deleted successfully");
|
||||
},
|
||||
onError: (error) => {
|
||||
toast.error(error instanceof Error ? error.message : "Failed to delete image");
|
||||
},
|
||||
});
|
||||
|
||||
const handleEdit = (image: ReusableImage) => {
|
||||
setEditingImage(image);
|
||||
setFormData({
|
||||
id: image.id,
|
||||
name: image.name,
|
||||
is_global: image.is_global,
|
||||
company: image.company,
|
||||
});
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteClick = (image: ReusableImage) => {
|
||||
setImageToDelete(image);
|
||||
setIsDeleteOpen(true);
|
||||
};
|
||||
|
||||
const handlePreview = (image: ReusableImage) => {
|
||||
setPreviewImage(image);
|
||||
setIsPreviewOpen(true);
|
||||
};
|
||||
|
||||
const handleDeleteConfirm = () => {
|
||||
if (imageToDelete) {
|
||||
deleteMutation.mutate(imageToDelete.id);
|
||||
setIsDeleteOpen(false);
|
||||
setImageToDelete(null);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSubmit = (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
|
||||
// If is_global is true, ensure company is null
|
||||
const submitData = {
|
||||
...formData,
|
||||
company: formData.is_global ? null : formData.company,
|
||||
};
|
||||
|
||||
if (editingImage) {
|
||||
updateMutation.mutate(submitData);
|
||||
} else {
|
||||
if (!submitData.file) {
|
||||
toast.error("Please select an image file");
|
||||
return;
|
||||
}
|
||||
createMutation.mutate(submitData);
|
||||
}
|
||||
};
|
||||
|
||||
const resetForm = () => {
|
||||
setFormData({
|
||||
name: "",
|
||||
is_global: false,
|
||||
company: null,
|
||||
file: undefined
|
||||
});
|
||||
setEditingImage(null);
|
||||
setIsFormOpen(false);
|
||||
};
|
||||
|
||||
const handleCreateClick = () => {
|
||||
resetForm();
|
||||
setIsFormOpen(true);
|
||||
};
|
||||
|
||||
// Configure dropzone for image uploads
|
||||
const onDrop = useCallback((acceptedFiles: File[]) => {
|
||||
if (acceptedFiles.length > 0) {
|
||||
const file = acceptedFiles[0]; // Take only the first file
|
||||
setFormData(prev => ({
|
||||
...prev,
|
||||
file
|
||||
}));
|
||||
}
|
||||
}, []);
|
||||
|
||||
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||
accept: {
|
||||
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
|
||||
},
|
||||
onDrop,
|
||||
multiple: false // Only accept single files
|
||||
});
|
||||
|
||||
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
|
||||
{
|
||||
accessorKey: "name",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Name
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
},
|
||||
{
|
||||
accessorKey: "is_global",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Type
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const isGlobal = row.getValue("is_global") as boolean;
|
||||
return isGlobal ? "Global" : "Company Specific";
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "company",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Company
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const isGlobal = row.getValue("is_global") as boolean;
|
||||
if (isGlobal) return 'N/A';
|
||||
|
||||
const companyId = row.getValue("company");
|
||||
if (!companyId) return 'None';
|
||||
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "file_size",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Size
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => {
|
||||
const size = row.getValue("file_size") as number;
|
||||
return `${(size / 1024).toFixed(1)} KB`;
|
||||
},
|
||||
},
|
||||
{
|
||||
accessorKey: "created_at",
|
||||
header: ({ column }) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||
>
|
||||
Created
|
||||
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||
</Button>
|
||||
),
|
||||
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
|
||||
},
|
||||
{
|
||||
accessorKey: "image_url",
|
||||
header: "Thumbnail",
|
||||
cell: ({ row }) => (
|
||||
<div className="flex items-center justify-center">
|
||||
<img
|
||||
src={row.getValue("image_url") as string}
|
||||
alt={row.getValue("name") as string}
|
||||
className="w-10 h-10 object-contain border rounded"
|
||||
/>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
id: "actions",
|
||||
cell: ({ row }) => (
|
||||
<div className="flex gap-2 justify-end">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
onClick={() => handlePreview(row.original)}
|
||||
title="Preview Image"
|
||||
>
|
||||
<Eye className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
onClick={() => handleEdit(row.original)}
|
||||
title="Edit Image"
|
||||
>
|
||||
<Pencil className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon"
|
||||
className="text-destructive hover:text-destructive"
|
||||
onClick={() => handleDeleteClick(row.original)}
|
||||
title="Delete Image"
|
||||
>
|
||||
<Trash2 className="h-4 w-4" />
|
||||
</Button>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
], [fieldOptions]);
|
||||
|
||||
const filteredData = useMemo(() => {
|
||||
if (!images) return [];
|
||||
return images.filter((image) => {
|
||||
const searchString = searchQuery.toLowerCase();
|
||||
return (
|
||||
image.name.toLowerCase().includes(searchString) ||
|
||||
(image.is_global ? "global" : "company").includes(searchString) ||
|
||||
(image.company && image.company.toLowerCase().includes(searchString))
|
||||
);
|
||||
});
|
||||
}, [images, searchQuery]);
|
||||
|
||||
const table = useReactTable({
|
||||
data: filteredData,
|
||||
columns,
|
||||
state: {
|
||||
sorting,
|
||||
},
|
||||
onSortingChange: setSorting,
|
||||
getSortedRowModel: getSortedRowModel(),
|
||||
getCoreRowModel: getCoreRowModel(),
|
||||
});
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h2 className="text-2xl font-bold">Reusable Images</h2>
|
||||
<Button onClick={handleCreateClick}>
|
||||
<PlusCircle className="mr-2 h-4 w-4" />
|
||||
Upload New Image
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-4">
|
||||
<Input
|
||||
placeholder="Search images..."
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
className="max-w-sm"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{isLoading ? (
|
||||
<div>Loading images...</div>
|
||||
) : (
|
||||
<div className="border rounded-lg">
|
||||
<Table>
|
||||
<TableHeader className="bg-muted">
|
||||
{table.getHeaderGroups().map((headerGroup) => (
|
||||
<TableRow key={headerGroup.id}>
|
||||
{headerGroup.headers.map((header) => (
|
||||
<TableHead key={header.id}>
|
||||
{header.isPlaceholder
|
||||
? null
|
||||
: flexRender(
|
||||
header.column.columnDef.header,
|
||||
header.getContext()
|
||||
)}
|
||||
</TableHead>
|
||||
))}
|
||||
</TableRow>
|
||||
))}
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{table.getRowModel().rows?.length ? (
|
||||
table.getRowModel().rows.map((row) => (
|
||||
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||
{row.getVisibleCells().map((cell) => (
|
||||
<TableCell key={cell.id} className="pl-6">
|
||||
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||
</TableCell>
|
||||
))}
|
||||
</TableRow>
|
||||
))
|
||||
) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={columns.length} className="text-center">
|
||||
No images found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Image Form Dialog */}
|
||||
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||
<DialogContent className="max-w-md">
|
||||
<DialogHeader>
|
||||
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
|
||||
<DialogDescription>
|
||||
{editingImage
|
||||
? "Update this reusable image's details."
|
||||
: "Upload a new reusable image that can be used across products."}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<ImageForm
|
||||
editingImage={editingImage}
|
||||
formData={formData}
|
||||
setFormData={setFormData}
|
||||
onSubmit={handleSubmit}
|
||||
onCancel={() => {
|
||||
resetForm();
|
||||
setIsFormOpen(false);
|
||||
}}
|
||||
fieldOptions={fieldOptions}
|
||||
getRootProps={getRootProps}
|
||||
getInputProps={getInputProps}
|
||||
isDragActive={isDragActive}
|
||||
/>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* Delete Confirmation Dialog */}
|
||||
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||
<AlertDialogContent>
|
||||
<AlertDialogHeader>
|
||||
<AlertDialogTitle>Delete Image</AlertDialogTitle>
|
||||
<AlertDialogDescription>
|
||||
Are you sure you want to delete this image? This action cannot be undone.
|
||||
</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
<AlertDialogFooter>
|
||||
<AlertDialogCancel onClick={() => {
|
||||
setIsDeleteOpen(false);
|
||||
setImageToDelete(null);
|
||||
}}>
|
||||
Cancel
|
||||
</AlertDialogCancel>
|
||||
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||
Delete
|
||||
</AlertDialogAction>
|
||||
</AlertDialogFooter>
|
||||
</AlertDialogContent>
|
||||
</AlertDialog>
|
||||
|
||||
{/* Preview Dialog */}
|
||||
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
|
||||
<DialogContent className="max-w-3xl">
|
||||
<DialogHeader>
|
||||
<DialogTitle>{previewImage?.name}</DialogTitle>
|
||||
<DialogDescription>
|
||||
{previewImage?.is_global
|
||||
? "Global image"
|
||||
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<div className="flex justify-center p-4">
|
||||
{previewImage && (
|
||||
<div className="bg-checkerboard rounded-md overflow-hidden">
|
||||
<img
|
||||
src={previewImage.image_url}
|
||||
alt={previewImage.name}
|
||||
className="max-h-[500px] max-w-full object-contain"
|
||||
/>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="grid grid-cols-2 gap-4 text-sm">
|
||||
<div>
|
||||
<span className="font-medium">Filename:</span> {previewImage?.filename}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Type:</span> {previewImage?.mime_type}
|
||||
</div>
|
||||
<div>
|
||||
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<DialogClose asChild>
|
||||
<Button>Close</Button>
|
||||
</DialogClose>
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
<style jsx global>{`
|
||||
.bg-checkerboard {
|
||||
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
|
||||
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
|
||||
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
|
||||
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
|
||||
background-size: 20px 20px;
|
||||
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
|
||||
}
|
||||
`}</style>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const ImagePreview = ({ file }: { file: File }) => {
|
||||
const [previewUrl, setPreviewUrl] = useState<string>('');
|
||||
|
||||
useEffect(() => {
|
||||
const url = URL.createObjectURL(file);
|
||||
setPreviewUrl(url);
|
||||
return () => {
|
||||
URL.revokeObjectURL(url);
|
||||
};
|
||||
}, [file]);
|
||||
|
||||
return (
|
||||
<img
|
||||
src={previewUrl}
|
||||
alt="Preview"
|
||||
className="max-h-32 max-w-full object-contain rounded-md"
|
||||
/>
|
||||
);
|
||||
};
|
||||
338
inventory/src/components/settings/UserForm.tsx
Normal file
338
inventory/src/components/settings/UserForm.tsx
Normal file
@@ -0,0 +1,338 @@
|
||||
import { useState, useEffect } from "react";
|
||||
import { z } from "zod";
|
||||
import { useForm } from "react-hook-form";
|
||||
import { zodResolver } from "@hookform/resolvers/zod";
|
||||
import {
|
||||
Form,
|
||||
FormControl,
|
||||
FormDescription,
|
||||
FormField,
|
||||
FormItem,
|
||||
FormLabel,
|
||||
FormMessage
|
||||
} from "@/components/ui/form";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { PermissionSelector } from "./PermissionSelector";
|
||||
import { Alert, AlertDescription } from "@/components/ui/alert";
|
||||
|
||||
interface Permission {
|
||||
id: number;
|
||||
name: string;
|
||||
code: string;
|
||||
description?: string;
|
||||
category?: string;
|
||||
}
|
||||
|
||||
interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email?: string;
|
||||
is_admin: boolean;
|
||||
is_active: boolean;
|
||||
permissions?: Permission[];
|
||||
}
|
||||
|
||||
interface PermissionCategory {
|
||||
category: string;
|
||||
permissions: Permission[];
|
||||
}
|
||||
|
||||
interface UserFormProps {
|
||||
user: User | null;
|
||||
permissions: PermissionCategory[];
|
||||
onSave: (userData: any) => void;
|
||||
onCancel: () => void;
|
||||
}
|
||||
|
||||
// Form validation schema
|
||||
const userFormSchema = z.object({
|
||||
username: z.string().min(3, { message: "Username must be at least 3 characters" }),
|
||||
email: z.string().email({ message: "Please enter a valid email" }).optional().or(z.literal("")),
|
||||
password: z.string().min(6, { message: "Password must be at least 6 characters" }).optional().or(z.literal("")),
|
||||
is_admin: z.boolean().default(false),
|
||||
is_active: z.boolean().default(true),
|
||||
});
|
||||
|
||||
type FormValues = z.infer<typeof userFormSchema>;
|
||||
|
||||
// Helper function to get all permission IDs from all categories
|
||||
const getAllPermissionIds = (permissionCategories: PermissionCategory[]): number[] => {
|
||||
const allIds: number[] = [];
|
||||
|
||||
if (permissionCategories && permissionCategories.length > 0) {
|
||||
permissionCategories.forEach(category => {
|
||||
category.permissions.forEach(permission => {
|
||||
allIds.push(permission.id);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return allIds;
|
||||
};
|
||||
|
||||
// User save data interface (represents the data structure for saving users)
|
||||
interface UserSaveData {
|
||||
id?: number;
|
||||
username: string;
|
||||
email?: string;
|
||||
password?: string;
|
||||
is_admin: boolean;
|
||||
is_active: boolean;
|
||||
permissions: Permission[];
|
||||
}
|
||||
|
||||
export function UserForm({ user, permissions, onSave, onCancel }: UserFormProps) {
|
||||
const [selectedPermissions, setSelectedPermissions] = useState<number[]>([]);
|
||||
const [formError, setFormError] = useState<string | null>(null);
|
||||
|
||||
// Initialize the form with React Hook Form
|
||||
const form = useForm<FormValues>({
|
||||
resolver: zodResolver(userFormSchema),
|
||||
defaultValues: {
|
||||
username: user?.username || "",
|
||||
email: user?.email || "",
|
||||
password: "", // Don't pre-fill password
|
||||
is_admin: user?.is_admin || false,
|
||||
is_active: user?.is_active !== false,
|
||||
},
|
||||
});
|
||||
|
||||
// Initialize selected permissions
|
||||
useEffect(() => {
|
||||
console.log("User permissions:", user?.permissions);
|
||||
|
||||
if (user?.permissions && Array.isArray(user.permissions) && user.permissions.length > 0) {
|
||||
// Extract IDs from the permissions
|
||||
const permissionIds = user.permissions.map(p => p.id);
|
||||
console.log("Setting selected permissions:", permissionIds);
|
||||
setSelectedPermissions(permissionIds);
|
||||
} else {
|
||||
console.log("No permissions found or empty permissions array");
|
||||
setSelectedPermissions([]);
|
||||
}
|
||||
}, [user]);
|
||||
|
||||
// Handle form submission
|
||||
const onSubmit = (data: FormValues) => {
|
||||
try {
|
||||
setFormError(null);
|
||||
console.log("Form submitted with permissions:", selectedPermissions);
|
||||
|
||||
// Validate
|
||||
if (!user && !data.password) {
|
||||
setFormError("Password is required for new users");
|
||||
return;
|
||||
}
|
||||
|
||||
// Prepare the data
|
||||
const userData: UserSaveData = {
|
||||
...data,
|
||||
id: user?.id, // Include ID if editing existing user
|
||||
permissions: [] // Initialize with empty array
|
||||
};
|
||||
|
||||
// If editing and password is empty, remove it
|
||||
if (user && !userData.password) {
|
||||
delete userData.password;
|
||||
}
|
||||
|
||||
// Add permissions if not admin
|
||||
if (!data.is_admin) {
|
||||
// Find the actual permission objects from selectedPermissions IDs
|
||||
const selectedPermissionObjects: Permission[] = [];
|
||||
|
||||
if (permissions && permissions.length > 0) {
|
||||
// Loop through all available permissions to find the selected ones
|
||||
permissions.forEach(category => {
|
||||
category.permissions.forEach(permission => {
|
||||
// If this permission's ID is in the selectedPermissions array, add it
|
||||
if (selectedPermissions.includes(permission.id)) {
|
||||
selectedPermissionObjects.push(permission);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
userData.permissions = selectedPermissionObjects;
|
||||
} else {
|
||||
// For admin users, don't send permissions as they're implied
|
||||
userData.permissions = [];
|
||||
}
|
||||
|
||||
console.log("Saving user data:", userData);
|
||||
onSave(userData);
|
||||
} catch (error) {
|
||||
const errorMessage = error instanceof Error ? error.message : "An error occurred";
|
||||
setFormError(errorMessage);
|
||||
}
|
||||
};
|
||||
|
||||
// For debugging
|
||||
console.log("Current form state:", form.getValues());
|
||||
console.log("Available permissions categories:", permissions);
|
||||
console.log("Selected permissions:", selectedPermissions);
|
||||
console.log("Is admin:", form.watch("is_admin"));
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
<div>
|
||||
<h2 className="text-2xl font-bold mb-2">{user ? "Edit User" : "Add New User"}</h2>
|
||||
<p className="text-muted-foreground">
|
||||
{user ? "Update the user's information and permissions" : "Create a new user account"}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
{formError && (
|
||||
<Alert variant="destructive">
|
||||
<AlertDescription>{formError}</AlertDescription>
|
||||
</Alert>
|
||||
)}
|
||||
|
||||
<Form {...form}>
|
||||
<form onSubmit={form.handleSubmit(onSubmit)} className="space-y-6">
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="username"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>Username</FormLabel>
|
||||
<FormControl>
|
||||
<Input {...field} />
|
||||
</FormControl>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="email"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>Email</FormLabel>
|
||||
<FormControl>
|
||||
<Input {...field} />
|
||||
</FormControl>
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="password"
|
||||
render={({ field }) => (
|
||||
<FormItem>
|
||||
<FormLabel>{user ? "New Password" : "Password"}</FormLabel>
|
||||
<FormControl>
|
||||
<Input
|
||||
type="password"
|
||||
{...field}
|
||||
placeholder={user ? "Leave blank to keep current password" : ""}
|
||||
/>
|
||||
</FormControl>
|
||||
{user && (
|
||||
<FormDescription>
|
||||
Leave blank to keep the current password
|
||||
</FormDescription>
|
||||
)}
|
||||
<FormMessage />
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="is_admin"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4 border rounded-md">
|
||||
<div>
|
||||
<FormLabel>Administrator</FormLabel>
|
||||
<FormDescription>
|
||||
Administrators have access to all permissions
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
|
||||
<FormField
|
||||
control={form.control}
|
||||
name="is_active"
|
||||
render={({ field }) => (
|
||||
<FormItem className="flex flex-row items-center justify-between p-4 border rounded-md">
|
||||
<div>
|
||||
<FormLabel>Active</FormLabel>
|
||||
<FormDescription>
|
||||
Inactive users cannot log in
|
||||
</FormDescription>
|
||||
</div>
|
||||
<FormControl>
|
||||
<Switch
|
||||
checked={field.value}
|
||||
onCheckedChange={field.onChange}
|
||||
/>
|
||||
</FormControl>
|
||||
</FormItem>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{permissions && permissions.length > 0 && (
|
||||
<>
|
||||
{form.watch("is_admin") ? (
|
||||
<div className="space-y-4">
|
||||
<h3 className="text-lg font-medium">Permissions</h3>
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
Administrators have access to all permissions by default. Individual permissions cannot be edited for admin users.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
<PermissionSelector
|
||||
permissionsByCategory={permissions}
|
||||
selectedPermissions={getAllPermissionIds(permissions)}
|
||||
onChange={() => {}}
|
||||
disabled={true}
|
||||
/>
|
||||
</div>
|
||||
) : (
|
||||
<>
|
||||
<PermissionSelector
|
||||
permissionsByCategory={permissions}
|
||||
selectedPermissions={selectedPermissions}
|
||||
onChange={setSelectedPermissions}
|
||||
/>
|
||||
{selectedPermissions.length === 0 && (
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
Warning: This user has no permissions selected. They won't be able to access anything.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
|
||||
<div className="flex justify-end space-x-4">
|
||||
<Button type="button" variant="outline" onClick={onCancel}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit">
|
||||
{user ? "Update User" : "Create User"}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</Form>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
94
inventory/src/components/settings/UserList.tsx
Normal file
94
inventory/src/components/settings/UserList.tsx
Normal file
@@ -0,0 +1,94 @@
|
||||
import { formatDistanceToNow } from "date-fns";
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import { Pencil, Trash2 } from "lucide-react";
|
||||
|
||||
interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email?: string;
|
||||
is_admin: boolean;
|
||||
is_active: boolean;
|
||||
last_login?: string;
|
||||
}
|
||||
|
||||
interface UserListProps {
|
||||
users: User[];
|
||||
onEdit: (userId: number) => void;
|
||||
onDelete: (userId: number) => void;
|
||||
}
|
||||
|
||||
export function UserList({ users, onEdit, onDelete }: UserListProps) {
|
||||
console.log("Rendering user list with users:", users);
|
||||
|
||||
if (users.length === 0) {
|
||||
return (
|
||||
<div className="text-center py-8">
|
||||
<p className="text-muted-foreground">No users found</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Username</TableHead>
|
||||
<TableHead>Email</TableHead>
|
||||
<TableHead>Admin</TableHead>
|
||||
<TableHead>Status</TableHead>
|
||||
<TableHead>Last Login</TableHead>
|
||||
<TableHead className="text-right">Actions</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{users.map((user) => (
|
||||
<TableRow key={user.id}>
|
||||
<TableCell className="font-medium">{user.username}</TableCell>
|
||||
<TableCell>{user.email || '-'}</TableCell>
|
||||
<TableCell>
|
||||
{user.is_admin ? (
|
||||
<Badge variant="default">Admin</Badge>
|
||||
) : (
|
||||
<Badge variant="outline">No</Badge>
|
||||
)}
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
{user.is_active ? (
|
||||
<Badge variant="default" className="bg-green-100 text-green-800 hover:bg-green-100">Active</Badge>
|
||||
) : (
|
||||
<Badge variant="secondary" className="bg-slate-100">Inactive</Badge>
|
||||
)}
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
{user.last_login
|
||||
? formatDistanceToNow(new Date(user.last_login), { addSuffix: true })
|
||||
: 'Never'}
|
||||
</TableCell>
|
||||
<TableCell className="text-right space-x-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => onEdit(user.id)}
|
||||
className="h-8 w-8 p-0"
|
||||
>
|
||||
<span className="sr-only">Edit</span>
|
||||
<Pencil className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => onDelete(user.id)}
|
||||
className="h-8 w-8 p-0"
|
||||
>
|
||||
<span className="sr-only">Delete</span>
|
||||
<Trash2 className="h-4 w-4" />
|
||||
</Button>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
);
|
||||
}
|
||||
353
inventory/src/components/settings/UserManagement.tsx
Normal file
353
inventory/src/components/settings/UserManagement.tsx
Normal file
@@ -0,0 +1,353 @@
|
||||
import { useState, useEffect, useContext } from "react";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Alert, AlertDescription, AlertTitle } from "@/components/ui/alert";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { UserList } from "./UserList";
|
||||
import { UserForm } from "./UserForm";
|
||||
import config from "@/config";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
import { ShieldAlert } from "lucide-react";
|
||||
|
||||
interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email?: string;
|
||||
is_admin: boolean;
|
||||
is_active: boolean;
|
||||
last_login?: string;
|
||||
permissions?: Permission[];
|
||||
}
|
||||
|
||||
interface Permission {
|
||||
id: number;
|
||||
name: string;
|
||||
code: string;
|
||||
description?: string;
|
||||
category?: string;
|
||||
}
|
||||
|
||||
interface PermissionCategory {
|
||||
category: string;
|
||||
permissions: Permission[];
|
||||
}
|
||||
|
||||
export function UserManagement() {
|
||||
const { token, fetchCurrentUser } = useContext(AuthContext);
|
||||
const [users, setUsers] = useState<User[]>([]);
|
||||
const [selectedUser, setSelectedUser] = useState<User | null>(null);
|
||||
const [isAddingUser, setIsAddingUser] = useState(false);
|
||||
const [permissions, setPermissions] = useState<PermissionCategory[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
// Fetch users and permissions
|
||||
const fetchData = async () => {
|
||||
if (!token) {
|
||||
setError("Authentication required. Please log in again.");
|
||||
setLoading(false);
|
||||
return;
|
||||
}
|
||||
|
||||
// The PermissionGuard component already handles permission checks,
|
||||
// so we don't need to duplicate that logic here
|
||||
|
||||
try {
|
||||
setLoading(true);
|
||||
setError(null);
|
||||
|
||||
// Fetch users
|
||||
const usersResponse = await fetch(`${config.authUrl}/users`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!usersResponse.ok) {
|
||||
if (usersResponse.status === 401) {
|
||||
throw new Error('Authentication failed. Please log in again.');
|
||||
} else if (usersResponse.status === 403) {
|
||||
throw new Error('You don\'t have permission to access the user list.');
|
||||
} else {
|
||||
// Try to get more detailed error message from response
|
||||
try {
|
||||
const errorData = await usersResponse.json();
|
||||
throw new Error(errorData.error || `Failed to fetch users (${usersResponse.status})`);
|
||||
} catch (e) {
|
||||
throw new Error(`Failed to fetch users (${usersResponse.status})`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const usersData = await usersResponse.json();
|
||||
setUsers(usersData);
|
||||
|
||||
// Fetch permissions
|
||||
const permissionsResponse = await fetch(`${config.authUrl}/permissions/categories`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!permissionsResponse.ok) {
|
||||
if (permissionsResponse.status === 401) {
|
||||
throw new Error('Authentication failed. Please log in again.');
|
||||
} else if (permissionsResponse.status === 403) {
|
||||
throw new Error('You don\'t have permission to access permissions.');
|
||||
} else {
|
||||
// Try to get more detailed error message from response
|
||||
try {
|
||||
const errorData = await permissionsResponse.json();
|
||||
throw new Error(errorData.error || `Failed to fetch permissions (${permissionsResponse.status})`);
|
||||
} catch (e) {
|
||||
throw new Error(`Failed to fetch permissions (${permissionsResponse.status})`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const permissionsData = await permissionsResponse.json();
|
||||
setPermissions(permissionsData);
|
||||
|
||||
setLoading(false);
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'An error occurred';
|
||||
setError(errorMessage);
|
||||
setLoading(false);
|
||||
|
||||
// If authentication error, refresh the token
|
||||
if (err instanceof Error && err.message.includes('Authentication failed')) {
|
||||
fetchCurrentUser().catch(() => {
|
||||
// Handle failed token refresh
|
||||
});
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
fetchData();
|
||||
}, [token]);
|
||||
|
||||
const handleEditUser = async (userId: number) => {
|
||||
try {
|
||||
const response = await fetch(`${config.authUrl}/users/${userId}`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
if (response.status === 401) {
|
||||
throw new Error('Authentication failed. Please log in again.');
|
||||
} else if (response.status === 403) {
|
||||
throw new Error('You don\'t have permission to edit users.');
|
||||
} else {
|
||||
throw new Error(`Failed to fetch user details (${response.status})`);
|
||||
}
|
||||
}
|
||||
|
||||
const userData = await response.json();
|
||||
console.log("Fetched user data for editing:", userData);
|
||||
|
||||
// Ensure permissions is always an array
|
||||
if (!userData.permissions) {
|
||||
userData.permissions = [];
|
||||
}
|
||||
|
||||
// Make sure the permissions are in the right format for the form
|
||||
// The server might send either an array of permission objects or just permission codes
|
||||
if (userData.permissions && userData.permissions.length > 0) {
|
||||
// Check if permissions are objects with id property
|
||||
if (typeof userData.permissions[0] === 'string') {
|
||||
// If we just have permission codes, we need to convert them to objects with ids
|
||||
// by looking them up in the permissions data
|
||||
const permissionObjects = [];
|
||||
|
||||
// Go through each permission category
|
||||
for (const category of permissions) {
|
||||
// For each permission in the category
|
||||
for (const permission of category.permissions) {
|
||||
// If this permission's code is in the user's permission codes
|
||||
if (userData.permissions.includes(permission.code)) {
|
||||
permissionObjects.push(permission);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
userData.permissions = permissionObjects;
|
||||
}
|
||||
}
|
||||
|
||||
setSelectedUser(userData);
|
||||
setIsAddingUser(false);
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'Failed to load user details';
|
||||
setError(errorMessage);
|
||||
}
|
||||
};
|
||||
|
||||
const handleAddUser = () => {
|
||||
setSelectedUser(null);
|
||||
setIsAddingUser(true);
|
||||
};
|
||||
|
||||
const handleSaveUser = async (userData: any) => {
|
||||
console.log("Saving user data:", userData);
|
||||
|
||||
// Format permissions for the API - convert from permission objects to IDs
|
||||
let formattedUserData = { ...userData };
|
||||
|
||||
if (userData.permissions && Array.isArray(userData.permissions)) {
|
||||
// Check if permissions are objects (from the form) and convert to IDs for the API
|
||||
if (userData.permissions.length > 0 && typeof userData.permissions[0] === 'object') {
|
||||
// The backend expects permission IDs, not just the code strings
|
||||
formattedUserData.permissions = userData.permissions.map((p: { id: any; }) => p.id);
|
||||
}
|
||||
}
|
||||
|
||||
console.log("Formatted user data for API:", formattedUserData);
|
||||
|
||||
setLoading(true);
|
||||
|
||||
try {
|
||||
// Use PUT for updating, POST for creating
|
||||
const method = userData.id ? 'PUT' : 'POST';
|
||||
const endpoint = userData.id
|
||||
? `${config.authUrl}/users/${userData.id}`
|
||||
: `${config.authUrl}/users`;
|
||||
|
||||
console.log(`${method} request to ${endpoint}`);
|
||||
|
||||
const response = await fetch(endpoint, {
|
||||
method,
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${token}`
|
||||
},
|
||||
body: JSON.stringify(formattedUserData)
|
||||
});
|
||||
|
||||
let responseData;
|
||||
|
||||
try {
|
||||
responseData = await response.json();
|
||||
} catch (e) {
|
||||
console.error("Error parsing response JSON:", e);
|
||||
throw new Error("Invalid response from server");
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
console.error("Error response from server:", responseData);
|
||||
throw new Error(responseData.error || responseData.message || `Failed to save user (${response.status})`);
|
||||
}
|
||||
|
||||
console.log("Server response after saving user:", responseData);
|
||||
|
||||
// Reset the form state
|
||||
setSelectedUser(null);
|
||||
setIsAddingUser(false);
|
||||
|
||||
// Refresh the user list
|
||||
fetchData();
|
||||
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'Failed to save user';
|
||||
setError(errorMessage);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleDeleteUser = async (userId: number) => {
|
||||
if (!window.confirm('Are you sure you want to delete this user?')) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
setLoading(true);
|
||||
|
||||
const response = await fetch(`${config.authUrl}/users/${userId}`, {
|
||||
method: 'DELETE',
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to delete user');
|
||||
}
|
||||
|
||||
// Refresh user list after a successful delete
|
||||
await fetchData();
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'Failed to delete user';
|
||||
setError(errorMessage);
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleCancel = () => {
|
||||
setSelectedUser(null);
|
||||
setIsAddingUser(false);
|
||||
};
|
||||
|
||||
if (loading && users.length === 0) {
|
||||
return (
|
||||
<Card>
|
||||
<CardContent className="pt-6">
|
||||
<div className="flex items-center justify-center h-40">
|
||||
<p>Loading user data...</p>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
if (error) {
|
||||
return (
|
||||
<Card>
|
||||
<CardContent className="pt-6">
|
||||
<Alert variant="destructive" className="mb-4">
|
||||
<ShieldAlert className="h-4 w-4" />
|
||||
<AlertTitle>Permission Error</AlertTitle>
|
||||
<AlertDescription>{error}</AlertDescription>
|
||||
</Alert>
|
||||
<div className="flex justify-center">
|
||||
<Button onClick={fetchData}>Retry</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<Card>
|
||||
{(selectedUser || isAddingUser) ? (
|
||||
<CardContent className="p-6">
|
||||
<UserForm
|
||||
user={selectedUser}
|
||||
permissions={permissions}
|
||||
onSave={handleSaveUser}
|
||||
onCancel={handleCancel}
|
||||
/>
|
||||
</CardContent>
|
||||
) : (
|
||||
<>
|
||||
<CardHeader className="flex flex-row items-center justify-between">
|
||||
<div>
|
||||
<CardTitle>User Management</CardTitle>
|
||||
</div>
|
||||
<Button onClick={handleAddUser}>
|
||||
Add User
|
||||
</Button>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<UserList
|
||||
users={users}
|
||||
onEdit={handleEditUser}
|
||||
onDelete={handleDeleteUser}
|
||||
/>
|
||||
</CardContent>
|
||||
</>
|
||||
)}
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
176
inventory/src/components/ui/form.tsx
Normal file
176
inventory/src/components/ui/form.tsx
Normal file
@@ -0,0 +1,176 @@
|
||||
import * as React from "react"
|
||||
import * as LabelPrimitive from "@radix-ui/react-label"
|
||||
import { Slot } from "@radix-ui/react-slot"
|
||||
import {
|
||||
Controller,
|
||||
ControllerProps,
|
||||
FieldPath,
|
||||
FieldValues,
|
||||
FormProvider,
|
||||
useFormContext,
|
||||
} from "react-hook-form"
|
||||
|
||||
import { cn } from "@/lib/utils"
|
||||
import { Label } from "@/components/ui/label"
|
||||
|
||||
const Form = FormProvider
|
||||
|
||||
type FormFieldContextValue<
|
||||
TFieldValues extends FieldValues = FieldValues,
|
||||
TName extends FieldPath<TFieldValues> = FieldPath<TFieldValues>
|
||||
> = {
|
||||
name: TName
|
||||
}
|
||||
|
||||
const FormFieldContext = React.createContext<FormFieldContextValue>(
|
||||
{} as FormFieldContextValue
|
||||
)
|
||||
|
||||
const FormField = <
|
||||
TFieldValues extends FieldValues = FieldValues,
|
||||
TName extends FieldPath<TFieldValues> = FieldPath<TFieldValues>
|
||||
>({
|
||||
...props
|
||||
}: ControllerProps<TFieldValues, TName>) => {
|
||||
return (
|
||||
<FormFieldContext.Provider value={{ name: props.name }}>
|
||||
<Controller {...props} />
|
||||
</FormFieldContext.Provider>
|
||||
)
|
||||
}
|
||||
|
||||
const useFormField = () => {
|
||||
const fieldContext = React.useContext(FormFieldContext)
|
||||
const itemContext = React.useContext(FormItemContext)
|
||||
const { getFieldState, formState } = useFormContext()
|
||||
|
||||
const fieldState = getFieldState(fieldContext.name, formState)
|
||||
|
||||
if (!fieldContext) {
|
||||
throw new Error("useFormField should be used within <FormField>")
|
||||
}
|
||||
|
||||
const { id } = itemContext
|
||||
|
||||
return {
|
||||
id,
|
||||
name: fieldContext.name,
|
||||
formItemId: `${id}-form-item`,
|
||||
formDescriptionId: `${id}-form-item-description`,
|
||||
formMessageId: `${id}-form-item-message`,
|
||||
...fieldState,
|
||||
}
|
||||
}
|
||||
|
||||
type FormItemContextValue = {
|
||||
id: string
|
||||
}
|
||||
|
||||
const FormItemContext = React.createContext<FormItemContextValue>(
|
||||
{} as FormItemContextValue
|
||||
)
|
||||
|
||||
const FormItem = React.forwardRef<
|
||||
HTMLDivElement,
|
||||
React.HTMLAttributes<HTMLDivElement>
|
||||
>(({ className, ...props }, ref) => {
|
||||
const id = React.useId()
|
||||
|
||||
return (
|
||||
<FormItemContext.Provider value={{ id }}>
|
||||
<div ref={ref} className={cn("space-y-2", className)} {...props} />
|
||||
</FormItemContext.Provider>
|
||||
)
|
||||
})
|
||||
FormItem.displayName = "FormItem"
|
||||
|
||||
const FormLabel = React.forwardRef<
|
||||
React.ElementRef<typeof LabelPrimitive.Root>,
|
||||
React.ComponentPropsWithoutRef<typeof LabelPrimitive.Root>
|
||||
>(({ className, ...props }, ref) => {
|
||||
const { error, formItemId } = useFormField()
|
||||
|
||||
return (
|
||||
<Label
|
||||
ref={ref}
|
||||
className={cn(error && "text-destructive", className)}
|
||||
htmlFor={formItemId}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
})
|
||||
FormLabel.displayName = "FormLabel"
|
||||
|
||||
const FormControl = React.forwardRef<
|
||||
React.ElementRef<typeof Slot>,
|
||||
React.ComponentPropsWithoutRef<typeof Slot>
|
||||
>(({ ...props }, ref) => {
|
||||
const { error, formItemId, formDescriptionId, formMessageId } = useFormField()
|
||||
|
||||
return (
|
||||
<Slot
|
||||
ref={ref}
|
||||
id={formItemId}
|
||||
aria-describedby={
|
||||
!error
|
||||
? `${formDescriptionId}`
|
||||
: `${formDescriptionId} ${formMessageId}`
|
||||
}
|
||||
aria-invalid={!!error}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
})
|
||||
FormControl.displayName = "FormControl"
|
||||
|
||||
const FormDescription = React.forwardRef<
|
||||
HTMLParagraphElement,
|
||||
React.HTMLAttributes<HTMLParagraphElement>
|
||||
>(({ className, ...props }, ref) => {
|
||||
const { formDescriptionId } = useFormField()
|
||||
|
||||
return (
|
||||
<p
|
||||
ref={ref}
|
||||
id={formDescriptionId}
|
||||
className={cn("text-sm text-muted-foreground", className)}
|
||||
{...props}
|
||||
/>
|
||||
)
|
||||
})
|
||||
FormDescription.displayName = "FormDescription"
|
||||
|
||||
const FormMessage = React.forwardRef<
|
||||
HTMLParagraphElement,
|
||||
React.HTMLAttributes<HTMLParagraphElement>
|
||||
>(({ className, children, ...props }, ref) => {
|
||||
const { error, formMessageId } = useFormField()
|
||||
const body = error ? String(error?.message) : children
|
||||
|
||||
if (!body) {
|
||||
return null
|
||||
}
|
||||
|
||||
return (
|
||||
<p
|
||||
ref={ref}
|
||||
id={formMessageId}
|
||||
className={cn("text-sm font-medium text-destructive", className)}
|
||||
{...props}
|
||||
>
|
||||
{body}
|
||||
</p>
|
||||
)
|
||||
})
|
||||
FormMessage.displayName = "FormMessage"
|
||||
|
||||
export {
|
||||
useFormField,
|
||||
Form,
|
||||
FormItem,
|
||||
FormLabel,
|
||||
FormControl,
|
||||
FormDescription,
|
||||
FormMessage,
|
||||
FormField,
|
||||
}
|
||||
158
inventory/src/contexts/AuthContext.tsx
Normal file
158
inventory/src/contexts/AuthContext.tsx
Normal file
@@ -0,0 +1,158 @@
|
||||
import { createContext, useState, useEffect, ReactNode, useCallback } from 'react';
|
||||
import config from '@/config';
|
||||
|
||||
export interface Permission {
|
||||
id: number;
|
||||
name: string;
|
||||
code: string;
|
||||
description?: string;
|
||||
category: string;
|
||||
}
|
||||
|
||||
export interface User {
|
||||
id: number;
|
||||
username: string;
|
||||
email?: string;
|
||||
is_admin: boolean;
|
||||
permissions: string[];
|
||||
}
|
||||
|
||||
interface AuthContextType {
|
||||
user: User | null;
|
||||
token: string | null;
|
||||
isLoading: boolean;
|
||||
error: string | null;
|
||||
login: (username: string, password: string) => Promise<void>;
|
||||
logout: () => void;
|
||||
fetchCurrentUser: () => Promise<void>;
|
||||
}
|
||||
|
||||
const defaultContext: AuthContextType = {
|
||||
user: null,
|
||||
token: null,
|
||||
isLoading: false,
|
||||
error: null,
|
||||
login: async () => {},
|
||||
logout: () => {},
|
||||
fetchCurrentUser: async () => {},
|
||||
};
|
||||
|
||||
export const AuthContext = createContext<AuthContextType>(defaultContext);
|
||||
|
||||
export function AuthProvider({ children }: { children: ReactNode }) {
|
||||
const [user, setUser] = useState<User | null>(null);
|
||||
const [token, setToken] = useState<string | null>(() => localStorage.getItem('token'));
|
||||
const [isLoading, setIsLoading] = useState<boolean>(false);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
const fetchCurrentUser = useCallback(async () => {
|
||||
if (!token) return;
|
||||
|
||||
try {
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
|
||||
const response = await fetch(`${config.authUrl}/me`, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${token}`
|
||||
}
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || 'Failed to fetch user data');
|
||||
}
|
||||
|
||||
const userData = await response.json();
|
||||
console.log("Fetched current user data:", userData);
|
||||
console.log("User permissions:", userData.permissions);
|
||||
|
||||
setUser(userData);
|
||||
// Ensure we have the sessionStorage isLoggedIn flag set
|
||||
sessionStorage.setItem('isLoggedIn', 'true');
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'An unknown error occurred';
|
||||
setError(errorMessage);
|
||||
console.error('Auth error:', errorMessage);
|
||||
|
||||
// Clear token if authentication failed
|
||||
if (err instanceof Error &&
|
||||
(err.message.includes('authentication') ||
|
||||
err.message.includes('token') ||
|
||||
err.message.includes('401'))) {
|
||||
logout();
|
||||
}
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, [token]);
|
||||
|
||||
// Load token and fetch user data on init
|
||||
useEffect(() => {
|
||||
if (token) {
|
||||
fetchCurrentUser();
|
||||
} else {
|
||||
// Clear sessionStorage if no token exists
|
||||
sessionStorage.removeItem('isLoggedIn');
|
||||
}
|
||||
}, [token, fetchCurrentUser]);
|
||||
|
||||
const login = async (username: string, password: string) => {
|
||||
try {
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
|
||||
const response = await fetch(`${config.authUrl}/login`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify({ username, password })
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || 'Login failed');
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log("Login successful, received data:", data);
|
||||
console.log("User permissions:", data.user?.permissions);
|
||||
|
||||
localStorage.setItem('token', data.token);
|
||||
sessionStorage.setItem('isLoggedIn', 'true');
|
||||
setToken(data.token);
|
||||
setUser(data.user);
|
||||
} catch (err) {
|
||||
const errorMessage = err instanceof Error ? err.message : 'Login failed';
|
||||
setError(errorMessage);
|
||||
console.error('Login error:', errorMessage);
|
||||
throw err;
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const logout = () => {
|
||||
localStorage.removeItem('token');
|
||||
sessionStorage.removeItem('isLoggedIn');
|
||||
setToken(null);
|
||||
setUser(null);
|
||||
};
|
||||
|
||||
return (
|
||||
<AuthContext.Provider
|
||||
value={{
|
||||
user,
|
||||
token,
|
||||
isLoading,
|
||||
error,
|
||||
login,
|
||||
logout,
|
||||
fetchCurrentUser,
|
||||
}}
|
||||
>
|
||||
{children}
|
||||
</AuthContext.Provider>
|
||||
);
|
||||
}
|
||||
@@ -1,200 +0,0 @@
|
||||
import { useEffect, useState } from "react"
|
||||
import { Button } from "@/components/ui/button"
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card"
|
||||
import { ScrollArea } from "@/components/ui/scroll-area"
|
||||
import { Code } from "@/components/ui/code"
|
||||
import { useToast } from "@/hooks/use-toast"
|
||||
import { Loader2 } from "lucide-react"
|
||||
import config from "@/config"
|
||||
|
||||
interface TaxonomyStats {
|
||||
categories: number
|
||||
themes: number
|
||||
colors: number
|
||||
taxCodes: number
|
||||
sizeCategories: number
|
||||
suppliers: number
|
||||
companies: number
|
||||
artists: number
|
||||
}
|
||||
|
||||
interface DebugData {
|
||||
taxonomyStats: TaxonomyStats | null
|
||||
basePrompt: string
|
||||
sampleFullPrompt: string
|
||||
promptLength: number
|
||||
estimatedProcessingTime?: {
|
||||
seconds: number | null
|
||||
sampleCount: number
|
||||
}
|
||||
}
|
||||
|
||||
export function AiValidationDebug() {
|
||||
const [isLoading, setIsLoading] = useState(false)
|
||||
const [debugData, setDebugData] = useState<DebugData | null>(null)
|
||||
const { toast } = useToast()
|
||||
|
||||
const fetchDebugData = async () => {
|
||||
setIsLoading(true)
|
||||
try {
|
||||
// Use a sample product to avoid loading full taxonomy
|
||||
const sampleProduct = {
|
||||
title: "Sample Product",
|
||||
description: "A sample product for testing",
|
||||
SKU: "SAMPLE-001",
|
||||
price: "9.99",
|
||||
cost_each: "5.00",
|
||||
qty_per_unit: "1",
|
||||
case_qty: "12"
|
||||
}
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/ai-validation/debug`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ products: [sampleProduct] })
|
||||
})
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch debug data')
|
||||
}
|
||||
const data = await response.json()
|
||||
setDebugData(data)
|
||||
} catch (error) {
|
||||
console.error('Error fetching debug data:', error)
|
||||
toast({
|
||||
variant: "destructive",
|
||||
title: "Error",
|
||||
description: error instanceof Error ? error.message : "Failed to fetch debug data"
|
||||
})
|
||||
} finally {
|
||||
setIsLoading(false)
|
||||
}
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
fetchDebugData()
|
||||
}, [])
|
||||
|
||||
return (
|
||||
<div className="container mx-auto py-6 space-y-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h1 className="text-3xl font-bold tracking-tight">AI Validation Debug</h1>
|
||||
<div className="space-x-4">
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={fetchDebugData}
|
||||
disabled={isLoading}
|
||||
>
|
||||
{isLoading && <Loader2 className="mr-2 h-4 w-4 animate-spin" />}
|
||||
Refresh Data
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{debugData && (
|
||||
<div className="grid grid-cols-1 md:grid-cols-2 gap-6">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Taxonomy Stats</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{debugData.taxonomyStats ? (
|
||||
<div className="space-y-2">
|
||||
<div>Categories: {debugData.taxonomyStats.categories}</div>
|
||||
<div>Themes: {debugData.taxonomyStats.themes}</div>
|
||||
<div>Colors: {debugData.taxonomyStats.colors}</div>
|
||||
<div>Tax Codes: {debugData.taxonomyStats.taxCodes}</div>
|
||||
<div>Size Categories: {debugData.taxonomyStats.sizeCategories}</div>
|
||||
<div>Suppliers: {debugData.taxonomyStats.suppliers}</div>
|
||||
<div>Companies: {debugData.taxonomyStats.companies}</div>
|
||||
<div>Artists: {debugData.taxonomyStats.artists}</div>
|
||||
</div>
|
||||
) : (
|
||||
<div>No taxonomy data available</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Prompt Length</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="space-y-2">
|
||||
<div>Characters: {debugData.promptLength}</div>
|
||||
<div>Tokens (est.): ~{Math.round(debugData.promptLength / 4)}</div>
|
||||
</div>
|
||||
<div className="space-y-2">
|
||||
<label htmlFor="costPerMillion" className="text-sm text-muted-foreground">
|
||||
Cost per million tokens ($)
|
||||
</label>
|
||||
<input
|
||||
id="costPerMillion"
|
||||
type="number"
|
||||
className="w-full px-3 py-2 border rounded-md"
|
||||
defaultValue="2.50"
|
||||
onChange={(e) => {
|
||||
const costPerMillion = parseFloat(e.target.value)
|
||||
if (!isNaN(costPerMillion)) {
|
||||
const tokens = Math.round(debugData.promptLength / 4)
|
||||
const cost = (tokens / 1_000_000) * costPerMillion * 100 // Convert to cents
|
||||
const costElement = document.getElementById('tokenCost')
|
||||
if (costElement) {
|
||||
costElement.textContent = cost.toFixed(1)
|
||||
}
|
||||
}
|
||||
}}
|
||||
/>
|
||||
<div className="text-sm">
|
||||
Cost: <span id="tokenCost">{((Math.round(debugData.promptLength / 4) / 1_000_000) * 3 * 100).toFixed(1)}</span>¢
|
||||
</div>
|
||||
</div>
|
||||
{debugData.estimatedProcessingTime && (
|
||||
<div className="mt-4 p-3 bg-muted rounded-md">
|
||||
<h3 className="text-sm font-medium mb-2">Processing Time Estimate</h3>
|
||||
{debugData.estimatedProcessingTime.seconds ? (
|
||||
<div className="space-y-1">
|
||||
<div className="text-sm">
|
||||
Estimated time: {formatTime(debugData.estimatedProcessingTime.seconds)}
|
||||
</div>
|
||||
<div className="text-xs text-muted-foreground">
|
||||
Based on {debugData.estimatedProcessingTime.sampleCount} similar validation{debugData.estimatedProcessingTime.sampleCount !== 1 ? 's' : ''}
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className="text-sm text-muted-foreground">No historical data available for this prompt size</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card className="col-span-full">
|
||||
<CardHeader>
|
||||
<CardTitle>Full Sample Prompt</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ScrollArea className="h-[500px] w-full rounded-md border p-4">
|
||||
<Code className="whitespace-pre-wrap">{debugData.sampleFullPrompt}</Code>
|
||||
</ScrollArea>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
// Helper function to format time in a human-readable way
|
||||
function formatTime(seconds: number): string {
|
||||
if (seconds < 60) {
|
||||
return `${Math.round(seconds)} seconds`;
|
||||
} else {
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
const remainingSeconds = Math.round(seconds % 60);
|
||||
return `${minutes}m ${remainingSeconds}s`;
|
||||
}
|
||||
}
|
||||
@@ -1,13 +1,12 @@
|
||||
import { useState } from "react";
|
||||
import { useState, useContext } from "react";
|
||||
import { useNavigate, useSearchParams } from "react-router-dom";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { toast } from "sonner";
|
||||
import config from "../config";
|
||||
import { Loader2, Box } from "lucide-react";
|
||||
import { motion } from "motion/react";
|
||||
|
||||
import { motion } from "framer-motion";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
|
||||
export function Login() {
|
||||
const [username, setUsername] = useState("");
|
||||
@@ -15,59 +14,22 @@ export function Login() {
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const navigate = useNavigate();
|
||||
const [searchParams] = useSearchParams();
|
||||
const { login } = useContext(AuthContext);
|
||||
|
||||
const handleLogin = async (e: React.FormEvent) => {
|
||||
e.preventDefault();
|
||||
setIsLoading(true);
|
||||
|
||||
try {
|
||||
const url = `${config.authUrl}/login`;
|
||||
console.log("Making login request:", {
|
||||
url,
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: { username, password },
|
||||
config,
|
||||
});
|
||||
|
||||
const response = await fetch(url, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
body: JSON.stringify({ username, password }),
|
||||
credentials: "include",
|
||||
});
|
||||
|
||||
console.log("Login response status:", response.status);
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response
|
||||
.json()
|
||||
.catch(() => ({ error: "Failed to parse error response" }));
|
||||
console.error("Login failed:", data);
|
||||
throw new Error(data.error || "Login failed");
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log("Login successful:", data);
|
||||
|
||||
sessionStorage.setItem("token", data.token);
|
||||
sessionStorage.setItem("isLoggedIn", "true");
|
||||
toast.success("Successfully logged in");
|
||||
|
||||
// Get the redirect URL from the URL parameters, defaulting to "/"
|
||||
const redirectTo = searchParams.get("redirect") || "/"
|
||||
await login(username, password);
|
||||
|
||||
// Navigate to the redirect URL after successful login
|
||||
navigate(redirectTo)
|
||||
// Login successful, redirect to the requested page or home
|
||||
const redirectTo = searchParams.get("redirect") || "/";
|
||||
navigate(redirectTo);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Login failed";
|
||||
toast.error(message);
|
||||
console.error("Login error:", error);
|
||||
toast.error(
|
||||
error instanceof Error ? error.message : "An unexpected error occurred"
|
||||
);
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
|
||||
@@ -1,322 +0,0 @@
|
||||
import { useState, useCallback } from 'react';
|
||||
import { useQuery, keepPreviousData } from '@tanstack/react-query';
|
||||
import { format } from 'date-fns';
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import {
|
||||
Pagination,
|
||||
PaginationContent,
|
||||
PaginationItem,
|
||||
PaginationLink,
|
||||
PaginationNext,
|
||||
PaginationPrevious,
|
||||
} from "@/components/ui/pagination";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
SelectItem,
|
||||
SelectTrigger,
|
||||
SelectValue,
|
||||
} from "@/components/ui/select";
|
||||
import { DateRangePicker } from "@/components/ui/date-range-picker";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { ArrowUpDown, Search } from "lucide-react";
|
||||
import debounce from 'lodash/debounce';
|
||||
import config from '../config';
|
||||
import { DateRange } from 'react-day-picker';
|
||||
import { motion } from 'motion/react';
|
||||
interface Order {
|
||||
order_number: string;
|
||||
customer: string;
|
||||
date: string;
|
||||
status: string;
|
||||
total_amount: number;
|
||||
items_count: number;
|
||||
payment_method: string;
|
||||
shipping_method: string;
|
||||
}
|
||||
|
||||
interface OrderFilters {
|
||||
search: string;
|
||||
status: string;
|
||||
dateRange: DateRange;
|
||||
minAmount: string;
|
||||
maxAmount: string;
|
||||
}
|
||||
|
||||
export function Orders() {
|
||||
const [page, setPage] = useState(1);
|
||||
const [sortColumn, setSortColumn] = useState<keyof Order>('date');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
||||
const [filters, setFilters] = useState<OrderFilters>({
|
||||
search: '',
|
||||
status: 'all',
|
||||
dateRange: { from: undefined, to: undefined },
|
||||
minAmount: '',
|
||||
maxAmount: '',
|
||||
});
|
||||
|
||||
const { data, isLoading, isFetching } = useQuery({
|
||||
queryKey: ['orders', page, sortColumn, sortDirection, filters],
|
||||
queryFn: async () => {
|
||||
const searchParams = new URLSearchParams({
|
||||
page: page.toString(),
|
||||
limit: '50',
|
||||
sortColumn: sortColumn.toString(),
|
||||
sortDirection,
|
||||
...filters.dateRange.from && { fromDate: filters.dateRange.from.toISOString() },
|
||||
...filters.dateRange.to && { toDate: filters.dateRange.to.toISOString() },
|
||||
...filters.minAmount && { minAmount: filters.minAmount },
|
||||
...filters.maxAmount && { maxAmount: filters.maxAmount },
|
||||
...filters.status !== 'all' && { status: filters.status },
|
||||
...filters.search && { search: filters.search },
|
||||
});
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/orders?${searchParams}`);
|
||||
if (!response.ok) throw new Error('Failed to fetch orders');
|
||||
return response.json();
|
||||
},
|
||||
placeholderData: keepPreviousData,
|
||||
staleTime: 30000,
|
||||
});
|
||||
|
||||
const debouncedFilterChange = useCallback(
|
||||
debounce((newFilters: Partial<OrderFilters>) => {
|
||||
setFilters(prev => ({ ...prev, ...newFilters }));
|
||||
setPage(1);
|
||||
}, 300),
|
||||
[]
|
||||
);
|
||||
|
||||
const handleSort = (column: keyof Order) => {
|
||||
if (sortColumn === column) {
|
||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
||||
} else {
|
||||
setSortColumn(column);
|
||||
setSortDirection('asc');
|
||||
}
|
||||
};
|
||||
|
||||
const getOrderStatusBadge = (status: string) => {
|
||||
const variants: Record<string, { variant: "default" | "secondary" | "destructive" | "outline"; label: string }> = {
|
||||
pending: { variant: "outline", label: "Pending" },
|
||||
processing: { variant: "secondary", label: "Processing" },
|
||||
completed: { variant: "default", label: "Completed" },
|
||||
cancelled: { variant: "destructive", label: "Cancelled" },
|
||||
};
|
||||
|
||||
const statusConfig = variants[status.toLowerCase()] || variants.pending;
|
||||
return <Badge variant={statusConfig.variant}>{statusConfig.label}</Badge>;
|
||||
};
|
||||
|
||||
const renderSortButton = (column: keyof Order, label: string) => (
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => handleSort(column)}
|
||||
className="w-full justify-start font-medium"
|
||||
>
|
||||
{label}
|
||||
<ArrowUpDown className={`ml-2 h-4 w-4 ${sortColumn === column && sortDirection === 'desc' ? 'rotate-180' : ''}`} />
|
||||
</Button>
|
||||
);
|
||||
|
||||
return (
|
||||
<motion.div layout className="p-8 space-y-8">
|
||||
<div className="flex items-center justify-between">
|
||||
<h1 className="text-2xl font-bold">Orders</h1>
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{data?.pagination.total.toLocaleString() ?? '...'} orders
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="grid gap-4 md:grid-cols-2 lg:grid-cols-4">
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Orders</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{data?.stats.totalOrders ?? '...'}</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
+{data?.stats.orderGrowth ?? 0}% from last month
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Revenue</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">
|
||||
${(data?.stats.totalRevenue ?? 0).toLocaleString()}
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
+{data?.stats.revenueGrowth ?? 0}% from last month
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Average Order Value</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">
|
||||
${(data?.stats.averageOrderValue ?? 0).toLocaleString()}
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
+{data?.stats.aovGrowth ?? 0}% from last month
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Conversion Rate</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">
|
||||
{(data?.stats.conversionRate ?? 0).toFixed(1)}%
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
+{data?.stats.conversionGrowth ?? 0}% from last month
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
|
||||
<div className="flex flex-col gap-4 md:flex-row md:items-center">
|
||||
<div className="flex items-center gap-2 flex-1">
|
||||
<Search className="h-4 w-4 text-muted-foreground" />
|
||||
<Input
|
||||
placeholder="Search orders..."
|
||||
value={filters.search}
|
||||
onChange={(e) => debouncedFilterChange({ search: e.target.value })}
|
||||
className="h-8 w-[300px]"
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-wrap items-center gap-2">
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => debouncedFilterChange({ status: value })}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[180px]">
|
||||
<SelectValue placeholder="Status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Status</SelectItem>
|
||||
<SelectItem value="pending">Pending</SelectItem>
|
||||
<SelectItem value="processing">Processing</SelectItem>
|
||||
<SelectItem value="completed">Completed</SelectItem>
|
||||
<SelectItem value="cancelled">Cancelled</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<DateRangePicker
|
||||
value={filters.dateRange}
|
||||
onChange={(range: DateRange | undefined) => debouncedFilterChange({ dateRange: range })}
|
||||
/>
|
||||
<div className="flex items-center gap-2">
|
||||
<Input
|
||||
type="number"
|
||||
placeholder="Min $"
|
||||
value={filters.minAmount}
|
||||
onChange={(e) => debouncedFilterChange({ minAmount: e.target.value })}
|
||||
className="h-8 w-[100px]"
|
||||
/>
|
||||
<span>-</span>
|
||||
<Input
|
||||
type="number"
|
||||
placeholder="Max $"
|
||||
value={filters.maxAmount}
|
||||
onChange={(e) => debouncedFilterChange({ maxAmount: e.target.value })}
|
||||
className="h-8 w-[100px]"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>{renderSortButton('order_number', 'Order')}</TableHead>
|
||||
<TableHead>{renderSortButton('customer', 'Customer')}</TableHead>
|
||||
<TableHead>{renderSortButton('date', 'Date')}</TableHead>
|
||||
<TableHead>{renderSortButton('status', 'Status')}</TableHead>
|
||||
<TableHead className="text-right">{renderSortButton('total_amount', 'Total')}</TableHead>
|
||||
<TableHead className="text-center">{renderSortButton('items_count', 'Items')}</TableHead>
|
||||
<TableHead>{renderSortButton('payment_method', 'Payment')}</TableHead>
|
||||
<TableHead>{renderSortButton('shipping_method', 'Shipping')}</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{isLoading ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={8} className="text-center py-8">
|
||||
Loading orders...
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : data?.orders.map((order: Order) => (
|
||||
<TableRow key={order.order_number}>
|
||||
<TableCell className="font-medium">#{order.order_number}</TableCell>
|
||||
<TableCell>{order.customer}</TableCell>
|
||||
<TableCell>{format(new Date(order.date), 'MMM d, yyyy')}</TableCell>
|
||||
<TableCell>{getOrderStatusBadge(order.status)}</TableCell>
|
||||
<TableCell className="text-right">${order.total_amount.toFixed(2)}</TableCell>
|
||||
<TableCell className="text-center">{order.items_count}</TableCell>
|
||||
<TableCell>{order.payment_method}</TableCell>
|
||||
<TableCell>{order.shipping_method}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
{!isLoading && !data?.orders.length && (
|
||||
<TableRow>
|
||||
<TableCell colSpan={8} className="text-center py-8 text-muted-foreground">
|
||||
No orders found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
|
||||
{data?.pagination.pages > 1 && (
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
aria-disabled={page === 1 || isFetching}
|
||||
className={page === 1 || isFetching ? 'pointer-events-none opacity-50' : ''}
|
||||
onClick={() => setPage(p => Math.max(1, p - 1))}
|
||||
/>
|
||||
</PaginationItem>
|
||||
{Array.from({ length: data.pagination.pages }, (_, i) => i + 1).map((p) => (
|
||||
<PaginationItem key={p}>
|
||||
<PaginationLink
|
||||
isActive={p === page}
|
||||
aria-disabled={isFetching}
|
||||
className={isFetching ? 'pointer-events-none opacity-50' : ''}
|
||||
onClick={() => setPage(p)}
|
||||
>
|
||||
{p}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
))}
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
aria-disabled={page === data.pagination.pages || isFetching}
|
||||
className={page === data.pagination.pages || isFetching ? 'pointer-events-none opacity-50' : ''}
|
||||
onClick={() => setPage(p => Math.min(data.pagination.pages, p + 1))}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
)}
|
||||
</motion.div>
|
||||
);
|
||||
}
|
||||
@@ -4,49 +4,267 @@ import { StockManagement } from "@/components/settings/StockManagement";
|
||||
import { PerformanceMetrics } from "@/components/settings/PerformanceMetrics";
|
||||
import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
||||
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
||||
import { motion } from 'motion/react';
|
||||
import { UserManagement } from "@/components/settings/UserManagement";
|
||||
import { PromptManagement } from "@/components/settings/PromptManagement";
|
||||
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
|
||||
import { motion } from 'framer-motion';
|
||||
import { Alert, AlertDescription } from "@/components/ui/alert";
|
||||
import { Protected } from "@/components/auth/Protected";
|
||||
import { useContext, useMemo } from "react";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
import { Separator } from "@/components/ui/separator";
|
||||
|
||||
// Define types for settings structure
|
||||
interface SettingsTab {
|
||||
id: string;
|
||||
permission: string;
|
||||
label: string;
|
||||
}
|
||||
|
||||
interface SettingsGroup {
|
||||
id: string;
|
||||
label: string;
|
||||
tabs: SettingsTab[];
|
||||
}
|
||||
|
||||
// Define available settings tabs with their permission requirements and groups
|
||||
const SETTINGS_GROUPS: SettingsGroup[] = [
|
||||
{
|
||||
id: "inventory",
|
||||
label: "Inventory Settings",
|
||||
tabs: [
|
||||
{ id: "stock-management", permission: "settings:stock_management", label: "Stock Management" },
|
||||
{ id: "performance-metrics", permission: "settings:performance_metrics", label: "Performance Metrics" },
|
||||
{ id: "calculation-settings", permission: "settings:calculation_settings", label: "Calculation Settings" },
|
||||
]
|
||||
},
|
||||
{
|
||||
id: "content",
|
||||
label: "Content Management",
|
||||
tabs: [
|
||||
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
||||
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
|
||||
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
|
||||
]
|
||||
},
|
||||
{
|
||||
id: "system",
|
||||
label: "System",
|
||||
tabs: [
|
||||
{ id: "user-management", permission: "settings:user_management", label: "User Management" },
|
||||
{ id: "data-management", permission: "settings:data_management", label: "Data Management" },
|
||||
]
|
||||
}
|
||||
];
|
||||
|
||||
// Flatten tabs for easier access
|
||||
const SETTINGS_TABS = SETTINGS_GROUPS.flatMap(group => group.tabs);
|
||||
|
||||
export function Settings() {
|
||||
const { user } = useContext(AuthContext);
|
||||
|
||||
// Determine the first tab the user has access to
|
||||
const defaultTab = useMemo(() => {
|
||||
// Admin users have access to all tabs
|
||||
if (user?.is_admin) {
|
||||
return SETTINGS_TABS[0].id;
|
||||
}
|
||||
|
||||
// Find the first tab the user has permission to access
|
||||
const firstAccessibleTab = SETTINGS_TABS.find(tab =>
|
||||
user?.permissions?.includes(tab.permission)
|
||||
);
|
||||
|
||||
// Return the ID of the first accessible tab, or first tab as fallback
|
||||
return firstAccessibleTab?.id || SETTINGS_TABS[0].id;
|
||||
}, [user]);
|
||||
|
||||
// Check if user has access to any tab
|
||||
const hasAccessToAnyTab = useMemo(() => {
|
||||
if (user?.is_admin) return true;
|
||||
return SETTINGS_TABS.some(tab => user?.permissions?.includes(tab.permission));
|
||||
}, [user]);
|
||||
|
||||
// If user doesn't have access to any tabs, show a helpful message
|
||||
if (!hasAccessToAnyTab) {
|
||||
return (
|
||||
<motion.div layout className="container mx-auto py-6">
|
||||
<div className="mb-6">
|
||||
<h1 className="text-3xl font-bold">Settings</h1>
|
||||
</div>
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access any settings. Please contact an administrator for assistance.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
</motion.div>
|
||||
);
|
||||
}
|
||||
|
||||
// Function to check if the user has access to any tab in a group
|
||||
const hasAccessToGroup = (group: SettingsGroup): boolean => {
|
||||
if (user?.is_admin) return true;
|
||||
return group.tabs.some(tab => user?.permissions?.includes(tab.permission));
|
||||
};
|
||||
|
||||
return (
|
||||
<motion.div layout className="container mx-auto py-6">
|
||||
<div className="mb-6">
|
||||
<h1 className="text-3xl font-bold">Settings</h1>
|
||||
</div>
|
||||
|
||||
<Tabs defaultValue="data-management" className="space-y-4">
|
||||
<TabsList>
|
||||
<TabsTrigger value="data-management">Data Management</TabsTrigger>
|
||||
<TabsTrigger value="stock-management">Stock Management</TabsTrigger>
|
||||
<TabsTrigger value="performance-metrics">
|
||||
Performance Metrics
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="calculation-settings">
|
||||
Calculation Settings
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="templates">
|
||||
Template Management
|
||||
</TabsTrigger>
|
||||
</TabsList>
|
||||
<Tabs defaultValue={defaultTab} orientation="vertical" className="flex flex-row min-h-[500px]">
|
||||
<div className="w-60 border-r pr-8">
|
||||
<TabsList className="flex flex-col h-auto justify-start items-stretch p-0 bg-transparent">
|
||||
{SETTINGS_GROUPS.map((group) => (
|
||||
hasAccessToGroup(group) && (
|
||||
<div key={group.id} className="">
|
||||
<h3 className="font-semibold text-sm px-3 py-2 bg-muted border text-foreground rounded-md mb-2">
|
||||
{group.label}
|
||||
</h3>
|
||||
<div className="space-y-1 pl-1">
|
||||
{group.tabs.map((tab) => (
|
||||
<Protected key={tab.id} permission={tab.permission}>
|
||||
<TabsTrigger
|
||||
value={tab.id}
|
||||
className="w-full justify-start px-3 py-2 text-sm font-normal text-muted-foreground data-[state=active]:font-medium data-[state=active]:text-accent-foreground data-[state=active]:shadow-none rounded-md data-[state=active]:underline"
|
||||
>
|
||||
{tab.label}
|
||||
</TabsTrigger>
|
||||
</Protected>
|
||||
))}
|
||||
</div>
|
||||
{/* Only add separator if not the last group */}
|
||||
{group.id !== SETTINGS_GROUPS[SETTINGS_GROUPS.length - 1].id && (
|
||||
<Separator className="mt-4 mb-4 opacity-70" />
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
))}
|
||||
</TabsList>
|
||||
</div>
|
||||
|
||||
<TabsContent value="data-management">
|
||||
<DataManagement />
|
||||
</TabsContent>
|
||||
<div className="pl-8 w-full">
|
||||
<TabsContent value="data-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:data_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Data Management.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<DataManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="stock-management">
|
||||
<StockManagement />
|
||||
</TabsContent>
|
||||
<TabsContent value="stock-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:stock_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Stock Management.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<StockManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="performance-metrics">
|
||||
<PerformanceMetrics />
|
||||
</TabsContent>
|
||||
<TabsContent value="performance-metrics" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:performance_metrics"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Performance Metrics.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<PerformanceMetrics />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="calculation-settings">
|
||||
<CalculationSettings />
|
||||
</TabsContent>
|
||||
<TabsContent value="calculation-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:calculation_settings"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Calculation Settings.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<CalculationSettings />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="templates">
|
||||
<TemplateManagement />
|
||||
</TabsContent>
|
||||
<TabsContent value="templates" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:templates"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Template Management.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<TemplateManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:prompt_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access AI Prompts.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<PromptManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:library_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Reusable Images.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<ReusableImageManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:user_management"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access User Management.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<UserManagement />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
</div>
|
||||
</Tabs>
|
||||
</motion.div>
|
||||
);
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user