Compare commits
6 Commits
add-compan
...
108181c63d
| Author | SHA1 | Date | |
|---|---|---|---|
| 108181c63d | |||
| 5dd779cb4a | |||
| 7b0e792d03 | |||
| 517bbe72f4 | |||
| 87d4b9e804 | |||
| 75da2c6772 |
342
docs/import-from-prod-data-mapping.md
Normal file
342
docs/import-from-prod-data-mapping.md
Normal file
@@ -0,0 +1,342 @@
|
|||||||
|
# MySQL to PostgreSQL Import Process Documentation
|
||||||
|
|
||||||
|
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
1. [Overview](#overview)
|
||||||
|
2. [Import Architecture](#import-architecture)
|
||||||
|
3. [Column Mappings](#column-mappings)
|
||||||
|
- [Categories](#categories)
|
||||||
|
- [Products](#products)
|
||||||
|
- [Product Categories (Relationship)](#product-categories-relationship)
|
||||||
|
- [Orders](#orders)
|
||||||
|
- [Purchase Orders](#purchase-orders)
|
||||||
|
- [Metadata Tables](#metadata-tables)
|
||||||
|
4. [Special Calculations](#special-calculations)
|
||||||
|
5. [Implementation Notes](#implementation-notes)
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
|
||||||
|
|
||||||
|
- **Full Import**: Imports all data regardless of last sync time
|
||||||
|
- **Incremental Import**: Only imports data that has changed since the last import
|
||||||
|
|
||||||
|
The process handles four main data types:
|
||||||
|
- Categories (product categorization hierarchy)
|
||||||
|
- Products (inventory items)
|
||||||
|
- Orders (sales records)
|
||||||
|
- Purchase Orders (vendor orders)
|
||||||
|
|
||||||
|
## Import Architecture
|
||||||
|
|
||||||
|
The import process follows these steps:
|
||||||
|
|
||||||
|
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
|
||||||
|
2. **Setup Import History**: Creates a record of the current import operation
|
||||||
|
3. **Import Categories**: Processes product categories in hierarchical order
|
||||||
|
4. **Import Products**: Processes products with their attributes and category relationships
|
||||||
|
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
|
||||||
|
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
|
||||||
|
7. **Record Results**: Updates the import history with results
|
||||||
|
8. **Close Connections**: Cleans up connections and resources
|
||||||
|
|
||||||
|
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
|
||||||
|
|
||||||
|
## Column Mappings
|
||||||
|
|
||||||
|
### Categories
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|---------------------------------|----------------------------------------------|
|
||||||
|
| cat_id | product_categories.cat_id | Direct mapping |
|
||||||
|
| name | product_categories.name | Direct mapping |
|
||||||
|
| type | product_categories.type | Direct mapping |
|
||||||
|
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
|
||||||
|
| description | product_categories.combined_name| Direct mapping |
|
||||||
|
| status | N/A | Hard-coded 'active' |
|
||||||
|
| created_at | N/A | Current timestamp |
|
||||||
|
| updated_at | N/A | Current timestamp |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
|
||||||
|
- Type 10/20 are top-level categories with no parent
|
||||||
|
- Types 11/21/12/13 are child categories that reference parent categories
|
||||||
|
|
||||||
|
### Products
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|----------------------|----------------------------------|---------------------------------------------------------------|
|
||||||
|
| pid | products.pid | Direct mapping |
|
||||||
|
| title | products.description | Direct mapping |
|
||||||
|
| description | products.notes | Direct mapping |
|
||||||
|
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
|
||||||
|
| preorder_count | current_inventory.onpreorder | Default 0 |
|
||||||
|
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
|
||||||
|
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
|
||||||
|
| regular_price | products.sellingprice | Default 0 |
|
||||||
|
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
|
||||||
|
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
|
||||||
|
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
|
||||||
|
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
|
||||||
|
| brand | product_categories.name | Linked via products.company |
|
||||||
|
| line | product_categories.name | Linked via products.line |
|
||||||
|
| subline | product_categories.name | Linked via products.subline |
|
||||||
|
| artist | product_categories.name | Linked via products.artist |
|
||||||
|
| categories | product_category_index | Comma-separated list of category IDs |
|
||||||
|
| created_at | products.date_created | Validated date, NULL if invalid |
|
||||||
|
| first_received | products.datein | Validated date, NULL if invalid |
|
||||||
|
| landing_cost_price | NULL | Not set |
|
||||||
|
| barcode | products.upc | Direct mapping |
|
||||||
|
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
|
||||||
|
| updated_at | products.stamp | Validated date, NULL if invalid |
|
||||||
|
| visible | shop_inventory | Calculated from show + buyable > 0 |
|
||||||
|
| managing_stock | N/A | Hard-coded true |
|
||||||
|
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
|
||||||
|
| permalink | N/A | Constructed URL with product ID |
|
||||||
|
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
|
||||||
|
| uom | N/A | Hard-coded 1 |
|
||||||
|
| rating | products.rating | Direct mapping |
|
||||||
|
| reviews | products.rating_votes | Direct mapping |
|
||||||
|
| weight | products.weight | Direct mapping |
|
||||||
|
| length | products.length | Direct mapping |
|
||||||
|
| width | products.width | Direct mapping |
|
||||||
|
| height | products.height | Direct mapping |
|
||||||
|
| country_of_origin | products.country_of_origin | Direct mapping |
|
||||||
|
| location | products.location | Direct mapping |
|
||||||
|
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
|
||||||
|
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
|
||||||
|
| notifies | product_notify | COUNT of records where pn.pid = pid |
|
||||||
|
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
|
||||||
|
| image | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| image_175 | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| image_full | N/A | Constructed from pid and image URL pattern |
|
||||||
|
| options | NULL | Not set |
|
||||||
|
| tags | NULL | Not set |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Replenishable calculation:
|
||||||
|
```javascript
|
||||||
|
CASE
|
||||||
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN (
|
||||||
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
|
) THEN 0
|
||||||
|
ELSE 1
|
||||||
|
END
|
||||||
|
```
|
||||||
|
|
||||||
|
In business terms, a product is considered NOT replenishable only if:
|
||||||
|
- It was manually flagged as not replenishable (negative reorder value)
|
||||||
|
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
|
||||||
|
- Image URLs are constructed using this pattern:
|
||||||
|
```javascript
|
||||||
|
const paddedPid = pid.toString().padStart(6, '0');
|
||||||
|
const prefix = paddedPid.slice(0, 3);
|
||||||
|
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||||
|
return {
|
||||||
|
image: `${basePath}-t-${iid}.jpg`,
|
||||||
|
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||||
|
image_full: `${basePath}-o-${iid}.jpg`
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### Product Categories (Relationship)
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| pid | products.pid | Direct mapping |
|
||||||
|
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
|
||||||
|
- Categories 16 and 17 are explicitly excluded
|
||||||
|
|
||||||
|
### Orders
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| order_number | order_items.order_id | Direct mapping |
|
||||||
|
| pid | order_items.prod_pid | Direct mapping |
|
||||||
|
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| date | _order.date_placed_onlydate | Via join to _order table |
|
||||||
|
| price | order_items.prod_price | Direct mapping |
|
||||||
|
| quantity | order_items.qty_ordered | Direct mapping |
|
||||||
|
| discount | Multiple sources | Complex calculation (see notes) |
|
||||||
|
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
|
||||||
|
| tax_included | N/A | Hard-coded false |
|
||||||
|
| shipping | N/A | Hard-coded 0 |
|
||||||
|
| customer | _order.order_cid | Direct mapping |
|
||||||
|
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
|
||||||
|
| status | _order.order_status | Direct mapping |
|
||||||
|
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
|
||||||
|
| costeach | order_costs | From latest record or fallback to price * 0.5 |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only orders with order_status >= 15 and with a valid date_placed are processed
|
||||||
|
- For incremental imports, only orders modified since last sync are processed
|
||||||
|
- Discount calculation combines three sources:
|
||||||
|
1. Base discount: order_items.prod_price_reg - order_items.prod_price
|
||||||
|
2. Promo discount: SUM of order_discount_items.amount
|
||||||
|
3. Proportional order discount: Calculation based on order subtotal proportion
|
||||||
|
```javascript
|
||||||
|
(oi.base_discount +
|
||||||
|
COALESCE(ot.promo_discount, 0) +
|
||||||
|
CASE
|
||||||
|
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
|
||||||
|
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
|
||||||
|
ELSE 0
|
||||||
|
END)::DECIMAL(10,2)
|
||||||
|
```
|
||||||
|
- Taxes are taken from the latest tax record for an order
|
||||||
|
- Cost data is taken from the latest non-pending cost record
|
||||||
|
|
||||||
|
### Purchase Orders
|
||||||
|
|
||||||
|
| PostgreSQL Column | MySQL Source | Transformation |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| po_id | po.po_id | Default 0 if NULL |
|
||||||
|
| pid | po_products.pid | Direct mapping |
|
||||||
|
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
|
||||||
|
| name | products.description | Fallback to 'Unknown Product' |
|
||||||
|
| cost_price | po_products.cost_each | Direct mapping |
|
||||||
|
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
|
||||||
|
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
|
||||||
|
| date | po.date_ordered | Fallback to po.date_created if NULL |
|
||||||
|
| expected_date | po.date_estin | Direct mapping |
|
||||||
|
| status | po.status | Default 1 if NULL |
|
||||||
|
| notes | po.short_note | Fallback to po.notes if NULL |
|
||||||
|
| ordered | po_products.qty_each | Direct mapping |
|
||||||
|
| received | N/A | Hard-coded 0 |
|
||||||
|
| receiving_status | N/A | Hard-coded 1 |
|
||||||
|
|
||||||
|
**Notes:**
|
||||||
|
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
|
||||||
|
- For incremental imports, only POs modified since last sync are processed
|
||||||
|
|
||||||
|
### Metadata Tables
|
||||||
|
|
||||||
|
#### import_history
|
||||||
|
|
||||||
|
| PostgreSQL Column | Source | Notes |
|
||||||
|
|-------------------|-----------------------------------|---------------------------------------------------------------|
|
||||||
|
| id | Auto-increment | Primary key |
|
||||||
|
| table_name | Code | 'all_tables' for overall import |
|
||||||
|
| start_time | NOW() | Import start time |
|
||||||
|
| end_time | NOW() | Import completion time |
|
||||||
|
| duration_seconds | Calculation | Elapsed seconds |
|
||||||
|
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
|
||||||
|
| records_added | Calculation | Sum from all imports |
|
||||||
|
| records_updated | Calculation | Sum from all imports |
|
||||||
|
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
|
||||||
|
| error_message | Exception | Error message if failed |
|
||||||
|
| additional_info | JSON | Configuration and results |
|
||||||
|
|
||||||
|
#### sync_status
|
||||||
|
|
||||||
|
| PostgreSQL Column | Source | Notes |
|
||||||
|
|----------------------|--------------------------------|---------------------------------------------------------------|
|
||||||
|
| table_name | Code | Name of imported table |
|
||||||
|
| last_sync_timestamp | NOW() | Timestamp of successful sync |
|
||||||
|
| last_sync_id | NULL | Not used currently |
|
||||||
|
|
||||||
|
## Special Calculations
|
||||||
|
|
||||||
|
### Date Validation
|
||||||
|
|
||||||
|
MySQL dates are validated before insertion into PostgreSQL:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
function validateDate(mysqlDate) {
|
||||||
|
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
// Check if the date is valid
|
||||||
|
const date = new Date(mysqlDate);
|
||||||
|
return isNaN(date.getTime()) ? null : mysqlDate;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Retry Mechanism
|
||||||
|
|
||||||
|
Operations that might fail temporarily are retried with exponential backoff:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async function withRetry(operation, errorMessage) {
|
||||||
|
let lastError;
|
||||||
|
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||||
|
try {
|
||||||
|
return await operation();
|
||||||
|
} catch (error) {
|
||||||
|
lastError = error;
|
||||||
|
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||||
|
if (attempt < MAX_RETRIES) {
|
||||||
|
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||||
|
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw lastError;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Progress Tracking
|
||||||
|
|
||||||
|
Progress is tracked with estimated time remaining:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
function estimateRemaining(startTime, current, total) {
|
||||||
|
if (current === 0) return "Calculating...";
|
||||||
|
const elapsedSeconds = (Date.now() - startTime) / 1000;
|
||||||
|
const itemsPerSecond = current / elapsedSeconds;
|
||||||
|
const remainingItems = total - current;
|
||||||
|
const remainingSeconds = remainingItems / itemsPerSecond;
|
||||||
|
return formatElapsedTime(remainingSeconds);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
|
||||||
|
### Transaction Management
|
||||||
|
|
||||||
|
All imports use transactions to ensure data consistency:
|
||||||
|
|
||||||
|
- **Categories**: Uses savepoints for each category type
|
||||||
|
- **Products**: Uses a single transaction for the entire import
|
||||||
|
- **Orders**: Uses a single transaction with temporary tables
|
||||||
|
- **Purchase Orders**: Uses a single transaction with temporary tables
|
||||||
|
|
||||||
|
### Memory Usage Optimization
|
||||||
|
|
||||||
|
To minimize memory usage when processing large datasets:
|
||||||
|
|
||||||
|
1. Data is processed in batches (100-5000 records per batch)
|
||||||
|
2. Temporary tables are used for intermediate data
|
||||||
|
3. Some queries use cursors to avoid loading all results at once
|
||||||
|
|
||||||
|
### MySQL vs PostgreSQL Compatibility
|
||||||
|
|
||||||
|
The scripts handle differences between MySQL and PostgreSQL:
|
||||||
|
|
||||||
|
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
|
||||||
|
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
|
||||||
|
3. Transaction syntax differences are abstracted in the connection wrapper
|
||||||
|
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
|
||||||
|
|
||||||
|
### SSH Tunnel
|
||||||
|
|
||||||
|
Database connections go through an SSH tunnel for security:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
ssh.forwardOut(
|
||||||
|
"127.0.0.1",
|
||||||
|
0,
|
||||||
|
sshConfig.prodDbConfig.host,
|
||||||
|
sshConfig.prodDbConfig.port,
|
||||||
|
async (err, stream) => {
|
||||||
|
if (err) reject(err);
|
||||||
|
resolve({ ssh, stream });
|
||||||
|
}
|
||||||
|
);
|
||||||
|
```
|
||||||
@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
|
|||||||
-- Create function for updating timestamps
|
-- Create function for updating timestamps
|
||||||
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
|
||||||
BEGIN
|
BEGIN
|
||||||
|
-- Check which table is being updated and use the appropriate column
|
||||||
|
IF TG_TABLE_NAME = 'categories' THEN
|
||||||
|
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||||
|
ELSE
|
||||||
NEW.updated = CURRENT_TIMESTAMP;
|
NEW.updated = CURRENT_TIMESTAMP;
|
||||||
|
END IF;
|
||||||
RETURN NEW;
|
RETURN NEW;
|
||||||
END;
|
END;
|
||||||
$func$ language plpgsql;
|
$func$ language plpgsql;
|
||||||
@@ -160,7 +165,7 @@ CREATE TABLE purchase_orders (
|
|||||||
expected_date DATE,
|
expected_date DATE,
|
||||||
pid BIGINT NOT NULL,
|
pid BIGINT NOT NULL,
|
||||||
sku VARCHAR(50) NOT NULL,
|
sku VARCHAR(50) NOT NULL,
|
||||||
name VARCHAR(100) NOT NULL,
|
name VARCHAR(255) NOT NULL,
|
||||||
cost_price DECIMAL(10, 3) NOT NULL,
|
cost_price DECIMAL(10, 3) NOT NULL,
|
||||||
po_cost_price DECIMAL(10, 3) NOT NULL,
|
po_cost_price DECIMAL(10, 3) NOT NULL,
|
||||||
status SMALLINT DEFAULT 1,
|
status SMALLINT DEFAULT 1,
|
||||||
@@ -171,7 +176,7 @@ CREATE TABLE purchase_orders (
|
|||||||
received INTEGER DEFAULT 0,
|
received INTEGER DEFAULT 0,
|
||||||
received_date DATE,
|
received_date DATE,
|
||||||
last_received_date DATE,
|
last_received_date DATE,
|
||||||
received_by VARCHAR(100),
|
received_by VARCHAR,
|
||||||
receiving_history JSONB,
|
receiving_history JSONB,
|
||||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
FOREIGN KEY (pid) REFERENCES products(pid),
|
FOREIGN KEY (pid) REFERENCES products(pid),
|
||||||
|
|||||||
@@ -49,6 +49,30 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
|
|||||||
ON ai_prompts (prompt_type)
|
ON ai_prompts (prompt_type)
|
||||||
WHERE prompt_type = 'system';
|
WHERE prompt_type = 'system';
|
||||||
|
|
||||||
|
-- Reusable Images table for storing persistent images
|
||||||
|
CREATE TABLE IF NOT EXISTS reusable_images (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
filename TEXT NOT NULL,
|
||||||
|
file_path TEXT NOT NULL,
|
||||||
|
image_url TEXT NOT NULL,
|
||||||
|
is_global BOOLEAN NOT NULL DEFAULT false,
|
||||||
|
company TEXT,
|
||||||
|
mime_type TEXT,
|
||||||
|
file_size INTEGER,
|
||||||
|
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
CONSTRAINT company_required_for_non_global CHECK (
|
||||||
|
(is_global = true AND company IS NULL) OR
|
||||||
|
(is_global = false AND company IS NOT NULL)
|
||||||
|
)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Create index on company for efficient querying
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
|
||||||
|
-- Create index on is_global for efficient querying
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
|
||||||
|
|
||||||
-- AI Validation Performance Tracking
|
-- AI Validation Performance Tracking
|
||||||
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
CREATE TABLE IF NOT EXISTS ai_validation_performance (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
@@ -83,3 +107,9 @@ CREATE TRIGGER update_ai_prompts_updated_at
|
|||||||
BEFORE UPDATE ON ai_prompts
|
BEFORE UPDATE ON ai_prompts
|
||||||
FOR EACH ROW
|
FOR EACH ROW
|
||||||
EXECUTE FUNCTION update_updated_at_column();
|
EXECUTE FUNCTION update_updated_at_column();
|
||||||
|
|
||||||
|
-- Trigger to automatically update the updated_at column for reusable_images
|
||||||
|
CREATE TRIGGER update_reusable_images_updated_at
|
||||||
|
BEFORE UPDATE ON reusable_images
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_updated_at_column();
|
||||||
@@ -10,9 +10,9 @@ const importPurchaseOrders = require('./import/purchase-orders');
|
|||||||
dotenv.config({ path: path.join(__dirname, "../.env") });
|
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||||
|
|
||||||
// Constants to control which imports run
|
// Constants to control which imports run
|
||||||
const IMPORT_CATEGORIES = true;
|
const IMPORT_CATEGORIES = false;
|
||||||
const IMPORT_PRODUCTS = true;
|
const IMPORT_PRODUCTS = false;
|
||||||
const IMPORT_ORDERS = true;
|
const IMPORT_ORDERS = false;
|
||||||
const IMPORT_PURCHASE_ORDERS = true;
|
const IMPORT_PURCHASE_ORDERS = true;
|
||||||
|
|
||||||
// Add flag for incremental updates
|
// Add flag for incremental updates
|
||||||
@@ -120,6 +120,7 @@ async function main() {
|
|||||||
`);
|
`);
|
||||||
|
|
||||||
// Create import history record for the overall session
|
// Create import history record for the overall session
|
||||||
|
try {
|
||||||
const [historyResult] = await localConnection.query(`
|
const [historyResult] = await localConnection.query(`
|
||||||
INSERT INTO import_history (
|
INSERT INTO import_history (
|
||||||
table_name,
|
table_name,
|
||||||
@@ -141,6 +142,16 @@ async function main() {
|
|||||||
) RETURNING id
|
) RETURNING id
|
||||||
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
`, [INCREMENTAL_UPDATE, IMPORT_CATEGORIES, IMPORT_PRODUCTS, IMPORT_ORDERS, IMPORT_PURCHASE_ORDERS]);
|
||||||
importHistoryId = historyResult.rows[0].id;
|
importHistoryId = historyResult.rows[0].id;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error creating import history record:", error);
|
||||||
|
outputProgress({
|
||||||
|
status: "error",
|
||||||
|
operation: "Import process",
|
||||||
|
message: "Failed to create import history record",
|
||||||
|
error: error.message
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
const results = {
|
const results = {
|
||||||
categories: null,
|
categories: null,
|
||||||
@@ -158,8 +169,8 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Categories import result:', results.categories);
|
console.log('Categories import result:', results.categories);
|
||||||
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.categories?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.categories?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_PRODUCTS) {
|
if (IMPORT_PRODUCTS) {
|
||||||
@@ -167,8 +178,8 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Products import result:', results.products);
|
console.log('Products import result:', results.products);
|
||||||
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.products?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.products?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_ORDERS) {
|
if (IMPORT_ORDERS) {
|
||||||
@@ -176,17 +187,34 @@ async function main() {
|
|||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Orders import result:', results.orders);
|
console.log('Orders import result:', results.orders);
|
||||||
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0);
|
totalRecordsAdded += parseInt(results.orders?.recordsAdded || 0) || 0;
|
||||||
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0);
|
totalRecordsUpdated += parseInt(results.orders?.recordsUpdated || 0) || 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (IMPORT_PURCHASE_ORDERS) {
|
if (IMPORT_PURCHASE_ORDERS) {
|
||||||
|
try {
|
||||||
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
results.purchaseOrders = await importPurchaseOrders(prodConnection, localConnection, INCREMENTAL_UPDATE);
|
||||||
if (isImportCancelled) throw new Error("Import cancelled");
|
if (isImportCancelled) throw new Error("Import cancelled");
|
||||||
completedSteps++;
|
completedSteps++;
|
||||||
console.log('Purchase orders import result:', results.purchaseOrders);
|
console.log('Purchase orders import result:', results.purchaseOrders);
|
||||||
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0);
|
|
||||||
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0);
|
// Handle potential error status
|
||||||
|
if (results.purchaseOrders?.status === 'error') {
|
||||||
|
console.error('Purchase orders import had an error:', results.purchaseOrders.error);
|
||||||
|
} else {
|
||||||
|
totalRecordsAdded += parseInt(results.purchaseOrders?.recordsAdded || 0) || 0;
|
||||||
|
totalRecordsUpdated += parseInt(results.purchaseOrders?.recordsUpdated || 0) || 0;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error during purchase orders import:', error);
|
||||||
|
// Continue with other imports, don't fail the whole process
|
||||||
|
results.purchaseOrders = {
|
||||||
|
status: 'error',
|
||||||
|
error: error.message,
|
||||||
|
recordsAdded: 0,
|
||||||
|
recordsUpdated: 0
|
||||||
|
};
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const endTime = Date.now();
|
const endTime = Date.now();
|
||||||
@@ -214,8 +242,8 @@ async function main() {
|
|||||||
WHERE id = $12
|
WHERE id = $12
|
||||||
`, [
|
`, [
|
||||||
totalElapsedSeconds,
|
totalElapsedSeconds,
|
||||||
totalRecordsAdded,
|
parseInt(totalRecordsAdded) || 0,
|
||||||
totalRecordsUpdated,
|
parseInt(totalRecordsUpdated) || 0,
|
||||||
IMPORT_CATEGORIES,
|
IMPORT_CATEGORIES,
|
||||||
IMPORT_PRODUCTS,
|
IMPORT_PRODUCTS,
|
||||||
IMPORT_ORDERS,
|
IMPORT_ORDERS,
|
||||||
|
|||||||
@@ -47,42 +47,18 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log(`\nProcessing ${categories.length} type ${type} categories`);
|
console.log(`Processing ${categories.length} type ${type} categories`);
|
||||||
if (type === 10) {
|
|
||||||
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
|
|
||||||
}
|
|
||||||
|
|
||||||
// For types that can have parents (11, 21, 12, 13), verify parent existence
|
// For types that can have parents (11, 21, 12, 13), we'll proceed directly
|
||||||
|
// No need to check for parent existence since we process in hierarchical order
|
||||||
let categoriesToInsert = categories;
|
let categoriesToInsert = categories;
|
||||||
if (![10, 20].includes(type)) {
|
|
||||||
// Get all parent IDs
|
|
||||||
const parentIds = [
|
|
||||||
...new Set(
|
|
||||||
categories
|
|
||||||
.filter(c => c && c.parent_id !== null)
|
|
||||||
.map(c => c.parent_id)
|
|
||||||
),
|
|
||||||
];
|
|
||||||
|
|
||||||
console.log(`Processing ${categories.length} type ${type} categories with ${parentIds.length} unique parent IDs`);
|
|
||||||
console.log('Parent IDs:', parentIds);
|
|
||||||
|
|
||||||
// No need to check for parent existence - we trust they exist since they were just inserted
|
|
||||||
categoriesToInsert = categories;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (categoriesToInsert.length === 0) {
|
if (categoriesToInsert.length === 0) {
|
||||||
console.log(
|
console.log(`No valid categories of type ${type} to insert`);
|
||||||
`No valid categories of type ${type} to insert`
|
|
||||||
);
|
|
||||||
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
await localConnection.query(`RELEASE SAVEPOINT category_type_${type}`);
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
console.log(
|
|
||||||
`Inserting ${categoriesToInsert.length} type ${type} categories`
|
|
||||||
);
|
|
||||||
|
|
||||||
// PostgreSQL upsert query with parameterized values
|
// PostgreSQL upsert query with parameterized values
|
||||||
const values = categoriesToInsert.flatMap((cat) => [
|
const values = categoriesToInsert.flatMap((cat) => [
|
||||||
cat.cat_id,
|
cat.cat_id,
|
||||||
@@ -95,14 +71,10 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
new Date()
|
new Date()
|
||||||
]);
|
]);
|
||||||
|
|
||||||
console.log('Attempting to insert/update with values:', JSON.stringify(values, null, 2));
|
|
||||||
|
|
||||||
const placeholders = categoriesToInsert
|
const placeholders = categoriesToInsert
|
||||||
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
.map((_, i) => `($${i * 8 + 1}, $${i * 8 + 2}, $${i * 8 + 3}, $${i * 8 + 4}, $${i * 8 + 5}, $${i * 8 + 6}, $${i * 8 + 7}, $${i * 8 + 8})`)
|
||||||
.join(',');
|
.join(',');
|
||||||
|
|
||||||
console.log('Using placeholders:', placeholders);
|
|
||||||
|
|
||||||
// Insert categories with ON CONFLICT clause for PostgreSQL
|
// Insert categories with ON CONFLICT clause for PostgreSQL
|
||||||
const query = `
|
const query = `
|
||||||
WITH inserted_categories AS (
|
WITH inserted_categories AS (
|
||||||
@@ -130,16 +102,13 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
COUNT(*) FILTER (WHERE NOT is_insert) as updated
|
||||||
FROM inserted_categories`;
|
FROM inserted_categories`;
|
||||||
|
|
||||||
console.log('Executing query:', query);
|
|
||||||
|
|
||||||
const result = await localConnection.query(query, values);
|
const result = await localConnection.query(query, values);
|
||||||
console.log('Query result:', result);
|
|
||||||
|
|
||||||
// Get the first result since query returns an array
|
// Get the first result since query returns an array
|
||||||
const queryResult = Array.isArray(result) ? result[0] : result;
|
const queryResult = Array.isArray(result) ? result[0] : result;
|
||||||
|
|
||||||
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
if (!queryResult || !queryResult.rows || !queryResult.rows[0]) {
|
||||||
console.error('Query failed to return results. Result:', queryResult);
|
console.error('Query failed to return results');
|
||||||
throw new Error('Query did not return expected results');
|
throw new Error('Query did not return expected results');
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -173,6 +142,14 @@ async function importCategories(prodConnection, localConnection) {
|
|||||||
// Commit the entire transaction - we'll do this even if we have skipped categories
|
// Commit the entire transaction - we'll do this even if we have skipped categories
|
||||||
await localConnection.query('COMMIT');
|
await localConnection.query('COMMIT');
|
||||||
|
|
||||||
|
// Update sync status
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||||
|
VALUES ('categories', NOW())
|
||||||
|
ON CONFLICT (table_name) DO UPDATE SET
|
||||||
|
last_sync_timestamp = NOW()
|
||||||
|
`);
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "complete",
|
status: "complete",
|
||||||
operation: "Categories import completed",
|
operation: "Categories import completed",
|
||||||
|
|||||||
@@ -26,6 +26,9 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
let cumulativeProcessedOrders = 0;
|
let cumulativeProcessedOrders = 0;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
// Begin transaction
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
// Get last sync info
|
// Get last sync info
|
||||||
const [syncInfo] = await localConnection.query(
|
const [syncInfo] = await localConnection.query(
|
||||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||||
@@ -38,7 +41,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const [[{ total }]] = await prodConnection.query(`
|
const [[{ total }]] = await prodConnection.query(`
|
||||||
SELECT COUNT(*) as total
|
SELECT COUNT(*) as total
|
||||||
FROM order_items oi
|
FROM order_items oi
|
||||||
USE INDEX (PRIMARY)
|
|
||||||
JOIN _order o ON oi.order_id = o.order_id
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
WHERE o.order_status >= 15
|
WHERE o.order_status >= 15
|
||||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||||
@@ -78,7 +80,6 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
||||||
oi.stamp as last_modified
|
oi.stamp as last_modified
|
||||||
FROM order_items oi
|
FROM order_items oi
|
||||||
USE INDEX (PRIMARY)
|
|
||||||
JOIN _order o ON oi.order_id = o.order_id
|
JOIN _order o ON oi.order_id = o.order_id
|
||||||
WHERE o.order_status >= 15
|
WHERE o.order_status >= 15
|
||||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||||
@@ -105,15 +106,15 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
|
|
||||||
console.log('Orders: Found', orderItems.length, 'order items to process');
|
console.log('Orders: Found', orderItems.length, 'order items to process');
|
||||||
|
|
||||||
// Create tables in PostgreSQL for debugging
|
// Create tables in PostgreSQL for data processing
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
DROP TABLE IF EXISTS debug_order_items;
|
DROP TABLE IF EXISTS temp_order_items;
|
||||||
DROP TABLE IF EXISTS debug_order_meta;
|
DROP TABLE IF EXISTS temp_order_meta;
|
||||||
DROP TABLE IF EXISTS debug_order_discounts;
|
DROP TABLE IF EXISTS temp_order_discounts;
|
||||||
DROP TABLE IF EXISTS debug_order_taxes;
|
DROP TABLE IF EXISTS temp_order_taxes;
|
||||||
DROP TABLE IF EXISTS debug_order_costs;
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
|
||||||
CREATE TABLE debug_order_items (
|
CREATE TEMP TABLE temp_order_items (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
SKU VARCHAR(50) NOT NULL,
|
SKU VARCHAR(50) NOT NULL,
|
||||||
@@ -123,7 +124,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_meta (
|
CREATE TEMP TABLE temp_order_meta (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
date DATE NOT NULL,
|
date DATE NOT NULL,
|
||||||
customer VARCHAR(100) NOT NULL,
|
customer VARCHAR(100) NOT NULL,
|
||||||
@@ -135,26 +136,29 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
PRIMARY KEY (order_id)
|
PRIMARY KEY (order_id)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_discounts (
|
CREATE TEMP TABLE temp_order_discounts (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
discount DECIMAL(10,2) NOT NULL,
|
discount DECIMAL(10,2) NOT NULL,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_taxes (
|
CREATE TEMP TABLE temp_order_taxes (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
tax DECIMAL(10,2) NOT NULL,
|
tax DECIMAL(10,2) NOT NULL,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE debug_order_costs (
|
CREATE TEMP TABLE temp_order_costs (
|
||||||
order_id INTEGER NOT NULL,
|
order_id INTEGER NOT NULL,
|
||||||
pid INTEGER NOT NULL,
|
pid INTEGER NOT NULL,
|
||||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||||
PRIMARY KEY (order_id, pid)
|
PRIMARY KEY (order_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
|
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||||
|
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Insert order items in batches
|
// Insert order items in batches
|
||||||
@@ -168,7 +172,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
SKU = EXCLUDED.SKU,
|
SKU = EXCLUDED.SKU,
|
||||||
@@ -202,6 +206,14 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const METADATA_BATCH_SIZE = 2000;
|
const METADATA_BATCH_SIZE = 2000;
|
||||||
const PG_BATCH_SIZE = 200;
|
const PG_BATCH_SIZE = 200;
|
||||||
|
|
||||||
|
// Add a helper function for title case conversion
|
||||||
|
function toTitleCase(str) {
|
||||||
|
if (!str) return '';
|
||||||
|
return str.toLowerCase().split(' ').map(word => {
|
||||||
|
return word.charAt(0).toUpperCase() + word.slice(1);
|
||||||
|
}).join(' ');
|
||||||
|
}
|
||||||
|
|
||||||
const processMetadataBatch = async (batchIds) => {
|
const processMetadataBatch = async (batchIds) => {
|
||||||
const [orders] = await prodConnection.query(`
|
const [orders] = await prodConnection.query(`
|
||||||
SELECT
|
SELECT
|
||||||
@@ -231,7 +243,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
order.order_id,
|
order.order_id,
|
||||||
order.date,
|
order.date,
|
||||||
order.customer,
|
order.customer,
|
||||||
order.customer_name || '',
|
toTitleCase(order.customer_name) || '',
|
||||||
order.status,
|
order.status,
|
||||||
order.canceled,
|
order.canceled,
|
||||||
order.summary_discount || 0,
|
order.summary_discount || 0,
|
||||||
@@ -239,7 +251,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_meta (
|
INSERT INTO temp_order_meta (
|
||||||
order_id, date, customer, customer_name, status, canceled,
|
order_id, date, customer, customer_name, status, canceled,
|
||||||
summary_discount, summary_subtotal
|
summary_discount, summary_subtotal
|
||||||
)
|
)
|
||||||
@@ -281,7 +293,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_discounts (order_id, pid, discount)
|
INSERT INTO temp_order_discounts (order_id, pid, discount)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
discount = EXCLUDED.discount
|
discount = EXCLUDED.discount
|
||||||
@@ -321,7 +333,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_taxes (order_id, pid, tax)
|
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
tax = EXCLUDED.tax
|
tax = EXCLUDED.tax
|
||||||
@@ -330,14 +342,23 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
};
|
};
|
||||||
|
|
||||||
const processCostsBatch = async (batchIds) => {
|
const processCostsBatch = async (batchIds) => {
|
||||||
|
// Modified query to ensure one row per order_id/pid by using a subquery
|
||||||
const [costs] = await prodConnection.query(`
|
const [costs] = await prodConnection.query(`
|
||||||
SELECT
|
SELECT
|
||||||
oc.orderid as order_id,
|
oc.orderid as order_id,
|
||||||
oc.pid,
|
oc.pid,
|
||||||
oc.costeach
|
oc.costeach
|
||||||
FROM order_costs oc
|
FROM order_costs oc
|
||||||
WHERE oc.orderid IN (?)
|
INNER JOIN (
|
||||||
AND oc.pending = 0
|
SELECT
|
||||||
|
orderid,
|
||||||
|
pid,
|
||||||
|
MAX(id) as max_id
|
||||||
|
FROM order_costs
|
||||||
|
WHERE orderid IN (?)
|
||||||
|
AND pending = 0
|
||||||
|
GROUP BY orderid, pid
|
||||||
|
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||||
`, [batchIds]);
|
`, [batchIds]);
|
||||||
|
|
||||||
if (costs.length === 0) return;
|
if (costs.length === 0) return;
|
||||||
@@ -357,7 +378,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
]);
|
]);
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO debug_order_costs (order_id, pid, costeach)
|
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||||
costeach = EXCLUDED.costeach
|
costeach = EXCLUDED.costeach
|
||||||
@@ -416,11 +437,12 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
oi.pid,
|
oi.pid,
|
||||||
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
||||||
COALESCE(ot.tax, 0) as total_tax,
|
COALESCE(ot.tax, 0) as total_tax,
|
||||||
COALESCE(oi.price * 0.5, 0) as costeach
|
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||||
FROM debug_order_items oi
|
FROM temp_order_items oi
|
||||||
LEFT JOIN debug_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||||
LEFT JOIN debug_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||||
GROUP BY oi.order_id, oi.pid, ot.tax
|
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||||
|
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
oi.order_id as order_number,
|
oi.order_id as order_number,
|
||||||
@@ -447,11 +469,11 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
FROM (
|
FROM (
|
||||||
SELECT DISTINCT ON (order_id, pid)
|
SELECT DISTINCT ON (order_id, pid)
|
||||||
order_id, pid, SKU, price, quantity, base_discount
|
order_id, pid, SKU, price, quantity, base_discount
|
||||||
FROM debug_order_items
|
FROM temp_order_items
|
||||||
WHERE order_id = ANY($1)
|
WHERE order_id = ANY($1)
|
||||||
ORDER BY order_id, pid
|
ORDER BY order_id, pid
|
||||||
) oi
|
) oi
|
||||||
JOIN debug_order_meta om ON oi.order_id = om.order_id
|
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||||
ORDER BY oi.order_id, oi.pid
|
ORDER BY oi.order_id, oi.pid
|
||||||
`, [subBatchIds]);
|
`, [subBatchIds]);
|
||||||
@@ -478,8 +500,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||||
|
|
||||||
const placeholders = subBatch.map((_, idx) => {
|
const placeholders = subBatch.map((_, idx) => {
|
||||||
const base = idx * 14; // 14 columns (removed updated)
|
const base = idx * 15; // 15 columns including costeach
|
||||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14})`;
|
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||||
}).join(',');
|
}).join(',');
|
||||||
|
|
||||||
const batchValues = subBatch.flatMap(o => [
|
const batchValues = subBatch.flatMap(o => [
|
||||||
@@ -496,7 +518,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
o.customer,
|
o.customer,
|
||||||
o.customer_name,
|
o.customer_name,
|
||||||
o.status,
|
o.status,
|
||||||
o.canceled
|
o.canceled,
|
||||||
|
o.costeach
|
||||||
]);
|
]);
|
||||||
|
|
||||||
const [result] = await localConnection.query(`
|
const [result] = await localConnection.query(`
|
||||||
@@ -504,7 +527,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
INSERT INTO orders (
|
INSERT INTO orders (
|
||||||
order_number, pid, sku, date, price, quantity, discount,
|
order_number, pid, sku, date, price, quantity, discount,
|
||||||
tax, tax_included, shipping, customer, customer_name,
|
tax, tax_included, shipping, customer, customer_name,
|
||||||
status, canceled
|
status, canceled, costeach
|
||||||
)
|
)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||||
@@ -519,7 +542,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
customer = EXCLUDED.customer,
|
customer = EXCLUDED.customer,
|
||||||
customer_name = EXCLUDED.customer_name,
|
customer_name = EXCLUDED.customer_name,
|
||||||
status = EXCLUDED.status,
|
status = EXCLUDED.status,
|
||||||
canceled = EXCLUDED.canceled
|
canceled = EXCLUDED.canceled,
|
||||||
|
costeach = EXCLUDED.costeach
|
||||||
RETURNING xmax = 0 as inserted
|
RETURNING xmax = 0 as inserted
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
@@ -529,8 +553,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
`, batchValues);
|
`, batchValues);
|
||||||
|
|
||||||
const { inserted, updated } = result.rows[0];
|
const { inserted, updated } = result.rows[0];
|
||||||
recordsAdded += inserted;
|
recordsAdded += parseInt(inserted) || 0;
|
||||||
recordsUpdated += updated;
|
recordsUpdated += parseInt(updated) || 0;
|
||||||
importedCount += subBatch.length;
|
importedCount += subBatch.length;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -556,18 +580,38 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
|||||||
last_sync_timestamp = NOW()
|
last_sync_timestamp = NOW()
|
||||||
`);
|
`);
|
||||||
|
|
||||||
|
// Cleanup temporary tables
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS temp_order_items;
|
||||||
|
DROP TABLE IF EXISTS temp_order_meta;
|
||||||
|
DROP TABLE IF EXISTS temp_order_discounts;
|
||||||
|
DROP TABLE IF EXISTS temp_order_taxes;
|
||||||
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Commit transaction
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
return {
|
return {
|
||||||
status: "complete",
|
status: "complete",
|
||||||
totalImported: Math.floor(importedCount),
|
totalImported: Math.floor(importedCount) || 0,
|
||||||
recordsAdded: recordsAdded || 0,
|
recordsAdded: parseInt(recordsAdded) || 0,
|
||||||
recordsUpdated: Math.floor(recordsUpdated),
|
recordsUpdated: parseInt(recordsUpdated) || 0,
|
||||||
totalSkipped: skippedOrders.size,
|
totalSkipped: skippedOrders.size || 0,
|
||||||
missingProducts: missingProducts.size,
|
missingProducts: missingProducts.size || 0,
|
||||||
incrementalUpdate,
|
incrementalUpdate,
|
||||||
lastSyncTime
|
lastSyncTime
|
||||||
};
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error during orders import:", error);
|
console.error("Error during orders import:", error);
|
||||||
|
|
||||||
|
// Rollback transaction
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
throw error;
|
throw error;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,10 +1,13 @@
|
|||||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||||
const BATCH_SIZE = 100; // Smaller batch size for better progress tracking
|
const BATCH_SIZE = 1000; // Smaller batch size for better progress tracking
|
||||||
const MAX_RETRIES = 3;
|
const MAX_RETRIES = 3;
|
||||||
const RETRY_DELAY = 5000; // 5 seconds
|
const RETRY_DELAY = 5000; // 5 seconds
|
||||||
|
const dotenv = require("dotenv");
|
||||||
|
const path = require("path");
|
||||||
|
dotenv.config({ path: path.join(__dirname, "../../.env") });
|
||||||
|
|
||||||
// Utility functions
|
// Utility functions
|
||||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
const imageUrlBase = process.env.PRODUCT_IMAGE_URL_BASE || 'https://sbing.com/i/products/0000/';
|
||||||
const getImageUrls = (pid, iid = 1) => {
|
const getImageUrls = (pid, iid = 1) => {
|
||||||
const paddedPid = pid.toString().padStart(6, '0');
|
const paddedPid = pid.toString().padStart(6, '0');
|
||||||
// Use padded PID only for the first 3 digits
|
// Use padded PID only for the first 3 digits
|
||||||
@@ -18,7 +21,7 @@ const getImageUrls = (pid, iid = 1) => {
|
|||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
// Add helper function for retrying operations
|
// Add helper function for retrying operations with exponential backoff
|
||||||
async function withRetry(operation, errorMessage) {
|
async function withRetry(operation, errorMessage) {
|
||||||
let lastError;
|
let lastError;
|
||||||
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
|
||||||
@@ -28,7 +31,8 @@ async function withRetry(operation, errorMessage) {
|
|||||||
lastError = error;
|
lastError = error;
|
||||||
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
|
||||||
if (attempt < MAX_RETRIES) {
|
if (attempt < MAX_RETRIES) {
|
||||||
await new Promise(resolve => setTimeout(resolve, RETRY_DELAY));
|
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
|
||||||
|
await new Promise(resolve => setTimeout(resolve, backoffTime));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -140,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||||
CASE
|
CASE
|
||||||
WHEN p.reorder < 0 THEN 0
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||||
|
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||||
WHEN (
|
WHEN (
|
||||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
) THEN 0
|
) THEN 0
|
||||||
ELSE 1
|
ELSE 1
|
||||||
END AS replenishable,
|
END AS replenishable,
|
||||||
@@ -155,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||||
CASE
|
CASE
|
||||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
THEN (
|
||||||
|
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||||
|
FROM product_inventory
|
||||||
|
WHERE pid = p.pid AND count > 0
|
||||||
|
)
|
||||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||||
END AS cost_price,
|
END AS cost_price,
|
||||||
NULL as landing_cost_price,
|
NULL as landing_cost_price,
|
||||||
@@ -183,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
p.country_of_origin,
|
p.country_of_origin,
|
||||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||||
p.totalsold AS total_sold,
|
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||||
pls.date_sold as date_last_sold,
|
pls.date_sold as date_last_sold,
|
||||||
GROUP_CONCAT(DISTINCT CASE
|
GROUP_CONCAT(DISTINCT CASE
|
||||||
WHEN pc.cat_id IS NOT NULL
|
WHEN pc.cat_id IS NOT NULL
|
||||||
@@ -233,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
|||||||
row.pid,
|
row.pid,
|
||||||
row.title,
|
row.title,
|
||||||
row.description,
|
row.description,
|
||||||
row.itemnumber || '',
|
row.sku || '',
|
||||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||||
row.preorder_count,
|
row.preorder_count,
|
||||||
row.notions_inv_count,
|
row.notions_inv_count,
|
||||||
@@ -335,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||||
CASE
|
CASE
|
||||||
WHEN p.reorder < 0 THEN 0
|
WHEN p.reorder < 0 THEN 0
|
||||||
|
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
|
||||||
|
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
|
||||||
WHEN (
|
WHEN (
|
||||||
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
|
||||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
|
||||||
) THEN 0
|
) THEN 0
|
||||||
ELSE 1
|
ELSE 1
|
||||||
END AS replenishable,
|
END AS replenishable,
|
||||||
@@ -350,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||||
CASE
|
CASE
|
||||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
THEN (
|
||||||
|
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
|
||||||
|
FROM product_inventory
|
||||||
|
WHERE pid = p.pid AND count > 0
|
||||||
|
)
|
||||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||||
END AS cost_price,
|
END AS cost_price,
|
||||||
NULL as landing_cost_price,
|
NULL as landing_cost_price,
|
||||||
@@ -378,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
p.country_of_origin,
|
p.country_of_origin,
|
||||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||||
p.totalsold AS total_sold,
|
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||||
pls.date_sold as date_last_sold,
|
pls.date_sold as date_last_sold,
|
||||||
GROUP_CONCAT(DISTINCT CASE
|
GROUP_CONCAT(DISTINCT CASE
|
||||||
WHEN pc.cat_id IS NOT NULL
|
WHEN pc.cat_id IS NOT NULL
|
||||||
@@ -432,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
|||||||
row.pid,
|
row.pid,
|
||||||
row.title,
|
row.title,
|
||||||
row.description,
|
row.description,
|
||||||
row.itemnumber || '',
|
row.sku || '',
|
||||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||||
row.preorder_count,
|
row.preorder_count,
|
||||||
row.notions_inv_count,
|
row.notions_inv_count,
|
||||||
@@ -772,31 +788,43 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
|||||||
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
recordsAdded += parseInt(result.rows[0].inserted, 10) || 0;
|
||||||
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
recordsUpdated += parseInt(result.rows[0].updated, 10) || 0;
|
||||||
|
|
||||||
// Process category relationships for each product in the batch
|
// Process category relationships in batches
|
||||||
|
const allCategories = [];
|
||||||
for (const row of batch) {
|
for (const row of batch) {
|
||||||
if (row.categories) {
|
if (row.categories) {
|
||||||
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
const categoryIds = row.categories.split(',').filter(id => id && id.trim());
|
||||||
if (categoryIds.length > 0) {
|
if (categoryIds.length > 0) {
|
||||||
const catPlaceholders = categoryIds.map((_, idx) =>
|
categoryIds.forEach(catId => {
|
||||||
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
allCategories.push([row.pid, parseInt(catId.trim(), 10)]);
|
||||||
).join(',');
|
});
|
||||||
const catValues = categoryIds.flatMap(catId => [row.pid, parseInt(catId.trim(), 10)]);
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// First delete existing relationships for this product
|
// If we have categories to process
|
||||||
|
if (allCategories.length > 0) {
|
||||||
|
// First get all products in this batch
|
||||||
|
const productIds = batch.map(p => p.pid);
|
||||||
|
|
||||||
|
// Delete all existing relationships for products in this batch
|
||||||
await localConnection.query(
|
await localConnection.query(
|
||||||
'DELETE FROM product_categories WHERE pid = $1',
|
'DELETE FROM product_categories WHERE pid = ANY($1)',
|
||||||
[row.pid]
|
[productIds]
|
||||||
);
|
);
|
||||||
|
|
||||||
// Then insert the new relationships
|
// Insert all new relationships in one batch
|
||||||
|
const catPlaceholders = allCategories.map((_, idx) =>
|
||||||
|
`($${idx * 2 + 1}, $${idx * 2 + 2})`
|
||||||
|
).join(',');
|
||||||
|
|
||||||
|
const catValues = allCategories.flat();
|
||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO product_categories (pid, cat_id)
|
INSERT INTO product_categories (pid, cat_id)
|
||||||
VALUES ${catPlaceholders}
|
VALUES ${catPlaceholders}
|
||||||
ON CONFLICT (pid, cat_id) DO NOTHING
|
ON CONFLICT (pid, cat_id) DO NOTHING
|
||||||
`, catValues);
|
`, catValues);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "running",
|
status: "running",
|
||||||
@@ -816,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
|||||||
// Commit the transaction
|
// Commit the transaction
|
||||||
await localConnection.commit();
|
await localConnection.commit();
|
||||||
|
|
||||||
|
// Update sync status
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||||
|
VALUES ('products', NOW())
|
||||||
|
ON CONFLICT (table_name) DO UPDATE SET
|
||||||
|
last_sync_timestamp = NOW()
|
||||||
|
`);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
status: 'complete',
|
status: 'complete',
|
||||||
recordsAdded,
|
recordsAdded,
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -184,7 +184,7 @@ async function resetDatabase() {
|
|||||||
SELECT string_agg(tablename, ', ') as tables
|
SELECT string_agg(tablename, ', ') as tables
|
||||||
FROM pg_tables
|
FROM pg_tables
|
||||||
WHERE schemaname = 'public'
|
WHERE schemaname = 'public'
|
||||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history');
|
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates');
|
||||||
`);
|
`);
|
||||||
|
|
||||||
if (!tablesResult.rows[0].tables) {
|
if (!tablesResult.rows[0].tables) {
|
||||||
|
|||||||
337
inventory-server/scripts/update-order-costs.js
Normal file
337
inventory-server/scripts/update-order-costs.js
Normal file
@@ -0,0 +1,337 @@
|
|||||||
|
/**
|
||||||
|
* This script updates the costeach values for existing orders from the original MySQL database
|
||||||
|
* without needing to run the full import process.
|
||||||
|
*/
|
||||||
|
const dotenv = require("dotenv");
|
||||||
|
const path = require("path");
|
||||||
|
const fs = require("fs");
|
||||||
|
const { setupConnections, closeConnections } = require('./import/utils');
|
||||||
|
const { outputProgress, formatElapsedTime } = require('./metrics/utils/progress');
|
||||||
|
|
||||||
|
dotenv.config({ path: path.join(__dirname, "../.env") });
|
||||||
|
|
||||||
|
// SSH configuration
|
||||||
|
const sshConfig = {
|
||||||
|
ssh: {
|
||||||
|
host: process.env.PROD_SSH_HOST,
|
||||||
|
port: process.env.PROD_SSH_PORT || 22,
|
||||||
|
username: process.env.PROD_SSH_USER,
|
||||||
|
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||||
|
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||||
|
: undefined,
|
||||||
|
compress: true, // Enable SSH compression
|
||||||
|
},
|
||||||
|
prodDbConfig: {
|
||||||
|
// MySQL config for production
|
||||||
|
host: process.env.PROD_DB_HOST || "localhost",
|
||||||
|
user: process.env.PROD_DB_USER,
|
||||||
|
password: process.env.PROD_DB_PASSWORD,
|
||||||
|
database: process.env.PROD_DB_NAME,
|
||||||
|
port: process.env.PROD_DB_PORT || 3306,
|
||||||
|
timezone: 'Z',
|
||||||
|
},
|
||||||
|
localDbConfig: {
|
||||||
|
// PostgreSQL config for local
|
||||||
|
host: process.env.DB_HOST,
|
||||||
|
user: process.env.DB_USER,
|
||||||
|
password: process.env.DB_PASSWORD,
|
||||||
|
database: process.env.DB_NAME,
|
||||||
|
port: process.env.DB_PORT || 5432,
|
||||||
|
ssl: process.env.DB_SSL === 'true',
|
||||||
|
connectionTimeoutMillis: 60000,
|
||||||
|
idleTimeoutMillis: 30000,
|
||||||
|
max: 10 // connection pool max size
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
async function updateOrderCosts() {
|
||||||
|
const startTime = Date.now();
|
||||||
|
let connections;
|
||||||
|
let updatedCount = 0;
|
||||||
|
let errorCount = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Initializing SSH tunnel..."
|
||||||
|
});
|
||||||
|
|
||||||
|
connections = await setupConnections(sshConfig);
|
||||||
|
const { prodConnection, localConnection } = connections;
|
||||||
|
|
||||||
|
// 1. Get all orders from local database that need cost updates
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Getting orders from local database..."
|
||||||
|
});
|
||||||
|
|
||||||
|
const [orders] = await localConnection.query(`
|
||||||
|
SELECT DISTINCT order_number, pid
|
||||||
|
FROM orders
|
||||||
|
WHERE costeach = 0 OR costeach IS NULL
|
||||||
|
ORDER BY order_number
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (!orders || !orders.rows || orders.rows.length === 0) {
|
||||||
|
console.log("No orders found that need cost updates");
|
||||||
|
return { updatedCount: 0, errorCount: 0 };
|
||||||
|
}
|
||||||
|
|
||||||
|
const totalOrders = orders.rows.length;
|
||||||
|
console.log(`Found ${totalOrders} orders that need cost updates`);
|
||||||
|
|
||||||
|
// Process in batches of 1000 orders
|
||||||
|
const BATCH_SIZE = 500;
|
||||||
|
for (let i = 0; i < orders.rows.length; i += BATCH_SIZE) {
|
||||||
|
try {
|
||||||
|
// Start transaction for this batch
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
|
const batch = orders.rows.slice(i, i + BATCH_SIZE);
|
||||||
|
|
||||||
|
const orderNumbers = [...new Set(batch.map(o => o.order_number))];
|
||||||
|
|
||||||
|
// 2. Fetch costs from production database for these orders
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Fetching costs for orders ${i + 1} to ${Math.min(i + BATCH_SIZE, totalOrders)} of ${totalOrders}`,
|
||||||
|
current: i,
|
||||||
|
total: totalOrders,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
|
||||||
|
const [costs] = await prodConnection.query(`
|
||||||
|
SELECT
|
||||||
|
oc.orderid as order_number,
|
||||||
|
oc.pid,
|
||||||
|
oc.costeach
|
||||||
|
FROM order_costs oc
|
||||||
|
INNER JOIN (
|
||||||
|
SELECT
|
||||||
|
orderid,
|
||||||
|
pid,
|
||||||
|
MAX(id) as max_id
|
||||||
|
FROM order_costs
|
||||||
|
WHERE orderid IN (?)
|
||||||
|
AND pending = 0
|
||||||
|
GROUP BY orderid, pid
|
||||||
|
) latest ON oc.orderid = latest.orderid AND oc.pid = latest.pid AND oc.id = latest.max_id
|
||||||
|
`, [orderNumbers]);
|
||||||
|
|
||||||
|
// Create a map of costs for easy lookup
|
||||||
|
const costMap = {};
|
||||||
|
if (costs && costs.length) {
|
||||||
|
costs.forEach(c => {
|
||||||
|
costMap[`${c.order_number}-${c.pid}`] = c.costeach || 0;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Update costs in local database by batches
|
||||||
|
// Using a more efficient update approach with a temporary table
|
||||||
|
|
||||||
|
// Create a temporary table for each batch
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS temp_order_costs;
|
||||||
|
CREATE TEMP TABLE temp_order_costs (
|
||||||
|
order_number VARCHAR(50) NOT NULL,
|
||||||
|
pid BIGINT NOT NULL,
|
||||||
|
costeach DECIMAL(10,3) NOT NULL,
|
||||||
|
PRIMARY KEY (order_number, pid)
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Insert cost data into the temporary table
|
||||||
|
const costEntries = [];
|
||||||
|
for (const order of batch) {
|
||||||
|
const key = `${order.order_number}-${order.pid}`;
|
||||||
|
if (key in costMap) {
|
||||||
|
costEntries.push({
|
||||||
|
order_number: order.order_number,
|
||||||
|
pid: order.pid,
|
||||||
|
costeach: costMap[key]
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert in sub-batches of 100
|
||||||
|
const DB_BATCH_SIZE = 50;
|
||||||
|
for (let j = 0; j < costEntries.length; j += DB_BATCH_SIZE) {
|
||||||
|
const subBatch = costEntries.slice(j, j + DB_BATCH_SIZE);
|
||||||
|
if (subBatch.length === 0) continue;
|
||||||
|
|
||||||
|
const placeholders = subBatch.map((_, idx) =>
|
||||||
|
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||||
|
).join(',');
|
||||||
|
|
||||||
|
const values = subBatch.flatMap(item => [
|
||||||
|
item.order_number,
|
||||||
|
item.pid,
|
||||||
|
item.costeach
|
||||||
|
]);
|
||||||
|
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO temp_order_costs (order_number, pid, costeach)
|
||||||
|
VALUES ${placeholders}
|
||||||
|
`, values);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Perform bulk update from the temporary table
|
||||||
|
const [updateResult] = await localConnection.query(`
|
||||||
|
UPDATE orders o
|
||||||
|
SET costeach = t.costeach
|
||||||
|
FROM temp_order_costs t
|
||||||
|
WHERE o.order_number = t.order_number AND o.pid = t.pid
|
||||||
|
RETURNING o.id
|
||||||
|
`);
|
||||||
|
|
||||||
|
const batchUpdated = updateResult.rowCount || 0;
|
||||||
|
updatedCount += batchUpdated;
|
||||||
|
|
||||||
|
// Commit transaction for this batch
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Updated ${updatedCount} orders with costs from production (batch: ${batchUpdated})`,
|
||||||
|
current: i + batch.length,
|
||||||
|
total: totalOrders,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
// If a batch fails, roll back that batch's transaction and continue
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during batch rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error(`Error processing batch ${i}-${i + BATCH_SIZE}:`, error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 4. For orders with no matching costs, set a default based on price
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: "Setting default costs for remaining orders..."
|
||||||
|
});
|
||||||
|
|
||||||
|
// Process remaining updates in smaller batches
|
||||||
|
const DEFAULT_BATCH_SIZE = 10000;
|
||||||
|
let totalDefaultUpdated = 0;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Start with a count query to determine how many records need the default update
|
||||||
|
const [countResult] = await localConnection.query(`
|
||||||
|
SELECT COUNT(*) as count FROM orders
|
||||||
|
WHERE (costeach = 0 OR costeach IS NULL)
|
||||||
|
`);
|
||||||
|
|
||||||
|
const totalToUpdate = parseInt(countResult.rows[0]?.count || 0);
|
||||||
|
|
||||||
|
if (totalToUpdate > 0) {
|
||||||
|
console.log(`Applying default cost to ${totalToUpdate} orders`);
|
||||||
|
|
||||||
|
// Apply the default in batches with separate transactions
|
||||||
|
for (let i = 0; i < totalToUpdate; i += DEFAULT_BATCH_SIZE) {
|
||||||
|
try {
|
||||||
|
await localConnection.beginTransaction();
|
||||||
|
|
||||||
|
const [defaultUpdates] = await localConnection.query(`
|
||||||
|
WITH orders_to_update AS (
|
||||||
|
SELECT id FROM orders
|
||||||
|
WHERE (costeach = 0 OR costeach IS NULL)
|
||||||
|
LIMIT ${DEFAULT_BATCH_SIZE}
|
||||||
|
)
|
||||||
|
UPDATE orders o
|
||||||
|
SET costeach = price * 0.5
|
||||||
|
FROM orders_to_update otu
|
||||||
|
WHERE o.id = otu.id
|
||||||
|
RETURNING o.id
|
||||||
|
`);
|
||||||
|
|
||||||
|
const batchDefaultUpdated = defaultUpdates.rowCount || 0;
|
||||||
|
totalDefaultUpdated += batchDefaultUpdated;
|
||||||
|
|
||||||
|
await localConnection.commit();
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Applied default costs to ${totalDefaultUpdated} of ${totalToUpdate} orders`,
|
||||||
|
current: totalDefaultUpdated,
|
||||||
|
total: totalToUpdate,
|
||||||
|
elapsed: formatElapsedTime((Date.now() - startTime) / 1000)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
try {
|
||||||
|
await localConnection.rollback();
|
||||||
|
} catch (rollbackError) {
|
||||||
|
console.error("Error during default update rollback:", rollbackError);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.error(`Error applying default costs batch ${i}-${i + DEFAULT_BATCH_SIZE}:`, error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error counting or updating remaining orders:", error);
|
||||||
|
errorCount++;
|
||||||
|
}
|
||||||
|
|
||||||
|
updatedCount += totalDefaultUpdated;
|
||||||
|
|
||||||
|
const endTime = Date.now();
|
||||||
|
const totalSeconds = (endTime - startTime) / 1000;
|
||||||
|
|
||||||
|
outputProgress({
|
||||||
|
status: "complete",
|
||||||
|
operation: "Order costs update",
|
||||||
|
message: `Updated ${updatedCount} orders (${totalDefaultUpdated} with default values) in ${formatElapsedTime(totalSeconds)}`,
|
||||||
|
elapsed: formatElapsedTime(totalSeconds)
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: "complete",
|
||||||
|
updatedCount,
|
||||||
|
errorCount
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error during order costs update:", error);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: "error",
|
||||||
|
error: error.message,
|
||||||
|
updatedCount,
|
||||||
|
errorCount
|
||||||
|
};
|
||||||
|
} finally {
|
||||||
|
if (connections) {
|
||||||
|
await closeConnections(connections).catch(err => {
|
||||||
|
console.error("Error closing connections:", err);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run the script only if this is the main module
|
||||||
|
if (require.main === module) {
|
||||||
|
updateOrderCosts().then((results) => {
|
||||||
|
console.log('Cost update completed:', results);
|
||||||
|
// Force exit after a small delay to ensure all logs are written
|
||||||
|
setTimeout(() => process.exit(0), 500);
|
||||||
|
}).catch((error) => {
|
||||||
|
console.error("Unhandled error:", error);
|
||||||
|
// Force exit with error code after a small delay
|
||||||
|
setTimeout(() => process.exit(1), 500);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Export the function for use in other scripts
|
||||||
|
module.exports = updateOrderCosts;
|
||||||
@@ -757,8 +757,8 @@ router.get('/history/import', async (req, res) => {
|
|||||||
end_time,
|
end_time,
|
||||||
status,
|
status,
|
||||||
error_message,
|
error_message,
|
||||||
rows_processed::integer,
|
records_added::integer,
|
||||||
files_processed::integer
|
records_updated::integer
|
||||||
FROM import_history
|
FROM import_history
|
||||||
ORDER BY start_time DESC
|
ORDER BY start_time DESC
|
||||||
LIMIT 20
|
LIMIT 20
|
||||||
|
|||||||
396
inventory-server/src/routes/reusable-images.js
Normal file
396
inventory-server/src/routes/reusable-images.js
Normal file
@@ -0,0 +1,396 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const router = express.Router();
|
||||||
|
const multer = require('multer');
|
||||||
|
const path = require('path');
|
||||||
|
const fs = require('fs');
|
||||||
|
|
||||||
|
// Create reusable uploads directory if it doesn't exist
|
||||||
|
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
|
||||||
|
fs.mkdirSync(uploadsDir, { recursive: true });
|
||||||
|
|
||||||
|
// Configure multer for file uploads
|
||||||
|
const storage = multer.diskStorage({
|
||||||
|
destination: function (req, file, cb) {
|
||||||
|
console.log(`Saving reusable image to: ${uploadsDir}`);
|
||||||
|
cb(null, uploadsDir);
|
||||||
|
},
|
||||||
|
filename: function (req, file, cb) {
|
||||||
|
// Create unique filename with original extension
|
||||||
|
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
|
||||||
|
|
||||||
|
// Make sure we preserve the original file extension
|
||||||
|
let fileExt = path.extname(file.originalname).toLowerCase();
|
||||||
|
|
||||||
|
// Ensure there is a proper extension based on mimetype if none exists
|
||||||
|
if (!fileExt) {
|
||||||
|
switch (file.mimetype) {
|
||||||
|
case 'image/jpeg': fileExt = '.jpg'; break;
|
||||||
|
case 'image/png': fileExt = '.png'; break;
|
||||||
|
case 'image/gif': fileExt = '.gif'; break;
|
||||||
|
case 'image/webp': fileExt = '.webp'; break;
|
||||||
|
default: fileExt = '.jpg'; // Default to jpg
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
|
||||||
|
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
|
||||||
|
cb(null, fileName);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const upload = multer({
|
||||||
|
storage: storage,
|
||||||
|
limits: {
|
||||||
|
fileSize: 5 * 1024 * 1024, // 5MB max file size
|
||||||
|
},
|
||||||
|
fileFilter: function (req, file, cb) {
|
||||||
|
// Accept only image files
|
||||||
|
const filetypes = /jpeg|jpg|png|gif|webp/;
|
||||||
|
const mimetype = filetypes.test(file.mimetype);
|
||||||
|
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
|
||||||
|
|
||||||
|
if (mimetype && extname) {
|
||||||
|
return cb(null, true);
|
||||||
|
}
|
||||||
|
cb(new Error('Only image files are allowed'));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get all reusable images
|
||||||
|
router.get('/', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`);
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable images:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable images',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get images by company or global images
|
||||||
|
router.get('/by-company/:companyId', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { companyId } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get images that are either global or belong to this company
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE is_global = true OR company = $1
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`, [companyId]);
|
||||||
|
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable images by company:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable images by company',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get global images only
|
||||||
|
router.get('/global', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE is_global = true
|
||||||
|
ORDER BY created_at DESC
|
||||||
|
`);
|
||||||
|
|
||||||
|
res.json(result.rows);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching global reusable images:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch global reusable images',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get a single image by ID
|
||||||
|
router.get('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
SELECT * FROM reusable_images
|
||||||
|
WHERE id = $1
|
||||||
|
`, [id]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to fetch reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Upload a new reusable image
|
||||||
|
router.post('/upload', upload.single('image'), async (req, res) => {
|
||||||
|
try {
|
||||||
|
if (!req.file) {
|
||||||
|
return res.status(400).json({ error: 'No image file provided' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { name, is_global, company } = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!name) {
|
||||||
|
return res.status(400).json({ error: 'Image name is required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert is_global from string to boolean
|
||||||
|
const isGlobal = is_global === 'true' || is_global === true;
|
||||||
|
|
||||||
|
// Validate company is provided for non-global images
|
||||||
|
if (!isGlobal && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log file information
|
||||||
|
console.log('Reusable image uploaded:', {
|
||||||
|
filename: req.file.filename,
|
||||||
|
originalname: req.file.originalname,
|
||||||
|
mimetype: req.file.mimetype,
|
||||||
|
size: req.file.size,
|
||||||
|
path: req.file.path
|
||||||
|
});
|
||||||
|
|
||||||
|
// Ensure the file exists
|
||||||
|
const filePath = path.join(uploadsDir, req.file.filename);
|
||||||
|
if (!fs.existsSync(filePath)) {
|
||||||
|
return res.status(500).json({ error: 'File was not saved correctly' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create URL for the uploaded file
|
||||||
|
const baseUrl = 'https://inventory.acot.site';
|
||||||
|
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Insert record into database
|
||||||
|
const result = await pool.query(`
|
||||||
|
INSERT INTO reusable_images (
|
||||||
|
name,
|
||||||
|
filename,
|
||||||
|
file_path,
|
||||||
|
image_url,
|
||||||
|
is_global,
|
||||||
|
company,
|
||||||
|
mime_type,
|
||||||
|
file_size
|
||||||
|
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
name,
|
||||||
|
req.file.filename,
|
||||||
|
filePath,
|
||||||
|
imageUrl,
|
||||||
|
isGlobal,
|
||||||
|
isGlobal ? null : company,
|
||||||
|
req.file.mimetype,
|
||||||
|
req.file.size
|
||||||
|
]);
|
||||||
|
|
||||||
|
// Return success response with image data
|
||||||
|
res.status(201).json({
|
||||||
|
success: true,
|
||||||
|
image: result.rows[0],
|
||||||
|
message: 'Image uploaded successfully'
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error uploading reusable image:', error);
|
||||||
|
res.status(500).json({ error: error.message || 'Failed to upload image' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update image details (name, is_global, company)
|
||||||
|
router.put('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const { name, is_global, company } = req.body;
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!name) {
|
||||||
|
return res.status(400).json({ error: 'Image name is required' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert is_global from string to boolean if necessary
|
||||||
|
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
|
||||||
|
|
||||||
|
// Validate company is provided for non-global images
|
||||||
|
if (!isGlobal && !company) {
|
||||||
|
return res.status(400).json({ error: 'Company is required for non-global images' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if the image exists
|
||||||
|
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
if (checkResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await pool.query(`
|
||||||
|
UPDATE reusable_images
|
||||||
|
SET
|
||||||
|
name = $1,
|
||||||
|
is_global = $2,
|
||||||
|
company = $3
|
||||||
|
WHERE id = $4
|
||||||
|
RETURNING *
|
||||||
|
`, [
|
||||||
|
name,
|
||||||
|
isGlobal,
|
||||||
|
isGlobal ? null : company,
|
||||||
|
id
|
||||||
|
]);
|
||||||
|
|
||||||
|
res.json(result.rows[0]);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error updating reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to update reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete a reusable image
|
||||||
|
router.delete('/:id', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { id } = req.params;
|
||||||
|
const pool = req.app.locals.pool;
|
||||||
|
if (!pool) {
|
||||||
|
throw new Error('Database pool not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the image data first to get the filename
|
||||||
|
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
|
||||||
|
if (imageResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'Reusable image not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const image = imageResult.rows[0];
|
||||||
|
|
||||||
|
// Delete from database
|
||||||
|
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
|
||||||
|
|
||||||
|
// Delete the file from filesystem
|
||||||
|
const filePath = path.join(uploadsDir, image.filename);
|
||||||
|
if (fs.existsSync(filePath)) {
|
||||||
|
fs.unlinkSync(filePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
message: 'Reusable image deleted successfully',
|
||||||
|
image
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error deleting reusable image:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Failed to delete reusable image',
|
||||||
|
details: error instanceof Error ? error.message : 'Unknown error'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Check if file exists and permissions
|
||||||
|
router.get('/check-file/:filename', (req, res) => {
|
||||||
|
const { filename } = req.params;
|
||||||
|
|
||||||
|
// Prevent directory traversal
|
||||||
|
if (filename.includes('..') || filename.includes('/')) {
|
||||||
|
return res.status(400).json({ error: 'Invalid filename' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const filePath = path.join(uploadsDir, filename);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Check if file exists
|
||||||
|
if (!fs.existsSync(filePath)) {
|
||||||
|
return res.status(404).json({
|
||||||
|
error: 'File not found',
|
||||||
|
path: filePath,
|
||||||
|
exists: false,
|
||||||
|
readable: false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if file is readable
|
||||||
|
fs.accessSync(filePath, fs.constants.R_OK);
|
||||||
|
|
||||||
|
// Get file stats
|
||||||
|
const stats = fs.statSync(filePath);
|
||||||
|
|
||||||
|
return res.json({
|
||||||
|
filename,
|
||||||
|
path: filePath,
|
||||||
|
exists: true,
|
||||||
|
readable: true,
|
||||||
|
isFile: stats.isFile(),
|
||||||
|
isDirectory: stats.isDirectory(),
|
||||||
|
size: stats.size,
|
||||||
|
created: stats.birthtime,
|
||||||
|
modified: stats.mtime,
|
||||||
|
permissions: stats.mode.toString(8)
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
return res.status(500).json({
|
||||||
|
error: error.message,
|
||||||
|
path: filePath,
|
||||||
|
exists: fs.existsSync(filePath),
|
||||||
|
readable: false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
router.use((err, req, res, next) => {
|
||||||
|
console.error('Reusable images route error:', err);
|
||||||
|
res.status(500).json({
|
||||||
|
error: 'Internal server error',
|
||||||
|
details: err.message
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
@@ -19,6 +19,7 @@ const importRouter = require('./routes/import');
|
|||||||
const aiValidationRouter = require('./routes/ai-validation');
|
const aiValidationRouter = require('./routes/ai-validation');
|
||||||
const templatesRouter = require('./routes/templates');
|
const templatesRouter = require('./routes/templates');
|
||||||
const aiPromptsRouter = require('./routes/ai-prompts');
|
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||||
|
const reusableImagesRouter = require('./routes/reusable-images');
|
||||||
|
|
||||||
// Get the absolute path to the .env file
|
// Get the absolute path to the .env file
|
||||||
const envPath = '/var/www/html/inventory/.env';
|
const envPath = '/var/www/html/inventory/.env';
|
||||||
@@ -105,6 +106,7 @@ async function startServer() {
|
|||||||
app.use('/api/ai-validation', aiValidationRouter);
|
app.use('/api/ai-validation', aiValidationRouter);
|
||||||
app.use('/api/templates', templatesRouter);
|
app.use('/api/templates', templatesRouter);
|
||||||
app.use('/api/ai-prompts', aiPromptsRouter);
|
app.use('/api/ai-prompts', aiPromptsRouter);
|
||||||
|
app.use('/api/reusable-images', reusableImagesRouter);
|
||||||
|
|
||||||
// Basic health check route
|
// Basic health check route
|
||||||
app.get('/health', (req, res) => {
|
app.get('/health', (req, res) => {
|
||||||
|
|||||||
@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
|
|||||||
}
|
}
|
||||||
getProductContainerClasses={() => getProductContainerClasses(index)}
|
getProductContainerClasses={() => getProductContainerClasses(index)}
|
||||||
findContainer={findContainer}
|
findContainer={findContainer}
|
||||||
|
handleAddImageFromUrl={handleAddImageFromUrl}
|
||||||
/>
|
/>
|
||||||
))}
|
))}
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
import { Input } from "@/components/ui/input";
|
import { Input } from "@/components/ui/input";
|
||||||
import { Card, CardContent } from "@/components/ui/card";
|
import { Card, CardContent } from "@/components/ui/card";
|
||||||
import { Loader2, Link as LinkIcon } from "lucide-react";
|
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
|
||||||
import { cn } from "@/lib/utils";
|
import { cn } from "@/lib/utils";
|
||||||
import { ImageDropzone } from "./ImageDropzone";
|
import { ImageDropzone } from "./ImageDropzone";
|
||||||
import { SortableImage } from "./SortableImage";
|
import { SortableImage } from "./SortableImage";
|
||||||
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
|
|||||||
import { ProductImageSortable, Product } from "../../types";
|
import { ProductImageSortable, Product } from "../../types";
|
||||||
import { DroppableContainer } from "../DroppableContainer";
|
import { DroppableContainer } from "../DroppableContainer";
|
||||||
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
|
||||||
|
import { useQuery } from "@tanstack/react-query";
|
||||||
|
import config from "@/config";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogDescription,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
} from "@/components/ui/dialog";
|
||||||
|
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||||
|
import { useState, useMemo } from "react";
|
||||||
|
|
||||||
|
interface ReusableImage {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
image_url: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
}
|
||||||
|
|
||||||
interface ProductCardProps {
|
interface ProductCardProps {
|
||||||
product: Product;
|
product: Product;
|
||||||
@@ -26,6 +45,7 @@ interface ProductCardProps {
|
|||||||
onRemoveImage: (id: string) => void;
|
onRemoveImage: (id: string) => void;
|
||||||
getProductContainerClasses: () => string;
|
getProductContainerClasses: () => string;
|
||||||
findContainer: (id: string) => string | null;
|
findContainer: (id: string) => string | null;
|
||||||
|
handleAddImageFromUrl: (productIndex: number, url: string) => void;
|
||||||
}
|
}
|
||||||
|
|
||||||
export const ProductCard = ({
|
export const ProductCard = ({
|
||||||
@@ -43,8 +63,11 @@ export const ProductCard = ({
|
|||||||
onDragOver,
|
onDragOver,
|
||||||
onRemoveImage,
|
onRemoveImage,
|
||||||
getProductContainerClasses,
|
getProductContainerClasses,
|
||||||
findContainer
|
findContainer,
|
||||||
|
handleAddImageFromUrl
|
||||||
}: ProductCardProps) => {
|
}: ProductCardProps) => {
|
||||||
|
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
|
||||||
|
|
||||||
// Function to get images for this product
|
// Function to get images for this product
|
||||||
const getProductImages = () => {
|
const getProductImages = () => {
|
||||||
return productImages.filter(img => img.productIndex === index);
|
return productImages.filter(img => img.productIndex === index);
|
||||||
@@ -56,6 +79,32 @@ export const ProductCard = ({
|
|||||||
return result !== null ? parseInt(result) : null;
|
return result !== null ? parseInt(result) : null;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Fetch reusable images
|
||||||
|
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
|
||||||
|
queryKey: ["reusable-images"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch reusable images");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Filter reusable images based on product's company
|
||||||
|
const availableReusableImages = useMemo(() => {
|
||||||
|
if (!reusableImages) return [];
|
||||||
|
return reusableImages.filter(img =>
|
||||||
|
img.is_global || img.company === product.company
|
||||||
|
);
|
||||||
|
}, [reusableImages, product.company]);
|
||||||
|
|
||||||
|
// Handle adding a reusable image
|
||||||
|
const handleAddReusableImage = (imageUrl: string) => {
|
||||||
|
handleAddImageFromUrl(index, imageUrl);
|
||||||
|
setIsReusableDialogOpen(false);
|
||||||
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Card
|
<Card
|
||||||
className={cn(
|
className={cn(
|
||||||
@@ -83,6 +132,18 @@ export const ProductCard = ({
|
|||||||
className="flex items-center gap-2"
|
className="flex items-center gap-2"
|
||||||
onSubmit={onUrlSubmit}
|
onSubmit={onUrlSubmit}
|
||||||
>
|
>
|
||||||
|
{getProductImages().length === 0 && (
|
||||||
|
<Button
|
||||||
|
type="button"
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
|
||||||
|
onClick={() => setIsReusableDialogOpen(true)}
|
||||||
|
>
|
||||||
|
<ImageIcon className="h-3.5 w-3.5" />
|
||||||
|
Select from Library
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
<Input
|
<Input
|
||||||
placeholder="Add image from URL"
|
placeholder="Add image from URL"
|
||||||
value={urlInput}
|
value={urlInput}
|
||||||
@@ -105,7 +166,7 @@ export const ProductCard = ({
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div className="flex flex-col sm:flex-row gap-2">
|
<div className="flex flex-col sm:flex-row gap-2">
|
||||||
<div className="flex flex-row gap-2 items-start">
|
<div className="flex flex-row gap-2 items-center gap-4">
|
||||||
<ImageDropzone
|
<ImageDropzone
|
||||||
productIndex={index}
|
productIndex={index}
|
||||||
onDrop={onImageUpload}
|
onDrop={onImageUpload}
|
||||||
@@ -158,6 +219,50 @@ export const ProductCard = ({
|
|||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
</CardContent>
|
</CardContent>
|
||||||
|
|
||||||
|
{/* Reusable Images Dialog */}
|
||||||
|
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
|
||||||
|
<DialogContent className="max-w-3xl">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>Select from Image Library</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
Choose a global or company-specific image to add to this product.
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
<ScrollArea className="h-[400px] pr-4">
|
||||||
|
{isLoadingReusable ? (
|
||||||
|
<div className="flex items-center justify-center h-full">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : availableReusableImages.length === 0 ? (
|
||||||
|
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
|
||||||
|
<ImageIcon className="h-8 w-8 mb-2" />
|
||||||
|
<p>No reusable images available</p>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
|
||||||
|
{availableReusableImages.map((image) => (
|
||||||
|
<div
|
||||||
|
key={image.id}
|
||||||
|
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
|
||||||
|
onClick={() => handleAddReusableImage(image.image_url)}
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src={image.image_url}
|
||||||
|
alt={image.name}
|
||||||
|
className="w-full h-full object-cover"
|
||||||
|
/>
|
||||||
|
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
|
||||||
|
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
|
||||||
|
<p className="text-xs text-white truncate">{image.name}</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</ScrollArea>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
</Card>
|
</Card>
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
@@ -31,5 +31,6 @@ export interface Product {
|
|||||||
supplier_no?: string;
|
supplier_no?: string;
|
||||||
sku?: string;
|
sku?: string;
|
||||||
model?: string;
|
model?: string;
|
||||||
|
company?: string;
|
||||||
product_images?: string | string[];
|
product_images?: string | string[];
|
||||||
}
|
}
|
||||||
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
773
inventory/src/components/settings/ReusableImageManagement.tsx
Normal file
@@ -0,0 +1,773 @@
|
|||||||
|
import { useState, useMemo, useCallback, useRef, useEffect } from "react";
|
||||||
|
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||||
|
import { Button } from "@/components/ui/button";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "@/components/ui/table";
|
||||||
|
import { Input } from "@/components/ui/input";
|
||||||
|
import { Label } from "@/components/ui/label";
|
||||||
|
import { Checkbox } from "@/components/ui/checkbox";
|
||||||
|
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||||
|
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
|
||||||
|
import config from "@/config";
|
||||||
|
import {
|
||||||
|
useReactTable,
|
||||||
|
getCoreRowModel,
|
||||||
|
getSortedRowModel,
|
||||||
|
SortingState,
|
||||||
|
flexRender,
|
||||||
|
type ColumnDef,
|
||||||
|
} from "@tanstack/react-table";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogDescription,
|
||||||
|
DialogFooter,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
DialogClose
|
||||||
|
} from "@/components/ui/dialog";
|
||||||
|
import {
|
||||||
|
AlertDialog,
|
||||||
|
AlertDialogAction,
|
||||||
|
AlertDialogCancel,
|
||||||
|
AlertDialogContent,
|
||||||
|
AlertDialogDescription,
|
||||||
|
AlertDialogFooter,
|
||||||
|
AlertDialogHeader,
|
||||||
|
AlertDialogTitle,
|
||||||
|
} from "@/components/ui/alert-dialog";
|
||||||
|
import { toast } from "sonner";
|
||||||
|
import { useDropzone } from "react-dropzone";
|
||||||
|
import { cn } from "@/lib/utils";
|
||||||
|
|
||||||
|
interface FieldOption {
|
||||||
|
label: string;
|
||||||
|
value: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ImageFormData {
|
||||||
|
id?: number;
|
||||||
|
name: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
file?: File;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ReusableImage {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
filename: string;
|
||||||
|
file_path: string;
|
||||||
|
image_url: string;
|
||||||
|
is_global: boolean;
|
||||||
|
company: string | null;
|
||||||
|
mime_type: string;
|
||||||
|
file_size: number;
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface FieldOptions {
|
||||||
|
companies: FieldOption[];
|
||||||
|
}
|
||||||
|
|
||||||
|
const ImageForm = ({
|
||||||
|
editingImage,
|
||||||
|
formData,
|
||||||
|
setFormData,
|
||||||
|
onSubmit,
|
||||||
|
onCancel,
|
||||||
|
fieldOptions,
|
||||||
|
getRootProps,
|
||||||
|
getInputProps,
|
||||||
|
isDragActive
|
||||||
|
}: {
|
||||||
|
editingImage: ReusableImage | null;
|
||||||
|
formData: ImageFormData;
|
||||||
|
setFormData: (data: ImageFormData) => void;
|
||||||
|
onSubmit: (e: React.FormEvent) => void;
|
||||||
|
onCancel: () => void;
|
||||||
|
fieldOptions: FieldOptions | undefined;
|
||||||
|
getRootProps: any;
|
||||||
|
getInputProps: any;
|
||||||
|
isDragActive: boolean;
|
||||||
|
}) => {
|
||||||
|
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
|
||||||
|
setFormData(prev => ({ ...prev, name: e.target.value }));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
const handleGlobalChange = useCallback((checked: boolean) => {
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
is_global: checked,
|
||||||
|
company: checked ? null : prev.company
|
||||||
|
}));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
const handleCompanyChange = useCallback((value: string) => {
|
||||||
|
setFormData(prev => ({ ...prev, company: value }));
|
||||||
|
}, [setFormData]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<form onSubmit={onSubmit}>
|
||||||
|
<div className="grid gap-4 py-4">
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="image_name">Image Name</Label>
|
||||||
|
<Input
|
||||||
|
id="image_name"
|
||||||
|
name="image_name"
|
||||||
|
value={formData.name}
|
||||||
|
onChange={handleNameChange}
|
||||||
|
placeholder="Enter image name"
|
||||||
|
required
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{!editingImage && (
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="image">Upload Image</Label>
|
||||||
|
<div
|
||||||
|
{...getRootProps()}
|
||||||
|
className={cn(
|
||||||
|
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
|
||||||
|
isDragActive && "border-primary bg-muted"
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<input {...getInputProps()} />
|
||||||
|
<div className="flex flex-col items-center justify-center py-2">
|
||||||
|
{formData.file ? (
|
||||||
|
<>
|
||||||
|
<div className="mb-4">
|
||||||
|
<ImagePreview file={formData.file} />
|
||||||
|
</div>
|
||||||
|
<div className="flex items-center gap-2 mb-2">
|
||||||
|
<Image className="h-4 w-4 text-primary" />
|
||||||
|
<span className="text-sm">{formData.file.name}</span>
|
||||||
|
</div>
|
||||||
|
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
|
||||||
|
</>
|
||||||
|
) : isDragActive ? (
|
||||||
|
<>
|
||||||
|
<Image className="h-8 w-8 mb-2 text-primary" />
|
||||||
|
<p className="text-base text-muted-foreground">Drop image here</p>
|
||||||
|
</>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
|
||||||
|
<p className="text-base text-muted-foreground">Click or drag to upload</p>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="flex items-center space-x-2">
|
||||||
|
<Checkbox
|
||||||
|
id="is_global"
|
||||||
|
checked={formData.is_global}
|
||||||
|
onCheckedChange={handleGlobalChange}
|
||||||
|
/>
|
||||||
|
<Label htmlFor="is_global">Available for all companies</Label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{!formData.is_global && (
|
||||||
|
<div className="grid gap-2">
|
||||||
|
<Label htmlFor="company">Company</Label>
|
||||||
|
<Select
|
||||||
|
value={formData.company || ''}
|
||||||
|
onValueChange={handleCompanyChange}
|
||||||
|
required={!formData.is_global}
|
||||||
|
>
|
||||||
|
<SelectTrigger>
|
||||||
|
<SelectValue placeholder="Select company" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{fieldOptions?.companies.map((company) => (
|
||||||
|
<SelectItem key={company.value} value={company.value}>
|
||||||
|
{company.label}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<DialogFooter>
|
||||||
|
<Button type="button" variant="outline" onClick={onCancel}>
|
||||||
|
Cancel
|
||||||
|
</Button>
|
||||||
|
<Button type="submit">
|
||||||
|
{editingImage ? "Update" : "Upload"} Image
|
||||||
|
</Button>
|
||||||
|
</DialogFooter>
|
||||||
|
</form>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export function ReusableImageManagement() {
|
||||||
|
const [isFormOpen, setIsFormOpen] = useState(false);
|
||||||
|
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
|
||||||
|
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
|
||||||
|
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
|
||||||
|
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
|
||||||
|
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
|
||||||
|
const [sorting, setSorting] = useState<SortingState>([
|
||||||
|
{ id: "created_at", desc: true }
|
||||||
|
]);
|
||||||
|
const [searchQuery, setSearchQuery] = useState("");
|
||||||
|
const [formData, setFormData] = useState<ImageFormData>({
|
||||||
|
name: "",
|
||||||
|
is_global: false,
|
||||||
|
company: null,
|
||||||
|
file: undefined
|
||||||
|
});
|
||||||
|
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
|
||||||
|
const { data: images, isLoading } = useQuery<ReusableImage[]>({
|
||||||
|
queryKey: ["reusable-images"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch reusable images");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const { data: fieldOptions } = useQuery<FieldOptions>({
|
||||||
|
queryKey: ["fieldOptions"],
|
||||||
|
queryFn: async () => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/import/field-options`);
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch field options");
|
||||||
|
}
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const createMutation = useMutation({
|
||||||
|
mutationFn: async (data: ImageFormData) => {
|
||||||
|
// Create FormData for file upload
|
||||||
|
const formData = new FormData();
|
||||||
|
formData.append('name', data.name);
|
||||||
|
formData.append('is_global', String(data.is_global));
|
||||||
|
|
||||||
|
if (!data.is_global && data.company) {
|
||||||
|
formData.append('company', data.company);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (data.file) {
|
||||||
|
formData.append('image', data.file);
|
||||||
|
} else {
|
||||||
|
throw new Error("Image file is required");
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
|
||||||
|
method: "POST",
|
||||||
|
body: formData,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to upload image");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image uploaded successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to upload image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const updateMutation = useMutation({
|
||||||
|
mutationFn: async (data: ImageFormData) => {
|
||||||
|
if (!data.id) throw new Error("Image ID is required for update");
|
||||||
|
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
|
||||||
|
method: "PUT",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
name: data.name,
|
||||||
|
is_global: data.is_global,
|
||||||
|
company: data.is_global ? null : data.company
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
const error = await response.json();
|
||||||
|
throw new Error(error.message || error.error || "Failed to update image");
|
||||||
|
}
|
||||||
|
|
||||||
|
return response.json();
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image updated successfully");
|
||||||
|
resetForm();
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to update image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const deleteMutation = useMutation({
|
||||||
|
mutationFn: async (id: number) => {
|
||||||
|
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
|
||||||
|
method: "DELETE",
|
||||||
|
});
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to delete image");
|
||||||
|
}
|
||||||
|
},
|
||||||
|
onSuccess: () => {
|
||||||
|
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
|
||||||
|
toast.success("Image deleted successfully");
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
toast.error(error instanceof Error ? error.message : "Failed to delete image");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const handleEdit = (image: ReusableImage) => {
|
||||||
|
setEditingImage(image);
|
||||||
|
setFormData({
|
||||||
|
id: image.id,
|
||||||
|
name: image.name,
|
||||||
|
is_global: image.is_global,
|
||||||
|
company: image.company,
|
||||||
|
});
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteClick = (image: ReusableImage) => {
|
||||||
|
setImageToDelete(image);
|
||||||
|
setIsDeleteOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handlePreview = (image: ReusableImage) => {
|
||||||
|
setPreviewImage(image);
|
||||||
|
setIsPreviewOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleDeleteConfirm = () => {
|
||||||
|
if (imageToDelete) {
|
||||||
|
deleteMutation.mutate(imageToDelete.id);
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setImageToDelete(null);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSubmit = (e: React.FormEvent) => {
|
||||||
|
e.preventDefault();
|
||||||
|
|
||||||
|
// If is_global is true, ensure company is null
|
||||||
|
const submitData = {
|
||||||
|
...formData,
|
||||||
|
company: formData.is_global ? null : formData.company,
|
||||||
|
};
|
||||||
|
|
||||||
|
if (editingImage) {
|
||||||
|
updateMutation.mutate(submitData);
|
||||||
|
} else {
|
||||||
|
if (!submitData.file) {
|
||||||
|
toast.error("Please select an image file");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
createMutation.mutate(submitData);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const resetForm = () => {
|
||||||
|
setFormData({
|
||||||
|
name: "",
|
||||||
|
is_global: false,
|
||||||
|
company: null,
|
||||||
|
file: undefined
|
||||||
|
});
|
||||||
|
setEditingImage(null);
|
||||||
|
setIsFormOpen(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleCreateClick = () => {
|
||||||
|
resetForm();
|
||||||
|
setIsFormOpen(true);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Configure dropzone for image uploads
|
||||||
|
const onDrop = useCallback((acceptedFiles: File[]) => {
|
||||||
|
if (acceptedFiles.length > 0) {
|
||||||
|
const file = acceptedFiles[0]; // Take only the first file
|
||||||
|
setFormData(prev => ({
|
||||||
|
...prev,
|
||||||
|
file
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const { getRootProps, getInputProps, isDragActive } = useDropzone({
|
||||||
|
accept: {
|
||||||
|
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
|
||||||
|
},
|
||||||
|
onDrop,
|
||||||
|
multiple: false // Only accept single files
|
||||||
|
});
|
||||||
|
|
||||||
|
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
|
||||||
|
{
|
||||||
|
accessorKey: "name",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Name
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "is_global",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Type
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const isGlobal = row.getValue("is_global") as boolean;
|
||||||
|
return isGlobal ? "Global" : "Company Specific";
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "company",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Company
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const isGlobal = row.getValue("is_global") as boolean;
|
||||||
|
if (isGlobal) return 'N/A';
|
||||||
|
|
||||||
|
const companyId = row.getValue("company");
|
||||||
|
if (!companyId) return 'None';
|
||||||
|
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "file_size",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Size
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => {
|
||||||
|
const size = row.getValue("file_size") as number;
|
||||||
|
return `${(size / 1024).toFixed(1)} KB`;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "created_at",
|
||||||
|
header: ({ column }) => (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
|
||||||
|
>
|
||||||
|
Created
|
||||||
|
<ArrowUpDown className="ml-2 h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
),
|
||||||
|
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
accessorKey: "image_url",
|
||||||
|
header: "Thumbnail",
|
||||||
|
cell: ({ row }) => (
|
||||||
|
<div className="flex items-center justify-center">
|
||||||
|
<img
|
||||||
|
src={row.getValue("image_url") as string}
|
||||||
|
alt={row.getValue("name") as string}
|
||||||
|
className="w-10 h-10 object-contain border rounded"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: "actions",
|
||||||
|
cell: ({ row }) => (
|
||||||
|
<div className="flex gap-2 justify-end">
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
onClick={() => handlePreview(row.original)}
|
||||||
|
title="Preview Image"
|
||||||
|
>
|
||||||
|
<Eye className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
onClick={() => handleEdit(row.original)}
|
||||||
|
title="Edit Image"
|
||||||
|
>
|
||||||
|
<Pencil className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
className="text-destructive hover:text-destructive"
|
||||||
|
onClick={() => handleDeleteClick(row.original)}
|
||||||
|
title="Delete Image"
|
||||||
|
>
|
||||||
|
<Trash2 className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
),
|
||||||
|
},
|
||||||
|
], [fieldOptions]);
|
||||||
|
|
||||||
|
const filteredData = useMemo(() => {
|
||||||
|
if (!images) return [];
|
||||||
|
return images.filter((image) => {
|
||||||
|
const searchString = searchQuery.toLowerCase();
|
||||||
|
return (
|
||||||
|
image.name.toLowerCase().includes(searchString) ||
|
||||||
|
(image.is_global ? "global" : "company").includes(searchString) ||
|
||||||
|
(image.company && image.company.toLowerCase().includes(searchString))
|
||||||
|
);
|
||||||
|
});
|
||||||
|
}, [images, searchQuery]);
|
||||||
|
|
||||||
|
const table = useReactTable({
|
||||||
|
data: filteredData,
|
||||||
|
columns,
|
||||||
|
state: {
|
||||||
|
sorting,
|
||||||
|
},
|
||||||
|
onSortingChange: setSorting,
|
||||||
|
getSortedRowModel: getSortedRowModel(),
|
||||||
|
getCoreRowModel: getCoreRowModel(),
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-6">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h2 className="text-2xl font-bold">Reusable Images</h2>
|
||||||
|
<Button onClick={handleCreateClick}>
|
||||||
|
<PlusCircle className="mr-2 h-4 w-4" />
|
||||||
|
Upload New Image
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
<Input
|
||||||
|
placeholder="Search images..."
|
||||||
|
value={searchQuery}
|
||||||
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
|
className="max-w-sm"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{isLoading ? (
|
||||||
|
<div>Loading images...</div>
|
||||||
|
) : (
|
||||||
|
<div className="border rounded-lg">
|
||||||
|
<Table>
|
||||||
|
<TableHeader className="bg-muted">
|
||||||
|
{table.getHeaderGroups().map((headerGroup) => (
|
||||||
|
<TableRow key={headerGroup.id}>
|
||||||
|
{headerGroup.headers.map((header) => (
|
||||||
|
<TableHead key={header.id}>
|
||||||
|
{header.isPlaceholder
|
||||||
|
? null
|
||||||
|
: flexRender(
|
||||||
|
header.column.columnDef.header,
|
||||||
|
header.getContext()
|
||||||
|
)}
|
||||||
|
</TableHead>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))}
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{table.getRowModel().rows?.length ? (
|
||||||
|
table.getRowModel().rows.map((row) => (
|
||||||
|
<TableRow key={row.id} className="hover:bg-gray-100">
|
||||||
|
{row.getVisibleCells().map((cell) => (
|
||||||
|
<TableCell key={cell.id} className="pl-6">
|
||||||
|
{flexRender(cell.column.columnDef.cell, cell.getContext())}
|
||||||
|
</TableCell>
|
||||||
|
))}
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
) : (
|
||||||
|
<TableRow>
|
||||||
|
<TableCell colSpan={columns.length} className="text-center">
|
||||||
|
No images found
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Image Form Dialog */}
|
||||||
|
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
|
||||||
|
<DialogContent className="max-w-md">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
{editingImage
|
||||||
|
? "Update this reusable image's details."
|
||||||
|
: "Upload a new reusable image that can be used across products."}
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
|
||||||
|
<ImageForm
|
||||||
|
editingImage={editingImage}
|
||||||
|
formData={formData}
|
||||||
|
setFormData={setFormData}
|
||||||
|
onSubmit={handleSubmit}
|
||||||
|
onCancel={() => {
|
||||||
|
resetForm();
|
||||||
|
setIsFormOpen(false);
|
||||||
|
}}
|
||||||
|
fieldOptions={fieldOptions}
|
||||||
|
getRootProps={getRootProps}
|
||||||
|
getInputProps={getInputProps}
|
||||||
|
isDragActive={isDragActive}
|
||||||
|
/>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
|
||||||
|
{/* Delete Confirmation Dialog */}
|
||||||
|
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
|
||||||
|
<AlertDialogContent>
|
||||||
|
<AlertDialogHeader>
|
||||||
|
<AlertDialogTitle>Delete Image</AlertDialogTitle>
|
||||||
|
<AlertDialogDescription>
|
||||||
|
Are you sure you want to delete this image? This action cannot be undone.
|
||||||
|
</AlertDialogDescription>
|
||||||
|
</AlertDialogHeader>
|
||||||
|
<AlertDialogFooter>
|
||||||
|
<AlertDialogCancel onClick={() => {
|
||||||
|
setIsDeleteOpen(false);
|
||||||
|
setImageToDelete(null);
|
||||||
|
}}>
|
||||||
|
Cancel
|
||||||
|
</AlertDialogCancel>
|
||||||
|
<AlertDialogAction onClick={handleDeleteConfirm}>
|
||||||
|
Delete
|
||||||
|
</AlertDialogAction>
|
||||||
|
</AlertDialogFooter>
|
||||||
|
</AlertDialogContent>
|
||||||
|
</AlertDialog>
|
||||||
|
|
||||||
|
{/* Preview Dialog */}
|
||||||
|
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
|
||||||
|
<DialogContent className="max-w-3xl">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle>{previewImage?.name}</DialogTitle>
|
||||||
|
<DialogDescription>
|
||||||
|
{previewImage?.is_global
|
||||||
|
? "Global image"
|
||||||
|
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
|
||||||
|
</DialogDescription>
|
||||||
|
</DialogHeader>
|
||||||
|
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
{previewImage && (
|
||||||
|
<div className="bg-checkerboard rounded-md overflow-hidden">
|
||||||
|
<img
|
||||||
|
src={previewImage.image_url}
|
||||||
|
alt={previewImage.name}
|
||||||
|
className="max-h-[500px] max-w-full object-contain"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="grid grid-cols-2 gap-4 text-sm">
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Filename:</span> {previewImage?.filename}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Type:</span> {previewImage?.mime_type}
|
||||||
|
</div>
|
||||||
|
<div>
|
||||||
|
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<DialogFooter>
|
||||||
|
<DialogClose asChild>
|
||||||
|
<Button>Close</Button>
|
||||||
|
</DialogClose>
|
||||||
|
</DialogFooter>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
|
||||||
|
<style jsx global>{`
|
||||||
|
.bg-checkerboard {
|
||||||
|
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
|
||||||
|
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
|
||||||
|
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
|
||||||
|
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
|
||||||
|
background-size: 20px 20px;
|
||||||
|
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
|
||||||
|
}
|
||||||
|
`}</style>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const ImagePreview = ({ file }: { file: File }) => {
|
||||||
|
const [previewUrl, setPreviewUrl] = useState<string>('');
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const url = URL.createObjectURL(file);
|
||||||
|
setPreviewUrl(url);
|
||||||
|
return () => {
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
};
|
||||||
|
}, [file]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<img
|
||||||
|
src={previewUrl}
|
||||||
|
alt="Preview"
|
||||||
|
className="max-h-32 max-w-full object-contain rounded-md"
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -6,6 +6,7 @@ import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
|||||||
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
||||||
import { UserManagement } from "@/components/settings/UserManagement";
|
import { UserManagement } from "@/components/settings/UserManagement";
|
||||||
import { PromptManagement } from "@/components/settings/PromptManagement";
|
import { PromptManagement } from "@/components/settings/PromptManagement";
|
||||||
|
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
|
||||||
import { motion } from 'framer-motion';
|
import { motion } from 'framer-motion';
|
||||||
import { Alert, AlertDescription } from "@/components/ui/alert";
|
import { Alert, AlertDescription } from "@/components/ui/alert";
|
||||||
import { Protected } from "@/components/auth/Protected";
|
import { Protected } from "@/components/auth/Protected";
|
||||||
@@ -42,7 +43,8 @@ const SETTINGS_GROUPS: SettingsGroup[] = [
|
|||||||
label: "Content Management",
|
label: "Content Management",
|
||||||
tabs: [
|
tabs: [
|
||||||
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
{ id: "templates", permission: "settings:templates", label: "Template Management" },
|
||||||
{ id: "ai-prompts", permission: "settings:templates", label: "AI Prompts" },
|
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
|
||||||
|
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -220,7 +222,7 @@ export function Settings() {
|
|||||||
|
|
||||||
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
<Protected
|
<Protected
|
||||||
permission="settings:templates"
|
permission="settings:prompt_management"
|
||||||
fallback={
|
fallback={
|
||||||
<Alert>
|
<Alert>
|
||||||
<AlertDescription>
|
<AlertDescription>
|
||||||
@@ -233,6 +235,21 @@ export function Settings() {
|
|||||||
</Protected>
|
</Protected>
|
||||||
</TabsContent>
|
</TabsContent>
|
||||||
|
|
||||||
|
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
|
<Protected
|
||||||
|
permission="settings:library_management"
|
||||||
|
fallback={
|
||||||
|
<Alert>
|
||||||
|
<AlertDescription>
|
||||||
|
You don't have permission to access Reusable Images.
|
||||||
|
</AlertDescription>
|
||||||
|
</Alert>
|
||||||
|
}
|
||||||
|
>
|
||||||
|
<ReusableImageManagement />
|
||||||
|
</Protected>
|
||||||
|
</TabsContent>
|
||||||
|
|
||||||
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||||
<Protected
|
<Protected
|
||||||
permission="settings:user_management"
|
permission="settings:user_management"
|
||||||
|
|||||||
Reference in New Issue
Block a user