3 Commits

Author SHA1 Message Date
5dd779cb4a Fix purchase orders import 2025-03-25 19:12:41 -04:00
7b0e792d03 Merge branch 'master' into move-to-postgresql 2025-03-25 12:15:07 -04:00
517bbe72f4 Add in image library feature 2025-03-25 12:14:36 -04:00
13 changed files with 2484 additions and 130 deletions

View File

@@ -0,0 +1,342 @@
# MySQL to PostgreSQL Import Process Documentation
This document outlines the data import process from the production MySQL database to the local PostgreSQL database, focusing on column mappings, data transformations, and the overall import architecture.
## Table of Contents
1. [Overview](#overview)
2. [Import Architecture](#import-architecture)
3. [Column Mappings](#column-mappings)
- [Categories](#categories)
- [Products](#products)
- [Product Categories (Relationship)](#product-categories-relationship)
- [Orders](#orders)
- [Purchase Orders](#purchase-orders)
- [Metadata Tables](#metadata-tables)
4. [Special Calculations](#special-calculations)
5. [Implementation Notes](#implementation-notes)
## Overview
The import process extracts data from a MySQL 5.7 production database and imports it into a PostgreSQL database. It can operate in two modes:
- **Full Import**: Imports all data regardless of last sync time
- **Incremental Import**: Only imports data that has changed since the last import
The process handles four main data types:
- Categories (product categorization hierarchy)
- Products (inventory items)
- Orders (sales records)
- Purchase Orders (vendor orders)
## Import Architecture
The import process follows these steps:
1. **Establish Connection**: Creates a SSH tunnel to the production server and establishes database connections
2. **Setup Import History**: Creates a record of the current import operation
3. **Import Categories**: Processes product categories in hierarchical order
4. **Import Products**: Processes products with their attributes and category relationships
5. **Import Orders**: Processes customer orders with line items, taxes, and discounts
6. **Import Purchase Orders**: Processes vendor purchase orders with line items
7. **Record Results**: Updates the import history with results
8. **Close Connections**: Cleans up connections and resources
Each import step uses temporary tables for processing and wraps operations in transactions to ensure data consistency.
## Column Mappings
### Categories
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|---------------------------------|----------------------------------------------|
| cat_id | product_categories.cat_id | Direct mapping |
| name | product_categories.name | Direct mapping |
| type | product_categories.type | Direct mapping |
| parent_id | product_categories.master_cat_id| NULL for top-level categories (types 10, 20) |
| description | product_categories.combined_name| Direct mapping |
| status | N/A | Hard-coded 'active' |
| created_at | N/A | Current timestamp |
| updated_at | N/A | Current timestamp |
**Notes:**
- Categories are processed in hierarchical order by type: [10, 20, 11, 21, 12, 13]
- Type 10/20 are top-level categories with no parent
- Types 11/21/12/13 are child categories that reference parent categories
### Products
| PostgreSQL Column | MySQL Source | Transformation |
|----------------------|----------------------------------|---------------------------------------------------------------|
| pid | products.pid | Direct mapping |
| title | products.description | Direct mapping |
| description | products.notes | Direct mapping |
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
| stock_quantity | shop_inventory.available_local | Capped at 5000, minimum 0 |
| preorder_count | current_inventory.onpreorder | Default 0 |
| notions_inv_count | product_notions_b2b.inventory | Default 0 |
| price | product_current_prices.price_each| Default 0, filtered on active=1 |
| regular_price | products.sellingprice | Default 0 |
| cost_price | product_inventory | Weighted average: SUM(costeach * count) / SUM(count) when count > 0, or latest costeach |
| vendor | suppliers.companyname | Via supplier_item_data.supplier_id |
| vendor_reference | supplier_item_data | supplier_itemnumber or notions_itemnumber based on vendor |
| notions_reference | supplier_item_data.notions_itemnumber | Direct mapping |
| brand | product_categories.name | Linked via products.company |
| line | product_categories.name | Linked via products.line |
| subline | product_categories.name | Linked via products.subline |
| artist | product_categories.name | Linked via products.artist |
| categories | product_category_index | Comma-separated list of category IDs |
| created_at | products.date_created | Validated date, NULL if invalid |
| first_received | products.datein | Validated date, NULL if invalid |
| landing_cost_price | NULL | Not set |
| barcode | products.upc | Direct mapping |
| harmonized_tariff_code| products.harmonized_tariff_code | Direct mapping |
| updated_at | products.stamp | Validated date, NULL if invalid |
| visible | shop_inventory | Calculated from show + buyable > 0 |
| managing_stock | N/A | Hard-coded true |
| replenishable | Multiple fields | Complex calculation based on reorder, dates, etc. |
| permalink | N/A | Constructed URL with product ID |
| moq | supplier_item_data | notions_qty_per_unit or supplier_qty_per_unit, minimum 1 |
| uom | N/A | Hard-coded 1 |
| rating | products.rating | Direct mapping |
| reviews | products.rating_votes | Direct mapping |
| weight | products.weight | Direct mapping |
| length | products.length | Direct mapping |
| width | products.width | Direct mapping |
| height | products.height | Direct mapping |
| country_of_origin | products.country_of_origin | Direct mapping |
| location | products.location | Direct mapping |
| total_sold | order_items | SUM(qty_ordered) for all order_items where prod_pid = pid |
| baskets | mybasket | COUNT of records where mb.item = pid and qty > 0 |
| notifies | product_notify | COUNT of records where pn.pid = pid |
| date_last_sold | product_last_sold.date_sold | Validated date, NULL if invalid |
| image | N/A | Constructed from pid and image URL pattern |
| image_175 | N/A | Constructed from pid and image URL pattern |
| image_full | N/A | Constructed from pid and image URL pattern |
| options | NULL | Not set |
| tags | NULL | Not set |
**Notes:**
- Replenishable calculation:
```javascript
CASE
WHEN p.reorder < 0 THEN 0
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END
```
In business terms, a product is considered NOT replenishable only if:
- It was manually flagged as not replenishable (negative reorder value)
- OR it shows no activity across ALL metrics (no sales AND no receipts AND no refills in the past 5 years)
- Image URLs are constructed using this pattern:
```javascript
const paddedPid = pid.toString().padStart(6, '0');
const prefix = paddedPid.slice(0, 3);
const basePath = `${imageUrlBase}${prefix}/${pid}`;
return {
image: `${basePath}-t-${iid}.jpg`,
image_175: `${basePath}-175x175-${iid}.jpg`,
image_full: `${basePath}-o-${iid}.jpg`
};
```
### Product Categories (Relationship)
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| pid | products.pid | Direct mapping |
| cat_id | product_category_index.cat_id | Direct mapping, filtered by category types |
**Notes:**
- Only categories of types 10, 20, 11, 21, 12, 13 are imported
- Categories 16 and 17 are explicitly excluded
### Orders
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| order_number | order_items.order_id | Direct mapping |
| pid | order_items.prod_pid | Direct mapping |
| sku | order_items.prod_itemnumber | Fallback to 'NO-SKU' if empty |
| date | _order.date_placed_onlydate | Via join to _order table |
| price | order_items.prod_price | Direct mapping |
| quantity | order_items.qty_ordered | Direct mapping |
| discount | Multiple sources | Complex calculation (see notes) |
| tax | order_tax_info_products.item_taxes_to_collect | Via latest order_tax_info record |
| tax_included | N/A | Hard-coded false |
| shipping | N/A | Hard-coded 0 |
| customer | _order.order_cid | Direct mapping |
| customer_name | users | CONCAT(users.firstname, ' ', users.lastname) |
| status | _order.order_status | Direct mapping |
| canceled | _order.date_cancelled | Boolean: true if date_cancelled is not '0000-00-00 00:00:00' |
| costeach | order_costs | From latest record or fallback to price * 0.5 |
**Notes:**
- Only orders with order_status >= 15 and with a valid date_placed are processed
- For incremental imports, only orders modified since last sync are processed
- Discount calculation combines three sources:
1. Base discount: order_items.prod_price_reg - order_items.prod_price
2. Promo discount: SUM of order_discount_items.amount
3. Proportional order discount: Calculation based on order subtotal proportion
```javascript
(oi.base_discount +
COALESCE(ot.promo_discount, 0) +
CASE
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
ELSE 0
END)::DECIMAL(10,2)
```
- Taxes are taken from the latest tax record for an order
- Cost data is taken from the latest non-pending cost record
### Purchase Orders
| PostgreSQL Column | MySQL Source | Transformation |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| po_id | po.po_id | Default 0 if NULL |
| pid | po_products.pid | Direct mapping |
| sku | products.itemnumber | Fallback to 'NO-SKU' if empty |
| name | products.description | Fallback to 'Unknown Product' |
| cost_price | po_products.cost_each | Direct mapping |
| po_cost_price | po_products.cost_each | Duplicate of cost_price |
| vendor | suppliers.companyname | Fallback to 'Unknown Vendor' if empty |
| date | po.date_ordered | Fallback to po.date_created if NULL |
| expected_date | po.date_estin | Direct mapping |
| status | po.status | Default 1 if NULL |
| notes | po.short_note | Fallback to po.notes if NULL |
| ordered | po_products.qty_each | Direct mapping |
| received | N/A | Hard-coded 0 |
| receiving_status | N/A | Hard-coded 1 |
**Notes:**
- Only POs created within last 1 year (incremental) or 5 years (full) are processed
- For incremental imports, only POs modified since last sync are processed
### Metadata Tables
#### import_history
| PostgreSQL Column | Source | Notes |
|-------------------|-----------------------------------|---------------------------------------------------------------|
| id | Auto-increment | Primary key |
| table_name | Code | 'all_tables' for overall import |
| start_time | NOW() | Import start time |
| end_time | NOW() | Import completion time |
| duration_seconds | Calculation | Elapsed seconds |
| is_incremental | INCREMENTAL_UPDATE | Flag from config |
| records_added | Calculation | Sum from all imports |
| records_updated | Calculation | Sum from all imports |
| status | Code | 'running', 'completed', 'failed', or 'cancelled' |
| error_message | Exception | Error message if failed |
| additional_info | JSON | Configuration and results |
#### sync_status
| PostgreSQL Column | Source | Notes |
|----------------------|--------------------------------|---------------------------------------------------------------|
| table_name | Code | Name of imported table |
| last_sync_timestamp | NOW() | Timestamp of successful sync |
| last_sync_id | NULL | Not used currently |
## Special Calculations
### Date Validation
MySQL dates are validated before insertion into PostgreSQL:
```javascript
function validateDate(mysqlDate) {
if (!mysqlDate || mysqlDate === '0000-00-00' || mysqlDate === '0000-00-00 00:00:00') {
return null;
}
// Check if the date is valid
const date = new Date(mysqlDate);
return isNaN(date.getTime()) ? null : mysqlDate;
}
```
### Retry Mechanism
Operations that might fail temporarily are retried with exponential backoff:
```javascript
async function withRetry(operation, errorMessage) {
let lastError;
for (let attempt = 1; attempt <= MAX_RETRIES; attempt++) {
try {
return await operation();
} catch (error) {
lastError = error;
console.error(`${errorMessage} (Attempt ${attempt}/${MAX_RETRIES}):`, error);
if (attempt < MAX_RETRIES) {
const backoffTime = RETRY_DELAY * Math.pow(2, attempt - 1);
await new Promise(resolve => setTimeout(resolve, backoffTime));
}
}
}
throw lastError;
}
```
### Progress Tracking
Progress is tracked with estimated time remaining:
```javascript
function estimateRemaining(startTime, current, total) {
if (current === 0) return "Calculating...";
const elapsedSeconds = (Date.now() - startTime) / 1000;
const itemsPerSecond = current / elapsedSeconds;
const remainingItems = total - current;
const remainingSeconds = remainingItems / itemsPerSecond;
return formatElapsedTime(remainingSeconds);
}
```
## Implementation Notes
### Transaction Management
All imports use transactions to ensure data consistency:
- **Categories**: Uses savepoints for each category type
- **Products**: Uses a single transaction for the entire import
- **Orders**: Uses a single transaction with temporary tables
- **Purchase Orders**: Uses a single transaction with temporary tables
### Memory Usage Optimization
To minimize memory usage when processing large datasets:
1. Data is processed in batches (100-5000 records per batch)
2. Temporary tables are used for intermediate data
3. Some queries use cursors to avoid loading all results at once
### MySQL vs PostgreSQL Compatibility
The scripts handle differences between MySQL and PostgreSQL:
1. MySQL-specific syntax like `USE INDEX` is removed for PostgreSQL
2. `GROUP_CONCAT` in MySQL becomes string operations in PostgreSQL
3. Transaction syntax differences are abstracted in the connection wrapper
4. PostgreSQL's `ON CONFLICT` replaces MySQL's `ON DUPLICATE KEY UPDATE`
### SSH Tunnel
Database connections go through an SSH tunnel for security:
```javascript
ssh.forwardOut(
"127.0.0.1",
0,
sshConfig.prodDbConfig.host,
sshConfig.prodDbConfig.port,
async (err, stream) => {
if (err) reject(err);
resolve({ ssh, stream });
}
);
```

View File

@@ -4,7 +4,12 @@ SET session_replication_role = 'replica'; -- Disable foreign key checks tempora
-- Create function for updating timestamps
CREATE OR REPLACE FUNCTION update_updated_column() RETURNS TRIGGER AS $func$
BEGIN
NEW.updated = CURRENT_TIMESTAMP;
-- Check which table is being updated and use the appropriate column
IF TG_TABLE_NAME = 'categories' THEN
NEW.updated_at = CURRENT_TIMESTAMP;
ELSE
NEW.updated = CURRENT_TIMESTAMP;
END IF;
RETURN NEW;
END;
$func$ language plpgsql;
@@ -160,7 +165,7 @@ CREATE TABLE purchase_orders (
expected_date DATE,
pid BIGINT NOT NULL,
sku VARCHAR(50) NOT NULL,
name VARCHAR(100) NOT NULL,
name VARCHAR(255) NOT NULL,
cost_price DECIMAL(10, 3) NOT NULL,
po_cost_price DECIMAL(10, 3) NOT NULL,
status SMALLINT DEFAULT 1,
@@ -171,7 +176,7 @@ CREATE TABLE purchase_orders (
received INTEGER DEFAULT 0,
received_date DATE,
last_received_date DATE,
received_by VARCHAR(100),
received_by VARCHAR,
receiving_history JSONB,
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (pid) REFERENCES products(pid),

View File

@@ -49,6 +49,30 @@ CREATE UNIQUE INDEX IF NOT EXISTS idx_unique_system_prompt
ON ai_prompts (prompt_type)
WHERE prompt_type = 'system';
-- Reusable Images table for storing persistent images
CREATE TABLE IF NOT EXISTS reusable_images (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
filename TEXT NOT NULL,
file_path TEXT NOT NULL,
image_url TEXT NOT NULL,
is_global BOOLEAN NOT NULL DEFAULT false,
company TEXT,
mime_type TEXT,
file_size INTEGER,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT company_required_for_non_global CHECK (
(is_global = true AND company IS NULL) OR
(is_global = false AND company IS NOT NULL)
)
);
-- Create index on company for efficient querying
CREATE INDEX IF NOT EXISTS idx_reusable_images_company ON reusable_images(company);
-- Create index on is_global for efficient querying
CREATE INDEX IF NOT EXISTS idx_reusable_images_is_global ON reusable_images(is_global);
-- AI Validation Performance Tracking
CREATE TABLE IF NOT EXISTS ai_validation_performance (
id SERIAL PRIMARY KEY,
@@ -83,3 +107,9 @@ CREATE TRIGGER update_ai_prompts_updated_at
BEFORE UPDATE ON ai_prompts
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
-- Trigger to automatically update the updated_at column for reusable_images
CREATE TRIGGER update_reusable_images_updated_at
BEFORE UPDATE ON reusable_images
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();

View File

@@ -142,6 +142,14 @@ async function importCategories(prodConnection, localConnection) {
// Commit the entire transaction - we'll do this even if we have skipped categories
await localConnection.query('COMMIT');
// Update sync status
await localConnection.query(`
INSERT INTO sync_status (table_name, last_sync_timestamp)
VALUES ('categories', NOW())
ON CONFLICT (table_name) DO UPDATE SET
last_sync_timestamp = NOW()
`);
outputProgress({
status: "complete",
operation: "Categories import completed",

View File

@@ -144,10 +144,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
CASE
WHEN p.reorder < 0 THEN 0
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END AS replenishable,
@@ -159,7 +161,11 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
COALESCE(p.sellingprice, 0) AS regular_price,
CASE
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
FROM product_inventory
WHERE pid = p.pid AND count > 0
)
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
END AS cost_price,
NULL as landing_cost_price,
@@ -187,7 +193,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
p.country_of_origin,
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
p.totalsold AS total_sold,
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
pls.date_sold as date_last_sold,
GROUP_CONCAT(DISTINCT CASE
WHEN pc.cat_id IS NOT NULL
@@ -237,7 +243,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
row.pid,
row.title,
row.description,
row.itemnumber || '',
row.sku || '',
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
row.preorder_count,
row.notions_inv_count,
@@ -339,10 +345,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
CASE
WHEN p.reorder < 0 THEN 0
WHEN p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR) THEN 1
WHEN COALESCE(pnb.inventory, 0) > 0 THEN 1
WHEN (
(COALESCE(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR))
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
AND (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(CURRENT_TIMESTAMP, INTERVAL 5 YEAR))
) THEN 0
ELSE 1
END AS replenishable,
@@ -354,7 +362,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
COALESCE(p.sellingprice, 0) AS regular_price,
CASE
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
THEN (
SELECT ROUND(SUM(costeach * count) / SUM(count), 5)
FROM product_inventory
WHERE pid = p.pid AND count > 0
)
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
END AS cost_price,
NULL as landing_cost_price,
@@ -382,7 +394,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
p.country_of_origin,
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
p.totalsold AS total_sold,
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
pls.date_sold as date_last_sold,
GROUP_CONCAT(DISTINCT CASE
WHEN pc.cat_id IS NOT NULL
@@ -436,7 +448,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
row.pid,
row.title,
row.description,
row.itemnumber || '',
row.sku || '',
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
row.preorder_count,
row.notions_inv_count,
@@ -832,6 +844,14 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
// Commit the transaction
await localConnection.commit();
// Update sync status
await localConnection.query(`
INSERT INTO sync_status (table_name, last_sync_timestamp)
VALUES ('products', NOW())
ON CONFLICT (table_name) DO UPDATE SET
last_sync_timestamp = NOW()
`);
return {
status: 'complete',
recordsAdded,

View File

@@ -1,9 +1,56 @@
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
/**
* Validates a date from MySQL before inserting it into PostgreSQL
* @param {string|Date|null} mysqlDate - Date string or object from MySQL
* @returns {string|null} Valid date string or null if invalid
*/
function validateDate(mysqlDate) {
// Handle null, undefined, or empty values
if (!mysqlDate) {
return null;
}
// Convert to string if it's not already
const dateStr = String(mysqlDate);
// Handle MySQL zero dates and empty values
if (dateStr === '0000-00-00' ||
dateStr === '0000-00-00 00:00:00' ||
dateStr.indexOf('0000-00-00') !== -1 ||
dateStr === '') {
return null;
}
// Check if the date is valid
const date = new Date(mysqlDate);
// If the date is invalid or suspiciously old (pre-1970), return null
if (isNaN(date.getTime()) || date.getFullYear() < 1970) {
return null;
}
return mysqlDate;
}
/**
* Imports purchase orders and receivings from a production MySQL database to a local PostgreSQL database.
* Implements FIFO allocation of receivings to purchase orders.
*
* @param {object} prodConnection - A MySQL connection to production DB
* @param {object} localConnection - A PostgreSQL connection to local DB
* @param {boolean} incrementalUpdate - Set to false for a full sync; true for incremental
* @returns {object} Information about the sync operation
*/
async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) {
const startTime = Date.now();
let recordsAdded = 0;
let recordsUpdated = 0;
let totalProcessed = 0;
// Batch size constants
const PO_BATCH_SIZE = 500;
const INSERT_BATCH_SIZE = 100;
try {
// Begin transaction for the entire import process
@@ -17,75 +64,177 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
console.log('Purchase Orders: Using last sync time:', lastSyncTime);
// Create temp tables
// Create temp tables for processing
await localConnection.query(`
DROP TABLE IF EXISTS temp_purchase_orders;
CREATE TABLE temp_purchase_orders (
po_id INTEGER NOT NULL,
pid INTEGER NOT NULL,
DROP TABLE IF EXISTS temp_receivings;
DROP TABLE IF EXISTS temp_receiving_allocations;
DROP TABLE IF EXISTS employee_names;
-- Temporary table for purchase orders
CREATE TEMP TABLE temp_purchase_orders (
po_id VARCHAR(50) NOT NULL,
pid BIGINT NOT NULL,
sku VARCHAR(50),
name VARCHAR(255),
vendor VARCHAR(255),
date TIMESTAMP WITH TIME ZONE,
expected_date TIMESTAMP WITH TIME ZONE,
expected_date DATE,
status INTEGER,
status_text VARCHAR(50),
notes TEXT,
long_note TEXT,
ordered INTEGER,
cost_price DECIMAL(10,3),
po_cost_price DECIMAL(10,3),
supplier_id INTEGER,
date_created TIMESTAMP WITH TIME ZONE,
date_ordered TIMESTAMP WITH TIME ZONE,
PRIMARY KEY (po_id, pid)
);
-- Temporary table for receivings
CREATE TEMP TABLE temp_receivings (
receiving_id VARCHAR(50) NOT NULL,
po_id VARCHAR(50),
pid BIGINT NOT NULL,
qty_each INTEGER,
cost_each DECIMAL(10,5),
received_by INTEGER,
received_date TIMESTAMP WITH TIME ZONE,
receiving_created_date TIMESTAMP WITH TIME ZONE,
supplier_id INTEGER,
status INTEGER,
status_text VARCHAR(50),
PRIMARY KEY (receiving_id, pid)
);
-- Temporary table for tracking FIFO allocations
CREATE TEMP TABLE temp_receiving_allocations (
po_id VARCHAR(50) NOT NULL,
pid BIGINT NOT NULL,
receiving_id VARCHAR(50) NOT NULL,
allocated_qty INTEGER NOT NULL,
cost_each DECIMAL(10,5) NOT NULL,
received_date TIMESTAMP WITH TIME ZONE NOT NULL,
received_by INTEGER,
PRIMARY KEY (po_id, pid, receiving_id)
);
-- Temporary table for employee names
CREATE TEMP TABLE employee_names (
employeeid INTEGER PRIMARY KEY,
firstname VARCHAR(100),
lastname VARCHAR(100)
);
-- Create indexes for efficient joins
CREATE INDEX idx_temp_po_pid ON temp_purchase_orders(pid);
CREATE INDEX idx_temp_receiving_pid ON temp_receivings(pid);
CREATE INDEX idx_temp_receiving_po_id ON temp_receivings(po_id);
`);
// First get all relevant PO IDs with basic info - Keep MySQL compatible for production
const [[{ total }]] = await prodConnection.query(`
// Map status codes to text values
const poStatusMap = {
0: 'Canceled',
1: 'Created',
10: 'Ready ESend',
11: 'Ordered',
12: 'Preordered',
13: 'Electronically Sent',
15: 'Receiving Started',
50: 'Done'
};
const receivingStatusMap = {
0: 'Canceled',
1: 'Created',
30: 'Partial Received',
40: 'Full Received',
50: 'Paid'
};
// Get time window for data retrieval
const yearInterval = incrementalUpdate ? 1 : 5;
// Fetch employee data from production
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Fetching employee data"
});
const [employees] = await prodConnection.query(`
SELECT
employeeid,
firstname,
lastname
FROM employees
`);
// Insert employee data into temp table
if (employees.length > 0) {
const employeeValues = employees.map(emp => [
emp.employeeid,
emp.firstname || '',
emp.lastname || ''
]).flat();
const placeholders = employees.map((_, idx) => {
const base = idx * 3;
return `($${base + 1}, $${base + 2}, $${base + 3})`;
}).join(',');
await localConnection.query(`
INSERT INTO employee_names (employeeid, firstname, lastname)
VALUES ${placeholders}
ON CONFLICT (employeeid) DO UPDATE SET
firstname = EXCLUDED.firstname,
lastname = EXCLUDED.lastname
`, employeeValues);
}
// 1. First, fetch all relevant POs
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Fetching purchase orders"
});
const [poCount] = await prodConnection.query(`
SELECT COUNT(*) as total
FROM (
SELECT DISTINCT pop.po_id, pop.pid
FROM po p
JOIN po_products pop ON p.po_id = pop.po_id
JOIN suppliers s ON p.supplier_id = s.supplierid
WHERE p.date_ordered >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
${incrementalUpdate ? `
AND (
p.date_updated > ?
OR p.date_ordered > ?
OR p.date_estin > ?
)
` : ''}
) all_items
FROM po p
WHERE p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
${incrementalUpdate ? `
AND (
p.date_updated > ?
OR p.date_ordered > ?
OR p.date_estin > ?
)
` : ''}
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime] : []);
console.log('Purchase Orders: Found changes:', total);
const totalPOs = poCount[0].total;
console.log(`Found ${totalPOs} relevant purchase orders`);
// Get PO list - Keep MySQL compatible for production
console.log('Fetching purchase orders in batches...');
const FETCH_BATCH_SIZE = 5000;
const INSERT_BATCH_SIZE = 200; // Process 200 records at a time for inserts
// Fetch and process POs in batches
let offset = 0;
let allProcessed = false;
let totalProcessed = 0;
let allPOsProcessed = false;
while (!allProcessed) {
console.log(`Fetching batch at offset ${offset}...`);
while (!allPOsProcessed) {
const [poList] = await prodConnection.query(`
SELECT DISTINCT
COALESCE(p.po_id, 0) as po_id,
pop.pid,
COALESCE(NULLIF(pr.itemnumber, ''), 'NO-SKU') as sku,
COALESCE(pr.description, 'Unknown Product') as name,
COALESCE(NULLIF(s.companyname, ''), 'Unknown Vendor') as vendor,
COALESCE(p.date_ordered, p.date_created) as date,
p.date_estin as expected_date,
COALESCE(p.status, 1) as status,
COALESCE(p.short_note, p.notes) as notes,
pop.qty_each as ordered,
pop.cost_each as cost_price
SELECT
p.po_id,
p.supplier_id,
s.companyname AS vendor,
p.status,
p.notes AS long_note,
p.short_note AS notes,
p.date_created,
p.date_ordered,
p.date_estin
FROM po p
JOIN po_products pop ON p.po_id = pop.po_id
JOIN products pr ON pop.pid = pr.pid
LEFT JOIN suppliers s ON p.supplier_id = s.supplierid
WHERE p.date_ordered >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
WHERE p.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
${incrementalUpdate ? `
AND (
p.date_updated > ?
@@ -93,28 +242,66 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
OR p.date_estin > ?
)
` : ''}
ORDER BY p.po_id, pop.pid
LIMIT ${FETCH_BATCH_SIZE} OFFSET ${offset}
ORDER BY p.po_id
LIMIT ${PO_BATCH_SIZE} OFFSET ${offset}
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime] : []);
if (poList.length === 0) {
allProcessed = true;
allPOsProcessed = true;
break;
}
console.log(`Processing batch of ${poList.length} purchase order items (${offset}-${offset + poList.length})`);
// Get products for these POs
const poIds = poList.map(po => po.po_id);
// Process in smaller batches for inserts
for (let i = 0; i < poList.length; i += INSERT_BATCH_SIZE) {
const batch = poList.slice(i, Math.min(i + INSERT_BATCH_SIZE, poList.length));
const [poProducts] = await prodConnection.query(`
SELECT
pp.po_id,
pp.pid,
pp.qty_each,
pp.cost_each,
COALESCE(p.itemnumber, 'NO-SKU') AS sku,
COALESCE(p.description, 'Unknown Product') AS name
FROM po_products pp
LEFT JOIN products p ON pp.pid = p.pid
WHERE pp.po_id IN (?)
`, [poIds]);
// Build complete PO records
const completePOs = [];
for (const product of poProducts) {
const po = poList.find(p => p.po_id == product.po_id);
if (!po) continue;
completePOs.push({
po_id: po.po_id.toString(),
pid: product.pid,
sku: product.sku,
name: product.name,
vendor: po.vendor || 'Unknown Vendor',
date: validateDate(po.date_ordered) || validateDate(po.date_created),
expected_date: validateDate(po.date_estin),
status: po.status,
status_text: poStatusMap[po.status] || '',
notes: po.notes || '',
long_note: po.long_note || '',
ordered: product.qty_each,
po_cost_price: product.cost_each,
supplier_id: po.supplier_id,
date_created: validateDate(po.date_created),
date_ordered: validateDate(po.date_ordered)
});
}
// Insert PO data in batches
for (let i = 0; i < completePOs.length; i += INSERT_BATCH_SIZE) {
const batch = completePOs.slice(i, i + INSERT_BATCH_SIZE);
// Create parameterized query with placeholders
const placeholders = batch.map((_, idx) => {
const base = idx * 11; // 11 columns
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11})`;
const base = idx * 16;
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15}, $${base + 16})`;
}).join(',');
// Create flattened values array
const values = batch.flatMap(po => [
po.po_id,
po.pid,
@@ -124,16 +311,20 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
po.date,
po.expected_date,
po.status,
po.status_text,
po.notes,
po.long_note,
po.ordered,
po.cost_price
po.po_cost_price,
po.supplier_id,
po.date_created,
po.date_ordered
]);
// Execute batch insert
await localConnection.query(`
INSERT INTO temp_purchase_orders (
po_id, pid, sku, name, vendor, date, expected_date,
status, notes, ordered, cost_price
po_id, pid, sku, name, vendor, date, expected_date, status, status_text,
notes, long_note, ordered, po_cost_price, supplier_id, date_created, date_ordered
)
VALUES ${placeholders}
ON CONFLICT (po_id, pid) DO UPDATE SET
@@ -143,73 +334,531 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
date = EXCLUDED.date,
expected_date = EXCLUDED.expected_date,
status = EXCLUDED.status,
status_text = EXCLUDED.status_text,
notes = EXCLUDED.notes,
long_note = EXCLUDED.long_note,
ordered = EXCLUDED.ordered,
cost_price = EXCLUDED.cost_price
po_cost_price = EXCLUDED.po_cost_price,
supplier_id = EXCLUDED.supplier_id,
date_created = EXCLUDED.date_created,
date_ordered = EXCLUDED.date_ordered
`, values);
totalProcessed += batch.length;
outputProgress({
status: "running",
operation: "Purchase orders import",
message: `Processed ${totalProcessed}/${total} purchase order items`,
current: totalProcessed,
total: total,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
remaining: estimateRemaining(startTime, totalProcessed, total),
rate: calculateRate(startTime, totalProcessed)
});
}
// Update offset for next batch
offset += poList.length;
totalProcessed += completePOs.length;
// Check if we've received fewer records than the batch size, meaning we're done
if (poList.length < FETCH_BATCH_SIZE) {
allProcessed = true;
outputProgress({
status: "running",
operation: "Purchase orders import",
message: `Processed ${offset} of ${totalPOs} purchase orders (${totalProcessed} line items)`,
current: offset,
total: totalPOs,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
remaining: estimateRemaining(startTime, offset, totalPOs),
rate: calculateRate(startTime, offset)
});
if (poList.length < PO_BATCH_SIZE) {
allPOsProcessed = true;
}
}
// Count the temp table contents
const [tempCount] = await localConnection.query(`SELECT COUNT(*) FROM temp_purchase_orders`);
const tempRowCount = parseInt(tempCount.rows[0].count);
console.log(`Successfully inserted ${tempRowCount} rows into temp_purchase_orders`);
// 2. Next, fetch all relevant receivings
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Fetching receivings data"
});
// Now insert into the final table
const [result] = await localConnection.query(`
WITH inserted_pos AS (
INSERT INTO purchase_orders (
po_id, pid, sku, name, cost_price, po_cost_price,
vendor, date, expected_date, status, notes,
ordered, received, receiving_status
)
const [receivingCount] = await prodConnection.query(`
SELECT COUNT(*) as total
FROM receivings r
WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
${incrementalUpdate ? `
AND (
r.date_updated > ?
OR r.date_created > ?
)
` : ''}
`, incrementalUpdate ? [lastSyncTime, lastSyncTime] : []);
const totalReceivings = receivingCount[0].total;
console.log(`Found ${totalReceivings} relevant receivings`);
// Fetch and process receivings in batches
offset = 0; // Reset offset for receivings
let allReceivingsProcessed = false;
while (!allReceivingsProcessed) {
const [receivingList] = await prodConnection.query(`
SELECT
po_id, pid, sku, name, cost_price, cost_price,
vendor, date, expected_date, status, notes,
ordered, 0 as received, 1 as receiving_status
FROM temp_purchase_orders
r.receiving_id,
r.po_id,
r.supplier_id,
r.status,
r.date_created
FROM receivings r
WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
${incrementalUpdate ? `
AND (
r.date_updated > ?
OR r.date_created > ?
)
` : ''}
ORDER BY r.receiving_id
LIMIT ${PO_BATCH_SIZE} OFFSET ${offset}
`, incrementalUpdate ? [lastSyncTime, lastSyncTime] : []);
if (receivingList.length === 0) {
allReceivingsProcessed = true;
break;
}
// Get products for these receivings
const receivingIds = receivingList.map(r => r.receiving_id);
const [receivingProducts] = await prodConnection.query(`
SELECT
rp.receiving_id,
rp.pid,
rp.qty_each,
rp.cost_each,
rp.received_by,
rp.received_date,
r.date_created as receiving_created_date
FROM receivings_products rp
JOIN receivings r ON rp.receiving_id = r.receiving_id
WHERE rp.receiving_id IN (?)
`, [receivingIds]);
// Build complete receiving records
const completeReceivings = [];
for (const product of receivingProducts) {
const receiving = receivingList.find(r => r.receiving_id == product.receiving_id);
if (!receiving) continue;
completeReceivings.push({
receiving_id: receiving.receiving_id.toString(),
po_id: receiving.po_id ? receiving.po_id.toString() : null,
pid: product.pid,
qty_each: product.qty_each,
cost_each: product.cost_each,
received_by: product.received_by,
received_date: validateDate(product.received_date) || validateDate(product.receiving_created_date),
receiving_created_date: validateDate(product.receiving_created_date),
supplier_id: receiving.supplier_id,
status: receiving.status,
status_text: receivingStatusMap[receiving.status] || '',
receiving_created_date: validateDate(product.receiving_created_date)
});
}
// Insert receiving data in batches
for (let i = 0; i < completeReceivings.length; i += INSERT_BATCH_SIZE) {
const batch = completeReceivings.slice(i, i + INSERT_BATCH_SIZE);
const placeholders = batch.map((_, idx) => {
const base = idx * 11;
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11})`;
}).join(',');
const values = batch.flatMap(r => [
r.receiving_id,
r.po_id,
r.pid,
r.qty_each,
r.cost_each,
r.received_by,
r.received_date,
r.receiving_created_date,
r.supplier_id,
r.status,
r.status_text
]);
await localConnection.query(`
INSERT INTO temp_receivings (
receiving_id, po_id, pid, qty_each, cost_each, received_by,
received_date, receiving_created_date, supplier_id, status, status_text
)
VALUES ${placeholders}
ON CONFLICT (receiving_id, pid) DO UPDATE SET
po_id = EXCLUDED.po_id,
qty_each = EXCLUDED.qty_each,
cost_each = EXCLUDED.cost_each,
received_by = EXCLUDED.received_by,
received_date = EXCLUDED.received_date,
receiving_created_date = EXCLUDED.receiving_created_date,
supplier_id = EXCLUDED.supplier_id,
status = EXCLUDED.status,
status_text = EXCLUDED.status_text
`, values);
}
offset += receivingList.length;
totalProcessed += completeReceivings.length;
outputProgress({
status: "running",
operation: "Purchase orders import",
message: `Processed ${offset} of ${totalReceivings} receivings (${totalProcessed} line items total)`,
current: offset,
total: totalReceivings,
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
remaining: estimateRemaining(startTime, offset, totalReceivings),
rate: calculateRate(startTime, offset)
});
if (receivingList.length < PO_BATCH_SIZE) {
allReceivingsProcessed = true;
}
}
// 3. Implement FIFO allocation of receivings to purchase orders
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Allocating receivings to purchase orders using FIFO"
});
// Step 1: Handle receivings with matching PO IDs (direct allocation)
await localConnection.query(`
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
r.po_id,
r.pid,
r.receiving_id,
LEAST(r.qty_each, po.ordered) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by
FROM temp_receivings r
JOIN temp_purchase_orders po ON r.po_id = po.po_id AND r.pid = po.pid
WHERE r.po_id IS NOT NULL
`);
// Step 2: Handle receivings without a matching PO (standalone receivings)
// Create a PO entry for each standalone receiving
await localConnection.query(`
INSERT INTO temp_purchase_orders (
po_id, pid, sku, name, vendor, date, status, status_text,
ordered, po_cost_price, supplier_id, date_created, date_ordered
)
SELECT
'R' || r.receiving_id as po_id,
r.pid,
COALESCE(p.sku, 'NO-SKU') as sku,
COALESCE(p.name, 'Unknown Product') as name,
COALESCE(
(SELECT vendor FROM temp_purchase_orders
WHERE supplier_id = r.supplier_id LIMIT 1),
'Unknown Vendor'
) as vendor,
COALESCE(r.received_date, r.receiving_created_date) as date,
NULL as status,
NULL as status_text,
NULL as ordered,
r.cost_each as po_cost_price,
r.supplier_id,
COALESCE(r.receiving_created_date, r.received_date) as date_created,
NULL as date_ordered
FROM temp_receivings r
LEFT JOIN (
SELECT DISTINCT pid, sku, name FROM temp_purchase_orders
) p ON r.pid = p.pid
WHERE r.po_id IS NULL
OR NOT EXISTS (
SELECT 1 FROM temp_purchase_orders po
WHERE po.po_id = r.po_id AND po.pid = r.pid
)
ON CONFLICT (po_id, pid) DO NOTHING
`);
// Now allocate these standalone receivings to their "virtual" POs
await localConnection.query(`
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
'R' || r.receiving_id as po_id,
r.pid,
r.receiving_id,
r.qty_each as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by
FROM temp_receivings r
WHERE r.po_id IS NULL
OR NOT EXISTS (
SELECT 1 FROM temp_purchase_orders po
WHERE po.po_id = r.po_id AND po.pid = r.pid
)
`);
// Step 3: Handle unallocated receivings vs. unfulfilled orders
// This is the complex FIFO allocation logic
await localConnection.query(`
WITH
-- Calculate remaining quantities after direct allocations
remaining_po_quantities AS (
SELECT
po.po_id,
po.pid,
po.ordered,
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
po.ordered - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
po.date_ordered,
po.date_created
FROM temp_purchase_orders po
LEFT JOIN temp_receiving_allocations ra ON po.po_id = ra.po_id AND po.pid = ra.pid
WHERE po.ordered IS NOT NULL
GROUP BY po.po_id, po.pid, po.ordered, po.date_ordered, po.date_created
HAVING po.ordered > COALESCE(SUM(ra.allocated_qty), 0)
),
remaining_receiving_quantities AS (
SELECT
r.receiving_id,
r.pid,
r.qty_each,
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
r.qty_each - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
r.received_date,
r.cost_each,
r.received_by
FROM temp_receivings r
LEFT JOIN temp_receiving_allocations ra ON r.receiving_id = ra.receiving_id AND r.pid = ra.pid
GROUP BY r.receiving_id, r.pid, r.qty_each, r.received_date, r.cost_each, r.received_by
HAVING r.qty_each > COALESCE(SUM(ra.allocated_qty), 0)
),
-- Rank POs by age, with a cutoff for very old POs (1 year)
ranked_pos AS (
SELECT
po.po_id,
po.pid,
po.remaining_qty,
CASE
WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2
ELSE 1
END as age_group,
ROW_NUMBER() OVER (
PARTITION BY po.pid, (CASE WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2 ELSE 1 END)
ORDER BY COALESCE(po.date_ordered, po.date_created, NOW())
) as rank_in_group
FROM remaining_po_quantities po
),
-- Rank receivings by date
ranked_receivings AS (
SELECT
r.receiving_id,
r.pid,
r.remaining_qty,
r.received_date,
r.cost_each,
r.received_by,
ROW_NUMBER() OVER (PARTITION BY r.pid ORDER BY COALESCE(r.received_date, NOW())) as rank
FROM remaining_receiving_quantities r
),
-- First allocate to recent POs
allocations_recent AS (
SELECT
po.po_id,
po.pid,
r.receiving_id,
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by,
po.age_group,
po.rank_in_group,
r.rank,
'recent' as allocation_type
FROM ranked_pos po
JOIN ranked_receivings r ON po.pid = r.pid
WHERE po.age_group = 1
ORDER BY po.pid, po.rank_in_group, r.rank
),
-- Then allocate to older POs
remaining_after_recent AS (
SELECT
r.receiving_id,
r.pid,
r.remaining_qty - COALESCE(SUM(a.allocated_qty), 0) as remaining_qty,
r.received_date,
r.cost_each,
r.received_by,
r.rank
FROM ranked_receivings r
LEFT JOIN allocations_recent a ON r.receiving_id = a.receiving_id AND r.pid = a.pid
GROUP BY r.receiving_id, r.pid, r.remaining_qty, r.received_date, r.cost_each, r.received_by, r.rank
HAVING r.remaining_qty > COALESCE(SUM(a.allocated_qty), 0)
),
allocations_old AS (
SELECT
po.po_id,
po.pid,
r.receiving_id,
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by,
po.age_group,
po.rank_in_group,
r.rank,
'old' as allocation_type
FROM ranked_pos po
JOIN remaining_after_recent r ON po.pid = r.pid
WHERE po.age_group = 2
ORDER BY po.pid, po.rank_in_group, r.rank
),
-- Combine allocations
combined_allocations AS (
SELECT * FROM allocations_recent
UNION ALL
SELECT * FROM allocations_old
)
-- Insert into allocations table
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
po_id, pid, receiving_id, allocated_qty, cost_each,
COALESCE(received_date, NOW()) as received_date,
received_by
FROM combined_allocations
WHERE allocated_qty > 0
`);
// 4. Generate final purchase order records with receiving data
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Generating final purchase order records"
});
const [finalResult] = await localConnection.query(`
WITH
receiving_summaries AS (
SELECT
po_id,
pid,
SUM(allocated_qty) as total_received,
JSONB_AGG(
JSONB_BUILD_OBJECT(
'receiving_id', receiving_id,
'qty', allocated_qty,
'date', COALESCE(received_date, NOW()),
'cost', cost_each,
'received_by', received_by,
'received_by_name', CASE
WHEN received_by IS NOT NULL AND received_by > 0 THEN
(SELECT CONCAT(firstname, ' ', lastname)
FROM employee_names
WHERE employeeid = received_by)
ELSE NULL
END
) ORDER BY COALESCE(received_date, NOW())
) as receiving_history,
MIN(COALESCE(received_date, NOW())) as first_received_date,
MAX(COALESCE(received_date, NOW())) as last_received_date,
STRING_AGG(
DISTINCT CASE WHEN received_by IS NOT NULL AND received_by > 0
THEN CAST(received_by AS TEXT)
ELSE NULL
END,
','
) as received_by_list,
STRING_AGG(
DISTINCT CASE
WHEN ra.received_by IS NOT NULL AND ra.received_by > 0 THEN
(SELECT CONCAT(firstname, ' ', lastname)
FROM employee_names
WHERE employeeid = ra.received_by)
ELSE NULL
END,
', '
) as received_by_names
FROM temp_receiving_allocations ra
GROUP BY po_id, pid
),
cost_averaging AS (
SELECT
ra.po_id,
ra.pid,
SUM(ra.allocated_qty * ra.cost_each) / NULLIF(SUM(ra.allocated_qty), 0) as avg_cost
FROM temp_receiving_allocations ra
GROUP BY ra.po_id, ra.pid
)
INSERT INTO purchase_orders (
po_id, vendor, date, expected_date, pid, sku, name,
cost_price, po_cost_price, status, receiving_status, notes, long_note,
ordered, received, received_date, last_received_date, received_by,
receiving_history
)
SELECT
po.po_id,
po.vendor,
CASE
WHEN po.date IS NOT NULL THEN po.date
-- For standalone receivings, try to use the receiving date from history
WHEN po.po_id LIKE 'R%' AND rs.first_received_date IS NOT NULL THEN rs.first_received_date
-- As a last resort for data integrity, use Unix epoch (Jan 1, 1970)
ELSE to_timestamp(0)
END as date,
NULLIF(po.expected_date::text, '0000-00-00')::date as expected_date,
po.pid,
po.sku,
po.name,
COALESCE(ca.avg_cost, po.po_cost_price) as cost_price,
po.po_cost_price,
CASE WHEN po.status IS NULL THEN 1 ELSE po.status END as status,
CASE
WHEN rs.total_received IS NULL THEN 1
WHEN rs.total_received = 0 THEN 1
WHEN rs.total_received < po.ordered THEN 30
WHEN rs.total_received >= po.ordered THEN 40
ELSE 1
END as receiving_status,
po.notes,
po.long_note,
COALESCE(po.ordered, 0),
COALESCE(rs.total_received, 0),
NULLIF(rs.first_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as received_date,
NULLIF(rs.last_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as last_received_date,
CASE
WHEN rs.received_by_list IS NULL THEN NULL
ELSE rs.received_by_names
END as received_by,
rs.receiving_history
FROM temp_purchase_orders po
LEFT JOIN receiving_summaries rs ON po.po_id = rs.po_id AND po.pid = rs.pid
LEFT JOIN cost_averaging ca ON po.po_id = ca.po_id AND po.pid = ca.pid
ON CONFLICT (po_id, pid) DO UPDATE SET
vendor = EXCLUDED.vendor,
date = EXCLUDED.date,
expected_date = EXCLUDED.expected_date,
sku = EXCLUDED.sku,
name = EXCLUDED.name,
cost_price = EXCLUDED.cost_price,
po_cost_price = EXCLUDED.po_cost_price,
status = EXCLUDED.status,
receiving_status = EXCLUDED.receiving_status,
notes = EXCLUDED.notes,
long_note = EXCLUDED.long_note,
ordered = EXCLUDED.ordered,
cost_price = EXCLUDED.cost_price,
po_cost_price = EXCLUDED.po_cost_price
RETURNING xmax = 0 as inserted
)
SELECT
COUNT(*) FILTER (WHERE inserted) as inserted,
COUNT(*) FILTER (WHERE NOT inserted) as updated
FROM inserted_pos
received = EXCLUDED.received,
received_date = EXCLUDED.received_date,
last_received_date = EXCLUDED.last_received_date,
received_by = EXCLUDED.received_by,
receiving_history = EXCLUDED.receiving_history,
updated = CURRENT_TIMESTAMP
RETURNING (xmax = 0) as inserted
`);
// Parse the result
const { inserted, updated } = result.rows[0];
recordsAdded = parseInt(inserted) || 0;
recordsUpdated = parseInt(updated) || 0;
recordsAdded = finalResult.rows.filter(r => r.inserted).length;
recordsUpdated = finalResult.rows.filter(r => !r.inserted).length;
// Update sync status
await localConnection.query(`
@@ -220,7 +869,12 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
`);
// Clean up temporary tables
await localConnection.query(`DROP TABLE IF EXISTS temp_purchase_orders;`);
await localConnection.query(`
DROP TABLE IF EXISTS temp_purchase_orders;
DROP TABLE IF EXISTS temp_receivings;
DROP TABLE IF EXISTS temp_receiving_allocations;
DROP TABLE IF EXISTS employee_names;
`);
// Commit transaction
await localConnection.commit();

View File

@@ -0,0 +1,396 @@
const express = require('express');
const router = express.Router();
const multer = require('multer');
const path = require('path');
const fs = require('fs');
// Create reusable uploads directory if it doesn't exist
const uploadsDir = path.join('/var/www/html/inventory/uploads/reusable');
fs.mkdirSync(uploadsDir, { recursive: true });
// Configure multer for file uploads
const storage = multer.diskStorage({
destination: function (req, file, cb) {
console.log(`Saving reusable image to: ${uploadsDir}`);
cb(null, uploadsDir);
},
filename: function (req, file, cb) {
// Create unique filename with original extension
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
// Make sure we preserve the original file extension
let fileExt = path.extname(file.originalname).toLowerCase();
// Ensure there is a proper extension based on mimetype if none exists
if (!fileExt) {
switch (file.mimetype) {
case 'image/jpeg': fileExt = '.jpg'; break;
case 'image/png': fileExt = '.png'; break;
case 'image/gif': fileExt = '.gif'; break;
case 'image/webp': fileExt = '.webp'; break;
default: fileExt = '.jpg'; // Default to jpg
}
}
const fileName = `reusable-${uniqueSuffix}${fileExt}`;
console.log(`Generated filename: ${fileName} with mimetype: ${file.mimetype}`);
cb(null, fileName);
}
});
const upload = multer({
storage: storage,
limits: {
fileSize: 5 * 1024 * 1024, // 5MB max file size
},
fileFilter: function (req, file, cb) {
// Accept only image files
const filetypes = /jpeg|jpg|png|gif|webp/;
const mimetype = filetypes.test(file.mimetype);
const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
if (mimetype && extname) {
return cb(null, true);
}
cb(new Error('Only image files are allowed'));
}
});
// Get all reusable images
router.get('/', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
ORDER BY created_at DESC
`);
res.json(result.rows);
} catch (error) {
console.error('Error fetching reusable images:', error);
res.status(500).json({
error: 'Failed to fetch reusable images',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get images by company or global images
router.get('/by-company/:companyId', async (req, res) => {
try {
const { companyId } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Get images that are either global or belong to this company
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE is_global = true OR company = $1
ORDER BY created_at DESC
`, [companyId]);
res.json(result.rows);
} catch (error) {
console.error('Error fetching reusable images by company:', error);
res.status(500).json({
error: 'Failed to fetch reusable images by company',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get global images only
router.get('/global', async (req, res) => {
try {
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE is_global = true
ORDER BY created_at DESC
`);
res.json(result.rows);
} catch (error) {
console.error('Error fetching global reusable images:', error);
res.status(500).json({
error: 'Failed to fetch global reusable images',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Get a single image by ID
router.get('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
const result = await pool.query(`
SELECT * FROM reusable_images
WHERE id = $1
`, [id]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
res.json(result.rows[0]);
} catch (error) {
console.error('Error fetching reusable image:', error);
res.status(500).json({
error: 'Failed to fetch reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Upload a new reusable image
router.post('/upload', upload.single('image'), async (req, res) => {
try {
if (!req.file) {
return res.status(400).json({ error: 'No image file provided' });
}
const { name, is_global, company } = req.body;
// Validate required fields
if (!name) {
return res.status(400).json({ error: 'Image name is required' });
}
// Convert is_global from string to boolean
const isGlobal = is_global === 'true' || is_global === true;
// Validate company is provided for non-global images
if (!isGlobal && !company) {
return res.status(400).json({ error: 'Company is required for non-global images' });
}
// Log file information
console.log('Reusable image uploaded:', {
filename: req.file.filename,
originalname: req.file.originalname,
mimetype: req.file.mimetype,
size: req.file.size,
path: req.file.path
});
// Ensure the file exists
const filePath = path.join(uploadsDir, req.file.filename);
if (!fs.existsSync(filePath)) {
return res.status(500).json({ error: 'File was not saved correctly' });
}
// Create URL for the uploaded file
const baseUrl = 'https://inventory.acot.site';
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Insert record into database
const result = await pool.query(`
INSERT INTO reusable_images (
name,
filename,
file_path,
image_url,
is_global,
company,
mime_type,
file_size
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8)
RETURNING *
`, [
name,
req.file.filename,
filePath,
imageUrl,
isGlobal,
isGlobal ? null : company,
req.file.mimetype,
req.file.size
]);
// Return success response with image data
res.status(201).json({
success: true,
image: result.rows[0],
message: 'Image uploaded successfully'
});
} catch (error) {
console.error('Error uploading reusable image:', error);
res.status(500).json({ error: error.message || 'Failed to upload image' });
}
});
// Update image details (name, is_global, company)
router.put('/:id', async (req, res) => {
try {
const { id } = req.params;
const { name, is_global, company } = req.body;
// Validate required fields
if (!name) {
return res.status(400).json({ error: 'Image name is required' });
}
// Convert is_global from string to boolean if necessary
const isGlobal = typeof is_global === 'string' ? is_global === 'true' : !!is_global;
// Validate company is provided for non-global images
if (!isGlobal && !company) {
return res.status(400).json({ error: 'Company is required for non-global images' });
}
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Check if the image exists
const checkResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
if (checkResult.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
const result = await pool.query(`
UPDATE reusable_images
SET
name = $1,
is_global = $2,
company = $3
WHERE id = $4
RETURNING *
`, [
name,
isGlobal,
isGlobal ? null : company,
id
]);
res.json(result.rows[0]);
} catch (error) {
console.error('Error updating reusable image:', error);
res.status(500).json({
error: 'Failed to update reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Delete a reusable image
router.delete('/:id', async (req, res) => {
try {
const { id } = req.params;
const pool = req.app.locals.pool;
if (!pool) {
throw new Error('Database pool not initialized');
}
// Get the image data first to get the filename
const imageResult = await pool.query('SELECT * FROM reusable_images WHERE id = $1', [id]);
if (imageResult.rows.length === 0) {
return res.status(404).json({ error: 'Reusable image not found' });
}
const image = imageResult.rows[0];
// Delete from database
await pool.query('DELETE FROM reusable_images WHERE id = $1', [id]);
// Delete the file from filesystem
const filePath = path.join(uploadsDir, image.filename);
if (fs.existsSync(filePath)) {
fs.unlinkSync(filePath);
}
res.json({
message: 'Reusable image deleted successfully',
image
});
} catch (error) {
console.error('Error deleting reusable image:', error);
res.status(500).json({
error: 'Failed to delete reusable image',
details: error instanceof Error ? error.message : 'Unknown error'
});
}
});
// Check if file exists and permissions
router.get('/check-file/:filename', (req, res) => {
const { filename } = req.params;
// Prevent directory traversal
if (filename.includes('..') || filename.includes('/')) {
return res.status(400).json({ error: 'Invalid filename' });
}
const filePath = path.join(uploadsDir, filename);
try {
// Check if file exists
if (!fs.existsSync(filePath)) {
return res.status(404).json({
error: 'File not found',
path: filePath,
exists: false,
readable: false
});
}
// Check if file is readable
fs.accessSync(filePath, fs.constants.R_OK);
// Get file stats
const stats = fs.statSync(filePath);
return res.json({
filename,
path: filePath,
exists: true,
readable: true,
isFile: stats.isFile(),
isDirectory: stats.isDirectory(),
size: stats.size,
created: stats.birthtime,
modified: stats.mtime,
permissions: stats.mode.toString(8)
});
} catch (error) {
return res.status(500).json({
error: error.message,
path: filePath,
exists: fs.existsSync(filePath),
readable: false
});
}
});
// Error handling middleware
router.use((err, req, res, next) => {
console.error('Reusable images route error:', err);
res.status(500).json({
error: 'Internal server error',
details: err.message
});
});
module.exports = router;

View File

@@ -19,6 +19,7 @@ const importRouter = require('./routes/import');
const aiValidationRouter = require('./routes/ai-validation');
const templatesRouter = require('./routes/templates');
const aiPromptsRouter = require('./routes/ai-prompts');
const reusableImagesRouter = require('./routes/reusable-images');
// Get the absolute path to the .env file
const envPath = '/var/www/html/inventory/.env';
@@ -105,6 +106,7 @@ async function startServer() {
app.use('/api/ai-validation', aiValidationRouter);
app.use('/api/templates', templatesRouter);
app.use('/api/ai-prompts', aiPromptsRouter);
app.use('/api/reusable-images', reusableImagesRouter);
// Basic health check route
app.get('/health', (req, res) => {

View File

@@ -253,6 +253,7 @@ export const ImageUploadStep = ({
}
getProductContainerClasses={() => getProductContainerClasses(index)}
findContainer={findContainer}
handleAddImageFromUrl={handleAddImageFromUrl}
/>
))}
</div>

View File

@@ -1,7 +1,7 @@
import { Button } from "@/components/ui/button";
import { Input } from "@/components/ui/input";
import { Card, CardContent } from "@/components/ui/card";
import { Loader2, Link as LinkIcon } from "lucide-react";
import { Loader2, Link as LinkIcon, Image as ImageIcon } from "lucide-react";
import { cn } from "@/lib/utils";
import { ImageDropzone } from "./ImageDropzone";
import { SortableImage } from "./SortableImage";
@@ -9,6 +9,25 @@ import { CopyButton } from "./CopyButton";
import { ProductImageSortable, Product } from "../../types";
import { DroppableContainer } from "../DroppableContainer";
import { SortableContext, horizontalListSortingStrategy } from '@dnd-kit/sortable';
import { useQuery } from "@tanstack/react-query";
import config from "@/config";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { ScrollArea } from "@/components/ui/scroll-area";
import { useState, useMemo } from "react";
interface ReusableImage {
id: number;
name: string;
image_url: string;
is_global: boolean;
company: string | null;
}
interface ProductCardProps {
product: Product;
@@ -26,6 +45,7 @@ interface ProductCardProps {
onRemoveImage: (id: string) => void;
getProductContainerClasses: () => string;
findContainer: (id: string) => string | null;
handleAddImageFromUrl: (productIndex: number, url: string) => void;
}
export const ProductCard = ({
@@ -43,8 +63,11 @@ export const ProductCard = ({
onDragOver,
onRemoveImage,
getProductContainerClasses,
findContainer
findContainer,
handleAddImageFromUrl
}: ProductCardProps) => {
const [isReusableDialogOpen, setIsReusableDialogOpen] = useState(false);
// Function to get images for this product
const getProductImages = () => {
return productImages.filter(img => img.productIndex === index);
@@ -56,6 +79,32 @@ export const ProductCard = ({
return result !== null ? parseInt(result) : null;
};
// Fetch reusable images
const { data: reusableImages, isLoading: isLoadingReusable } = useQuery<ReusableImage[]>({
queryKey: ["reusable-images"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/reusable-images`);
if (!response.ok) {
throw new Error("Failed to fetch reusable images");
}
return response.json();
},
});
// Filter reusable images based on product's company
const availableReusableImages = useMemo(() => {
if (!reusableImages) return [];
return reusableImages.filter(img =>
img.is_global || img.company === product.company
);
}, [reusableImages, product.company]);
// Handle adding a reusable image
const handleAddReusableImage = (imageUrl: string) => {
handleAddImageFromUrl(index, imageUrl);
setIsReusableDialogOpen(false);
};
return (
<Card
className={cn(
@@ -83,6 +132,18 @@ export const ProductCard = ({
className="flex items-center gap-2"
onSubmit={onUrlSubmit}
>
{getProductImages().length === 0 && (
<Button
type="button"
variant="outline"
size="sm"
className="h-8 whitespace-nowrap flex gap-1 items-center text-xs"
onClick={() => setIsReusableDialogOpen(true)}
>
<ImageIcon className="h-3.5 w-3.5" />
Select from Library
</Button>
)}
<Input
placeholder="Add image from URL"
value={urlInput}
@@ -105,7 +166,7 @@ export const ProductCard = ({
</div>
<div className="flex flex-col sm:flex-row gap-2">
<div className="flex flex-row gap-2 items-start">
<div className="flex flex-row gap-2 items-center gap-4">
<ImageDropzone
productIndex={index}
onDrop={onImageUpload}
@@ -158,6 +219,50 @@ export const ProductCard = ({
/>
</div>
</CardContent>
{/* Reusable Images Dialog */}
<Dialog open={isReusableDialogOpen} onOpenChange={setIsReusableDialogOpen}>
<DialogContent className="max-w-3xl">
<DialogHeader>
<DialogTitle>Select from Image Library</DialogTitle>
<DialogDescription>
Choose a global or company-specific image to add to this product.
</DialogDescription>
</DialogHeader>
<ScrollArea className="h-[400px] pr-4">
{isLoadingReusable ? (
<div className="flex items-center justify-center h-full">
<Loader2 className="h-8 w-8 animate-spin" />
</div>
) : availableReusableImages.length === 0 ? (
<div className="flex flex-col items-center justify-center h-full text-muted-foreground">
<ImageIcon className="h-8 w-8 mb-2" />
<p>No reusable images available</p>
</div>
) : (
<div className="grid grid-cols-2 sm:grid-cols-3 md:grid-cols-4 gap-4">
{availableReusableImages.map((image) => (
<div
key={image.id}
className="group relative aspect-square border rounded-lg overflow-hidden cursor-pointer hover:ring-2 hover:ring-primary"
onClick={() => handleAddReusableImage(image.image_url)}
>
<img
src={image.image_url}
alt={image.name}
className="w-full h-full object-cover"
/>
<div className="absolute inset-0 bg-black/0 group-hover:bg-black/20 transition-colors" />
<div className="absolute bottom-0 left-0 right-0 p-2 bg-gradient-to-t from-black/60 to-transparent">
<p className="text-xs text-white truncate">{image.name}</p>
</div>
</div>
))}
</div>
)}
</ScrollArea>
</DialogContent>
</Dialog>
</Card>
);
};

View File

@@ -31,5 +31,6 @@ export interface Product {
supplier_no?: string;
sku?: string;
model?: string;
company?: string;
product_images?: string | string[];
}

View File

@@ -0,0 +1,773 @@
import { useState, useMemo, useCallback, useRef, useEffect } from "react";
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
import { Button } from "@/components/ui/button";
import {
Table,
TableBody,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import { Checkbox } from "@/components/ui/checkbox";
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
import { ArrowUpDown, Pencil, Trash2, PlusCircle, Image, Eye } from "lucide-react";
import config from "@/config";
import {
useReactTable,
getCoreRowModel,
getSortedRowModel,
SortingState,
flexRender,
type ColumnDef,
} from "@tanstack/react-table";
import {
Dialog,
DialogContent,
DialogDescription,
DialogFooter,
DialogHeader,
DialogTitle,
DialogClose
} from "@/components/ui/dialog";
import {
AlertDialog,
AlertDialogAction,
AlertDialogCancel,
AlertDialogContent,
AlertDialogDescription,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogTitle,
} from "@/components/ui/alert-dialog";
import { toast } from "sonner";
import { useDropzone } from "react-dropzone";
import { cn } from "@/lib/utils";
interface FieldOption {
label: string;
value: string;
}
interface ImageFormData {
id?: number;
name: string;
is_global: boolean;
company: string | null;
file?: File;
}
interface ReusableImage {
id: number;
name: string;
filename: string;
file_path: string;
image_url: string;
is_global: boolean;
company: string | null;
mime_type: string;
file_size: number;
created_at: string;
updated_at: string;
}
interface FieldOptions {
companies: FieldOption[];
}
const ImageForm = ({
editingImage,
formData,
setFormData,
onSubmit,
onCancel,
fieldOptions,
getRootProps,
getInputProps,
isDragActive
}: {
editingImage: ReusableImage | null;
formData: ImageFormData;
setFormData: (data: ImageFormData) => void;
onSubmit: (e: React.FormEvent) => void;
onCancel: () => void;
fieldOptions: FieldOptions | undefined;
getRootProps: any;
getInputProps: any;
isDragActive: boolean;
}) => {
const handleNameChange = useCallback((e: React.ChangeEvent<HTMLInputElement>) => {
setFormData(prev => ({ ...prev, name: e.target.value }));
}, [setFormData]);
const handleGlobalChange = useCallback((checked: boolean) => {
setFormData(prev => ({
...prev,
is_global: checked,
company: checked ? null : prev.company
}));
}, [setFormData]);
const handleCompanyChange = useCallback((value: string) => {
setFormData(prev => ({ ...prev, company: value }));
}, [setFormData]);
return (
<form onSubmit={onSubmit}>
<div className="grid gap-4 py-4">
<div className="grid gap-2">
<Label htmlFor="image_name">Image Name</Label>
<Input
id="image_name"
name="image_name"
value={formData.name}
onChange={handleNameChange}
placeholder="Enter image name"
required
/>
</div>
{!editingImage && (
<div className="grid gap-2">
<Label htmlFor="image">Upload Image</Label>
<div
{...getRootProps()}
className={cn(
"border-2 border-dashed border-secondary-foreground/30 bg-muted/90 rounded-md w-full py-6 flex flex-col items-center justify-center cursor-pointer hover:bg-muted/70 transition-colors",
isDragActive && "border-primary bg-muted"
)}
>
<input {...getInputProps()} />
<div className="flex flex-col items-center justify-center py-2">
{formData.file ? (
<>
<div className="mb-4">
<ImagePreview file={formData.file} />
</div>
<div className="flex items-center gap-2 mb-2">
<Image className="h-4 w-4 text-primary" />
<span className="text-sm">{formData.file.name}</span>
</div>
<p className="text-xs text-muted-foreground">Click or drag to replace</p>
</>
) : isDragActive ? (
<>
<Image className="h-8 w-8 mb-2 text-primary" />
<p className="text-base text-muted-foreground">Drop image here</p>
</>
) : (
<>
<Image className="h-8 w-8 mb-2 text-muted-foreground" />
<p className="text-base text-muted-foreground">Click or drag to upload</p>
</>
)}
</div>
</div>
</div>
)}
<div className="flex items-center space-x-2">
<Checkbox
id="is_global"
checked={formData.is_global}
onCheckedChange={handleGlobalChange}
/>
<Label htmlFor="is_global">Available for all companies</Label>
</div>
{!formData.is_global && (
<div className="grid gap-2">
<Label htmlFor="company">Company</Label>
<Select
value={formData.company || ''}
onValueChange={handleCompanyChange}
required={!formData.is_global}
>
<SelectTrigger>
<SelectValue placeholder="Select company" />
</SelectTrigger>
<SelectContent>
{fieldOptions?.companies.map((company) => (
<SelectItem key={company.value} value={company.value}>
{company.label}
</SelectItem>
))}
</SelectContent>
</Select>
</div>
)}
</div>
<DialogFooter>
<Button type="button" variant="outline" onClick={onCancel}>
Cancel
</Button>
<Button type="submit">
{editingImage ? "Update" : "Upload"} Image
</Button>
</DialogFooter>
</form>
);
};
export function ReusableImageManagement() {
const [isFormOpen, setIsFormOpen] = useState(false);
const [isDeleteOpen, setIsDeleteOpen] = useState(false);
const [isPreviewOpen, setIsPreviewOpen] = useState(false);
const [imageToDelete, setImageToDelete] = useState<ReusableImage | null>(null);
const [previewImage, setPreviewImage] = useState<ReusableImage | null>(null);
const [editingImage, setEditingImage] = useState<ReusableImage | null>(null);
const [sorting, setSorting] = useState<SortingState>([
{ id: "created_at", desc: true }
]);
const [searchQuery, setSearchQuery] = useState("");
const [formData, setFormData] = useState<ImageFormData>({
name: "",
is_global: false,
company: null,
file: undefined
});
const queryClient = useQueryClient();
const { data: images, isLoading } = useQuery<ReusableImage[]>({
queryKey: ["reusable-images"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/reusable-images`);
if (!response.ok) {
throw new Error("Failed to fetch reusable images");
}
return response.json();
},
});
const { data: fieldOptions } = useQuery<FieldOptions>({
queryKey: ["fieldOptions"],
queryFn: async () => {
const response = await fetch(`${config.apiUrl}/import/field-options`);
if (!response.ok) {
throw new Error("Failed to fetch field options");
}
return response.json();
},
});
const createMutation = useMutation({
mutationFn: async (data: ImageFormData) => {
// Create FormData for file upload
const formData = new FormData();
formData.append('name', data.name);
formData.append('is_global', String(data.is_global));
if (!data.is_global && data.company) {
formData.append('company', data.company);
}
if (data.file) {
formData.append('image', data.file);
} else {
throw new Error("Image file is required");
}
const response = await fetch(`${config.apiUrl}/reusable-images/upload`, {
method: "POST",
body: formData,
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to upload image");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image uploaded successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to upload image");
},
});
const updateMutation = useMutation({
mutationFn: async (data: ImageFormData) => {
if (!data.id) throw new Error("Image ID is required for update");
const response = await fetch(`${config.apiUrl}/reusable-images/${data.id}`, {
method: "PUT",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
name: data.name,
is_global: data.is_global,
company: data.is_global ? null : data.company
}),
});
if (!response.ok) {
const error = await response.json();
throw new Error(error.message || error.error || "Failed to update image");
}
return response.json();
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image updated successfully");
resetForm();
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to update image");
},
});
const deleteMutation = useMutation({
mutationFn: async (id: number) => {
const response = await fetch(`${config.apiUrl}/reusable-images/${id}`, {
method: "DELETE",
});
if (!response.ok) {
throw new Error("Failed to delete image");
}
},
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["reusable-images"] });
toast.success("Image deleted successfully");
},
onError: (error) => {
toast.error(error instanceof Error ? error.message : "Failed to delete image");
},
});
const handleEdit = (image: ReusableImage) => {
setEditingImage(image);
setFormData({
id: image.id,
name: image.name,
is_global: image.is_global,
company: image.company,
});
setIsFormOpen(true);
};
const handleDeleteClick = (image: ReusableImage) => {
setImageToDelete(image);
setIsDeleteOpen(true);
};
const handlePreview = (image: ReusableImage) => {
setPreviewImage(image);
setIsPreviewOpen(true);
};
const handleDeleteConfirm = () => {
if (imageToDelete) {
deleteMutation.mutate(imageToDelete.id);
setIsDeleteOpen(false);
setImageToDelete(null);
}
};
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
// If is_global is true, ensure company is null
const submitData = {
...formData,
company: formData.is_global ? null : formData.company,
};
if (editingImage) {
updateMutation.mutate(submitData);
} else {
if (!submitData.file) {
toast.error("Please select an image file");
return;
}
createMutation.mutate(submitData);
}
};
const resetForm = () => {
setFormData({
name: "",
is_global: false,
company: null,
file: undefined
});
setEditingImage(null);
setIsFormOpen(false);
};
const handleCreateClick = () => {
resetForm();
setIsFormOpen(true);
};
// Configure dropzone for image uploads
const onDrop = useCallback((acceptedFiles: File[]) => {
if (acceptedFiles.length > 0) {
const file = acceptedFiles[0]; // Take only the first file
setFormData(prev => ({
...prev,
file
}));
}
}, []);
const { getRootProps, getInputProps, isDragActive } = useDropzone({
accept: {
'image/*': ['.jpeg', '.jpg', '.png', '.gif', '.webp']
},
onDrop,
multiple: false // Only accept single files
});
const columns = useMemo<ColumnDef<ReusableImage>[]>(() => [
{
accessorKey: "name",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Name
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
},
{
accessorKey: "is_global",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Type
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const isGlobal = row.getValue("is_global") as boolean;
return isGlobal ? "Global" : "Company Specific";
},
},
{
accessorKey: "company",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Company
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const isGlobal = row.getValue("is_global") as boolean;
if (isGlobal) return 'N/A';
const companyId = row.getValue("company");
if (!companyId) return 'None';
return fieldOptions?.companies.find(c => c.value === companyId)?.label || companyId;
},
},
{
accessorKey: "file_size",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Size
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => {
const size = row.getValue("file_size") as number;
return `${(size / 1024).toFixed(1)} KB`;
},
},
{
accessorKey: "created_at",
header: ({ column }) => (
<Button
variant="ghost"
onClick={() => column.toggleSorting(column.getIsSorted() === "asc")}
>
Created
<ArrowUpDown className="ml-2 h-4 w-4" />
</Button>
),
cell: ({ row }) => new Date(row.getValue("created_at")).toLocaleDateString(),
},
{
accessorKey: "image_url",
header: "Thumbnail",
cell: ({ row }) => (
<div className="flex items-center justify-center">
<img
src={row.getValue("image_url") as string}
alt={row.getValue("name") as string}
className="w-10 h-10 object-contain border rounded"
/>
</div>
),
},
{
id: "actions",
cell: ({ row }) => (
<div className="flex gap-2 justify-end">
<Button
variant="ghost"
size="icon"
onClick={() => handlePreview(row.original)}
title="Preview Image"
>
<Eye className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
onClick={() => handleEdit(row.original)}
title="Edit Image"
>
<Pencil className="h-4 w-4" />
</Button>
<Button
variant="ghost"
size="icon"
className="text-destructive hover:text-destructive"
onClick={() => handleDeleteClick(row.original)}
title="Delete Image"
>
<Trash2 className="h-4 w-4" />
</Button>
</div>
),
},
], [fieldOptions]);
const filteredData = useMemo(() => {
if (!images) return [];
return images.filter((image) => {
const searchString = searchQuery.toLowerCase();
return (
image.name.toLowerCase().includes(searchString) ||
(image.is_global ? "global" : "company").includes(searchString) ||
(image.company && image.company.toLowerCase().includes(searchString))
);
});
}, [images, searchQuery]);
const table = useReactTable({
data: filteredData,
columns,
state: {
sorting,
},
onSortingChange: setSorting,
getSortedRowModel: getSortedRowModel(),
getCoreRowModel: getCoreRowModel(),
});
return (
<div className="space-y-6">
<div className="flex items-center justify-between">
<h2 className="text-2xl font-bold">Reusable Images</h2>
<Button onClick={handleCreateClick}>
<PlusCircle className="mr-2 h-4 w-4" />
Upload New Image
</Button>
</div>
<div className="flex items-center gap-4">
<Input
placeholder="Search images..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="max-w-sm"
/>
</div>
{isLoading ? (
<div>Loading images...</div>
) : (
<div className="border rounded-lg">
<Table>
<TableHeader className="bg-muted">
{table.getHeaderGroups().map((headerGroup) => (
<TableRow key={headerGroup.id}>
{headerGroup.headers.map((header) => (
<TableHead key={header.id}>
{header.isPlaceholder
? null
: flexRender(
header.column.columnDef.header,
header.getContext()
)}
</TableHead>
))}
</TableRow>
))}
</TableHeader>
<TableBody>
{table.getRowModel().rows?.length ? (
table.getRowModel().rows.map((row) => (
<TableRow key={row.id} className="hover:bg-gray-100">
{row.getVisibleCells().map((cell) => (
<TableCell key={cell.id} className="pl-6">
{flexRender(cell.column.columnDef.cell, cell.getContext())}
</TableCell>
))}
</TableRow>
))
) : (
<TableRow>
<TableCell colSpan={columns.length} className="text-center">
No images found
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</div>
)}
{/* Image Form Dialog */}
<Dialog open={isFormOpen} onOpenChange={setIsFormOpen}>
<DialogContent className="max-w-md">
<DialogHeader>
<DialogTitle>{editingImage ? "Edit Image" : "Upload New Image"}</DialogTitle>
<DialogDescription>
{editingImage
? "Update this reusable image's details."
: "Upload a new reusable image that can be used across products."}
</DialogDescription>
</DialogHeader>
<ImageForm
editingImage={editingImage}
formData={formData}
setFormData={setFormData}
onSubmit={handleSubmit}
onCancel={() => {
resetForm();
setIsFormOpen(false);
}}
fieldOptions={fieldOptions}
getRootProps={getRootProps}
getInputProps={getInputProps}
isDragActive={isDragActive}
/>
</DialogContent>
</Dialog>
{/* Delete Confirmation Dialog */}
<AlertDialog open={isDeleteOpen} onOpenChange={setIsDeleteOpen}>
<AlertDialogContent>
<AlertDialogHeader>
<AlertDialogTitle>Delete Image</AlertDialogTitle>
<AlertDialogDescription>
Are you sure you want to delete this image? This action cannot be undone.
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel onClick={() => {
setIsDeleteOpen(false);
setImageToDelete(null);
}}>
Cancel
</AlertDialogCancel>
<AlertDialogAction onClick={handleDeleteConfirm}>
Delete
</AlertDialogAction>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialog>
{/* Preview Dialog */}
<Dialog open={isPreviewOpen} onOpenChange={setIsPreviewOpen}>
<DialogContent className="max-w-3xl">
<DialogHeader>
<DialogTitle>{previewImage?.name}</DialogTitle>
<DialogDescription>
{previewImage?.is_global
? "Global image"
: `Company specific image for ${fieldOptions?.companies.find(c => c.value === previewImage?.company)?.label}`}
</DialogDescription>
</DialogHeader>
<div className="flex justify-center p-4">
{previewImage && (
<div className="bg-checkerboard rounded-md overflow-hidden">
<img
src={previewImage.image_url}
alt={previewImage.name}
className="max-h-[500px] max-w-full object-contain"
/>
</div>
)}
</div>
<div className="grid grid-cols-2 gap-4 text-sm">
<div>
<span className="font-medium">Filename:</span> {previewImage?.filename}
</div>
<div>
<span className="font-medium">Size:</span> {previewImage && `${(previewImage.file_size / 1024).toFixed(1)} KB`}
</div>
<div>
<span className="font-medium">Type:</span> {previewImage?.mime_type}
</div>
<div>
<span className="font-medium">Uploaded:</span> {previewImage && new Date(previewImage.created_at).toLocaleString()}
</div>
</div>
<DialogFooter>
<DialogClose asChild>
<Button>Close</Button>
</DialogClose>
</DialogFooter>
</DialogContent>
</Dialog>
<style jsx global>{`
.bg-checkerboard {
background-image: linear-gradient(45deg, #f0f0f0 25%, transparent 25%),
linear-gradient(-45deg, #f0f0f0 25%, transparent 25%),
linear-gradient(45deg, transparent 75%, #f0f0f0 75%),
linear-gradient(-45deg, transparent 75%, #f0f0f0 75%);
background-size: 20px 20px;
background-position: 0 0, 0 10px, 10px -10px, -10px 0px;
}
`}</style>
</div>
);
}
const ImagePreview = ({ file }: { file: File }) => {
const [previewUrl, setPreviewUrl] = useState<string>('');
useEffect(() => {
const url = URL.createObjectURL(file);
setPreviewUrl(url);
return () => {
URL.revokeObjectURL(url);
};
}, [file]);
return (
<img
src={previewUrl}
alt="Preview"
className="max-h-32 max-w-full object-contain rounded-md"
/>
);
};

View File

@@ -6,6 +6,7 @@ import { CalculationSettings } from "@/components/settings/CalculationSettings";
import { TemplateManagement } from "@/components/settings/TemplateManagement";
import { UserManagement } from "@/components/settings/UserManagement";
import { PromptManagement } from "@/components/settings/PromptManagement";
import { ReusableImageManagement } from "@/components/settings/ReusableImageManagement";
import { motion } from 'framer-motion';
import { Alert, AlertDescription } from "@/components/ui/alert";
import { Protected } from "@/components/auth/Protected";
@@ -42,7 +43,8 @@ const SETTINGS_GROUPS: SettingsGroup[] = [
label: "Content Management",
tabs: [
{ id: "templates", permission: "settings:templates", label: "Template Management" },
{ id: "ai-prompts", permission: "settings:templates", label: "AI Prompts" },
{ id: "ai-prompts", permission: "settings:prompt_management", label: "AI Prompts" },
{ id: "reusable-images", permission: "settings:library_management", label: "Reusable Images" },
]
},
{
@@ -220,7 +222,7 @@ export function Settings() {
<TabsContent value="ai-prompts" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:templates"
permission="settings:prompt_management"
fallback={
<Alert>
<AlertDescription>
@@ -233,6 +235,21 @@ export function Settings() {
</Protected>
</TabsContent>
<TabsContent value="reusable-images" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:library_management"
fallback={
<Alert>
<AlertDescription>
You don't have permission to access Reusable Images.
</AlertDescription>
</Alert>
}
>
<ReusableImageManagement />
</Protected>
</TabsContent>
<TabsContent value="user-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
<Protected
permission="settings:user_management"