Add receivings table, split PO import

This commit is contained in:
2025-04-12 14:20:59 -04:00
parent ac14179bd2
commit 8508bfac93
7 changed files with 395 additions and 463 deletions

112
docs/split-up-pos.md Normal file
View File

@@ -0,0 +1,112 @@
Okay, I understand completely now. The core issue is that the previous approaches tried too hard to reconcile every receipt back to a specific PO line within the `purchase_orders` table structure, which doesn't reflect the reality where receipts can be independent events. Your downstream scripts, especially `daily_snapshots` and `product_metrics`, rely on having a complete picture of *all* receivings.
Let's pivot to a model that respects both distinct data streams: **Orders (Intent)** and **Receivings (Actuals)**.
**Proposed Solution: Separate `purchase_orders` and `receivings` Tables**
This is the cleanest way to model the reality you've described.
1. **`purchase_orders` Table:**
* **Purpose:** Tracks the status and details of purchase *orders* placed. Represents the *intent* to receive goods.
* **Key Columns:** `po_id`, `pid`, `ordered` (quantity ordered), `po_cost_price`, `date` (order/created date), `expected_date`, `status` (PO lifecycle: 'ordered', 'canceled', 'done'), `vendor`, `notes`, etc.
* **Crucially:** This table *does not* need a `received` column or a `receiving_history` column derived from complex allocations. It focuses solely on the PO itself.
2. **`receivings` Table (New or Refined):**
* **Purpose:** Tracks every single line item received, regardless of whether it was linked to a PO during the receiving process. Represents the *actual* goods that arrived.
* **Key Columns:**
* `receiving_id` (Identifier for the overall receiving document/batch)
* `pid` (Product ID received)
* `received_qty` (Quantity received for this specific line)
* `cost_each` (Actual cost paid for this item on this receiving)
* `received_date` (Actual date the item was received)
* `received_by` (Employee ID/Name)
* `source_po_id` (The `po_id` entered on the receiving screen, *nullable*. Stores the original link attempt, even if it was wrong or missing)
* `source_receiving_status` (The status from the source `receivings` table: 'partial_received', 'full_received', 'paid', 'canceled')
**How the Import Script Changes:**
1. **Fetch POs:** Fetch data from `po` and `po_products`.
2. **Populate `purchase_orders`:**
* Insert/Update rows into `purchase_orders` based directly on the fetched PO data.
* Set `po_id`, `pid`, `ordered`, `po_cost_price`, `date` (`COALESCE(date_ordered, date_created)`), `expected_date`.
* Set `status` by mapping the source `po.status` code directly ('ordered', 'canceled', 'done', etc.).
* **No complex allocation needed here.**
3. **Fetch Receivings:** Fetch data from `receivings` and `receivings_products`.
4. **Populate `receivings`:**
* For *every* line item fetched from `receivings_products`:
* Perform necessary data validation (dates, numbers).
* Insert a new row into `receivings` with all the relevant details (`receiving_id`, `pid`, `received_qty`, `cost_each`, `received_date`, `received_by`, `source_po_id`, `source_receiving_status`).
* Use `ON CONFLICT (receiving_id, pid)` (or similar unique key based on your source data) `DO UPDATE SET ...` for incremental updates if necessary, or simply delete/re-insert based on `receiving_id` for simplicity if performance allows.
**Impact on Downstream Scripts (and how to adapt):**
* **Initial Query (Active POs):**
* `SELECT ... FROM purchase_orders po WHERE po.status NOT IN ('canceled', 'done', 'paid_equivalent_status?') AND po.date >= ...`
* `active_pos`: `COUNT(DISTINCT po.po_id)` based on the filtered POs.
* `overdue_pos`: Add `AND po.expected_date < CURRENT_DATE`.
* `total_units`: `SUM(po.ordered)`. Represents total units *ordered* on active POs.
* `total_cost`: `SUM(po.ordered * po.po_cost_price)`. Cost of units *ordered*.
* `total_retail`: `SUM(po.ordered * pm.current_price)`. Retail value of units *ordered*.
* **Result:** This query now cleanly reports on the status of *orders* placed, which seems closer to its original intent. The filter `po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')` is replaced by `po.status NOT IN ('canceled', 'done', 'paid_equivalent?')`. The 90% received check is removed as `received` is not reliably tracked *on the PO* anymore.
* **`daily_product_snapshots`:**
* **`SalesData` CTE:** No change needed.
* **`ReceivingData` CTE:** **Must be changed.** Query the **`receivings`** table instead of `purchase_orders`.
```sql
ReceivingData AS (
SELECT
rl.pid,
COUNT(DISTINCT rl.receiving_id) as receiving_doc_count,
SUM(rl.received_qty) AS units_received,
SUM(rl.received_qty * rl.cost_each) AS cost_received
FROM public.receivings rl
WHERE rl.received_date::date = _date
-- Optional: Filter out canceled receivings if needed
-- AND rl.source_receiving_status <> 'canceled'
GROUP BY rl.pid
),
```
* **Result:** This now accurately reflects *all* units received on a given day from the definitive source.
* **`update_product_metrics`:**
* **`CurrentInfo` CTE:** No change needed (pulls from `products`).
* **`OnOrderInfo` CTE:** Needs re-evaluation. How do you want to define "On Order"?
* **Option A (Strict PO View):** `SUM(po.ordered)` from `purchase_orders po WHERE po.status NOT IN ('canceled', 'done', 'paid_equivalent?')`. This is quantity on *open orders*, ignoring fulfillment state. Simple, but might overestimate if items arrived unlinked.
* **Option B (Approximate Fulfillment):** `SUM(po.ordered)` from open POs MINUS `SUM(rl.received_qty)` from `receivings rl` where `rl.source_po_id = po.po_id` (summing only directly linked receivings). Better, but still misses fulfillment via unlinked receivings.
* **Option C (Heuristic):** `SUM(po.ordered)` from open POs MINUS `SUM(rl.received_qty)` from `receivings rl` where `rl.pid = po.pid` and `rl.received_date >= po.date`. This *tries* to account for unlinked receivings but is imprecise.
* **Recommendation:** Start with **Option A** for simplicity, clearly labeling it "Quantity on Open POs". You might need a separate process or metric for a more nuanced view of expected vs. actual pipeline.
```sql
-- Example for Option A
OnOrderInfo AS (
SELECT
pid,
SUM(ordered) AS on_order_qty, -- Total qty on open POs
SUM(ordered * po_cost_price) AS on_order_cost -- Cost of qty on open POs
FROM public.purchase_orders
WHERE status NOT IN ('canceled', 'done', 'paid_equivalent?') -- Define your open statuses
GROUP BY pid
),
```
* **`HistoricalDates` CTE:**
* `date_first_sold`, `max_order_date`: No change (queries `orders`).
* `date_first_received_calc`, `date_last_received_calc`: **Must be changed.** Query `MIN(rl.received_date)` and `MAX(rl.received_date)` from the **`receivings`** table grouped by `pid`.
* **`SnapshotAggregates` CTE:**
* `received_qty_30d`, `received_cost_30d`: These are calculated from `daily_product_snapshots`, which are now correctly sourced from `receivings`, so this part is fine.
* **Forecasting Calculations:** Will use the chosen definition of `on_order_qty`. Be aware of the implications of Option A (potentially inflated if unlinked receivings fulfill orders).
* **Result:** Metrics are calculated based on distinct order data and complete receiving data. The definition of "on order" needs careful consideration.
**Summary of this Approach:**
* **Pros:**
* Accurately models distinct order and receiving events.
* Provides a definitive source (`receivings`) for all received inventory.
* Simplifies the `purchase_orders` table and its import logic.
* Avoids complex/potentially inaccurate allocation logic for unlinked receivings within the main tables.
* Avoids synthetic records.
* Fixes downstream reporting (`daily_snapshots` receiving data).
* **Cons:**
* Requires creating/managing the `receivings` table.
* Requires modifying downstream queries (`ReceivingData`, `OnOrderInfo`, `HistoricalDates`).
* Calculating a precise "net quantity still expected to arrive" (true on-order minus all relevant fulfillment) becomes more complex and may require specific business rules or heuristics outside the basic table structure if Option A for `OnOrderInfo` isn't sufficient.
This two-table approach (`purchase_orders` + `receivings`) seems the most robust and accurate way to handle your requirement for complete receiving records independent of potentially flawed PO linking. It directly addresses the shortcomings of the previous attempts.

View File

@@ -7,7 +7,7 @@ BEGIN
-- Check which table is being updated and use the appropriate column -- Check which table is being updated and use the appropriate column
IF TG_TABLE_NAME = 'categories' THEN IF TG_TABLE_NAME = 'categories' THEN
NEW.updated_at = CURRENT_TIMESTAMP; NEW.updated_at = CURRENT_TIMESTAMP;
ELSIF TG_TABLE_NAME IN ('products', 'orders', 'purchase_orders') THEN ELSIF TG_TABLE_NAME IN ('products', 'orders', 'purchase_orders', 'receivings') THEN
NEW.updated = CURRENT_TIMESTAMP; NEW.updated = CURRENT_TIMESTAMP;
END IF; END IF;
RETURN NEW; RETURN NEW;
@@ -159,27 +159,24 @@ CREATE INDEX idx_orders_pid_date ON orders(pid, date);
CREATE INDEX idx_orders_updated ON orders(updated); CREATE INDEX idx_orders_updated ON orders(updated);
-- Create purchase_orders table with its indexes -- Create purchase_orders table with its indexes
-- This table now focuses solely on purchase order intent, not receivings
CREATE TABLE purchase_orders ( CREATE TABLE purchase_orders (
id BIGSERIAL PRIMARY KEY, id BIGSERIAL PRIMARY KEY,
po_id TEXT NOT NULL, po_id TEXT NOT NULL,
vendor TEXT NOT NULL, vendor TEXT NOT NULL,
date DATE NOT NULL, date TIMESTAMP WITH TIME ZONE NOT NULL,
expected_date DATE, expected_date DATE,
pid BIGINT NOT NULL, pid BIGINT NOT NULL,
sku TEXT NOT NULL, sku TEXT NOT NULL,
name TEXT NOT NULL, name TEXT NOT NULL,
cost_price NUMERIC(14, 4) NOT NULL,
po_cost_price NUMERIC(14, 4) NOT NULL, po_cost_price NUMERIC(14, 4) NOT NULL,
status TEXT DEFAULT 'created', status TEXT DEFAULT 'created',
receiving_status TEXT DEFAULT 'created',
notes TEXT, notes TEXT,
long_note TEXT, long_note TEXT,
ordered INTEGER NOT NULL, ordered INTEGER NOT NULL,
received INTEGER DEFAULT 0, supplier_id INTEGER,
received_date DATE, date_created TIMESTAMP WITH TIME ZONE,
last_received_date DATE, date_ordered TIMESTAMP WITH TIME ZONE,
received_by TEXT,
receiving_history JSONB,
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP, updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE, FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE,
UNIQUE (po_id, pid) UNIQUE (po_id, pid)
@@ -192,21 +189,61 @@ CREATE TRIGGER update_purchase_orders_updated
EXECUTE FUNCTION update_updated_column(); EXECUTE FUNCTION update_updated_column();
COMMENT ON COLUMN purchase_orders.name IS 'Product name from products.description'; COMMENT ON COLUMN purchase_orders.name IS 'Product name from products.description';
COMMENT ON COLUMN purchase_orders.po_cost_price IS 'Original cost from PO, before receiving adjustments'; COMMENT ON COLUMN purchase_orders.po_cost_price IS 'Original cost from PO';
COMMENT ON COLUMN purchase_orders.status IS 'canceled, created, electronically_ready_send, ordered, preordered, electronically_sent, receiving_started, done'; COMMENT ON COLUMN purchase_orders.status IS 'canceled, created, electronically_ready_send, ordered, preordered, electronically_sent, receiving_started, done';
COMMENT ON COLUMN purchase_orders.receiving_status IS 'canceled, created, partial_received, full_received, paid';
COMMENT ON COLUMN purchase_orders.receiving_history IS 'Array of receiving records with qty, date, cost, receiving_id, and alt_po flag';
CREATE INDEX idx_po_id ON purchase_orders(po_id); CREATE INDEX idx_po_id ON purchase_orders(po_id);
CREATE INDEX idx_po_sku ON purchase_orders(sku); CREATE INDEX idx_po_sku ON purchase_orders(sku);
CREATE INDEX idx_po_vendor ON purchase_orders(vendor); CREATE INDEX idx_po_vendor ON purchase_orders(vendor);
CREATE INDEX idx_po_status ON purchase_orders(status); CREATE INDEX idx_po_status ON purchase_orders(status);
CREATE INDEX idx_po_receiving_status ON purchase_orders(receiving_status);
CREATE INDEX idx_po_expected_date ON purchase_orders(expected_date); CREATE INDEX idx_po_expected_date ON purchase_orders(expected_date);
CREATE INDEX idx_po_last_received_date ON purchase_orders(last_received_date);
CREATE INDEX idx_po_pid_status ON purchase_orders(pid, status); CREATE INDEX idx_po_pid_status ON purchase_orders(pid, status);
CREATE INDEX idx_po_pid_date ON purchase_orders(pid, date); CREATE INDEX idx_po_pid_date ON purchase_orders(pid, date);
CREATE INDEX idx_po_updated ON purchase_orders(updated); CREATE INDEX idx_po_updated ON purchase_orders(updated);
CREATE INDEX idx_po_supplier_id ON purchase_orders(supplier_id);
-- Create receivings table to track actual receipt of goods
CREATE TABLE receivings (
id BIGSERIAL PRIMARY KEY,
receiving_id TEXT NOT NULL,
pid BIGINT NOT NULL,
sku TEXT NOT NULL,
name TEXT NOT NULL,
vendor TEXT,
qty_each INTEGER NOT NULL,
qty_each_orig INTEGER,
cost_each NUMERIC(14, 5) NOT NULL,
cost_each_orig NUMERIC(14, 5),
received_by INTEGER,
received_by_name TEXT,
received_date TIMESTAMP WITH TIME ZONE NOT NULL,
receiving_created_date TIMESTAMP WITH TIME ZONE,
supplier_id INTEGER,
status TEXT DEFAULT 'created',
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE,
UNIQUE (receiving_id, pid)
);
-- Create trigger for receivings
CREATE TRIGGER update_receivings_updated
BEFORE UPDATE ON receivings
FOR EACH ROW
EXECUTE FUNCTION update_updated_column();
COMMENT ON COLUMN receivings.status IS 'canceled, created, partial_received, full_received, paid';
COMMENT ON COLUMN receivings.qty_each_orig IS 'Original quantity from the source system';
COMMENT ON COLUMN receivings.cost_each_orig IS 'Original cost from the source system';
COMMENT ON COLUMN receivings.vendor IS 'Vendor name, same as in purchase_orders';
CREATE INDEX idx_receivings_id ON receivings(receiving_id);
CREATE INDEX idx_receivings_pid ON receivings(pid);
CREATE INDEX idx_receivings_sku ON receivings(sku);
CREATE INDEX idx_receivings_status ON receivings(status);
CREATE INDEX idx_receivings_received_date ON receivings(received_date);
CREATE INDEX idx_receivings_supplier_id ON receivings(supplier_id);
CREATE INDEX idx_receivings_vendor ON receivings(vendor);
CREATE INDEX idx_receivings_updated ON receivings(updated);
SET session_replication_role = 'origin'; -- Re-enable foreign key checks SET session_replication_role = 'origin'; -- Re-enable foreign key checks

View File

@@ -11,10 +11,10 @@ const importHistoricalData = require('./import/historical-data');
dotenv.config({ path: path.join(__dirname, "../.env") }); dotenv.config({ path: path.join(__dirname, "../.env") });
// Constants to control which imports run // Constants to control which imports run
const IMPORT_CATEGORIES = true; const IMPORT_CATEGORIES = false;
const IMPORT_PRODUCTS = true; const IMPORT_PRODUCTS = true;
const IMPORT_ORDERS = true; const IMPORT_ORDERS = false;
const IMPORT_PURCHASE_ORDERS = true; const IMPORT_PURCHASE_ORDERS = false;
const IMPORT_HISTORICAL_DATA = false; const IMPORT_HISTORICAL_DATA = false;
// Add flag for incremental updates // Add flag for incremental updates

View File

@@ -406,7 +406,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
WHERE oi.prod_pid = p.pid AND o.order_status >= 20) AS total_sold, WHERE oi.prod_pid = p.pid AND o.order_status >= 20) AS total_sold,
pls.date_sold as date_last_sold, pls.date_sold as date_last_sold,
(SELECT iid FROM product_images WHERE pid = p.pid AND \`order\` = 255 LIMIT 1) AS primary_iid, (SELECT iid FROM product_images WHERE pid = p.pid AND \`order\` = 255 LIMIT 1) AS primary_iid,
NULL as category_ids GROUP_CONCAT(DISTINCT CASE
WHEN pc.cat_id IS NOT NULL
AND pc.type IN (10, 20, 11, 21, 12, 13)
AND pci.cat_id NOT IN (16, 17)
THEN pci.cat_id
END) as category_ids
FROM products p FROM products p
LEFT JOIN shop_inventory si ON p.pid = si.pid AND si.store = 0 LEFT JOIN shop_inventory si ON p.pid = si.pid AND si.store = 0
LEFT JOIN current_inventory ci ON p.pid = ci.pid LEFT JOIN current_inventory ci ON p.pid = ci.pid

View File

@@ -35,7 +35,7 @@ function validateDate(mysqlDate) {
/** /**
* Imports purchase orders and receivings from a production MySQL database to a local PostgreSQL database. * Imports purchase orders and receivings from a production MySQL database to a local PostgreSQL database.
* Implements FIFO allocation of receivings to purchase orders. * Handles these as separate data streams without complex FIFO allocation.
* *
* @param {object} prodConnection - A MySQL connection to production DB * @param {object} prodConnection - A MySQL connection to production DB
* @param {object} localConnection - A PostgreSQL connection to local DB * @param {object} localConnection - A PostgreSQL connection to local DB
@@ -44,8 +44,10 @@ function validateDate(mysqlDate) {
*/ */
async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) { async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) {
const startTime = Date.now(); const startTime = Date.now();
let recordsAdded = 0; let poRecordsAdded = 0;
let recordsUpdated = 0; let poRecordsUpdated = 0;
let receivingRecordsAdded = 0;
let receivingRecordsUpdated = 0;
let totalProcessed = 0; let totalProcessed = 0;
// Batch size constants // Batch size constants
@@ -68,7 +70,6 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
await localConnection.query(` await localConnection.query(`
DROP TABLE IF EXISTS temp_purchase_orders; DROP TABLE IF EXISTS temp_purchase_orders;
DROP TABLE IF EXISTS temp_receivings; DROP TABLE IF EXISTS temp_receivings;
DROP TABLE IF EXISTS temp_receiving_allocations;
DROP TABLE IF EXISTS employee_names; DROP TABLE IF EXISTS employee_names;
DROP TABLE IF EXISTS temp_supplier_names; DROP TABLE IF EXISTS temp_supplier_names;
@@ -95,32 +96,23 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
-- Temporary table for receivings -- Temporary table for receivings
CREATE TEMP TABLE temp_receivings ( CREATE TEMP TABLE temp_receivings (
receiving_id TEXT NOT NULL, receiving_id TEXT NOT NULL,
po_id TEXT,
pid BIGINT NOT NULL, pid BIGINT NOT NULL,
sku TEXT,
name TEXT,
vendor TEXT,
qty_each INTEGER, qty_each INTEGER,
cost_each NUMERIC(14, 4), qty_each_orig INTEGER,
cost_each NUMERIC(14, 5),
cost_each_orig NUMERIC(14, 5),
received_by INTEGER, received_by INTEGER,
received_by_name TEXT,
received_date TIMESTAMP WITH TIME ZONE, received_date TIMESTAMP WITH TIME ZONE,
receiving_created_date TIMESTAMP WITH TIME ZONE, receiving_created_date TIMESTAMP WITH TIME ZONE,
supplier_id INTEGER, supplier_id INTEGER,
status TEXT, status TEXT,
sku TEXT,
name TEXT,
PRIMARY KEY (receiving_id, pid) PRIMARY KEY (receiving_id, pid)
); );
-- Temporary table for tracking FIFO allocations
CREATE TEMP TABLE temp_receiving_allocations (
po_id TEXT NOT NULL,
pid BIGINT NOT NULL,
receiving_id TEXT NOT NULL,
allocated_qty INTEGER NOT NULL,
cost_each NUMERIC(14, 4) NOT NULL,
received_date TIMESTAMP WITH TIME ZONE NOT NULL,
received_by INTEGER,
PRIMARY KEY (po_id, pid, receiving_id)
);
-- Temporary table for employee names -- Temporary table for employee names
CREATE TEMP TABLE employee_names ( CREATE TEMP TABLE employee_names (
employeeid INTEGER PRIMARY KEY, employeeid INTEGER PRIMARY KEY,
@@ -131,7 +123,6 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
-- Create indexes for efficient joins -- Create indexes for efficient joins
CREATE INDEX idx_temp_po_pid ON temp_purchase_orders(pid); CREATE INDEX idx_temp_po_pid ON temp_purchase_orders(pid);
CREATE INDEX idx_temp_receiving_pid ON temp_receivings(pid); CREATE INDEX idx_temp_receiving_pid ON temp_receivings(pid);
CREATE INDEX idx_temp_receiving_po_id ON temp_receivings(po_id);
`); `);
// Map status codes to text values // Map status codes to text values
@@ -194,7 +185,56 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
`, employeeValues); `, employeeValues);
} }
// 1. First, fetch all relevant POs // Add this section before the PO import to create a supplier names mapping
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Fetching supplier data for vendor mapping"
});
// Fetch supplier data from production and store in a temp table
const [suppliers] = await prodConnection.query(`
SELECT
supplierid,
companyname
FROM suppliers
WHERE companyname IS NOT NULL AND companyname != ''
`);
if (suppliers.length > 0) {
// Create temp table for supplier names
await localConnection.query(`
DROP TABLE IF EXISTS temp_supplier_names;
CREATE TEMP TABLE temp_supplier_names (
supplier_id INTEGER PRIMARY KEY,
company_name TEXT NOT NULL
);
`);
// Insert supplier data in batches
for (let i = 0; i < suppliers.length; i += INSERT_BATCH_SIZE) {
const batch = suppliers.slice(i, i + INSERT_BATCH_SIZE);
const placeholders = batch.map((_, idx) => {
const base = idx * 2;
return `($${base + 1}, $${base + 2})`;
}).join(',');
const values = batch.flatMap(s => [
s.supplierid,
s.companyname || 'Unnamed Supplier'
]);
await localConnection.query(`
INSERT INTO temp_supplier_names (supplier_id, company_name)
VALUES ${placeholders}
ON CONFLICT (supplier_id) DO UPDATE SET
company_name = EXCLUDED.company_name
`, values);
}
}
// 1. Fetch and process purchase orders
outputProgress({ outputProgress({
status: "running", status: "running",
operation: "Purchase orders import", operation: "Purchase orders import",
@@ -392,10 +432,16 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
const [receivingList] = await prodConnection.query(` const [receivingList] = await prodConnection.query(`
SELECT SELECT
r.receiving_id, r.receiving_id,
r.po_id,
r.supplier_id, r.supplier_id,
r.status, r.status,
r.date_created r.notes,
r.shipping,
r.total_amount,
r.hold,
r.for_storefront,
r.date_created,
r.date_paid,
r.date_checked
FROM receivings r FROM receivings r
WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR) WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
${incrementalUpdate ? ` ${incrementalUpdate ? `
@@ -421,7 +467,9 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
rp.receiving_id, rp.receiving_id,
rp.pid, rp.pid,
rp.qty_each, rp.qty_each,
rp.qty_each_orig,
rp.cost_each, rp.cost_each,
rp.cost_each_orig,
rp.received_by, rp.received_by,
rp.received_date, rp.received_date,
r.date_created as receiving_created_date, r.date_created as receiving_created_date,
@@ -439,19 +487,50 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
const receiving = receivingList.find(r => r.receiving_id == product.receiving_id); const receiving = receivingList.find(r => r.receiving_id == product.receiving_id);
if (!receiving) continue; if (!receiving) continue;
// Get employee name if available
let receivedByName = null;
if (product.received_by) {
const [employeeResult] = await localConnection.query(`
SELECT CONCAT(firstname, ' ', lastname) as full_name
FROM employee_names
WHERE employeeid = $1
`, [product.received_by]);
if (employeeResult.rows.length > 0) {
receivedByName = employeeResult.rows[0].full_name;
}
}
// Get vendor name if available
let vendorName = 'Unknown Vendor';
if (receiving.supplier_id) {
const [vendorResult] = await localConnection.query(`
SELECT company_name
FROM temp_supplier_names
WHERE supplier_id = $1
`, [receiving.supplier_id]);
if (vendorResult.rows.length > 0) {
vendorName = vendorResult.rows[0].company_name;
}
}
completeReceivings.push({ completeReceivings.push({
receiving_id: receiving.receiving_id.toString(), receiving_id: receiving.receiving_id.toString(),
po_id: receiving.po_id ? receiving.po_id.toString() : null,
pid: product.pid, pid: product.pid,
sku: product.sku,
name: product.name,
vendor: vendorName,
qty_each: product.qty_each, qty_each: product.qty_each,
qty_each_orig: product.qty_each_orig,
cost_each: product.cost_each, cost_each: product.cost_each,
cost_each_orig: product.cost_each_orig,
received_by: product.received_by, received_by: product.received_by,
received_by_name: receivedByName,
received_date: validateDate(product.received_date) || validateDate(product.receiving_created_date), received_date: validateDate(product.received_date) || validateDate(product.receiving_created_date),
receiving_created_date: validateDate(product.receiving_created_date), receiving_created_date: validateDate(product.receiving_created_date),
supplier_id: receiving.supplier_id, supplier_id: receiving.supplier_id,
status: receivingStatusMap[receiving.status] || 'created', status: receivingStatusMap[receiving.status] || 'created'
sku: product.sku,
name: product.name
}); });
} }
@@ -460,43 +539,49 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
const batch = completeReceivings.slice(i, i + INSERT_BATCH_SIZE); const batch = completeReceivings.slice(i, i + INSERT_BATCH_SIZE);
const placeholders = batch.map((_, idx) => { const placeholders = batch.map((_, idx) => {
const base = idx * 12; const base = idx * 15;
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12})`; return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
}).join(','); }).join(',');
const values = batch.flatMap(r => [ const values = batch.flatMap(r => [
r.receiving_id, r.receiving_id,
r.po_id,
r.pid, r.pid,
r.sku,
r.name,
r.vendor,
r.qty_each, r.qty_each,
r.qty_each_orig,
r.cost_each, r.cost_each,
r.cost_each_orig,
r.received_by, r.received_by,
r.received_by_name,
r.received_date, r.received_date,
r.receiving_created_date, r.receiving_created_date,
r.supplier_id, r.supplier_id,
r.status, r.status
r.sku,
r.name
]); ]);
await localConnection.query(` await localConnection.query(`
INSERT INTO temp_receivings ( INSERT INTO temp_receivings (
receiving_id, po_id, pid, qty_each, cost_each, received_by, receiving_id, pid, sku, name, vendor, qty_each, qty_each_orig,
received_date, receiving_created_date, supplier_id, status, cost_each, cost_each_orig, received_by, received_by_name,
sku, name received_date, receiving_created_date, supplier_id, status
) )
VALUES ${placeholders} VALUES ${placeholders}
ON CONFLICT (receiving_id, pid) DO UPDATE SET ON CONFLICT (receiving_id, pid) DO UPDATE SET
po_id = EXCLUDED.po_id, sku = EXCLUDED.sku,
name = EXCLUDED.name,
vendor = EXCLUDED.vendor,
qty_each = EXCLUDED.qty_each, qty_each = EXCLUDED.qty_each,
qty_each_orig = EXCLUDED.qty_each_orig,
cost_each = EXCLUDED.cost_each, cost_each = EXCLUDED.cost_each,
cost_each_orig = EXCLUDED.cost_each_orig,
received_by = EXCLUDED.received_by, received_by = EXCLUDED.received_by,
received_by_name = EXCLUDED.received_by_name,
received_date = EXCLUDED.received_date, received_date = EXCLUDED.received_date,
receiving_created_date = EXCLUDED.receiving_created_date, receiving_created_date = EXCLUDED.receiving_created_date,
supplier_id = EXCLUDED.supplier_id, supplier_id = EXCLUDED.supplier_id,
status = EXCLUDED.status, status = EXCLUDED.status
sku = EXCLUDED.sku,
name = EXCLUDED.name
`, values); `, values);
} }
@@ -519,64 +604,13 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
} }
} }
// Add this section before the FIFO steps to create a supplier names mapping // Add this section to filter out invalid PIDs before final import
outputProgress({ outputProgress({
status: "running", status: "running",
operation: "Purchase orders import", operation: "Purchase orders import",
message: "Fetching supplier data for vendor mapping" message: "Validating product IDs before final import"
}); });
// Fetch supplier data from production and store in a temp table
const [suppliers] = await prodConnection.query(`
SELECT
supplierid,
companyname
FROM suppliers
WHERE companyname IS NOT NULL AND companyname != ''
`);
if (suppliers.length > 0) {
// Create temp table for supplier names
await localConnection.query(`
DROP TABLE IF EXISTS temp_supplier_names;
CREATE TEMP TABLE temp_supplier_names (
supplier_id INTEGER PRIMARY KEY,
company_name TEXT NOT NULL
);
`);
// Insert supplier data in batches
for (let i = 0; i < suppliers.length; i += INSERT_BATCH_SIZE) {
const batch = suppliers.slice(i, i + INSERT_BATCH_SIZE);
const placeholders = batch.map((_, idx) => {
const base = idx * 2;
return `($${base + 1}, $${base + 2})`;
}).join(',');
const values = batch.flatMap(s => [
s.supplierid,
s.companyname || 'Unnamed Supplier'
]);
await localConnection.query(`
INSERT INTO temp_supplier_names (supplier_id, company_name)
VALUES ${placeholders}
ON CONFLICT (supplier_id) DO UPDATE SET
company_name = EXCLUDED.company_name
`, values);
}
}
// 3. Implement FIFO allocation of receivings to purchase orders
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Validating product IDs before allocation"
});
// Add this section to filter out invalid PIDs before allocation
// This will check all PIDs in our temp tables against the products table
await localConnection.query(` await localConnection.query(`
-- Create temp table to store invalid PIDs -- Create temp table to store invalid PIDs
DROP TABLE IF EXISTS temp_invalid_pids; DROP TABLE IF EXISTS temp_invalid_pids;
@@ -614,367 +648,107 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
console.log(`Filtered out ${filteredCount} items with invalid product IDs`); console.log(`Filtered out ${filteredCount} items with invalid product IDs`);
} }
// Break FIFO allocation into steps with progress tracking // 3. Insert final purchase order records to the actual table
const fifoSteps = [
{
name: "Direct allocations",
query: `
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
r.po_id,
r.pid,
r.receiving_id,
LEAST(r.qty_each, po.ordered) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by
FROM temp_receivings r
JOIN temp_purchase_orders po ON r.po_id = po.po_id AND r.pid = po.pid
WHERE r.po_id IS NOT NULL
`
},
{
name: "Handling standalone receivings",
query: `
INSERT INTO temp_purchase_orders (
po_id, pid, sku, name, vendor, date, status,
ordered, po_cost_price, supplier_id, date_created, date_ordered
)
SELECT
r.receiving_id::text as po_id,
r.pid,
r.sku,
r.name,
COALESCE(
-- First, check if we already have a vendor name from the temp_purchase_orders table
(SELECT vendor FROM temp_purchase_orders
WHERE supplier_id = r.supplier_id LIMIT 1),
-- Next, check the supplier_names mapping table we created
(SELECT company_name FROM temp_supplier_names
WHERE supplier_id = r.supplier_id),
-- If both fail, use a generic name with the supplier ID
CASE
WHEN r.supplier_id IS NOT NULL THEN 'Supplier #' || r.supplier_id::text
ELSE 'Unknown Supplier'
END
) as vendor,
COALESCE(r.received_date, r.receiving_created_date) as date,
'created' as status,
NULL as ordered,
r.cost_each as po_cost_price,
r.supplier_id,
COALESCE(r.receiving_created_date, r.received_date) as date_created,
NULL as date_ordered
FROM temp_receivings r
WHERE r.po_id IS NULL
OR NOT EXISTS (
SELECT 1 FROM temp_purchase_orders po
WHERE po.po_id = r.po_id AND po.pid = r.pid
)
ON CONFLICT (po_id, pid) DO NOTHING
`
},
{
name: "Allocating standalone receivings",
query: `
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
r.receiving_id::text as po_id,
r.pid,
r.receiving_id,
r.qty_each as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by
FROM temp_receivings r
WHERE r.po_id IS NULL
OR NOT EXISTS (
SELECT 1 FROM temp_purchase_orders po
WHERE po.po_id = r.po_id AND po.pid = r.pid
)
`
},
{
name: "FIFO allocation logic",
query: `
WITH
-- Calculate remaining quantities after direct allocations
remaining_po_quantities AS (
SELECT
po.po_id,
po.pid,
po.ordered,
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
po.ordered - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
po.date_ordered,
po.date_created
FROM temp_purchase_orders po
LEFT JOIN temp_receiving_allocations ra ON po.po_id = ra.po_id AND po.pid = ra.pid
WHERE po.ordered IS NOT NULL
GROUP BY po.po_id, po.pid, po.ordered, po.date_ordered, po.date_created
HAVING po.ordered > COALESCE(SUM(ra.allocated_qty), 0)
),
remaining_receiving_quantities AS (
SELECT
r.receiving_id,
r.pid,
r.qty_each,
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
r.qty_each - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
r.received_date,
r.cost_each,
r.received_by
FROM temp_receivings r
LEFT JOIN temp_receiving_allocations ra ON r.receiving_id = ra.receiving_id AND r.pid = ra.pid
GROUP BY r.receiving_id, r.pid, r.qty_each, r.received_date, r.cost_each, r.received_by
HAVING r.qty_each > COALESCE(SUM(ra.allocated_qty), 0)
),
-- Rank POs by age, with a cutoff for very old POs (1 year)
ranked_pos AS (
SELECT
po.po_id,
po.pid,
po.remaining_qty,
CASE
WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2
ELSE 1
END as age_group,
ROW_NUMBER() OVER (
PARTITION BY po.pid, (CASE WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2 ELSE 1 END)
ORDER BY COALESCE(po.date_ordered, po.date_created, NOW())
) as rank_in_group
FROM remaining_po_quantities po
),
-- Rank receivings by date
ranked_receivings AS (
SELECT
r.receiving_id,
r.pid,
r.remaining_qty,
r.received_date,
r.cost_each,
r.received_by,
ROW_NUMBER() OVER (PARTITION BY r.pid ORDER BY COALESCE(r.received_date, NOW())) as rank
FROM remaining_receiving_quantities r
),
-- First allocate to recent POs
allocations_recent AS (
SELECT
po.po_id,
po.pid,
r.receiving_id,
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by,
po.age_group,
po.rank_in_group,
r.rank,
'recent' as allocation_type
FROM ranked_pos po
JOIN ranked_receivings r ON po.pid = r.pid
WHERE po.age_group = 1
ORDER BY po.pid, po.rank_in_group, r.rank
),
-- Then allocate to older POs
remaining_after_recent AS (
SELECT
r.receiving_id,
r.pid,
r.remaining_qty - COALESCE(SUM(a.allocated_qty), 0) as remaining_qty,
r.received_date,
r.cost_each,
r.received_by,
r.rank
FROM ranked_receivings r
LEFT JOIN allocations_recent a ON r.receiving_id = a.receiving_id AND r.pid = a.pid
GROUP BY r.receiving_id, r.pid, r.remaining_qty, r.received_date, r.cost_each, r.received_by, r.rank
HAVING r.remaining_qty > COALESCE(SUM(a.allocated_qty), 0)
),
allocations_old AS (
SELECT
po.po_id,
po.pid,
r.receiving_id,
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
r.cost_each,
COALESCE(r.received_date, NOW()) as received_date,
r.received_by,
po.age_group,
po.rank_in_group,
r.rank,
'old' as allocation_type
FROM ranked_pos po
JOIN remaining_after_recent r ON po.pid = r.pid
WHERE po.age_group = 2
ORDER BY po.pid, po.rank_in_group, r.rank
),
-- Combine allocations
combined_allocations AS (
SELECT * FROM allocations_recent
UNION ALL
SELECT * FROM allocations_old
)
-- Insert into allocations table
INSERT INTO temp_receiving_allocations (
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
)
SELECT
po_id, pid, receiving_id, allocated_qty, cost_each,
COALESCE(received_date, NOW()) as received_date,
received_by
FROM combined_allocations
WHERE allocated_qty > 0
`
}
];
// Execute FIFO steps with progress tracking
for (let i = 0; i < fifoSteps.length; i++) {
const step = fifoSteps[i];
outputProgress({ outputProgress({
status: "running", status: "running",
operation: "Purchase orders import", operation: "Purchase orders import",
message: `FIFO allocation step ${i+1}/${fifoSteps.length}: ${step.name}`, message: "Inserting final purchase order records"
current: i,
total: fifoSteps.length
}); });
await localConnection.query(step.query); const [poResult] = await localConnection.query(`
}
// 4. Generate final purchase order records with receiving data
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Generating final purchase order records"
});
const [finalResult] = await localConnection.query(`
WITH
receiving_summaries AS (
SELECT
po_id,
pid,
SUM(allocated_qty) as total_received,
JSONB_AGG(
JSONB_BUILD_OBJECT(
'receiving_id', receiving_id,
'qty', allocated_qty,
'date', COALESCE(received_date, NOW()),
'cost', cost_each,
'received_by', received_by,
'received_by_name', CASE
WHEN received_by IS NOT NULL AND received_by > 0 THEN
(SELECT CONCAT(firstname, ' ', lastname)
FROM employee_names
WHERE employeeid = received_by)
ELSE NULL
END
) ORDER BY COALESCE(received_date, NOW())
) as receiving_history,
MIN(COALESCE(received_date, NOW())) as first_received_date,
MAX(COALESCE(received_date, NOW())) as last_received_date,
STRING_AGG(
DISTINCT CASE WHEN received_by IS NOT NULL AND received_by > 0
THEN CAST(received_by AS TEXT)
ELSE NULL
END,
','
) as received_by_list,
STRING_AGG(
DISTINCT CASE
WHEN ra.received_by IS NOT NULL AND ra.received_by > 0 THEN
(SELECT CONCAT(firstname, ' ', lastname)
FROM employee_names
WHERE employeeid = ra.received_by)
ELSE NULL
END,
', '
) as received_by_names
FROM temp_receiving_allocations ra
GROUP BY po_id, pid
),
cost_averaging AS (
SELECT
ra.po_id,
ra.pid,
SUM(ra.allocated_qty * ra.cost_each) / NULLIF(SUM(ra.allocated_qty), 0) as avg_cost
FROM temp_receiving_allocations ra
GROUP BY ra.po_id, ra.pid
)
INSERT INTO purchase_orders ( INSERT INTO purchase_orders (
po_id, vendor, date, expected_date, pid, sku, name, po_id, vendor, date, expected_date, pid, sku, name,
cost_price, po_cost_price, status, receiving_status, notes, long_note, po_cost_price, status, notes, long_note,
ordered, received, received_date, last_received_date, received_by, ordered, supplier_id, date_created, date_ordered
receiving_history
) )
SELECT SELECT
po.po_id, po_id,
po.vendor, vendor,
CASE COALESCE(date, date_created, now()) as date,
WHEN po.date IS NOT NULL THEN po.date expected_date,
-- For standalone receivings, try to use the receiving date from history pid,
WHEN po.po_id LIKE 'R%' AND rs.first_received_date IS NOT NULL THEN rs.first_received_date sku,
-- As a last resort for data integrity, use Unix epoch (Jan 1, 1970) name,
ELSE to_timestamp(0) po_cost_price,
END as date, status,
NULLIF(po.expected_date::text, '0000-00-00')::date as expected_date, notes,
po.pid, long_note,
po.sku, ordered,
po.name, supplier_id,
COALESCE(ca.avg_cost, po.po_cost_price) as cost_price, date_created,
po.po_cost_price, date_ordered
COALESCE(po.status, 'created'), FROM temp_purchase_orders
CASE
WHEN rs.total_received IS NULL THEN 'created'
WHEN rs.total_received = 0 THEN 'created'
WHEN rs.total_received < po.ordered THEN 'partial_received'
WHEN rs.total_received >= po.ordered THEN 'full_received'
ELSE 'created'
END as receiving_status,
po.notes,
po.long_note,
COALESCE(po.ordered, 0),
COALESCE(rs.total_received, 0),
NULLIF(rs.first_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as received_date,
NULLIF(rs.last_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as last_received_date,
CASE
WHEN rs.received_by_list IS NULL THEN NULL
ELSE rs.received_by_names
END as received_by,
rs.receiving_history
FROM temp_purchase_orders po
LEFT JOIN receiving_summaries rs ON po.po_id = rs.po_id AND po.pid = rs.pid
LEFT JOIN cost_averaging ca ON po.po_id = ca.po_id AND po.pid = ca.pid
ON CONFLICT (po_id, pid) DO UPDATE SET ON CONFLICT (po_id, pid) DO UPDATE SET
vendor = EXCLUDED.vendor, vendor = EXCLUDED.vendor,
date = EXCLUDED.date, date = EXCLUDED.date,
expected_date = EXCLUDED.expected_date, expected_date = EXCLUDED.expected_date,
sku = EXCLUDED.sku, sku = EXCLUDED.sku,
name = EXCLUDED.name, name = EXCLUDED.name,
cost_price = EXCLUDED.cost_price,
po_cost_price = EXCLUDED.po_cost_price, po_cost_price = EXCLUDED.po_cost_price,
status = EXCLUDED.status, status = EXCLUDED.status,
receiving_status = EXCLUDED.receiving_status,
notes = EXCLUDED.notes, notes = EXCLUDED.notes,
long_note = EXCLUDED.long_note, long_note = EXCLUDED.long_note,
ordered = EXCLUDED.ordered, ordered = EXCLUDED.ordered,
received = EXCLUDED.received, supplier_id = EXCLUDED.supplier_id,
received_date = EXCLUDED.received_date, date_created = EXCLUDED.date_created,
last_received_date = EXCLUDED.last_received_date, date_ordered = EXCLUDED.date_ordered,
received_by = EXCLUDED.received_by,
receiving_history = EXCLUDED.receiving_history,
updated = CURRENT_TIMESTAMP updated = CURRENT_TIMESTAMP
RETURNING (xmax = 0) as inserted RETURNING (xmax = 0) as inserted
`); `);
recordsAdded = finalResult.rows.filter(r => r.inserted).length; poRecordsAdded = poResult.rows.filter(r => r.inserted).length;
recordsUpdated = finalResult.rows.filter(r => !r.inserted).length; poRecordsUpdated = poResult.rows.filter(r => !r.inserted).length;
// 4. Insert final receiving records to the actual table
outputProgress({
status: "running",
operation: "Purchase orders import",
message: "Inserting final receiving records"
});
const [receivingsResult] = await localConnection.query(`
INSERT INTO receivings (
receiving_id, pid, sku, name, vendor, qty_each, qty_each_orig,
cost_each, cost_each_orig, received_by, received_by_name,
received_date, receiving_created_date, supplier_id, status
)
SELECT
receiving_id,
pid,
sku,
name,
vendor,
qty_each,
qty_each_orig,
cost_each,
cost_each_orig,
received_by,
received_by_name,
COALESCE(received_date, receiving_created_date, now()) as received_date,
receiving_created_date,
supplier_id,
status
FROM temp_receivings
ON CONFLICT (receiving_id, pid) DO UPDATE SET
sku = EXCLUDED.sku,
name = EXCLUDED.name,
vendor = EXCLUDED.vendor,
qty_each = EXCLUDED.qty_each,
qty_each_orig = EXCLUDED.qty_each_orig,
cost_each = EXCLUDED.cost_each,
cost_each_orig = EXCLUDED.cost_each_orig,
received_by = EXCLUDED.received_by,
received_by_name = EXCLUDED.received_by_name,
received_date = EXCLUDED.received_date,
receiving_created_date = EXCLUDED.receiving_created_date,
supplier_id = EXCLUDED.supplier_id,
status = EXCLUDED.status,
updated = CURRENT_TIMESTAMP
RETURNING (xmax = 0) as inserted
`);
receivingRecordsAdded = receivingsResult.rows.filter(r => r.inserted).length;
receivingRecordsUpdated = receivingsResult.rows.filter(r => !r.inserted).length;
// Update sync status // Update sync status
await localConnection.query(` await localConnection.query(`
@@ -988,9 +762,9 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
await localConnection.query(` await localConnection.query(`
DROP TABLE IF EXISTS temp_purchase_orders; DROP TABLE IF EXISTS temp_purchase_orders;
DROP TABLE IF EXISTS temp_receivings; DROP TABLE IF EXISTS temp_receivings;
DROP TABLE IF EXISTS temp_receiving_allocations;
DROP TABLE IF EXISTS employee_names; DROP TABLE IF EXISTS employee_names;
DROP TABLE IF EXISTS temp_supplier_names; DROP TABLE IF EXISTS temp_supplier_names;
DROP TABLE IF EXISTS temp_invalid_pids;
`); `);
// Commit transaction // Commit transaction
@@ -998,8 +772,12 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
return { return {
status: "complete", status: "complete",
recordsAdded: recordsAdded || 0, recordsAdded: poRecordsAdded + receivingRecordsAdded,
recordsUpdated: recordsUpdated || 0, recordsUpdated: poRecordsUpdated + receivingRecordsUpdated,
poRecordsAdded,
poRecordsUpdated,
receivingRecordsAdded,
receivingRecordsUpdated,
totalRecords: totalProcessed totalRecords: totalProcessed
}; };
} catch (error) { } catch (error) {

View File

@@ -351,7 +351,7 @@ router.get('/status/table-counts', async (req, res) => {
const pool = req.app.locals.pool; const pool = req.app.locals.pool;
const tables = [ const tables = [
// Core tables // Core tables
'products', 'categories', 'product_categories', 'orders', 'purchase_orders', 'products', 'categories', 'product_categories', 'orders', 'purchase_orders', 'receivings',
// New metrics tables // New metrics tables
'product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics', 'product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics',
// Config tables // Config tables
@@ -375,7 +375,7 @@ router.get('/status/table-counts', async (req, res) => {
// Group tables by type // Group tables by type
const groupedCounts = { const groupedCounts = {
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders'].includes(c.table_name)), core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders', 'receivings'].includes(c.table_name)),
metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics'].includes(c.table_name)), metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics'].includes(c.table_name)),
config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name)) config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name))
}; };

View File

@@ -801,7 +801,7 @@ export function DataManagement() {
); );
return ( return (
<Card className="md:col-start-2 md:row-span-2 h-[550px]"> <Card className="md:col-start-2 md:row-span-2 h-[580px]">
<CardHeader className="pb-3"> <CardHeader className="pb-3">
<CardTitle>Table Record Counts</CardTitle> <CardTitle>Table Record Counts</CardTitle>
</CardHeader> </CardHeader>
@@ -953,7 +953,7 @@ export function DataManagement() {
<div className="grid gap-4 md:grid-cols-2"> <div className="grid gap-4 md:grid-cols-2">
{/* Table Status */} {/* Table Status */}
<div className="space-y-4 flex flex-col h-[550px]"> <div className="space-y-4 flex flex-col h-[580px]">
<Card className="flex-1"> <Card className="flex-1">
<CardHeader className="pb-3"> <CardHeader className="pb-3">
<CardTitle>Last Import Times</CardTitle> <CardTitle>Last Import Times</CardTitle>