Compare commits
13 Commits
a9dbbbf824
...
check-numb
| Author | SHA1 | Date | |
|---|---|---|---|
| 00249f7c33 | |||
| f271f3aae4 | |||
| 43f76e4ac0 | |||
| 92ff80fba2 | |||
| a4c1a19d2e | |||
| c9b656d34b | |||
| d081a60662 | |||
| 4021fe487d | |||
| 4552fa4862 | |||
| 2601a04211 | |||
| 6051b849d6 | |||
| dbd0232285 | |||
| 1b9f01d101 |
271
docs/routes-cleanup.md
Normal file
271
docs/routes-cleanup.md
Normal file
@@ -0,0 +1,271 @@
|
||||
**Analysis of Potential Issues**
|
||||
|
||||
1. **Obsolete Functionality:**
|
||||
* **`config.js` Legacy Endpoints:** The endpoints `GET /config/`, `PUT /config/stock-thresholds/:id`, `PUT /config/lead-time-thresholds/:id`, `PUT /config/sales-velocity/:id`, `PUT /config/abc-classification/:id`, `PUT /config/safety-stock/:id`, and `PUT /config/turnover/:id` appear **highly likely to be obsolete**. They reference older, single-row config tables (`stock_thresholds`, etc.) while newer endpoints (`/config/global`, `/config/products`, `/config/vendors`) manage settings in more structured tables (`settings_global`, `settings_product`, `settings_vendor`). Unless specifically required for backward compatibility, these legacy endpoints should be removed to avoid confusion and potential data conflicts.
|
||||
* **`analytics.js` Forecast Endpoint (`GET /analytics/forecast`):** This endpoint uses **MySQL syntax** (`DATEDIFF`, `DATE_FORMAT`, `JSON_OBJECT`, `?` placeholders) but seems intended to run within the analytics module which otherwise uses PostgreSQL (`req.app.locals.pool`, `date_trunc`, `::text`, `$1` placeholders). This endpoint is likely **obsolete or misplaced** and will not function correctly against the PostgreSQL database.
|
||||
* **`csv.js` Redundant Actions:**
|
||||
* `POST /csv/update` seems redundant with `POST /csv/full-update`. The latter uses the `runScript` helper and dedicated state (`activeFullUpdate`), appearing more robust. `/csv/update` might be older or incomplete.
|
||||
* `POST /csv/reset` seems redundant with `POST /csv/full-reset`. Similar reasoning applies; `/csv/full-reset` appears preferred.
|
||||
* **`products.js` Import Endpoint (`POST /products/import`):** This is **dangerous duplication**. The `/csv` module handles imports (`/csv/import`, `/csv/import-from-prod`) with locking (`activeImport`) to prevent concurrent operations. This endpoint lacks such locking and could corrupt data if run simultaneously with other CSV/reset operations. It should likely be removed.
|
||||
* **`products.js` Metrics Endpoint (`GET /products/:id/metrics`):** This is redundant. The `/metrics/:pid` endpoint provides the same, possibly more comprehensive, data directly from the `product_metrics` table. Clients should use `/metrics/:pid` instead.
|
||||
|
||||
2. **Overlap or Inappropriate Duplication of Effort:**
|
||||
* **AI Prompt Getters:** `GET /ai-prompts/type/general` and `GET /ai-prompts/type/system` could potentially be handled by adding a query parameter filter to `GET /ai-prompts/` (e.g., `GET /ai-prompts?prompt_type=general`). However, dedicated endpoints for single, specific items can sometimes be simpler. This is more of a design choice than a major issue.
|
||||
* **Vendor Performance/Metrics:** There are multiple ways to get vendor performance data:
|
||||
* `GET /analytics/vendors` (uses `vendor_metrics`)
|
||||
* `GET /dashboard/vendor/performance` (uses `purchase_orders`)
|
||||
* `GET /purchase-orders/vendor-metrics` (uses `purchase_orders`)
|
||||
* `GET /vendors-aggregate/` (uses `vendor_metrics`, augmented with `purchase_orders`)
|
||||
This suggests significant overlap. The `/vendors-aggregate` endpoint seems the most comprehensive, combining pre-aggregated data with some real-time info. The others, especially `/dashboard/vendor/performance` and `/purchase-orders/vendor-metrics` which calculate directly from `purchase_orders`, might be redundant or less performant.
|
||||
* **Product Listing:**
|
||||
* `GET /products/` lists products joining `products`, `product_metrics`, and `categories`.
|
||||
* `GET /metrics/` lists products primarily from `product_metrics`.
|
||||
They offer similar filtering/sorting. If `product_metrics` contains all necessary display fields, `GET /products/` might be partly redundant for simple listing views, although it does provide aggregated category names. Evaluate if both full list endpoints are necessary.
|
||||
* **Image Uploads/Management:** Image handling is split:
|
||||
* `products-import.js`: Uploads temporary images for product import to `/uploads/products/`, schedules deletion.
|
||||
* `reusable-images.js`: Uploads persistent images to `/uploads/reusable/`, stores metadata in DB.
|
||||
* `products-import.js` has `/check-file` and `/list-uploads` that can see *both* directories, while `reusable-images.js` has a `/check-file` that only sees its own. This separation could be confusing. Clarify the purpose and lifecycle of images in each directory.
|
||||
* **Background Task Management (`csv.js`):** The use of `activeImport` for multiple unrelated tasks (import, reset, metrics calc) prevents concurrency, which might be too restrictive. The cancellation logic (`/cancel`) only targets `full-update`/`full-reset`, not tasks locked by `activeImport`. This needs unification.
|
||||
* **Analytics/Dashboard Base Table Queries:** Several endpoints in `analytics.js` (`/pricing`, `/categories`) and `dashboard.js` (`/best-sellers`, `/sales/metrics`, `/trending/products`, `/key-metrics`, `/inventory-health`, `/sales-overview`) query base tables (`orders`, `products`, `purchase_orders`) directly, while many others leverage pre-aggregated `_metrics` tables. This inconsistency can lead to performance differences and suggests potential for optimization by using aggregates where possible.
|
||||
|
||||
3. **Obvious Mistakes / Data Issues:**
|
||||
* **AI Prompt Fetching:** `GET /ai-prompts/company/:companyId`, `/type/general`, `/type/system` return `result.rows[0]`. This assumes uniqueness. If the underlying DB constraints (`unique_company_prompt`, etc.) fail or aren't present, this could silently hide data if multiple rows match. The use of unique constraint handling in POST/PUT suggests this is likely intended and safe *if* DB constraints are solid.
|
||||
* **Mixed Databases & SSH Tunnels:** The heavy reliance in `ai_validation.js` and `products-import.js` on connecting to a production MySQL DB via SSH tunnel while also using a local PostgreSQL DB adds significant architectural complexity.
|
||||
* **Inefficiency:** In `ai_validation.js` (`generateDebugResponse`), an SSH tunnel and MySQL connection (`promptTunnel`, `promptConnection`) are established but seem unused when fetching prompts (which correctly come from the PG pool `res.app.locals.pool`). This is wasted effort.
|
||||
* **Improvement:** The `getDbConnection` function in `products-import.js` implements caching/pooling for the SSH/MySQL connection – this is much better and should ideally be used consistently wherever the production DB is accessed (e.g., in `ai_validation.js`).
|
||||
* **`products.js` Brand Filtering:** `GET /products/brands` filters brands based on having associated purchase orders with a cost >= 500. This seems arbitrary for a general list of brands and might return incomplete results depending on the use case.
|
||||
* **Type Handling:** Ensure `parseValue` handles all required types and edge cases correctly, especially for filtering complex queries in `*-aggregate` and `metrics` routes. Explicit type casting in SQL (`::numeric`, `::text`, etc.) is generally good practice in PostgreSQL.
|
||||
* **Dummy Data:** Several `dashboard.js` endpoints return hardcoded dummy data on errors or when no data is found. While this prevents UI crashes, it can mask real issues. Ensure logging is robust when fallbacks are used.
|
||||
|
||||
**Summary of Endpoints**
|
||||
|
||||
Here's a summary of the available endpoints, grouped by their likely file/module:
|
||||
|
||||
**1. AI Prompts (`ai_prompts.js`)**
|
||||
* `GET /`: Get all AI prompts.
|
||||
* `GET /:id`: Get a specific AI prompt by its ID.
|
||||
* `GET /company/:companyId`: Get the AI prompt for a specific company (expects one). **(Deprecated)**
|
||||
* `GET /type/general`: Get the general AI prompt (expects one). **(Deprecated)**
|
||||
* `GET /type/system`: Get the system AI prompt (expects one). **(Deprecated)**
|
||||
* `GET /by-type`: Get AI prompt by type (general, system, company_specific) with optional company parameter. **(New Consolidated Endpoint)**
|
||||
* `POST /`: Create a new AI prompt.
|
||||
* `PUT /:id`: Update an existing AI prompt.
|
||||
* `DELETE /:id`: Delete an AI prompt.
|
||||
|
||||
**2. AI Validation (`ai_validation.js`)**
|
||||
* `POST /debug`: Generate and view the structure of prompts and taxonomy data (for debugging, doesn't call OpenAI). Connects to Prod MySQL (taxonomy) and Local PG (prompts, performance).
|
||||
* `POST /validate`: Validate product data using OpenAI. Connects to Prod MySQL (taxonomy) and Local PG (prompts, performance).
|
||||
* `GET /test-taxonomy`: Test endpoint to query sample taxonomy data from Prod MySQL.
|
||||
|
||||
**3. Analytics (`analytics.js`)**
|
||||
* `GET /stats`: Get overall business statistics from metrics tables.
|
||||
* `GET /profit`: Get profit analysis data (by category, over time, top products) from metrics tables.
|
||||
* `GET /vendors`: Get vendor performance analysis from `vendor_metrics`.
|
||||
* `GET /stock`: Get stock analysis data (turnover, levels, critical items) from metrics tables.
|
||||
* `GET /pricing`: Get pricing analysis (price points, elasticity, recommendations) - **uses `orders` table**.
|
||||
* `GET /categories`: Get category performance analysis (revenue, profit, growth, distribution, trends) - **uses `orders` and `products` tables**.
|
||||
* `GET /forecast`: (**Likely Obsolete/Broken**) Attempts to get forecast data using MySQL syntax.
|
||||
|
||||
**4. Brands Aggregate (`brands-aggregate.js`)**
|
||||
* `GET /filter-options`: Get distinct brand names and statuses for UI filters (from `brand_metrics`).
|
||||
* `GET /stats`: Get overall statistics related to brands (from `brand_metrics`).
|
||||
* `GET /`: List brands with aggregated metrics, supporting filtering, sorting, pagination (from `brand_metrics`).
|
||||
|
||||
**5. Categories Aggregate (`categories-aggregate.js`)**
|
||||
* `GET /filter-options`: Get distinct category types, statuses, and counts for UI filters (from `category_metrics` & `categories`).
|
||||
* `GET /stats`: Get overall statistics related to categories (from `category_metrics` & `categories`).
|
||||
* `GET /`: List categories with aggregated metrics, supporting filtering, sorting (incl. hierarchy), pagination (from `category_metrics` & `categories`).
|
||||
|
||||
**6. Configuration (`config.js`)**
|
||||
* **(New)** `GET /global`: Get all global settings.
|
||||
* **(New)** `PUT /global`: Update global settings.
|
||||
* **(New)** `GET /products`: List product-specific settings with pagination/search.
|
||||
* **(New)** `PUT /products/:pid`: Update/Create product-specific settings.
|
||||
* **(New)** `POST /products/:pid/reset`: Reset product settings to defaults.
|
||||
* **(New)** `GET /vendors`: List vendor-specific settings with pagination/search.
|
||||
* **(New)** `PUT /vendors/:vendor`: Update/Create vendor-specific settings.
|
||||
* **(New)** `POST /vendors/:vendor/reset`: Reset vendor settings to defaults.
|
||||
* **(Legacy/Obsolete)** `GET /`: Get all config from old single-row tables.
|
||||
* **(Legacy/Obsolete)** `PUT /stock-thresholds/:id`: Update old stock thresholds.
|
||||
* **(Legacy/Obsolete)** `PUT /lead-time-thresholds/:id`: Update old lead time thresholds.
|
||||
* **(Legacy/Obsolete)** `PUT /sales-velocity/:id`: Update old sales velocity config.
|
||||
* **(Legacy/Obsolete)** `PUT /abc-classification/:id`: Update old ABC config.
|
||||
* **(Legacy/Obsolete)** `PUT /safety-stock/:id`: Update old safety stock config.
|
||||
* **(Legacy/Obsolete)** `PUT /turnover/:id`: Update old turnover config.
|
||||
|
||||
**7. CSV Operations & Background Tasks (`csv.js`)**
|
||||
* `GET /:type/progress`: SSE endpoint for full update/reset progress.
|
||||
* `GET /test`: Simple test endpoint.
|
||||
* `GET /status`: Check status of the generic background task lock (`activeImport`).
|
||||
* `GET /calculate-metrics/status`: Check status of metrics calculation.
|
||||
* `GET /history/import`: Get recent import history.
|
||||
* `GET /history/calculate`: Get recent metrics calculation history.
|
||||
* `GET /status/modules`: Get last calculation time per module.
|
||||
* `GET /status/tables`: Get last sync time per table.
|
||||
* `GET /status/table-counts`: Get record counts for key tables.
|
||||
* `POST /update`: (**Potentially Obsolete**) Trigger `update-csv.js` script.
|
||||
* `POST /import`: Trigger `import-csv.js` script.
|
||||
* `POST /cancel`: Cancel `/full-update` or `/full-reset` task.
|
||||
* `POST /reset`: (**Potentially Obsolete**) Trigger `reset-db.js` script.
|
||||
* `POST /reset-metrics`: Trigger `reset-metrics.js` script.
|
||||
* `POST /calculate-metrics`: Trigger `calculate-metrics.js` script.
|
||||
* `POST /import-from-prod`: Trigger `import-from-prod.js` script.
|
||||
* `POST /full-update`: Trigger `full-update.js` script (preferred update).
|
||||
* `POST /full-reset`: Trigger `full-reset.js` script (preferred reset).
|
||||
|
||||
**8. Dashboard (`dashboard.js`)**
|
||||
* `GET /stock/metrics`: Get dashboard stock summary metrics & brand breakdown.
|
||||
* `GET /purchase/metrics`: Get dashboard purchase order summary metrics & vendor breakdown.
|
||||
* `GET /replenishment/metrics`: Get dashboard replenishment summary & top variants.
|
||||
* `GET /forecast/metrics`: Get dashboard forecast summary, daily, and category breakdown.
|
||||
* `GET /overstock/metrics`: Get dashboard overstock summary & category breakdown.
|
||||
* `GET /overstock/products`: Get list of top overstocked products.
|
||||
* `GET /best-sellers`: Get dashboard best-selling products, brands, categories - **uses `orders`, `products`**.
|
||||
* `GET /sales/metrics`: Get dashboard sales summary for a period - **uses `orders`**.
|
||||
* `GET /low-stock/products`: Get list of top low stock/critical products.
|
||||
* `GET /trending/products`: Get list of trending products - **uses `orders`, `products`**.
|
||||
* `GET /vendor/performance`: Get dashboard vendor performance details - **uses `purchase_orders`**.
|
||||
* `GET /key-metrics`: Get dashboard summary KPIs - **uses multiple base tables**.
|
||||
* `GET /inventory-health`: Get dashboard inventory health overview - **uses `products`, `product_metrics`**.
|
||||
* `GET /replenish/products`: Get list of products needing replenishment (overlaps `/low-stock/products`).
|
||||
* `GET /sales-overview`: Get daily sales totals for chart - **uses `orders`**.
|
||||
|
||||
**9. Product Import Utilities (`products-import.js`)**
|
||||
* `POST /upload-image`: Upload temporary product image, schedule deletion.
|
||||
* `DELETE /delete-image`: Delete temporary product image.
|
||||
* `GET /field-options`: Get dropdown options for product fields from Prod MySQL (cached).
|
||||
* `GET /product-lines/:companyId`: Get product lines for a company from Prod MySQL (cached).
|
||||
* `GET /sublines/:lineId`: Get sublines for a line from Prod MySQL (cached).
|
||||
* `GET /check-file/:filename`: Check existence/permissions of uploaded file (temp or reusable).
|
||||
* `GET /list-uploads`: List files in upload directories.
|
||||
* `GET /search-products`: Search products in Prod MySQL DB.
|
||||
* `GET /check-upc-and-generate-sku`: Check UPC existence and generate SKU suggestion based on Prod MySQL data.
|
||||
* `GET /product-categories/:pid`: Get assigned categories for a product from Prod MySQL.
|
||||
|
||||
**10. Product Metrics (`product-metrics.js`)**
|
||||
* `GET /filter-options`: Get distinct filter values (vendor, brand, abcClass) from `product_metrics`.
|
||||
* `GET /`: List detailed product metrics with filtering, sorting, pagination (primary data access).
|
||||
* `GET /:pid`: Get full metrics record for a single product.
|
||||
|
||||
**11. Orders (`orders.js`)**
|
||||
* `GET /`: List orders with summary info, filtering, sorting, pagination, and stats.
|
||||
* `GET /:orderNumber`: Get details for a single order, including items.
|
||||
|
||||
**12. Products (`products.js`)**
|
||||
* `GET /brands`: Get distinct brands (filtered by PO value).
|
||||
* `GET /`: List products with core data + metrics, filtering, sorting, pagination.
|
||||
* `GET /trending`: Get trending products based on `product_metrics`.
|
||||
* `GET /:id`: Get details for a single product (core data + metrics).
|
||||
* `POST /import`: (**Likely Obsolete/Dangerous**) Import products from CSV.
|
||||
* `PUT /:id`: Update core product data.
|
||||
* `GET /:id/metrics`: (**Redundant**) Get metrics for a single product.
|
||||
* `GET /:id/time-series`: Get sales/PO history for a single product.
|
||||
|
||||
**13. Purchase Orders (`purchase-orders.js`)**
|
||||
* `GET /`: List purchase orders with summary info, filtering, sorting, pagination, and summary stats.
|
||||
* `GET /vendor-metrics`: Calculate vendor performance metrics from `purchase_orders`.
|
||||
* `GET /cost-analysis`: Calculate cost analysis by category from `purchase_orders`.
|
||||
* `GET /receiving-status`: Get summary counts based on PO receiving status.
|
||||
* `GET /order-vs-received`: List product ordered vs. received quantities.
|
||||
|
||||
**14. Reusable Images (`reusable-images.js`)**
|
||||
* `GET /`: List all reusable images.
|
||||
* `GET /by-company/:companyId`: List global and company-specific images.
|
||||
* `GET /global`: List only global images.
|
||||
* `GET /:id`: Get a single reusable image record.
|
||||
* `POST /upload`: Upload a new reusable image and create DB record.
|
||||
* `PUT /:id`: Update reusable image metadata (name, global, company).
|
||||
* `DELETE /:id`: Delete reusable image record and file.
|
||||
* `GET /check-file/:filename`: Check existence/permissions of a reusable image file.
|
||||
|
||||
**15. Templates (`templates.js`)**
|
||||
* `GET /`: List all product data templates.
|
||||
* `GET /:company/:productType`: Get a specific template.
|
||||
* `POST /`: Create a new template.
|
||||
* `PUT /:id`: Update an existing template.
|
||||
* `DELETE /:id`: Delete a template.
|
||||
|
||||
**16. Vendors Aggregate (`vendors-aggregate.js`)**
|
||||
* `GET /filter-options`: Get distinct vendor names and statuses for UI filters (from `vendor_metrics`).
|
||||
* `GET /stats`: Get overall statistics related to vendors (from `vendor_metrics` & `purchase_orders`).
|
||||
* `GET /`: List vendors with aggregated metrics, supporting filtering, sorting, pagination (from `vendor_metrics` & `purchase_orders`).
|
||||
|
||||
**Recommendations:**
|
||||
|
||||
1. **Address Obsolete Endpoints:** Prioritize removing or confirming the necessity of the endpoints marked as obsolete/redundant (legacy config, `/analytics/forecast`, `/csv/update`, `/csv/reset`, `/products/import`, `/products/:id/metrics`).
|
||||
2. **Consolidate Overlapping Functionality:** Review the multiple vendor performance and product listing endpoints. Decide on the primary method (e.g., using aggregate tables via `/vendors-aggregate` and `/metrics`) and refactor or remove the others. Clarify the image upload strategies.
|
||||
3. **Standardize Data Access:** Decide whether `dashboard` and `analytics` endpoints should primarily use aggregate tables (like `/metrics`, `/brands-aggregate`, etc.) or if direct access to base tables is sometimes necessary. Aim for consistency and document the reasoning. Optimize queries hitting base tables if they must remain.
|
||||
4. **Improve Background Task Management:** Refactor `csv.js` to use a unified locking mechanism (maybe separate locks per task type?) and a consistent cancellation strategy for all spawned/managed processes. Clarify the purpose of `update` vs `full-update` and `reset` vs `full-reset`.
|
||||
5. **Optimize DB Connections:** Ensure the `getDbConnection` pooling/caching helper from `products-import.js` is used *consistently* across all modules interacting with the production MySQL database (especially `ai_validation.js`). Remove unnecessary tunnel creations.
|
||||
6. **Review Data Integrity:** Double-check the assumptions made (e.g., uniqueness of AI prompts) and ensure database constraints enforce them. Review the `GET /products/brands` filtering logic.
|
||||
|
||||
## Changes Made
|
||||
|
||||
1. **Removed Obsolete Legacy Endpoints in `config.js`**:
|
||||
- Removed `GET /config/` endpoint
|
||||
- Removed `PUT /config/stock-thresholds/:id` endpoint
|
||||
- Removed `PUT /config/lead-time-thresholds/:id` endpoint
|
||||
- Removed `PUT /config/sales-velocity/:id` endpoint
|
||||
- Removed `PUT /config/abc-classification/:id` endpoint
|
||||
- Removed `PUT /config/safety-stock/:id` endpoint
|
||||
- Removed `PUT /config/turnover/:id` endpoint
|
||||
|
||||
These endpoints were obsolete as they referenced older, single-row config tables that have been replaced by newer endpoints using the structured tables `settings_global`, `settings_product`, and `settings_vendor`.
|
||||
|
||||
2. **Removed MySQL Syntax `/forecast` Endpoint in `analytics.js`**:
|
||||
- Removed `GET /analytics/forecast` endpoint that was using MySQL-specific syntax incompatible with the PostgreSQL database used elsewhere in the application.
|
||||
|
||||
3. **Renamed and Removed Redundant Endpoints**:
|
||||
- Renamed `csv.js` to `data-management.js` while maintaining the same `/csv/*` endpoint paths for consistency
|
||||
- Removed deprecated `/csv/update` endpoint (now fully replaced by `/csv/full-update`)
|
||||
- Removed deprecated `/csv/reset` endpoint (now fully replaced by `/csv/full-reset`)
|
||||
- Removed deprecated `/products/import` endpoint (now handled by `/csv/import`)
|
||||
- Removed deprecated `/products/:id/metrics` endpoint (now handled by `/metrics/:pid`)
|
||||
|
||||
4. **Fixed Data Integrity Issues**:
|
||||
- Improved `GET /products/brands` endpoint by removing the arbitrary filtering logic that was only showing brands with purchase orders that had a total cost of at least $500
|
||||
- The updated endpoint now returns all distinct brands from visible products, providing more complete data
|
||||
|
||||
5. **Optimized Database Connections**:
|
||||
- Created a new `dbConnection.js` utility file that encapsulates the optimized database connection management logic
|
||||
- Improved the `ai-validation.js` file to use this shared connection management, eliminating unnecessary repeated tunnel creation
|
||||
- Added proper connection pooling with timeout-based connection reuse, reducing the overhead of repeatedly creating SSH tunnels
|
||||
- Added query result caching for frequently accessed data to improve performance
|
||||
|
||||
These changes improve maintainability by removing duplicate code, enhance consistency by standardizing on the newer endpoint patterns, and optimize performance by reducing redundant database connections.
|
||||
|
||||
## Additional Improvements
|
||||
|
||||
1. **Further Database Connection Optimizations**:
|
||||
- Extended the use of the optimized database connection utility to additional endpoints in `ai-validation.js`
|
||||
- Updated the `/validate` endpoint and `/test-taxonomy` endpoint to use `getDbConnection`
|
||||
- Ensured consistent connection management across all routes that access the production database
|
||||
|
||||
2. **AI Prompts Data Integrity Verification**:
|
||||
- Confirmed proper uniqueness constraints are in place in the database schema for AI prompts
|
||||
- The schema includes:
|
||||
- `unique_company_prompt` constraint ensuring only one prompt per company
|
||||
- `idx_unique_general_prompt` index ensuring only one general prompt in the system
|
||||
- `idx_unique_system_prompt` index ensuring only one system prompt in the system
|
||||
- Endpoint handlers properly handle uniqueness constraint violations with appropriate 409 Conflict responses
|
||||
- Validation ensures company-specific prompts have company IDs, while general/system prompts do not
|
||||
|
||||
3. **AI Prompts Endpoint Consolidation**:
|
||||
- Added a new consolidated `/by-type` endpoint that handles all types of prompts (general, system, company_specific)
|
||||
- Marked the existing separate endpoints as deprecated with console warnings
|
||||
- Maintained backward compatibility while providing a cleaner API moving forward
|
||||
|
||||
## Completed Items
|
||||
|
||||
✅ Removed obsolete legacy endpoints in `config.js`
|
||||
✅ Removed MySQL syntax `/forecast` endpoint in `analytics.js`
|
||||
✅ Fixed `GET /products/brands` endpoint filtering logic
|
||||
✅ Created reusable database connection utility (`dbConnection.js`)
|
||||
✅ Optimized database connections in `ai-validation.js`
|
||||
✅ Verified data integrity in AI prompts handling
|
||||
✅ Consolidated AI prompts endpoints with a unified `/by-type` endpoint
|
||||
|
||||
## Remaining Items
|
||||
|
||||
- Consider adding additional error handling and logging for database connections
|
||||
- Perform load testing on the optimized database connections to ensure they handle high traffic properly
|
||||
@@ -150,7 +150,7 @@ CREATE TABLE IF NOT EXISTS calculate_history (
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS calculate_status (
|
||||
module_name module_name PRIMARY KEY,
|
||||
module_name text PRIMARY KEY,
|
||||
last_calculation_timestamp TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
|
||||
@@ -53,6 +53,28 @@ CREATE TABLE public.product_metrics (
|
||||
image_url VARCHAR, -- (e.g., products.image_175)
|
||||
is_visible BOOLEAN,
|
||||
is_replenishable BOOLEAN,
|
||||
|
||||
-- Additional product fields
|
||||
barcode VARCHAR,
|
||||
harmonized_tariff_code VARCHAR,
|
||||
vendor_reference VARCHAR,
|
||||
notions_reference VARCHAR,
|
||||
line VARCHAR,
|
||||
subline VARCHAR,
|
||||
artist VARCHAR,
|
||||
moq INT,
|
||||
rating NUMERIC(10, 2),
|
||||
reviews INT,
|
||||
weight NUMERIC(14, 4),
|
||||
length NUMERIC(14, 4),
|
||||
width NUMERIC(14, 4),
|
||||
height NUMERIC(14, 4),
|
||||
country_of_origin VARCHAR,
|
||||
location VARCHAR,
|
||||
baskets INT,
|
||||
notifies INT,
|
||||
preorder_count INT,
|
||||
notions_inv_count INT,
|
||||
|
||||
-- Current Status (Refreshed Hourly)
|
||||
current_price NUMERIC(10, 2),
|
||||
@@ -151,6 +173,9 @@ CREATE TABLE public.product_metrics (
|
||||
-- Yesterday's Metrics (Refreshed Hourly from daily_product_snapshots)
|
||||
yesterday_sales INT,
|
||||
|
||||
-- Product Status (Calculated from metrics)
|
||||
status VARCHAR, -- Stores status values like: Critical, Reorder Soon, Healthy, Overstock, At Risk, New
|
||||
|
||||
CONSTRAINT fk_product_metrics_pid FOREIGN KEY (pid) REFERENCES public.products(pid) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
);
|
||||
|
||||
@@ -163,6 +188,7 @@ CREATE INDEX idx_product_metrics_revenue_30d ON public.product_metrics(revenue_3
|
||||
CREATE INDEX idx_product_metrics_sales_30d ON public.product_metrics(sales_30d DESC NULLS LAST); -- Example sorting index
|
||||
CREATE INDEX idx_product_metrics_current_stock ON public.product_metrics(current_stock);
|
||||
CREATE INDEX idx_product_metrics_sells_out_in_days ON public.product_metrics(sells_out_in_days ASC NULLS LAST); -- Example sorting index
|
||||
CREATE INDEX idx_product_metrics_status ON public.product_metrics(status); -- Index for status filtering
|
||||
|
||||
-- Add new vendor, category, and brand metrics tables
|
||||
-- Drop tables in reverse order if they exist
|
||||
@@ -178,6 +204,7 @@ CREATE TABLE public.category_metrics (
|
||||
parent_id INT8, -- Denormalized for convenience
|
||||
last_calculated TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
|
||||
-- ROLLED-UP METRICS (includes this category + all descendants)
|
||||
-- Counts & Basic Info
|
||||
product_count INT NOT NULL DEFAULT 0, -- Total products linked
|
||||
active_product_count INT NOT NULL DEFAULT 0, -- Visible products linked
|
||||
@@ -195,7 +222,24 @@ CREATE TABLE public.category_metrics (
|
||||
sales_365d INT NOT NULL DEFAULT 0, revenue_365d NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
lifetime_sales INT NOT NULL DEFAULT 0, lifetime_revenue NUMERIC(18, 4) NOT NULL DEFAULT 0.00,
|
||||
|
||||
-- Calculated KPIs (Based on 30d aggregates)
|
||||
-- DIRECT METRICS (only products directly in this category)
|
||||
direct_product_count INT NOT NULL DEFAULT 0, -- Products directly in this category
|
||||
direct_active_product_count INT NOT NULL DEFAULT 0, -- Visible products directly in this category
|
||||
direct_replenishable_product_count INT NOT NULL DEFAULT 0,-- Replenishable products directly in this category
|
||||
|
||||
-- Direct Current Stock Value
|
||||
direct_current_stock_units INT NOT NULL DEFAULT 0,
|
||||
direct_stock_cost NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
direct_stock_retail NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
|
||||
-- Direct Rolling Period Aggregates
|
||||
direct_sales_7d INT NOT NULL DEFAULT 0, direct_revenue_7d NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
direct_sales_30d INT NOT NULL DEFAULT 0, direct_revenue_30d NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
direct_profit_30d NUMERIC(16, 4) NOT NULL DEFAULT 0.00, direct_cogs_30d NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
direct_sales_365d INT NOT NULL DEFAULT 0, direct_revenue_365d NUMERIC(16, 4) NOT NULL DEFAULT 0.00,
|
||||
direct_lifetime_sales INT NOT NULL DEFAULT 0, direct_lifetime_revenue NUMERIC(18, 4) NOT NULL DEFAULT 0.00,
|
||||
|
||||
-- Calculated KPIs (Based on 30d aggregates) - Apply to rolled-up metrics
|
||||
avg_margin_30d NUMERIC(7, 3), -- (profit / revenue) * 100
|
||||
stock_turn_30d NUMERIC(10, 3), -- sales_units / avg_stock_units (Needs avg stock calc)
|
||||
-- growth_rate_30d NUMERIC(7, 3), -- (current 30d rev - prev 30d rev) / prev 30d rev
|
||||
@@ -236,7 +280,7 @@ CREATE TABLE public.vendor_metrics (
|
||||
lifetime_sales INT NOT NULL DEFAULT 0, lifetime_revenue NUMERIC(18, 4) NOT NULL DEFAULT 0.00,
|
||||
|
||||
-- Calculated KPIs (Based on 30d aggregates)
|
||||
avg_margin_30d NUMERIC(7, 3) -- (profit / revenue) * 100
|
||||
avg_margin_30d NUMERIC(14, 4) -- (profit / revenue) * 100
|
||||
-- Add more KPIs if needed (e.g., avg product value, sell-through rate for vendor)
|
||||
);
|
||||
CREATE INDEX idx_vendor_metrics_active_count ON public.vendor_metrics(active_product_count);
|
||||
|
||||
@@ -213,55 +213,55 @@ SET session_replication_role = 'origin'; -- Re-enable foreign key checks
|
||||
-- Create views for common calculations
|
||||
-- product_sales_trends view moved to metrics-schema.sql
|
||||
|
||||
-- Historical data tables imported from production
|
||||
CREATE TABLE imported_product_current_prices (
|
||||
price_id BIGSERIAL PRIMARY KEY,
|
||||
pid BIGINT NOT NULL,
|
||||
qty_buy SMALLINT NOT NULL,
|
||||
is_min_qty_buy BOOLEAN NOT NULL,
|
||||
price_each NUMERIC(10,3) NOT NULL,
|
||||
qty_limit SMALLINT NOT NULL,
|
||||
no_promo BOOLEAN NOT NULL,
|
||||
checkout_offer BOOLEAN NOT NULL,
|
||||
active BOOLEAN NOT NULL,
|
||||
date_active TIMESTAMP WITH TIME ZONE,
|
||||
date_deactive TIMESTAMP WITH TIME ZONE,
|
||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
-- -- Historical data tables imported from production
|
||||
-- CREATE TABLE imported_product_current_prices (
|
||||
-- price_id BIGSERIAL PRIMARY KEY,
|
||||
-- pid BIGINT NOT NULL,
|
||||
-- qty_buy SMALLINT NOT NULL,
|
||||
-- is_min_qty_buy BOOLEAN NOT NULL,
|
||||
-- price_each NUMERIC(10,3) NOT NULL,
|
||||
-- qty_limit SMALLINT NOT NULL,
|
||||
-- no_promo BOOLEAN NOT NULL,
|
||||
-- checkout_offer BOOLEAN NOT NULL,
|
||||
-- active BOOLEAN NOT NULL,
|
||||
-- date_active TIMESTAMP WITH TIME ZONE,
|
||||
-- date_deactive TIMESTAMP WITH TIME ZONE,
|
||||
-- updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||
-- );
|
||||
|
||||
CREATE INDEX idx_imported_product_current_prices_pid ON imported_product_current_prices(pid, active, qty_buy);
|
||||
CREATE INDEX idx_imported_product_current_prices_checkout ON imported_product_current_prices(checkout_offer, active);
|
||||
CREATE INDEX idx_imported_product_current_prices_deactive ON imported_product_current_prices(date_deactive, active);
|
||||
CREATE INDEX idx_imported_product_current_prices_active ON imported_product_current_prices(date_active, active);
|
||||
-- CREATE INDEX idx_imported_product_current_prices_pid ON imported_product_current_prices(pid, active, qty_buy);
|
||||
-- CREATE INDEX idx_imported_product_current_prices_checkout ON imported_product_current_prices(checkout_offer, active);
|
||||
-- CREATE INDEX idx_imported_product_current_prices_deactive ON imported_product_current_prices(date_deactive, active);
|
||||
-- CREATE INDEX idx_imported_product_current_prices_active ON imported_product_current_prices(date_active, active);
|
||||
|
||||
CREATE TABLE imported_daily_inventory (
|
||||
date DATE NOT NULL,
|
||||
pid BIGINT NOT NULL,
|
||||
amountsold SMALLINT NOT NULL DEFAULT 0,
|
||||
times_sold SMALLINT NOT NULL DEFAULT 0,
|
||||
qtyreceived SMALLINT NOT NULL DEFAULT 0,
|
||||
price NUMERIC(7,2) NOT NULL DEFAULT 0,
|
||||
costeach NUMERIC(7,2) NOT NULL DEFAULT 0,
|
||||
stamp TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (date, pid)
|
||||
);
|
||||
-- CREATE TABLE imported_daily_inventory (
|
||||
-- date DATE NOT NULL,
|
||||
-- pid BIGINT NOT NULL,
|
||||
-- amountsold SMALLINT NOT NULL DEFAULT 0,
|
||||
-- times_sold SMALLINT NOT NULL DEFAULT 0,
|
||||
-- qtyreceived SMALLINT NOT NULL DEFAULT 0,
|
||||
-- price NUMERIC(7,2) NOT NULL DEFAULT 0,
|
||||
-- costeach NUMERIC(7,2) NOT NULL DEFAULT 0,
|
||||
-- stamp TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
-- updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
-- PRIMARY KEY (date, pid)
|
||||
-- );
|
||||
|
||||
CREATE INDEX idx_imported_daily_inventory_pid ON imported_daily_inventory(pid);
|
||||
-- CREATE INDEX idx_imported_daily_inventory_pid ON imported_daily_inventory(pid);
|
||||
|
||||
CREATE TABLE imported_product_stat_history (
|
||||
pid BIGINT NOT NULL,
|
||||
date DATE NOT NULL,
|
||||
score NUMERIC(10,2) NOT NULL,
|
||||
score2 NUMERIC(10,2) NOT NULL,
|
||||
qty_in_baskets SMALLINT NOT NULL,
|
||||
qty_sold SMALLINT NOT NULL,
|
||||
notifies_set SMALLINT NOT NULL,
|
||||
visibility_score NUMERIC(10,2) NOT NULL,
|
||||
health_score VARCHAR(5) NOT NULL,
|
||||
sold_view_score NUMERIC(6,3) NOT NULL,
|
||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (pid, date)
|
||||
);
|
||||
-- CREATE TABLE imported_product_stat_history (
|
||||
-- pid BIGINT NOT NULL,
|
||||
-- date DATE NOT NULL,
|
||||
-- score NUMERIC(10,2) NOT NULL,
|
||||
-- score2 NUMERIC(10,2) NOT NULL,
|
||||
-- qty_in_baskets SMALLINT NOT NULL,
|
||||
-- qty_sold SMALLINT NOT NULL,
|
||||
-- notifies_set SMALLINT NOT NULL,
|
||||
-- visibility_score NUMERIC(10,2) NOT NULL,
|
||||
-- health_score VARCHAR(5) NOT NULL,
|
||||
-- sold_view_score NUMERIC(6,3) NOT NULL,
|
||||
-- updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
-- PRIMARY KEY (pid, date)
|
||||
-- );
|
||||
|
||||
CREATE INDEX idx_imported_product_stat_history_date ON imported_product_stat_history(date);
|
||||
-- CREATE INDEX idx_imported_product_stat_history_date ON imported_product_stat_history(date);
|
||||
@@ -1,7 +1,7 @@
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const progress = require('../utils/progress'); // Assuming progress utils are here
|
||||
const { getConnection, closePool } = require('../utils/db'); // Assuming db utils are here
|
||||
const progress = require('../scripts/metrics-new/utils/progress'); // Assuming progress utils are here
|
||||
const { getConnection, closePool } = require('../scripts/metrics-new/utils/db'); // Assuming db utils are here
|
||||
const os = require('os'); // For detecting number of CPU cores
|
||||
|
||||
// --- Configuration ---
|
||||
@@ -156,6 +156,7 @@ let currentStep = ''; // Track which step is running for cancellation message
|
||||
let overallStartTime = null;
|
||||
let mainTimeoutHandle = null;
|
||||
let stepTimeoutHandle = null;
|
||||
let combinedHistoryId = null; // ID for the combined history record
|
||||
|
||||
async function cancelCalculation(reason = 'cancelled by user') {
|
||||
if (isCancelled) return; // Prevent multiple cancellations
|
||||
@@ -181,6 +182,22 @@ async function cancelCalculation(reason = 'cancelled by user') {
|
||||
AND pid <> pg_backend_pid(); -- Don't cancel self
|
||||
`);
|
||||
console.log(`Sent ${result.rowCount} cancellation signal(s).`);
|
||||
|
||||
// Update the combined history record to show cancellation
|
||||
if (combinedHistoryId) {
|
||||
const totalDuration = Math.round((Date.now() - overallStartTime) / 1000);
|
||||
await conn.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'cancelled'::calculation_status,
|
||||
end_time = NOW(),
|
||||
duration_seconds = $1::integer,
|
||||
error_message = $2::text
|
||||
WHERE id = $3::integer;
|
||||
`, [totalDuration, `Calculation ${reason} during step: ${currentStep}`, combinedHistoryId]);
|
||||
console.log(`Updated combined history record ${combinedHistoryId} with cancellation status`);
|
||||
}
|
||||
|
||||
conn.release();
|
||||
} catch (err) {
|
||||
console.error('Error during database query cancellation:', err.message);
|
||||
@@ -349,7 +366,6 @@ async function executeSqlStep(config, progress) {
|
||||
console.log(`\n--- Starting Step: ${config.name} ---`);
|
||||
const stepStartTime = Date.now();
|
||||
let connection = null;
|
||||
let calculateHistoryId = null;
|
||||
|
||||
// Set timeout for this specific step
|
||||
if (stepTimeoutHandle) clearTimeout(stepTimeoutHandle); // Clear previous step's timeout
|
||||
@@ -383,10 +399,7 @@ async function executeSqlStep(config, progress) {
|
||||
connection = await getConnection();
|
||||
console.log("Database connection acquired.");
|
||||
|
||||
// 3. Clean up Previous Runs & Create History Record (within a transaction)
|
||||
await connection.query('BEGIN');
|
||||
|
||||
// Ensure calculate_status table exists
|
||||
// 3. Ensure calculate_status table exists
|
||||
await connection.query(`
|
||||
CREATE TABLE IF NOT EXISTS calculate_status (
|
||||
module_name TEXT PRIMARY KEY,
|
||||
@@ -394,41 +407,6 @@ async function executeSqlStep(config, progress) {
|
||||
);
|
||||
`);
|
||||
|
||||
// Ensure calculate_history table exists (basic structure)
|
||||
await connection.query(`
|
||||
CREATE TABLE IF NOT EXISTS calculate_history (
|
||||
id SERIAL PRIMARY KEY,
|
||||
start_time TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
end_time TIMESTAMP WITH TIME ZONE,
|
||||
duration_seconds INTEGER,
|
||||
status TEXT, -- 'running', 'completed', 'failed', 'cancelled'
|
||||
error_message TEXT,
|
||||
additional_info JSONB
|
||||
);
|
||||
`);
|
||||
|
||||
// Mark previous runs of this type as cancelled
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'cancelled',
|
||||
end_time = NOW(),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = 'Previous calculation was not completed properly or was superseded.'
|
||||
WHERE status = 'running' AND additional_info->>'type' = $1::text;
|
||||
`, [config.historyType]);
|
||||
|
||||
// Create history record for this run
|
||||
const historyResult = await connection.query(`
|
||||
INSERT INTO calculate_history (status, additional_info)
|
||||
VALUES ('running', jsonb_build_object('type', $1::text, 'sql_file', $2::text))
|
||||
RETURNING id;
|
||||
`, [config.historyType, config.sqlFile]);
|
||||
calculateHistoryId = historyResult.rows[0].id;
|
||||
|
||||
await connection.query('COMMIT');
|
||||
console.log(`Created history record ID: ${calculateHistoryId}`);
|
||||
|
||||
// 4. Initial Progress Update
|
||||
progress.outputProgress({
|
||||
status: 'running',
|
||||
@@ -486,9 +464,7 @@ async function executeSqlStep(config, progress) {
|
||||
|
||||
console.log(`SQL execution finished for ${config.name}.`);
|
||||
|
||||
// 6. Update Status & History (within a transaction)
|
||||
await connection.query('BEGIN');
|
||||
|
||||
// 6. Update Status table only
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ($1::text, NOW())
|
||||
@@ -497,16 +473,6 @@ async function executeSqlStep(config, progress) {
|
||||
`, [config.statusModule]);
|
||||
|
||||
const stepDuration = Math.round((Date.now() - stepStartTime) / 1000);
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = $1::integer,
|
||||
status = 'completed'
|
||||
WHERE id = $2::integer;
|
||||
`, [stepDuration, calculateHistoryId]);
|
||||
|
||||
await connection.query('COMMIT');
|
||||
|
||||
// 7. Final Progress Update for Step
|
||||
progress.outputProgress({
|
||||
@@ -540,33 +506,8 @@ async function executeSqlStep(config, progress) {
|
||||
console.error(error); // Log the full error
|
||||
console.error(`------------------------------------`);
|
||||
|
||||
// Update history with error/cancellation status
|
||||
if (connection && calculateHistoryId) {
|
||||
try {
|
||||
// Use a separate transaction for error logging
|
||||
await connection.query('ROLLBACK'); // Rollback any partial transaction from try block
|
||||
await connection.query('BEGIN');
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = $1::integer,
|
||||
status = $2::text,
|
||||
error_message = $3::text
|
||||
WHERE id = $4::integer;
|
||||
`, [errorDuration, finalStatus, errorMessage.substring(0, 1000), calculateHistoryId]); // Limit error message size
|
||||
await connection.query('COMMIT');
|
||||
console.log(`Updated history record ID ${calculateHistoryId} with status: ${finalStatus}`);
|
||||
} catch (historyError) {
|
||||
console.error("FATAL: Failed to update history record on error:", historyError);
|
||||
// Cannot rollback here if already rolled back or commit failed
|
||||
}
|
||||
} else {
|
||||
console.warn("Could not update history record on error (no connection or history ID).");
|
||||
}
|
||||
|
||||
// Update progress file with error/cancellation
|
||||
progress.outputProgress({
|
||||
progress.outputProgress({
|
||||
status: finalStatus,
|
||||
operation: `Error in ${config.name}: ${errorMessage.split('\n')[0]}`, // Show first line of error
|
||||
current: 50, total: 100, // Indicate partial completion
|
||||
@@ -656,9 +597,80 @@ async function runAllCalculations() {
|
||||
}
|
||||
];
|
||||
|
||||
// Build a list of steps we will actually run
|
||||
const stepsToRun = steps.filter(step => step.run);
|
||||
const stepNames = stepsToRun.map(step => step.name);
|
||||
const sqlFiles = stepsToRun.map(step => step.sqlFile);
|
||||
|
||||
let overallSuccess = true;
|
||||
let connection = null;
|
||||
|
||||
try {
|
||||
// Create a single history record before starting all calculations
|
||||
try {
|
||||
connection = await getConnection();
|
||||
|
||||
// Ensure calculate_history table exists (basic structure)
|
||||
await connection.query(`
|
||||
CREATE TABLE IF NOT EXISTS calculate_history (
|
||||
id SERIAL PRIMARY KEY,
|
||||
start_time TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
end_time TIMESTAMP WITH TIME ZONE,
|
||||
duration_seconds INTEGER,
|
||||
status TEXT, -- Will be altered to enum if needed below
|
||||
error_message TEXT,
|
||||
additional_info JSONB
|
||||
);
|
||||
`);
|
||||
|
||||
// Ensure the calculation_status enum type exists if needed
|
||||
await connection.query(`
|
||||
DO $$
|
||||
BEGIN
|
||||
IF NOT EXISTS (SELECT 1 FROM pg_type WHERE typname = 'calculation_status') THEN
|
||||
CREATE TYPE calculation_status AS ENUM ('running', 'completed', 'failed', 'cancelled');
|
||||
|
||||
-- If needed, alter the existing table to use the enum
|
||||
ALTER TABLE calculate_history
|
||||
ALTER COLUMN status TYPE calculation_status
|
||||
USING status::calculation_status;
|
||||
END IF;
|
||||
END
|
||||
$$;
|
||||
`);
|
||||
|
||||
// Mark any previous running combined calculations as cancelled
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'cancelled'::calculation_status,
|
||||
end_time = NOW(),
|
||||
duration_seconds = EXTRACT(EPOCH FROM (NOW() - start_time))::INTEGER,
|
||||
error_message = 'Previous calculation was not completed properly or was superseded.'
|
||||
WHERE status = 'running'::calculation_status AND additional_info->>'type' = 'combined_metrics';
|
||||
`);
|
||||
|
||||
// Create a single history record for this run
|
||||
const historyResult = await connection.query(`
|
||||
INSERT INTO calculate_history (status, additional_info)
|
||||
VALUES ('running'::calculation_status, jsonb_build_object(
|
||||
'type', 'combined_metrics',
|
||||
'steps', $1::jsonb,
|
||||
'sql_files', $2::jsonb
|
||||
))
|
||||
RETURNING id;
|
||||
`, [JSON.stringify(stepNames), JSON.stringify(sqlFiles)]);
|
||||
|
||||
combinedHistoryId = historyResult.rows[0].id;
|
||||
console.log(`Created combined history record ID: ${combinedHistoryId}`);
|
||||
|
||||
connection.release();
|
||||
} catch (historyError) {
|
||||
console.error('Error creating combined history record:', historyError);
|
||||
if (connection) connection.release();
|
||||
// Continue without history tracking if it fails
|
||||
}
|
||||
|
||||
// First, sync the settings_product table to ensure all products have entries
|
||||
progressUtils.outputProgress({
|
||||
operation: 'Starting metrics calculation',
|
||||
@@ -678,6 +690,9 @@ async function runAllCalculations() {
|
||||
// Don't fail the entire process if settings sync fails
|
||||
}
|
||||
|
||||
// Track completed steps
|
||||
const completedSteps = [];
|
||||
|
||||
// Now run the calculation steps
|
||||
for (const step of steps) {
|
||||
if (step.run) {
|
||||
@@ -686,8 +701,17 @@ async function runAllCalculations() {
|
||||
overallSuccess = false; // Mark as not fully successful if steps are skipped due to cancel
|
||||
continue; // Skip to next step
|
||||
}
|
||||
|
||||
// Pass the progress utilities to the step executor
|
||||
await executeSqlStep(step, progressUtils);
|
||||
const result = await executeSqlStep(step, progressUtils);
|
||||
|
||||
if (result.success) {
|
||||
completedSteps.push({
|
||||
name: step.name,
|
||||
duration: result.duration,
|
||||
status: 'completed'
|
||||
});
|
||||
}
|
||||
} else {
|
||||
console.log(`Skipping step "${step.name}" (disabled by configuration).`);
|
||||
}
|
||||
@@ -696,6 +720,34 @@ async function runAllCalculations() {
|
||||
// If we finished naturally (no errors thrown out)
|
||||
clearTimeout(mainTimeoutHandle); // Clear the main timeout
|
||||
|
||||
// Update the combined history record on successful completion
|
||||
if (combinedHistoryId) {
|
||||
try {
|
||||
connection = await getConnection();
|
||||
const totalDuration = Math.round((Date.now() - overallStartTime) / 1000);
|
||||
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = $1::integer,
|
||||
status = $2::calculation_status,
|
||||
additional_info = additional_info || jsonb_build_object('completed_steps', $3::jsonb)
|
||||
WHERE id = $4::integer;
|
||||
`, [
|
||||
totalDuration,
|
||||
isCancelled ? 'cancelled' : 'completed',
|
||||
JSON.stringify(completedSteps),
|
||||
combinedHistoryId
|
||||
]);
|
||||
|
||||
connection.release();
|
||||
} catch (historyError) {
|
||||
console.error('Error updating combined history record on completion:', historyError);
|
||||
if (connection) connection.release();
|
||||
}
|
||||
}
|
||||
|
||||
if (isCancelled) {
|
||||
console.log("\n--- Calculation finished with cancellation ---");
|
||||
overallSuccess = false;
|
||||
@@ -709,8 +761,34 @@ async function runAllCalculations() {
|
||||
console.error("\n--- SCRIPT EXECUTION FAILED ---");
|
||||
// Error details were already logged by executeSqlStep or global handlers
|
||||
overallSuccess = false;
|
||||
// Don't re-log the error here unless adding context
|
||||
// console.error("Overall failure reason:", error.message);
|
||||
|
||||
// Update the combined history record on error
|
||||
if (combinedHistoryId) {
|
||||
try {
|
||||
connection = await getConnection();
|
||||
const totalDuration = Math.round((Date.now() - overallStartTime) / 1000);
|
||||
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = $1::integer,
|
||||
status = $2::calculation_status,
|
||||
error_message = $3::text
|
||||
WHERE id = $4::integer;
|
||||
`, [
|
||||
totalDuration,
|
||||
isCancelled ? 'cancelled' : 'failed',
|
||||
error.message.substring(0, 1000),
|
||||
combinedHistoryId
|
||||
]);
|
||||
|
||||
connection.release();
|
||||
} catch (historyError) {
|
||||
console.error('Error updating combined history record on error:', historyError);
|
||||
if (connection) connection.release();
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
await closePool();
|
||||
console.log(`Total execution time: ${progressUtils.formatElapsedTime(overallStartTime)}`);
|
||||
|
||||
@@ -38,7 +38,7 @@ const sshConfig = {
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z',
|
||||
timezone: '-05:00', // Production DB always stores times in EST (UTC-5) regardless of DST
|
||||
},
|
||||
localDbConfig: {
|
||||
// PostgreSQL config for local
|
||||
|
||||
@@ -26,10 +26,7 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
let cumulativeProcessedOrders = 0;
|
||||
|
||||
try {
|
||||
// Begin transaction
|
||||
await localConnection.beginTransaction();
|
||||
|
||||
// Get last sync info
|
||||
// Get last sync info - NOT in a transaction anymore
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||
);
|
||||
@@ -43,8 +40,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed_onlydate IS NOT NULL
|
||||
AND o.date_placed >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed IS NOT NULL
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
o.stamp > ?
|
||||
@@ -82,8 +79,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed_onlydate IS NOT NULL
|
||||
AND o.date_placed >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed IS NOT NULL
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
o.stamp > ?
|
||||
@@ -107,91 +104,131 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
console.log('Orders: Found', orderItems.length, 'order items to process');
|
||||
|
||||
// Create tables in PostgreSQL for data processing
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
|
||||
CREATE TEMP TABLE temp_order_items (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
sku TEXT NOT NULL,
|
||||
price NUMERIC(14, 4) NOT NULL,
|
||||
quantity INTEGER NOT NULL,
|
||||
base_discount NUMERIC(14, 4) DEFAULT 0,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_meta (
|
||||
order_id INTEGER NOT NULL,
|
||||
date TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
customer TEXT NOT NULL,
|
||||
customer_name TEXT NOT NULL,
|
||||
status TEXT,
|
||||
canceled BOOLEAN,
|
||||
summary_discount NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
summary_subtotal NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
PRIMARY KEY (order_id)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
discount NUMERIC(14, 4) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_taxes (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
tax NUMERIC(14, 4) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
costeach NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||
`);
|
||||
|
||||
// Insert order items in batches
|
||||
for (let i = 0; i < orderItems.length; i += 5000) {
|
||||
const batch = orderItems.slice(i, Math.min(i + 5000, orderItems.length));
|
||||
const placeholders = batch.map((_, idx) =>
|
||||
`($${idx * 6 + 1}, $${idx * 6 + 2}, $${idx * 6 + 3}, $${idx * 6 + 4}, $${idx * 6 + 5}, $${idx * 6 + 6})`
|
||||
).join(",");
|
||||
const values = batch.flatMap(item => [
|
||||
item.order_id, item.prod_pid, item.SKU, item.price, item.quantity, item.base_discount
|
||||
]);
|
||||
|
||||
// Start a transaction just for creating the temp tables
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_items (order_id, pid, sku, price, quantity, base_discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
sku = EXCLUDED.sku,
|
||||
price = EXCLUDED.price,
|
||||
quantity = EXCLUDED.quantity,
|
||||
base_discount = EXCLUDED.base_discount
|
||||
`, values);
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
DROP TABLE IF EXISTS temp_main_discounts;
|
||||
DROP TABLE IF EXISTS temp_item_discounts;
|
||||
|
||||
processedCount = i + batch.length;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Loading order items: ${processedCount} of ${totalOrderItems}`,
|
||||
current: processedCount,
|
||||
total: totalOrderItems,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalOrderItems),
|
||||
rate: calculateRate(startTime, processedCount)
|
||||
});
|
||||
CREATE TEMP TABLE temp_order_items (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
sku TEXT NOT NULL,
|
||||
price NUMERIC(14, 4) NOT NULL,
|
||||
quantity INTEGER NOT NULL,
|
||||
base_discount NUMERIC(14, 4) DEFAULT 0,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_meta (
|
||||
order_id INTEGER NOT NULL,
|
||||
date TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||
customer TEXT NOT NULL,
|
||||
customer_name TEXT NOT NULL,
|
||||
status TEXT,
|
||||
canceled BOOLEAN,
|
||||
summary_discount NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
summary_subtotal NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
summary_discount_subtotal NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
PRIMARY KEY (order_id)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
discount NUMERIC(14, 4) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_main_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
discount_id INTEGER NOT NULL,
|
||||
discount_amount_subtotal NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
PRIMARY KEY (order_id, discount_id)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_item_discounts (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
discount_id INTEGER NOT NULL,
|
||||
amount NUMERIC(14, 4) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid, discount_id)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_taxes (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
tax NUMERIC(14, 4) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE TEMP TABLE temp_order_costs (
|
||||
order_id INTEGER NOT NULL,
|
||||
pid INTEGER NOT NULL,
|
||||
costeach NUMERIC(14, 4) DEFAULT 0.0000,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_temp_order_items_pid ON temp_order_items(pid);
|
||||
CREATE INDEX idx_temp_order_meta_order_id ON temp_order_meta(order_id);
|
||||
CREATE INDEX idx_temp_order_discounts_order_pid ON temp_order_discounts(order_id, pid);
|
||||
CREATE INDEX idx_temp_order_taxes_order_pid ON temp_order_taxes(order_id, pid);
|
||||
CREATE INDEX idx_temp_order_costs_order_pid ON temp_order_costs(order_id, pid);
|
||||
CREATE INDEX idx_temp_main_discounts_discount_id ON temp_main_discounts(discount_id);
|
||||
CREATE INDEX idx_temp_item_discounts_order_pid ON temp_item_discounts(order_id, pid);
|
||||
CREATE INDEX idx_temp_item_discounts_discount_id ON temp_item_discounts(discount_id);
|
||||
`);
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Insert order items in batches - each batch gets its own transaction
|
||||
for (let i = 0; i < orderItems.length; i += 5000) {
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
const batch = orderItems.slice(i, Math.min(i + 5000, orderItems.length));
|
||||
const placeholders = batch.map((_, idx) =>
|
||||
`($${idx * 6 + 1}, $${idx * 6 + 2}, $${idx * 6 + 3}, $${idx * 6 + 4}, $${idx * 6 + 5}, $${idx * 6 + 6})`
|
||||
).join(",");
|
||||
const values = batch.flatMap(item => [
|
||||
item.order_id, item.prod_pid, item.SKU, item.price, item.quantity, item.base_discount
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_items (order_id, pid, sku, price, quantity, base_discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
sku = EXCLUDED.sku,
|
||||
price = EXCLUDED.price,
|
||||
quantity = EXCLUDED.quantity,
|
||||
base_discount = EXCLUDED.base_discount
|
||||
`, values);
|
||||
|
||||
await localConnection.commit();
|
||||
|
||||
processedCount = i + batch.length;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Loading order items: ${processedCount} of ${totalOrderItems}`,
|
||||
current: processedCount,
|
||||
total: totalOrderItems,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalOrderItems),
|
||||
rate: calculateRate(startTime, processedCount)
|
||||
});
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Get unique order IDs
|
||||
@@ -218,86 +255,162 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const [orders] = await prodConnection.query(`
|
||||
SELECT
|
||||
o.order_id,
|
||||
o.date_placed_onlydate as date,
|
||||
o.date_placed as date,
|
||||
o.order_cid as customer,
|
||||
CONCAT(COALESCE(u.firstname, ''), ' ', COALESCE(u.lastname, '')) as customer_name,
|
||||
o.order_status as status,
|
||||
CASE WHEN o.date_cancelled != '0000-00-00 00:00:00' THEN 1 ELSE 0 END as canceled,
|
||||
o.summary_discount,
|
||||
o.summary_subtotal
|
||||
o.summary_subtotal,
|
||||
o.summary_discount_subtotal
|
||||
FROM _order o
|
||||
LEFT JOIN users u ON o.order_cid = u.cid
|
||||
WHERE o.order_id IN (?)
|
||||
`, [batchIds]);
|
||||
|
||||
// Process in sub-batches for PostgreSQL
|
||||
for (let j = 0; j < orders.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = orders.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
for (let j = 0; j < orders.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = orders.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 8 + 1}, $${idx * 8 + 2}, $${idx * 8 + 3}, $${idx * 8 + 4}, $${idx * 8 + 5}, $${idx * 8 + 6}, $${idx * 8 + 7}, $${idx * 8 + 8})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(order => [
|
||||
order.order_id,
|
||||
new Date(order.date), // Convert to TIMESTAMP WITH TIME ZONE
|
||||
order.customer,
|
||||
toTitleCase(order.customer_name) || '',
|
||||
order.status.toString(), // Convert status to TEXT
|
||||
order.canceled,
|
||||
order.summary_discount || 0,
|
||||
order.summary_subtotal || 0
|
||||
]);
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 9 + 1}, $${idx * 9 + 2}, $${idx * 9 + 3}, $${idx * 9 + 4}, $${idx * 9 + 5}, $${idx * 9 + 6}, $${idx * 9 + 7}, $${idx * 9 + 8}, $${idx * 9 + 9})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(order => [
|
||||
order.order_id,
|
||||
new Date(order.date), // Convert to TIMESTAMP WITH TIME ZONE
|
||||
order.customer,
|
||||
toTitleCase(order.customer_name) || '',
|
||||
order.status.toString(), // Convert status to TEXT
|
||||
order.canceled,
|
||||
order.summary_discount || 0,
|
||||
order.summary_subtotal || 0,
|
||||
order.summary_discount_subtotal || 0
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_meta (
|
||||
order_id, date, customer, customer_name, status, canceled,
|
||||
summary_discount, summary_subtotal
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id) DO UPDATE SET
|
||||
date = EXCLUDED.date,
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled,
|
||||
summary_discount = EXCLUDED.summary_discount,
|
||||
summary_subtotal = EXCLUDED.summary_subtotal
|
||||
`, values);
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_meta (
|
||||
order_id, date, customer, customer_name, status, canceled,
|
||||
summary_discount, summary_subtotal, summary_discount_subtotal
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id) DO UPDATE SET
|
||||
date = EXCLUDED.date,
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled,
|
||||
summary_discount = EXCLUDED.summary_discount,
|
||||
summary_subtotal = EXCLUDED.summary_subtotal,
|
||||
summary_discount_subtotal = EXCLUDED.summary_discount_subtotal
|
||||
`, values);
|
||||
}
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
const processDiscountsBatch = async (batchIds) => {
|
||||
// First, load main discount records
|
||||
const [mainDiscounts] = await prodConnection.query(`
|
||||
SELECT order_id, discount_id, discount_amount_subtotal
|
||||
FROM order_discounts
|
||||
WHERE order_id IN (?)
|
||||
`, [batchIds]);
|
||||
|
||||
if (mainDiscounts.length > 0) {
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
for (let j = 0; j < mainDiscounts.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = mainDiscounts.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(d => [
|
||||
d.order_id,
|
||||
d.discount_id,
|
||||
d.discount_amount_subtotal || 0
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_main_discounts (order_id, discount_id, discount_amount_subtotal)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, discount_id) DO UPDATE SET
|
||||
discount_amount_subtotal = EXCLUDED.discount_amount_subtotal
|
||||
`, values);
|
||||
}
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Then, load item discount records
|
||||
const [discounts] = await prodConnection.query(`
|
||||
SELECT order_id, pid, SUM(amount) as discount
|
||||
SELECT order_id, pid, discount_id, amount
|
||||
FROM order_discount_items
|
||||
WHERE order_id IN (?)
|
||||
GROUP BY order_id, pid
|
||||
`, [batchIds]);
|
||||
|
||||
if (discounts.length === 0) return;
|
||||
|
||||
for (let j = 0; j < discounts.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = discounts.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
// Process in memory to handle potential duplicates
|
||||
const discountMap = new Map();
|
||||
for (const d of discounts) {
|
||||
const key = `${d.order_id}-${d.pid}-${d.discount_id}`;
|
||||
discountMap.set(key, d);
|
||||
}
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(d => [
|
||||
d.order_id,
|
||||
d.pid,
|
||||
d.discount || 0
|
||||
]);
|
||||
const uniqueDiscounts = Array.from(discountMap.values());
|
||||
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
for (let j = 0; j < uniqueDiscounts.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = uniqueDiscounts.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 4 + 1}, $${idx * 4 + 2}, $${idx * 4 + 3}, $${idx * 4 + 4})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(d => [
|
||||
d.order_id,
|
||||
d.pid,
|
||||
d.discount_id,
|
||||
d.amount || 0
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_item_discounts (order_id, pid, discount_id, amount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid, discount_id) DO UPDATE SET
|
||||
amount = EXCLUDED.amount
|
||||
`, values);
|
||||
}
|
||||
|
||||
// Create aggregated view with a simpler, safer query that avoids duplicates
|
||||
await localConnection.query(`
|
||||
TRUNCATE temp_order_discounts;
|
||||
|
||||
INSERT INTO temp_order_discounts (order_id, pid, discount)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
discount = EXCLUDED.discount
|
||||
`, values);
|
||||
SELECT order_id, pid, SUM(amount) as discount
|
||||
FROM temp_item_discounts
|
||||
GROUP BY order_id, pid
|
||||
`);
|
||||
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
@@ -318,26 +431,33 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
|
||||
if (taxes.length === 0) return;
|
||||
|
||||
for (let j = 0; j < taxes.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = taxes.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
for (let j = 0; j < taxes.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = taxes.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(t => [
|
||||
t.order_id,
|
||||
t.pid,
|
||||
t.tax || 0
|
||||
]);
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(t => [
|
||||
t.order_id,
|
||||
t.pid,
|
||||
t.tax || 0
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
tax = EXCLUDED.tax
|
||||
`, values);
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_taxes (order_id, pid, tax)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
tax = EXCLUDED.tax
|
||||
`, values);
|
||||
}
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
@@ -363,39 +483,45 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
|
||||
if (costs.length === 0) return;
|
||||
|
||||
for (let j = 0; j < costs.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = costs.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
for (let j = 0; j < costs.length; j += PG_BATCH_SIZE) {
|
||||
const subBatch = costs.slice(j, j + PG_BATCH_SIZE);
|
||||
if (subBatch.length === 0) continue;
|
||||
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(c => [
|
||||
c.order_id,
|
||||
c.pid,
|
||||
c.costeach || 0
|
||||
]);
|
||||
const placeholders = subBatch.map((_, idx) =>
|
||||
`($${idx * 3 + 1}, $${idx * 3 + 2}, $${idx * 3 + 3})`
|
||||
).join(",");
|
||||
|
||||
const values = subBatch.flatMap(c => [
|
||||
c.order_id,
|
||||
c.pid,
|
||||
c.costeach || 0
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
costeach = EXCLUDED.costeach
|
||||
`, values);
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_id, pid) DO UPDATE SET
|
||||
costeach = EXCLUDED.costeach
|
||||
`, values);
|
||||
}
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// Process all data types in parallel for each batch
|
||||
// Process all data types SEQUENTIALLY for each batch - not in parallel
|
||||
for (let i = 0; i < orderIds.length; i += METADATA_BATCH_SIZE) {
|
||||
const batchIds = orderIds.slice(i, i + METADATA_BATCH_SIZE);
|
||||
|
||||
await Promise.all([
|
||||
processMetadataBatch(batchIds),
|
||||
processDiscountsBatch(batchIds),
|
||||
processTaxesBatch(batchIds),
|
||||
processCostsBatch(batchIds)
|
||||
]);
|
||||
// Run these sequentially instead of in parallel to avoid transaction conflicts
|
||||
await processMetadataBatch(batchIds);
|
||||
await processDiscountsBatch(batchIds);
|
||||
await processTaxesBatch(batchIds);
|
||||
await processCostsBatch(batchIds);
|
||||
|
||||
processedCount = i + batchIds.length;
|
||||
outputProgress({
|
||||
@@ -422,175 +548,201 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
const existingPids = new Set(existingProducts.rows.map(p => p.pid));
|
||||
|
||||
// Process in smaller batches
|
||||
for (let i = 0; i < orderIds.length; i += 1000) {
|
||||
const batchIds = orderIds.slice(i, i + 1000);
|
||||
for (let i = 0; i < orderIds.length; i += 2000) { // Increased from 1000 to 2000
|
||||
const batchIds = orderIds.slice(i, i + 2000);
|
||||
|
||||
// Get combined data for this batch in sub-batches
|
||||
const PG_BATCH_SIZE = 100; // Process 100 records at a time
|
||||
const PG_BATCH_SIZE = 200; // Increased from 100 to 200
|
||||
for (let j = 0; j < batchIds.length; j += PG_BATCH_SIZE) {
|
||||
const subBatchIds = batchIds.slice(j, j + PG_BATCH_SIZE);
|
||||
|
||||
const [orders] = await localConnection.query(`
|
||||
WITH order_totals AS (
|
||||
SELECT
|
||||
oi.order_id,
|
||||
oi.pid,
|
||||
SUM(COALESCE(od.discount, 0)) as promo_discount,
|
||||
COALESCE(ot.tax, 0) as total_tax,
|
||||
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||
FROM temp_order_items oi
|
||||
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||
)
|
||||
SELECT
|
||||
oi.order_id as order_number,
|
||||
oi.pid::bigint as pid,
|
||||
oi.sku,
|
||||
om.date,
|
||||
oi.price,
|
||||
oi.quantity,
|
||||
(oi.base_discount +
|
||||
COALESCE(ot.promo_discount, 0) +
|
||||
CASE
|
||||
WHEN om.summary_discount > 0 AND om.summary_subtotal > 0 THEN
|
||||
ROUND((om.summary_discount * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 2)
|
||||
ELSE 0
|
||||
END)::NUMERIC(14, 4) as discount,
|
||||
COALESCE(ot.total_tax, 0)::NUMERIC(14, 4) as tax,
|
||||
false as tax_included,
|
||||
0 as shipping,
|
||||
om.customer,
|
||||
om.customer_name,
|
||||
om.status,
|
||||
om.canceled,
|
||||
COALESCE(ot.costeach, oi.price * 0.5)::NUMERIC(14, 4) as costeach
|
||||
FROM (
|
||||
SELECT DISTINCT ON (order_id, pid)
|
||||
order_id, pid, sku, price, quantity, base_discount
|
||||
FROM temp_order_items
|
||||
WHERE order_id = ANY($1)
|
||||
ORDER BY order_id, pid
|
||||
) oi
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
ORDER BY oi.order_id, oi.pid
|
||||
`, [subBatchIds]);
|
||||
|
||||
// Filter orders and track missing products
|
||||
const validOrders = [];
|
||||
const processedOrderItems = new Set();
|
||||
const processedOrders = new Set();
|
||||
|
||||
for (const order of orders.rows) {
|
||||
if (!existingPids.has(order.pid)) {
|
||||
missingProducts.add(order.pid);
|
||||
skippedOrders.add(order.order_number);
|
||||
continue;
|
||||
}
|
||||
validOrders.push(order);
|
||||
processedOrderItems.add(`${order.order_number}-${order.pid}`);
|
||||
processedOrders.add(order.order_number);
|
||||
}
|
||||
|
||||
// Process valid orders in smaller sub-batches
|
||||
const FINAL_BATCH_SIZE = 50;
|
||||
for (let k = 0; k < validOrders.length; k += FINAL_BATCH_SIZE) {
|
||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||
|
||||
const placeholders = subBatch.map((_, idx) => {
|
||||
const base = idx * 15; // 15 columns including costeach
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||
}).join(',');
|
||||
|
||||
const batchValues = subBatch.flatMap(o => [
|
||||
o.order_number,
|
||||
o.pid,
|
||||
o.sku || 'NO-SKU',
|
||||
o.date, // This is now a TIMESTAMP WITH TIME ZONE
|
||||
o.price,
|
||||
o.quantity,
|
||||
o.discount,
|
||||
o.tax,
|
||||
o.tax_included,
|
||||
o.shipping,
|
||||
o.customer,
|
||||
o.customer_name,
|
||||
o.status.toString(), // Convert status to TEXT
|
||||
o.canceled,
|
||||
o.costeach
|
||||
]);
|
||||
|
||||
const [result] = await localConnection.query(`
|
||||
WITH inserted_orders AS (
|
||||
INSERT INTO orders (
|
||||
order_number, pid, sku, date, price, quantity, discount,
|
||||
tax, tax_included, shipping, customer, customer_name,
|
||||
status, canceled, costeach
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||
sku = EXCLUDED.sku,
|
||||
date = EXCLUDED.date,
|
||||
price = EXCLUDED.price,
|
||||
quantity = EXCLUDED.quantity,
|
||||
discount = EXCLUDED.discount,
|
||||
tax = EXCLUDED.tax,
|
||||
tax_included = EXCLUDED.tax_included,
|
||||
shipping = EXCLUDED.shipping,
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled,
|
||||
costeach = EXCLUDED.costeach
|
||||
RETURNING xmax = 0 as inserted
|
||||
// Start a transaction for this sub-batch
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
const [orders] = await localConnection.query(`
|
||||
WITH order_totals AS (
|
||||
SELECT
|
||||
oi.order_id,
|
||||
oi.pid,
|
||||
-- Instead of using ARRAY_AGG which can cause duplicate issues, use SUM with a CASE
|
||||
SUM(CASE
|
||||
WHEN COALESCE(md.discount_amount_subtotal, 0) > 0 THEN id.amount
|
||||
ELSE 0
|
||||
END) as promo_discount_sum,
|
||||
COALESCE(ot.tax, 0) as total_tax,
|
||||
COALESCE(oc.costeach, oi.price * 0.5) as costeach
|
||||
FROM temp_order_items oi
|
||||
LEFT JOIN temp_item_discounts id ON oi.order_id = id.order_id AND oi.pid = id.pid
|
||||
LEFT JOIN temp_main_discounts md ON id.order_id = md.order_id AND id.discount_id = md.discount_id
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs oc ON oi.order_id = oc.order_id AND oi.pid = oc.pid
|
||||
WHERE oi.order_id = ANY($1)
|
||||
GROUP BY oi.order_id, oi.pid, ot.tax, oc.costeach
|
||||
)
|
||||
SELECT
|
||||
COUNT(*) FILTER (WHERE inserted) as inserted,
|
||||
COUNT(*) FILTER (WHERE NOT inserted) as updated
|
||||
FROM inserted_orders
|
||||
`, batchValues);
|
||||
|
||||
const { inserted, updated } = result.rows[0];
|
||||
recordsAdded += parseInt(inserted) || 0;
|
||||
recordsUpdated += parseInt(updated) || 0;
|
||||
importedCount += subBatch.length;
|
||||
}
|
||||
oi.order_id as order_number,
|
||||
oi.pid::bigint as pid,
|
||||
oi.sku,
|
||||
om.date,
|
||||
oi.price,
|
||||
oi.quantity,
|
||||
(
|
||||
-- Part 1: Sale Savings for the Line
|
||||
(oi.base_discount * oi.quantity)
|
||||
+
|
||||
-- Part 2: Prorated Points Discount (if applicable)
|
||||
CASE
|
||||
WHEN om.summary_discount_subtotal > 0 AND om.summary_subtotal > 0 THEN
|
||||
COALESCE(ROUND((om.summary_discount_subtotal * (oi.price * oi.quantity)) / NULLIF(om.summary_subtotal, 0), 4), 0)
|
||||
ELSE 0
|
||||
END
|
||||
+
|
||||
-- Part 3: Specific Item-Level Discount (only if parent discount affected subtotal)
|
||||
COALESCE(ot.promo_discount_sum, 0)
|
||||
)::NUMERIC(14, 4) as discount,
|
||||
COALESCE(ot.total_tax, 0)::NUMERIC(14, 4) as tax,
|
||||
false as tax_included,
|
||||
0 as shipping,
|
||||
om.customer,
|
||||
om.customer_name,
|
||||
om.status,
|
||||
om.canceled,
|
||||
COALESCE(ot.costeach, oi.price * 0.5)::NUMERIC(14, 4) as costeach
|
||||
FROM temp_order_items oi
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN order_totals ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
WHERE oi.order_id = ANY($1)
|
||||
ORDER BY oi.order_id, oi.pid
|
||||
`, [subBatchIds]);
|
||||
|
||||
cumulativeProcessedOrders += processedOrders.size;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Importing orders: ${cumulativeProcessedOrders} of ${totalUniqueOrders}`,
|
||||
current: cumulativeProcessedOrders,
|
||||
total: totalUniqueOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, cumulativeProcessedOrders, totalUniqueOrders),
|
||||
rate: calculateRate(startTime, cumulativeProcessedOrders)
|
||||
});
|
||||
// Filter orders and track missing products
|
||||
const validOrders = [];
|
||||
const processedOrderItems = new Set();
|
||||
const processedOrders = new Set();
|
||||
|
||||
for (const order of orders.rows) {
|
||||
if (!existingPids.has(order.pid)) {
|
||||
missingProducts.add(order.pid);
|
||||
skippedOrders.add(order.order_number);
|
||||
continue;
|
||||
}
|
||||
validOrders.push(order);
|
||||
processedOrderItems.add(`${order.order_number}-${order.pid}`);
|
||||
processedOrders.add(order.order_number);
|
||||
}
|
||||
|
||||
// Process valid orders in smaller sub-batches
|
||||
const FINAL_BATCH_SIZE = 100; // Increased from 50 to 100
|
||||
for (let k = 0; k < validOrders.length; k += FINAL_BATCH_SIZE) {
|
||||
const subBatch = validOrders.slice(k, k + FINAL_BATCH_SIZE);
|
||||
|
||||
const placeholders = subBatch.map((_, idx) => {
|
||||
const base = idx * 15; // 15 columns including costeach
|
||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||
}).join(',');
|
||||
|
||||
const batchValues = subBatch.flatMap(o => [
|
||||
o.order_number,
|
||||
o.pid,
|
||||
o.sku || 'NO-SKU',
|
||||
o.date, // This is now a TIMESTAMP WITH TIME ZONE
|
||||
o.price,
|
||||
o.quantity,
|
||||
o.discount,
|
||||
o.tax,
|
||||
o.tax_included,
|
||||
o.shipping,
|
||||
o.customer,
|
||||
o.customer_name,
|
||||
o.status.toString(), // Convert status to TEXT
|
||||
o.canceled,
|
||||
o.costeach
|
||||
]);
|
||||
|
||||
const [result] = await localConnection.query(`
|
||||
WITH inserted_orders AS (
|
||||
INSERT INTO orders (
|
||||
order_number, pid, sku, date, price, quantity, discount,
|
||||
tax, tax_included, shipping, customer, customer_name,
|
||||
status, canceled, costeach
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (order_number, pid) DO UPDATE SET
|
||||
sku = EXCLUDED.sku,
|
||||
date = EXCLUDED.date,
|
||||
price = EXCLUDED.price,
|
||||
quantity = EXCLUDED.quantity,
|
||||
discount = EXCLUDED.discount,
|
||||
tax = EXCLUDED.tax,
|
||||
tax_included = EXCLUDED.tax_included,
|
||||
shipping = EXCLUDED.shipping,
|
||||
customer = EXCLUDED.customer,
|
||||
customer_name = EXCLUDED.customer_name,
|
||||
status = EXCLUDED.status,
|
||||
canceled = EXCLUDED.canceled,
|
||||
costeach = EXCLUDED.costeach
|
||||
RETURNING xmax = 0 as inserted
|
||||
)
|
||||
SELECT
|
||||
COUNT(*) FILTER (WHERE inserted) as inserted,
|
||||
COUNT(*) FILTER (WHERE NOT inserted) as updated
|
||||
FROM inserted_orders
|
||||
`, batchValues);
|
||||
|
||||
const { inserted, updated } = result.rows[0];
|
||||
recordsAdded += parseInt(inserted) || 0;
|
||||
recordsUpdated += parseInt(updated) || 0;
|
||||
importedCount += subBatch.length;
|
||||
}
|
||||
|
||||
await localConnection.commit();
|
||||
|
||||
cumulativeProcessedOrders += processedOrders.size;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Importing orders: ${cumulativeProcessedOrders} of ${totalUniqueOrders}`,
|
||||
current: cumulativeProcessedOrders,
|
||||
total: totalUniqueOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, cumulativeProcessedOrders, totalUniqueOrders),
|
||||
rate: calculateRate(startTime, cumulativeProcessedOrders)
|
||||
});
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('orders', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Cleanup temporary tables
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
`);
|
||||
|
||||
// Commit transaction
|
||||
await localConnection.commit();
|
||||
// Start a transaction for updating sync status and dropping temp tables
|
||||
await localConnection.beginTransaction();
|
||||
try {
|
||||
// Update sync status
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('orders', NOW())
|
||||
ON CONFLICT (table_name) DO UPDATE SET
|
||||
last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Cleanup temporary tables
|
||||
await localConnection.query(`
|
||||
DROP TABLE IF EXISTS temp_order_items;
|
||||
DROP TABLE IF EXISTS temp_order_meta;
|
||||
DROP TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TABLE IF EXISTS temp_order_costs;
|
||||
DROP TABLE IF EXISTS temp_main_discounts;
|
||||
DROP TABLE IF EXISTS temp_item_discounts;
|
||||
`);
|
||||
|
||||
// Commit final transaction
|
||||
await localConnection.commit();
|
||||
} catch (error) {
|
||||
await localConnection.rollback();
|
||||
throw error;
|
||||
}
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
@@ -604,16 +756,8 @@ async function importOrders(prodConnection, localConnection, incrementalUpdate =
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during orders import:", error);
|
||||
|
||||
// Rollback transaction
|
||||
try {
|
||||
await localConnection.rollback();
|
||||
} catch (rollbackError) {
|
||||
console.error("Error during rollback:", rollbackError);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = importOrders;
|
||||
module.exports = importOrders;
|
||||
@@ -98,6 +98,7 @@ async function setupTemporaryTables(connection) {
|
||||
baskets INTEGER,
|
||||
notifies INTEGER,
|
||||
date_last_sold TIMESTAMP WITH TIME ZONE,
|
||||
primary_iid INTEGER,
|
||||
image TEXT,
|
||||
image_175 TEXT,
|
||||
image_full TEXT,
|
||||
@@ -193,8 +194,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0)
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE oi.prod_pid = p.pid AND o.order_status >= 20) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
(SELECT iid FROM product_images WHERE pid = p.pid AND \`order\` = 255 LIMIT 1) AS primary_iid,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
AND pc.type IN (10, 20, 11, 21, 12, 13)
|
||||
@@ -233,12 +238,12 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
const batch = prodData.slice(i, i + BATCH_SIZE);
|
||||
|
||||
const placeholders = batch.map((_, idx) => {
|
||||
const base = idx * 47; // 47 columns
|
||||
return `(${Array.from({ length: 47 }, (_, i) => `$${base + i + 1}`).join(', ')})`;
|
||||
const base = idx * 48; // 48 columns
|
||||
return `(${Array.from({ length: 48 }, (_, i) => `$${base + i + 1}`).join(', ')})`;
|
||||
}).join(',');
|
||||
|
||||
const values = batch.flatMap(row => {
|
||||
const imageUrls = getImageUrls(row.pid);
|
||||
const imageUrls = getImageUrls(row.pid, row.primary_iid || 1);
|
||||
return [
|
||||
row.pid,
|
||||
row.title,
|
||||
@@ -282,6 +287,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
row.baskets,
|
||||
row.notifies,
|
||||
validateDate(row.date_last_sold),
|
||||
row.primary_iid,
|
||||
imageUrls.image,
|
||||
imageUrls.image_175,
|
||||
imageUrls.image_full,
|
||||
@@ -299,7 +305,7 @@ async function importMissingProducts(prodConnection, localConnection, missingPid
|
||||
landing_cost_price, barcode, harmonized_tariff_code, updated_at, visible,
|
||||
managing_stock, replenishable, permalink, moq, uom, rating, reviews,
|
||||
weight, length, width, height, country_of_origin, location, total_sold,
|
||||
baskets, notifies, date_last_sold, image, image_175, image_full, options, tags
|
||||
baskets, notifies, date_last_sold, primary_iid, image, image_175, image_full, options, tags
|
||||
)
|
||||
VALUES ${placeholders}
|
||||
ON CONFLICT (pid) DO NOTHING
|
||||
@@ -394,8 +400,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0) FROM order_items oi WHERE oi.prod_pid = p.pid) AS total_sold,
|
||||
(SELECT COALESCE(SUM(oi.qty_ordered), 0)
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE oi.prod_pid = p.pid AND o.order_status >= 20) AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
(SELECT iid FROM product_images WHERE pid = p.pid AND \`order\` = 255 LIMIT 1) AS primary_iid,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
AND pc.type IN (10, 20, 11, 21, 12, 13)
|
||||
@@ -422,9 +432,11 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
pcp.date_deactive > ? OR
|
||||
pcp.date_active > ? OR
|
||||
pnb.date_updated > ?
|
||||
-- Add condition for product_images changes if needed for incremental updates
|
||||
-- OR EXISTS (SELECT 1 FROM product_images pi WHERE pi.pid = p.pid AND pi.stamp > ?)
|
||||
` : 'TRUE'}
|
||||
GROUP BY p.pid
|
||||
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime] : []);
|
||||
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime /*, lastSyncTime */] : []);
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
@@ -438,12 +450,12 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
|
||||
await withRetry(async () => {
|
||||
const placeholders = batch.map((_, idx) => {
|
||||
const base = idx * 47; // 47 columns
|
||||
return `(${Array.from({ length: 47 }, (_, i) => `$${base + i + 1}`).join(', ')})`;
|
||||
const base = idx * 48; // 48 columns
|
||||
return `(${Array.from({ length: 48 }, (_, i) => `$${base + i + 1}`).join(', ')})`;
|
||||
}).join(',');
|
||||
|
||||
const values = batch.flatMap(row => {
|
||||
const imageUrls = getImageUrls(row.pid);
|
||||
const imageUrls = getImageUrls(row.pid, row.primary_iid || 1);
|
||||
return [
|
||||
row.pid,
|
||||
row.title,
|
||||
@@ -487,6 +499,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
row.baskets,
|
||||
row.notifies,
|
||||
validateDate(row.date_last_sold),
|
||||
row.primary_iid,
|
||||
imageUrls.image,
|
||||
imageUrls.image_175,
|
||||
imageUrls.image_full,
|
||||
@@ -503,7 +516,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
landing_cost_price, barcode, harmonized_tariff_code, updated_at, visible,
|
||||
managing_stock, replenishable, permalink, moq, uom, rating, reviews,
|
||||
weight, length, width, height, country_of_origin, location, total_sold,
|
||||
baskets, notifies, date_last_sold, image, image_175, image_full, options, tags
|
||||
baskets, notifies, date_last_sold, primary_iid, image, image_175, image_full, options, tags
|
||||
) VALUES ${placeholders}
|
||||
ON CONFLICT (pid) DO UPDATE SET
|
||||
title = EXCLUDED.title,
|
||||
@@ -546,6 +559,7 @@ async function materializeCalculations(prodConnection, localConnection, incremen
|
||||
baskets = EXCLUDED.baskets,
|
||||
notifies = EXCLUDED.notifies,
|
||||
date_last_sold = EXCLUDED.date_last_sold,
|
||||
primary_iid = EXCLUDED.primary_iid,
|
||||
image = EXCLUDED.image,
|
||||
image_175 = EXCLUDED.image_175,
|
||||
image_full = EXCLUDED.image_full,
|
||||
@@ -644,6 +658,7 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
t.baskets,
|
||||
t.notifies,
|
||||
t.date_last_sold,
|
||||
t.primary_iid,
|
||||
t.image,
|
||||
t.image_175,
|
||||
t.image_full,
|
||||
@@ -666,7 +681,7 @@ async function importProducts(prodConnection, localConnection, incrementalUpdate
|
||||
}).join(',');
|
||||
|
||||
const values = batch.flatMap(row => {
|
||||
const imageUrls = getImageUrls(row.pid);
|
||||
const imageUrls = getImageUrls(row.pid, row.primary_iid || 1);
|
||||
return [
|
||||
row.pid,
|
||||
row.title,
|
||||
|
||||
@@ -31,7 +31,8 @@ BEGIN
|
||||
p.stock_quantity as current_stock, -- Use actual current stock for forecast base
|
||||
p.created_at, p.first_received, p.date_last_sold,
|
||||
p.moq,
|
||||
p.uom
|
||||
p.uom,
|
||||
p.total_sold as historical_total_sold -- Add historical total_sold from products table
|
||||
FROM public.products p
|
||||
),
|
||||
OnOrderInfo AS (
|
||||
@@ -99,9 +100,30 @@ BEGIN
|
||||
AVG(CASE WHEN snapshot_date BETWEEN _calculation_date - INTERVAL '29 days' AND _calculation_date THEN eod_stock_retail END) AS avg_stock_retail_30d,
|
||||
AVG(CASE WHEN snapshot_date BETWEEN _calculation_date - INTERVAL '29 days' AND _calculation_date THEN eod_stock_gross END) AS avg_stock_gross_30d,
|
||||
|
||||
-- Lifetime (Sum over ALL available snapshots up to calculation date)
|
||||
SUM(units_sold) AS lifetime_sales,
|
||||
SUM(net_revenue) AS lifetime_revenue,
|
||||
-- Lifetime (Using historical total from products table)
|
||||
(SELECT total_sold FROM public.products WHERE public.products.pid = daily_product_snapshots.pid) AS lifetime_sales,
|
||||
COALESCE(
|
||||
-- Option 1: Use 30-day average price if available
|
||||
CASE WHEN SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '29 days' AND snapshot_date <= _calculation_date THEN units_sold ELSE 0 END) > 0 THEN
|
||||
(SELECT total_sold FROM public.products WHERE public.products.pid = daily_product_snapshots.pid) * (
|
||||
SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '29 days' AND snapshot_date <= _calculation_date THEN net_revenue ELSE 0 END) /
|
||||
NULLIF(SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '29 days' AND snapshot_date <= _calculation_date THEN units_sold ELSE 0 END), 0)
|
||||
)
|
||||
ELSE NULL END,
|
||||
-- Option 2: Try 365-day average price if available
|
||||
CASE WHEN SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '364 days' AND snapshot_date <= _calculation_date THEN units_sold ELSE 0 END) > 0 THEN
|
||||
(SELECT total_sold FROM public.products WHERE public.products.pid = daily_product_snapshots.pid) * (
|
||||
SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '364 days' AND snapshot_date <= _calculation_date THEN net_revenue ELSE 0 END) /
|
||||
NULLIF(SUM(CASE WHEN snapshot_date >= _calculation_date - INTERVAL '364 days' AND snapshot_date <= _calculation_date THEN units_sold ELSE 0 END), 0)
|
||||
)
|
||||
ELSE NULL END,
|
||||
-- Option 3: Use current price from products table
|
||||
(SELECT total_sold * price FROM public.products WHERE public.products.pid = daily_product_snapshots.pid),
|
||||
-- Option 4: Use regular price if current price might be zero
|
||||
(SELECT total_sold * regular_price FROM public.products WHERE public.products.pid = daily_product_snapshots.pid),
|
||||
-- Final fallback: Use accumulated revenue (less accurate for old products)
|
||||
SUM(net_revenue)
|
||||
) AS lifetime_revenue,
|
||||
|
||||
-- Yesterday (Sales for the specific _calculation_date)
|
||||
SUM(CASE WHEN snapshot_date = _calculation_date THEN units_sold ELSE 0 END) as yesterday_sales
|
||||
|
||||
@@ -0,0 +1,200 @@
|
||||
-- Description: Rebuilds daily product snapshots from scratch using real orders data.
|
||||
-- Fixes issues with duplicated/inflated metrics.
|
||||
-- Dependencies: Core import tables (products, orders, purchase_orders).
|
||||
-- Frequency: One-time run to clear out problematic data.
|
||||
|
||||
DO $$
|
||||
DECLARE
|
||||
_module_name TEXT := 'rebuild_daily_snapshots';
|
||||
_start_time TIMESTAMPTZ := clock_timestamp();
|
||||
_date DATE;
|
||||
_count INT;
|
||||
_total_records INT := 0;
|
||||
_begin_date DATE := (SELECT MIN(date)::date FROM orders WHERE date >= '2024-01-01'); -- Starting point for data rebuild
|
||||
_end_date DATE := CURRENT_DATE;
|
||||
BEGIN
|
||||
RAISE NOTICE 'Beginning daily snapshots rebuild from % to %. Starting at %', _begin_date, _end_date, _start_time;
|
||||
|
||||
-- First truncate the existing snapshots to ensure a clean slate
|
||||
TRUNCATE TABLE public.daily_product_snapshots;
|
||||
RAISE NOTICE 'Cleared existing snapshot data';
|
||||
|
||||
-- Now rebuild the snapshots day by day
|
||||
_date := _begin_date;
|
||||
|
||||
WHILE _date <= _end_date LOOP
|
||||
RAISE NOTICE 'Processing date %...', _date;
|
||||
|
||||
-- Create snapshots for this date
|
||||
WITH SalesData AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.sku,
|
||||
-- Count orders to ensure we only include products with real activity
|
||||
COUNT(o.id) as order_count,
|
||||
-- Aggregate Sales (Quantity > 0, Status not Canceled/Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.quantity ELSE 0 END), 0) AS units_sold,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.price * o.quantity ELSE 0 END), 0.00) AS gross_revenue_unadjusted,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.discount ELSE 0 END), 0.00) AS discounts,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN COALESCE(o.costeach, p.landing_cost_price, p.cost_price) * o.quantity ELSE 0 END), 0.00) AS cogs,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN p.regular_price * o.quantity ELSE 0 END), 0.00) AS gross_regular_revenue,
|
||||
|
||||
-- Aggregate Returns (Quantity < 0 or Status = Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN ABS(o.quantity) ELSE 0 END), 0) AS units_returned,
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN o.price * ABS(o.quantity) ELSE 0 END), 0.00) AS returns_revenue
|
||||
FROM public.products p
|
||||
LEFT JOIN public.orders o
|
||||
ON p.pid = o.pid
|
||||
AND o.date::date = _date
|
||||
GROUP BY p.pid, p.sku
|
||||
HAVING COUNT(o.id) > 0 -- Only include products with actual orders for this date
|
||||
),
|
||||
ReceivingData AS (
|
||||
SELECT
|
||||
po.pid,
|
||||
-- Count POs to ensure we only include products with real activity
|
||||
COUNT(po.po_id) as po_count,
|
||||
-- Calculate received quantity for this day
|
||||
COALESCE(
|
||||
-- First try the received field from purchase_orders table (if received on this date)
|
||||
SUM(CASE WHEN po.date::date = _date THEN po.received ELSE 0 END),
|
||||
|
||||
-- Otherwise try receiving_history JSON
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
),
|
||||
0
|
||||
) AS units_received,
|
||||
|
||||
COALESCE(
|
||||
-- First try the actual cost_price from purchase_orders
|
||||
SUM(CASE WHEN po.date::date = _date THEN po.received * po.cost_price ELSE 0 END),
|
||||
|
||||
-- Otherwise try receiving_history JSON
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
* COALESCE((rh.item->>'cost')::numeric, po.cost_price)
|
||||
),
|
||||
0.00
|
||||
) AS cost_received
|
||||
FROM public.purchase_orders po
|
||||
LEFT JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item) ON
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0 AND
|
||||
(
|
||||
(rh.item->>'date')::date = _date OR
|
||||
(rh.item->>'received_at')::date = _date OR
|
||||
(rh.item->>'receipt_date')::date = _date
|
||||
)
|
||||
-- Include POs with the current date or relevant receiving_history
|
||||
WHERE
|
||||
po.date::date = _date OR
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0
|
||||
GROUP BY po.pid
|
||||
HAVING COUNT(po.po_id) > 0 OR SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
) > 0
|
||||
),
|
||||
-- Get stock quantities for the day - note this is approximate since we're using current products data
|
||||
StockData AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.stock_quantity,
|
||||
COALESCE(p.landing_cost_price, p.cost_price, 0.00) as effective_cost_price,
|
||||
COALESCE(p.price, 0.00) as current_price,
|
||||
COALESCE(p.regular_price, 0.00) as current_regular_price
|
||||
FROM public.products p
|
||||
)
|
||||
INSERT INTO public.daily_product_snapshots (
|
||||
snapshot_date,
|
||||
pid,
|
||||
sku,
|
||||
eod_stock_quantity,
|
||||
eod_stock_cost,
|
||||
eod_stock_retail,
|
||||
eod_stock_gross,
|
||||
stockout_flag,
|
||||
units_sold,
|
||||
units_returned,
|
||||
gross_revenue,
|
||||
discounts,
|
||||
returns_revenue,
|
||||
net_revenue,
|
||||
cogs,
|
||||
gross_regular_revenue,
|
||||
profit,
|
||||
units_received,
|
||||
cost_received,
|
||||
calculation_timestamp
|
||||
)
|
||||
SELECT
|
||||
_date AS snapshot_date,
|
||||
COALESCE(sd.pid, rd.pid) AS pid,
|
||||
sd.sku,
|
||||
-- Use current stock as approximation, since historical stock data may not be available
|
||||
s.stock_quantity AS eod_stock_quantity,
|
||||
s.stock_quantity * s.effective_cost_price AS eod_stock_cost,
|
||||
s.stock_quantity * s.current_price AS eod_stock_retail,
|
||||
s.stock_quantity * s.current_regular_price AS eod_stock_gross,
|
||||
(s.stock_quantity <= 0) AS stockout_flag,
|
||||
-- Sales metrics
|
||||
COALESCE(sd.units_sold, 0),
|
||||
COALESCE(sd.units_returned, 0),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00),
|
||||
COALESCE(sd.discounts, 0.00),
|
||||
COALESCE(sd.returns_revenue, 0.00),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00) AS net_revenue,
|
||||
COALESCE(sd.cogs, 0.00),
|
||||
COALESCE(sd.gross_regular_revenue, 0.00),
|
||||
(COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00)) - COALESCE(sd.cogs, 0.00) AS profit,
|
||||
-- Receiving metrics
|
||||
COALESCE(rd.units_received, 0),
|
||||
COALESCE(rd.cost_received, 0.00),
|
||||
_start_time
|
||||
FROM SalesData sd
|
||||
FULL OUTER JOIN ReceivingData rd ON sd.pid = rd.pid
|
||||
LEFT JOIN StockData s ON COALESCE(sd.pid, rd.pid) = s.pid
|
||||
WHERE (COALESCE(sd.order_count, 0) > 0 OR COALESCE(rd.po_count, 0) > 0);
|
||||
|
||||
-- Get record count for this day
|
||||
GET DIAGNOSTICS _count = ROW_COUNT;
|
||||
_total_records := _total_records + _count;
|
||||
|
||||
RAISE NOTICE 'Added % snapshot records for date %', _count, _date;
|
||||
|
||||
-- Move to next day
|
||||
_date := _date + INTERVAL '1 day';
|
||||
END LOOP;
|
||||
|
||||
RAISE NOTICE 'Rebuilding daily snapshots complete. Added % total records across % days.', _total_records, (_end_date - _begin_date)::integer + 1;
|
||||
|
||||
-- Update the status table for daily_snapshots
|
||||
INSERT INTO public.calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('daily_snapshots', _start_time)
|
||||
ON CONFLICT (module_name) DO UPDATE SET last_calculation_timestamp = _start_time;
|
||||
|
||||
-- Now update product_metrics based on the rebuilt snapshots
|
||||
RAISE NOTICE 'Triggering update of product_metrics table...';
|
||||
|
||||
-- Call the update_product_metrics procedure directly
|
||||
-- Your system might use a different method to trigger this update
|
||||
PERFORM pg_notify('recalculate_metrics', 'product_metrics');
|
||||
|
||||
RAISE NOTICE 'Rebuild complete. Duration: %', clock_timestamp() - _start_time;
|
||||
END $$;
|
||||
@@ -6,6 +6,7 @@ DO $$
|
||||
DECLARE
|
||||
_module_name VARCHAR := 'brand_metrics';
|
||||
_start_time TIMESTAMPTZ := clock_timestamp();
|
||||
_min_revenue NUMERIC := 50.00; -- Minimum revenue threshold for margin calculation
|
||||
BEGIN
|
||||
RAISE NOTICE 'Running % calculation...', _module_name;
|
||||
|
||||
@@ -19,14 +20,26 @@ BEGIN
|
||||
SUM(pm.current_stock) AS current_stock_units,
|
||||
SUM(pm.current_stock_cost) AS current_stock_cost,
|
||||
SUM(pm.current_stock_retail) AS current_stock_retail,
|
||||
SUM(pm.sales_7d) AS sales_7d, SUM(pm.revenue_7d) AS revenue_7d,
|
||||
SUM(pm.sales_30d) AS sales_30d, SUM(pm.revenue_30d) AS revenue_30d,
|
||||
SUM(pm.profit_30d) AS profit_30d, SUM(pm.cogs_30d) AS cogs_30d,
|
||||
SUM(pm.sales_365d) AS sales_365d, SUM(pm.revenue_365d) AS revenue_365d,
|
||||
SUM(pm.lifetime_sales) AS lifetime_sales, SUM(pm.lifetime_revenue) AS lifetime_revenue
|
||||
-- Only include products with valid sales data in each time period
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_7d > 0 THEN pm.pid END) AS products_with_sales_7d,
|
||||
SUM(CASE WHEN pm.sales_7d > 0 THEN pm.sales_7d ELSE 0 END) AS sales_7d,
|
||||
SUM(CASE WHEN pm.revenue_7d > 0 THEN pm.revenue_7d ELSE 0 END) AS revenue_7d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_30d > 0 THEN pm.pid END) AS products_with_sales_30d,
|
||||
SUM(CASE WHEN pm.sales_30d > 0 THEN pm.sales_30d ELSE 0 END) AS sales_30d,
|
||||
SUM(CASE WHEN pm.revenue_30d > 0 THEN pm.revenue_30d ELSE 0 END) AS revenue_30d,
|
||||
SUM(CASE WHEN pm.cogs_30d > 0 THEN pm.cogs_30d ELSE 0 END) AS cogs_30d,
|
||||
SUM(CASE WHEN pm.profit_30d != 0 THEN pm.profit_30d ELSE 0 END) AS profit_30d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_365d > 0 THEN pm.pid END) AS products_with_sales_365d,
|
||||
SUM(CASE WHEN pm.sales_365d > 0 THEN pm.sales_365d ELSE 0 END) AS sales_365d,
|
||||
SUM(CASE WHEN pm.revenue_365d > 0 THEN pm.revenue_365d ELSE 0 END) AS revenue_365d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.lifetime_sales > 0 THEN pm.pid END) AS products_with_lifetime_sales,
|
||||
SUM(CASE WHEN pm.lifetime_sales > 0 THEN pm.lifetime_sales ELSE 0 END) AS lifetime_sales,
|
||||
SUM(CASE WHEN pm.lifetime_revenue > 0 THEN pm.lifetime_revenue ELSE 0 END) AS lifetime_revenue
|
||||
FROM public.product_metrics pm
|
||||
JOIN public.products p ON pm.pid = p.pid
|
||||
-- WHERE p.visible = true -- Optional: filter only visible products for brand metrics?
|
||||
GROUP BY brand_group
|
||||
),
|
||||
AllBrands AS (
|
||||
@@ -58,8 +71,14 @@ BEGIN
|
||||
COALESCE(ba.profit_30d, 0.00), COALESCE(ba.cogs_30d, 0.00),
|
||||
COALESCE(ba.sales_365d, 0), COALESCE(ba.revenue_365d, 0.00),
|
||||
COALESCE(ba.lifetime_sales, 0), COALESCE(ba.lifetime_revenue, 0.00),
|
||||
-- KPIs
|
||||
(ba.profit_30d / NULLIF(ba.revenue_30d, 0)) * 100.0
|
||||
-- KPIs - Calculate margin only for brands with significant revenue
|
||||
CASE
|
||||
WHEN COALESCE(ba.revenue_30d, 0) >= _min_revenue THEN
|
||||
-- Directly calculate margin from revenue and cogs for consistency
|
||||
-- This is mathematically equivalent to profit/revenue but more explicit
|
||||
((COALESCE(ba.revenue_30d, 0) - COALESCE(ba.cogs_30d, 0)) / COALESCE(ba.revenue_30d, 1)) * 100.0
|
||||
ELSE NULL -- No margin for low/no revenue brands
|
||||
END
|
||||
FROM AllBrands b
|
||||
LEFT JOIN BrandAggregates ba ON b.brand_group = ba.brand_group
|
||||
|
||||
|
||||
@@ -6,12 +6,49 @@ DO $$
|
||||
DECLARE
|
||||
_module_name VARCHAR := 'category_metrics';
|
||||
_start_time TIMESTAMPTZ := clock_timestamp();
|
||||
_min_revenue NUMERIC := 50.00; -- Minimum revenue threshold for margin calculation
|
||||
BEGIN
|
||||
RAISE NOTICE 'Running % calculation...', _module_name;
|
||||
|
||||
WITH CategoryAggregates AS (
|
||||
WITH
|
||||
-- Identify the hierarchy depth for each category
|
||||
CategoryDepth AS (
|
||||
WITH RECURSIVE CategoryTree AS (
|
||||
-- Base case: Start with categories without parents (root categories)
|
||||
SELECT cat_id, name, parent_id, 0 AS depth
|
||||
FROM public.categories
|
||||
WHERE parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
-- Recursive step: Add child categories with incremented depth
|
||||
SELECT c.cat_id, c.name, c.parent_id, ct.depth + 1
|
||||
FROM public.categories c
|
||||
JOIN CategoryTree ct ON c.parent_id = ct.cat_id
|
||||
)
|
||||
SELECT cat_id, depth
|
||||
FROM CategoryTree
|
||||
),
|
||||
-- For each product, find the most specific (deepest) category it belongs to
|
||||
ProductDeepestCategory AS (
|
||||
SELECT
|
||||
pc.pid,
|
||||
pc.cat_id
|
||||
FROM public.product_categories pc
|
||||
JOIN CategoryDepth cd ON pc.cat_id = cd.cat_id
|
||||
-- This is the key part: for each product, select only the category with maximum depth
|
||||
WHERE (pc.pid, cd.depth) IN (
|
||||
SELECT pc2.pid, MAX(cd2.depth)
|
||||
FROM public.product_categories pc2
|
||||
JOIN CategoryDepth cd2 ON pc2.cat_id = cd2.cat_id
|
||||
GROUP BY pc2.pid
|
||||
)
|
||||
),
|
||||
-- Calculate metrics only at the most specific category level for each product
|
||||
-- These are the direct metrics (only products directly in this category)
|
||||
DirectCategoryMetrics AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
pdc.cat_id,
|
||||
-- Counts
|
||||
COUNT(DISTINCT pm.pid) AS product_count,
|
||||
COUNT(DISTINCT CASE WHEN pm.is_visible THEN pm.pid END) AS active_product_count,
|
||||
@@ -20,57 +57,188 @@ BEGIN
|
||||
SUM(pm.current_stock) AS current_stock_units,
|
||||
SUM(pm.current_stock_cost) AS current_stock_cost,
|
||||
SUM(pm.current_stock_retail) AS current_stock_retail,
|
||||
-- Rolling Periods (Sum directly from product_metrics)
|
||||
SUM(pm.sales_7d) AS sales_7d, SUM(pm.revenue_7d) AS revenue_7d,
|
||||
SUM(pm.sales_30d) AS sales_30d, SUM(pm.revenue_30d) AS revenue_30d,
|
||||
SUM(pm.profit_30d) AS profit_30d, SUM(pm.cogs_30d) AS cogs_30d,
|
||||
SUM(pm.sales_365d) AS sales_365d, SUM(pm.revenue_365d) AS revenue_365d,
|
||||
SUM(pm.lifetime_sales) AS lifetime_sales, SUM(pm.lifetime_revenue) AS lifetime_revenue,
|
||||
-- Data for KPIs
|
||||
SUM(pm.avg_stock_units_30d) AS total_avg_stock_units_30d -- Sum of averages (use cautiously)
|
||||
-- Rolling Periods - Only include products with actual sales in each period
|
||||
SUM(CASE WHEN pm.sales_7d > 0 THEN pm.sales_7d ELSE 0 END) AS sales_7d,
|
||||
SUM(CASE WHEN pm.revenue_7d > 0 THEN pm.revenue_7d ELSE 0 END) AS revenue_7d,
|
||||
SUM(CASE WHEN pm.sales_30d > 0 THEN pm.sales_30d ELSE 0 END) AS sales_30d,
|
||||
SUM(CASE WHEN pm.revenue_30d > 0 THEN pm.revenue_30d ELSE 0 END) AS revenue_30d,
|
||||
SUM(CASE WHEN pm.cogs_30d > 0 THEN pm.cogs_30d ELSE 0 END) AS cogs_30d,
|
||||
SUM(CASE WHEN pm.profit_30d != 0 THEN pm.profit_30d ELSE 0 END) AS profit_30d,
|
||||
SUM(CASE WHEN pm.sales_365d > 0 THEN pm.sales_365d ELSE 0 END) AS sales_365d,
|
||||
SUM(CASE WHEN pm.revenue_365d > 0 THEN pm.revenue_365d ELSE 0 END) AS revenue_365d,
|
||||
SUM(CASE WHEN pm.lifetime_sales > 0 THEN pm.lifetime_sales ELSE 0 END) AS lifetime_sales,
|
||||
SUM(CASE WHEN pm.lifetime_revenue > 0 THEN pm.lifetime_revenue ELSE 0 END) AS lifetime_revenue,
|
||||
-- Data for KPIs - Only average stock for products with stock
|
||||
SUM(CASE WHEN pm.avg_stock_units_30d > 0 THEN pm.avg_stock_units_30d ELSE 0 END) AS total_avg_stock_units_30d
|
||||
FROM public.product_metrics pm
|
||||
JOIN public.product_categories pc ON pm.pid = pc.pid
|
||||
-- Optional: JOIN products p ON pm.pid = p.pid if needed for filtering (e.g., only visible products)
|
||||
-- WHERE p.visible = true -- Example filter
|
||||
GROUP BY pc.cat_id
|
||||
JOIN ProductDeepestCategory pdc ON pm.pid = pdc.pid
|
||||
GROUP BY pdc.cat_id
|
||||
),
|
||||
-- Build a category lookup table for parent relationships
|
||||
CategoryHierarchyPaths AS (
|
||||
WITH RECURSIVE ParentPaths AS (
|
||||
-- Base case: All categories with their immediate parents
|
||||
SELECT
|
||||
cat_id,
|
||||
cat_id as leaf_id, -- Every category is its own leaf initially
|
||||
ARRAY[cat_id] as path
|
||||
FROM public.categories
|
||||
|
||||
UNION ALL
|
||||
|
||||
-- Recursive step: Walk up the parent chain
|
||||
SELECT
|
||||
c.parent_id as cat_id,
|
||||
pp.leaf_id, -- Keep the original leaf_id
|
||||
c.parent_id || pp.path as path
|
||||
FROM ParentPaths pp
|
||||
JOIN public.categories c ON pp.cat_id = c.cat_id
|
||||
WHERE c.parent_id IS NOT NULL -- Stop at root categories
|
||||
)
|
||||
-- Select distinct paths to avoid duplication
|
||||
SELECT DISTINCT cat_id, leaf_id
|
||||
FROM ParentPaths
|
||||
),
|
||||
-- Aggregate metrics from leaf categories to their ancestors without duplication
|
||||
-- These are the rolled-up metrics (including all child categories)
|
||||
RollupMetrics AS (
|
||||
SELECT
|
||||
chp.cat_id,
|
||||
-- For each parent category, count distinct products to avoid duplication
|
||||
COUNT(DISTINCT dcm.cat_id) AS child_categories_count,
|
||||
SUM(dcm.product_count) AS rollup_product_count,
|
||||
SUM(dcm.active_product_count) AS rollup_active_product_count,
|
||||
SUM(dcm.replenishable_product_count) AS rollup_replenishable_product_count,
|
||||
SUM(dcm.current_stock_units) AS rollup_current_stock_units,
|
||||
SUM(dcm.current_stock_cost) AS rollup_current_stock_cost,
|
||||
SUM(dcm.current_stock_retail) AS rollup_current_stock_retail,
|
||||
SUM(dcm.sales_7d) AS rollup_sales_7d,
|
||||
SUM(dcm.revenue_7d) AS rollup_revenue_7d,
|
||||
SUM(dcm.sales_30d) AS rollup_sales_30d,
|
||||
SUM(dcm.revenue_30d) AS rollup_revenue_30d,
|
||||
SUM(dcm.cogs_30d) AS rollup_cogs_30d,
|
||||
SUM(dcm.profit_30d) AS rollup_profit_30d,
|
||||
SUM(dcm.sales_365d) AS rollup_sales_365d,
|
||||
SUM(dcm.revenue_365d) AS rollup_revenue_365d,
|
||||
SUM(dcm.lifetime_sales) AS rollup_lifetime_sales,
|
||||
SUM(dcm.lifetime_revenue) AS rollup_lifetime_revenue,
|
||||
SUM(dcm.total_avg_stock_units_30d) AS rollup_total_avg_stock_units_30d
|
||||
FROM CategoryHierarchyPaths chp
|
||||
JOIN DirectCategoryMetrics dcm ON chp.leaf_id = dcm.cat_id
|
||||
GROUP BY chp.cat_id
|
||||
),
|
||||
-- Combine direct and rollup metrics
|
||||
CombinedMetrics AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.type,
|
||||
c.parent_id,
|
||||
-- Direct metrics (just this category)
|
||||
COALESCE(dcm.product_count, 0) AS direct_product_count,
|
||||
COALESCE(dcm.active_product_count, 0) AS direct_active_product_count,
|
||||
COALESCE(dcm.replenishable_product_count, 0) AS direct_replenishable_product_count,
|
||||
COALESCE(dcm.current_stock_units, 0) AS direct_current_stock_units,
|
||||
COALESCE(dcm.current_stock_cost, 0) AS direct_current_stock_cost,
|
||||
COALESCE(dcm.current_stock_retail, 0) AS direct_current_stock_retail,
|
||||
COALESCE(dcm.sales_7d, 0) AS direct_sales_7d,
|
||||
COALESCE(dcm.revenue_7d, 0) AS direct_revenue_7d,
|
||||
COALESCE(dcm.sales_30d, 0) AS direct_sales_30d,
|
||||
COALESCE(dcm.revenue_30d, 0) AS direct_revenue_30d,
|
||||
COALESCE(dcm.cogs_30d, 0) AS direct_cogs_30d,
|
||||
COALESCE(dcm.profit_30d, 0) AS direct_profit_30d,
|
||||
COALESCE(dcm.sales_365d, 0) AS direct_sales_365d,
|
||||
COALESCE(dcm.revenue_365d, 0) AS direct_revenue_365d,
|
||||
COALESCE(dcm.lifetime_sales, 0) AS direct_lifetime_sales,
|
||||
COALESCE(dcm.lifetime_revenue, 0) AS direct_lifetime_revenue,
|
||||
COALESCE(dcm.total_avg_stock_units_30d, 0) AS direct_avg_stock_units_30d,
|
||||
|
||||
-- Rolled up metrics (this category + all children)
|
||||
COALESCE(rm.rollup_product_count, 0) AS product_count,
|
||||
COALESCE(rm.rollup_active_product_count, 0) AS active_product_count,
|
||||
COALESCE(rm.rollup_replenishable_product_count, 0) AS replenishable_product_count,
|
||||
COALESCE(rm.rollup_current_stock_units, 0) AS current_stock_units,
|
||||
COALESCE(rm.rollup_current_stock_cost, 0) AS current_stock_cost,
|
||||
COALESCE(rm.rollup_current_stock_retail, 0) AS current_stock_retail,
|
||||
COALESCE(rm.rollup_sales_7d, 0) AS sales_7d,
|
||||
COALESCE(rm.rollup_revenue_7d, 0) AS revenue_7d,
|
||||
COALESCE(rm.rollup_sales_30d, 0) AS sales_30d,
|
||||
COALESCE(rm.rollup_revenue_30d, 0) AS revenue_30d,
|
||||
COALESCE(rm.rollup_cogs_30d, 0) AS cogs_30d,
|
||||
COALESCE(rm.rollup_profit_30d, 0) AS profit_30d,
|
||||
COALESCE(rm.rollup_sales_365d, 0) AS sales_365d,
|
||||
COALESCE(rm.rollup_revenue_365d, 0) AS revenue_365d,
|
||||
COALESCE(rm.rollup_lifetime_sales, 0) AS lifetime_sales,
|
||||
COALESCE(rm.rollup_lifetime_revenue, 0) AS lifetime_revenue,
|
||||
COALESCE(rm.rollup_total_avg_stock_units_30d, 0) AS total_avg_stock_units_30d
|
||||
FROM public.categories c
|
||||
LEFT JOIN DirectCategoryMetrics dcm ON c.cat_id = dcm.cat_id
|
||||
LEFT JOIN RollupMetrics rm ON c.cat_id = rm.cat_id
|
||||
)
|
||||
INSERT INTO public.category_metrics (
|
||||
category_id, category_name, category_type, parent_id, last_calculated,
|
||||
-- Store all direct and rolled up metrics
|
||||
product_count, active_product_count, replenishable_product_count,
|
||||
current_stock_units, current_stock_cost, current_stock_retail,
|
||||
sales_7d, revenue_7d, sales_30d, revenue_30d, profit_30d, cogs_30d,
|
||||
sales_365d, revenue_365d, lifetime_sales, lifetime_revenue,
|
||||
-- Also store direct metrics with direct_ prefix
|
||||
direct_product_count, direct_active_product_count, direct_replenishable_product_count,
|
||||
direct_current_stock_units, direct_stock_cost, direct_stock_retail,
|
||||
direct_sales_7d, direct_revenue_7d, direct_sales_30d, direct_revenue_30d,
|
||||
direct_profit_30d, direct_cogs_30d, direct_sales_365d, direct_revenue_365d,
|
||||
direct_lifetime_sales, direct_lifetime_revenue,
|
||||
-- KPIs
|
||||
avg_margin_30d, stock_turn_30d
|
||||
)
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.type,
|
||||
c.parent_id,
|
||||
cm.cat_id,
|
||||
cm.name,
|
||||
cm.type,
|
||||
cm.parent_id,
|
||||
_start_time,
|
||||
-- Base Aggregates
|
||||
COALESCE(ca.product_count, 0),
|
||||
COALESCE(ca.active_product_count, 0),
|
||||
COALESCE(ca.replenishable_product_count, 0),
|
||||
COALESCE(ca.current_stock_units, 0),
|
||||
COALESCE(ca.current_stock_cost, 0.00),
|
||||
COALESCE(ca.current_stock_retail, 0.00),
|
||||
COALESCE(ca.sales_7d, 0), COALESCE(ca.revenue_7d, 0.00),
|
||||
COALESCE(ca.sales_30d, 0), COALESCE(ca.revenue_30d, 0.00),
|
||||
COALESCE(ca.profit_30d, 0.00), COALESCE(ca.cogs_30d, 0.00),
|
||||
COALESCE(ca.sales_365d, 0), COALESCE(ca.revenue_365d, 0.00),
|
||||
COALESCE(ca.lifetime_sales, 0), COALESCE(ca.lifetime_revenue, 0.00),
|
||||
-- KPIs
|
||||
(ca.profit_30d / NULLIF(ca.revenue_30d, 0)) * 100.0,
|
||||
ca.sales_30d / NULLIF(ca.total_avg_stock_units_30d, 0) -- Simple unit-based turnover
|
||||
FROM public.categories c -- Start from categories to include those with no products yet
|
||||
LEFT JOIN CategoryAggregates ca ON c.cat_id = ca.cat_id
|
||||
-- Rolled-up metrics (total including children)
|
||||
cm.product_count,
|
||||
cm.active_product_count,
|
||||
cm.replenishable_product_count,
|
||||
cm.current_stock_units,
|
||||
cm.current_stock_cost,
|
||||
cm.current_stock_retail,
|
||||
cm.sales_7d, cm.revenue_7d,
|
||||
cm.sales_30d, cm.revenue_30d, cm.profit_30d, cm.cogs_30d,
|
||||
cm.sales_365d, cm.revenue_365d,
|
||||
cm.lifetime_sales, cm.lifetime_revenue,
|
||||
-- Direct metrics (just this category)
|
||||
cm.direct_product_count,
|
||||
cm.direct_active_product_count,
|
||||
cm.direct_replenishable_product_count,
|
||||
cm.direct_current_stock_units,
|
||||
cm.direct_current_stock_cost,
|
||||
cm.direct_current_stock_retail,
|
||||
cm.direct_sales_7d, cm.direct_revenue_7d,
|
||||
cm.direct_sales_30d, cm.direct_revenue_30d, cm.direct_profit_30d, cm.direct_cogs_30d,
|
||||
cm.direct_sales_365d, cm.direct_revenue_365d,
|
||||
cm.direct_lifetime_sales, cm.direct_lifetime_revenue,
|
||||
-- KPIs - Calculate margin only for categories with significant revenue
|
||||
CASE
|
||||
WHEN cm.revenue_30d >= _min_revenue THEN
|
||||
((cm.revenue_30d - cm.cogs_30d) / cm.revenue_30d) * 100.0
|
||||
ELSE NULL -- No margin for low/no revenue categories
|
||||
END,
|
||||
-- Stock Turn calculation
|
||||
CASE
|
||||
WHEN cm.total_avg_stock_units_30d > 0 THEN
|
||||
cm.sales_30d / cm.total_avg_stock_units_30d
|
||||
ELSE NULL -- No stock turn if no average stock
|
||||
END
|
||||
FROM CombinedMetrics cm
|
||||
|
||||
ON CONFLICT (category_id) DO UPDATE SET
|
||||
category_name = EXCLUDED.category_name,
|
||||
category_type = EXCLUDED.category_type,
|
||||
parent_id = EXCLUDED.parent_id,
|
||||
last_calculated = EXCLUDED.last_calculated,
|
||||
-- Update rolled-up metrics
|
||||
product_count = EXCLUDED.product_count,
|
||||
active_product_count = EXCLUDED.active_product_count,
|
||||
replenishable_product_count = EXCLUDED.replenishable_product_count,
|
||||
@@ -82,6 +250,19 @@ BEGIN
|
||||
profit_30d = EXCLUDED.profit_30d, cogs_30d = EXCLUDED.cogs_30d,
|
||||
sales_365d = EXCLUDED.sales_365d, revenue_365d = EXCLUDED.revenue_365d,
|
||||
lifetime_sales = EXCLUDED.lifetime_sales, lifetime_revenue = EXCLUDED.lifetime_revenue,
|
||||
-- Update direct metrics
|
||||
direct_product_count = EXCLUDED.direct_product_count,
|
||||
direct_active_product_count = EXCLUDED.direct_active_product_count,
|
||||
direct_replenishable_product_count = EXCLUDED.direct_replenishable_product_count,
|
||||
direct_current_stock_units = EXCLUDED.direct_current_stock_units,
|
||||
direct_stock_cost = EXCLUDED.direct_stock_cost,
|
||||
direct_stock_retail = EXCLUDED.direct_stock_retail,
|
||||
direct_sales_7d = EXCLUDED.direct_sales_7d, direct_revenue_7d = EXCLUDED.direct_revenue_7d,
|
||||
direct_sales_30d = EXCLUDED.direct_sales_30d, direct_revenue_30d = EXCLUDED.direct_revenue_30d,
|
||||
direct_profit_30d = EXCLUDED.direct_profit_30d, direct_cogs_30d = EXCLUDED.direct_cogs_30d,
|
||||
direct_sales_365d = EXCLUDED.direct_sales_365d, direct_revenue_365d = EXCLUDED.direct_revenue_365d,
|
||||
direct_lifetime_sales = EXCLUDED.direct_lifetime_sales, direct_lifetime_revenue = EXCLUDED.direct_lifetime_revenue,
|
||||
-- Update KPIs
|
||||
avg_margin_30d = EXCLUDED.avg_margin_30d,
|
||||
stock_turn_30d = EXCLUDED.stock_turn_30d;
|
||||
|
||||
|
||||
@@ -21,11 +21,24 @@ BEGIN
|
||||
SUM(pm.current_stock_retail) AS current_stock_retail,
|
||||
SUM(pm.on_order_qty) AS on_order_units,
|
||||
SUM(pm.on_order_cost) AS on_order_cost,
|
||||
SUM(pm.sales_7d) AS sales_7d, SUM(pm.revenue_7d) AS revenue_7d,
|
||||
SUM(pm.sales_30d) AS sales_30d, SUM(pm.revenue_30d) AS revenue_30d,
|
||||
SUM(pm.profit_30d) AS profit_30d, SUM(pm.cogs_30d) AS cogs_30d,
|
||||
SUM(pm.sales_365d) AS sales_365d, SUM(pm.revenue_365d) AS revenue_365d,
|
||||
SUM(pm.lifetime_sales) AS lifetime_sales, SUM(pm.lifetime_revenue) AS lifetime_revenue
|
||||
-- Only include products with valid sales data in each time period
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_7d > 0 THEN pm.pid END) AS products_with_sales_7d,
|
||||
SUM(CASE WHEN pm.sales_7d > 0 THEN pm.sales_7d ELSE 0 END) AS sales_7d,
|
||||
SUM(CASE WHEN pm.revenue_7d > 0 THEN pm.revenue_7d ELSE 0 END) AS revenue_7d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_30d > 0 THEN pm.pid END) AS products_with_sales_30d,
|
||||
SUM(CASE WHEN pm.sales_30d > 0 THEN pm.sales_30d ELSE 0 END) AS sales_30d,
|
||||
SUM(CASE WHEN pm.revenue_30d > 0 THEN pm.revenue_30d ELSE 0 END) AS revenue_30d,
|
||||
SUM(CASE WHEN pm.cogs_30d > 0 THEN pm.cogs_30d ELSE 0 END) AS cogs_30d,
|
||||
SUM(CASE WHEN pm.profit_30d != 0 THEN pm.profit_30d ELSE 0 END) AS profit_30d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.sales_365d > 0 THEN pm.pid END) AS products_with_sales_365d,
|
||||
SUM(CASE WHEN pm.sales_365d > 0 THEN pm.sales_365d ELSE 0 END) AS sales_365d,
|
||||
SUM(CASE WHEN pm.revenue_365d > 0 THEN pm.revenue_365d ELSE 0 END) AS revenue_365d,
|
||||
|
||||
COUNT(DISTINCT CASE WHEN pm.lifetime_sales > 0 THEN pm.pid END) AS products_with_lifetime_sales,
|
||||
SUM(CASE WHEN pm.lifetime_sales > 0 THEN pm.lifetime_sales ELSE 0 END) AS lifetime_sales,
|
||||
SUM(CASE WHEN pm.lifetime_revenue > 0 THEN pm.lifetime_revenue ELSE 0 END) AS lifetime_revenue
|
||||
FROM public.product_metrics pm
|
||||
JOIN public.products p ON pm.pid = p.pid
|
||||
WHERE p.vendor IS NOT NULL AND p.vendor <> ''
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
-- Description: Calculates and updates daily aggregated product data for the current day.
|
||||
-- Description: Calculates and updates daily aggregated product data for recent days.
|
||||
-- Uses UPSERT (INSERT ON CONFLICT UPDATE) for idempotency.
|
||||
-- Dependencies: Core import tables (products, orders, purchase_orders), calculate_status table.
|
||||
-- Frequency: Hourly (Run ~5-10 minutes after hourly data import completes).
|
||||
@@ -8,176 +8,243 @@ DECLARE
|
||||
_module_name TEXT := 'daily_snapshots';
|
||||
_start_time TIMESTAMPTZ := clock_timestamp(); -- Time execution started
|
||||
_last_calc_time TIMESTAMPTZ;
|
||||
_target_date DATE := CURRENT_DATE; -- Always recalculate today for simplicity with hourly runs
|
||||
_target_date DATE; -- Will be set in the loop
|
||||
_total_records INT := 0;
|
||||
_has_orders BOOLEAN := FALSE;
|
||||
_process_days INT := 5; -- Number of days to check/process (today plus previous 4 days)
|
||||
_day_counter INT;
|
||||
_missing_days INT[] := ARRAY[]::INT[]; -- Array to store days with missing or incomplete data
|
||||
BEGIN
|
||||
-- Get the timestamp before the last successful run of this module
|
||||
SELECT last_calculation_timestamp INTO _last_calc_time
|
||||
FROM public.calculate_status
|
||||
WHERE module_name = _module_name;
|
||||
|
||||
RAISE NOTICE 'Running % for date %. Start Time: %', _module_name, _target_date, _start_time;
|
||||
RAISE NOTICE 'Running % script. Start Time: %', _module_name, _start_time;
|
||||
|
||||
-- First, check which days need processing by comparing orders data with snapshot data
|
||||
FOR _day_counter IN 0..(_process_days-1) LOOP
|
||||
_target_date := CURRENT_DATE - (_day_counter * INTERVAL '1 day');
|
||||
|
||||
-- Check if this date needs updating by comparing orders to snapshot data
|
||||
-- If the date has orders but not enough snapshots, or if snapshots show zero sales but orders exist, it's incomplete
|
||||
SELECT
|
||||
CASE WHEN (
|
||||
-- We have orders for this date but not enough snapshots, or snapshots with wrong total
|
||||
(EXISTS (SELECT 1 FROM public.orders WHERE date::date = _target_date) AND
|
||||
(
|
||||
-- No snapshots exist for this date
|
||||
NOT EXISTS (SELECT 1 FROM public.daily_product_snapshots WHERE snapshot_date = _target_date) OR
|
||||
-- Or snapshots show zero sales but orders exist
|
||||
(SELECT COALESCE(SUM(units_sold), 0) FROM public.daily_product_snapshots WHERE snapshot_date = _target_date) = 0 OR
|
||||
-- Or the count of snapshot records is significantly less than distinct products in orders
|
||||
(SELECT COUNT(*) FROM public.daily_product_snapshots WHERE snapshot_date = _target_date) <
|
||||
(SELECT COUNT(DISTINCT pid) FROM public.orders WHERE date::date = _target_date) * 0.8
|
||||
)
|
||||
)
|
||||
) THEN TRUE ELSE FALSE END
|
||||
INTO _has_orders;
|
||||
|
||||
IF _has_orders THEN
|
||||
-- This day needs processing - add to our array
|
||||
_missing_days := _missing_days || _day_counter;
|
||||
RAISE NOTICE 'Day % needs updating (incomplete or missing data)', _target_date;
|
||||
END IF;
|
||||
END LOOP;
|
||||
|
||||
-- If no days need updating, exit early
|
||||
IF array_length(_missing_days, 1) IS NULL THEN
|
||||
RAISE NOTICE 'No days need updating - all snapshot data appears complete';
|
||||
|
||||
-- Still update the calculate_status to record this run
|
||||
UPDATE public.calculate_status
|
||||
SET last_calculation_timestamp = _start_time
|
||||
WHERE module_name = _module_name;
|
||||
|
||||
RETURN;
|
||||
END IF;
|
||||
|
||||
RAISE NOTICE 'Need to update % days with missing or incomplete data', array_length(_missing_days, 1);
|
||||
|
||||
-- Use CTEs to aggregate data for the target date
|
||||
WITH SalesData AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.sku,
|
||||
-- Aggregate Sales (Quantity > 0, Status not Canceled/Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.quantity ELSE 0 END), 0) AS units_sold,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.price * o.quantity ELSE 0 END), 0.00) AS gross_revenue_unadjusted, -- Before discount
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.discount ELSE 0 END), 0.00) AS discounts,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN COALESCE(o.costeach, p.landing_cost_price, p.cost_price) * o.quantity ELSE 0 END), 0.00) AS cogs,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN p.regular_price * o.quantity ELSE 0 END), 0.00) AS gross_regular_revenue, -- Use current regular price for simplicity here
|
||||
-- Process only the days that need updating
|
||||
FOREACH _day_counter IN ARRAY _missing_days LOOP
|
||||
_target_date := CURRENT_DATE - (_day_counter * INTERVAL '1 day');
|
||||
RAISE NOTICE 'Processing date: %', _target_date;
|
||||
|
||||
-- IMPORTANT: First delete any existing data for this date to prevent duplication
|
||||
DELETE FROM public.daily_product_snapshots
|
||||
WHERE snapshot_date = _target_date;
|
||||
|
||||
-- Aggregate Returns (Quantity < 0 or Status = Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN ABS(o.quantity) ELSE 0 END), 0) AS units_returned,
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN o.price * ABS(o.quantity) ELSE 0 END), 0.00) AS returns_revenue
|
||||
FROM public.products p -- Start from products to include those with no orders today
|
||||
LEFT JOIN public.orders o
|
||||
ON p.pid = o.pid
|
||||
AND o.date::date = _target_date -- Cast to date to ensure compatibility regardless of original type
|
||||
GROUP BY p.pid, p.sku
|
||||
),
|
||||
ReceivingData AS (
|
||||
SELECT
|
||||
po.pid,
|
||||
-- Prioritize the actual table fields over the JSON data
|
||||
COALESCE(
|
||||
-- First try the received field from purchase_orders table
|
||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received ELSE 0 END),
|
||||
-- Proceed with calculating daily metrics only for products with actual activity
|
||||
WITH SalesData AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.sku,
|
||||
-- Track number of orders to ensure we have real data
|
||||
COUNT(o.id) as order_count,
|
||||
-- Aggregate Sales (Quantity > 0, Status not Canceled/Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.quantity ELSE 0 END), 0) AS units_sold,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.price * o.quantity ELSE 0 END), 0.00) AS gross_revenue_unadjusted, -- Before discount
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.discount ELSE 0 END), 0.00) AS discounts,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN COALESCE(o.costeach, p.landing_cost_price, p.cost_price) * o.quantity ELSE 0 END), 0.00) AS cogs,
|
||||
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN p.regular_price * o.quantity ELSE 0 END), 0.00) AS gross_regular_revenue, -- Use current regular price for simplicity here
|
||||
|
||||
-- Aggregate Returns (Quantity < 0 or Status = Returned)
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN ABS(o.quantity) ELSE 0 END), 0) AS units_returned,
|
||||
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN o.price * ABS(o.quantity) ELSE 0 END), 0.00) AS returns_revenue
|
||||
FROM public.products p -- Start from products to include those with no orders today
|
||||
JOIN public.orders o -- Changed to INNER JOIN to only process products with orders
|
||||
ON p.pid = o.pid
|
||||
AND o.date::date = _target_date -- Cast to date to ensure compatibility regardless of original type
|
||||
GROUP BY p.pid, p.sku
|
||||
-- No HAVING clause here - we always want to include all orders
|
||||
),
|
||||
ReceivingData AS (
|
||||
SELECT
|
||||
po.pid,
|
||||
-- Track number of POs to ensure we have real data
|
||||
COUNT(po.po_id) as po_count,
|
||||
-- Prioritize the actual table fields over the JSON data
|
||||
COALESCE(
|
||||
-- First try the received field from purchase_orders table
|
||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received ELSE 0 END),
|
||||
|
||||
-- Otherwise fall back to the receiving_history JSON as secondary source
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
),
|
||||
0
|
||||
) AS units_received,
|
||||
|
||||
-- Otherwise fall back to the receiving_history JSON as secondary source
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
),
|
||||
0
|
||||
) AS units_received,
|
||||
|
||||
COALESCE(
|
||||
-- First try the actual cost_price from purchase_orders
|
||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received * po.cost_price ELSE 0 END),
|
||||
|
||||
-- Otherwise fall back to receiving_history JSON
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
* COALESCE((rh.item->>'cost')::numeric, po.cost_price)
|
||||
),
|
||||
0.00
|
||||
) AS cost_received
|
||||
FROM public.purchase_orders po
|
||||
LEFT JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item) ON
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0 AND
|
||||
(
|
||||
(rh.item->>'date')::date = _target_date OR
|
||||
(rh.item->>'received_at')::date = _target_date OR
|
||||
(rh.item->>'receipt_date')::date = _target_date
|
||||
)
|
||||
-- Include POs with the current date or relevant receiving_history
|
||||
WHERE
|
||||
po.date::date = _target_date OR
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0
|
||||
GROUP BY po.pid
|
||||
),
|
||||
CurrentStock AS (
|
||||
-- Select current stock values directly from products table
|
||||
SELECT
|
||||
COALESCE(
|
||||
-- First try the actual cost_price from purchase_orders
|
||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received * po.cost_price ELSE 0 END),
|
||||
|
||||
-- Otherwise fall back to receiving_history JSON
|
||||
SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
* COALESCE((rh.item->>'cost')::numeric, po.cost_price)
|
||||
),
|
||||
0.00
|
||||
) AS cost_received
|
||||
FROM public.purchase_orders po
|
||||
LEFT JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item) ON
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0 AND
|
||||
(
|
||||
(rh.item->>'date')::date = _target_date OR
|
||||
(rh.item->>'received_at')::date = _target_date OR
|
||||
(rh.item->>'receipt_date')::date = _target_date
|
||||
)
|
||||
-- Include POs with the current date or relevant receiving_history
|
||||
WHERE
|
||||
po.date::date = _target_date OR
|
||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
||||
jsonb_array_length(po.receiving_history) > 0
|
||||
GROUP BY po.pid
|
||||
-- CRITICAL: Only include products with actual receiving activity
|
||||
HAVING COUNT(po.po_id) > 0 OR SUM(
|
||||
CASE
|
||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
||||
ELSE 0
|
||||
END
|
||||
) > 0
|
||||
),
|
||||
CurrentStock AS (
|
||||
-- Select current stock values directly from products table
|
||||
SELECT
|
||||
pid,
|
||||
stock_quantity,
|
||||
COALESCE(landing_cost_price, cost_price, 0.00) as effective_cost_price,
|
||||
COALESCE(price, 0.00) as current_price,
|
||||
COALESCE(regular_price, 0.00) as current_regular_price
|
||||
FROM public.products
|
||||
),
|
||||
ProductsWithActivity AS (
|
||||
-- Quick pre-filter to only process products with activity
|
||||
SELECT DISTINCT pid
|
||||
FROM (
|
||||
SELECT pid FROM SalesData
|
||||
UNION
|
||||
SELECT pid FROM ReceivingData
|
||||
) a
|
||||
)
|
||||
-- Now insert records, but ONLY for products with actual activity
|
||||
INSERT INTO public.daily_product_snapshots (
|
||||
snapshot_date,
|
||||
pid,
|
||||
stock_quantity,
|
||||
COALESCE(landing_cost_price, cost_price, 0.00) as effective_cost_price,
|
||||
COALESCE(price, 0.00) as current_price,
|
||||
COALESCE(regular_price, 0.00) as current_regular_price
|
||||
FROM public.products
|
||||
)
|
||||
-- Upsert into the daily snapshots table
|
||||
INSERT INTO public.daily_product_snapshots (
|
||||
snapshot_date,
|
||||
pid,
|
||||
sku,
|
||||
eod_stock_quantity,
|
||||
eod_stock_cost,
|
||||
eod_stock_retail,
|
||||
eod_stock_gross,
|
||||
stockout_flag,
|
||||
units_sold,
|
||||
units_returned,
|
||||
gross_revenue,
|
||||
discounts,
|
||||
returns_revenue,
|
||||
net_revenue,
|
||||
cogs,
|
||||
gross_regular_revenue,
|
||||
profit,
|
||||
units_received,
|
||||
cost_received,
|
||||
calculation_timestamp
|
||||
)
|
||||
SELECT
|
||||
_target_date AS snapshot_date,
|
||||
p.pid,
|
||||
p.sku,
|
||||
-- Inventory Metrics (Using CurrentStock)
|
||||
cs.stock_quantity AS eod_stock_quantity,
|
||||
cs.stock_quantity * cs.effective_cost_price AS eod_stock_cost,
|
||||
cs.stock_quantity * cs.current_price AS eod_stock_retail,
|
||||
cs.stock_quantity * cs.current_regular_price AS eod_stock_gross,
|
||||
(cs.stock_quantity <= 0) AS stockout_flag,
|
||||
-- Sales Metrics (From SalesData)
|
||||
COALESCE(sd.units_sold, 0),
|
||||
COALESCE(sd.units_returned, 0),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00),
|
||||
COALESCE(sd.discounts, 0.00),
|
||||
COALESCE(sd.returns_revenue, 0.00),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00) AS net_revenue,
|
||||
COALESCE(sd.cogs, 0.00),
|
||||
COALESCE(sd.gross_regular_revenue, 0.00),
|
||||
(COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00)) - COALESCE(sd.cogs, 0.00) AS profit, -- Basic profit: Net Revenue - COGS
|
||||
-- Receiving Metrics (From ReceivingData)
|
||||
COALESCE(rd.units_received, 0),
|
||||
COALESCE(rd.cost_received, 0.00),
|
||||
_start_time -- Timestamp of this calculation run
|
||||
FROM public.products p
|
||||
LEFT JOIN CurrentStock cs ON p.pid = cs.pid
|
||||
LEFT JOIN SalesData sd ON p.pid = sd.pid
|
||||
LEFT JOIN ReceivingData rd ON p.pid = rd.pid
|
||||
WHERE p.pid IS NOT NULL -- Ensure we only insert for existing products
|
||||
sku,
|
||||
eod_stock_quantity,
|
||||
eod_stock_cost,
|
||||
eod_stock_retail,
|
||||
eod_stock_gross,
|
||||
stockout_flag,
|
||||
units_sold,
|
||||
units_returned,
|
||||
gross_revenue,
|
||||
discounts,
|
||||
returns_revenue,
|
||||
net_revenue,
|
||||
cogs,
|
||||
gross_regular_revenue,
|
||||
profit,
|
||||
units_received,
|
||||
cost_received,
|
||||
calculation_timestamp
|
||||
)
|
||||
SELECT
|
||||
_target_date AS snapshot_date,
|
||||
COALESCE(sd.pid, rd.pid) AS pid, -- Use sales or receiving PID
|
||||
COALESCE(sd.sku, p.sku) AS sku, -- Get SKU from sales data or products table
|
||||
-- Inventory Metrics (Using CurrentStock)
|
||||
cs.stock_quantity AS eod_stock_quantity,
|
||||
cs.stock_quantity * cs.effective_cost_price AS eod_stock_cost,
|
||||
cs.stock_quantity * cs.current_price AS eod_stock_retail,
|
||||
cs.stock_quantity * cs.current_regular_price AS eod_stock_gross,
|
||||
(cs.stock_quantity <= 0) AS stockout_flag,
|
||||
-- Sales Metrics (From SalesData)
|
||||
COALESCE(sd.units_sold, 0),
|
||||
COALESCE(sd.units_returned, 0),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00),
|
||||
COALESCE(sd.discounts, 0.00),
|
||||
COALESCE(sd.returns_revenue, 0.00),
|
||||
COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00) AS net_revenue,
|
||||
COALESCE(sd.cogs, 0.00),
|
||||
COALESCE(sd.gross_regular_revenue, 0.00),
|
||||
(COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00)) - COALESCE(sd.cogs, 0.00) AS profit, -- Basic profit: Net Revenue - COGS
|
||||
-- Receiving Metrics (From ReceivingData)
|
||||
COALESCE(rd.units_received, 0),
|
||||
COALESCE(rd.cost_received, 0.00),
|
||||
_start_time -- Timestamp of this calculation run
|
||||
FROM SalesData sd
|
||||
FULL OUTER JOIN ReceivingData rd ON sd.pid = rd.pid
|
||||
JOIN ProductsWithActivity pwa ON COALESCE(sd.pid, rd.pid) = pwa.pid
|
||||
LEFT JOIN public.products p ON COALESCE(sd.pid, rd.pid) = p.pid
|
||||
LEFT JOIN CurrentStock cs ON COALESCE(sd.pid, rd.pid) = cs.pid
|
||||
WHERE p.pid IS NOT NULL; -- Ensure we only insert for existing products
|
||||
|
||||
ON CONFLICT (snapshot_date, pid) DO UPDATE SET
|
||||
sku = EXCLUDED.sku,
|
||||
eod_stock_quantity = EXCLUDED.eod_stock_quantity,
|
||||
eod_stock_cost = EXCLUDED.eod_stock_cost,
|
||||
eod_stock_retail = EXCLUDED.eod_stock_retail,
|
||||
eod_stock_gross = EXCLUDED.eod_stock_gross,
|
||||
stockout_flag = EXCLUDED.stockout_flag,
|
||||
units_sold = EXCLUDED.units_sold,
|
||||
units_returned = EXCLUDED.units_returned,
|
||||
gross_revenue = EXCLUDED.gross_revenue,
|
||||
discounts = EXCLUDED.discounts,
|
||||
returns_revenue = EXCLUDED.returns_revenue,
|
||||
net_revenue = EXCLUDED.net_revenue,
|
||||
cogs = EXCLUDED.cogs,
|
||||
gross_regular_revenue = EXCLUDED.gross_regular_revenue,
|
||||
profit = EXCLUDED.profit,
|
||||
units_received = EXCLUDED.units_received,
|
||||
cost_received = EXCLUDED.cost_received,
|
||||
calculation_timestamp = EXCLUDED.calculation_timestamp; -- Use the timestamp from this run
|
||||
-- Get the total number of records inserted for this date
|
||||
GET DIAGNOSTICS _total_records = ROW_COUNT;
|
||||
RAISE NOTICE 'Created % daily snapshot records for % with sales/receiving activity', _total_records, _target_date;
|
||||
END LOOP;
|
||||
|
||||
-- Update the status table with the timestamp from the START of this run
|
||||
UPDATE public.calculate_status
|
||||
SET last_calculation_timestamp = _start_time
|
||||
WHERE module_name = _module_name;
|
||||
|
||||
RAISE NOTICE 'Finished % for date %. Duration: %', _module_name, _target_date, clock_timestamp() - _start_time;
|
||||
RAISE NOTICE 'Finished % processing for multiple dates. Duration: %', _module_name, clock_timestamp() - _start_time;
|
||||
|
||||
END $$;
|
||||
@@ -28,6 +28,27 @@ BEGIN
|
||||
COALESCE(p.image_175, p.image) as image_url,
|
||||
p.visible as is_visible,
|
||||
p.replenishable as is_replenishable,
|
||||
-- Add new product fields
|
||||
p.barcode,
|
||||
p.harmonized_tariff_code,
|
||||
p.vendor_reference,
|
||||
p.notions_reference,
|
||||
p.line,
|
||||
p.subline,
|
||||
p.artist,
|
||||
p.moq,
|
||||
p.rating,
|
||||
p.reviews,
|
||||
p.weight,
|
||||
p.length,
|
||||
p.width,
|
||||
p.height,
|
||||
p.country_of_origin,
|
||||
p.location,
|
||||
p.baskets,
|
||||
p.notifies,
|
||||
p.preorder_count,
|
||||
p.notions_inv_count,
|
||||
COALESCE(p.price, 0.00) as current_price,
|
||||
COALESCE(p.regular_price, 0.00) as current_regular_price,
|
||||
COALESCE(p.cost_price, 0.00) as current_cost_price,
|
||||
@@ -36,7 +57,7 @@ BEGIN
|
||||
p.created_at,
|
||||
p.first_received,
|
||||
p.date_last_sold,
|
||||
p.moq,
|
||||
p.total_sold as historical_total_sold, -- Add historical total_sold from products table
|
||||
p.uom -- Assuming UOM logic is handled elsewhere or simple (e.g., 1=each)
|
||||
FROM public.products p
|
||||
),
|
||||
@@ -110,31 +131,37 @@ BEGIN
|
||||
SUM(units_sold) AS total_units_sold,
|
||||
SUM(net_revenue) AS total_net_revenue,
|
||||
|
||||
-- Specific time windows if we have enough data
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '6 days' THEN units_sold ELSE 0 END) AS sales_7d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '6 days' THEN net_revenue ELSE 0 END) AS revenue_7d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '13 days' THEN units_sold ELSE 0 END) AS sales_14d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '13 days' THEN net_revenue ELSE 0 END) AS revenue_14d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN units_sold ELSE 0 END) AS sales_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN net_revenue ELSE 0 END) AS revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN cogs ELSE 0 END) AS cogs_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN profit ELSE 0 END) AS profit_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN units_returned ELSE 0 END) AS returns_units_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN returns_revenue ELSE 0 END) AS returns_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN discounts ELSE 0 END) AS discounts_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN gross_revenue ELSE 0 END) AS gross_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN gross_regular_revenue ELSE 0 END) AS gross_regular_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND stockout_flag THEN 1 ELSE 0 END) AS stockout_days_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '364 days' THEN units_sold ELSE 0 END) AS sales_365d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '364 days' THEN net_revenue ELSE 0 END) AS revenue_365d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN units_received ELSE 0 END) AS received_qty_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN cost_received ELSE 0 END) AS received_cost_30d,
|
||||
-- Specific time windows using date range boundaries precisely
|
||||
-- Use _current_date - INTERVAL '6 days' to include 7 days (today + 6 previous days)
|
||||
-- This ensures we count exactly the right number of days in each period
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '6 days' AND snapshot_date <= _current_date THEN units_sold ELSE 0 END) AS sales_7d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '6 days' AND snapshot_date <= _current_date THEN net_revenue ELSE 0 END) AS revenue_7d,
|
||||
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '13 days' AND snapshot_date <= _current_date THEN units_sold ELSE 0 END) AS sales_14d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '13 days' AND snapshot_date <= _current_date THEN net_revenue ELSE 0 END) AS revenue_14d,
|
||||
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN units_sold ELSE 0 END) AS sales_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN net_revenue ELSE 0 END) AS revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN cogs ELSE 0 END) AS cogs_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN profit ELSE 0 END) AS profit_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN units_returned ELSE 0 END) AS returns_units_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN returns_revenue ELSE 0 END) AS returns_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN discounts ELSE 0 END) AS discounts_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN gross_revenue ELSE 0 END) AS gross_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN gross_regular_revenue ELSE 0 END) AS gross_regular_revenue_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date AND stockout_flag THEN 1 ELSE 0 END) AS stockout_days_30d,
|
||||
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '364 days' AND snapshot_date <= _current_date THEN units_sold ELSE 0 END) AS sales_365d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '364 days' AND snapshot_date <= _current_date THEN net_revenue ELSE 0 END) AS revenue_365d,
|
||||
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN units_received ELSE 0 END) AS received_qty_30d,
|
||||
SUM(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN cost_received ELSE 0 END) AS received_cost_30d,
|
||||
|
||||
-- Averages (check for NULLIF 0 days in period if filtering dates)
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN eod_stock_quantity END) AS avg_stock_units_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN eod_stock_cost END) AS avg_stock_cost_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN eod_stock_retail END) AS avg_stock_retail_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' THEN eod_stock_gross END) AS avg_stock_gross_30d,
|
||||
-- Averages for stock levels - only include dates within the specified period
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN eod_stock_quantity END) AS avg_stock_units_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN eod_stock_cost END) AS avg_stock_cost_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN eod_stock_retail END) AS avg_stock_retail_30d,
|
||||
AVG(CASE WHEN snapshot_date >= _current_date - INTERVAL '29 days' AND snapshot_date <= _current_date THEN eod_stock_gross END) AS avg_stock_gross_30d,
|
||||
|
||||
-- Lifetime - should match total values above
|
||||
SUM(units_sold) AS lifetime_sales,
|
||||
@@ -150,14 +177,14 @@ BEGIN
|
||||
SELECT
|
||||
pid,
|
||||
date_first_sold,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '6 days' THEN units_sold ELSE 0 END) AS first_7_days_sales,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '6 days' THEN net_revenue ELSE 0 END) AS first_7_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '29 days' THEN units_sold ELSE 0 END) AS first_30_days_sales,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '29 days' THEN net_revenue ELSE 0 END) AS first_30_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '59 days' THEN units_sold ELSE 0 END) AS first_60_days_sales,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '59 days' THEN net_revenue ELSE 0 END) AS first_60_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '89 days' THEN units_sold ELSE 0 END) AS first_90_days_sales,
|
||||
SUM(CASE WHEN snapshot_date BETWEEN date_first_sold AND date_first_sold + INTERVAL '89 days' THEN net_revenue ELSE 0 END) AS first_90_days_revenue
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '6 days' THEN units_sold ELSE 0 END) AS first_7_days_sales,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '6 days' THEN net_revenue ELSE 0 END) AS first_7_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '29 days' THEN units_sold ELSE 0 END) AS first_30_days_sales,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '29 days' THEN net_revenue ELSE 0 END) AS first_30_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '59 days' THEN units_sold ELSE 0 END) AS first_60_days_sales,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '59 days' THEN net_revenue ELSE 0 END) AS first_60_days_revenue,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '89 days' THEN units_sold ELSE 0 END) AS first_90_days_sales,
|
||||
SUM(CASE WHEN snapshot_date >= date_first_sold AND snapshot_date <= date_first_sold + INTERVAL '89 days' THEN net_revenue ELSE 0 END) AS first_90_days_revenue
|
||||
FROM public.daily_product_snapshots ds
|
||||
JOIN HistoricalDates hd USING(pid)
|
||||
WHERE date_first_sold IS NOT NULL
|
||||
@@ -179,6 +206,9 @@ BEGIN
|
||||
-- Final UPSERT into product_metrics
|
||||
INSERT INTO public.product_metrics (
|
||||
pid, last_calculated, sku, title, brand, vendor, image_url, is_visible, is_replenishable,
|
||||
barcode, harmonized_tariff_code, vendor_reference, notions_reference, line, subline, artist,
|
||||
moq, rating, reviews, weight, length, width, height, country_of_origin, location,
|
||||
baskets, notifies, preorder_count, notions_inv_count,
|
||||
current_price, current_regular_price, current_cost_price, current_landing_cost_price,
|
||||
current_stock, current_stock_cost, current_stock_retail, current_stock_gross,
|
||||
on_order_qty, on_order_cost, on_order_retail, earliest_expected_date,
|
||||
@@ -203,10 +233,14 @@ BEGIN
|
||||
to_order_units, forecast_lost_sales_units, forecast_lost_revenue,
|
||||
stock_cover_in_days, po_cover_in_days, sells_out_in_days, replenish_date,
|
||||
overstocked_units, overstocked_cost, overstocked_retail, is_old_stock,
|
||||
yesterday_sales
|
||||
yesterday_sales,
|
||||
status -- Add status field for calculated status
|
||||
)
|
||||
SELECT
|
||||
ci.pid, _start_time, ci.sku, ci.title, ci.brand, ci.vendor, ci.image_url, ci.is_visible, ci.is_replenishable,
|
||||
ci.barcode, ci.harmonized_tariff_code, ci.vendor_reference, ci.notions_reference, ci.line, ci.subline, ci.artist,
|
||||
ci.moq, ci.rating, ci.reviews, ci.weight, ci.length, ci.width, ci.height, ci.country_of_origin, ci.location,
|
||||
ci.baskets, ci.notifies, ci.preorder_count, ci.notions_inv_count,
|
||||
ci.current_price, ci.current_regular_price, ci.current_cost_price, ci.current_effective_cost,
|
||||
ci.current_stock, ci.current_stock * ci.current_effective_cost, ci.current_stock * ci.current_price, ci.current_stock * ci.current_regular_price,
|
||||
COALESCE(ooi.on_order_qty, 0), COALESCE(ooi.on_order_cost, 0.00), COALESCE(ooi.on_order_qty, 0) * ci.current_price, ooi.earliest_expected_date,
|
||||
@@ -222,9 +256,25 @@ BEGIN
|
||||
sa.stockout_days_30d, sa.sales_365d, sa.revenue_365d,
|
||||
sa.avg_stock_units_30d, sa.avg_stock_cost_30d, sa.avg_stock_retail_30d, sa.avg_stock_gross_30d,
|
||||
sa.received_qty_30d, sa.received_cost_30d,
|
||||
-- Use total counts for lifetime values to ensure we have data even with limited history
|
||||
COALESCE(sa.total_units_sold, sa.lifetime_sales) AS lifetime_sales,
|
||||
COALESCE(sa.total_net_revenue, sa.lifetime_revenue) AS lifetime_revenue,
|
||||
-- Use total_sold from products table as the source of truth for lifetime sales
|
||||
-- This includes all historical data from the production database
|
||||
ci.historical_total_sold AS lifetime_sales,
|
||||
COALESCE(
|
||||
-- Option 1: Use 30-day average price if available
|
||||
CASE WHEN sa.sales_30d > 0 THEN
|
||||
ci.historical_total_sold * (sa.revenue_30d / NULLIF(sa.sales_30d, 0))
|
||||
ELSE NULL END,
|
||||
-- Option 2: Try 365-day average price if available
|
||||
CASE WHEN sa.sales_365d > 0 THEN
|
||||
ci.historical_total_sold * (sa.revenue_365d / NULLIF(sa.sales_365d, 0))
|
||||
ELSE NULL END,
|
||||
-- Option 3: Use current price as a reasonable estimate
|
||||
ci.historical_total_sold * ci.current_price,
|
||||
-- Option 4: Use regular price if current price might be zero
|
||||
ci.historical_total_sold * ci.current_regular_price,
|
||||
-- Final fallback: Use accumulated revenue (this is less accurate for old products)
|
||||
sa.total_net_revenue
|
||||
) AS lifetime_revenue,
|
||||
fpm.first_7_days_sales, fpm.first_7_days_revenue, fpm.first_30_days_sales, fpm.first_30_days_revenue,
|
||||
fpm.first_60_days_sales, fpm.first_60_days_revenue, fpm.first_90_days_sales, fpm.first_90_days_revenue,
|
||||
|
||||
@@ -246,45 +296,314 @@ BEGIN
|
||||
(sa.sales_30d / NULLIF(ci.current_stock + sa.sales_30d, 0)) * 100 AS sell_through_30d,
|
||||
|
||||
-- Forecasting intermediate values
|
||||
(sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) AS sales_velocity_daily,
|
||||
-- CRITICAL FIX: Use safer velocity calculation to prevent extreme values
|
||||
-- Original problematic calculation: (sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0))
|
||||
-- Use available days (not stockout days) as denominator with a minimum safety value
|
||||
(sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d, -- Standard calculation
|
||||
CASE
|
||||
WHEN sa.sales_30d > 0 THEN 14.0 -- If we have sales, ensure at least 14 days denominator
|
||||
ELSE 30.0 -- If no sales, use full period
|
||||
END
|
||||
),
|
||||
0
|
||||
)
|
||||
) AS sales_velocity_daily,
|
||||
s.effective_lead_time AS config_lead_time,
|
||||
s.effective_days_of_stock AS config_days_of_stock,
|
||||
s.effective_safety_stock AS config_safety_stock,
|
||||
(s.effective_lead_time + s.effective_days_of_stock) AS planning_period_days,
|
||||
(sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time AS lead_time_forecast_units,
|
||||
(sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock AS days_of_stock_forecast_units,
|
||||
((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock) AS planning_period_forecast_units,
|
||||
(ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time)) AS lead_time_closing_stock,
|
||||
((ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time))) - ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock) AS days_of_stock_closing_stock,
|
||||
(((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0) AS replenishment_needed_raw,
|
||||
|
||||
-- Apply the same fix to all derived calculations
|
||||
(sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time AS lead_time_forecast_units,
|
||||
|
||||
(sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock AS days_of_stock_forecast_units,
|
||||
|
||||
(sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * (s.effective_lead_time + s.effective_days_of_stock) AS planning_period_forecast_units,
|
||||
|
||||
(ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time)) AS lead_time_closing_stock,
|
||||
|
||||
((ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time))) - ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock) AS days_of_stock_closing_stock,
|
||||
|
||||
(((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0) AS replenishment_needed_raw,
|
||||
|
||||
-- Final Forecasting / Replenishment Metrics (apply CEILING/GREATEST/etc.)
|
||||
-- Note: These calculations are nested for clarity, can be simplified in prod
|
||||
CEILING(GREATEST(0, ((((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int AS replenishment_units,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * ci.current_effective_cost AS replenishment_cost,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * ci.current_price AS replenishment_retail,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * (ci.current_price - ci.current_effective_cost) AS replenishment_profit,
|
||||
CEILING(GREATEST(0, ((((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int AS replenishment_units,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * ci.current_effective_cost AS replenishment_cost,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * ci.current_price AS replenishment_retail,
|
||||
(CEILING(GREATEST(0, ((((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int) * (ci.current_price - ci.current_effective_cost) AS replenishment_profit,
|
||||
|
||||
-- Placeholder for To Order (Apply MOQ/UOM logic here if needed, otherwise equals replenishment)
|
||||
CEILING(GREATEST(0, ((((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int AS to_order_units,
|
||||
CEILING(GREATEST(0, ((((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)) + s.effective_safety_stock - ci.current_stock - COALESCE(ooi.on_order_qty, 0))))::int AS to_order_units,
|
||||
|
||||
GREATEST(0, - (ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time))) AS forecast_lost_sales_units,
|
||||
GREATEST(0, - (ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time))) * ci.current_price AS forecast_lost_revenue,
|
||||
GREATEST(0, - (ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time))) AS forecast_lost_sales_units,
|
||||
GREATEST(0, - (ci.current_stock + COALESCE(ooi.on_order_qty, 0) - ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time))) * ci.current_price AS forecast_lost_revenue,
|
||||
|
||||
ci.current_stock / NULLIF((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)), 0) AS stock_cover_in_days,
|
||||
COALESCE(ooi.on_order_qty, 0) / NULLIF((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)), 0) AS po_cover_in_days,
|
||||
(ci.current_stock + COALESCE(ooi.on_order_qty, 0)) / NULLIF((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)), 0) AS sells_out_in_days,
|
||||
ci.current_stock / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0) AS stock_cover_in_days,
|
||||
COALESCE(ooi.on_order_qty, 0) / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0) AS po_cover_in_days,
|
||||
(ci.current_stock + COALESCE(ooi.on_order_qty, 0)) / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0) AS sells_out_in_days,
|
||||
|
||||
-- Replenish Date: Date when stock is projected to hit safety stock, minus lead time
|
||||
CASE
|
||||
WHEN (sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) > 0
|
||||
THEN _current_date + FLOOR(GREATEST(0, ci.current_stock - s.effective_safety_stock) / (sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)))::int - s.effective_lead_time
|
||||
WHEN (sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) > 0
|
||||
THEN _current_date + FLOOR(GREATEST(0, ci.current_stock - s.effective_safety_stock) / (sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
))::int - s.effective_lead_time
|
||||
ELSE NULL
|
||||
END AS replenish_date,
|
||||
|
||||
GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)))::int AS overstocked_units,
|
||||
(GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)))) * ci.current_effective_cost AS overstocked_cost,
|
||||
(GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_lead_time) + ((sa.sales_30d / NULLIF(30.0 - sa.stockout_days_30d, 0)) * s.effective_days_of_stock)))) * ci.current_price AS overstocked_retail,
|
||||
GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)))::int AS overstocked_units,
|
||||
(GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)))) * ci.current_effective_cost AS overstocked_cost,
|
||||
(GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock)))) * ci.current_price AS overstocked_retail,
|
||||
|
||||
-- Old Stock Flag
|
||||
(ci.created_at::date < _current_date - INTERVAL '60 day') AND
|
||||
@@ -293,7 +612,119 @@ BEGIN
|
||||
COALESCE(ooi.on_order_qty, 0) = 0
|
||||
AS is_old_stock,
|
||||
|
||||
sa.yesterday_sales
|
||||
sa.yesterday_sales,
|
||||
|
||||
-- Calculate status using direct CASE statements (inline logic)
|
||||
CASE
|
||||
-- Non-replenishable items default to Healthy
|
||||
WHEN NOT ci.is_replenishable THEN 'Healthy'
|
||||
|
||||
-- Calculate lead time and thresholds
|
||||
ELSE
|
||||
CASE
|
||||
-- Check for overstock first
|
||||
WHEN GREATEST(0, ci.current_stock - s.effective_safety_stock - (((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_lead_time) + ((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
) * s.effective_days_of_stock))) > 0 THEN 'Overstock'
|
||||
|
||||
-- Check for Critical stock
|
||||
WHEN ci.current_stock <= 0 OR
|
||||
(ci.current_stock / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0)) <= 0 THEN 'Critical'
|
||||
|
||||
WHEN (ci.current_stock / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0)) < (COALESCE(s.effective_lead_time, 30) * 0.5) THEN 'Critical'
|
||||
|
||||
-- Check for reorder soon
|
||||
WHEN ((ci.current_stock + COALESCE(ooi.on_order_qty, 0)) / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0)) < (COALESCE(s.effective_lead_time, 30) + 7) THEN
|
||||
CASE
|
||||
WHEN (ci.current_stock / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0)) < (COALESCE(s.effective_lead_time, 30) * 0.5) THEN 'Critical'
|
||||
ELSE 'Reorder Soon'
|
||||
END
|
||||
|
||||
-- Check for 'At Risk' - old stock
|
||||
WHEN (ci.created_at::date < _current_date - INTERVAL '60 day') AND
|
||||
(COALESCE(ci.date_last_sold, hd.max_order_date) IS NULL OR COALESCE(ci.date_last_sold, hd.max_order_date) < _current_date - INTERVAL '60 day') AND
|
||||
(hd.date_last_received_calc IS NULL OR hd.date_last_received_calc < _current_date - INTERVAL '60 day') AND
|
||||
COALESCE(ooi.on_order_qty, 0) = 0 THEN 'At Risk'
|
||||
|
||||
-- Check for 'At Risk' - hasn't sold in a long time
|
||||
WHEN COALESCE(ci.date_last_sold, hd.max_order_date) IS NOT NULL
|
||||
AND COALESCE(ci.date_last_sold, hd.max_order_date) < (_current_date - INTERVAL '90 days')
|
||||
AND (CASE
|
||||
WHEN ci.created_at IS NULL AND hd.date_first_sold IS NULL THEN 0
|
||||
WHEN ci.created_at IS NULL THEN (_current_date - hd.date_first_sold)::integer
|
||||
WHEN hd.date_first_sold IS NULL THEN (_current_date - ci.created_at::date)::integer
|
||||
ELSE (_current_date - LEAST(ci.created_at::date, hd.date_first_sold))::integer
|
||||
END) > 180 THEN 'At Risk'
|
||||
|
||||
-- Very high stock cover is at risk too
|
||||
WHEN (ci.current_stock / NULLIF((sa.sales_30d /
|
||||
NULLIF(
|
||||
GREATEST(
|
||||
30.0 - sa.stockout_days_30d,
|
||||
CASE WHEN sa.sales_30d > 0 THEN 14.0 ELSE 30.0 END
|
||||
),
|
||||
0
|
||||
)
|
||||
), 0)) > 365 THEN 'At Risk'
|
||||
|
||||
-- New products (less than 30 days old)
|
||||
WHEN (CASE
|
||||
WHEN ci.created_at IS NULL AND hd.date_first_sold IS NULL THEN 0
|
||||
WHEN ci.created_at IS NULL THEN (_current_date - hd.date_first_sold)::integer
|
||||
WHEN hd.date_first_sold IS NULL THEN (_current_date - ci.created_at::date)::integer
|
||||
ELSE (_current_date - LEAST(ci.created_at::date, hd.date_first_sold))::integer
|
||||
END) <= 30 THEN 'New'
|
||||
|
||||
-- If none of the above, assume Healthy
|
||||
ELSE 'Healthy'
|
||||
END
|
||||
END AS status
|
||||
|
||||
FROM CurrentInfo ci
|
||||
LEFT JOIN OnOrderInfo ooi ON ci.pid = ooi.pid
|
||||
@@ -306,6 +737,9 @@ BEGIN
|
||||
ON CONFLICT (pid) DO UPDATE SET
|
||||
last_calculated = EXCLUDED.last_calculated,
|
||||
sku = EXCLUDED.sku, title = EXCLUDED.title, brand = EXCLUDED.brand, vendor = EXCLUDED.vendor, image_url = EXCLUDED.image_url, is_visible = EXCLUDED.is_visible, is_replenishable = EXCLUDED.is_replenishable,
|
||||
barcode = EXCLUDED.barcode, harmonized_tariff_code = EXCLUDED.harmonized_tariff_code, vendor_reference = EXCLUDED.vendor_reference, notions_reference = EXCLUDED.notions_reference, line = EXCLUDED.line, subline = EXCLUDED.subline, artist = EXCLUDED.artist,
|
||||
moq = EXCLUDED.moq, rating = EXCLUDED.rating, reviews = EXCLUDED.reviews, weight = EXCLUDED.weight, length = EXCLUDED.length, width = EXCLUDED.width, height = EXCLUDED.height, country_of_origin = EXCLUDED.country_of_origin, location = EXCLUDED.location,
|
||||
baskets = EXCLUDED.baskets, notifies = EXCLUDED.notifies, preorder_count = EXCLUDED.preorder_count, notions_inv_count = EXCLUDED.notions_inv_count,
|
||||
current_price = EXCLUDED.current_price, current_regular_price = EXCLUDED.current_regular_price, current_cost_price = EXCLUDED.current_cost_price, current_landing_cost_price = EXCLUDED.current_landing_cost_price,
|
||||
current_stock = EXCLUDED.current_stock, current_stock_cost = EXCLUDED.current_stock_cost, current_stock_retail = EXCLUDED.current_stock_retail, current_stock_gross = EXCLUDED.current_stock_gross,
|
||||
on_order_qty = EXCLUDED.on_order_qty, on_order_cost = EXCLUDED.on_order_cost, on_order_retail = EXCLUDED.on_order_retail, earliest_expected_date = EXCLUDED.earliest_expected_date,
|
||||
@@ -330,7 +764,8 @@ BEGIN
|
||||
to_order_units = EXCLUDED.to_order_units, forecast_lost_sales_units = EXCLUDED.forecast_lost_sales_units, forecast_lost_revenue = EXCLUDED.forecast_lost_revenue,
|
||||
stock_cover_in_days = EXCLUDED.stock_cover_in_days, po_cover_in_days = EXCLUDED.po_cover_in_days, sells_out_in_days = EXCLUDED.sells_out_in_days, replenish_date = EXCLUDED.replenish_date,
|
||||
overstocked_units = EXCLUDED.overstocked_units, overstocked_cost = EXCLUDED.overstocked_cost, overstocked_retail = EXCLUDED.overstocked_retail, is_old_stock = EXCLUDED.is_old_stock,
|
||||
yesterday_sales = EXCLUDED.yesterday_sales
|
||||
yesterday_sales = EXCLUDED.yesterday_sales,
|
||||
status = EXCLUDED.status
|
||||
;
|
||||
|
||||
-- Update the status table with the timestamp from the START of this run
|
||||
|
||||
@@ -13,6 +13,22 @@ const dbConfig = {
|
||||
port: process.env.DB_PORT || 5432
|
||||
};
|
||||
|
||||
// Tables to always protect from being dropped
|
||||
const PROTECTED_TABLES = [
|
||||
'users',
|
||||
'permissions',
|
||||
'user_permissions',
|
||||
'calculate_history',
|
||||
'import_history',
|
||||
'ai_prompts',
|
||||
'ai_validation_performance',
|
||||
'templates',
|
||||
'reusable_images',
|
||||
'imported_daily_inventory',
|
||||
'imported_product_stat_history',
|
||||
'imported_product_current_prices'
|
||||
];
|
||||
|
||||
// Helper function to output progress in JSON format
|
||||
function outputProgress(data) {
|
||||
if (!data.status) {
|
||||
@@ -33,17 +49,6 @@ const CORE_TABLES = [
|
||||
'product_categories'
|
||||
];
|
||||
|
||||
// Config tables that must be created
|
||||
const CONFIG_TABLES = [
|
||||
'stock_thresholds',
|
||||
'lead_time_thresholds',
|
||||
'sales_velocity_config',
|
||||
'abc_classification_config',
|
||||
'safety_stock_config',
|
||||
'sales_seasonality',
|
||||
'turnover_config'
|
||||
];
|
||||
|
||||
// Split SQL into individual statements
|
||||
function splitSQLStatements(sql) {
|
||||
// First, normalize line endings
|
||||
@@ -184,8 +189,8 @@ async function resetDatabase() {
|
||||
SELECT string_agg(tablename, ', ') as tables
|
||||
FROM pg_tables
|
||||
WHERE schemaname = 'public'
|
||||
AND tablename NOT IN ('users', 'permissions', 'user_permissions', 'calculate_history', 'import_history', 'ai_prompts', 'ai_validation_performance', 'templates', 'reusable_images');
|
||||
`);
|
||||
AND tablename NOT IN (SELECT unnest($1::text[]));
|
||||
`, [PROTECTED_TABLES]);
|
||||
|
||||
if (!tablesResult.rows[0].tables) {
|
||||
outputProgress({
|
||||
@@ -204,7 +209,7 @@ async function resetDatabase() {
|
||||
// Drop all tables except users
|
||||
const tables = tablesResult.rows[0].tables.split(', ');
|
||||
for (const table of tables) {
|
||||
if (!['users', 'reusable_images'].includes(table)) {
|
||||
if (!PROTECTED_TABLES.includes(table)) {
|
||||
await client.query(`DROP TABLE IF EXISTS "${table}" CASCADE`);
|
||||
}
|
||||
}
|
||||
@@ -259,7 +264,9 @@ async function resetDatabase() {
|
||||
'category_metrics',
|
||||
'brand_metrics',
|
||||
'sales_forecasts',
|
||||
'abc_classification'
|
||||
'abc_classification',
|
||||
'daily_snapshots',
|
||||
'periodic_metrics'
|
||||
)
|
||||
`);
|
||||
}
|
||||
@@ -301,51 +308,67 @@ async function resetDatabase() {
|
||||
}
|
||||
});
|
||||
|
||||
for (let i = 0; i < statements.length; i++) {
|
||||
const stmt = statements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
// Verify if table was created (if this was a CREATE TABLE statement)
|
||||
if (stmt.trim().toLowerCase().startsWith('create table')) {
|
||||
const tableName = stmt.match(/create\s+table\s+(?:if\s+not\s+exists\s+)?["]?(\w+)["]?/i)?.[1];
|
||||
if (tableName) {
|
||||
const tableExists = await client.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = 'public'
|
||||
AND table_name = $1
|
||||
`, [tableName]);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Table Creation Verification',
|
||||
message: {
|
||||
table: tableName,
|
||||
exists: tableExists.rows[0].count > 0
|
||||
}
|
||||
});
|
||||
// Start a transaction for better error handling
|
||||
await client.query('BEGIN');
|
||||
try {
|
||||
for (let i = 0; i < statements.length; i++) {
|
||||
const stmt = statements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
// Verify if table was created (if this was a CREATE TABLE statement)
|
||||
if (stmt.trim().toLowerCase().startsWith('create table')) {
|
||||
const tableName = stmt.match(/create\s+table\s+(?:if\s+not\s+exists\s+)?["]?(\w+)["]?/i)?.[1];
|
||||
if (tableName) {
|
||||
const tableExists = await client.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = 'public'
|
||||
AND table_name = $1
|
||||
`, [tableName]);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Table Creation Verification',
|
||||
message: {
|
||||
table: tableName,
|
||||
exists: tableExists.rows[0].count > 0
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
operation: 'SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: statements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
|
||||
// Commit in chunks of 10 statements to avoid long-running transactions
|
||||
if (i > 0 && i % 10 === 0) {
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
}
|
||||
} catch (sqlError) {
|
||||
await client.query('ROLLBACK');
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
operation: 'SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: statements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
}
|
||||
// Commit the final transaction
|
||||
await client.query('COMMIT');
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Verify core tables were created
|
||||
@@ -383,11 +406,25 @@ async function resetDatabase() {
|
||||
operation: 'Running config setup',
|
||||
message: 'Creating configuration tables...'
|
||||
});
|
||||
const configSchemaSQL = fs.readFileSync(
|
||||
path.join(__dirname, '../db/config-schema-new.sql'),
|
||||
'utf8'
|
||||
);
|
||||
const configSchemaPath = path.join(__dirname, '../db/config-schema-new.sql');
|
||||
|
||||
// Verify file exists
|
||||
if (!fs.existsSync(configSchemaPath)) {
|
||||
throw new Error(`Config schema file not found at: ${configSchemaPath}`);
|
||||
}
|
||||
|
||||
const configSchemaSQL = fs.readFileSync(configSchemaPath, 'utf8');
|
||||
|
||||
outputProgress({
|
||||
operation: 'Config Schema file',
|
||||
message: {
|
||||
path: configSchemaPath,
|
||||
exists: fs.existsSync(configSchemaPath),
|
||||
size: fs.statSync(configSchemaPath).size,
|
||||
firstFewLines: configSchemaSQL.split('\n').slice(0, 5).join('\n')
|
||||
}
|
||||
});
|
||||
|
||||
// Execute config schema statements one at a time
|
||||
const configStatements = splitSQLStatements(configSchemaSQL);
|
||||
outputProgress({
|
||||
@@ -401,30 +438,46 @@ async function resetDatabase() {
|
||||
}
|
||||
});
|
||||
|
||||
for (let i = 0; i < configStatements.length; i++) {
|
||||
const stmt = configStatements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Config SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: configStatements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
// Start a transaction for better error handling
|
||||
await client.query('BEGIN');
|
||||
try {
|
||||
for (let i = 0; i < configStatements.length; i++) {
|
||||
const stmt = configStatements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Config SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: configStatements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
|
||||
// Commit in chunks of 10 statements to avoid long-running transactions
|
||||
if (i > 0 && i % 10 === 0) {
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
}
|
||||
});
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Config SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
} catch (sqlError) {
|
||||
await client.query('ROLLBACK');
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Config SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
}
|
||||
}
|
||||
// Commit the final transaction
|
||||
await client.query('COMMIT');
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
|
||||
// Read and execute metrics schema (metrics tables)
|
||||
@@ -432,11 +485,25 @@ async function resetDatabase() {
|
||||
operation: 'Running metrics setup',
|
||||
message: 'Creating metrics tables...'
|
||||
});
|
||||
const metricsSchemaSQL = fs.readFileSync(
|
||||
path.join(__dirname, '../db/metrics-schema-new.sql'),
|
||||
'utf8'
|
||||
);
|
||||
const metricsSchemaPath = path.join(__dirname, '../db/metrics-schema-new.sql');
|
||||
|
||||
// Verify file exists
|
||||
if (!fs.existsSync(metricsSchemaPath)) {
|
||||
throw new Error(`Metrics schema file not found at: ${metricsSchemaPath}`);
|
||||
}
|
||||
|
||||
const metricsSchemaSQL = fs.readFileSync(metricsSchemaPath, 'utf8');
|
||||
|
||||
outputProgress({
|
||||
operation: 'Metrics Schema file',
|
||||
message: {
|
||||
path: metricsSchemaPath,
|
||||
exists: fs.existsSync(metricsSchemaPath),
|
||||
size: fs.statSync(metricsSchemaPath).size,
|
||||
firstFewLines: metricsSchemaSQL.split('\n').slice(0, 5).join('\n')
|
||||
}
|
||||
});
|
||||
|
||||
// Execute metrics schema statements one at a time
|
||||
const metricsStatements = splitSQLStatements(metricsSchemaSQL);
|
||||
outputProgress({
|
||||
@@ -450,30 +517,46 @@ async function resetDatabase() {
|
||||
}
|
||||
});
|
||||
|
||||
for (let i = 0; i < metricsStatements.length; i++) {
|
||||
const stmt = metricsStatements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Metrics SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: metricsStatements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
// Start a transaction for better error handling
|
||||
await client.query('BEGIN');
|
||||
try {
|
||||
for (let i = 0; i < metricsStatements.length; i++) {
|
||||
const stmt = metricsStatements[i];
|
||||
try {
|
||||
const result = await client.query(stmt);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Metrics SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: metricsStatements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : ''),
|
||||
rowCount: result.rowCount
|
||||
}
|
||||
});
|
||||
|
||||
// Commit in chunks of 10 statements to avoid long-running transactions
|
||||
if (i > 0 && i % 10 === 0) {
|
||||
await client.query('COMMIT');
|
||||
await client.query('BEGIN');
|
||||
}
|
||||
});
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Metrics SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
} catch (sqlError) {
|
||||
await client.query('ROLLBACK');
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Metrics SQL Error',
|
||||
error: sqlError.message,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
});
|
||||
throw sqlError;
|
||||
}
|
||||
}
|
||||
// Commit the final transaction
|
||||
await client.query('COMMIT');
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
@@ -490,6 +573,14 @@ async function resetDatabase() {
|
||||
});
|
||||
process.exit(1);
|
||||
} finally {
|
||||
// Make sure to re-enable foreign key checks if they were disabled
|
||||
try {
|
||||
await client.query('SET session_replication_role = \'origin\'');
|
||||
} catch (e) {
|
||||
console.error('Error re-enabling foreign key checks:', e.message);
|
||||
}
|
||||
|
||||
// Close the database connection
|
||||
await client.end();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -31,7 +31,10 @@ const PROTECTED_TABLES = [
|
||||
'ai_prompts',
|
||||
'ai_validation_performance',
|
||||
'templates',
|
||||
'reusable_images'
|
||||
'reusable_images',
|
||||
'imported_daily_inventory',
|
||||
'imported_product_stat_history',
|
||||
'imported_product_current_prices'
|
||||
];
|
||||
|
||||
// Split SQL into individual statements
|
||||
|
||||
@@ -51,83 +51,67 @@ router.get('/:id', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Get prompt by company
|
||||
router.get('/company/:companyId', async (req, res) => {
|
||||
// Get prompt by type (general, system, company_specific)
|
||||
router.get('/by-type', async (req, res) => {
|
||||
try {
|
||||
const { companyId } = req.params;
|
||||
const { type, company } = req.query;
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE company = $1
|
||||
`, [companyId]);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found for this company' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching AI prompt by company:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch AI prompt by company',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get general prompt
|
||||
router.get('/type/general', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
// Validate prompt type
|
||||
if (!type || !['general', 'system', 'company_specific'].includes(type)) {
|
||||
return res.status(400).json({
|
||||
error: 'Valid type query parameter is required (general, system, or company_specific)'
|
||||
});
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
`);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'General AI prompt not found' });
|
||||
}
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching general AI prompt:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch general AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Get system prompt
|
||||
router.get('/type/system', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
// For company_specific type, company ID is required
|
||||
if (type === 'company_specific' && !company) {
|
||||
return res.status(400).json({
|
||||
error: 'Company ID is required for company_specific prompt type'
|
||||
});
|
||||
}
|
||||
|
||||
const result = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
`);
|
||||
|
||||
if (result.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'System AI prompt not found' });
|
||||
// For general and system types, company should not be provided
|
||||
if ((type === 'general' || type === 'system') && company) {
|
||||
return res.status(400).json({
|
||||
error: 'Company ID should not be provided for general or system prompt types'
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
// Build the query based on the type
|
||||
let query, params;
|
||||
if (type === 'company_specific') {
|
||||
query = 'SELECT * FROM ai_prompts WHERE prompt_type = $1 AND company = $2';
|
||||
params = [type, company];
|
||||
} else {
|
||||
query = 'SELECT * FROM ai_prompts WHERE prompt_type = $1';
|
||||
params = [type];
|
||||
}
|
||||
|
||||
// Execute the query
|
||||
const result = await pool.query(query, params);
|
||||
|
||||
// Check if any prompt was found
|
||||
if (result.rows.length === 0) {
|
||||
let errorMessage;
|
||||
if (type === 'company_specific') {
|
||||
errorMessage = `AI prompt not found for company ${company}`;
|
||||
} else {
|
||||
errorMessage = `${type.charAt(0).toUpperCase() + type.slice(1)} AI prompt not found`;
|
||||
}
|
||||
return res.status(404).json({ error: errorMessage });
|
||||
}
|
||||
|
||||
// Return the first matching prompt
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching system AI prompt:', error);
|
||||
console.error('Error fetching AI prompt by type:', error);
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch system AI prompt',
|
||||
error: 'Failed to fetch AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
}
|
||||
|
||||
@@ -6,6 +6,7 @@ const path = require("path");
|
||||
const dotenv = require("dotenv");
|
||||
const mysql = require('mysql2/promise');
|
||||
const { Client } = require('ssh2');
|
||||
const { getDbConnection } = require('../utils/dbConnection'); // Import the optimized connection function
|
||||
|
||||
// Ensure environment variables are loaded
|
||||
dotenv.config({ path: path.join(__dirname, "../../.env") });
|
||||
@@ -18,50 +19,6 @@ if (!process.env.OPENAI_API_KEY) {
|
||||
console.error("Warning: OPENAI_API_KEY is not set in environment variables");
|
||||
}
|
||||
|
||||
// Helper function to setup SSH tunnel to production database
|
||||
async function setupSshTunnel() {
|
||||
const sshConfig = {
|
||||
host: process.env.PROD_SSH_HOST,
|
||||
port: process.env.PROD_SSH_PORT || 22,
|
||||
username: process.env.PROD_SSH_USER,
|
||||
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||
? require('fs').readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||
: undefined,
|
||||
compress: true
|
||||
};
|
||||
|
||||
const dbConfig = {
|
||||
host: process.env.PROD_DB_HOST || 'localhost',
|
||||
user: process.env.PROD_DB_USER,
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z'
|
||||
};
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const ssh = new Client();
|
||||
|
||||
ssh.on('error', (err) => {
|
||||
console.error('SSH connection error:', err);
|
||||
reject(err);
|
||||
});
|
||||
|
||||
ssh.on('ready', () => {
|
||||
ssh.forwardOut(
|
||||
'127.0.0.1',
|
||||
0,
|
||||
dbConfig.host,
|
||||
dbConfig.port,
|
||||
(err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream, dbConfig });
|
||||
}
|
||||
);
|
||||
}).connect(sshConfig);
|
||||
});
|
||||
}
|
||||
|
||||
// Debug endpoint for viewing prompt
|
||||
router.post("/debug", async (req, res) => {
|
||||
try {
|
||||
@@ -195,16 +152,12 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
// Load taxonomy data first
|
||||
console.log("Loading taxonomy data...");
|
||||
try {
|
||||
// Setup MySQL connection via SSH tunnel
|
||||
const tunnel = await setupSshTunnel();
|
||||
ssh = tunnel.ssh;
|
||||
// Use optimized database connection
|
||||
const { connection, ssh: connSsh } = await getDbConnection();
|
||||
mysqlConnection = connection;
|
||||
ssh = connSsh;
|
||||
|
||||
mysqlConnection = await mysql.createConnection({
|
||||
...tunnel.dbConfig,
|
||||
stream: tunnel.stream
|
||||
});
|
||||
|
||||
console.log("MySQL connection established successfully");
|
||||
console.log("MySQL connection established successfully using optimized connection");
|
||||
|
||||
taxonomy = await getTaxonomyData(mysqlConnection);
|
||||
console.log("Successfully loaded taxonomy data");
|
||||
@@ -218,10 +171,6 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
errno: taxonomyError.errno || null,
|
||||
sql: taxonomyError.sql || null,
|
||||
});
|
||||
} finally {
|
||||
// Make sure we close the connection
|
||||
if (mysqlConnection) await mysqlConnection.end();
|
||||
if (ssh) ssh.end();
|
||||
}
|
||||
|
||||
// Verify the taxonomy data structure
|
||||
@@ -282,11 +231,8 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
console.log("Loading prompt...");
|
||||
|
||||
// Setup a new connection for loading the prompt
|
||||
const promptTunnel = await setupSshTunnel();
|
||||
const promptConnection = await mysql.createConnection({
|
||||
...promptTunnel.dbConfig,
|
||||
stream: promptTunnel.stream
|
||||
});
|
||||
// Use optimized connection instead of creating a new one
|
||||
const { connection: promptConnection } = await getDbConnection();
|
||||
|
||||
try {
|
||||
// Get the local PostgreSQL pool to fetch prompts
|
||||
@@ -296,7 +242,7 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// First, fetch the system prompt
|
||||
// First, fetch the system prompt using the consolidated endpoint approach
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
@@ -311,7 +257,7 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
console.warn("⚠️ No system prompt found in database, will use default");
|
||||
}
|
||||
|
||||
// Then, fetch the general prompt
|
||||
// Then, fetch the general prompt using the consolidated endpoint approach
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
@@ -458,7 +404,6 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
return response;
|
||||
} finally {
|
||||
if (promptConnection) await promptConnection.end();
|
||||
if (promptTunnel.ssh) promptTunnel.ssh.end();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error generating debug response:", error);
|
||||
@@ -645,7 +590,7 @@ async function loadPrompt(connection, productsToValidate = null, appPool = null)
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// Fetch the system prompt
|
||||
// Fetch the system prompt using the consolidated endpoint approach
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
@@ -662,7 +607,7 @@ async function loadPrompt(connection, productsToValidate = null, appPool = null)
|
||||
console.warn("⚠️ No system prompt found in database, using default");
|
||||
}
|
||||
|
||||
// Fetch the general prompt
|
||||
// Fetch the general prompt using the consolidated endpoint approach
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
@@ -926,15 +871,11 @@ router.post("/validate", async (req, res) => {
|
||||
let promptLength = 0; // Track prompt length for performance metrics
|
||||
|
||||
try {
|
||||
// Setup MySQL connection via SSH tunnel
|
||||
console.log("🔄 Setting up connection to production database...");
|
||||
const tunnel = await setupSshTunnel();
|
||||
ssh = tunnel.ssh;
|
||||
|
||||
connection = await mysql.createConnection({
|
||||
...tunnel.dbConfig,
|
||||
stream: tunnel.stream
|
||||
});
|
||||
// Use the optimized connection utility instead of direct SSH tunnel
|
||||
console.log("🔄 Setting up connection to production database using optimized connection...");
|
||||
const { ssh: connSsh, connection: connDB } = await getDbConnection();
|
||||
ssh = connSsh;
|
||||
connection = connDB;
|
||||
|
||||
console.log("🔄 MySQL connection established successfully");
|
||||
|
||||
@@ -1238,14 +1179,11 @@ router.get("/test-taxonomy", async (req, res) => {
|
||||
let connection = null;
|
||||
|
||||
try {
|
||||
// Setup MySQL connection via SSH tunnel
|
||||
const tunnel = await setupSshTunnel();
|
||||
ssh = tunnel.ssh;
|
||||
|
||||
connection = await mysql.createConnection({
|
||||
...tunnel.dbConfig,
|
||||
stream: tunnel.stream
|
||||
});
|
||||
// Use the optimized connection utility instead of direct SSH tunnel
|
||||
console.log("🔄 Setting up connection to production database using optimized connection...");
|
||||
const { ssh: connSsh, connection: connDB } = await getDbConnection();
|
||||
ssh = connSsh;
|
||||
connection = connDB;
|
||||
|
||||
console.log("MySQL connection established successfully for test");
|
||||
|
||||
|
||||
@@ -7,37 +7,33 @@ router.get('/stats', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const { rows: [results] } = await pool.query(`
|
||||
SELECT
|
||||
COALESCE(
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
),
|
||||
0
|
||||
) as profitMargin,
|
||||
COALESCE(
|
||||
ROUND(
|
||||
(AVG(p.price / NULLIF(p.cost_price, 0) - 1) * 100)::numeric, 1
|
||||
),
|
||||
0
|
||||
) as averageMarkup,
|
||||
COALESCE(
|
||||
ROUND(
|
||||
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 2
|
||||
),
|
||||
0
|
||||
) as stockTurnoverRate,
|
||||
COALESCE(COUNT(DISTINCT p.vendor), 0) as vendorCount,
|
||||
COALESCE(COUNT(DISTINCT p.categories), 0) as categoryCount,
|
||||
COALESCE(
|
||||
ROUND(
|
||||
AVG(o.price * o.quantity)::numeric, 2
|
||||
),
|
||||
0
|
||||
) as averageOrderValue
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
WITH vendor_count AS (
|
||||
SELECT COUNT(DISTINCT vendor_name) AS count
|
||||
FROM vendor_metrics
|
||||
),
|
||||
category_count AS (
|
||||
SELECT COUNT(DISTINCT category_id) AS count
|
||||
FROM category_metrics
|
||||
),
|
||||
metrics_summary AS (
|
||||
SELECT
|
||||
AVG(margin_30d) AS avg_profit_margin,
|
||||
AVG(markup_30d) AS avg_markup,
|
||||
AVG(stockturn_30d) AS avg_stock_turnover,
|
||||
AVG(asp_30d) AS avg_order_value
|
||||
FROM product_metrics
|
||||
WHERE sales_30d > 0
|
||||
)
|
||||
SELECT
|
||||
COALESCE(ms.avg_profit_margin, 0) AS profitMargin,
|
||||
COALESCE(ms.avg_markup, 0) AS averageMarkup,
|
||||
COALESCE(ms.avg_stock_turnover, 0) AS stockTurnoverRate,
|
||||
COALESCE(vc.count, 0) AS vendorCount,
|
||||
COALESCE(cc.count, 0) AS categoryCount,
|
||||
COALESCE(ms.avg_order_value, 0) AS averageOrderValue
|
||||
FROM metrics_summary ms
|
||||
CROSS JOIN vendor_count vc
|
||||
CROSS JOIN category_count cc
|
||||
`);
|
||||
|
||||
// Ensure all values are numbers
|
||||
@@ -84,43 +80,53 @@ router.get('/profit', async (req, res) => {
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
) as profitMargin,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as revenue,
|
||||
ROUND(SUM(p.cost_price * o.quantity)::numeric, 3) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY c.name, cp.path
|
||||
ORDER BY profitMargin DESC
|
||||
cm.category_name as category,
|
||||
COALESCE(cp.path, cm.category_name) as categorypath,
|
||||
cm.avg_margin_30d as profitmargin,
|
||||
cm.revenue_30d as revenue,
|
||||
cm.cogs_30d as cost
|
||||
FROM category_metrics cm
|
||||
LEFT JOIN category_path cp ON cm.category_id = cp.cat_id
|
||||
WHERE cm.revenue_30d > 0
|
||||
ORDER BY cm.revenue_30d DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Get profit margin trend over time
|
||||
// Get profit margin over time
|
||||
const { rows: overTime } = await pool.query(`
|
||||
SELECT
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
) as profitMargin,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as revenue,
|
||||
ROUND(SUM(p.cost_price * o.quantity)::numeric, 3) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
WITH time_series AS (
|
||||
SELECT
|
||||
date_trunc('day', generate_series(
|
||||
CURRENT_DATE - INTERVAL '30 days',
|
||||
CURRENT_DATE,
|
||||
'1 day'::interval
|
||||
))::date AS date
|
||||
),
|
||||
daily_profits AS (
|
||||
SELECT
|
||||
snapshot_date as date,
|
||||
SUM(net_revenue) as revenue,
|
||||
SUM(cogs) as cost,
|
||||
CASE
|
||||
WHEN SUM(net_revenue) > 0
|
||||
THEN (SUM(net_revenue - cogs) / SUM(net_revenue)) * 100
|
||||
ELSE 0
|
||||
END as profit_margin
|
||||
FROM daily_product_snapshots
|
||||
WHERE snapshot_date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY snapshot_date
|
||||
)
|
||||
SELECT
|
||||
to_char(ts.date, 'YYYY-MM-DD') as date,
|
||||
COALESCE(dp.profit_margin, 0) as profitmargin,
|
||||
COALESCE(dp.revenue, 0) as revenue,
|
||||
COALESCE(dp.cost, 0) as cost
|
||||
FROM time_series ts
|
||||
LEFT JOIN daily_profits dp ON ts.date = dp.date
|
||||
ORDER BY ts.date
|
||||
`);
|
||||
|
||||
// Get top performing products with category paths
|
||||
// Get top performing products by profit margin
|
||||
const { rows: topProducts } = await pool.query(`
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
@@ -140,26 +146,28 @@ router.get('/profit', async (req, res) => {
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
product_categories AS (
|
||||
SELECT
|
||||
pc.pid,
|
||||
c.name as category,
|
||||
COALESCE(cp.path, c.name) as categorypath
|
||||
FROM product_categories pc
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
LEFT JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
)
|
||||
SELECT
|
||||
p.title as product,
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
) as profitMargin,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as revenue,
|
||||
ROUND(SUM(p.cost_price * o.quantity)::numeric, 3) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.pid, p.title, c.name, cp.path
|
||||
HAVING SUM(o.price * o.quantity) > 0
|
||||
ORDER BY profitMargin DESC
|
||||
pm.title as product,
|
||||
COALESCE(pc.category, 'Uncategorized') as category,
|
||||
COALESCE(pc.categorypath, 'Uncategorized') as categorypath,
|
||||
pm.margin_30d as profitmargin,
|
||||
pm.revenue_30d as revenue,
|
||||
pm.cogs_30d as cost
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN product_categories pc ON pm.pid = pc.pid
|
||||
WHERE pm.revenue_30d > 100
|
||||
AND pm.margin_30d > 0
|
||||
ORDER BY pm.margin_30d DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
@@ -184,93 +192,52 @@ router.get('/vendors', async (req, res) => {
|
||||
|
||||
console.log('Fetching vendor performance data...');
|
||||
|
||||
// First check if we have any vendors with sales
|
||||
const { rows: [checkData] } = await pool.query(`
|
||||
SELECT COUNT(DISTINCT p.vendor) as vendor_count,
|
||||
COUNT(DISTINCT o.order_number) as order_count
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
`);
|
||||
|
||||
console.log('Vendor data check:', checkData);
|
||||
|
||||
// Get vendor performance metrics
|
||||
// Get vendor performance metrics from the vendor_metrics table
|
||||
const { rows: rawPerformance } = await pool.query(`
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
ROUND(SUM(CASE
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END)::numeric, 3) as current_month,
|
||||
ROUND(SUM(CASE
|
||||
WHEN o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
AND o.date < CURRENT_DATE - INTERVAL '30 days'
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END)::numeric, 3) as previous_month
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '60 days'
|
||||
GROUP BY p.vendor
|
||||
)
|
||||
SELECT
|
||||
p.vendor,
|
||||
ROUND(SUM(o.price * o.quantity)::numeric, 3) as sales_volume,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0) * 100)::numeric, 1
|
||||
), 0) as profit_margin,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1
|
||||
), 0) as stock_turnover,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
ROUND(
|
||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||
1
|
||||
) as growth
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN monthly_sales ms ON p.vendor = ms.vendor
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor, ms.current_month, ms.previous_month
|
||||
ORDER BY sales_volume DESC
|
||||
LIMIT 10
|
||||
vendor_name as vendor,
|
||||
revenue_30d as sales_volume,
|
||||
avg_margin_30d as profit_margin,
|
||||
COALESCE(
|
||||
sales_30d / NULLIF(current_stock_units, 0),
|
||||
0
|
||||
) as stock_turnover,
|
||||
product_count,
|
||||
-- Use an estimate of growth based on 7-day vs 30-day revenue
|
||||
CASE
|
||||
WHEN revenue_30d > 0
|
||||
THEN ((revenue_7d * 4.0) / revenue_30d - 1) * 100
|
||||
ELSE 0
|
||||
END as growth
|
||||
FROM vendor_metrics
|
||||
WHERE revenue_30d > 0
|
||||
ORDER BY revenue_30d DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
|
||||
// Transform to camelCase properties for frontend consumption
|
||||
const performance = rawPerformance.map(item => ({
|
||||
vendor: item.vendor,
|
||||
salesVolume: Number(item.sales_volume) || 0,
|
||||
profitMargin: Number(item.profit_margin) || 0,
|
||||
stockTurnover: Number(item.stock_turnover) || 0,
|
||||
productCount: Number(item.product_count) || 0,
|
||||
growth: Number(item.growth) || 0
|
||||
// Format the performance data
|
||||
const performance = rawPerformance.map(vendor => ({
|
||||
vendor: vendor.vendor,
|
||||
salesVolume: Number(vendor.sales_volume) || 0,
|
||||
profitMargin: Number(vendor.profit_margin) || 0,
|
||||
stockTurnover: Number(vendor.stock_turnover) || 0,
|
||||
productCount: Number(vendor.product_count) || 0,
|
||||
growth: Number(vendor.growth) || 0
|
||||
}));
|
||||
|
||||
// Get vendor comparison metrics (sales per product vs margin)
|
||||
const { rows: rawComparison } = await pool.query(`
|
||||
SELECT
|
||||
p.vendor,
|
||||
COALESCE(ROUND(
|
||||
SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0),
|
||||
2
|
||||
), 0) as sales_per_product,
|
||||
COALESCE(ROUND(
|
||||
AVG((p.price - p.cost_price) / NULLIF(p.cost_price, 0) * 100),
|
||||
2
|
||||
), 0) as average_margin,
|
||||
COUNT(DISTINCT p.pid) as size
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY p.vendor
|
||||
HAVING COUNT(DISTINCT p.pid) > 0
|
||||
vendor_name as vendor,
|
||||
CASE
|
||||
WHEN active_product_count > 0
|
||||
THEN revenue_30d / active_product_count
|
||||
ELSE 0
|
||||
END as sales_per_product,
|
||||
avg_margin_30d as average_margin,
|
||||
product_count as size
|
||||
FROM vendor_metrics
|
||||
WHERE active_product_count > 0
|
||||
ORDER BY sales_per_product DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
@@ -294,58 +261,7 @@ router.get('/vendors', async (req, res) => {
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendor performance:', error);
|
||||
console.error('Error details:', error.message);
|
||||
|
||||
// Return dummy data on error with complete structure
|
||||
res.json({
|
||||
performance: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesVolume: 10000,
|
||||
profitMargin: 25.5,
|
||||
stockTurnover: 3.2,
|
||||
productCount: 15,
|
||||
growth: 12.3
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesVolume: 8500,
|
||||
profitMargin: 22.8,
|
||||
stockTurnover: 2.9,
|
||||
productCount: 12,
|
||||
growth: 8.7
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesVolume: 6200,
|
||||
profitMargin: 19.5,
|
||||
stockTurnover: 2.5,
|
||||
productCount: 8,
|
||||
growth: 5.2
|
||||
}
|
||||
],
|
||||
comparison: [
|
||||
{
|
||||
vendor: "Example Vendor 1",
|
||||
salesPerProduct: 650,
|
||||
averageMargin: 35.2,
|
||||
size: 15
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 2",
|
||||
salesPerProduct: 710,
|
||||
averageMargin: 28.5,
|
||||
size: 12
|
||||
},
|
||||
{
|
||||
vendor: "Example Vendor 3",
|
||||
salesPerProduct: 770,
|
||||
averageMargin: 22.8,
|
||||
size: 8
|
||||
}
|
||||
],
|
||||
trends: []
|
||||
});
|
||||
res.status(500).json({ error: 'Failed to fetch vendor performance data' });
|
||||
}
|
||||
});
|
||||
|
||||
@@ -353,108 +269,119 @@ router.get('/vendors', async (req, res) => {
|
||||
router.get('/stock', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('Fetching stock analysis data...');
|
||||
|
||||
// Get global configuration values
|
||||
const { rows: configs } = await pool.query(`
|
||||
SELECT
|
||||
st.low_stock_threshold,
|
||||
tc.calculation_period_days as turnover_period
|
||||
FROM stock_thresholds st
|
||||
CROSS JOIN turnover_config tc
|
||||
WHERE st.id = 1 AND tc.id = 1
|
||||
`);
|
||||
|
||||
const config = configs[0] || {
|
||||
low_stock_threshold: 5,
|
||||
turnover_period: 30
|
||||
};
|
||||
// Use the new metrics tables to get data
|
||||
|
||||
// Get turnover by category
|
||||
const { rows: turnoverByCategory } = await pool.query(`
|
||||
SELECT
|
||||
c.name as category,
|
||||
ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) as turnoverRate,
|
||||
ROUND(AVG(p.stock_quantity)::numeric, 0) as averageStock,
|
||||
SUM(o.quantity) as totalSales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY c.name
|
||||
HAVING ROUND((SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0))::numeric, 1) > 0
|
||||
ORDER BY turnoverRate DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Get stock levels over time
|
||||
const { rows: stockLevels } = await pool.query(`
|
||||
SELECT
|
||||
to_char(o.date, 'YYYY-MM-DD') as date,
|
||||
SUM(CASE WHEN p.stock_quantity > $1 THEN 1 ELSE 0 END) as inStock,
|
||||
SUM(CASE WHEN p.stock_quantity <= $1 AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
GROUP BY to_char(o.date, 'YYYY-MM-DD')
|
||||
ORDER BY date
|
||||
`, [config.low_stock_threshold]);
|
||||
|
||||
// Get critical stock items
|
||||
const { rows: criticalItems } = await pool.query(`
|
||||
WITH product_thresholds AS (
|
||||
WITH category_metrics_with_path AS (
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
c.name::text as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
(cp.path || ' > ' || c.name)::text
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
SELECT
|
||||
p.pid,
|
||||
COALESCE(
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
WHERE st.vendor = p.vendor LIMIT 1),
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
WHERE st.vendor IS NULL LIMIT 1),
|
||||
14
|
||||
) as reorder_days
|
||||
FROM products p
|
||||
cm.category_id,
|
||||
cm.category_name,
|
||||
cp.path as category_path,
|
||||
cm.current_stock_units,
|
||||
cm.sales_30d,
|
||||
cm.stock_turn_30d
|
||||
FROM category_metrics cm
|
||||
LEFT JOIN category_path cp ON cm.category_id = cp.cat_id
|
||||
WHERE cm.sales_30d > 0
|
||||
)
|
||||
SELECT
|
||||
p.title as product,
|
||||
p.SKU as sku,
|
||||
p.stock_quantity as stockQuantity,
|
||||
GREATEST(ROUND((AVG(o.quantity) * pt.reorder_days)::numeric), $1) as reorderPoint,
|
||||
ROUND((SUM(o.quantity) / NULLIF(p.stock_quantity, 0))::numeric, 1) as turnoverRate,
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END as daysUntilStockout
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_thresholds pt ON p.pid = pt.pid
|
||||
WHERE o.date >= CURRENT_DATE - INTERVAL '${config.turnover_period} days'
|
||||
AND p.managing_stock = true
|
||||
GROUP BY p.pid, pt.reorder_days
|
||||
HAVING
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END < $3
|
||||
AND
|
||||
CASE
|
||||
WHEN p.stock_quantity = 0 THEN 0
|
||||
ELSE ROUND((p.stock_quantity / NULLIF((SUM(o.quantity) / $2), 0))::numeric)
|
||||
END >= 0
|
||||
ORDER BY daysUntilStockout
|
||||
category_name as category,
|
||||
COALESCE(stock_turn_30d, 0) as turnoverRate,
|
||||
current_stock_units as averageStock,
|
||||
sales_30d as totalSales
|
||||
FROM category_metrics_with_path
|
||||
ORDER BY stock_turn_30d DESC NULLS LAST
|
||||
LIMIT 10
|
||||
`, [
|
||||
config.low_stock_threshold,
|
||||
config.turnover_period,
|
||||
config.turnover_period
|
||||
]);
|
||||
|
||||
res.json({ turnoverByCategory, stockLevels, criticalItems });
|
||||
`);
|
||||
|
||||
// Get stock levels over time (last 30 days)
|
||||
const { rows: stockLevels } = await pool.query(`
|
||||
WITH date_range AS (
|
||||
SELECT generate_series(
|
||||
CURRENT_DATE - INTERVAL '30 days',
|
||||
CURRENT_DATE,
|
||||
'1 day'::interval
|
||||
)::date AS date
|
||||
),
|
||||
daily_stock_counts AS (
|
||||
SELECT
|
||||
snapshot_date,
|
||||
COUNT(DISTINCT pid) as total_products,
|
||||
COUNT(DISTINCT CASE WHEN eod_stock_quantity > 5 THEN pid END) as in_stock,
|
||||
COUNT(DISTINCT CASE WHEN eod_stock_quantity <= 5 AND eod_stock_quantity > 0 THEN pid END) as low_stock,
|
||||
COUNT(DISTINCT CASE WHEN eod_stock_quantity = 0 THEN pid END) as out_of_stock
|
||||
FROM daily_product_snapshots
|
||||
WHERE snapshot_date >= CURRENT_DATE - INTERVAL '30 days'
|
||||
GROUP BY snapshot_date
|
||||
)
|
||||
SELECT
|
||||
to_char(dr.date, 'YYYY-MM-DD') as date,
|
||||
COALESCE(dsc.in_stock, 0) as inStock,
|
||||
COALESCE(dsc.low_stock, 0) as lowStock,
|
||||
COALESCE(dsc.out_of_stock, 0) as outOfStock
|
||||
FROM date_range dr
|
||||
LEFT JOIN daily_stock_counts dsc ON dr.date = dsc.snapshot_date
|
||||
ORDER BY dr.date
|
||||
`);
|
||||
|
||||
// Get critical items (products that need reordering)
|
||||
const { rows: criticalItems } = await pool.query(`
|
||||
SELECT
|
||||
pm.title as product,
|
||||
pm.sku as sku,
|
||||
pm.current_stock as stockQuantity,
|
||||
COALESCE(pm.config_safety_stock, 0) as reorderPoint,
|
||||
COALESCE(pm.stockturn_30d, 0) as turnoverRate,
|
||||
CASE
|
||||
WHEN pm.sales_velocity_daily > 0
|
||||
THEN ROUND(pm.current_stock / pm.sales_velocity_daily)
|
||||
ELSE 999
|
||||
END as daysUntilStockout
|
||||
FROM product_metrics pm
|
||||
WHERE pm.is_visible = true
|
||||
AND pm.is_replenishable = true
|
||||
AND pm.sales_30d > 0
|
||||
AND pm.current_stock <= pm.config_safety_stock * 2
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN pm.sales_velocity_daily > 0
|
||||
THEN pm.current_stock / pm.sales_velocity_daily
|
||||
ELSE 999
|
||||
END ASC,
|
||||
pm.revenue_30d DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
res.json({
|
||||
turnoverByCategory,
|
||||
stockLevels,
|
||||
criticalItems
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching stock analysis:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch stock analysis' });
|
||||
res.status(500).json({ error: 'Failed to fetch stock analysis', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
@@ -685,99 +612,4 @@ router.get('/categories', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Forecast endpoint
|
||||
router.get('/forecast', async (req, res) => {
|
||||
try {
|
||||
const { brand, startDate, endDate } = req.query;
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [results] = await pool.query(`
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
category_metrics AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name as category_name,
|
||||
cp.path,
|
||||
p.brand,
|
||||
COUNT(DISTINCT p.pid) as num_products,
|
||||
CAST(COALESCE(ROUND(SUM(o.quantity) / DATEDIFF(?, ?), 2), 0) AS DECIMAL(15,3)) as avg_daily_sales,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
CAST(COALESCE(ROUND(SUM(o.quantity) / COUNT(DISTINCT p.pid), 2), 0) AS DECIMAL(15,3)) as avgTotalSold,
|
||||
CAST(COALESCE(ROUND(AVG(o.price), 2), 0) AS DECIMAL(15,3)) as avg_price
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
LEFT JOIN product_metrics pmet ON p.pid = pmet.pid
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pmet.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY c.cat_id, c.name, cp.path, p.brand
|
||||
),
|
||||
product_details AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.title,
|
||||
p.SKU,
|
||||
p.stock_quantity,
|
||||
pc.cat_id,
|
||||
pmet.first_received_date,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
CAST(COALESCE(ROUND(AVG(o.price), 2), 0) AS DECIMAL(15,3)) as avg_price
|
||||
FROM products p
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN product_metrics pmet ON p.pid = pmet.pid
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pmet.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY p.pid, p.title, p.SKU, p.stock_quantity, pc.cat_id, pmet.first_received_date
|
||||
)
|
||||
SELECT
|
||||
cm.*,
|
||||
JSON_ARRAYAGG(
|
||||
JSON_OBJECT(
|
||||
'pid', pd.pid,
|
||||
'title', pd.title,
|
||||
'SKU', pd.SKU,
|
||||
'stock_quantity', pd.stock_quantity,
|
||||
'total_sold', pd.total_sold,
|
||||
'avg_price', pd.avg_price,
|
||||
'first_received_date', DATE_FORMAT(pd.first_received_date, '%Y-%m-%d')
|
||||
)
|
||||
) as products
|
||||
FROM category_metrics cm
|
||||
JOIN product_details pd ON cm.cat_id = pd.cat_id
|
||||
GROUP BY cm.cat_id, cm.category_name, cm.path, cm.brand, cm.num_products, cm.avg_daily_sales, cm.total_sold, cm.avgTotalSold, cm.avg_price
|
||||
ORDER BY cm.total_sold DESC
|
||||
`, [endDate, startDate, startDate, endDate, brand, startDate, endDate, startDate, endDate, brand, startDate, endDate]);
|
||||
|
||||
res.json(results);
|
||||
} catch (error) {
|
||||
console.error('Error fetching forecast data:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch forecast data' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
281
inventory-server/src/routes/brandsAggregate.js
Normal file
281
inventory-server/src/routes/brandsAggregate.js
Normal file
@@ -0,0 +1,281 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { parseValue } = require('../utils/apiHelpers'); // Adjust path if needed
|
||||
|
||||
// --- Configuration & Helpers ---
|
||||
const DEFAULT_PAGE_LIMIT = 50;
|
||||
const MAX_PAGE_LIMIT = 200;
|
||||
|
||||
// Maps query keys to DB columns in brand_metrics
|
||||
const COLUMN_MAP = {
|
||||
brandName: { dbCol: 'bm.brand_name', type: 'string' },
|
||||
productCount: { dbCol: 'bm.product_count', type: 'number' },
|
||||
activeProductCount: { dbCol: 'bm.active_product_count', type: 'number' },
|
||||
replenishableProductCount: { dbCol: 'bm.replenishable_product_count', type: 'number' },
|
||||
currentStockUnits: { dbCol: 'bm.current_stock_units', type: 'number' },
|
||||
currentStockCost: { dbCol: 'bm.current_stock_cost', type: 'number' },
|
||||
currentStockRetail: { dbCol: 'bm.current_stock_retail', type: 'number' },
|
||||
sales7d: { dbCol: 'bm.sales_7d', type: 'number' },
|
||||
revenue7d: { dbCol: 'bm.revenue_7d', type: 'number' },
|
||||
sales30d: { dbCol: 'bm.sales_30d', type: 'number' },
|
||||
revenue30d: { dbCol: 'bm.revenue_30d', type: 'number' },
|
||||
profit30d: { dbCol: 'bm.profit_30d', type: 'number' },
|
||||
cogs30d: { dbCol: 'bm.cogs_30d', type: 'number' },
|
||||
sales365d: { dbCol: 'bm.sales_365d', type: 'number' },
|
||||
revenue365d: { dbCol: 'bm.revenue_365d', type: 'number' },
|
||||
lifetimeSales: { dbCol: 'bm.lifetime_sales', type: 'number' },
|
||||
lifetimeRevenue: { dbCol: 'bm.lifetime_revenue', type: 'number' },
|
||||
avgMargin30d: { dbCol: 'bm.avg_margin_30d', type: 'number' },
|
||||
// Add aliases if needed
|
||||
name: { dbCol: 'bm.brand_name', type: 'string' },
|
||||
// Add status for filtering
|
||||
status: { dbCol: 'brand_status', type: 'string' },
|
||||
};
|
||||
|
||||
function getSafeColumnInfo(queryParamKey) {
|
||||
return COLUMN_MAP[queryParamKey] || null;
|
||||
}
|
||||
|
||||
// --- Route Handlers ---
|
||||
|
||||
// GET /brands-aggregate/filter-options (Just brands list for now)
|
||||
router.get('/filter-options', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /brands-aggregate/filter-options');
|
||||
try {
|
||||
// Get brand names
|
||||
const { rows: brandRows } = await pool.query(`
|
||||
SELECT DISTINCT brand_name FROM public.brand_metrics ORDER BY brand_name
|
||||
`);
|
||||
|
||||
// Get status values - calculate them since they're derived
|
||||
const { rows: statusRows } = await pool.query(`
|
||||
SELECT DISTINCT
|
||||
CASE
|
||||
WHEN active_product_count > 0 AND sales_30d > 0 THEN 'active'
|
||||
WHEN active_product_count > 0 THEN 'inactive'
|
||||
ELSE 'pending'
|
||||
END as status
|
||||
FROM public.brand_metrics
|
||||
ORDER BY status
|
||||
`);
|
||||
|
||||
res.json({
|
||||
brands: brandRows.map(r => r.brand_name),
|
||||
statuses: statusRows.map(r => r.status)
|
||||
});
|
||||
} catch(error) {
|
||||
console.error('Error fetching brand filter options:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch filter options' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /brands-aggregate/stats (Overall brand stats)
|
||||
router.get('/stats', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /brands-aggregate/stats');
|
||||
try {
|
||||
const { rows: [stats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(*) AS total_brands,
|
||||
COUNT(CASE WHEN active_product_count > 0 THEN 1 END) AS active_brands,
|
||||
SUM(active_product_count) AS total_active_products,
|
||||
SUM(current_stock_cost) AS total_stock_value,
|
||||
-- Weighted Average Margin
|
||||
SUM(profit_30d) * 100.0 / NULLIF(SUM(revenue_30d), 0) AS overall_avg_margin_weighted
|
||||
FROM public.brand_metrics bm
|
||||
`);
|
||||
|
||||
res.json({
|
||||
totalBrands: parseInt(stats?.total_brands || 0),
|
||||
activeBrands: parseInt(stats?.active_brands || 0),
|
||||
totalActiveProducts: parseInt(stats?.total_active_products || 0),
|
||||
totalValue: parseFloat(stats?.total_stock_value || 0),
|
||||
avgMargin: parseFloat(stats?.overall_avg_margin_weighted || 0),
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching brand stats:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch brand stats.' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /brands-aggregate/ (List brands)
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /brands-aggregate received query:', req.query);
|
||||
try {
|
||||
// --- Pagination ---
|
||||
let page = parseInt(req.query.page, 10) || 1;
|
||||
let limit = parseInt(req.query.limit, 10) || DEFAULT_PAGE_LIMIT;
|
||||
limit = Math.min(limit, MAX_PAGE_LIMIT);
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
// --- Sorting ---
|
||||
const sortQueryKey = req.query.sort || 'brandName'; // Default sort
|
||||
const sortColumnInfo = getSafeColumnInfo(sortQueryKey);
|
||||
const sortColumn = sortColumnInfo ? sortColumnInfo.dbCol : 'bm.brand_name';
|
||||
const sortDirection = req.query.order?.toLowerCase() === 'desc' ? 'DESC' : 'ASC';
|
||||
const nullsOrder = (sortDirection === 'ASC' ? 'NULLS FIRST' : 'NULLS LAST');
|
||||
const sortClause = `ORDER BY ${sortColumn} ${sortDirection} ${nullsOrder}`;
|
||||
|
||||
// --- Filtering ---
|
||||
const conditions = [];
|
||||
const params = [];
|
||||
let paramCounter = 1;
|
||||
// Build conditions based on req.query, using COLUMN_MAP and parseValue
|
||||
for (const key in req.query) {
|
||||
if (['page', 'limit', 'sort', 'order'].includes(key)) continue;
|
||||
|
||||
let filterKey = key;
|
||||
let operator = '='; // Default operator
|
||||
const value = req.query[key];
|
||||
|
||||
const operatorMatch = key.match(/^(.*)_(eq|ne|gt|gte|lt|lte|like|ilike|between|in)$/);
|
||||
if (operatorMatch) {
|
||||
filterKey = operatorMatch[1];
|
||||
operator = operatorMatch[2];
|
||||
}
|
||||
|
||||
const columnInfo = getSafeColumnInfo(filterKey);
|
||||
if (columnInfo) {
|
||||
const dbColumn = columnInfo.dbCol;
|
||||
const valueType = columnInfo.type;
|
||||
try {
|
||||
let conditionFragment = '';
|
||||
let needsParam = true;
|
||||
switch (operator.toLowerCase()) { // Normalize operator
|
||||
case 'eq': operator = '='; break;
|
||||
case 'ne': operator = '<>'; break;
|
||||
case 'gt': operator = '>'; break;
|
||||
case 'gte': operator = '>='; break;
|
||||
case 'lt': operator = '<'; break;
|
||||
case 'lte': operator = '<='; break;
|
||||
case 'like': operator = 'LIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'ilike': operator = 'ILIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'between':
|
||||
const [val1, val2] = String(value).split(',');
|
||||
if (val1 !== undefined && val2 !== undefined) {
|
||||
conditionFragment = `${dbColumn} BETWEEN $${paramCounter++} AND $${paramCounter++}`;
|
||||
params.push(parseValue(val1, valueType), parseValue(val2, valueType));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
case 'in':
|
||||
const inValues = String(value).split(',');
|
||||
if (inValues.length > 0) {
|
||||
const placeholders = inValues.map(() => `$${paramCounter++}`).join(', ');
|
||||
conditionFragment = `${dbColumn} IN (${placeholders})`;
|
||||
params.push(...inValues.map(v => parseValue(v, valueType)));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
default: operator = '='; break;
|
||||
}
|
||||
|
||||
if (needsParam) {
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
params.push(parseValue(value, valueType));
|
||||
} else if (!conditionFragment) { // For LIKE/ILIKE
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
}
|
||||
|
||||
|
||||
if (conditionFragment) {
|
||||
conditions.push(`(${conditionFragment})`);
|
||||
}
|
||||
} catch (parseError) {
|
||||
console.warn(`Skipping filter for key "${key}" due to parsing error: ${parseError.message}`);
|
||||
if (needsParam) paramCounter--;
|
||||
}
|
||||
} else {
|
||||
console.warn(`Invalid filter key ignored: ${key}`);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// --- Execute Queries ---
|
||||
const whereClause = conditions.length > 0 ? `WHERE ${conditions.join(' AND ')}` : '';
|
||||
|
||||
// Status calculation similar to vendors
|
||||
const statusCase = `
|
||||
CASE
|
||||
WHEN active_product_count > 0 AND sales_30d > 0 THEN 'active'
|
||||
WHEN active_product_count > 0 THEN 'inactive'
|
||||
ELSE 'pending'
|
||||
END as brand_status
|
||||
`;
|
||||
|
||||
const baseSql = `
|
||||
FROM (
|
||||
SELECT
|
||||
bm.*,
|
||||
${statusCase}
|
||||
FROM public.brand_metrics bm
|
||||
) bm
|
||||
${whereClause}
|
||||
`;
|
||||
|
||||
const countSql = `SELECT COUNT(*) AS total ${baseSql}`;
|
||||
const dataSql = `
|
||||
WITH brand_data AS (
|
||||
SELECT
|
||||
bm.*,
|
||||
${statusCase}
|
||||
FROM public.brand_metrics bm
|
||||
)
|
||||
SELECT bm.*
|
||||
FROM brand_data bm
|
||||
${whereClause}
|
||||
${sortClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`;
|
||||
const dataParams = [...params, limit, offset];
|
||||
|
||||
console.log("Count SQL:", countSql, params);
|
||||
console.log("Data SQL:", dataSql, dataParams);
|
||||
|
||||
const [countResult, dataResult] = await Promise.all([
|
||||
pool.query(countSql, params),
|
||||
pool.query(dataSql, dataParams)
|
||||
]);
|
||||
|
||||
const total = parseInt(countResult.rows[0].total, 10);
|
||||
const brands = dataResult.rows.map(row => {
|
||||
// Create a new object with both snake_case and camelCase keys
|
||||
const transformedRow = { ...row }; // Start with original data
|
||||
|
||||
for (const key in row) {
|
||||
// Skip null/undefined values
|
||||
if (row[key] === null || row[key] === undefined) {
|
||||
continue; // Original already has the null value
|
||||
}
|
||||
|
||||
// Transform keys to match frontend expectations (add camelCase versions)
|
||||
// First handle cases like sales_7d -> sales7d
|
||||
let camelKey = key.replace(/_(\d+[a-z])/g, '$1');
|
||||
|
||||
// Then handle regular snake_case -> camelCase
|
||||
camelKey = camelKey.replace(/_([a-z])/g, (_, letter) => letter.toUpperCase());
|
||||
if (camelKey !== key) { // Only add if different from original
|
||||
transformedRow[camelKey] = row[key];
|
||||
}
|
||||
}
|
||||
return transformedRow;
|
||||
});
|
||||
|
||||
// --- Respond ---
|
||||
res.json({
|
||||
brands,
|
||||
pagination: { total, pages: Math.ceil(total / limit), currentPage: page, limit },
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error fetching brand metrics list:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch brand metrics.' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /brands-aggregate/:name (Get single brand metric)
|
||||
// Implement if needed, remember to URL-decode the name parameter
|
||||
|
||||
module.exports = router;
|
||||
@@ -1,100 +0,0 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
|
||||
// Get all categories
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// Get all categories with metrics and hierarchy info
|
||||
const { rows: categories } = await pool.query(`
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.type,
|
||||
c.parent_id,
|
||||
c.description,
|
||||
c.status,
|
||||
p.name as parent_name,
|
||||
p.type as parent_type,
|
||||
COALESCE(cm.product_count, 0) as product_count,
|
||||
COALESCE(cm.active_products, 0) as active_products,
|
||||
ROUND(COALESCE(cm.total_value, 0)::numeric, 3) as total_value,
|
||||
COALESCE(cm.avg_margin, 0) as avg_margin,
|
||||
COALESCE(cm.turnover_rate, 0) as turnover_rate,
|
||||
COALESCE(cm.growth_rate, 0) as growth_rate
|
||||
FROM categories c
|
||||
LEFT JOIN categories p ON c.parent_id = p.cat_id
|
||||
LEFT JOIN category_metrics cm ON c.cat_id = cm.category_id
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN c.type = 10 THEN 1 -- sections first
|
||||
WHEN c.type = 11 THEN 2 -- categories second
|
||||
WHEN c.type = 12 THEN 3 -- subcategories third
|
||||
WHEN c.type = 13 THEN 4 -- subsubcategories fourth
|
||||
WHEN c.type = 20 THEN 5 -- themes fifth
|
||||
WHEN c.type = 21 THEN 6 -- subthemes last
|
||||
ELSE 7
|
||||
END,
|
||||
c.name ASC
|
||||
`);
|
||||
|
||||
// Get overall stats
|
||||
const { rows: [stats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(DISTINCT c.cat_id) as totalCategories,
|
||||
COUNT(DISTINCT CASE WHEN c.status = 'active' THEN c.cat_id END) as activeCategories,
|
||||
ROUND(COALESCE(SUM(cm.total_value), 0)::numeric, 3) as totalValue,
|
||||
COALESCE(ROUND(AVG(NULLIF(cm.avg_margin, 0))::numeric, 1), 0) as avgMargin,
|
||||
COALESCE(ROUND(AVG(NULLIF(cm.growth_rate, 0))::numeric, 1), 0) as avgGrowth
|
||||
FROM categories c
|
||||
LEFT JOIN category_metrics cm ON c.cat_id = cm.category_id
|
||||
`);
|
||||
|
||||
// Get type counts for filtering
|
||||
const { rows: typeCounts } = await pool.query(`
|
||||
SELECT
|
||||
type,
|
||||
COUNT(*)::integer as count
|
||||
FROM categories
|
||||
GROUP BY type
|
||||
ORDER BY type
|
||||
`);
|
||||
|
||||
res.json({
|
||||
categories: categories.map(cat => ({
|
||||
cat_id: cat.cat_id,
|
||||
name: cat.name,
|
||||
type: cat.type,
|
||||
parent_id: cat.parent_id,
|
||||
parent_name: cat.parent_name,
|
||||
parent_type: cat.parent_type,
|
||||
description: cat.description,
|
||||
status: cat.status,
|
||||
metrics: {
|
||||
product_count: parseInt(cat.product_count),
|
||||
active_products: parseInt(cat.active_products),
|
||||
total_value: parseFloat(cat.total_value),
|
||||
avg_margin: parseFloat(cat.avg_margin),
|
||||
turnover_rate: parseFloat(cat.turnover_rate),
|
||||
growth_rate: parseFloat(cat.growth_rate)
|
||||
}
|
||||
})),
|
||||
typeCounts: typeCounts.map(tc => ({
|
||||
type: tc.type,
|
||||
count: tc.count // Already cast to integer in the query
|
||||
})),
|
||||
stats: {
|
||||
totalCategories: parseInt(stats.totalcategories),
|
||||
activeCategories: parseInt(stats.activecategories),
|
||||
totalValue: parseFloat(stats.totalvalue),
|
||||
avgMargin: parseFloat(stats.avgmargin),
|
||||
avgGrowth: parseFloat(stats.avggrowth)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching categories:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch categories' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
360
inventory-server/src/routes/categoriesAggregate.js
Normal file
360
inventory-server/src/routes/categoriesAggregate.js
Normal file
@@ -0,0 +1,360 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { parseValue } = require('../utils/apiHelpers'); // Adjust path if needed
|
||||
|
||||
// --- Configuration & Helpers ---
|
||||
const DEFAULT_PAGE_LIMIT = 50;
|
||||
const MAX_PAGE_LIMIT = 5000; // Increase this to allow retrieving all categories in one request
|
||||
|
||||
// Maps query keys to DB columns in category_metrics and categories tables
|
||||
const COLUMN_MAP = {
|
||||
categoryId: { dbCol: 'cm.category_id', type: 'integer' },
|
||||
categoryName: { dbCol: 'cm.category_name', type: 'string' }, // From aggregate table
|
||||
categoryType: { dbCol: 'cm.category_type', type: 'integer' }, // From aggregate table
|
||||
parentId: { dbCol: 'cm.parent_id', type: 'integer' }, // From aggregate table
|
||||
parentName: { dbCol: 'p.name', type: 'string' }, // Requires JOIN to categories
|
||||
productCount: { dbCol: 'cm.product_count', type: 'number' },
|
||||
activeProductCount: { dbCol: 'cm.active_product_count', type: 'number' },
|
||||
replenishableProductCount: { dbCol: 'cm.replenishable_product_count', type: 'number' },
|
||||
currentStockUnits: { dbCol: 'cm.current_stock_units', type: 'number' },
|
||||
currentStockCost: { dbCol: 'cm.current_stock_cost', type: 'number' },
|
||||
currentStockRetail: { dbCol: 'cm.current_stock_retail', type: 'number' },
|
||||
sales7d: { dbCol: 'cm.sales_7d', type: 'number' },
|
||||
revenue7d: { dbCol: 'cm.revenue_7d', type: 'number' },
|
||||
sales30d: { dbCol: 'cm.sales_30d', type: 'number' },
|
||||
revenue30d: { dbCol: 'cm.revenue_30d', type: 'number' },
|
||||
profit30d: { dbCol: 'cm.profit_30d', type: 'number' },
|
||||
cogs30d: { dbCol: 'cm.cogs_30d', type: 'number' },
|
||||
sales365d: { dbCol: 'cm.sales_365d', type: 'number' },
|
||||
revenue365d: { dbCol: 'cm.revenue_365d', type: 'number' },
|
||||
lifetimeSales: { dbCol: 'cm.lifetime_sales', type: 'number' },
|
||||
lifetimeRevenue: { dbCol: 'cm.lifetime_revenue', type: 'number' },
|
||||
avgMargin30d: { dbCol: 'cm.avg_margin_30d', type: 'number' },
|
||||
stockTurn30d: { dbCol: 'cm.stock_turn_30d', type: 'number' },
|
||||
// Add status from the categories table for filtering
|
||||
status: { dbCol: 'c.status', type: 'string' },
|
||||
};
|
||||
|
||||
function getSafeColumnInfo(queryParamKey) {
|
||||
return COLUMN_MAP[queryParamKey] || null;
|
||||
}
|
||||
|
||||
// Type Labels (Consider moving to a shared config or fetching from DB)
|
||||
const TYPE_LABELS = {
|
||||
10: 'Section', 11: 'Category', 12: 'Subcategory', 13: 'Sub-subcategory',
|
||||
1: 'Company', 2: 'Line', 3: 'Subline', 40: 'Artist', // From old schema comments
|
||||
20: 'Theme', 21: 'Subtheme' // Additional types from categories.js
|
||||
};
|
||||
|
||||
// --- Route Handlers ---
|
||||
|
||||
// GET /categories-aggregate/filter-options
|
||||
router.get('/filter-options', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /categories-aggregate/filter-options');
|
||||
try {
|
||||
// Fetch distinct types directly from the aggregate table if reliable
|
||||
// Or join with categories table if source of truth is needed
|
||||
const { rows: typeRows } = await pool.query(`
|
||||
SELECT DISTINCT category_type
|
||||
FROM public.category_metrics
|
||||
ORDER BY category_type
|
||||
`);
|
||||
|
||||
const typeOptions = typeRows.map(r => ({
|
||||
value: r.category_type,
|
||||
label: TYPE_LABELS[r.category_type] || `Type ${r.category_type}` // Add labels
|
||||
}));
|
||||
|
||||
// Add status options for filtering (from categories.js)
|
||||
const { rows: statusRows } = await pool.query(`
|
||||
SELECT DISTINCT status FROM public.categories ORDER BY status
|
||||
`);
|
||||
|
||||
// Get type counts (from categories.js)
|
||||
const { rows: typeCounts } = await pool.query(`
|
||||
SELECT
|
||||
type,
|
||||
COUNT(*)::integer as count
|
||||
FROM categories
|
||||
GROUP BY type
|
||||
ORDER BY type
|
||||
`);
|
||||
|
||||
res.json({
|
||||
types: typeOptions,
|
||||
statuses: statusRows.map(r => r.status),
|
||||
typeCounts: typeCounts.map(tc => ({
|
||||
type: tc.type,
|
||||
count: tc.count
|
||||
}))
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching category filter options:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch filter options' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /categories-aggregate/stats
|
||||
router.get('/stats', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /categories-aggregate/stats');
|
||||
try {
|
||||
// Calculate stats directly from the aggregate table
|
||||
const { rows: [stats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(*) AS total_categories,
|
||||
-- Count active based on the source categories table status
|
||||
COUNT(CASE WHEN c.status = 'active' THEN cm.category_id END) AS active_categories,
|
||||
SUM(cm.active_product_count) AS total_active_products, -- Sum from aggregates
|
||||
SUM(cm.current_stock_cost) AS total_stock_value, -- Sum from aggregates
|
||||
-- Weighted Average Margin (Revenue as weight)
|
||||
SUM(cm.profit_30d) * 100.0 / NULLIF(SUM(cm.revenue_30d), 0) AS overall_avg_margin_weighted,
|
||||
-- Simple Average Margin (less accurate if categories vary greatly in size)
|
||||
AVG(NULLIF(cm.avg_margin_30d, 0)) AS overall_avg_margin_simple
|
||||
-- Growth rate can be calculated from 30d vs previous 30d revenue if needed
|
||||
FROM public.category_metrics cm
|
||||
JOIN public.categories c ON cm.category_id = c.cat_id -- Join to check category status
|
||||
`);
|
||||
|
||||
res.json({
|
||||
totalCategories: parseInt(stats?.total_categories || 0),
|
||||
activeCategories: parseInt(stats?.active_categories || 0), // Based on categories.status
|
||||
totalActiveProducts: parseInt(stats?.total_active_products || 0),
|
||||
totalValue: parseFloat(stats?.total_stock_value || 0),
|
||||
// Choose which avg margin calculation to expose
|
||||
avgMargin: parseFloat(stats?.overall_avg_margin_weighted || stats?.overall_avg_margin_simple || 0)
|
||||
// Growth rate could be added if we implement the calculation
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching category stats:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch category stats.' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /categories-aggregate/ (List categories)
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /categories-aggregate received query:', req.query);
|
||||
try {
|
||||
// --- Pagination ---
|
||||
let page = parseInt(req.query.page, 10) || 1;
|
||||
let limit = parseInt(req.query.limit, 10) || DEFAULT_PAGE_LIMIT;
|
||||
limit = Math.min(limit, MAX_PAGE_LIMIT);
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
// --- Sorting ---
|
||||
const sortQueryKey = req.query.sort || 'categoryName';
|
||||
const sortColumnInfo = getSafeColumnInfo(sortQueryKey);
|
||||
|
||||
// Hierarchical sorting logic from categories.js
|
||||
const hierarchicalSortOrder = `
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN cm.category_type = 10 THEN 1 -- sections first
|
||||
WHEN cm.category_type = 11 THEN 2 -- categories second
|
||||
WHEN cm.category_type = 12 THEN 3 -- subcategories third
|
||||
WHEN cm.category_type = 13 THEN 4 -- subsubcategories fourth
|
||||
WHEN cm.category_type = 20 THEN 5 -- themes fifth
|
||||
WHEN cm.category_type = 21 THEN 6 -- subthemes last
|
||||
ELSE 7
|
||||
END,
|
||||
cm.category_name ASC
|
||||
`;
|
||||
|
||||
// Use hierarchical sort as default
|
||||
let sortClause = hierarchicalSortOrder;
|
||||
|
||||
// Override with custom sort if specified
|
||||
if (sortColumnInfo && sortQueryKey !== 'categoryName') {
|
||||
const sortColumn = sortColumnInfo.dbCol;
|
||||
const sortDirection = req.query.order?.toLowerCase() === 'desc' ? 'DESC' : 'ASC';
|
||||
const nullsOrder = (sortDirection === 'ASC' ? 'NULLS FIRST' : 'NULLS LAST');
|
||||
sortClause = `ORDER BY ${sortColumn} ${sortDirection} ${nullsOrder}`;
|
||||
}
|
||||
|
||||
// --- Filtering ---
|
||||
const conditions = [];
|
||||
const params = [];
|
||||
let paramCounter = 1;
|
||||
|
||||
console.log("Starting to process filters from query:", req.query);
|
||||
|
||||
// Add filters based on req.query using COLUMN_MAP and parseValue
|
||||
for (const key in req.query) {
|
||||
if (['page', 'limit', 'sort', 'order'].includes(key)) continue;
|
||||
|
||||
let filterKey = key;
|
||||
let operator = '='; // Default operator
|
||||
const value = req.query[key];
|
||||
|
||||
console.log(`Processing filter key: "${key}" with value: "${value}"`);
|
||||
|
||||
const operatorMatch = key.match(/^(.*)_(eq|ne|gt|gte|lt|lte|like|ilike|between|in)$/);
|
||||
if (operatorMatch) {
|
||||
filterKey = operatorMatch[1];
|
||||
operator = operatorMatch[2];
|
||||
console.log(`Parsed filter key: "${filterKey}" with operator: "${operator}"`);
|
||||
}
|
||||
|
||||
// Special case for parentName requires join
|
||||
const requiresJoin = filterKey === 'parentName';
|
||||
const columnInfo = getSafeColumnInfo(filterKey);
|
||||
|
||||
if (columnInfo) {
|
||||
console.log(`Column info for "${filterKey}":`, columnInfo);
|
||||
const dbColumn = columnInfo.dbCol;
|
||||
const valueType = columnInfo.type;
|
||||
try {
|
||||
let conditionFragment = '';
|
||||
let needsParam = true;
|
||||
switch (operator.toLowerCase()) {
|
||||
case 'eq': operator = '='; break;
|
||||
case 'ne': operator = '<>'; break;
|
||||
case 'gt': operator = '>'; break;
|
||||
case 'gte': operator = '>='; break;
|
||||
case 'lt': operator = '<'; break;
|
||||
case 'lte': operator = '<='; break;
|
||||
case 'like': operator = 'LIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'ilike': operator = 'ILIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'between':
|
||||
const [val1, val2] = String(value).split(',');
|
||||
if (val1 !== undefined && val2 !== undefined) {
|
||||
conditionFragment = `${dbColumn} BETWEEN $${paramCounter++} AND $${paramCounter++}`;
|
||||
params.push(parseValue(val1, valueType), parseValue(val2, valueType));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
case 'in':
|
||||
const inValues = String(value).split(',');
|
||||
if (inValues.length > 0) {
|
||||
const placeholders = inValues.map(() => `$${paramCounter++}`).join(', ');
|
||||
conditionFragment = `${dbColumn} IN (${placeholders})`;
|
||||
params.push(...inValues.map(v => parseValue(v, valueType)));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
default: operator = '='; break;
|
||||
}
|
||||
|
||||
if (needsParam) {
|
||||
try {
|
||||
// Special handling for categoryType to ensure it works
|
||||
if (filterKey === 'categoryType') {
|
||||
console.log(`Special handling for categoryType: ${value}`);
|
||||
// Force conversion to integer
|
||||
const numericValue = parseInt(value, 10);
|
||||
if (!isNaN(numericValue)) {
|
||||
console.log(`Successfully converted categoryType to integer: ${numericValue}`);
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
params.push(numericValue);
|
||||
} else {
|
||||
console.error(`Failed to convert categoryType to integer: "${value}"`);
|
||||
throw new Error(`Invalid categoryType value: "${value}"`);
|
||||
}
|
||||
} else {
|
||||
// Normal handling for other fields
|
||||
const parsedValue = parseValue(value, valueType);
|
||||
console.log(`Parsed "${value}" as ${valueType}: ${parsedValue}`);
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
params.push(parsedValue);
|
||||
}
|
||||
} catch (innerError) {
|
||||
console.error(`Failed to parse "${value}" as ${valueType}:`, innerError);
|
||||
throw innerError;
|
||||
}
|
||||
} else if (!conditionFragment) { // For LIKE/ILIKE where needsParam is false
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`; // paramCounter was already incremented in push
|
||||
}
|
||||
|
||||
|
||||
if (conditionFragment) {
|
||||
console.log(`Adding condition: ${conditionFragment}`);
|
||||
conditions.push(`(${conditionFragment})`);
|
||||
}
|
||||
} catch (parseError) {
|
||||
console.error(`Skipping filter for key "${key}" due to parsing error:`, parseError);
|
||||
if (needsParam) paramCounter--; // Roll back counter if param push failed
|
||||
}
|
||||
} else {
|
||||
console.warn(`Invalid filter key ignored: "${key}", not found in COLUMN_MAP`);
|
||||
}
|
||||
}
|
||||
|
||||
// --- Execute Queries ---
|
||||
const whereClause = conditions.length > 0 ? `WHERE ${conditions.join(' AND ')}` : '';
|
||||
|
||||
// Need JOIN for parent_name if sorting/filtering by it, or always include for display
|
||||
const sortColumn = sortColumnInfo?.dbCol;
|
||||
|
||||
// Always include the category and parent joins for status and parent_name
|
||||
const joinSql = `
|
||||
JOIN public.categories c ON cm.category_id = c.cat_id
|
||||
LEFT JOIN public.categories p ON cm.parent_id = p.cat_id
|
||||
`;
|
||||
|
||||
const baseSql = `
|
||||
FROM public.category_metrics cm
|
||||
${joinSql}
|
||||
${whereClause}
|
||||
`;
|
||||
|
||||
const countSql = `SELECT COUNT(*) AS total ${baseSql}`;
|
||||
const dataSql = `
|
||||
SELECT
|
||||
cm.*,
|
||||
c.status,
|
||||
c.description,
|
||||
p.name as parent_name,
|
||||
p.type as parent_type
|
||||
${baseSql}
|
||||
${sortClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`;
|
||||
const dataParams = [...params, limit, offset];
|
||||
|
||||
console.log("Count SQL:", countSql, params);
|
||||
console.log("Data SQL:", dataSql, dataParams);
|
||||
|
||||
const [countResult, dataResult] = await Promise.all([
|
||||
pool.query(countSql, params),
|
||||
pool.query(dataSql, dataParams)
|
||||
]);
|
||||
|
||||
const total = parseInt(countResult.rows[0].total, 10);
|
||||
const categories = dataResult.rows.map(row => {
|
||||
// Create a new object with both snake_case and camelCase keys
|
||||
const transformedRow = { ...row }; // Start with original data
|
||||
|
||||
for (const key in row) {
|
||||
// Skip null/undefined values
|
||||
if (row[key] === null || row[key] === undefined) {
|
||||
continue; // Original already has the null value
|
||||
}
|
||||
|
||||
// Transform keys to match frontend expectations (add camelCase versions)
|
||||
// First handle cases like sales_7d -> sales7d
|
||||
let camelKey = key.replace(/_(\d+[a-z])/g, '$1');
|
||||
|
||||
// Then handle regular snake_case -> camelCase
|
||||
camelKey = camelKey.replace(/_([a-z])/g, (_, letter) => letter.toUpperCase());
|
||||
if (camelKey !== key) { // Only add if different from original
|
||||
transformedRow[camelKey] = row[key];
|
||||
}
|
||||
}
|
||||
return transformedRow;
|
||||
});
|
||||
|
||||
// --- Respond ---
|
||||
res.json({
|
||||
categories,
|
||||
pagination: { total, pages: Math.ceil(total / limit), currentPage: page, limit },
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error fetching category metrics list:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch category metrics.' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -7,164 +7,317 @@ router.use((req, res, next) => {
|
||||
next();
|
||||
});
|
||||
|
||||
// Get all configuration values
|
||||
router.get('/', async (req, res) => {
|
||||
// ===== GLOBAL SETTINGS =====
|
||||
|
||||
// Get all global settings
|
||||
router.get('/global', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
console.log('[Config Route] Fetching configuration values...');
|
||||
console.log('[Config Route] Fetching global settings...');
|
||||
|
||||
const { rows: stockThresholds } = await pool.query('SELECT * FROM stock_thresholds WHERE id = 1');
|
||||
console.log('[Config Route] Stock thresholds:', stockThresholds);
|
||||
const { rows } = await pool.query('SELECT * FROM settings_global ORDER BY setting_key');
|
||||
|
||||
const { rows: leadTimeThresholds } = await pool.query('SELECT * FROM lead_time_thresholds WHERE id = 1');
|
||||
console.log('[Config Route] Lead time thresholds:', leadTimeThresholds);
|
||||
|
||||
const { rows: salesVelocityConfig } = await pool.query('SELECT * FROM sales_velocity_config WHERE id = 1');
|
||||
console.log('[Config Route] Sales velocity config:', salesVelocityConfig);
|
||||
|
||||
const { rows: abcConfig } = await pool.query('SELECT * FROM abc_classification_config WHERE id = 1');
|
||||
console.log('[Config Route] ABC config:', abcConfig);
|
||||
|
||||
const { rows: safetyStockConfig } = await pool.query('SELECT * FROM safety_stock_config WHERE id = 1');
|
||||
console.log('[Config Route] Safety stock config:', safetyStockConfig);
|
||||
|
||||
const { rows: turnoverConfig } = await pool.query('SELECT * FROM turnover_config WHERE id = 1');
|
||||
console.log('[Config Route] Turnover config:', turnoverConfig);
|
||||
console.log('[Config Route] Sending global settings:', rows);
|
||||
res.json(rows);
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error fetching global settings:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch global settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update global settings
|
||||
router.put('/global', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
console.log('[Config Route] Updating global settings:', req.body);
|
||||
|
||||
// Validate request
|
||||
if (!Array.isArray(req.body)) {
|
||||
return res.status(400).json({ error: 'Request body must be an array of settings' });
|
||||
}
|
||||
|
||||
// Begin transaction
|
||||
const client = await pool.connect();
|
||||
try {
|
||||
await client.query('BEGIN');
|
||||
|
||||
for (const setting of req.body) {
|
||||
if (!setting.setting_key || !setting.setting_value) {
|
||||
throw new Error('Each setting must have a key and value');
|
||||
}
|
||||
|
||||
await client.query(
|
||||
`UPDATE settings_global
|
||||
SET setting_value = $1,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE setting_key = $2`,
|
||||
[setting.setting_value, setting.setting_key]
|
||||
);
|
||||
}
|
||||
|
||||
await client.query('COMMIT');
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
await client.query('ROLLBACK');
|
||||
throw error;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating global settings:', error);
|
||||
res.status(500).json({ error: 'Failed to update global settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// ===== PRODUCT SETTINGS =====
|
||||
|
||||
// Get product settings with pagination and search
|
||||
router.get('/products', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
console.log('[Config Route] Fetching product settings...');
|
||||
|
||||
const page = parseInt(req.query.page) || 1;
|
||||
const pageSize = parseInt(req.query.pageSize) || 10;
|
||||
const offset = (page - 1) * pageSize;
|
||||
const search = req.query.search || '';
|
||||
|
||||
// Get total count for pagination
|
||||
const countQuery = search
|
||||
? `SELECT COUNT(*) FROM settings_product sp
|
||||
JOIN products p ON sp.pid::text = p.pid::text
|
||||
WHERE sp.pid::text ILIKE $1 OR p.title ILIKE $1`
|
||||
: 'SELECT COUNT(*) FROM settings_product';
|
||||
|
||||
const countParams = search ? [`%${search}%`] : [];
|
||||
const { rows: countResult } = await pool.query(countQuery, countParams);
|
||||
const total = parseInt(countResult[0].count);
|
||||
|
||||
// Get paginated settings
|
||||
const query = search
|
||||
? `SELECT sp.*, p.title as product_name
|
||||
FROM settings_product sp
|
||||
JOIN products p ON sp.pid::text = p.pid::text
|
||||
WHERE sp.pid::text ILIKE $1 OR p.title ILIKE $1
|
||||
ORDER BY sp.pid
|
||||
LIMIT $2 OFFSET $3`
|
||||
: `SELECT sp.*, p.title as product_name
|
||||
FROM settings_product sp
|
||||
JOIN products p ON sp.pid::text = p.pid::text
|
||||
ORDER BY sp.pid
|
||||
LIMIT $1 OFFSET $2`;
|
||||
|
||||
const queryParams = search
|
||||
? [`%${search}%`, pageSize, offset]
|
||||
: [pageSize, offset];
|
||||
|
||||
const { rows } = await pool.query(query, queryParams);
|
||||
|
||||
const response = {
|
||||
stockThresholds: stockThresholds[0],
|
||||
leadTimeThresholds: leadTimeThresholds[0],
|
||||
salesVelocityConfig: salesVelocityConfig[0],
|
||||
abcConfig: abcConfig[0],
|
||||
safetyStockConfig: safetyStockConfig[0],
|
||||
turnoverConfig: turnoverConfig[0]
|
||||
items: rows,
|
||||
total,
|
||||
page,
|
||||
pageSize
|
||||
};
|
||||
|
||||
console.log('[Config Route] Sending response:', response);
|
||||
console.log(`[Config Route] Sending ${rows.length} product settings`);
|
||||
res.json(response);
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error fetching configuration:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch configuration', details: error.message });
|
||||
console.error('[Config Route] Error fetching product settings:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch product settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update stock thresholds
|
||||
router.put('/stock-thresholds/:id', async (req, res) => {
|
||||
// Update product settings
|
||||
router.put('/products/:pid', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { critical_days, reorder_days, overstock_days, low_stock_threshold, min_reorder_quantity } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE stock_thresholds
|
||||
SET critical_days = $1,
|
||||
reorder_days = $2,
|
||||
overstock_days = $3,
|
||||
low_stock_threshold = $4,
|
||||
min_reorder_quantity = $5
|
||||
WHERE id = $6`,
|
||||
[critical_days, reorder_days, overstock_days, low_stock_threshold, min_reorder_quantity, req.params.id]
|
||||
const { pid } = req.params;
|
||||
const { lead_time_days, days_of_stock, safety_stock, forecast_method, exclude_from_forecast } = req.body;
|
||||
|
||||
console.log(`[Config Route] Updating product settings for ${pid}:`, req.body);
|
||||
|
||||
// Check if product exists
|
||||
const { rows: checkProduct } = await pool.query(
|
||||
'SELECT 1 FROM settings_product WHERE pid::text = $1',
|
||||
[pid]
|
||||
);
|
||||
|
||||
if (checkProduct.length === 0) {
|
||||
// Insert if it doesn't exist
|
||||
await pool.query(
|
||||
`INSERT INTO settings_product
|
||||
(pid, lead_time_days, days_of_stock, safety_stock, forecast_method, exclude_from_forecast)
|
||||
VALUES ($1, $2, $3, $4, $5, $6)`,
|
||||
[pid, lead_time_days, days_of_stock, safety_stock, forecast_method, exclude_from_forecast]
|
||||
);
|
||||
} else {
|
||||
// Update if it exists
|
||||
await pool.query(
|
||||
`UPDATE settings_product
|
||||
SET lead_time_days = $2,
|
||||
days_of_stock = $3,
|
||||
safety_stock = $4,
|
||||
forecast_method = $5,
|
||||
exclude_from_forecast = $6,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE pid::text = $1`,
|
||||
[pid, lead_time_days, days_of_stock, safety_stock, forecast_method, exclude_from_forecast]
|
||||
);
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating stock thresholds:', error);
|
||||
res.status(500).json({ error: 'Failed to update stock thresholds' });
|
||||
console.error(`[Config Route] Error updating product settings for ${req.params.pid}:`, error);
|
||||
res.status(500).json({ error: 'Failed to update product settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update lead time thresholds
|
||||
router.put('/lead-time-thresholds/:id', async (req, res) => {
|
||||
// Reset product settings to defaults
|
||||
router.post('/products/:pid/reset', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { target_days, warning_days, critical_days } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE lead_time_thresholds
|
||||
SET target_days = $1,
|
||||
warning_days = $2,
|
||||
critical_days = $3
|
||||
WHERE id = $4`,
|
||||
[target_days, warning_days, critical_days, req.params.id]
|
||||
const { pid } = req.params;
|
||||
|
||||
console.log(`[Config Route] Resetting product settings for ${pid}`);
|
||||
|
||||
// Reset by setting everything to null/default
|
||||
await pool.query(
|
||||
`UPDATE settings_product
|
||||
SET lead_time_days = NULL,
|
||||
days_of_stock = NULL,
|
||||
safety_stock = 0,
|
||||
forecast_method = NULL,
|
||||
exclude_from_forecast = false,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE pid::text = $1`,
|
||||
[pid]
|
||||
);
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating lead time thresholds:', error);
|
||||
res.status(500).json({ error: 'Failed to update lead time thresholds' });
|
||||
console.error(`[Config Route] Error resetting product settings for ${req.params.pid}:`, error);
|
||||
res.status(500).json({ error: 'Failed to reset product settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update sales velocity config
|
||||
router.put('/sales-velocity/:id', async (req, res) => {
|
||||
// ===== VENDOR SETTINGS =====
|
||||
|
||||
// Get vendor settings with pagination and search
|
||||
router.get('/vendors', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { daily_window_days, weekly_window_days, monthly_window_days } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE sales_velocity_config
|
||||
SET daily_window_days = $1,
|
||||
weekly_window_days = $2,
|
||||
monthly_window_days = $3
|
||||
WHERE id = $4`,
|
||||
[daily_window_days, weekly_window_days, monthly_window_days, req.params.id]
|
||||
);
|
||||
res.json({ success: true });
|
||||
console.log('[Config Route] Fetching vendor settings...');
|
||||
|
||||
const page = parseInt(req.query.page) || 1;
|
||||
const pageSize = parseInt(req.query.pageSize) || 10;
|
||||
const offset = (page - 1) * pageSize;
|
||||
const search = req.query.search || '';
|
||||
|
||||
// Get total count for pagination
|
||||
const countQuery = search
|
||||
? 'SELECT COUNT(*) FROM settings_vendor WHERE vendor ILIKE $1'
|
||||
: 'SELECT COUNT(*) FROM settings_vendor';
|
||||
|
||||
const countParams = search ? [`%${search}%`] : [];
|
||||
const { rows: countResult } = await pool.query(countQuery, countParams);
|
||||
const total = parseInt(countResult[0].count);
|
||||
|
||||
// Get paginated settings
|
||||
const query = search
|
||||
? `SELECT * FROM settings_vendor
|
||||
WHERE vendor ILIKE $1
|
||||
ORDER BY vendor
|
||||
LIMIT $2 OFFSET $3`
|
||||
: `SELECT * FROM settings_vendor
|
||||
ORDER BY vendor
|
||||
LIMIT $1 OFFSET $2`;
|
||||
|
||||
const queryParams = search
|
||||
? [`%${search}%`, pageSize, offset]
|
||||
: [pageSize, offset];
|
||||
|
||||
const { rows } = await pool.query(query, queryParams);
|
||||
|
||||
const response = {
|
||||
items: rows,
|
||||
total,
|
||||
page,
|
||||
pageSize
|
||||
};
|
||||
|
||||
console.log(`[Config Route] Sending ${rows.length} vendor settings`);
|
||||
res.json(response);
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating sales velocity config:', error);
|
||||
res.status(500).json({ error: 'Failed to update sales velocity config' });
|
||||
console.error('[Config Route] Error fetching vendor settings:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendor settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update ABC classification config
|
||||
router.put('/abc-classification/:id', async (req, res) => {
|
||||
// Update vendor settings
|
||||
router.put('/vendors/:vendor', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { a_threshold, b_threshold, classification_period_days } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE abc_classification_config
|
||||
SET a_threshold = $1,
|
||||
b_threshold = $2,
|
||||
classification_period_days = $3
|
||||
WHERE id = $4`,
|
||||
[a_threshold, b_threshold, classification_period_days, req.params.id]
|
||||
const vendor = req.params.vendor;
|
||||
const { default_lead_time_days, default_days_of_stock } = req.body;
|
||||
|
||||
console.log(`[Config Route] Updating vendor settings for ${vendor}:`, req.body);
|
||||
|
||||
// Check if vendor exists
|
||||
const { rows: checkVendor } = await pool.query(
|
||||
'SELECT 1 FROM settings_vendor WHERE vendor = $1',
|
||||
[vendor]
|
||||
);
|
||||
|
||||
if (checkVendor.length === 0) {
|
||||
// Insert if it doesn't exist
|
||||
await pool.query(
|
||||
`INSERT INTO settings_vendor
|
||||
(vendor, default_lead_time_days, default_days_of_stock)
|
||||
VALUES ($1, $2, $3)`,
|
||||
[vendor, default_lead_time_days, default_days_of_stock]
|
||||
);
|
||||
} else {
|
||||
// Update if it exists
|
||||
await pool.query(
|
||||
`UPDATE settings_vendor
|
||||
SET default_lead_time_days = $2,
|
||||
default_days_of_stock = $3,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE vendor = $1`,
|
||||
[vendor, default_lead_time_days, default_days_of_stock]
|
||||
);
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating ABC classification config:', error);
|
||||
res.status(500).json({ error: 'Failed to update ABC classification config' });
|
||||
console.error(`[Config Route] Error updating vendor settings for ${req.params.vendor}:`, error);
|
||||
res.status(500).json({ error: 'Failed to update vendor settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update safety stock config
|
||||
router.put('/safety-stock/:id', async (req, res) => {
|
||||
// Reset vendor settings to defaults
|
||||
router.post('/vendors/:vendor/reset', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { coverage_days, service_level } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE safety_stock_config
|
||||
SET coverage_days = $1,
|
||||
service_level = $2
|
||||
WHERE id = $3`,
|
||||
[coverage_days, service_level, req.params.id]
|
||||
const vendor = req.params.vendor;
|
||||
|
||||
console.log(`[Config Route] Resetting vendor settings for ${vendor}`);
|
||||
|
||||
// Reset by setting everything to null
|
||||
await pool.query(
|
||||
`UPDATE settings_vendor
|
||||
SET default_lead_time_days = NULL,
|
||||
default_days_of_stock = NULL,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE vendor = $1`,
|
||||
[vendor]
|
||||
);
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating safety stock config:', error);
|
||||
res.status(500).json({ error: 'Failed to update safety stock config' });
|
||||
}
|
||||
});
|
||||
|
||||
// Update turnover config
|
||||
router.put('/turnover/:id', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { calculation_period_days, target_rate } = req.body;
|
||||
const { rows } = await pool.query(
|
||||
`UPDATE turnover_config
|
||||
SET calculation_period_days = $1,
|
||||
target_rate = $2
|
||||
WHERE id = $3`,
|
||||
[calculation_period_days, target_rate, req.params.id]
|
||||
);
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[Config Route] Error updating turnover config:', error);
|
||||
res.status(500).json({ error: 'Failed to update turnover config' });
|
||||
console.error(`[Config Route] Error resetting vendor settings for ${req.params.vendor}:`, error);
|
||||
res.status(500).json({ error: 'Failed to reset vendor settings', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
@@ -1,881 +0,0 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { spawn } = require('child_process');
|
||||
const path = require('path');
|
||||
const db = require('../utils/db');
|
||||
|
||||
// Debug middleware MUST be first
|
||||
router.use((req, res, next) => {
|
||||
console.log(`[CSV Route Debug] ${req.method} ${req.path}`);
|
||||
next();
|
||||
});
|
||||
|
||||
// Store active processes and their progress
|
||||
let activeImport = null;
|
||||
let importProgress = null;
|
||||
let activeFullUpdate = null;
|
||||
let activeFullReset = null;
|
||||
|
||||
// SSE clients for progress updates
|
||||
const updateClients = new Set();
|
||||
const importClients = new Set();
|
||||
const resetClients = new Set();
|
||||
const resetMetricsClients = new Set();
|
||||
const calculateMetricsClients = new Set();
|
||||
const fullUpdateClients = new Set();
|
||||
const fullResetClients = new Set();
|
||||
|
||||
// Helper to send progress to specific clients
|
||||
function sendProgressToClients(clients, data) {
|
||||
// If data is a string, send it directly
|
||||
// If it's an object, convert it to JSON
|
||||
const message = typeof data === 'string'
|
||||
? `data: ${data}\n\n`
|
||||
: `data: ${JSON.stringify(data)}\n\n`;
|
||||
|
||||
clients.forEach(client => {
|
||||
try {
|
||||
client.write(message);
|
||||
// Immediately flush the response
|
||||
if (typeof client.flush === 'function') {
|
||||
client.flush();
|
||||
}
|
||||
} catch (error) {
|
||||
// Silently remove failed client
|
||||
clients.delete(client);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Helper to run a script and stream progress
|
||||
function runScript(scriptPath, type, clients) {
|
||||
return new Promise((resolve, reject) => {
|
||||
// Kill any existing process of this type
|
||||
let activeProcess;
|
||||
switch (type) {
|
||||
case 'update':
|
||||
if (activeFullUpdate) {
|
||||
try { activeFullUpdate.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullUpdate;
|
||||
break;
|
||||
case 'reset':
|
||||
if (activeFullReset) {
|
||||
try { activeFullReset.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullReset;
|
||||
break;
|
||||
}
|
||||
|
||||
const child = spawn('node', [scriptPath], {
|
||||
stdio: ['inherit', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = child;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = child;
|
||||
break;
|
||||
}
|
||||
|
||||
let output = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
output += text;
|
||||
|
||||
// Split by lines to handle multiple JSON outputs
|
||||
const lines = text.split('\n');
|
||||
lines.filter(line => line.trim()).forEach(line => {
|
||||
try {
|
||||
// Try to parse as JSON but don't let it affect the display
|
||||
const jsonData = JSON.parse(line);
|
||||
// Only end the process if we get a final status
|
||||
if (jsonData.status === 'complete' || jsonData.status === 'error' || jsonData.status === 'cancelled') {
|
||||
if (jsonData.status === 'complete' && !jsonData.operation?.includes('complete')) {
|
||||
// Don't close for intermediate completion messages
|
||||
sendProgressToClients(clients, line);
|
||||
return;
|
||||
}
|
||||
// Close only on final completion/error/cancellation
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
if (jsonData.status === 'error') {
|
||||
reject(new Error(jsonData.error || 'Unknown error'));
|
||||
} else {
|
||||
resolve({ output });
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// Not JSON, just display as is
|
||||
}
|
||||
// Always send the raw line
|
||||
sendProgressToClients(clients, line);
|
||||
});
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
console.error(text);
|
||||
// Send stderr output directly too
|
||||
sendProgressToClients(clients, text);
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
|
||||
if (code !== 0) {
|
||||
const error = `Script ${scriptPath} exited with code ${code}`;
|
||||
sendProgressToClients(clients, error);
|
||||
reject(new Error(error));
|
||||
}
|
||||
// Don't resolve here - let the completion message from the script trigger the resolve
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
sendProgressToClients(clients, err.message);
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Progress endpoints
|
||||
router.get('/:type/progress', (req, res) => {
|
||||
const { type } = req.params;
|
||||
if (!['update', 'reset'].includes(type)) {
|
||||
return res.status(400).json({ error: 'Invalid operation type' });
|
||||
}
|
||||
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Add this client to the correct set
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
clients.add(res);
|
||||
|
||||
// Send initial connection message
|
||||
sendProgressToClients(new Set([res]), JSON.stringify({
|
||||
status: 'running',
|
||||
operation: 'Initializing connection...'
|
||||
}));
|
||||
|
||||
// Handle client disconnect
|
||||
req.on('close', () => {
|
||||
clients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
// Debug endpoint to verify route registration
|
||||
router.get('/test', (req, res) => {
|
||||
console.log('CSV test endpoint hit');
|
||||
res.json({ message: 'CSV routes are working' });
|
||||
});
|
||||
|
||||
// Route to check import status
|
||||
router.get('/status', (req, res) => {
|
||||
console.log('CSV status endpoint hit');
|
||||
res.json({
|
||||
active: !!activeImport,
|
||||
progress: importProgress
|
||||
});
|
||||
});
|
||||
|
||||
// Add calculate-metrics status endpoint
|
||||
router.get('/calculate-metrics/status', (req, res) => {
|
||||
const calculateMetrics = require('../../scripts/calculate-metrics');
|
||||
const progress = calculateMetrics.getProgress();
|
||||
|
||||
// Only consider it active if both the process is running and we have progress
|
||||
const isActive = !!activeImport && !!progress;
|
||||
|
||||
res.json({
|
||||
active: isActive,
|
||||
progress: isActive ? progress : null
|
||||
});
|
||||
});
|
||||
|
||||
// Route to update CSV files
|
||||
router.post('/update', async (req, res, next) => {
|
||||
if (activeImport) {
|
||||
return res.status(409).json({ error: 'Import already in progress' });
|
||||
}
|
||||
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '..', '..', 'scripts', 'update-csv.js');
|
||||
|
||||
if (!require('fs').existsSync(scriptPath)) {
|
||||
return res.status(500).json({ error: 'Update script not found' });
|
||||
}
|
||||
|
||||
activeImport = spawn('node', [scriptPath]);
|
||||
|
||||
activeImport.stdout.on('data', (data) => {
|
||||
const output = data.toString().trim();
|
||||
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(output);
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'running',
|
||||
...jsonData
|
||||
});
|
||||
} catch (e) {
|
||||
// If not JSON, send as plain progress
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'running',
|
||||
progress: output
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
activeImport.stderr.on('data', (data) => {
|
||||
const error = data.toString().trim();
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(error);
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'error',
|
||||
...jsonData
|
||||
});
|
||||
} catch {
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'error',
|
||||
error
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
await new Promise((resolve, reject) => {
|
||||
activeImport.on('close', (code) => {
|
||||
// Don't treat cancellation (code 143/SIGTERM) as an error
|
||||
if (code === 0 || code === 143) {
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'complete',
|
||||
operation: code === 143 ? 'Operation cancelled' : 'Update complete'
|
||||
});
|
||||
resolve();
|
||||
} else {
|
||||
const errorMsg = `Update process exited with code ${code}`;
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'error',
|
||||
error: errorMsg
|
||||
});
|
||||
reject(new Error(errorMsg));
|
||||
}
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
});
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('Error updating CSV files:', error);
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
sendProgressToClients(updateClients, {
|
||||
status: 'error',
|
||||
error: error.message
|
||||
});
|
||||
next(error);
|
||||
}
|
||||
});
|
||||
|
||||
// Route to import CSV files
|
||||
router.post('/import', async (req, res) => {
|
||||
if (activeImport) {
|
||||
return res.status(409).json({ error: 'Import already in progress' });
|
||||
}
|
||||
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '..', '..', 'scripts', 'import-csv.js');
|
||||
|
||||
if (!require('fs').existsSync(scriptPath)) {
|
||||
return res.status(500).json({ error: 'Import script not found' });
|
||||
}
|
||||
|
||||
// Get test limits from request body
|
||||
const { products = 0, orders = 10000, purchaseOrders = 10000 } = req.body;
|
||||
|
||||
// Create environment variables for the script
|
||||
const env = {
|
||||
...process.env,
|
||||
PRODUCTS_TEST_LIMIT: products.toString(),
|
||||
ORDERS_TEST_LIMIT: orders.toString(),
|
||||
PURCHASE_ORDERS_TEST_LIMIT: purchaseOrders.toString()
|
||||
};
|
||||
|
||||
activeImport = spawn('node', [scriptPath], { env });
|
||||
|
||||
activeImport.stdout.on('data', (data) => {
|
||||
const output = data.toString().trim();
|
||||
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(output);
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'running',
|
||||
...jsonData
|
||||
});
|
||||
} catch {
|
||||
// If not JSON, send as plain progress
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'running',
|
||||
progress: output
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
activeImport.stderr.on('data', (data) => {
|
||||
const error = data.toString().trim();
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(error);
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'error',
|
||||
...jsonData
|
||||
});
|
||||
} catch {
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'error',
|
||||
error
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
await new Promise((resolve, reject) => {
|
||||
activeImport.on('close', (code) => {
|
||||
// Don't treat cancellation (code 143/SIGTERM) as an error
|
||||
if (code === 0 || code === 143) {
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'complete',
|
||||
operation: code === 143 ? 'Operation cancelled' : 'Import complete'
|
||||
});
|
||||
resolve();
|
||||
} else {
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'error',
|
||||
error: `Process exited with code ${code}`
|
||||
});
|
||||
reject(new Error(`Import process exited with code ${code}`));
|
||||
}
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
});
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('Error importing CSV files:', error);
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
sendProgressToClients(importClients, {
|
||||
status: 'error',
|
||||
error: error.message
|
||||
});
|
||||
res.status(500).json({ error: 'Failed to import CSV files', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Route to cancel active process
|
||||
router.post('/cancel', (req, res) => {
|
||||
let killed = false;
|
||||
|
||||
// Get the operation type from the request
|
||||
const { type } = req.query;
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
const activeProcess = type === 'update' ? activeFullUpdate : activeFullReset;
|
||||
|
||||
if (activeProcess) {
|
||||
try {
|
||||
activeProcess.kill('SIGTERM');
|
||||
if (type === 'update') {
|
||||
activeFullUpdate = null;
|
||||
} else {
|
||||
activeFullReset = null;
|
||||
}
|
||||
killed = true;
|
||||
sendProgressToClients(clients, JSON.stringify({
|
||||
status: 'cancelled',
|
||||
operation: 'Operation cancelled'
|
||||
}));
|
||||
} catch (err) {
|
||||
console.error(`Error killing ${type} process:`, err);
|
||||
}
|
||||
}
|
||||
|
||||
if (killed) {
|
||||
res.json({ success: true });
|
||||
} else {
|
||||
res.status(404).json({ error: 'No active process to cancel' });
|
||||
}
|
||||
});
|
||||
|
||||
// Route to reset database
|
||||
router.post('/reset', async (req, res) => {
|
||||
if (activeImport) {
|
||||
return res.status(409).json({ error: 'Import already in progress' });
|
||||
}
|
||||
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '..', '..', 'scripts', 'reset-db.js');
|
||||
|
||||
if (!require('fs').existsSync(scriptPath)) {
|
||||
return res.status(500).json({ error: 'Reset script not found' });
|
||||
}
|
||||
|
||||
activeImport = spawn('node', [scriptPath]);
|
||||
|
||||
activeImport.stdout.on('data', (data) => {
|
||||
const output = data.toString().trim();
|
||||
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(output);
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'running',
|
||||
...jsonData
|
||||
});
|
||||
} catch (e) {
|
||||
// If not JSON, send as plain progress
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'running',
|
||||
progress: output
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
activeImport.stderr.on('data', (data) => {
|
||||
const error = data.toString().trim();
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(error);
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'error',
|
||||
...jsonData
|
||||
});
|
||||
} catch {
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'error',
|
||||
error
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
await new Promise((resolve, reject) => {
|
||||
activeImport.on('close', (code) => {
|
||||
// Don't treat cancellation (code 143/SIGTERM) as an error
|
||||
if (code === 0 || code === 143) {
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'complete',
|
||||
operation: code === 143 ? 'Operation cancelled' : 'Reset complete'
|
||||
});
|
||||
resolve();
|
||||
} else {
|
||||
const errorMsg = `Reset process exited with code ${code}`;
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'error',
|
||||
error: errorMsg
|
||||
});
|
||||
reject(new Error(errorMsg));
|
||||
}
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
});
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('Error resetting database:', error);
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
sendProgressToClients(resetClients, {
|
||||
status: 'error',
|
||||
error: error.message
|
||||
});
|
||||
res.status(500).json({ error: 'Failed to reset database', details: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Add reset-metrics endpoint
|
||||
router.post('/reset-metrics', async (req, res) => {
|
||||
if (activeImport) {
|
||||
res.status(400).json({ error: 'Operation already in progress' });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
// Set active import to prevent concurrent operations
|
||||
activeImport = {
|
||||
type: 'reset-metrics',
|
||||
status: 'running',
|
||||
operation: 'Starting metrics reset'
|
||||
};
|
||||
|
||||
// Send initial response
|
||||
res.status(200).json({ message: 'Reset metrics started' });
|
||||
|
||||
// Send initial progress through SSE
|
||||
sendProgressToClients(resetMetricsClients, {
|
||||
status: 'running',
|
||||
operation: 'Starting metrics reset'
|
||||
});
|
||||
|
||||
// Run the reset metrics script
|
||||
const resetMetrics = require('../../scripts/reset-metrics');
|
||||
await resetMetrics();
|
||||
|
||||
// Send completion through SSE
|
||||
sendProgressToClients(resetMetricsClients, {
|
||||
status: 'complete',
|
||||
operation: 'Metrics reset completed'
|
||||
});
|
||||
|
||||
activeImport = null;
|
||||
} catch (error) {
|
||||
console.error('Error during metrics reset:', error);
|
||||
|
||||
// Send error through SSE
|
||||
sendProgressToClients(resetMetricsClients, {
|
||||
status: 'error',
|
||||
error: error.message || 'Failed to reset metrics'
|
||||
});
|
||||
|
||||
activeImport = null;
|
||||
res.status(500).json({ error: error.message || 'Failed to reset metrics' });
|
||||
}
|
||||
});
|
||||
|
||||
// Add calculate-metrics endpoint
|
||||
router.post('/calculate-metrics', async (req, res) => {
|
||||
if (activeImport) {
|
||||
return res.status(409).json({ error: 'Import already in progress' });
|
||||
}
|
||||
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '..', '..', 'scripts', 'calculate-metrics.js');
|
||||
|
||||
if (!require('fs').existsSync(scriptPath)) {
|
||||
return res.status(500).json({ error: 'Calculate metrics script not found' });
|
||||
}
|
||||
|
||||
activeImport = spawn('node', [scriptPath]);
|
||||
let wasCancelled = false;
|
||||
|
||||
activeImport.stdout.on('data', (data) => {
|
||||
const output = data.toString().trim();
|
||||
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(output);
|
||||
importProgress = {
|
||||
status: 'running',
|
||||
...jsonData.progress
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
} catch (e) {
|
||||
// If not JSON, send as plain progress
|
||||
importProgress = {
|
||||
status: 'running',
|
||||
progress: output
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
}
|
||||
});
|
||||
|
||||
activeImport.stderr.on('data', (data) => {
|
||||
if (wasCancelled) return; // Don't send errors if cancelled
|
||||
|
||||
const error = data.toString().trim();
|
||||
try {
|
||||
// Try to parse as JSON
|
||||
const jsonData = JSON.parse(error);
|
||||
importProgress = {
|
||||
status: 'error',
|
||||
...jsonData.progress
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
} catch {
|
||||
importProgress = {
|
||||
status: 'error',
|
||||
error
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
}
|
||||
});
|
||||
|
||||
await new Promise((resolve, reject) => {
|
||||
activeImport.on('close', (code, signal) => {
|
||||
wasCancelled = signal === 'SIGTERM' || code === 143;
|
||||
activeImport = null;
|
||||
|
||||
if (code === 0 || wasCancelled) {
|
||||
if (wasCancelled) {
|
||||
importProgress = {
|
||||
status: 'cancelled',
|
||||
operation: 'Operation cancelled'
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
} else {
|
||||
importProgress = {
|
||||
status: 'complete',
|
||||
operation: 'Metrics calculation complete'
|
||||
};
|
||||
sendProgressToClients(calculateMetricsClients, importProgress);
|
||||
}
|
||||
resolve();
|
||||
} else {
|
||||
importProgress = null;
|
||||
reject(new Error(`Metrics calculation process exited with code ${code}`));
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('Error calculating metrics:', error);
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
|
||||
// Only send error if it wasn't a cancellation
|
||||
if (!error.message?.includes('code 143') && !error.message?.includes('SIGTERM')) {
|
||||
sendProgressToClients(calculateMetricsClients, {
|
||||
status: 'error',
|
||||
error: error.message
|
||||
});
|
||||
res.status(500).json({ error: 'Failed to calculate metrics', details: error.message });
|
||||
} else {
|
||||
res.json({ success: true });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Route to import from production database
|
||||
router.post('/import-from-prod', async (req, res) => {
|
||||
if (activeImport) {
|
||||
return res.status(409).json({ error: 'Import already in progress' });
|
||||
}
|
||||
|
||||
try {
|
||||
const importFromProd = require('../../scripts/import-from-prod');
|
||||
|
||||
// Set up progress handler
|
||||
const progressHandler = (data) => {
|
||||
importProgress = data;
|
||||
sendProgressToClients(importClients, data);
|
||||
};
|
||||
|
||||
// Start the import process
|
||||
importFromProd.outputProgress = progressHandler;
|
||||
activeImport = importFromProd; // Store the module for cancellation
|
||||
|
||||
// Run the import in the background
|
||||
importFromProd.main().catch(error => {
|
||||
console.error('Error in import process:', error);
|
||||
activeImport = null;
|
||||
importProgress = {
|
||||
status: error.message === 'Import cancelled' ? 'cancelled' : 'error',
|
||||
operation: 'Import process',
|
||||
error: error.message
|
||||
};
|
||||
sendProgressToClients(importClients, importProgress);
|
||||
}).finally(() => {
|
||||
activeImport = null;
|
||||
});
|
||||
|
||||
res.json({ message: 'Import from production started' });
|
||||
} catch (error) {
|
||||
console.error('Error starting production import:', error);
|
||||
activeImport = null;
|
||||
res.status(500).json({ error: error.message || 'Failed to start production import' });
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-update - Run full update script
|
||||
router.post('/full-update', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-update.js');
|
||||
runScript(scriptPath, 'update', fullUpdateClients)
|
||||
.catch(error => {
|
||||
console.error('Update failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Update started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-reset - Run full reset script
|
||||
router.post('/full-reset', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-reset.js');
|
||||
runScript(scriptPath, 'reset', fullResetClients)
|
||||
.catch(error => {
|
||||
console.error('Reset failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Reset started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/import - Get recent import history
|
||||
router.get('/history/import', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
id,
|
||||
start_time,
|
||||
end_time,
|
||||
status,
|
||||
error_message,
|
||||
records_added::integer,
|
||||
records_updated::integer
|
||||
FROM import_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching import history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/calculate - Get recent calculation history
|
||||
router.get('/history/calculate', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
id,
|
||||
start_time,
|
||||
end_time,
|
||||
duration_minutes,
|
||||
status,
|
||||
error_message,
|
||||
total_products,
|
||||
total_orders,
|
||||
total_purchase_orders,
|
||||
processed_products,
|
||||
processed_orders,
|
||||
processed_purchase_orders,
|
||||
additional_info
|
||||
FROM calculate_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching calculate history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/modules - Get module calculation status
|
||||
router.get('/status/modules', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
module_name,
|
||||
last_calculation_timestamp::timestamp
|
||||
FROM calculate_status
|
||||
ORDER BY module_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching module status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/tables - Get table sync status
|
||||
router.get('/status/tables', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
table_name,
|
||||
last_sync_timestamp::timestamp
|
||||
FROM sync_status
|
||||
ORDER BY table_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/table-counts - Get record counts for all tables
|
||||
router.get('/status/table-counts', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const tables = [
|
||||
// Core tables
|
||||
'products', 'categories', 'product_categories', 'orders', 'purchase_orders',
|
||||
// New metrics tables
|
||||
'product_metrics', 'daily_product_snapshots',
|
||||
// Config tables
|
||||
'settings_global', 'settings_vendor', 'settings_product'
|
||||
];
|
||||
|
||||
const counts = await Promise.all(
|
||||
tables.map(table =>
|
||||
pool.query(`SELECT COUNT(*) as count FROM ${table}`)
|
||||
.then(result => ({
|
||||
table_name: table,
|
||||
count: parseInt(result.rows[0].count)
|
||||
}))
|
||||
.catch(err => ({
|
||||
table_name: table,
|
||||
count: null,
|
||||
error: err.message
|
||||
}))
|
||||
)
|
||||
);
|
||||
|
||||
// Group tables by type
|
||||
const groupedCounts = {
|
||||
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders'].includes(c.table_name)),
|
||||
metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots'].includes(c.table_name)),
|
||||
config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name))
|
||||
};
|
||||
|
||||
res.json(groupedCounts);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table counts:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -22,11 +22,11 @@ router.get('/stock/metrics', async (req, res) => {
|
||||
const { rows: [stockMetrics] } = await executeQuery(`
|
||||
SELECT
|
||||
COALESCE(COUNT(*), 0)::integer as total_products,
|
||||
COALESCE(COUNT(CASE WHEN stock_quantity > 0 THEN 1 END), 0)::integer as products_in_stock,
|
||||
COALESCE(SUM(CASE WHEN stock_quantity > 0 THEN stock_quantity END), 0)::integer as total_units,
|
||||
ROUND(COALESCE(SUM(CASE WHEN stock_quantity > 0 THEN stock_quantity * cost_price END), 0)::numeric, 3) as total_cost,
|
||||
ROUND(COALESCE(SUM(CASE WHEN stock_quantity > 0 THEN stock_quantity * price END), 0)::numeric, 3) as total_retail
|
||||
FROM products
|
||||
COALESCE(COUNT(CASE WHEN current_stock > 0 THEN 1 END), 0)::integer as products_in_stock,
|
||||
COALESCE(SUM(CASE WHEN current_stock > 0 THEN current_stock END), 0)::integer as total_units,
|
||||
ROUND(COALESCE(SUM(CASE WHEN current_stock > 0 THEN current_stock_cost END), 0)::numeric, 3) as total_cost,
|
||||
ROUND(COALESCE(SUM(CASE WHEN current_stock > 0 THEN current_stock_retail END), 0)::numeric, 3) as total_retail
|
||||
FROM product_metrics
|
||||
`);
|
||||
|
||||
console.log('Raw stockMetrics from database:', stockMetrics);
|
||||
@@ -42,13 +42,13 @@ router.get('/stock/metrics', async (req, res) => {
|
||||
SELECT
|
||||
COALESCE(brand, 'Unbranded') as brand,
|
||||
COUNT(DISTINCT pid)::integer as variant_count,
|
||||
COALESCE(SUM(stock_quantity), 0)::integer as stock_units,
|
||||
ROUND(COALESCE(SUM(stock_quantity * cost_price), 0)::numeric, 3) as stock_cost,
|
||||
ROUND(COALESCE(SUM(stock_quantity * price), 0)::numeric, 3) as stock_retail
|
||||
FROM products
|
||||
WHERE stock_quantity > 0
|
||||
COALESCE(SUM(current_stock), 0)::integer as stock_units,
|
||||
ROUND(COALESCE(SUM(current_stock_cost), 0)::numeric, 3) as stock_cost,
|
||||
ROUND(COALESCE(SUM(current_stock_retail), 0)::numeric, 3) as stock_retail
|
||||
FROM product_metrics
|
||||
WHERE current_stock > 0
|
||||
GROUP BY COALESCE(brand, 'Unbranded')
|
||||
HAVING ROUND(COALESCE(SUM(stock_quantity * cost_price), 0)::numeric, 3) > 0
|
||||
HAVING ROUND(COALESCE(SUM(current_stock_cost), 0)::numeric, 3) > 0
|
||||
),
|
||||
other_brands AS (
|
||||
SELECT
|
||||
@@ -130,11 +130,11 @@ router.get('/purchase/metrics', async (req, res) => {
|
||||
END), 0)::numeric, 3) as total_cost,
|
||||
ROUND(COALESCE(SUM(CASE
|
||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
||||
THEN po.ordered * p.price
|
||||
THEN po.ordered * pm.current_price
|
||||
ELSE 0
|
||||
END), 0)::numeric, 3) as total_retail
|
||||
FROM purchase_orders po
|
||||
JOIN products p ON po.pid = p.pid
|
||||
JOIN product_metrics pm ON po.pid = pm.pid
|
||||
`);
|
||||
|
||||
const { rows: vendorOrders } = await executeQuery(`
|
||||
@@ -143,9 +143,9 @@ router.get('/purchase/metrics', async (req, res) => {
|
||||
COUNT(DISTINCT po.po_id)::integer as orders,
|
||||
COALESCE(SUM(po.ordered), 0)::integer as units,
|
||||
ROUND(COALESCE(SUM(po.ordered * po.cost_price), 0)::numeric, 3) as cost,
|
||||
ROUND(COALESCE(SUM(po.ordered * p.price), 0)::numeric, 3) as retail
|
||||
ROUND(COALESCE(SUM(po.ordered * pm.current_price), 0)::numeric, 3) as retail
|
||||
FROM purchase_orders po
|
||||
JOIN products p ON po.pid = p.pid
|
||||
JOIN product_metrics pm ON po.pid = pm.pid
|
||||
WHERE po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
||||
GROUP BY po.vendor
|
||||
HAVING ROUND(COALESCE(SUM(po.ordered * po.cost_price), 0)::numeric, 3) > 0
|
||||
@@ -223,54 +223,35 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
// Get summary metrics
|
||||
const { rows: [metrics] } = await executeQuery(`
|
||||
SELECT
|
||||
COUNT(DISTINCT p.pid)::integer as products_to_replenish,
|
||||
COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN ABS(p.stock_quantity) + pm.reorder_qty
|
||||
ELSE pm.reorder_qty
|
||||
END), 0)::integer as total_units_needed,
|
||||
ROUND(COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.cost_price
|
||||
ELSE pm.reorder_qty * p.cost_price
|
||||
END), 0)::numeric, 3) as total_cost,
|
||||
ROUND(COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.price
|
||||
ELSE pm.reorder_qty * p.price
|
||||
END), 0)::numeric, 3) as total_retail
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND (pm.stock_status IN ('Critical', 'Reorder')
|
||||
OR p.stock_quantity < 0)
|
||||
AND pm.reorder_qty > 0
|
||||
COUNT(DISTINCT pm.pid)::integer as products_to_replenish,
|
||||
COALESCE(SUM(pm.replenishment_units), 0)::integer as total_units_needed,
|
||||
ROUND(COALESCE(SUM(pm.replenishment_cost), 0)::numeric, 3) as total_cost,
|
||||
ROUND(COALESCE(SUM(pm.replenishment_retail), 0)::numeric, 3) as total_retail
|
||||
FROM product_metrics pm
|
||||
WHERE pm.is_replenishable = true
|
||||
AND (pm.status IN ('Critical', 'Reorder')
|
||||
OR pm.current_stock < 0)
|
||||
AND pm.replenishment_units > 0
|
||||
`);
|
||||
|
||||
// Get top variants to replenish
|
||||
const { rows: variants } = await executeQuery(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.title,
|
||||
p.stock_quantity::integer as current_stock,
|
||||
CASE
|
||||
WHEN p.stock_quantity < 0 THEN ABS(p.stock_quantity) + pm.reorder_qty
|
||||
ELSE pm.reorder_qty
|
||||
END::integer as replenish_qty,
|
||||
ROUND(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.cost_price
|
||||
ELSE pm.reorder_qty * p.cost_price
|
||||
END::numeric, 3) as replenish_cost,
|
||||
ROUND(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.price
|
||||
ELSE pm.reorder_qty * p.price
|
||||
END::numeric, 3) as replenish_retail,
|
||||
pm.stock_status
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND (pm.stock_status IN ('Critical', 'Reorder')
|
||||
OR p.stock_quantity < 0)
|
||||
AND pm.reorder_qty > 0
|
||||
pm.pid,
|
||||
pm.title,
|
||||
pm.current_stock::integer as current_stock,
|
||||
pm.replenishment_units::integer as replenish_qty,
|
||||
ROUND(pm.replenishment_cost::numeric, 3) as replenish_cost,
|
||||
ROUND(pm.replenishment_retail::numeric, 3) as replenish_retail,
|
||||
pm.status,
|
||||
pm.planning_period_days::text as planning_period
|
||||
FROM product_metrics pm
|
||||
WHERE pm.is_replenishable = true
|
||||
AND (pm.status IN ('Critical', 'Reorder')
|
||||
OR pm.current_stock < 0)
|
||||
AND pm.replenishment_units > 0
|
||||
ORDER BY
|
||||
CASE pm.stock_status
|
||||
CASE pm.status
|
||||
WHEN 'Critical' THEN 1
|
||||
WHEN 'Reorder' THEN 2
|
||||
END,
|
||||
@@ -280,7 +261,7 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
|
||||
// If no data, provide dummy data
|
||||
if (!metrics || variants.length === 0) {
|
||||
console.log('No replenishment metrics found, returning dummy data');
|
||||
console.log('No replenishment metrics found in new schema, returning dummy data');
|
||||
|
||||
return res.json({
|
||||
productsToReplenish: 15,
|
||||
@@ -288,11 +269,11 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
replenishmentCost: 15000.00,
|
||||
replenishmentRetail: 30000.00,
|
||||
topVariants: [
|
||||
{ id: 1, title: "Test Product 1", currentStock: 5, replenishQty: 20, replenishCost: 500, replenishRetail: 1000, status: "Critical" },
|
||||
{ id: 2, title: "Test Product 2", currentStock: 10, replenishQty: 15, replenishCost: 450, replenishRetail: 900, status: "Critical" },
|
||||
{ id: 3, title: "Test Product 3", currentStock: 15, replenishQty: 10, replenishCost: 300, replenishRetail: 600, status: "Reorder" },
|
||||
{ id: 4, title: "Test Product 4", currentStock: 20, replenishQty: 20, replenishCost: 200, replenishRetail: 400, status: "Reorder" },
|
||||
{ id: 5, title: "Test Product 5", currentStock: 25, replenishQty: 10, replenishCost: 150, replenishRetail: 300, status: "Reorder" }
|
||||
{ id: 1, title: "Test Product 1", currentStock: 5, replenishQty: 20, replenishCost: 500, replenishRetail: 1000, status: "Critical", planningPeriod: "30" },
|
||||
{ id: 2, title: "Test Product 2", currentStock: 10, replenishQty: 15, replenishCost: 450, replenishRetail: 900, status: "Critical", planningPeriod: "30" },
|
||||
{ id: 3, title: "Test Product 3", currentStock: 15, replenishQty: 10, replenishCost: 300, replenishRetail: 600, status: "Reorder", planningPeriod: "30" },
|
||||
{ id: 4, title: "Test Product 4", currentStock: 20, replenishQty: 20, replenishCost: 200, replenishRetail: 400, status: "Reorder", planningPeriod: "30" },
|
||||
{ id: 5, title: "Test Product 5", currentStock: 25, replenishQty: 10, replenishCost: 150, replenishRetail: 300, status: "Reorder", planningPeriod: "30" }
|
||||
]
|
||||
});
|
||||
}
|
||||
@@ -310,7 +291,8 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
replenishQty: parseInt(v.replenish_qty) || 0,
|
||||
replenishCost: parseFloat(v.replenish_cost) || 0,
|
||||
replenishRetail: parseFloat(v.replenish_retail) || 0,
|
||||
status: v.stock_status
|
||||
status: v.status,
|
||||
planningPeriod: v.planning_period
|
||||
}))
|
||||
};
|
||||
|
||||
@@ -325,11 +307,11 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
replenishmentCost: 15000.00,
|
||||
replenishmentRetail: 30000.00,
|
||||
topVariants: [
|
||||
{ id: 1, title: "Test Product 1", currentStock: 5, replenishQty: 20, replenishCost: 500, replenishRetail: 1000, status: "Critical" },
|
||||
{ id: 2, title: "Test Product 2", currentStock: 10, replenishQty: 15, replenishCost: 450, replenishRetail: 900, status: "Critical" },
|
||||
{ id: 3, title: "Test Product 3", currentStock: 15, replenishQty: 10, replenishCost: 300, replenishRetail: 600, status: "Reorder" },
|
||||
{ id: 4, title: "Test Product 4", currentStock: 20, replenishQty: 20, replenishCost: 200, replenishRetail: 400, status: "Reorder" },
|
||||
{ id: 5, title: "Test Product 5", currentStock: 25, replenishQty: 10, replenishCost: 150, replenishRetail: 300, status: "Reorder" }
|
||||
{ id: 1, title: "Test Product 1", currentStock: 5, replenishQty: 20, replenishCost: 500, replenishRetail: 1000, status: "Critical", planningPeriod: "30" },
|
||||
{ id: 2, title: "Test Product 2", currentStock: 10, replenishQty: 15, replenishCost: 450, replenishRetail: 900, status: "Critical", planningPeriod: "30" },
|
||||
{ id: 3, title: "Test Product 3", currentStock: 15, replenishQty: 10, replenishCost: 300, replenishRetail: 600, status: "Reorder", planningPeriod: "30" },
|
||||
{ id: 4, title: "Test Product 4", currentStock: 20, replenishQty: 20, replenishCost: 200, replenishRetail: 400, status: "Reorder", planningPeriod: "30" },
|
||||
{ id: 5, title: "Test Product 5", currentStock: 25, replenishQty: 10, replenishCost: 150, replenishRetail: 300, status: "Reorder", planningPeriod: "30" }
|
||||
]
|
||||
});
|
||||
}
|
||||
@@ -499,74 +481,15 @@ router.get('/forecast/metrics', async (req, res) => {
|
||||
// Returns overstock metrics by category
|
||||
router.get('/overstock/metrics', async (req, res) => {
|
||||
try {
|
||||
const { rows } = await executeQuery(`
|
||||
WITH category_overstock AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name as category_name,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
THEN p.pid
|
||||
END) as overstocked_products,
|
||||
SUM(CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
THEN pm.overstocked_amt
|
||||
ELSE 0
|
||||
END) as total_excess_units,
|
||||
SUM(CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
THEN pm.overstocked_amt * p.cost_price
|
||||
ELSE 0
|
||||
END) as total_excess_cost,
|
||||
SUM(CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
THEN pm.overstocked_amt * p.price
|
||||
ELSE 0
|
||||
END) as total_excess_retail
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
GROUP BY c.cat_id, c.name
|
||||
),
|
||||
filtered_categories AS (
|
||||
SELECT *
|
||||
FROM category_overstock
|
||||
WHERE overstocked_products > 0
|
||||
ORDER BY total_excess_cost DESC
|
||||
LIMIT 8
|
||||
),
|
||||
summary AS (
|
||||
SELECT
|
||||
SUM(overstocked_products) as total_overstocked,
|
||||
SUM(total_excess_units) as total_excess_units,
|
||||
SUM(total_excess_cost) as total_excess_cost,
|
||||
SUM(total_excess_retail) as total_excess_retail
|
||||
FROM filtered_categories
|
||||
)
|
||||
SELECT
|
||||
s.total_overstocked,
|
||||
s.total_excess_units,
|
||||
s.total_excess_cost,
|
||||
s.total_excess_retail,
|
||||
json_agg(
|
||||
json_build_object(
|
||||
'category', fc.category_name,
|
||||
'products', fc.overstocked_products,
|
||||
'units', fc.total_excess_units,
|
||||
'cost', fc.total_excess_cost,
|
||||
'retail', fc.total_excess_retail
|
||||
)
|
||||
) as category_data
|
||||
FROM summary s, filtered_categories fc
|
||||
GROUP BY
|
||||
s.total_overstocked,
|
||||
s.total_excess_units,
|
||||
s.total_excess_cost,
|
||||
s.total_excess_retail
|
||||
// Check if we have any products with Overstock status
|
||||
const { rows: [countCheck] } = await executeQuery(`
|
||||
SELECT COUNT(*) as overstock_count FROM product_metrics WHERE status = 'Overstock'
|
||||
`);
|
||||
|
||||
if (rows.length === 0) {
|
||||
|
||||
console.log('Overstock count:', countCheck.overstock_count);
|
||||
|
||||
// If no overstock products, return empty metrics
|
||||
if (parseInt(countCheck.overstock_count) === 0) {
|
||||
return res.json({
|
||||
overstockedProducts: 0,
|
||||
total_excess_units: 0,
|
||||
@@ -575,31 +498,51 @@ router.get('/overstock/metrics', async (req, res) => {
|
||||
category_data: []
|
||||
});
|
||||
}
|
||||
|
||||
// Get summary metrics in a simpler, more direct query
|
||||
const { rows: [summaryMetrics] } = await executeQuery(`
|
||||
SELECT
|
||||
COUNT(DISTINCT pid)::integer as total_overstocked,
|
||||
SUM(overstocked_units)::integer as total_excess_units,
|
||||
ROUND(SUM(overstocked_cost)::numeric, 3) as total_excess_cost,
|
||||
ROUND(SUM(overstocked_retail)::numeric, 3) as total_excess_retail
|
||||
FROM product_metrics
|
||||
WHERE status = 'Overstock'
|
||||
`);
|
||||
|
||||
// Get category breakdowns separately
|
||||
const { rows: categoryData } = await executeQuery(`
|
||||
SELECT
|
||||
c.name as category_name,
|
||||
COUNT(DISTINCT pm.pid)::integer as overstocked_products,
|
||||
SUM(pm.overstocked_units)::integer as total_excess_units,
|
||||
ROUND(SUM(pm.overstocked_cost)::numeric, 3) as total_excess_cost,
|
||||
ROUND(SUM(pm.overstocked_retail)::numeric, 3) as total_excess_retail
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN product_metrics pm ON pc.pid = pm.pid
|
||||
WHERE pm.status = 'Overstock'
|
||||
GROUP BY c.name
|
||||
ORDER BY total_excess_cost DESC
|
||||
LIMIT 8
|
||||
`);
|
||||
|
||||
// Generate dummy data if the query returned empty results
|
||||
if (rows[0].total_overstocked === null || rows[0].total_excess_units === null) {
|
||||
console.log('Empty overstock metrics results, returning dummy data');
|
||||
return res.json({
|
||||
overstockedProducts: 10,
|
||||
total_excess_units: 500,
|
||||
total_excess_cost: 5000,
|
||||
total_excess_retail: 10000,
|
||||
category_data: [
|
||||
{ category: "Electronics", products: 3, units: 150, cost: 1500, retail: 3000 },
|
||||
{ category: "Clothing", products: 4, units: 200, cost: 2000, retail: 4000 },
|
||||
{ category: "Home Goods", products: 2, units: 100, cost: 1000, retail: 2000 },
|
||||
{ category: "Office Supplies", products: 1, units: 50, cost: 500, retail: 1000 }
|
||||
]
|
||||
});
|
||||
}
|
||||
console.log('Summary metrics:', summaryMetrics);
|
||||
console.log('Category data count:', categoryData.length);
|
||||
|
||||
// Format response with explicit type conversion
|
||||
const response = {
|
||||
overstockedProducts: parseInt(rows[0].total_overstocked) || 0,
|
||||
total_excess_units: parseInt(rows[0].total_excess_units) || 0,
|
||||
total_excess_cost: parseFloat(rows[0].total_excess_cost) || 0,
|
||||
total_excess_retail: parseFloat(rows[0].total_excess_retail) || 0,
|
||||
category_data: rows[0].category_data || []
|
||||
overstockedProducts: parseInt(summaryMetrics.total_overstocked) || 0,
|
||||
total_excess_units: parseInt(summaryMetrics.total_excess_units) || 0,
|
||||
total_excess_cost: parseFloat(summaryMetrics.total_excess_cost) || 0,
|
||||
total_excess_retail: parseFloat(summaryMetrics.total_excess_retail) || 0,
|
||||
category_data: categoryData.map(cat => ({
|
||||
category: cat.category_name,
|
||||
products: parseInt(cat.overstocked_products) || 0,
|
||||
units: parseInt(cat.total_excess_units) || 0,
|
||||
cost: parseFloat(cat.total_excess_cost) || 0,
|
||||
retail: parseFloat(cat.total_excess_retail) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
res.json(response);
|
||||
@@ -629,27 +572,26 @@ router.get('/overstock/products', async (req, res) => {
|
||||
try {
|
||||
const { rows } = await executeQuery(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.SKU,
|
||||
p.title,
|
||||
p.brand,
|
||||
p.vendor,
|
||||
p.stock_quantity,
|
||||
p.cost_price,
|
||||
p.price,
|
||||
pm.daily_sales_avg,
|
||||
pm.days_of_inventory,
|
||||
pm.overstocked_amt,
|
||||
(pm.overstocked_amt * p.cost_price) as excess_cost,
|
||||
(pm.overstocked_amt * p.price) as excess_retail,
|
||||
pm.pid,
|
||||
pm.sku AS SKU,
|
||||
pm.title,
|
||||
pm.brand,
|
||||
pm.vendor,
|
||||
pm.current_stock as stock_quantity,
|
||||
pm.current_cost_price as cost_price,
|
||||
pm.current_price as price,
|
||||
pm.sales_velocity_daily as daily_sales_avg,
|
||||
pm.stock_cover_in_days as days_of_inventory,
|
||||
pm.overstocked_units,
|
||||
pm.overstocked_cost as excess_cost,
|
||||
pm.overstocked_retail as excess_retail,
|
||||
STRING_AGG(c.name, ', ') as categories
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN product_categories pc ON p.pid = pc.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN product_categories pc ON pm.pid = pc.pid
|
||||
LEFT JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE pm.stock_status = 'Overstocked'
|
||||
GROUP BY p.pid, p.SKU, p.title, p.brand, p.vendor, p.stock_quantity, p.cost_price, p.price,
|
||||
pm.daily_sales_avg, pm.days_of_inventory, pm.overstocked_amt
|
||||
WHERE pm.status = 'Overstock'
|
||||
GROUP BY pm.pid, pm.sku, pm.title, pm.brand, pm.vendor, pm.current_stock, pm.current_cost_price, pm.current_price,
|
||||
pm.sales_velocity_daily, pm.stock_cover_in_days, pm.overstocked_units, pm.overstocked_cost, pm.overstocked_retail
|
||||
ORDER BY excess_cost DESC
|
||||
LIMIT $1
|
||||
`, [limit]);
|
||||
@@ -827,42 +769,38 @@ router.get('/sales/metrics', async (req, res) => {
|
||||
const endDate = req.query.endDate || today.toISOString();
|
||||
|
||||
try {
|
||||
// Get daily sales data
|
||||
// Get daily orders and totals for the specified period
|
||||
const { rows: dailyRows } = await executeQuery(`
|
||||
SELECT
|
||||
DATE(o.date) as sale_date,
|
||||
COUNT(DISTINCT o.order_number) as total_orders,
|
||||
SUM(o.quantity) as total_units,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
SUM(p.cost_price * o.quantity) as total_cogs,
|
||||
SUM((o.price - p.cost_price) * o.quantity) as total_profit
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN $1 AND $2
|
||||
GROUP BY DATE(o.date)
|
||||
DATE(date) as sale_date,
|
||||
COUNT(DISTINCT order_number) as total_orders,
|
||||
SUM(quantity) as total_units,
|
||||
SUM(price * quantity) as total_revenue,
|
||||
SUM(costeach * quantity) as total_cogs
|
||||
FROM orders
|
||||
WHERE date BETWEEN $1 AND $2
|
||||
AND canceled = false
|
||||
GROUP BY DATE(date)
|
||||
ORDER BY sale_date
|
||||
`, [startDate, endDate]);
|
||||
|
||||
// Get summary metrics
|
||||
const { rows: metrics } = await executeQuery(`
|
||||
// Get overall metrics for the period
|
||||
const { rows: [metrics] } = await executeQuery(`
|
||||
SELECT
|
||||
COUNT(DISTINCT o.order_number) as total_orders,
|
||||
SUM(o.quantity) as total_units,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
SUM(p.cost_price * o.quantity) as total_cogs,
|
||||
SUM((o.price - p.cost_price) * o.quantity) as total_profit
|
||||
FROM orders o
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN $1 AND $2
|
||||
COUNT(DISTINCT order_number) as total_orders,
|
||||
SUM(quantity) as total_units,
|
||||
SUM(price * quantity) as total_revenue,
|
||||
SUM(costeach * quantity) as total_cogs
|
||||
FROM orders
|
||||
WHERE date BETWEEN $1 AND $2
|
||||
AND canceled = false
|
||||
`, [startDate, endDate]);
|
||||
|
||||
const response = {
|
||||
totalOrders: parseInt(metrics[0]?.total_orders) || 0,
|
||||
totalUnitsSold: parseInt(metrics[0]?.total_units) || 0,
|
||||
totalCogs: parseFloat(metrics[0]?.total_cogs) || 0,
|
||||
totalRevenue: parseFloat(metrics[0]?.total_revenue) || 0,
|
||||
totalOrders: parseInt(metrics?.total_orders) || 0,
|
||||
totalUnitsSold: parseInt(metrics?.total_units) || 0,
|
||||
totalCogs: parseFloat(metrics?.total_cogs) || 0,
|
||||
totalRevenue: parseFloat(metrics?.total_revenue) || 0,
|
||||
dailySales: dailyRows.map(day => ({
|
||||
date: day.sale_date,
|
||||
units: parseInt(day.total_units) || 0,
|
||||
@@ -1304,39 +1242,33 @@ router.get('/inventory-health', async (req, res) => {
|
||||
});
|
||||
|
||||
// GET /dashboard/replenish/products
|
||||
// Returns top products that need replenishment
|
||||
// Returns list of products to replenish
|
||||
router.get('/replenish/products', async (req, res) => {
|
||||
const limit = Math.max(1, Math.min(100, parseInt(req.query.limit) || 50));
|
||||
const limit = parseInt(req.query.limit) || 50;
|
||||
try {
|
||||
const { rows: products } = await executeQuery(`
|
||||
const { rows } = await executeQuery(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.SKU as sku,
|
||||
p.title,
|
||||
p.stock_quantity,
|
||||
pm.daily_sales_avg,
|
||||
pm.reorder_qty,
|
||||
pm.last_purchase_date
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND pm.stock_status IN ('Critical', 'Reorder')
|
||||
AND pm.reorder_qty > 0
|
||||
pm.pid,
|
||||
pm.sku,
|
||||
pm.title,
|
||||
pm.current_stock AS stock_quantity,
|
||||
pm.sales_velocity_daily AS daily_sales_avg,
|
||||
pm.replenishment_units AS reorder_qty,
|
||||
pm.date_last_received AS last_purchase_date
|
||||
FROM product_metrics pm
|
||||
WHERE pm.is_replenishable = true
|
||||
AND (pm.status IN ('Critical', 'Reorder')
|
||||
OR pm.current_stock < 0)
|
||||
AND pm.replenishment_units > 0
|
||||
ORDER BY
|
||||
CASE pm.stock_status
|
||||
CASE pm.status
|
||||
WHEN 'Critical' THEN 1
|
||||
WHEN 'Reorder' THEN 2
|
||||
END,
|
||||
pm.reorder_qty * p.cost_price DESC
|
||||
pm.replenishment_cost DESC
|
||||
LIMIT $1
|
||||
`, [limit]);
|
||||
|
||||
res.json(products.map(p => ({
|
||||
...p,
|
||||
stock_quantity: parseInt(p.stock_quantity) || 0,
|
||||
daily_sales_avg: parseFloat(p.daily_sales_avg) || 0,
|
||||
reorder_qty: parseInt(p.reorder_qty) || 0
|
||||
})));
|
||||
res.json(rows);
|
||||
} catch (err) {
|
||||
console.error('Error fetching products to replenish:', err);
|
||||
res.status(500).json({ error: 'Failed to fetch products to replenish' });
|
||||
|
||||
390
inventory-server/src/routes/data-management.js
Normal file
390
inventory-server/src/routes/data-management.js
Normal file
@@ -0,0 +1,390 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { spawn } = require('child_process');
|
||||
const path = require('path');
|
||||
const db = require('../utils/db');
|
||||
|
||||
// Debug middleware MUST be first
|
||||
router.use((req, res, next) => {
|
||||
console.log(`[CSV Route Debug] ${req.method} ${req.path}`);
|
||||
next();
|
||||
});
|
||||
|
||||
// Store active processes and their progress
|
||||
let activeImport = null;
|
||||
let importProgress = null;
|
||||
let activeFullUpdate = null;
|
||||
let activeFullReset = null;
|
||||
|
||||
// SSE clients for progress updates
|
||||
const updateClients = new Set();
|
||||
const importClients = new Set();
|
||||
const resetClients = new Set();
|
||||
const resetMetricsClients = new Set();
|
||||
const calculateMetricsClients = new Set();
|
||||
const fullUpdateClients = new Set();
|
||||
const fullResetClients = new Set();
|
||||
|
||||
// Helper to send progress to specific clients
|
||||
function sendProgressToClients(clients, data) {
|
||||
// If data is a string, send it directly
|
||||
// If it's an object, convert it to JSON
|
||||
const message = typeof data === 'string'
|
||||
? `data: ${data}\n\n`
|
||||
: `data: ${JSON.stringify(data)}\n\n`;
|
||||
|
||||
clients.forEach(client => {
|
||||
try {
|
||||
client.write(message);
|
||||
// Immediately flush the response
|
||||
if (typeof client.flush === 'function') {
|
||||
client.flush();
|
||||
}
|
||||
} catch (error) {
|
||||
// Silently remove failed client
|
||||
clients.delete(client);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Helper to run a script and stream progress
|
||||
function runScript(scriptPath, type, clients) {
|
||||
return new Promise((resolve, reject) => {
|
||||
// Kill any existing process of this type
|
||||
let activeProcess;
|
||||
switch (type) {
|
||||
case 'update':
|
||||
if (activeFullUpdate) {
|
||||
try { activeFullUpdate.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullUpdate;
|
||||
break;
|
||||
case 'reset':
|
||||
if (activeFullReset) {
|
||||
try { activeFullReset.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullReset;
|
||||
break;
|
||||
}
|
||||
|
||||
const child = spawn('node', [scriptPath], {
|
||||
stdio: ['inherit', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = child;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = child;
|
||||
break;
|
||||
}
|
||||
|
||||
let output = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
output += text;
|
||||
|
||||
// Split by lines to handle multiple JSON outputs
|
||||
const lines = text.split('\n');
|
||||
lines.filter(line => line.trim()).forEach(line => {
|
||||
try {
|
||||
// Try to parse as JSON but don't let it affect the display
|
||||
const jsonData = JSON.parse(line);
|
||||
// Only end the process if we get a final status
|
||||
if (jsonData.status === 'complete' || jsonData.status === 'error' || jsonData.status === 'cancelled') {
|
||||
if (jsonData.status === 'complete' && !jsonData.operation?.includes('complete')) {
|
||||
// Don't close for intermediate completion messages
|
||||
sendProgressToClients(clients, line);
|
||||
return;
|
||||
}
|
||||
// Close only on final completion/error/cancellation
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
if (jsonData.status === 'error') {
|
||||
reject(new Error(jsonData.error || 'Unknown error'));
|
||||
} else {
|
||||
resolve({ output });
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// Not JSON, just display as is
|
||||
}
|
||||
// Always send the raw line
|
||||
sendProgressToClients(clients, line);
|
||||
});
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
console.error(text);
|
||||
// Send stderr output directly too
|
||||
sendProgressToClients(clients, text);
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
|
||||
if (code !== 0) {
|
||||
const error = `Script ${scriptPath} exited with code ${code}`;
|
||||
sendProgressToClients(clients, error);
|
||||
reject(new Error(error));
|
||||
}
|
||||
// Don't resolve here - let the completion message from the script trigger the resolve
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
sendProgressToClients(clients, err.message);
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Progress endpoints
|
||||
router.get('/:type/progress', (req, res) => {
|
||||
const { type } = req.params;
|
||||
if (!['update', 'reset'].includes(type)) {
|
||||
return res.status(400).json({ error: 'Invalid operation type' });
|
||||
}
|
||||
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Add this client to the correct set
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
clients.add(res);
|
||||
|
||||
// Send initial connection message
|
||||
sendProgressToClients(new Set([res]), JSON.stringify({
|
||||
status: 'running',
|
||||
operation: 'Initializing connection...'
|
||||
}));
|
||||
|
||||
// Handle client disconnect
|
||||
req.on('close', () => {
|
||||
clients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
// Route to cancel active process
|
||||
router.post('/cancel', (req, res) => {
|
||||
let killed = false;
|
||||
|
||||
// Get the operation type from the request
|
||||
const { type } = req.query;
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
const activeProcess = type === 'update' ? activeFullUpdate : activeFullReset;
|
||||
|
||||
if (activeProcess) {
|
||||
try {
|
||||
activeProcess.kill('SIGTERM');
|
||||
if (type === 'update') {
|
||||
activeFullUpdate = null;
|
||||
} else {
|
||||
activeFullReset = null;
|
||||
}
|
||||
killed = true;
|
||||
sendProgressToClients(clients, JSON.stringify({
|
||||
status: 'cancelled',
|
||||
operation: 'Operation cancelled'
|
||||
}));
|
||||
} catch (err) {
|
||||
console.error(`Error killing ${type} process:`, err);
|
||||
}
|
||||
}
|
||||
|
||||
if (killed) {
|
||||
res.json({ success: true });
|
||||
} else {
|
||||
res.status(404).json({ error: 'No active process to cancel' });
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-update - Run full update script
|
||||
router.post('/full-update', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-update.js');
|
||||
runScript(scriptPath, 'update', fullUpdateClients)
|
||||
.catch(error => {
|
||||
console.error('Update failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Update started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-reset - Run full reset script
|
||||
router.post('/full-reset', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-reset.js');
|
||||
runScript(scriptPath, 'reset', fullResetClients)
|
||||
.catch(error => {
|
||||
console.error('Reset failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Reset started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/import - Get recent import history
|
||||
router.get('/history/import', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
id,
|
||||
start_time,
|
||||
end_time,
|
||||
status,
|
||||
error_message,
|
||||
records_added::integer,
|
||||
records_updated::integer
|
||||
FROM import_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching import history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/calculate - Get recent calculation history
|
||||
router.get('/history/calculate', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
id,
|
||||
start_time,
|
||||
end_time,
|
||||
duration_minutes,
|
||||
status,
|
||||
error_message,
|
||||
total_products,
|
||||
total_orders,
|
||||
total_purchase_orders,
|
||||
processed_products,
|
||||
processed_orders,
|
||||
processed_purchase_orders,
|
||||
additional_info
|
||||
FROM calculate_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching calculate history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/modules - Get module calculation status
|
||||
router.get('/status/modules', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
module_name,
|
||||
last_calculation_timestamp::timestamp
|
||||
FROM calculate_status
|
||||
ORDER BY module_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching module status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/tables - Get table sync status
|
||||
router.get('/status/tables', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const { rows } = await pool.query(`
|
||||
SELECT
|
||||
table_name,
|
||||
last_sync_timestamp::timestamp
|
||||
FROM sync_status
|
||||
ORDER BY table_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/table-counts - Get record counts for all tables
|
||||
router.get('/status/table-counts', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const tables = [
|
||||
// Core tables
|
||||
'products', 'categories', 'product_categories', 'orders', 'purchase_orders',
|
||||
// New metrics tables
|
||||
'product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics',
|
||||
// Config tables
|
||||
'settings_global', 'settings_vendor', 'settings_product'
|
||||
];
|
||||
|
||||
const counts = await Promise.all(
|
||||
tables.map(table =>
|
||||
pool.query(`SELECT COUNT(*) as count FROM ${table}`)
|
||||
.then(result => ({
|
||||
table_name: table,
|
||||
count: parseInt(result.rows[0].count)
|
||||
}))
|
||||
.catch(err => ({
|
||||
table_name: table,
|
||||
count: null,
|
||||
error: err.message
|
||||
}))
|
||||
)
|
||||
);
|
||||
|
||||
// Group tables by type
|
||||
const groupedCounts = {
|
||||
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders'].includes(c.table_name)),
|
||||
metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics'].includes(c.table_name)),
|
||||
config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name))
|
||||
};
|
||||
|
||||
res.json(groupedCounts);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table counts:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -7,90 +7,230 @@ const { Pool } = require('pg'); // Assuming pg driver
|
||||
const DEFAULT_PAGE_LIMIT = 50;
|
||||
const MAX_PAGE_LIMIT = 200; // Prevent excessive data requests
|
||||
|
||||
/**
|
||||
* Maps user-friendly query parameter keys (camelCase) to database column names.
|
||||
* Also validates if the column is safe for sorting or filtering.
|
||||
* Add ALL columns from product_metrics that should be filterable/sortable.
|
||||
*/
|
||||
// Define direct mapping from frontend column names to database columns
|
||||
// This simplifies the code by eliminating conversion logic
|
||||
const COLUMN_MAP = {
|
||||
// Product Info
|
||||
pid: { dbCol: 'pm.pid', type: 'number' },
|
||||
sku: { dbCol: 'pm.sku', type: 'string' },
|
||||
title: { dbCol: 'pm.title', type: 'string' },
|
||||
brand: { dbCol: 'pm.brand', type: 'string' },
|
||||
vendor: { dbCol: 'pm.vendor', type: 'string' },
|
||||
imageUrl: { dbCol: 'pm.image_url', type: 'string' },
|
||||
isVisible: { dbCol: 'pm.is_visible', type: 'boolean' },
|
||||
isReplenishable: { dbCol: 'pm.is_replenishable', type: 'boolean' },
|
||||
pid: 'pm.pid',
|
||||
sku: 'pm.sku',
|
||||
title: 'pm.title',
|
||||
brand: 'pm.brand',
|
||||
vendor: 'pm.vendor',
|
||||
imageUrl: 'pm.image_url',
|
||||
isVisible: 'pm.is_visible',
|
||||
isReplenishable: 'pm.is_replenishable',
|
||||
// Additional Product Fields
|
||||
barcode: 'pm.barcode',
|
||||
harmonizedTariffCode: 'pm.harmonized_tariff_code',
|
||||
vendorReference: 'pm.vendor_reference',
|
||||
notionsReference: 'pm.notions_reference',
|
||||
line: 'pm.line',
|
||||
subline: 'pm.subline',
|
||||
artist: 'pm.artist',
|
||||
moq: 'pm.moq',
|
||||
rating: 'pm.rating',
|
||||
reviews: 'pm.reviews',
|
||||
weight: 'pm.weight',
|
||||
length: 'pm.length',
|
||||
width: 'pm.width',
|
||||
height: 'pm.height',
|
||||
countryOfOrigin: 'pm.country_of_origin',
|
||||
location: 'pm.location',
|
||||
baskets: 'pm.baskets',
|
||||
notifies: 'pm.notifies',
|
||||
preorderCount: 'pm.preorder_count',
|
||||
notionsInvCount: 'pm.notions_inv_count',
|
||||
// Current Status
|
||||
currentPrice: { dbCol: 'pm.current_price', type: 'number' },
|
||||
currentRegularPrice: { dbCol: 'pm.current_regular_price', type: 'number' },
|
||||
currentCostPrice: { dbCol: 'pm.current_cost_price', type: 'number' },
|
||||
currentLandingCostPrice: { dbCol: 'pm.current_landing_cost_price', type: 'number' },
|
||||
currentStock: { dbCol: 'pm.current_stock', type: 'number' },
|
||||
currentStockCost: { dbCol: 'pm.current_stock_cost', type: 'number' },
|
||||
currentStockRetail: { dbCol: 'pm.current_stock_retail', type: 'number' },
|
||||
currentStockGross: { dbCol: 'pm.current_stock_gross', type: 'number' },
|
||||
onOrderQty: { dbCol: 'pm.on_order_qty', type: 'number' },
|
||||
onOrderCost: { dbCol: 'pm.on_order_cost', type: 'number' },
|
||||
onOrderRetail: { dbCol: 'pm.on_order_retail', type: 'number' },
|
||||
earliestExpectedDate: { dbCol: 'pm.earliest_expected_date', type: 'date' },
|
||||
currentPrice: 'pm.current_price',
|
||||
currentRegularPrice: 'pm.current_regular_price',
|
||||
currentCostPrice: 'pm.current_cost_price',
|
||||
currentLandingCostPrice: 'pm.current_landing_cost_price',
|
||||
currentStock: 'pm.current_stock',
|
||||
currentStockCost: 'pm.current_stock_cost',
|
||||
currentStockRetail: 'pm.current_stock_retail',
|
||||
currentStockGross: 'pm.current_stock_gross',
|
||||
onOrderQty: 'pm.on_order_qty',
|
||||
onOrderCost: 'pm.on_order_cost',
|
||||
onOrderRetail: 'pm.on_order_retail',
|
||||
earliestExpectedDate: 'pm.earliest_expected_date',
|
||||
// Historical Dates
|
||||
dateCreated: { dbCol: 'pm.date_created', type: 'date' },
|
||||
dateFirstReceived: { dbCol: 'pm.date_first_received', type: 'date' },
|
||||
dateLastReceived: { dbCol: 'pm.date_last_received', type: 'date' },
|
||||
dateFirstSold: { dbCol: 'pm.date_first_sold', type: 'date' },
|
||||
dateLastSold: { dbCol: 'pm.date_last_sold', type: 'date' },
|
||||
ageDays: { dbCol: 'pm.age_days', type: 'number' },
|
||||
dateCreated: 'pm.date_created',
|
||||
dateFirstReceived: 'pm.date_first_received',
|
||||
dateLastReceived: 'pm.date_last_received',
|
||||
dateFirstSold: 'pm.date_first_sold',
|
||||
dateLastSold: 'pm.date_last_sold',
|
||||
ageDays: 'pm.age_days',
|
||||
// Rolling Period Metrics
|
||||
sales7d: { dbCol: 'pm.sales_7d', type: 'number' }, revenue7d: { dbCol: 'pm.revenue_7d', type: 'number' },
|
||||
sales14d: { dbCol: 'pm.sales_14d', type: 'number' }, revenue14d: { dbCol: 'pm.revenue_14d', type: 'number' },
|
||||
sales30d: { dbCol: 'pm.sales_30d', type: 'number' }, revenue30d: { dbCol: 'pm.revenue_30d', type: 'number' },
|
||||
cogs30d: { dbCol: 'pm.cogs_30d', type: 'number' }, profit30d: { dbCol: 'pm.profit_30d', type: 'number' },
|
||||
returnsUnits30d: { dbCol: 'pm.returns_units_30d', type: 'number' }, returnsRevenue30d: { dbCol: 'pm.returns_revenue_30d', type: 'number' },
|
||||
discounts30d: { dbCol: 'pm.discounts_30d', type: 'number' }, grossRevenue30d: { dbCol: 'pm.gross_revenue_30d', type: 'number' },
|
||||
grossRegularRevenue30d: { dbCol: 'pm.gross_regular_revenue_30d', type: 'number' },
|
||||
stockoutDays30d: { dbCol: 'pm.stockout_days_30d', type: 'number' },
|
||||
sales365d: { dbCol: 'pm.sales_365d', type: 'number' }, revenue365d: { dbCol: 'pm.revenue_365d', type: 'number' },
|
||||
avgStockUnits30d: { dbCol: 'pm.avg_stock_units_30d', type: 'number' }, avgStockCost30d: { dbCol: 'pm.avg_stock_cost_30d', type: 'number' },
|
||||
avgStockRetail30d: { dbCol: 'pm.avg_stock_retail_30d', type: 'number' }, avgStockGross30d: { dbCol: 'pm.avg_stock_gross_30d', type: 'number' },
|
||||
receivedQty30d: { dbCol: 'pm.received_qty_30d', type: 'number' }, receivedCost30d: { dbCol: 'pm.received_cost_30d', type: 'number' },
|
||||
sales7d: 'pm.sales_7d',
|
||||
revenue7d: 'pm.revenue_7d',
|
||||
sales14d: 'pm.sales_14d',
|
||||
revenue14d: 'pm.revenue_14d',
|
||||
sales30d: 'pm.sales_30d',
|
||||
revenue30d: 'pm.revenue_30d',
|
||||
cogs30d: 'pm.cogs_30d',
|
||||
profit30d: 'pm.profit_30d',
|
||||
returnsUnits30d: 'pm.returns_units_30d',
|
||||
returnsRevenue30d: 'pm.returns_revenue_30d',
|
||||
discounts30d: 'pm.discounts_30d',
|
||||
grossRevenue30d: 'pm.gross_revenue_30d',
|
||||
grossRegularRevenue30d: 'pm.gross_regular_revenue_30d',
|
||||
stockoutDays30d: 'pm.stockout_days_30d',
|
||||
sales365d: 'pm.sales_365d',
|
||||
revenue365d: 'pm.revenue_365d',
|
||||
avgStockUnits30d: 'pm.avg_stock_units_30d',
|
||||
avgStockCost30d: 'pm.avg_stock_cost_30d',
|
||||
avgStockRetail30d: 'pm.avg_stock_retail_30d',
|
||||
avgStockGross30d: 'pm.avg_stock_gross_30d',
|
||||
receivedQty30d: 'pm.received_qty_30d',
|
||||
receivedCost30d: 'pm.received_cost_30d',
|
||||
// Lifetime Metrics
|
||||
lifetimeSales: { dbCol: 'pm.lifetime_sales', type: 'number' }, lifetimeRevenue: { dbCol: 'pm.lifetime_revenue', type: 'number' },
|
||||
lifetimeSales: 'pm.lifetime_sales',
|
||||
lifetimeRevenue: 'pm.lifetime_revenue',
|
||||
// First Period Metrics
|
||||
first7DaysSales: { dbCol: 'pm.first_7_days_sales', type: 'number' }, first7DaysRevenue: { dbCol: 'pm.first_7_days_revenue', type: 'number' },
|
||||
first30DaysSales: { dbCol: 'pm.first_30_days_sales', type: 'number' }, first30DaysRevenue: { dbCol: 'pm.first_30_days_revenue', type: 'number' },
|
||||
first60DaysSales: { dbCol: 'pm.first_60_days_sales', type: 'number' }, first60DaysRevenue: { dbCol: 'pm.first_60_days_revenue', type: 'number' },
|
||||
first90DaysSales: { dbCol: 'pm.first_90_days_sales', type: 'number' }, first90DaysRevenue: { dbCol: 'pm.first_90_days_revenue', type: 'number' },
|
||||
first7DaysSales: 'pm.first_7_days_sales',
|
||||
first7DaysRevenue: 'pm.first_7_days_revenue',
|
||||
first30DaysSales: 'pm.first_30_days_sales',
|
||||
first30DaysRevenue: 'pm.first_30_days_revenue',
|
||||
first60DaysSales: 'pm.first_60_days_sales',
|
||||
first60DaysRevenue: 'pm.first_60_days_revenue',
|
||||
first90DaysSales: 'pm.first_90_days_sales',
|
||||
first90DaysRevenue: 'pm.first_90_days_revenue',
|
||||
// Calculated KPIs
|
||||
asp30d: { dbCol: 'pm.asp_30d', type: 'number' }, acp30d: { dbCol: 'pm.acp_30d', type: 'number' }, avgRos30d: { dbCol: 'pm.avg_ros_30d', type: 'number' },
|
||||
avgSalesPerDay30d: { dbCol: 'pm.avg_sales_per_day_30d', type: 'number' }, avgSalesPerMonth30d: { dbCol: 'pm.avg_sales_per_month_30d', type: 'number' },
|
||||
margin30d: { dbCol: 'pm.margin_30d', type: 'number' }, markup30d: { dbCol: 'pm.markup_30d', type: 'number' }, gmroi30d: { dbCol: 'pm.gmroi_30d', type: 'number' },
|
||||
stockturn30d: { dbCol: 'pm.stockturn_30d', type: 'number' }, returnRate30d: { dbCol: 'pm.return_rate_30d', type: 'number' },
|
||||
discountRate30d: { dbCol: 'pm.discount_rate_30d', type: 'number' }, stockoutRate30d: { dbCol: 'pm.stockout_rate_30d', type: 'number' },
|
||||
markdown30d: { dbCol: 'pm.markdown_30d', type: 'number' }, markdownRate30d: { dbCol: 'pm.markdown_rate_30d', type: 'number' },
|
||||
sellThrough30d: { dbCol: 'pm.sell_through_30d', type: 'number' }, avgLeadTimeDays: { dbCol: 'pm.avg_lead_time_days', type: 'number' },
|
||||
asp30d: 'pm.asp_30d',
|
||||
acp30d: 'pm.acp_30d',
|
||||
avgRos30d: 'pm.avg_ros_30d',
|
||||
avgSalesPerDay30d: 'pm.avg_sales_per_day_30d',
|
||||
avgSalesPerMonth30d: 'pm.avg_sales_per_month_30d',
|
||||
margin30d: 'pm.margin_30d',
|
||||
markup30d: 'pm.markup_30d',
|
||||
gmroi30d: 'pm.gmroi_30d',
|
||||
stockturn30d: 'pm.stockturn_30d',
|
||||
returnRate30d: 'pm.return_rate_30d',
|
||||
discountRate30d: 'pm.discount_rate_30d',
|
||||
stockoutRate30d: 'pm.stockout_rate_30d',
|
||||
markdown30d: 'pm.markdown_30d',
|
||||
markdownRate30d: 'pm.markdown_rate_30d',
|
||||
sellThrough30d: 'pm.sell_through_30d',
|
||||
avgLeadTimeDays: 'pm.avg_lead_time_days',
|
||||
// Forecasting & Replenishment
|
||||
abcClass: { dbCol: 'pm.abc_class', type: 'string' }, salesVelocityDaily: { dbCol: 'pm.sales_velocity_daily', type: 'number' },
|
||||
configLeadTime: { dbCol: 'pm.config_lead_time', type: 'number' }, configDaysOfStock: { dbCol: 'pm.config_days_of_stock', type: 'number' },
|
||||
configSafetyStock: { dbCol: 'pm.config_safety_stock', type: 'number' }, planningPeriodDays: { dbCol: 'pm.planning_period_days', type: 'number' },
|
||||
leadTimeForecastUnits: { dbCol: 'pm.lead_time_forecast_units', type: 'number' }, daysOfStockForecastUnits: { dbCol: 'pm.days_of_stock_forecast_units', type: 'number' },
|
||||
planningPeriodForecastUnits: { dbCol: 'pm.planning_period_forecast_units', type: 'number' }, leadTimeClosingStock: { dbCol: 'pm.lead_time_closing_stock', type: 'number' },
|
||||
daysOfStockClosingStock: { dbCol: 'pm.days_of_stock_closing_stock', type: 'number' }, replenishmentNeededRaw: { dbCol: 'pm.replenishment_needed_raw', type: 'number' },
|
||||
replenishmentUnits: { dbCol: 'pm.replenishment_units', type: 'number' }, replenishmentCost: { dbCol: 'pm.replenishment_cost', type: 'number' },
|
||||
replenishmentRetail: { dbCol: 'pm.replenishment_retail', type: 'number' }, replenishmentProfit: { dbCol: 'pm.replenishment_profit', type: 'number' },
|
||||
toOrderUnits: { dbCol: 'pm.to_order_units', type: 'number' }, forecastLostSalesUnits: { dbCol: 'pm.forecast_lost_sales_units', type: 'number' },
|
||||
forecastLostRevenue: { dbCol: 'pm.forecast_lost_revenue', type: 'number' }, stockCoverInDays: { dbCol: 'pm.stock_cover_in_days', type: 'number' },
|
||||
poCoverInDays: { dbCol: 'pm.po_cover_in_days', type: 'number' }, sellsOutInDays: { dbCol: 'pm.sells_out_in_days', type: 'number' },
|
||||
replenishDate: { dbCol: 'pm.replenish_date', type: 'date' }, overstockedUnits: { dbCol: 'pm.overstocked_units', type: 'number' },
|
||||
overstockedCost: { dbCol: 'pm.overstocked_cost', type: 'number' }, overstockedRetail: { dbCol: 'pm.overstocked_retail', type: 'number' },
|
||||
isOldStock: { dbCol: 'pm.is_old_stock', type: 'boolean' },
|
||||
abcClass: 'pm.abc_class',
|
||||
salesVelocityDaily: 'pm.sales_velocity_daily',
|
||||
configLeadTime: 'pm.config_lead_time',
|
||||
configDaysOfStock: 'pm.config_days_of_stock',
|
||||
configSafetyStock: 'pm.config_safety_stock',
|
||||
planningPeriodDays: 'pm.planning_period_days',
|
||||
leadTimeForecastUnits: 'pm.lead_time_forecast_units',
|
||||
daysOfStockForecastUnits: 'pm.days_of_stock_forecast_units',
|
||||
planningPeriodForecastUnits: 'pm.planning_period_forecast_units',
|
||||
leadTimeClosingStock: 'pm.lead_time_closing_stock',
|
||||
daysOfStockClosingStock: 'pm.days_of_stock_closing_stock',
|
||||
replenishmentNeededRaw: 'pm.replenishment_needed_raw',
|
||||
replenishmentUnits: 'pm.replenishment_units',
|
||||
replenishmentCost: 'pm.replenishment_cost',
|
||||
replenishmentRetail: 'pm.replenishment_retail',
|
||||
replenishmentProfit: 'pm.replenishment_profit',
|
||||
toOrderUnits: 'pm.to_order_units',
|
||||
forecastLostSalesUnits: 'pm.forecast_lost_sales_units',
|
||||
forecastLostRevenue: 'pm.forecast_lost_revenue',
|
||||
stockCoverInDays: 'pm.stock_cover_in_days',
|
||||
poCoverInDays: 'pm.po_cover_in_days',
|
||||
sellsOutInDays: 'pm.sells_out_in_days',
|
||||
replenishDate: 'pm.replenish_date',
|
||||
overstockedUnits: 'pm.overstocked_units',
|
||||
overstockedCost: 'pm.overstocked_cost',
|
||||
overstockedRetail: 'pm.overstocked_retail',
|
||||
isOldStock: 'pm.is_old_stock',
|
||||
// Yesterday
|
||||
yesterdaySales: { dbCol: 'pm.yesterday_sales', type: 'number' },
|
||||
yesterdaySales: 'pm.yesterday_sales',
|
||||
// Map status column - directly mapped now instead of calculated on frontend
|
||||
status: 'pm.status'
|
||||
};
|
||||
|
||||
function getSafeColumnInfo(queryParamKey) {
|
||||
return COLUMN_MAP[queryParamKey] || null;
|
||||
// Define column types for use in sorting/filtering
|
||||
// This helps apply correct comparison operators and sorting logic
|
||||
const COLUMN_TYPES = {
|
||||
// Numeric columns (use numeric operators and sorting)
|
||||
numeric: [
|
||||
'pid', 'currentPrice', 'currentRegularPrice', 'currentCostPrice', 'currentLandingCostPrice',
|
||||
'currentStock', 'currentStockCost', 'currentStockRetail', 'currentStockGross',
|
||||
'onOrderQty', 'onOrderCost', 'onOrderRetail', 'ageDays',
|
||||
'sales7d', 'revenue7d', 'sales14d', 'revenue14d', 'sales30d', 'revenue30d',
|
||||
'cogs30d', 'profit30d', 'returnsUnits30d', 'returnsRevenue30d', 'discounts30d',
|
||||
'grossRevenue30d', 'grossRegularRevenue30d', 'stockoutDays30d', 'sales365d', 'revenue365d',
|
||||
'avgStockUnits30d', 'avgStockCost30d', 'avgStockRetail30d', 'avgStockGross30d',
|
||||
'receivedQty30d', 'receivedCost30d', 'lifetimeSales', 'lifetimeRevenue',
|
||||
'first7DaysSales', 'first7DaysRevenue', 'first30DaysSales', 'first30DaysRevenue',
|
||||
'first60DaysSales', 'first60DaysRevenue', 'first90DaysSales', 'first90DaysRevenue',
|
||||
'asp30d', 'acp30d', 'avgRos30d', 'avgSalesPerDay30d', 'avgSalesPerMonth30d',
|
||||
'margin30d', 'markup30d', 'gmroi30d', 'stockturn30d', 'returnRate30d', 'discountRate30d',
|
||||
'stockoutRate30d', 'markdown30d', 'markdownRate30d', 'sellThrough30d', 'avgLeadTimeDays',
|
||||
'salesVelocityDaily', 'configLeadTime', 'configDaysOfStock', 'configSafetyStock',
|
||||
'planningPeriodDays', 'leadTimeForecastUnits', 'daysOfStockForecastUnits',
|
||||
'planningPeriodForecastUnits', 'leadTimeClosingStock', 'daysOfStockClosingStock',
|
||||
'replenishmentNeededRaw', 'replenishmentUnits', 'replenishmentCost', 'replenishmentRetail',
|
||||
'replenishmentProfit', 'toOrderUnits', 'forecastLostSalesUnits', 'forecastLostRevenue',
|
||||
'stockCoverInDays', 'poCoverInDays', 'sellsOutInDays', 'overstockedUnits',
|
||||
'overstockedCost', 'overstockedRetail', 'yesterdaySales',
|
||||
// New numeric columns
|
||||
'moq', 'rating', 'reviews', 'weight', 'length', 'width', 'height',
|
||||
'baskets', 'notifies', 'preorderCount', 'notionsInvCount'
|
||||
],
|
||||
// Date columns (use date operators and sorting)
|
||||
date: [
|
||||
'dateCreated', 'dateFirstReceived', 'dateLastReceived', 'dateFirstSold', 'dateLastSold',
|
||||
'earliestExpectedDate', 'replenishDate', 'forecastedOutOfStockDate'
|
||||
],
|
||||
// String columns (use string operators and sorting)
|
||||
string: [
|
||||
'sku', 'title', 'brand', 'vendor', 'imageUrl', 'abcClass', 'status',
|
||||
// New string columns
|
||||
'barcode', 'harmonizedTariffCode', 'vendorReference', 'notionsReference',
|
||||
'line', 'subline', 'artist', 'countryOfOrigin', 'location'
|
||||
],
|
||||
// Boolean columns (use boolean operators and sorting)
|
||||
boolean: ['isVisible', 'isReplenishable', 'isOldStock']
|
||||
};
|
||||
|
||||
// Special sort handling for certain columns
|
||||
const SPECIAL_SORT_COLUMNS = {
|
||||
// Percentage columns where we want to sort by the numeric value
|
||||
margin30d: true,
|
||||
markup30d: true,
|
||||
sellThrough30d: true,
|
||||
discountRate30d: true,
|
||||
stockoutRate30d: true,
|
||||
returnRate30d: true,
|
||||
markdownRate30d: true,
|
||||
|
||||
// Columns where we may want to sort by absolute value
|
||||
profit30d: 'abs',
|
||||
|
||||
// Velocity columns
|
||||
salesVelocityDaily: true,
|
||||
|
||||
// Status column needs special ordering
|
||||
status: 'priority'
|
||||
};
|
||||
|
||||
// Status priority for sorting (lower number = higher priority)
|
||||
const STATUS_PRIORITY = {
|
||||
'Critical': 1,
|
||||
'At Risk': 2,
|
||||
'Reorder': 3,
|
||||
'Overstocked': 4,
|
||||
'Healthy': 5,
|
||||
'New': 6
|
||||
// Any other status will be sorted alphabetically after these
|
||||
};
|
||||
|
||||
// Get database column name from frontend column name
|
||||
function getDbColumn(frontendColumn) {
|
||||
return COLUMN_MAP[frontendColumn] || 'pm.title'; // Default to title if not found
|
||||
}
|
||||
|
||||
// Get column type for proper sorting
|
||||
function getColumnType(frontendColumn) {
|
||||
return COLUMN_TYPES[frontendColumn] || 'string';
|
||||
}
|
||||
|
||||
// --- Route Handlers ---
|
||||
@@ -121,7 +261,7 @@ router.get('/filter-options', async (req, res) => {
|
||||
|
||||
// GET /metrics/ - List all product metrics with filtering, sorting, pagination
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool; // Get pool from app instance
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /metrics received query:', req.query);
|
||||
|
||||
try {
|
||||
@@ -135,10 +275,45 @@ router.get('/', async (req, res) => {
|
||||
|
||||
// --- Sorting ---
|
||||
const sortQueryKey = req.query.sort || 'title'; // Default sort field key
|
||||
const sortColumnInfo = getSafeColumnInfo(sortQueryKey);
|
||||
const sortColumn = sortColumnInfo ? sortColumnInfo.dbCol : 'pm.title'; // Default DB column
|
||||
const dbColumn = getDbColumn(sortQueryKey);
|
||||
const columnType = getColumnType(sortQueryKey);
|
||||
|
||||
console.log(`Sorting request: ${sortQueryKey} -> ${dbColumn} (${columnType})`);
|
||||
|
||||
const sortDirection = req.query.order?.toLowerCase() === 'desc' ? 'DESC' : 'ASC';
|
||||
const nullsOrder = (sortDirection === 'ASC' ? 'NULLS FIRST' : 'NULLS LAST'); // Consistent null handling
|
||||
|
||||
// Always put nulls last regardless of sort direction or column type
|
||||
const nullsOrder = 'NULLS LAST';
|
||||
|
||||
// Build the ORDER BY clause based on column type and special handling
|
||||
let orderByClause;
|
||||
|
||||
if (SPECIAL_SORT_COLUMNS[sortQueryKey] === 'abs') {
|
||||
// Sort by absolute value for columns where negative values matter
|
||||
orderByClause = `ABS(${dbColumn}::numeric) ${sortDirection} ${nullsOrder}`;
|
||||
} else if (columnType === 'number' || SPECIAL_SORT_COLUMNS[sortQueryKey] === true) {
|
||||
// For numeric columns, cast to numeric to ensure proper sorting
|
||||
orderByClause = `${dbColumn}::numeric ${sortDirection} ${nullsOrder}`;
|
||||
} else if (columnType === 'date') {
|
||||
// For date columns, cast to timestamp to ensure proper sorting
|
||||
orderByClause = `CASE WHEN ${dbColumn} IS NULL THEN 1 ELSE 0 END, ${dbColumn}::timestamp ${sortDirection}`;
|
||||
} else if (columnType === 'status' || SPECIAL_SORT_COLUMNS[sortQueryKey] === 'priority') {
|
||||
// Special handling for status column, using priority for known statuses
|
||||
orderByClause = `
|
||||
CASE WHEN ${dbColumn} IS NULL THEN 999
|
||||
WHEN ${dbColumn} = 'Critical' THEN 1
|
||||
WHEN ${dbColumn} = 'At Risk' THEN 2
|
||||
WHEN ${dbColumn} = 'Reorder' THEN 3
|
||||
WHEN ${dbColumn} = 'Overstocked' THEN 4
|
||||
WHEN ${dbColumn} = 'Healthy' THEN 5
|
||||
WHEN ${dbColumn} = 'New' THEN 6
|
||||
ELSE 100
|
||||
END ${sortDirection} ${nullsOrder},
|
||||
${dbColumn} ${sortDirection}`;
|
||||
} else {
|
||||
// For string and boolean columns, no special casting needed
|
||||
orderByClause = `CASE WHEN ${dbColumn} IS NULL THEN 1 ELSE 0 END, ${dbColumn} ${sortDirection}`;
|
||||
}
|
||||
|
||||
// --- Filtering ---
|
||||
const conditions = [];
|
||||
@@ -149,9 +324,24 @@ router.get('/', async (req, res) => {
|
||||
if (req.query.showInvisible !== 'true') conditions.push(`pm.is_visible = true`);
|
||||
if (req.query.showNonReplenishable !== 'true') conditions.push(`pm.is_replenishable = true`);
|
||||
|
||||
// Special handling for stock_status
|
||||
if (req.query.stock_status) {
|
||||
const status = req.query.stock_status;
|
||||
// Handle special case for "at-risk" which is stored as "At Risk" in the database
|
||||
if (status.toLowerCase() === 'at-risk') {
|
||||
conditions.push(`pm.status = $${paramCounter++}`);
|
||||
params.push('At Risk');
|
||||
} else {
|
||||
// Capitalize first letter to match database values
|
||||
conditions.push(`pm.status = $${paramCounter++}`);
|
||||
params.push(status.charAt(0).toUpperCase() + status.slice(1));
|
||||
}
|
||||
}
|
||||
|
||||
// Process other filters from query parameters
|
||||
for (const key in req.query) {
|
||||
if (['page', 'limit', 'sort', 'order', 'showInvisible', 'showNonReplenishable'].includes(key)) continue; // Skip control params
|
||||
// Skip control params
|
||||
if (['page', 'limit', 'sort', 'order', 'showInvisible', 'showNonReplenishable', 'stock_status'].includes(key)) continue;
|
||||
|
||||
let filterKey = key;
|
||||
let operator = '='; // Default operator
|
||||
@@ -164,15 +354,15 @@ router.get('/', async (req, res) => {
|
||||
operator = operatorMatch[2]; // e.g., "gt"
|
||||
}
|
||||
|
||||
const columnInfo = getSafeColumnInfo(filterKey);
|
||||
if (!columnInfo) {
|
||||
// Get the database column for this filter key
|
||||
const dbColumn = getDbColumn(filterKey);
|
||||
const valueType = getColumnType(filterKey);
|
||||
|
||||
if (!dbColumn) {
|
||||
console.warn(`Invalid filter key ignored: ${key}`);
|
||||
continue; // Skip if the key doesn't map to a known column
|
||||
}
|
||||
|
||||
const dbColumn = columnInfo.dbCol;
|
||||
const valueType = columnInfo.type;
|
||||
|
||||
// --- Build WHERE clause fragment ---
|
||||
try {
|
||||
let conditionFragment = '';
|
||||
@@ -234,6 +424,10 @@ router.get('/', async (req, res) => {
|
||||
// --- Construct and Execute Queries ---
|
||||
const whereClause = conditions.length > 0 ? `WHERE ${conditions.join(' AND ')}` : '';
|
||||
|
||||
// Debug log of conditions and parameters
|
||||
console.log('Constructed WHERE conditions:', conditions);
|
||||
console.log('Parameters:', params);
|
||||
|
||||
// Count Query
|
||||
const countSql = `SELECT COUNT(*) AS total FROM public.product_metrics pm ${whereClause}`;
|
||||
console.log('Executing Count Query:', countSql, params);
|
||||
@@ -244,11 +438,20 @@ router.get('/', async (req, res) => {
|
||||
SELECT pm.*
|
||||
FROM public.product_metrics pm
|
||||
${whereClause}
|
||||
ORDER BY ${sortColumn} ${sortDirection} ${nullsOrder}
|
||||
ORDER BY ${orderByClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`;
|
||||
const dataParams = [...params, limit, offset];
|
||||
console.log('Executing Data Query:', dataSql, dataParams);
|
||||
|
||||
// Log detailed query information for debugging
|
||||
console.log('Executing Data Query:');
|
||||
console.log(' - Sort Column:', dbColumn);
|
||||
console.log(' - Column Type:', columnType);
|
||||
console.log(' - Sort Direction:', sortDirection);
|
||||
console.log(' - Order By Clause:', orderByClause);
|
||||
console.log(' - Full SQL:', dataSql);
|
||||
console.log(' - Parameters:', dataParams);
|
||||
|
||||
const dataPromise = pool.query(dataSql, dataParams);
|
||||
|
||||
// Execute queries in parallel
|
||||
|
||||
@@ -23,10 +23,7 @@ router.get('/brands', async (req, res) => {
|
||||
const { rows } = await pool.query(`
|
||||
SELECT DISTINCT COALESCE(p.brand, 'Unbranded') as brand
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE p.visible = true
|
||||
GROUP BY COALESCE(p.brand, 'Unbranded')
|
||||
HAVING SUM(po.cost_price * po.received) >= 500
|
||||
ORDER BY COALESCE(p.brand, 'Unbranded')
|
||||
`);
|
||||
|
||||
@@ -629,163 +626,6 @@ router.get('/:id', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Import products from CSV
|
||||
router.post('/import', upload.single('file'), async (req, res) => {
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ error: 'No file uploaded' });
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await importProductsFromCSV(req.file.path, req.app.locals.pool);
|
||||
// Clean up the uploaded file
|
||||
require('fs').unlinkSync(req.file.path);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('Error importing products:', error);
|
||||
res.status(500).json({ error: 'Failed to import products' });
|
||||
}
|
||||
});
|
||||
|
||||
// Update a product
|
||||
router.put('/:id', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const {
|
||||
title,
|
||||
sku,
|
||||
stock_quantity,
|
||||
price,
|
||||
regular_price,
|
||||
cost_price,
|
||||
vendor,
|
||||
brand,
|
||||
categories,
|
||||
visible,
|
||||
managing_stock
|
||||
} = req.body;
|
||||
|
||||
const { rowCount } = await pool.query(
|
||||
`UPDATE products
|
||||
SET title = $1,
|
||||
sku = $2,
|
||||
stock_quantity = $3,
|
||||
price = $4,
|
||||
regular_price = $5,
|
||||
cost_price = $6,
|
||||
vendor = $7,
|
||||
brand = $8,
|
||||
categories = $9,
|
||||
visible = $10,
|
||||
managing_stock = $11
|
||||
WHERE pid = $12`,
|
||||
[
|
||||
title,
|
||||
sku,
|
||||
stock_quantity,
|
||||
price,
|
||||
regular_price,
|
||||
cost_price,
|
||||
vendor,
|
||||
brand,
|
||||
categories,
|
||||
visible,
|
||||
managing_stock,
|
||||
req.params.id
|
||||
]
|
||||
);
|
||||
|
||||
if (rowCount === 0) {
|
||||
return res.status(404).json({ error: 'Product not found' });
|
||||
}
|
||||
|
||||
res.json({ message: 'Product updated successfully' });
|
||||
} catch (error) {
|
||||
console.error('Error updating product:', error);
|
||||
res.status(500).json({ error: 'Failed to update product' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get product metrics
|
||||
router.get('/:id/metrics', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { id } = req.params;
|
||||
|
||||
// Get metrics from product_metrics table with inventory health data
|
||||
const { rows: metrics } = await pool.query(`
|
||||
WITH inventory_status AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
CASE
|
||||
WHEN pm.daily_sales_avg = 0 THEN 'New'
|
||||
WHEN p.stock_quantity <= CEIL(pm.daily_sales_avg * 7) THEN 'Critical'
|
||||
WHEN p.stock_quantity <= CEIL(pm.daily_sales_avg * 14) THEN 'Reorder'
|
||||
WHEN p.stock_quantity > (pm.daily_sales_avg * 90) THEN 'Overstocked'
|
||||
ELSE 'Healthy'
|
||||
END as calculated_status
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = $1
|
||||
)
|
||||
SELECT
|
||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||
COALESCE(pm.weekly_sales_avg, 0) as weekly_sales_avg,
|
||||
COALESCE(pm.monthly_sales_avg, 0) as monthly_sales_avg,
|
||||
COALESCE(pm.days_of_inventory, 0) as days_of_inventory,
|
||||
COALESCE(pm.reorder_point, CEIL(COALESCE(pm.daily_sales_avg, 0) * 14)) as reorder_point,
|
||||
COALESCE(pm.safety_stock, CEIL(COALESCE(pm.daily_sales_avg, 0) * 7)) as safety_stock,
|
||||
COALESCE(pm.avg_margin_percent,
|
||||
((p.price - COALESCE(p.cost_price, 0)) / NULLIF(p.price, 0)) * 100
|
||||
) as avg_margin_percent,
|
||||
COALESCE(pm.total_revenue, 0) as total_revenue,
|
||||
COALESCE(pm.inventory_value, p.stock_quantity * COALESCE(p.cost_price, 0)) as inventory_value,
|
||||
COALESCE(pm.turnover_rate, 0) as turnover_rate,
|
||||
COALESCE(pm.abc_class, 'C') as abc_class,
|
||||
COALESCE(pm.stock_status, is.calculated_status) as stock_status,
|
||||
COALESCE(pm.avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
COALESCE(pm.current_lead_time, 0) as current_lead_time,
|
||||
COALESCE(pm.target_lead_time, 14) as target_lead_time,
|
||||
COALESCE(pm.lead_time_status, 'Unknown') as lead_time_status,
|
||||
COALESCE(pm.reorder_qty, 0) as reorder_qty,
|
||||
COALESCE(pm.overstocked_amt, 0) as overstocked_amt
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN inventory_status is ON p.pid = is.pid
|
||||
WHERE p.pid = $2
|
||||
`, [id, id]);
|
||||
|
||||
if (!metrics.length) {
|
||||
// Return default metrics structure if no data found
|
||||
res.json({
|
||||
daily_sales_avg: 0,
|
||||
weekly_sales_avg: 0,
|
||||
monthly_sales_avg: 0,
|
||||
days_of_inventory: 0,
|
||||
reorder_point: 0,
|
||||
safety_stock: 0,
|
||||
avg_margin_percent: 0,
|
||||
total_revenue: 0,
|
||||
inventory_value: 0,
|
||||
turnover_rate: 0,
|
||||
abc_class: 'C',
|
||||
stock_status: 'New',
|
||||
avg_lead_time_days: 0,
|
||||
current_lead_time: 0,
|
||||
target_lead_time: 14,
|
||||
lead_time_status: 'Unknown',
|
||||
reorder_qty: 0,
|
||||
overstocked_amt: 0
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
res.json(metrics[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching product metrics:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch product metrics' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get product time series data
|
||||
router.get('/:id/time-series', async (req, res) => {
|
||||
const { id } = req.params;
|
||||
|
||||
@@ -1,108 +0,0 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
|
||||
// Get vendors with pagination, filtering, and sorting
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// Get all vendors with metrics
|
||||
const { rows: vendors } = await pool.query(`
|
||||
SELECT DISTINCT
|
||||
p.vendor as name,
|
||||
COALESCE(vm.active_products, 0) as active_products,
|
||||
COALESCE(vm.total_orders, 0) as total_orders,
|
||||
COALESCE(vm.avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
COALESCE(vm.on_time_delivery_rate, 0) as on_time_delivery_rate,
|
||||
COALESCE(vm.order_fill_rate, 0) as order_fill_rate,
|
||||
CASE
|
||||
WHEN COALESCE(vm.total_orders, 0) > 0 AND COALESCE(vm.order_fill_rate, 0) >= 75 THEN 'active'
|
||||
WHEN COALESCE(vm.total_orders, 0) > 0 THEN 'inactive'
|
||||
ELSE 'pending'
|
||||
END as status
|
||||
FROM products p
|
||||
LEFT JOIN vendor_metrics vm ON p.vendor = vm.vendor
|
||||
WHERE p.vendor IS NOT NULL AND p.vendor != ''
|
||||
`);
|
||||
|
||||
// Get cost metrics for all vendors
|
||||
const vendorNames = vendors.map(v => v.name);
|
||||
const { rows: costMetrics } = await pool.query(`
|
||||
SELECT
|
||||
vendor,
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor = ANY($1)
|
||||
GROUP BY vendor
|
||||
`, [vendorNames]);
|
||||
|
||||
// Create a map of cost metrics by vendor
|
||||
const costMetricsMap = costMetrics.reduce((acc, curr) => {
|
||||
acc[curr.vendor] = {
|
||||
avg_unit_cost: curr.avg_unit_cost,
|
||||
total_spend: curr.total_spend
|
||||
};
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
// Get overall stats
|
||||
const { rows: [stats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(DISTINCT p.vendor) as totalVendors,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN COALESCE(vm.total_orders, 0) > 0 AND COALESCE(vm.order_fill_rate, 0) >= 75
|
||||
THEN p.vendor
|
||||
END) as activeVendors,
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.avg_lead_time_days, 0))::numeric, 1), 0) as avgLeadTime,
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.order_fill_rate, 0))::numeric, 1), 0) as avgFillRate,
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.on_time_delivery_rate, 0))::numeric, 1), 0) as avgOnTimeDelivery
|
||||
FROM products p
|
||||
LEFT JOIN vendor_metrics vm ON p.vendor = vm.vendor
|
||||
WHERE p.vendor IS NOT NULL AND p.vendor != ''
|
||||
`);
|
||||
|
||||
// Get overall cost metrics
|
||||
const { rows: [overallCostMetrics] } = await pool.query(`
|
||||
SELECT
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 2
|
||||
AND cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor IS NOT NULL AND vendor != ''
|
||||
`);
|
||||
|
||||
res.json({
|
||||
vendors: vendors.map(vendor => ({
|
||||
vendor_id: vendor.name,
|
||||
name: vendor.name,
|
||||
status: vendor.status,
|
||||
avg_lead_time_days: parseFloat(vendor.avg_lead_time_days),
|
||||
on_time_delivery_rate: parseFloat(vendor.on_time_delivery_rate),
|
||||
order_fill_rate: parseFloat(vendor.order_fill_rate),
|
||||
total_orders: parseInt(vendor.total_orders),
|
||||
active_products: parseInt(vendor.active_products),
|
||||
avg_unit_cost: parseFloat(costMetricsMap[vendor.name]?.avg_unit_cost || 0),
|
||||
total_spend: parseFloat(costMetricsMap[vendor.name]?.total_spend || 0)
|
||||
})),
|
||||
stats: {
|
||||
totalVendors: parseInt(stats.totalvendors),
|
||||
activeVendors: parseInt(stats.activevendors),
|
||||
avgLeadTime: parseFloat(stats.avgleadtime),
|
||||
avgFillRate: parseFloat(stats.avgfillrate),
|
||||
avgOnTimeDelivery: parseFloat(stats.avgontimedelivery),
|
||||
avgUnitCost: parseFloat(overallCostMetrics.avg_unit_cost),
|
||||
totalSpend: parseFloat(overallCostMetrics.total_spend)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendors:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendors' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
320
inventory-server/src/routes/vendorsAggregate.js
Normal file
320
inventory-server/src/routes/vendorsAggregate.js
Normal file
@@ -0,0 +1,320 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const { parseValue } = require('../utils/apiHelpers'); // Adjust path if needed
|
||||
|
||||
// --- Configuration & Helpers ---
|
||||
const DEFAULT_PAGE_LIMIT = 50;
|
||||
const MAX_PAGE_LIMIT = 200;
|
||||
|
||||
// Maps query keys to DB columns in vendor_metrics
|
||||
const COLUMN_MAP = {
|
||||
vendorName: { dbCol: 'vm.vendor_name', type: 'string' },
|
||||
productCount: { dbCol: 'vm.product_count', type: 'number' },
|
||||
activeProductCount: { dbCol: 'vm.active_product_count', type: 'number' },
|
||||
replenishableProductCount: { dbCol: 'vm.replenishable_product_count', type: 'number' },
|
||||
currentStockUnits: { dbCol: 'vm.current_stock_units', type: 'number' },
|
||||
currentStockCost: { dbCol: 'vm.current_stock_cost', type: 'number' },
|
||||
currentStockRetail: { dbCol: 'vm.current_stock_retail', type: 'number' },
|
||||
onOrderUnits: { dbCol: 'vm.on_order_units', type: 'number' },
|
||||
onOrderCost: { dbCol: 'vm.on_order_cost', type: 'number' },
|
||||
poCount365d: { dbCol: 'vm.po_count_365d', type: 'number' },
|
||||
avgLeadTimeDays: { dbCol: 'vm.avg_lead_time_days', type: 'number' },
|
||||
sales7d: { dbCol: 'vm.sales_7d', type: 'number' },
|
||||
revenue7d: { dbCol: 'vm.revenue_7d', type: 'number' },
|
||||
sales30d: { dbCol: 'vm.sales_30d', type: 'number' },
|
||||
revenue30d: { dbCol: 'vm.revenue_30d', type: 'number' },
|
||||
profit30d: { dbCol: 'vm.profit_30d', type: 'number' },
|
||||
cogs30d: { dbCol: 'vm.cogs_30d', type: 'number' },
|
||||
sales365d: { dbCol: 'vm.sales_365d', type: 'number' },
|
||||
revenue365d: { dbCol: 'vm.revenue_365d', type: 'number' },
|
||||
lifetimeSales: { dbCol: 'vm.lifetime_sales', type: 'number' },
|
||||
lifetimeRevenue: { dbCol: 'vm.lifetime_revenue', type: 'number' },
|
||||
avgMargin30d: { dbCol: 'vm.avg_margin_30d', type: 'number' },
|
||||
// Add aliases if needed for frontend compatibility
|
||||
name: { dbCol: 'vm.vendor_name', type: 'string' },
|
||||
leadTime: { dbCol: 'vm.avg_lead_time_days', type: 'number' },
|
||||
// Add status for filtering
|
||||
status: { dbCol: 'vendor_status', type: 'string' },
|
||||
};
|
||||
|
||||
function getSafeColumnInfo(queryParamKey) {
|
||||
return COLUMN_MAP[queryParamKey] || null;
|
||||
}
|
||||
|
||||
// --- Route Handlers ---
|
||||
|
||||
// GET /vendors-aggregate/filter-options (Just vendors list for now)
|
||||
router.get('/filter-options', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /vendors-aggregate/filter-options');
|
||||
try {
|
||||
// Get vendor names
|
||||
const { rows: vendorRows } = await pool.query(`
|
||||
SELECT DISTINCT vendor_name FROM public.vendor_metrics ORDER BY vendor_name
|
||||
`);
|
||||
|
||||
// Get status values - calculate them since they're derived
|
||||
const { rows: statusRows } = await pool.query(`
|
||||
SELECT DISTINCT
|
||||
CASE
|
||||
WHEN po_count_365d > 0 AND sales_30d > 0 THEN 'active'
|
||||
WHEN po_count_365d > 0 THEN 'inactive'
|
||||
ELSE 'pending'
|
||||
END as status
|
||||
FROM public.vendor_metrics
|
||||
ORDER BY status
|
||||
`);
|
||||
|
||||
res.json({
|
||||
vendors: vendorRows.map(r => r.vendor_name),
|
||||
statuses: statusRows.map(r => r.status)
|
||||
});
|
||||
} catch(error) {
|
||||
console.error('Error fetching vendor filter options:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch filter options' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /vendors-aggregate/stats (Overall vendor stats)
|
||||
router.get('/stats', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /vendors-aggregate/stats');
|
||||
try {
|
||||
// Get basic vendor stats from aggregate table
|
||||
const { rows: [stats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(*) AS total_vendors,
|
||||
SUM(active_product_count) AS total_active_products,
|
||||
SUM(current_stock_cost) AS total_stock_value,
|
||||
SUM(on_order_cost) AS total_on_order_value,
|
||||
AVG(NULLIF(avg_lead_time_days, 0)) AS overall_avg_lead_time
|
||||
FROM public.vendor_metrics vm
|
||||
`);
|
||||
|
||||
// Count active vendors based on criteria (from old vendors.js)
|
||||
const { rows: [activeStats] } = await pool.query(`
|
||||
SELECT
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN po_count_365d > 0
|
||||
THEN vendor_name
|
||||
END) as active_vendors
|
||||
FROM public.vendor_metrics
|
||||
`);
|
||||
|
||||
// Get overall cost metrics from purchase orders
|
||||
const { rows: [overallCostMetrics] } = await pool.query(`
|
||||
SELECT
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE cost_price IS NOT NULL
|
||||
AND ordered > 0
|
||||
AND vendor IS NOT NULL AND vendor != ''
|
||||
`);
|
||||
|
||||
res.json({
|
||||
totalVendors: parseInt(stats?.total_vendors || 0),
|
||||
activeVendors: parseInt(activeStats?.active_vendors || 0),
|
||||
totalActiveProducts: parseInt(stats?.total_active_products || 0),
|
||||
totalValue: parseFloat(stats?.total_stock_value || 0),
|
||||
totalOnOrderValue: parseFloat(stats?.total_on_order_value || 0),
|
||||
avgLeadTime: parseFloat(stats?.overall_avg_lead_time || 0),
|
||||
avgUnitCost: parseFloat(overallCostMetrics?.avg_unit_cost || 0),
|
||||
totalSpend: parseFloat(overallCostMetrics?.total_spend || 0)
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendor stats:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendor stats.' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /vendors-aggregate/ (List vendors)
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
console.log('GET /vendors-aggregate received query:', req.query);
|
||||
try {
|
||||
// --- Pagination ---
|
||||
let page = parseInt(req.query.page, 10) || 1;
|
||||
let limit = parseInt(req.query.limit, 10) || DEFAULT_PAGE_LIMIT;
|
||||
limit = Math.min(limit, MAX_PAGE_LIMIT);
|
||||
const offset = (page - 1) * limit;
|
||||
|
||||
// --- Sorting ---
|
||||
const sortQueryKey = req.query.sort || 'vendorName'; // Default sort
|
||||
const sortColumnInfo = getSafeColumnInfo(sortQueryKey);
|
||||
const sortColumn = sortColumnInfo ? sortColumnInfo.dbCol : 'vm.vendor_name';
|
||||
const sortDirection = req.query.order?.toLowerCase() === 'desc' ? 'DESC' : 'ASC';
|
||||
const nullsOrder = (sortDirection === 'ASC' ? 'NULLS FIRST' : 'NULLS LAST');
|
||||
const sortClause = `ORDER BY ${sortColumn} ${sortDirection} ${nullsOrder}`;
|
||||
|
||||
// --- Filtering ---
|
||||
const conditions = [];
|
||||
const params = [];
|
||||
let paramCounter = 1;
|
||||
// Build conditions based on req.query, using COLUMN_MAP and parseValue
|
||||
for (const key in req.query) {
|
||||
if (['page', 'limit', 'sort', 'order'].includes(key)) continue;
|
||||
|
||||
let filterKey = key;
|
||||
let operator = '='; // Default operator
|
||||
const value = req.query[key];
|
||||
|
||||
const operatorMatch = key.match(/^(.*)_(eq|ne|gt|gte|lt|lte|like|ilike|between|in)$/);
|
||||
if (operatorMatch) {
|
||||
filterKey = operatorMatch[1];
|
||||
operator = operatorMatch[2];
|
||||
}
|
||||
|
||||
const columnInfo = getSafeColumnInfo(filterKey);
|
||||
if (columnInfo) {
|
||||
const dbColumn = columnInfo.dbCol;
|
||||
const valueType = columnInfo.type;
|
||||
try {
|
||||
let conditionFragment = '';
|
||||
let needsParam = true;
|
||||
switch (operator.toLowerCase()) { // Normalize operator
|
||||
case 'eq': operator = '='; break;
|
||||
case 'ne': operator = '<>'; break;
|
||||
case 'gt': operator = '>'; break;
|
||||
case 'gte': operator = '>='; break;
|
||||
case 'lt': operator = '<'; break;
|
||||
case 'lte': operator = '<='; break;
|
||||
case 'like': operator = 'LIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'ilike': operator = 'ILIKE'; needsParam=false; params.push(`%${parseValue(value, valueType)}%`); break;
|
||||
case 'between':
|
||||
const [val1, val2] = String(value).split(',');
|
||||
if (val1 !== undefined && val2 !== undefined) {
|
||||
conditionFragment = `${dbColumn} BETWEEN $${paramCounter++} AND $${paramCounter++}`;
|
||||
params.push(parseValue(val1, valueType), parseValue(val2, valueType));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
case 'in':
|
||||
const inValues = String(value).split(',');
|
||||
if (inValues.length > 0) {
|
||||
const placeholders = inValues.map(() => `$${paramCounter++}`).join(', ');
|
||||
conditionFragment = `${dbColumn} IN (${placeholders})`;
|
||||
params.push(...inValues.map(v => parseValue(v, valueType)));
|
||||
needsParam = false;
|
||||
} else continue;
|
||||
break;
|
||||
default: operator = '='; break;
|
||||
}
|
||||
|
||||
if (needsParam) {
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
params.push(parseValue(value, valueType));
|
||||
} else if (!conditionFragment) { // For LIKE/ILIKE
|
||||
conditionFragment = `${dbColumn} ${operator} $${paramCounter++}`;
|
||||
}
|
||||
|
||||
if (conditionFragment) {
|
||||
conditions.push(`(${conditionFragment})`);
|
||||
}
|
||||
} catch (parseError) {
|
||||
console.warn(`Skipping filter for key "${key}" due to parsing error: ${parseError.message}`);
|
||||
if (needsParam) paramCounter--;
|
||||
}
|
||||
} else {
|
||||
console.warn(`Invalid filter key ignored: ${key}`);
|
||||
}
|
||||
}
|
||||
|
||||
// --- Execute Queries ---
|
||||
const whereClause = conditions.length > 0 ? `WHERE ${conditions.join(' AND ')}` : '';
|
||||
|
||||
// Status calculation from vendors.js
|
||||
const statusCase = `
|
||||
CASE
|
||||
WHEN po_count_365d > 0 AND sales_30d > 0 THEN 'active'
|
||||
WHEN po_count_365d > 0 THEN 'inactive'
|
||||
ELSE 'pending'
|
||||
END as vendor_status
|
||||
`;
|
||||
|
||||
const baseSql = `
|
||||
FROM (
|
||||
SELECT
|
||||
vm.*,
|
||||
${statusCase}
|
||||
FROM public.vendor_metrics vm
|
||||
) vm
|
||||
${whereClause}
|
||||
`;
|
||||
|
||||
const countSql = `SELECT COUNT(*) AS total ${baseSql}`;
|
||||
const dataSql = `
|
||||
WITH vendor_data AS (
|
||||
SELECT
|
||||
vm.*,
|
||||
${statusCase}
|
||||
FROM public.vendor_metrics vm
|
||||
)
|
||||
SELECT
|
||||
vm.*,
|
||||
COALESCE(po.avg_unit_cost, 0) as avg_unit_cost,
|
||||
COALESCE(po.total_spend, 0) as total_spend
|
||||
FROM vendor_data vm
|
||||
LEFT JOIN (
|
||||
SELECT
|
||||
vendor,
|
||||
ROUND((SUM(ordered * cost_price)::numeric / NULLIF(SUM(ordered), 0)), 2) as avg_unit_cost,
|
||||
ROUND(SUM(ordered * cost_price)::numeric, 3) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE cost_price IS NOT NULL AND ordered > 0
|
||||
GROUP BY vendor
|
||||
) po ON vm.vendor_name = po.vendor
|
||||
${whereClause}
|
||||
${sortClause}
|
||||
LIMIT $${paramCounter} OFFSET $${paramCounter + 1}
|
||||
`;
|
||||
const dataParams = [...params, limit, offset];
|
||||
|
||||
console.log("Count SQL:", countSql, params);
|
||||
console.log("Data SQL:", dataSql, dataParams);
|
||||
|
||||
const [countResult, dataResult] = await Promise.all([
|
||||
pool.query(countSql, params),
|
||||
pool.query(dataSql, dataParams)
|
||||
]);
|
||||
|
||||
const total = parseInt(countResult.rows[0].total, 10);
|
||||
const vendors = dataResult.rows.map(row => {
|
||||
// Create a new object with both snake_case and camelCase keys
|
||||
const transformedRow = { ...row }; // Start with original data
|
||||
|
||||
for (const key in row) {
|
||||
// Skip null/undefined values
|
||||
if (row[key] === null || row[key] === undefined) {
|
||||
continue; // Original already has the null value
|
||||
}
|
||||
|
||||
// Transform keys to match frontend expectations (add camelCase versions)
|
||||
// First handle cases like sales_7d -> sales7d
|
||||
let camelKey = key.replace(/_(\d+[a-z])/g, '$1');
|
||||
|
||||
// Then handle regular snake_case -> camelCase
|
||||
camelKey = camelKey.replace(/_([a-z])/g, (_, letter) => letter.toUpperCase());
|
||||
if (camelKey !== key) { // Only add if different from original
|
||||
transformedRow[camelKey] = row[key];
|
||||
}
|
||||
}
|
||||
return transformedRow;
|
||||
});
|
||||
|
||||
// --- Respond ---
|
||||
res.json({
|
||||
vendors,
|
||||
pagination: { total, pages: Math.ceil(total / limit), currentPage: page, limit },
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error fetching vendor metrics list:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch vendor metrics.' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /vendors-aggregate/:name (Get single vendor metric)
|
||||
// Implement if needed, remember to URL-decode the name parameter
|
||||
|
||||
module.exports = router;
|
||||
@@ -8,18 +8,19 @@ const { initPool } = require('./utils/db');
|
||||
const productsRouter = require('./routes/products');
|
||||
const dashboardRouter = require('./routes/dashboard');
|
||||
const ordersRouter = require('./routes/orders');
|
||||
const csvRouter = require('./routes/csv');
|
||||
const csvRouter = require('./routes/data-management');
|
||||
const analyticsRouter = require('./routes/analytics');
|
||||
const purchaseOrdersRouter = require('./routes/purchase-orders');
|
||||
const configRouter = require('./routes/config');
|
||||
const metricsRouter = require('./routes/metrics');
|
||||
const vendorsRouter = require('./routes/vendors');
|
||||
const categoriesRouter = require('./routes/categories');
|
||||
const importRouter = require('./routes/import');
|
||||
const aiValidationRouter = require('./routes/ai-validation');
|
||||
const templatesRouter = require('./routes/templates');
|
||||
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||
const reusableImagesRouter = require('./routes/reusable-images');
|
||||
const categoriesAggregateRouter = require('./routes/categoriesAggregate');
|
||||
const vendorsAggregateRouter = require('./routes/vendorsAggregate');
|
||||
const brandsAggregateRouter = require('./routes/brandsAggregate');
|
||||
|
||||
// Get the absolute path to the .env file
|
||||
const envPath = '/var/www/html/inventory/.env';
|
||||
@@ -100,8 +101,13 @@ async function startServer() {
|
||||
app.use('/api/purchase-orders', purchaseOrdersRouter);
|
||||
app.use('/api/config', configRouter);
|
||||
app.use('/api/metrics', metricsRouter);
|
||||
app.use('/api/vendors', vendorsRouter);
|
||||
app.use('/api/categories', categoriesRouter);
|
||||
// Use only the aggregate routes for vendors and categories
|
||||
app.use('/api/vendors', vendorsAggregateRouter);
|
||||
app.use('/api/categories', categoriesAggregateRouter);
|
||||
// Keep the aggregate-specific endpoints for backward compatibility
|
||||
app.use('/api/categories-aggregate', categoriesAggregateRouter);
|
||||
app.use('/api/vendors-aggregate', vendorsAggregateRouter);
|
||||
app.use('/api/brands-aggregate', brandsAggregateRouter);
|
||||
app.use('/api/import', importRouter);
|
||||
app.use('/api/ai-validation', aiValidationRouter);
|
||||
app.use('/api/templates', templatesRouter);
|
||||
|
||||
45
inventory-server/src/utils/apiHelpers.js
Normal file
45
inventory-server/src/utils/apiHelpers.js
Normal file
@@ -0,0 +1,45 @@
|
||||
/**
|
||||
* Parses a query parameter value based on its expected type.
|
||||
* Throws error for invalid formats. Adjust date handling as needed.
|
||||
*/
|
||||
function parseValue(value, type) {
|
||||
if (value === null || value === undefined || value === '') return null;
|
||||
|
||||
console.log(`Parsing value: "${value}" as type: "${type}"`);
|
||||
|
||||
switch (type) {
|
||||
case 'number':
|
||||
const num = parseFloat(value);
|
||||
if (isNaN(num)) {
|
||||
console.error(`Invalid number format: "${value}"`);
|
||||
throw new Error(`Invalid number format: "${value}"`);
|
||||
}
|
||||
return num;
|
||||
case 'integer': // Specific type for integer IDs etc.
|
||||
const int = parseInt(value, 10);
|
||||
if (isNaN(int)) {
|
||||
console.error(`Invalid integer format: "${value}"`);
|
||||
throw new Error(`Invalid integer format: "${value}"`);
|
||||
}
|
||||
console.log(`Successfully parsed integer: ${int}`);
|
||||
return int;
|
||||
case 'boolean':
|
||||
if (String(value).toLowerCase() === 'true') return true;
|
||||
if (String(value).toLowerCase() === 'false') return false;
|
||||
console.error(`Invalid boolean format: "${value}"`);
|
||||
throw new Error(`Invalid boolean format: "${value}"`);
|
||||
case 'date':
|
||||
// Basic ISO date format validation (YYYY-MM-DD)
|
||||
if (!String(value).match(/^\d{4}-\d{2}-\d{2}$/)) {
|
||||
console.warn(`Potentially invalid date format passed: "${value}"`);
|
||||
// Optionally throw an error or return null depending on strictness
|
||||
// throw new Error(`Invalid date format (YYYY-MM-DD expected): "${value}"`);
|
||||
}
|
||||
return String(value); // Send as string, let DB handle casting/comparison
|
||||
case 'string':
|
||||
default:
|
||||
return String(value);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { parseValue };
|
||||
239
inventory-server/src/utils/dbConnection.js
Normal file
239
inventory-server/src/utils/dbConnection.js
Normal file
@@ -0,0 +1,239 @@
|
||||
const { Client } = require('ssh2');
|
||||
const mysql = require('mysql2/promise');
|
||||
const fs = require('fs');
|
||||
|
||||
// Connection pooling and cache configuration
|
||||
const connectionCache = {
|
||||
ssh: null,
|
||||
dbConnection: null,
|
||||
lastUsed: 0,
|
||||
isConnecting: false,
|
||||
connectionPromise: null,
|
||||
// Cache expiration time in milliseconds (5 minutes)
|
||||
expirationTime: 5 * 60 * 1000,
|
||||
// Cache for query results (key: query string, value: {data, timestamp})
|
||||
queryCache: new Map(),
|
||||
// Cache duration for different query types in milliseconds
|
||||
cacheDuration: {
|
||||
'field-options': 30 * 60 * 1000, // 30 minutes for field options
|
||||
'product-lines': 10 * 60 * 1000, // 10 minutes for product lines
|
||||
'sublines': 10 * 60 * 1000, // 10 minutes for sublines
|
||||
'taxonomy': 30 * 60 * 1000, // 30 minutes for taxonomy data
|
||||
'default': 60 * 1000 // 1 minute default
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Get a database connection with connection pooling
|
||||
* @returns {Promise<{ssh: object, connection: object}>} The SSH and database connection
|
||||
*/
|
||||
async function getDbConnection() {
|
||||
const now = Date.now();
|
||||
|
||||
// Check if we need to refresh the connection due to inactivity
|
||||
const needsRefresh = !connectionCache.ssh ||
|
||||
!connectionCache.dbConnection ||
|
||||
(now - connectionCache.lastUsed > connectionCache.expirationTime);
|
||||
|
||||
// If connection is still valid, update last used time and return existing connection
|
||||
if (!needsRefresh) {
|
||||
connectionCache.lastUsed = now;
|
||||
return {
|
||||
ssh: connectionCache.ssh,
|
||||
connection: connectionCache.dbConnection
|
||||
};
|
||||
}
|
||||
|
||||
// If another request is already establishing a connection, wait for that promise
|
||||
if (connectionCache.isConnecting && connectionCache.connectionPromise) {
|
||||
try {
|
||||
await connectionCache.connectionPromise;
|
||||
return {
|
||||
ssh: connectionCache.ssh,
|
||||
connection: connectionCache.dbConnection
|
||||
};
|
||||
} catch (error) {
|
||||
// If that connection attempt failed, we'll try again below
|
||||
console.error('Error waiting for existing connection:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Close existing connections if they exist
|
||||
if (connectionCache.dbConnection) {
|
||||
try {
|
||||
await connectionCache.dbConnection.end();
|
||||
} catch (error) {
|
||||
console.error('Error closing existing database connection:', error);
|
||||
}
|
||||
}
|
||||
|
||||
if (connectionCache.ssh) {
|
||||
try {
|
||||
connectionCache.ssh.end();
|
||||
} catch (error) {
|
||||
console.error('Error closing existing SSH connection:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Mark that we're establishing a new connection
|
||||
connectionCache.isConnecting = true;
|
||||
|
||||
// Create a new promise for this connection attempt
|
||||
connectionCache.connectionPromise = setupSshTunnel().then(tunnel => {
|
||||
const { ssh, stream, dbConfig } = tunnel;
|
||||
|
||||
return mysql.createConnection({
|
||||
...dbConfig,
|
||||
stream
|
||||
}).then(connection => {
|
||||
// Store the new connections
|
||||
connectionCache.ssh = ssh;
|
||||
connectionCache.dbConnection = connection;
|
||||
connectionCache.lastUsed = Date.now();
|
||||
connectionCache.isConnecting = false;
|
||||
|
||||
return {
|
||||
ssh,
|
||||
connection
|
||||
};
|
||||
});
|
||||
}).catch(error => {
|
||||
connectionCache.isConnecting = false;
|
||||
throw error;
|
||||
});
|
||||
|
||||
// Wait for the connection to be established
|
||||
return connectionCache.connectionPromise;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get cached query results or execute query if not cached
|
||||
* @param {string} cacheKey - Unique key to identify the query
|
||||
* @param {string} queryType - Type of query (field-options, product-lines, etc.)
|
||||
* @param {Function} queryFn - Function to execute if cache miss
|
||||
* @returns {Promise<any>} The query result
|
||||
*/
|
||||
async function getCachedQuery(cacheKey, queryType, queryFn) {
|
||||
// Get cache duration based on query type
|
||||
const cacheDuration = connectionCache.cacheDuration[queryType] || connectionCache.cacheDuration.default;
|
||||
|
||||
// Check if we have a valid cached result
|
||||
const cachedResult = connectionCache.queryCache.get(cacheKey);
|
||||
const now = Date.now();
|
||||
|
||||
if (cachedResult && (now - cachedResult.timestamp < cacheDuration)) {
|
||||
console.log(`Cache hit for ${queryType} query: ${cacheKey}`);
|
||||
return cachedResult.data;
|
||||
}
|
||||
|
||||
// No valid cache found, execute the query
|
||||
console.log(`Cache miss for ${queryType} query: ${cacheKey}`);
|
||||
const result = await queryFn();
|
||||
|
||||
// Cache the result
|
||||
connectionCache.queryCache.set(cacheKey, {
|
||||
data: result,
|
||||
timestamp: now
|
||||
});
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup SSH tunnel to production database
|
||||
* @private - Should only be used by getDbConnection
|
||||
* @returns {Promise<{ssh: object, stream: object, dbConfig: object}>}
|
||||
*/
|
||||
async function setupSshTunnel() {
|
||||
const sshConfig = {
|
||||
host: process.env.PROD_SSH_HOST,
|
||||
port: process.env.PROD_SSH_PORT || 22,
|
||||
username: process.env.PROD_SSH_USER,
|
||||
privateKey: process.env.PROD_SSH_KEY_PATH
|
||||
? fs.readFileSync(process.env.PROD_SSH_KEY_PATH)
|
||||
: undefined,
|
||||
compress: true
|
||||
};
|
||||
|
||||
const dbConfig = {
|
||||
host: process.env.PROD_DB_HOST || 'localhost',
|
||||
user: process.env.PROD_DB_USER,
|
||||
password: process.env.PROD_DB_PASSWORD,
|
||||
database: process.env.PROD_DB_NAME,
|
||||
port: process.env.PROD_DB_PORT || 3306,
|
||||
timezone: 'Z'
|
||||
};
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const ssh = new Client();
|
||||
|
||||
ssh.on('error', (err) => {
|
||||
console.error('SSH connection error:', err);
|
||||
reject(err);
|
||||
});
|
||||
|
||||
ssh.on('ready', () => {
|
||||
ssh.forwardOut(
|
||||
'127.0.0.1',
|
||||
0,
|
||||
dbConfig.host,
|
||||
dbConfig.port,
|
||||
(err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream, dbConfig });
|
||||
}
|
||||
);
|
||||
}).connect(sshConfig);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear cached query results
|
||||
* @param {string} [cacheKey] - Specific cache key to clear (clears all if not provided)
|
||||
*/
|
||||
function clearQueryCache(cacheKey) {
|
||||
if (cacheKey) {
|
||||
connectionCache.queryCache.delete(cacheKey);
|
||||
console.log(`Cleared cache for key: ${cacheKey}`);
|
||||
} else {
|
||||
connectionCache.queryCache.clear();
|
||||
console.log('Cleared all query cache');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Force close all active connections
|
||||
* Useful for server shutdown or manual connection reset
|
||||
*/
|
||||
async function closeAllConnections() {
|
||||
if (connectionCache.dbConnection) {
|
||||
try {
|
||||
await connectionCache.dbConnection.end();
|
||||
console.log('Closed database connection');
|
||||
} catch (error) {
|
||||
console.error('Error closing database connection:', error);
|
||||
}
|
||||
connectionCache.dbConnection = null;
|
||||
}
|
||||
|
||||
if (connectionCache.ssh) {
|
||||
try {
|
||||
connectionCache.ssh.end();
|
||||
console.log('Closed SSH connection');
|
||||
} catch (error) {
|
||||
console.error('Error closing SSH connection:', error);
|
||||
}
|
||||
connectionCache.ssh = null;
|
||||
}
|
||||
|
||||
connectionCache.lastUsed = 0;
|
||||
connectionCache.isConnecting = false;
|
||||
connectionCache.connectionPromise = null;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
getDbConnection,
|
||||
getCachedQuery,
|
||||
clearQueryCache,
|
||||
closeAllConnections
|
||||
};
|
||||
20
inventory/package-lock.json
generated
20
inventory/package-lock.json
generated
@@ -61,6 +61,7 @@
|
||||
"react-chartjs-2": "^5.3.0",
|
||||
"react-data-grid": "^7.0.0-beta.13",
|
||||
"react-day-picker": "^8.10.1",
|
||||
"react-debounce-input": "^3.3.0",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-dropzone": "^14.3.5",
|
||||
"react-hook-form": "^7.54.2",
|
||||
@@ -6043,6 +6044,12 @@
|
||||
"integrity": "sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/lodash.debounce": {
|
||||
"version": "4.0.8",
|
||||
"resolved": "https://registry.npmjs.org/lodash.debounce/-/lodash.debounce-4.0.8.tgz",
|
||||
"integrity": "sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/lodash.merge": {
|
||||
"version": "4.6.2",
|
||||
"resolved": "https://registry.npmjs.org/lodash.merge/-/lodash.merge-4.6.2.tgz",
|
||||
@@ -6919,6 +6926,19 @@
|
||||
"react": "^16.8.0 || ^17.0.0 || ^18.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/react-debounce-input": {
|
||||
"version": "3.3.0",
|
||||
"resolved": "https://registry.npmjs.org/react-debounce-input/-/react-debounce-input-3.3.0.tgz",
|
||||
"integrity": "sha512-VEqkvs8JvY/IIZvh71Z0TC+mdbxERvYF33RcebnodlsUZ8RSgyKe2VWaHXv4+/8aoOgXLxWrdsYs2hDhcwbUgA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"lodash.debounce": "^4",
|
||||
"prop-types": "^15.8.1"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": "^15.3.0 || 16 || 17 || 18"
|
||||
}
|
||||
},
|
||||
"node_modules/react-dom": {
|
||||
"version": "18.3.1",
|
||||
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-18.3.1.tgz",
|
||||
|
||||
@@ -63,6 +63,7 @@
|
||||
"react-chartjs-2": "^5.3.0",
|
||||
"react-data-grid": "^7.0.0-beta.13",
|
||||
"react-day-picker": "^8.10.1",
|
||||
"react-debounce-input": "^3.3.0",
|
||||
"react-dom": "^18.3.1",
|
||||
"react-dropzone": "^14.3.5",
|
||||
"react-hook-form": "^7.54.2",
|
||||
|
||||
@@ -18,7 +18,7 @@ import { Import } from '@/pages/Import';
|
||||
import { AuthProvider } from './contexts/AuthContext';
|
||||
import { Protected } from './components/auth/Protected';
|
||||
import { FirstAccessiblePage } from './components/auth/FirstAccessiblePage';
|
||||
|
||||
import { Brands } from '@/pages/Brands';
|
||||
const queryClient = new QueryClient();
|
||||
|
||||
function App() {
|
||||
@@ -108,6 +108,11 @@ function App() {
|
||||
<Vendors />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/brands" element={
|
||||
<Protected page="brands">
|
||||
<Brands />
|
||||
</Protected>
|
||||
} />
|
||||
<Route path="/purchase-orders" element={
|
||||
<Protected page="purchase_orders">
|
||||
<PurchaseOrders />
|
||||
|
||||
@@ -38,21 +38,22 @@ export function CategoryPerformance() {
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
performance: rawData.performance.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
category: item.category || '',
|
||||
categoryPath: item.categoryPath || item.categorypath || item.category || '',
|
||||
revenue: Number(item.revenue) || 0,
|
||||
profit: Number(item.profit) || 0,
|
||||
growth: Number(item.growth) || 0,
|
||||
productCount: Number(item.productCount) || 0
|
||||
productCount: Number(item.productCount) || Number(item.productcount) || 0
|
||||
})),
|
||||
distribution: rawData.distribution.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
category: item.category || '',
|
||||
categoryPath: item.categoryPath || item.categorypath || item.category || '',
|
||||
value: Number(item.value) || 0
|
||||
})),
|
||||
trends: rawData.trends.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
category: item.category || '',
|
||||
categoryPath: item.categoryPath || item.categorypath || item.category || '',
|
||||
month: item.month || '',
|
||||
sales: Number(item.sales) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
@@ -25,41 +25,91 @@ interface PriceData {
|
||||
}
|
||||
|
||||
export function PriceAnalysis() {
|
||||
const { data, isLoading } = useQuery<PriceData>({
|
||||
const { data, isLoading, error } = useQuery<PriceData>({
|
||||
queryKey: ['price-analysis'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/pricing`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch price analysis');
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/pricing`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch: ${response.status}`);
|
||||
}
|
||||
const rawData = await response.json();
|
||||
|
||||
if (!rawData || !rawData.pricePoints) {
|
||||
return {
|
||||
pricePoints: [],
|
||||
elasticity: [],
|
||||
recommendations: []
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
pricePoints: (rawData.pricePoints || []).map((item: any) => ({
|
||||
price: Number(item.price) || 0,
|
||||
salesVolume: Number(item.salesVolume || item.salesvolume) || 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
category: item.category || ''
|
||||
})),
|
||||
elasticity: (rawData.elasticity || []).map((item: any) => ({
|
||||
date: item.date || '',
|
||||
price: Number(item.price) || 0,
|
||||
demand: Number(item.demand) || 0
|
||||
})),
|
||||
recommendations: (rawData.recommendations || []).map((item: any) => ({
|
||||
product: item.product || '',
|
||||
currentPrice: Number(item.currentPrice || item.currentprice) || 0,
|
||||
recommendedPrice: Number(item.recommendedPrice || item.recommendedprice) || 0,
|
||||
potentialRevenue: Number(item.potentialRevenue || item.potentialrevenue) || 0,
|
||||
confidence: Number(item.confidence) || 0
|
||||
}))
|
||||
};
|
||||
} catch (err) {
|
||||
console.error('Error fetching price data:', err);
|
||||
throw err;
|
||||
}
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
pricePoints: rawData.pricePoints.map((item: any) => ({
|
||||
...item,
|
||||
price: Number(item.price) || 0,
|
||||
salesVolume: Number(item.salesVolume) || 0,
|
||||
revenue: Number(item.revenue) || 0
|
||||
})),
|
||||
elasticity: rawData.elasticity.map((item: any) => ({
|
||||
...item,
|
||||
price: Number(item.price) || 0,
|
||||
demand: Number(item.demand) || 0
|
||||
})),
|
||||
recommendations: rawData.recommendations.map((item: any) => ({
|
||||
...item,
|
||||
currentPrice: Number(item.currentPrice) || 0,
|
||||
recommendedPrice: Number(item.recommendedPrice) || 0,
|
||||
potentialRevenue: Number(item.potentialRevenue) || 0,
|
||||
confidence: Number(item.confidence) || 0
|
||||
}))
|
||||
};
|
||||
},
|
||||
retry: 1
|
||||
});
|
||||
|
||||
if (isLoading || !data) {
|
||||
if (isLoading) {
|
||||
return <div>Loading price analysis...</div>;
|
||||
}
|
||||
|
||||
if (error || !data) {
|
||||
return (
|
||||
<Card className="mb-4">
|
||||
<CardHeader>
|
||||
<CardTitle>Price Analysis</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<p className="text-red-500">
|
||||
Unable to load price analysis. The price metrics may need to be set up in the database.
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
// Early return if no data to display
|
||||
if (
|
||||
data.pricePoints.length === 0 &&
|
||||
data.elasticity.length === 0 &&
|
||||
data.recommendations.length === 0
|
||||
) {
|
||||
return (
|
||||
<Card className="mb-4">
|
||||
<CardHeader>
|
||||
<CardTitle>Price Analysis</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<p className="text-muted-foreground">
|
||||
No price data available. This may be because the price metrics haven't been calculated yet.
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
|
||||
@@ -38,22 +38,23 @@ export function ProfitAnalysis() {
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
byCategory: rawData.byCategory.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
profitMargin: Number(item.profitMargin) || 0,
|
||||
category: item.category || '',
|
||||
categoryPath: item.categorypath || item.category || '',
|
||||
profitMargin: item.profitmargin !== null ? Number(item.profitmargin) : 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
cost: Number(item.cost) || 0
|
||||
})),
|
||||
overTime: rawData.overTime.map((item: any) => ({
|
||||
...item,
|
||||
profitMargin: Number(item.profitMargin) || 0,
|
||||
date: item.date || '',
|
||||
profitMargin: item.profitmargin !== null ? Number(item.profitmargin) : 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
cost: Number(item.cost) || 0
|
||||
})),
|
||||
topProducts: rawData.topProducts.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
profitMargin: Number(item.profitMargin) || 0,
|
||||
product: item.product || '',
|
||||
category: item.category || '',
|
||||
categoryPath: item.categorypath || item.category || '',
|
||||
profitMargin: item.profitmargin !== null ? Number(item.profitmargin) : 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
cost: Number(item.cost) || 0
|
||||
}))
|
||||
|
||||
@@ -28,42 +28,93 @@ interface StockData {
|
||||
}
|
||||
|
||||
export function StockAnalysis() {
|
||||
const { data, isLoading } = useQuery<StockData>({
|
||||
const { data, isLoading, error } = useQuery<StockData>({
|
||||
queryKey: ['stock-analysis'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/stock`);
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to fetch stock analysis');
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/analytics/stock`);
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to fetch: ${response.status}`);
|
||||
}
|
||||
const rawData = await response.json();
|
||||
|
||||
if (!rawData || !rawData.turnoverByCategory) {
|
||||
return {
|
||||
turnoverByCategory: [],
|
||||
stockLevels: [],
|
||||
criticalItems: []
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
turnoverByCategory: (rawData.turnoverByCategory || []).map((item: any) => ({
|
||||
category: item.category || '',
|
||||
turnoverRate: Number(item.turnoverRate || item.turnoverrate) || 0,
|
||||
averageStock: Number(item.averageStock || item.averagestock) || 0,
|
||||
totalSales: Number(item.totalSales || item.totalsales) || 0
|
||||
})),
|
||||
stockLevels: (rawData.stockLevels || []).map((item: any) => ({
|
||||
date: item.date || '',
|
||||
inStock: Number(item.inStock || item.instock) || 0,
|
||||
lowStock: Number(item.lowStock || item.lowstock) || 0,
|
||||
outOfStock: Number(item.outOfStock || item.outofstock) || 0
|
||||
})),
|
||||
criticalItems: (rawData.criticalItems || []).map((item: any) => ({
|
||||
product: item.product || '',
|
||||
sku: item.sku || '',
|
||||
stockQuantity: Number(item.stockQuantity || item.stockquantity) || 0,
|
||||
reorderPoint: Number(item.reorderPoint || item.reorderpoint) || 0,
|
||||
turnoverRate: Number(item.turnoverRate || item.turnoverrate) || 0,
|
||||
daysUntilStockout: Number(item.daysUntilStockout || item.daysuntilstockout) || 0
|
||||
}))
|
||||
};
|
||||
} catch (err) {
|
||||
console.error('Error fetching stock data:', err);
|
||||
throw err;
|
||||
}
|
||||
const rawData = await response.json();
|
||||
return {
|
||||
turnoverByCategory: rawData.turnoverByCategory.map((item: any) => ({
|
||||
...item,
|
||||
turnoverRate: Number(item.turnoverRate) || 0,
|
||||
averageStock: Number(item.averageStock) || 0,
|
||||
totalSales: Number(item.totalSales) || 0
|
||||
})),
|
||||
stockLevels: rawData.stockLevels.map((item: any) => ({
|
||||
...item,
|
||||
inStock: Number(item.inStock) || 0,
|
||||
lowStock: Number(item.lowStock) || 0,
|
||||
outOfStock: Number(item.outOfStock) || 0
|
||||
})),
|
||||
criticalItems: rawData.criticalItems.map((item: any) => ({
|
||||
...item,
|
||||
stockQuantity: Number(item.stockQuantity) || 0,
|
||||
reorderPoint: Number(item.reorderPoint) || 0,
|
||||
turnoverRate: Number(item.turnoverRate) || 0,
|
||||
daysUntilStockout: Number(item.daysUntilStockout) || 0
|
||||
}))
|
||||
};
|
||||
},
|
||||
retry: 1
|
||||
});
|
||||
|
||||
if (isLoading || !data) {
|
||||
if (isLoading) {
|
||||
return <div>Loading stock analysis...</div>;
|
||||
}
|
||||
|
||||
if (error || !data) {
|
||||
return (
|
||||
<Card className="mb-4">
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Analysis</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<p className="text-red-500">
|
||||
Unable to load stock analysis. The stock metrics may need to be set up in the database.
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
// Early return if no data to display
|
||||
if (
|
||||
data.turnoverByCategory.length === 0 &&
|
||||
data.stockLevels.length === 0 &&
|
||||
data.criticalItems.length === 0
|
||||
) {
|
||||
return (
|
||||
<Card className="mb-4">
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Analysis</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<p className="text-muted-foreground">
|
||||
No stock data available. This may be because the stock metrics haven't been calculated yet.
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
}
|
||||
|
||||
const getStockStatus = (daysUntilStockout: number) => {
|
||||
if (daysUntilStockout <= 7) {
|
||||
return <Badge variant="destructive">Critical</Badge>;
|
||||
|
||||
@@ -58,22 +58,22 @@ export function VendorPerformance() {
|
||||
// Create a complete structure even if some parts are missing
|
||||
const data: VendorData = {
|
||||
performance: rawData.performance.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesVolume: Number(vendor.salesVolume) || 0,
|
||||
profitMargin: Number(vendor.profitMargin) || 0,
|
||||
stockTurnover: Number(vendor.stockTurnover) || 0,
|
||||
vendor: vendor.vendor || '',
|
||||
salesVolume: vendor.salesVolume !== null ? Number(vendor.salesVolume) : 0,
|
||||
profitMargin: vendor.profitMargin !== null ? Number(vendor.profitMargin) : 0,
|
||||
stockTurnover: vendor.stockTurnover !== null ? Number(vendor.stockTurnover) : 0,
|
||||
productCount: Number(vendor.productCount) || 0,
|
||||
growth: Number(vendor.growth) || 0
|
||||
growth: vendor.growth !== null ? Number(vendor.growth) : 0
|
||||
})),
|
||||
comparison: rawData.comparison?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
salesPerProduct: Number(vendor.salesPerProduct) || 0,
|
||||
averageMargin: Number(vendor.averageMargin) || 0,
|
||||
vendor: vendor.vendor || '',
|
||||
salesPerProduct: vendor.salesPerProduct !== null ? Number(vendor.salesPerProduct) : 0,
|
||||
averageMargin: vendor.averageMargin !== null ? Number(vendor.averageMargin) : 0,
|
||||
size: Number(vendor.size) || 0
|
||||
})) || [],
|
||||
trends: rawData.trends?.map((vendor: any) => ({
|
||||
vendor: vendor.vendor,
|
||||
month: vendor.month,
|
||||
vendor: vendor.vendor || '',
|
||||
month: vendor.month || '',
|
||||
sales: Number(vendor.sales) || 0
|
||||
})) || []
|
||||
};
|
||||
|
||||
7
inventory/src/components/config.ts
Normal file
7
inventory/src/components/config.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
const config = {
|
||||
// API base URL - update based on your actual API endpoint
|
||||
apiUrl: '/api',
|
||||
// Add other config values as needed
|
||||
};
|
||||
|
||||
export default config;
|
||||
@@ -5,9 +5,10 @@ import {
|
||||
Settings,
|
||||
ClipboardList,
|
||||
LogOut,
|
||||
Users,
|
||||
Tags,
|
||||
FileSpreadsheet,
|
||||
Plus,
|
||||
ShoppingBag,
|
||||
Truck,
|
||||
} from "lucide-react";
|
||||
import { IconCrystalBall } from "@tabler/icons-react";
|
||||
import {
|
||||
@@ -39,27 +40,21 @@ const items = [
|
||||
url: "/products",
|
||||
permission: "access:products"
|
||||
},
|
||||
{
|
||||
title: "Import",
|
||||
icon: FileSpreadsheet,
|
||||
url: "/import",
|
||||
permission: "access:import"
|
||||
},
|
||||
{
|
||||
title: "Forecasting",
|
||||
icon: IconCrystalBall,
|
||||
url: "/forecasting",
|
||||
permission: "access:forecasting"
|
||||
},
|
||||
{
|
||||
title: "Categories",
|
||||
icon: Tags,
|
||||
url: "/categories",
|
||||
permission: "access:categories"
|
||||
},
|
||||
{
|
||||
title: "Brands",
|
||||
icon: ShoppingBag,
|
||||
url: "/brands",
|
||||
permission: "access:brands"
|
||||
},
|
||||
{
|
||||
title: "Vendors",
|
||||
icon: Users,
|
||||
icon: Truck,
|
||||
url: "/vendors",
|
||||
permission: "access:vendors"
|
||||
},
|
||||
@@ -75,6 +70,18 @@ const items = [
|
||||
url: "/analytics",
|
||||
permission: "access:analytics"
|
||||
},
|
||||
{
|
||||
title: "Forecasting",
|
||||
icon: IconCrystalBall,
|
||||
url: "/forecasting",
|
||||
permission: "access:forecasting"
|
||||
},
|
||||
{
|
||||
title: "Create Products",
|
||||
icon: Plus,
|
||||
url: "/import",
|
||||
permission: "access:import"
|
||||
}
|
||||
];
|
||||
|
||||
export function AppSidebar() {
|
||||
@@ -100,7 +107,7 @@ export function AppSidebar() {
|
||||
className="w-6 h-6 object-contain -rotate-12 transform hover:rotate-0 transition-transform ease-in-out duration-300"
|
||||
/>
|
||||
</div>
|
||||
<div className="ml-2 transition-all duration-200 whitespace-nowrap group-[.group[data-state=collapsed]]:hidden">
|
||||
<div className="ml-1 transition-all duration-200 whitespace-nowrap group-[.group[data-state=collapsed]]:hidden">
|
||||
<span className="font-bold text-lg">A Cherry On Bottom</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
@@ -1,6 +1,5 @@
|
||||
import * as React from "react";
|
||||
import { SortAsc, SortDesc } from "lucide-react";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
@@ -14,10 +13,11 @@ import {
|
||||
DndContext,
|
||||
DragEndEvent,
|
||||
DragStartEvent,
|
||||
MouseSensor,
|
||||
PointerSensor,
|
||||
TouchSensor,
|
||||
useSensor,
|
||||
useSensors,
|
||||
closestCenter,
|
||||
} from "@dnd-kit/core";
|
||||
import {
|
||||
SortableContext,
|
||||
@@ -26,36 +26,38 @@ import {
|
||||
useSortable,
|
||||
} from "@dnd-kit/sortable";
|
||||
import { CSS } from "@dnd-kit/utilities";
|
||||
import { Product } from "@/types/products";
|
||||
|
||||
export type ColumnKey = keyof Product | 'image';
|
||||
import { ProductMetric, ProductMetricColumnKey } from "@/types/products";
|
||||
import { Skeleton } from "@/components/ui/skeleton";
|
||||
import { getStatusBadge } from "@/utils/productUtils";
|
||||
|
||||
// Column definition
|
||||
interface ColumnDef {
|
||||
key: ColumnKey;
|
||||
key: ProductMetricColumnKey;
|
||||
label: string;
|
||||
group: string;
|
||||
format?: (value: any) => string | number;
|
||||
width?: string;
|
||||
noLabel?: boolean;
|
||||
width?: string;
|
||||
format?: (value: any, product?: ProductMetric) => React.ReactNode;
|
||||
}
|
||||
|
||||
interface ProductTableProps {
|
||||
products: Product[];
|
||||
onSort: (column: ColumnKey) => void;
|
||||
sortColumn: ColumnKey;
|
||||
products: ProductMetric[];
|
||||
onSort: (column: ProductMetricColumnKey) => void;
|
||||
sortColumn: ProductMetricColumnKey;
|
||||
sortDirection: 'asc' | 'desc';
|
||||
visibleColumns: Set<ColumnKey>;
|
||||
visibleColumns: Set<ProductMetricColumnKey>;
|
||||
columnDefs: ColumnDef[];
|
||||
columnOrder: ColumnKey[];
|
||||
onColumnOrderChange?: (columns: ColumnKey[]) => void;
|
||||
onRowClick?: (product: Product) => void;
|
||||
columnOrder: ProductMetricColumnKey[];
|
||||
onColumnOrderChange?: (columns: ProductMetricColumnKey[]) => void;
|
||||
onRowClick?: (product: ProductMetric) => void;
|
||||
isLoading?: boolean;
|
||||
}
|
||||
|
||||
interface SortableHeaderProps {
|
||||
column: ColumnKey;
|
||||
column: ProductMetricColumnKey;
|
||||
columnDef?: ColumnDef;
|
||||
onSort: (column: ColumnKey) => void;
|
||||
sortColumn: ColumnKey;
|
||||
onSort: (column: ProductMetricColumnKey) => void;
|
||||
sortColumn: ProductMetricColumnKey;
|
||||
sortDirection: 'asc' | 'desc';
|
||||
}
|
||||
|
||||
@@ -73,18 +75,32 @@ function SortableHeader({ column, columnDef, onSort, sortColumn, sortDirection }
|
||||
transform: CSS.Transform.toString(transform),
|
||||
transition,
|
||||
opacity: isDragging ? 0.5 : 1,
|
||||
zIndex: isDragging ? 10 : 1,
|
||||
position: 'relative' as const,
|
||||
touchAction: 'none' as const,
|
||||
width: columnDef?.width ? undefined : 'auto',
|
||||
minWidth: columnDef?.key === 'imageUrl' ? '60px' : '100px',
|
||||
};
|
||||
|
||||
// Skip rendering content for 'noLabel' columns (like image)
|
||||
if (columnDef?.noLabel) {
|
||||
return <TableHead ref={setNodeRef} style={style} />;
|
||||
return (
|
||||
<TableHead
|
||||
ref={setNodeRef}
|
||||
style={style}
|
||||
className={cn(columnDef?.width, "select-none", "whitespace-nowrap")}
|
||||
{...attributes}
|
||||
{...listeners}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<TableHead
|
||||
<TableHead
|
||||
ref={setNodeRef}
|
||||
style={style}
|
||||
className={cn(
|
||||
"cursor-pointer select-none",
|
||||
"cursor-pointer select-none group whitespace-nowrap",
|
||||
columnDef?.width,
|
||||
sortColumn === column && "bg-accent/50"
|
||||
)}
|
||||
@@ -95,7 +111,7 @@ function SortableHeader({ column, columnDef, onSort, sortColumn, sortDirection }
|
||||
<div className="flex items-center gap-1">
|
||||
{columnDef?.label ?? column}
|
||||
{sortColumn === column && (
|
||||
sortDirection === 'desc'
|
||||
sortDirection === 'desc'
|
||||
? <SortDesc className="h-4 w-4 min-w-4" />
|
||||
: <SortAsc className="h-4 w-4 min-w-4" />
|
||||
)}
|
||||
@@ -104,240 +120,191 @@ function SortableHeader({ column, columnDef, onSort, sortColumn, sortDirection }
|
||||
);
|
||||
}
|
||||
|
||||
export function ProductTable({
|
||||
products,
|
||||
onSort,
|
||||
sortColumn,
|
||||
export function ProductTable({
|
||||
products,
|
||||
onSort,
|
||||
sortColumn,
|
||||
sortDirection,
|
||||
visibleColumns,
|
||||
columnDefs,
|
||||
columnOrder = columnDefs.map(col => col.key),
|
||||
onColumnOrderChange,
|
||||
onRowClick,
|
||||
isLoading = false,
|
||||
}: ProductTableProps) {
|
||||
const [, setActiveId] = React.useState<ColumnKey | null>(null);
|
||||
const [activeId, setActiveId] = React.useState<ProductMetricColumnKey | null>(null);
|
||||
const sensors = useSensors(
|
||||
useSensor(MouseSensor, {
|
||||
activationConstraint: {
|
||||
distance: 8,
|
||||
},
|
||||
useSensor(PointerSensor, {
|
||||
activationConstraint: { distance: 5 },
|
||||
}),
|
||||
useSensor(TouchSensor, {
|
||||
activationConstraint: {
|
||||
delay: 200,
|
||||
tolerance: 8,
|
||||
},
|
||||
activationConstraint: { delay: 250, tolerance: 5 },
|
||||
})
|
||||
);
|
||||
|
||||
// Get ordered visible columns
|
||||
const orderedColumns = React.useMemo(() => {
|
||||
// Filter columnOrder to only include visible columns for SortableContext
|
||||
const orderedVisibleColumns = React.useMemo(() => {
|
||||
return columnOrder.filter(col => visibleColumns.has(col));
|
||||
}, [columnOrder, visibleColumns]);
|
||||
|
||||
const handleDragStart = (event: DragStartEvent) => {
|
||||
setActiveId(event.active.id as ColumnKey);
|
||||
setActiveId(event.active.id as ProductMetricColumnKey);
|
||||
};
|
||||
|
||||
const handleDragEnd = (event: DragEndEvent) => {
|
||||
const { active, over } = event;
|
||||
setActiveId(null);
|
||||
|
||||
if (over && active.id !== over.id) {
|
||||
const oldIndex = orderedColumns.indexOf(active.id as ColumnKey);
|
||||
const newIndex = orderedColumns.indexOf(over.id as ColumnKey);
|
||||
|
||||
const newOrder = arrayMove(orderedColumns, oldIndex, newIndex);
|
||||
onColumnOrderChange?.(newOrder);
|
||||
if (over && active.id !== over.id && onColumnOrderChange) {
|
||||
const oldIndex = orderedVisibleColumns.indexOf(active.id as ProductMetricColumnKey);
|
||||
const newIndex = orderedVisibleColumns.indexOf(over.id as ProductMetricColumnKey);
|
||||
|
||||
if (oldIndex !== -1 && newIndex !== -1) {
|
||||
const newVisibleOrder = arrayMove(orderedVisibleColumns, oldIndex, newIndex);
|
||||
onColumnOrderChange(newVisibleOrder);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const formatColumnValue = (product: ProductMetric, columnKey: ProductMetricColumnKey) => {
|
||||
const columnDef = columnDefs.find(def => def.key === columnKey);
|
||||
const value = product[columnKey as keyof ProductMetric];
|
||||
|
||||
const getStockStatus = (status: string | undefined) => {
|
||||
if (!status) return null;
|
||||
const normalizedStatus = status.toLowerCase().replace(/-/g, ' ');
|
||||
switch (normalizedStatus) {
|
||||
case 'critical':
|
||||
return <Badge variant="destructive">Critical</Badge>;
|
||||
case 'reorder':
|
||||
return <Badge variant="secondary">Reorder</Badge>;
|
||||
case 'healthy':
|
||||
return <Badge variant="default">Healthy</Badge>;
|
||||
case 'overstocked':
|
||||
return <Badge variant="secondary">Overstocked</Badge>;
|
||||
case 'new':
|
||||
return <Badge variant="default">New</Badge>;
|
||||
case 'out of stock':
|
||||
return <Badge variant="destructive">Out of Stock</Badge>;
|
||||
case 'at risk':
|
||||
return <Badge variant="secondary">At Risk</Badge>;
|
||||
default:
|
||||
return <Badge variant="outline">{status}</Badge>;
|
||||
if (columnKey === 'status') {
|
||||
return <div dangerouslySetInnerHTML={{ __html: getStatusBadge(product.status || 'Unknown') }} />;
|
||||
}
|
||||
};
|
||||
|
||||
const getABCClass = (abcClass: string | undefined) => {
|
||||
if (!abcClass) return null;
|
||||
switch (abcClass.toUpperCase()) {
|
||||
case 'A':
|
||||
return <Badge variant="default">A</Badge>;
|
||||
case 'B':
|
||||
return <Badge variant="secondary">B</Badge>;
|
||||
case 'C':
|
||||
return <Badge variant="outline">C</Badge>;
|
||||
default:
|
||||
return null;
|
||||
if (columnDef?.format) {
|
||||
return columnDef.format(value, product);
|
||||
}
|
||||
};
|
||||
|
||||
const getLeadTimeStatus = (status: string | undefined) => {
|
||||
if (!status) return null;
|
||||
switch (status.toLowerCase()) {
|
||||
case 'critical':
|
||||
return <Badge variant="destructive">Critical</Badge>;
|
||||
case 'warning':
|
||||
return <Badge variant="secondary">Warning</Badge>;
|
||||
case 'good':
|
||||
return <Badge variant="default">Good</Badge>;
|
||||
default:
|
||||
return null;
|
||||
// Default formatting for common types if no formatter provided
|
||||
if (typeof value === 'boolean') {
|
||||
return value ? 'Yes' : 'No';
|
||||
}
|
||||
};
|
||||
|
||||
const formatColumnValue = (product: Product, column: ColumnKey) => {
|
||||
const columnDef = columnDefs.find(def => def.key === column);
|
||||
let value: any = product[column as keyof Product];
|
||||
|
||||
switch (column) {
|
||||
case 'image':
|
||||
return product.image ? (
|
||||
<div className="flex items-center justify-center w-[60px]">
|
||||
<img
|
||||
src={product.image}
|
||||
alt={product.title}
|
||||
className="h-12 w-12 object-contain bg-white"
|
||||
/>
|
||||
</div>
|
||||
) : null;
|
||||
case 'title':
|
||||
return (
|
||||
<div className="min-w-[200px]">
|
||||
<div className="font-medium">{product.title}</div>
|
||||
<div className="text-sm text-muted-foreground">{product.SKU}</div>
|
||||
</div>
|
||||
);
|
||||
case 'categories':
|
||||
return (
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{Array.from(new Set(value as string[])).map((category) => (
|
||||
<Badge key={`${product.pid}-${category}`} variant="outline">{category}</Badge>
|
||||
)) || '-'}
|
||||
</div>
|
||||
);
|
||||
case 'dimensions':
|
||||
if (value) {
|
||||
return `${value.length}×${value.width}×${value.height}`;
|
||||
}
|
||||
return '-';
|
||||
case 'stock_status':
|
||||
return getStockStatus(product.stock_status);
|
||||
case 'abc_class':
|
||||
return getABCClass(product.abc_class);
|
||||
case 'lead_time_status':
|
||||
return getLeadTimeStatus(product.lead_time_status);
|
||||
case 'visible':
|
||||
return value ? (
|
||||
<Badge variant="secondary">Active</Badge>
|
||||
) : (
|
||||
<Badge variant="outline">Hidden</Badge>
|
||||
);
|
||||
case 'replenishable':
|
||||
return value ? (
|
||||
<Badge variant="secondary">Replenishable</Badge>
|
||||
) : (
|
||||
<Badge variant="outline">Non-Replenishable</Badge>
|
||||
);
|
||||
case 'rating':
|
||||
if (value === undefined || value === null) return '-';
|
||||
return (
|
||||
<div className="flex items-center">
|
||||
{value.toFixed(1)}
|
||||
<span className="ml-1 text-yellow-500">★</span>
|
||||
</div>
|
||||
);
|
||||
default:
|
||||
if (columnDef?.format && value !== undefined && value !== null) {
|
||||
// For numeric formats (those using toFixed), ensure the value is a number
|
||||
if (typeof value === 'string') {
|
||||
const num = parseFloat(value);
|
||||
if (!isNaN(num)) {
|
||||
return columnDef.format(num);
|
||||
}
|
||||
}
|
||||
// If the value is already a number, format it directly
|
||||
if (typeof value === 'number') {
|
||||
return columnDef.format(value);
|
||||
}
|
||||
// For other formats (e.g., date formatting), pass the value as is
|
||||
return columnDef.format(value);
|
||||
}
|
||||
return value ?? '-';
|
||||
|
||||
// Handle date strings consistently
|
||||
if (value && typeof value === 'string' &&
|
||||
(columnKey.toLowerCase().includes('date') || columnKey === 'replenishDate')) {
|
||||
try {
|
||||
return new Date(value).toLocaleDateString();
|
||||
} catch (e) {
|
||||
return String(value);
|
||||
}
|
||||
}
|
||||
|
||||
if (value === null || value === undefined || value === '') {
|
||||
return '-';
|
||||
}
|
||||
|
||||
// Fallback to string conversion
|
||||
return String(value);
|
||||
};
|
||||
|
||||
return (
|
||||
<DndContext
|
||||
sensors={sensors}
|
||||
collisionDetection={closestCenter}
|
||||
onDragStart={handleDragStart}
|
||||
onDragEnd={handleDragEnd}
|
||||
onDragCancel={() => setActiveId(null)}
|
||||
>
|
||||
<div className="rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<SortableContext
|
||||
items={orderedColumns}
|
||||
strategy={horizontalListSortingStrategy}
|
||||
>
|
||||
{orderedColumns.map((column) => (
|
||||
<SortableHeader
|
||||
key={column}
|
||||
column={column}
|
||||
columnDef={columnDefs.find(def => def.key === column)}
|
||||
onSort={onSort}
|
||||
sortColumn={sortColumn}
|
||||
sortDirection={sortDirection}
|
||||
/>
|
||||
))}
|
||||
</SortableContext>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products.map((product) => (
|
||||
<TableRow
|
||||
key={product.pid}
|
||||
onClick={() => onRowClick?.(product)}
|
||||
className="cursor-pointer"
|
||||
>
|
||||
{orderedColumns.map((column) => (
|
||||
<TableCell key={`${product.pid}-${column}`}>
|
||||
{formatColumnValue(product, column)}
|
||||
</TableCell>
|
||||
))}
|
||||
</TableRow>
|
||||
))}
|
||||
{!products.length && (
|
||||
<div className="border rounded-md relative">
|
||||
{isLoading && (
|
||||
<div className="absolute inset-0 bg-background/70 flex items-center justify-center z-20">
|
||||
<Skeleton className="h-8 w-32" />
|
||||
</div>
|
||||
)}
|
||||
<div className="overflow-x-auto">
|
||||
<Table className={cn(isLoading ? 'opacity-50' : '', "w-max min-w-full")}>
|
||||
<TableHeader className="sticky top-0 bg-background z-10">
|
||||
<TableRow>
|
||||
<TableCell
|
||||
colSpan={orderedColumns.length}
|
||||
className="text-center py-8 text-muted-foreground"
|
||||
<SortableContext
|
||||
items={orderedVisibleColumns}
|
||||
strategy={horizontalListSortingStrategy}
|
||||
>
|
||||
No products found
|
||||
</TableCell>
|
||||
{orderedVisibleColumns.map((columnKey) => (
|
||||
<SortableHeader
|
||||
key={columnKey}
|
||||
column={columnKey}
|
||||
columnDef={columnDefs.find(def => def.key === columnKey)}
|
||||
onSort={onSort}
|
||||
sortColumn={sortColumn}
|
||||
sortDirection={sortDirection}
|
||||
/>
|
||||
))}
|
||||
</SortableContext>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products.length === 0 && !isLoading ? (
|
||||
<TableRow>
|
||||
<TableCell
|
||||
colSpan={orderedVisibleColumns.length}
|
||||
className="text-center py-8 text-muted-foreground"
|
||||
>
|
||||
No products found matching your criteria.
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : (
|
||||
products.map((product) => (
|
||||
<TableRow
|
||||
key={product.pid}
|
||||
onClick={() => onRowClick?.(product)}
|
||||
className="cursor-pointer hover:bg-muted/50"
|
||||
data-state={isLoading ? 'loading' : undefined}
|
||||
>
|
||||
{orderedVisibleColumns.map((columnKey) => {
|
||||
const colDef = columnDefs.find(c => c.key === columnKey);
|
||||
return (
|
||||
<TableCell
|
||||
key={`${product.pid}-${columnKey}`}
|
||||
className={cn(
|
||||
colDef?.width,
|
||||
"whitespace-nowrap",
|
||||
columnKey === 'title' && "max-w-[300px] truncate"
|
||||
)}
|
||||
>
|
||||
{columnKey === 'imageUrl' ? (
|
||||
<div className="flex items-center justify-center h-12 w-[60px]">
|
||||
{product.imageUrl ? (
|
||||
<img
|
||||
src={product.imageUrl}
|
||||
alt={product.title || 'Product image'}
|
||||
className="max-h-full max-w-full object-contain bg-white p-0.5 border rounded"
|
||||
loading="lazy"
|
||||
/>
|
||||
) : (
|
||||
<div className="h-10 w-10 bg-muted rounded flex items-center justify-center text-muted-foreground text-xs">No Image</div>
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
formatColumnValue(product, columnKey)
|
||||
)}
|
||||
</TableCell>
|
||||
);
|
||||
})}
|
||||
</TableRow>
|
||||
))
|
||||
)}
|
||||
{isLoading && products.length === 0 && Array.from({length: 10}).map((_, i) => (
|
||||
<TableRow key={`skel-${i}`}>
|
||||
{orderedVisibleColumns.map(key => {
|
||||
const colDef = columnDefs.find(c => c.key === key);
|
||||
return (
|
||||
<TableCell
|
||||
key={`skel-${i}-${key}`}
|
||||
className={cn(colDef?.width, "whitespace-nowrap")}
|
||||
>
|
||||
<Skeleton className={`h-5 ${key==='imageUrl' ? 'w-10 h-10' : 'w-full'}`} />
|
||||
</TableCell>
|
||||
);
|
||||
})}
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</div>
|
||||
</DndContext>
|
||||
);
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { Tabs, TabsList, TabsTrigger } from "@/components/ui/tabs"
|
||||
import { Product } from "@/types/products"
|
||||
import {
|
||||
AlertTriangle,
|
||||
CheckCircle2,
|
||||
@@ -15,7 +14,6 @@ export type ProductView = {
|
||||
label: string
|
||||
icon: any
|
||||
iconClassName: string
|
||||
columns: (keyof Product)[]
|
||||
}
|
||||
|
||||
export const PRODUCT_VIEWS: ProductView[] = [
|
||||
@@ -23,50 +21,43 @@ export const PRODUCT_VIEWS: ProductView[] = [
|
||||
id: "all",
|
||||
label: "All Products",
|
||||
icon: PackageSearch,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "SKU", "stock_quantity", "price", "stock_status"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "critical",
|
||||
label: "Critical Stock",
|
||||
icon: AlertTriangle,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "SKU", "stock_quantity", "daily_sales_avg", "reorder_qty", "last_purchase_date", "lead_time_status"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "reorder",
|
||||
label: "Reorder Soon",
|
||||
icon: PackagePlus,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "SKU", "stock_quantity", "daily_sales_avg", "reorder_qty", "last_purchase_date", "lead_time_status"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "healthy",
|
||||
label: "Healthy Stock",
|
||||
icon: CheckCircle2,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "stock_quantity", "daily_sales_avg", "stock_status", "abc_class"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "at-risk",
|
||||
label: "At Risk",
|
||||
icon: Timer,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "stock_quantity", "daily_sales_avg", "weekly_sales_avg", "days_of_inventory", "last_sale_date"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "overstocked",
|
||||
label: "Overstock",
|
||||
icon: PackageX,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "stock_quantity", "daily_sales_avg", "overstocked_amt", "days_of_inventory", "last_sale_date"]
|
||||
iconClassName: ""
|
||||
},
|
||||
{
|
||||
id: "new",
|
||||
label: "New Products",
|
||||
icon: Sparkles,
|
||||
iconClassName: "",
|
||||
columns: ["image", "title", "stock_quantity", "daily_sales_avg", "stock_status", "abc_class"]
|
||||
iconClassName: ""
|
||||
}
|
||||
]
|
||||
|
||||
|
||||
239
inventory/src/components/products/Products.tsx
Normal file
239
inventory/src/components/products/Products.tsx
Normal file
@@ -0,0 +1,239 @@
|
||||
import * as React from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { useSearchParams } from "react-router-dom";
|
||||
import { Spinner } from "@/components/ui/spinner";
|
||||
import { ProductFilterOptions, ProductMetric } from "@/types/products";
|
||||
import { ProductTable } from "./ProductTable";
|
||||
import { ProductFilters } from "./ProductFilters";
|
||||
import { ProductDetail } from "./ProductDetail";
|
||||
import config from "@/config";
|
||||
import { getProductStatus } from "@/utils/productUtils";
|
||||
|
||||
export function Products() {
|
||||
const [searchParams, setSearchParams] = useSearchParams();
|
||||
const [selectedProductId, setSelectedProductId] = React.useState<number | null>(null);
|
||||
|
||||
// Get current filter values from URL params
|
||||
const currentPage = Number(searchParams.get("page") || "1");
|
||||
const pageSize = Number(searchParams.get("pageSize") || "25");
|
||||
const sortBy = searchParams.get("sortBy") || "title";
|
||||
const sortDirection = searchParams.get("sortDirection") || "asc";
|
||||
const filterType = searchParams.get("filterType") || "";
|
||||
const filterValue = searchParams.get("filterValue") || "";
|
||||
const searchQuery = searchParams.get("search") || "";
|
||||
const statusFilter = searchParams.get("status") || "";
|
||||
|
||||
// Fetch filter options
|
||||
const {
|
||||
data: filterOptions,
|
||||
isLoading: isLoadingOptions
|
||||
} = useQuery<ProductFilterOptions>({
|
||||
queryKey: ["productFilterOptions"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/metrics/filter-options`, {
|
||||
credentials: 'include',
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return { vendors: [], brands: [], abcClasses: [] };
|
||||
}
|
||||
|
||||
return await response.json();
|
||||
},
|
||||
initialData: { vendors: [], brands: [], abcClasses: [] }, // Provide initial data to prevent undefined
|
||||
});
|
||||
|
||||
// Fetch products with metrics data
|
||||
const {
|
||||
data,
|
||||
isLoading,
|
||||
error
|
||||
} = useQuery<{ products: ProductMetric[], total: number }>({
|
||||
queryKey: ["products", currentPage, pageSize, sortBy, sortDirection, filterType, filterValue, searchQuery, statusFilter],
|
||||
queryFn: async () => {
|
||||
// Build query parameters
|
||||
const params = new URLSearchParams();
|
||||
params.append("page", currentPage.toString());
|
||||
params.append("limit", pageSize.toString());
|
||||
|
||||
if (sortBy) params.append("sortBy", sortBy);
|
||||
if (sortDirection) params.append("sortDirection", sortDirection);
|
||||
if (filterType && filterValue) {
|
||||
params.append("filterType", filterType);
|
||||
params.append("filterValue", filterValue);
|
||||
}
|
||||
if (searchQuery) params.append("search", searchQuery);
|
||||
if (statusFilter) params.append("status", statusFilter);
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/metrics?${params.toString()}`, {
|
||||
credentials: 'include',
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || `Failed to fetch products (${response.status})`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Calculate status for each product
|
||||
const productsWithStatus = data.products.map((product: ProductMetric) => ({
|
||||
...product,
|
||||
status: getProductStatus(product)
|
||||
}));
|
||||
|
||||
return {
|
||||
products: productsWithStatus,
|
||||
total: data.total
|
||||
};
|
||||
},
|
||||
});
|
||||
|
||||
const handlePageChange = (page: number) => {
|
||||
searchParams.set("page", page.toString());
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handlePageSizeChange = (size: number) => {
|
||||
searchParams.set("pageSize", size.toString());
|
||||
searchParams.set("page", "1"); // Reset to first page when changing page size
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handleSortChange = (field: string, direction: "asc" | "desc") => {
|
||||
searchParams.set("sortBy", field);
|
||||
searchParams.set("sortDirection", direction);
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handleFilterChange = (type: string, value: string) => {
|
||||
if (type && value) {
|
||||
searchParams.set("filterType", type);
|
||||
searchParams.set("filterValue", value);
|
||||
} else {
|
||||
searchParams.delete("filterType");
|
||||
searchParams.delete("filterValue");
|
||||
}
|
||||
searchParams.set("page", "1"); // Reset to first page when applying filters
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handleStatusFilterChange = (status: string) => {
|
||||
if (status) {
|
||||
searchParams.set("status", status);
|
||||
} else {
|
||||
searchParams.delete("status");
|
||||
}
|
||||
searchParams.set("page", "1"); // Reset to first page when changing status filter
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handleSearchChange = (query: string) => {
|
||||
if (query) {
|
||||
searchParams.set("search", query);
|
||||
} else {
|
||||
searchParams.delete("search");
|
||||
}
|
||||
searchParams.set("page", "1"); // Reset to first page when searching
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
const handleViewProduct = (id: number) => {
|
||||
setSelectedProductId(id);
|
||||
};
|
||||
|
||||
const handleCloseProductDetail = () => {
|
||||
setSelectedProductId(null);
|
||||
};
|
||||
|
||||
// Create a wrapper function to handle all filter changes
|
||||
const handleFiltersChange = (filters: Record<string, any>) => {
|
||||
// Reset to first page when applying filters
|
||||
searchParams.set("page", "1");
|
||||
|
||||
// Update searchParams with all filters
|
||||
Object.entries(filters).forEach(([key, value]) => {
|
||||
if (value) {
|
||||
searchParams.set(key, String(value));
|
||||
} else {
|
||||
searchParams.delete(key);
|
||||
}
|
||||
});
|
||||
|
||||
setSearchParams(searchParams);
|
||||
};
|
||||
|
||||
// Clear all filters
|
||||
const handleClearFilters = () => {
|
||||
// Keep only pagination and sorting params
|
||||
const newParams = new URLSearchParams();
|
||||
newParams.set("page", "1");
|
||||
newParams.set("pageSize", pageSize.toString());
|
||||
newParams.set("sortBy", sortBy);
|
||||
newParams.set("sortDirection", sortDirection);
|
||||
setSearchParams(newParams);
|
||||
};
|
||||
|
||||
// Current active filters
|
||||
const activeFilters = React.useMemo(() => {
|
||||
const filters: Record<string, any> = {};
|
||||
|
||||
if (filterType && filterValue) {
|
||||
filters[filterType] = filterValue;
|
||||
}
|
||||
|
||||
if (searchQuery) {
|
||||
filters.search = searchQuery;
|
||||
}
|
||||
|
||||
if (statusFilter) {
|
||||
filters.status = statusFilter;
|
||||
}
|
||||
|
||||
return filters;
|
||||
}, [filterType, filterValue, searchQuery, statusFilter]);
|
||||
|
||||
return (
|
||||
<div className="flex-1 space-y-4 p-4 md:p-8 pt-6">
|
||||
<div className="flex items-center justify-between">
|
||||
<h2 className="text-3xl font-bold tracking-tight">Products</h2>
|
||||
</div>
|
||||
|
||||
<ProductFilters
|
||||
filterOptions={filterOptions || { vendors: [], brands: [], abcClasses: [] }}
|
||||
isLoadingOptions={isLoadingOptions}
|
||||
onFilterChange={handleFiltersChange}
|
||||
onClearFilters={handleClearFilters}
|
||||
activeFilters={activeFilters}
|
||||
/>
|
||||
|
||||
{isLoading ? (
|
||||
<div className="flex justify-center items-center min-h-[300px]">
|
||||
<Spinner size="lg" />
|
||||
</div>
|
||||
) : error ? (
|
||||
<div className="bg-destructive/10 p-4 rounded-lg text-center text-destructive border border-destructive">
|
||||
Error loading products: {(error as Error).message}
|
||||
</div>
|
||||
) : (
|
||||
<ProductTable
|
||||
products={data?.products || []}
|
||||
total={data?.total || 0}
|
||||
currentPage={currentPage}
|
||||
pageSize={pageSize}
|
||||
sortBy={sortBy}
|
||||
sortDirection={sortDirection as "asc" | "desc"}
|
||||
onPageChange={handlePageChange}
|
||||
onPageSizeChange={handlePageSizeChange}
|
||||
onSortChange={handleSortChange}
|
||||
onViewProduct={handleViewProduct}
|
||||
/>
|
||||
)}
|
||||
|
||||
<ProductDetail
|
||||
productId={selectedProductId}
|
||||
onClose={handleCloseProductDetail}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,130 +0,0 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
|
||||
interface SalesVelocityConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
daily_window_days: number;
|
||||
weekly_window_days: number;
|
||||
monthly_window_days: number;
|
||||
}
|
||||
|
||||
export function CalculationSettings() {
|
||||
const [salesVelocityConfig, setSalesVelocityConfig] = useState<SalesVelocityConfig>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
daily_window_days: 30,
|
||||
weekly_window_days: 7,
|
||||
monthly_window_days: 90
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configuration');
|
||||
}
|
||||
const data = await response.json();
|
||||
setSalesVelocityConfig(data.salesVelocityConfig);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
loadConfig();
|
||||
}, []);
|
||||
|
||||
const handleUpdateSalesVelocityConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/sales-velocity/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(salesVelocityConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update sales velocity configuration');
|
||||
}
|
||||
|
||||
toast.success('Sales velocity configuration updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="max-w-[700px] space-y-4">
|
||||
{/* Sales Velocity Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Sales Velocity Windows</CardTitle>
|
||||
<CardDescription>Configure time windows for sales velocity calculations</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="daily-window">Daily Window (days)</Label>
|
||||
<Input
|
||||
id="daily-window"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={salesVelocityConfig.daily_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
daily_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="weekly-window">Weekly Window (days)</Label>
|
||||
<Input
|
||||
id="weekly-window"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={salesVelocityConfig.weekly_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
weekly_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="monthly-window">Monthly Window (days)</Label>
|
||||
<Input
|
||||
id="monthly-window"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={salesVelocityConfig.monthly_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
monthly_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateSalesVelocityConfig}>
|
||||
Update Sales Velocity Windows
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,626 +0,0 @@
|
||||
import { useState, useEffect } from "react";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import axios from "axios";
|
||||
import config from "@/config";
|
||||
import { toast } from "sonner";
|
||||
|
||||
interface StockThreshold {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
critical_days: number;
|
||||
reorder_days: number;
|
||||
overstock_days: number;
|
||||
low_stock_threshold: number;
|
||||
min_reorder_quantity: number;
|
||||
category_name?: string;
|
||||
threshold_scope?: string;
|
||||
}
|
||||
|
||||
interface LeadTimeThreshold {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
target_days: number;
|
||||
warning_days: number;
|
||||
critical_days: number;
|
||||
}
|
||||
|
||||
interface SalesVelocityConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
daily_window_days: number;
|
||||
weekly_window_days: number;
|
||||
monthly_window_days: number;
|
||||
}
|
||||
|
||||
interface ABCClassificationConfig {
|
||||
id: number;
|
||||
a_threshold: number;
|
||||
b_threshold: number;
|
||||
classification_period_days: number;
|
||||
}
|
||||
|
||||
interface SafetyStockConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
coverage_days: number;
|
||||
service_level: number;
|
||||
}
|
||||
|
||||
interface TurnoverConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
calculation_period_days: number;
|
||||
target_rate: number;
|
||||
}
|
||||
|
||||
export function Configuration() {
|
||||
const [stockThresholds, setStockThresholds] = useState<StockThreshold>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
critical_days: 7,
|
||||
reorder_days: 14,
|
||||
overstock_days: 90,
|
||||
low_stock_threshold: 5,
|
||||
min_reorder_quantity: 1
|
||||
});
|
||||
|
||||
const [leadTimeThresholds, setLeadTimeThresholds] = useState<LeadTimeThreshold>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
target_days: 14,
|
||||
warning_days: 21,
|
||||
critical_days: 30
|
||||
});
|
||||
|
||||
const [salesVelocityConfig, setSalesVelocityConfig] = useState<SalesVelocityConfig>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
daily_window_days: 30,
|
||||
weekly_window_days: 7,
|
||||
monthly_window_days: 90
|
||||
});
|
||||
|
||||
const [abcConfig, setAbcConfig] = useState<ABCClassificationConfig>({
|
||||
id: 1,
|
||||
a_threshold: 20.0,
|
||||
b_threshold: 50.0,
|
||||
classification_period_days: 90
|
||||
});
|
||||
|
||||
const [safetyStockConfig, setSafetyStockConfig] = useState<SafetyStockConfig>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
coverage_days: 14,
|
||||
service_level: 95.0
|
||||
});
|
||||
|
||||
const [turnoverConfig, setTurnoverConfig] = useState<TurnoverConfig>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
calculation_period_days: 30,
|
||||
target_rate: 1.0
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configuration');
|
||||
}
|
||||
const data = await response.json();
|
||||
setStockThresholds(data.stockThresholds);
|
||||
setLeadTimeThresholds(data.leadTimeThresholds);
|
||||
setSalesVelocityConfig(data.salesVelocityConfig);
|
||||
setAbcConfig(data.abcConfig);
|
||||
setSafetyStockConfig(data.safetyStockConfig);
|
||||
setTurnoverConfig(data.turnoverConfig);
|
||||
} catch (error) {
|
||||
toast.error('Failed to load configuration');
|
||||
}
|
||||
};
|
||||
loadConfig();
|
||||
}, []);
|
||||
|
||||
const handleUpdateStockThresholds = async () => {
|
||||
try {
|
||||
const response = await axios.post(`${config.apiUrl}/settings/stock-thresholds`, stockThresholds);
|
||||
if (response.status === 200) {
|
||||
toast.success('Stock thresholds updated successfully');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error updating stock thresholds:", error);
|
||||
toast.error('Failed to update stock thresholds');
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateLeadTimeThresholds = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/lead-time`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(leadTimeThresholds)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update lead time thresholds');
|
||||
}
|
||||
|
||||
toast.success('Lead time thresholds updated successfully');
|
||||
} catch (error) {
|
||||
console.error("Error updating lead time thresholds:", error);
|
||||
toast.error(`Failed to update thresholds: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateSalesVelocityConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/sales-velocity/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(salesVelocityConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update sales velocity configuration');
|
||||
}
|
||||
|
||||
toast.success('Sales velocity configuration updated successfully');
|
||||
} catch (error) {
|
||||
console.error("Error updating sales velocity configuration:", error);
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateABCConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/abc-classification/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(abcConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update ABC classification configuration');
|
||||
}
|
||||
|
||||
toast.success('ABC classification configuration updated successfully');
|
||||
} catch (error) {
|
||||
console.error("Error updating ABC classification configuration:", error);
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateSafetyStockConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/safety-stock/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(safetyStockConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update safety stock configuration');
|
||||
}
|
||||
|
||||
toast.success('Safety stock configuration updated successfully');
|
||||
} catch (error) {
|
||||
console.error("Error updating safety stock configuration:", error);
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateTurnoverConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/turnover/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(turnoverConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update turnover configuration');
|
||||
}
|
||||
|
||||
toast.success('Turnover configuration updated successfully');
|
||||
} catch (error) {
|
||||
console.error("Error updating turnover configuration:", error);
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<Tabs defaultValue="stock" className="w-full">
|
||||
<TabsList>
|
||||
<TabsTrigger value="stock">Stock Management</TabsTrigger>
|
||||
<TabsTrigger value="performance">Performance Metrics</TabsTrigger>
|
||||
<TabsTrigger value="calculation">Calculation Settings</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
<TabsContent value="stock" className="space-y-4">
|
||||
{/* Stock Thresholds Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Thresholds</CardTitle>
|
||||
<CardDescription>Configure stock level thresholds for inventory management</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="critical-days">Critical Days</Label>
|
||||
<Input
|
||||
id="critical-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={stockThresholds.critical_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
critical_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="reorder-days">Reorder Days</Label>
|
||||
<Input
|
||||
id="reorder-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={stockThresholds.reorder_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
reorder_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="overstock-days">Overstock Days</Label>
|
||||
<Input
|
||||
id="overstock-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={stockThresholds.overstock_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
overstock_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="low-stock-threshold">Low Stock Threshold</Label>
|
||||
<Input
|
||||
id="low-stock-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
value={stockThresholds.low_stock_threshold}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
low_stock_threshold: parseInt(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="min-reorder-quantity">Minimum Reorder Quantity</Label>
|
||||
<Input
|
||||
id="min-reorder-quantity"
|
||||
type="number"
|
||||
min="1"
|
||||
value={stockThresholds.min_reorder_quantity}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
min_reorder_quantity: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateStockThresholds}>
|
||||
Update Stock Thresholds
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* Safety Stock Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Safety Stock</CardTitle>
|
||||
<CardDescription>Configure safety stock parameters</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="coverage-days">Coverage Days</Label>
|
||||
<Input
|
||||
id="coverage-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={safetyStockConfig.coverage_days}
|
||||
onChange={(e) => setSafetyStockConfig(prev => ({
|
||||
...prev,
|
||||
coverage_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="service-level">Service Level (%)</Label>
|
||||
<Input
|
||||
id="service-level"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
value={safetyStockConfig.service_level}
|
||||
onChange={(e) => setSafetyStockConfig(prev => ({
|
||||
...prev,
|
||||
service_level: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateSafetyStockConfig}>
|
||||
Update Safety Stock Configuration
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="performance" className="space-y-4">
|
||||
{/* Lead Time Thresholds Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Lead Time Thresholds</CardTitle>
|
||||
<CardDescription>Configure lead time thresholds for vendor performance</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="target-days">Target Days</Label>
|
||||
<Input
|
||||
id="target-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={leadTimeThresholds.target_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
target_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="warning-days">Warning Days</Label>
|
||||
<Input
|
||||
id="warning-days"
|
||||
type="number"
|
||||
min="1"
|
||||
value={leadTimeThresholds.warning_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
warning_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="critical-days-lead">Critical Days</Label>
|
||||
<Input
|
||||
id="critical-days-lead"
|
||||
type="number"
|
||||
min="1"
|
||||
value={leadTimeThresholds.critical_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
critical_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateLeadTimeThresholds}>
|
||||
Update Lead Time Thresholds
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* ABC Classification Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>ABC Classification</CardTitle>
|
||||
<CardDescription>Configure ABC classification parameters</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="a-threshold">A Threshold (%)</Label>
|
||||
<Input
|
||||
id="a-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
value={abcConfig.a_threshold}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
a_threshold: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="b-threshold">B Threshold (%)</Label>
|
||||
<Input
|
||||
id="b-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
value={abcConfig.b_threshold}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
b_threshold: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="classification-period">Classification Period (days)</Label>
|
||||
<Input
|
||||
id="classification-period"
|
||||
type="number"
|
||||
min="1"
|
||||
value={abcConfig.classification_period_days}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
classification_period_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateABCConfig}>
|
||||
Update ABC Classification
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* Turnover Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Turnover Rate</CardTitle>
|
||||
<CardDescription>Configure turnover rate calculations</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="calculation-period">Calculation Period (days)</Label>
|
||||
<Input
|
||||
id="calculation-period"
|
||||
type="number"
|
||||
min="1"
|
||||
value={turnoverConfig.calculation_period_days}
|
||||
onChange={(e) => setTurnoverConfig(prev => ({
|
||||
...prev,
|
||||
calculation_period_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="target-rate">Target Rate</Label>
|
||||
<Input
|
||||
id="target-rate"
|
||||
type="number"
|
||||
min="0"
|
||||
step="0.1"
|
||||
value={turnoverConfig.target_rate}
|
||||
onChange={(e) => setTurnoverConfig(prev => ({
|
||||
...prev,
|
||||
target_rate: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateTurnoverConfig}>
|
||||
Update Turnover Configuration
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="calculation" className="space-y-4">
|
||||
{/* Sales Velocity Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Sales Velocity Windows</CardTitle>
|
||||
<CardDescription>Configure time windows for sales velocity calculations</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="daily-window">Daily Window (days)</Label>
|
||||
<Input
|
||||
id="daily-window"
|
||||
type="number"
|
||||
min="1"
|
||||
value={salesVelocityConfig.daily_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
daily_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="weekly-window">Weekly Window (days)</Label>
|
||||
<Input
|
||||
id="weekly-window"
|
||||
type="number"
|
||||
min="1"
|
||||
value={salesVelocityConfig.weekly_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
weekly_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="monthly-window">Monthly Window (days)</Label>
|
||||
<Input
|
||||
id="monthly-window"
|
||||
type="number"
|
||||
min="1"
|
||||
value={salesVelocityConfig.monthly_window_days}
|
||||
onChange={(e) => setSalesVelocityConfig(prev => ({
|
||||
...prev,
|
||||
monthly_window_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateSalesVelocityConfig}>
|
||||
Update Sales Velocity Windows
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</TabsContent>
|
||||
</Tabs>
|
||||
);
|
||||
}
|
||||
188
inventory/src/components/settings/GlobalSettings.tsx
Normal file
188
inventory/src/components/settings/GlobalSettings.tsx
Normal file
@@ -0,0 +1,188 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
|
||||
interface GlobalSetting {
|
||||
setting_key: string;
|
||||
setting_value: string;
|
||||
description: string;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
export function GlobalSettings() {
|
||||
const [settings, setSettings] = useState<GlobalSetting[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
|
||||
useEffect(() => {
|
||||
loadSettings();
|
||||
}, []);
|
||||
|
||||
const loadSettings = async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const response = await fetch(`${config.apiUrl}/config/global`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load global settings');
|
||||
}
|
||||
const data = await response.json();
|
||||
setSettings(data);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load settings: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const updateSetting = async (key: string, value: string) => {
|
||||
const updatedSettings = settings.map(s =>
|
||||
s.setting_key === key ? { ...s, setting_value: value } : s
|
||||
);
|
||||
setSettings(updatedSettings);
|
||||
};
|
||||
|
||||
const handleSaveSettings = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/global`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(settings)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update global settings');
|
||||
}
|
||||
|
||||
toast.success('Global settings updated successfully');
|
||||
await loadSettings(); // Reload to get fresh data
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update settings: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const renderSettingInput = (setting: GlobalSetting) => {
|
||||
// Handle different input types based on setting key or value
|
||||
if (setting.setting_key === 'abc_calculation_basis') {
|
||||
return (
|
||||
<Select
|
||||
value={setting.setting_value}
|
||||
onValueChange={(value) => updateSetting(setting.setting_key, value)}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select calculation basis" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="revenue_30d">Revenue (30 days)</SelectItem>
|
||||
<SelectItem value="sales_30d">Sales Quantity (30 days)</SelectItem>
|
||||
<SelectItem value="lifetime_revenue">Lifetime Revenue</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
);
|
||||
} else if (setting.setting_key === 'default_forecast_method') {
|
||||
return (
|
||||
<Select
|
||||
value={setting.setting_value}
|
||||
onValueChange={(value) => updateSetting(setting.setting_key, value)}
|
||||
>
|
||||
<SelectTrigger>
|
||||
<SelectValue placeholder="Select forecast method" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="standard">Standard</SelectItem>
|
||||
<SelectItem value="seasonal">Seasonal</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
);
|
||||
} else if (setting.setting_key.includes('threshold')) {
|
||||
// Percentage inputs
|
||||
return (
|
||||
<Input
|
||||
type="number"
|
||||
min="0"
|
||||
max="1"
|
||||
step="0.01"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.setting_value}
|
||||
onChange={(e) => updateSetting(setting.setting_key, e.target.value)}
|
||||
/>
|
||||
);
|
||||
} else {
|
||||
// Default to number input for other settings
|
||||
return (
|
||||
<Input
|
||||
type="number"
|
||||
min="0"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.setting_value}
|
||||
onChange={(e) => updateSetting(setting.setting_key, e.target.value)}
|
||||
/>
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
// Group settings by their purpose
|
||||
const abcSettings = settings.filter(s => s.setting_key.startsWith('abc_'));
|
||||
const defaultSettings = settings.filter(s => s.setting_key.startsWith('default_'));
|
||||
|
||||
if (loading) {
|
||||
return <div className="py-4">Loading settings...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="max-w-[700px] space-y-6">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>ABC Classification Settings</CardTitle>
|
||||
<CardDescription>Configure how products are classified into A, B, and C categories</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{abcSettings.map(setting => (
|
||||
<div key={setting.setting_key} className="grid gap-2">
|
||||
<Label htmlFor={setting.setting_key}>
|
||||
{setting.description}
|
||||
</Label>
|
||||
{renderSettingInput(setting)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Default Settings</CardTitle>
|
||||
<CardDescription>Configure system-wide default values</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{defaultSettings.map(setting => (
|
||||
<div key={setting.setting_key} className="grid gap-2">
|
||||
<Label htmlFor={setting.setting_key}>
|
||||
{setting.description}
|
||||
</Label>
|
||||
{renderSettingInput(setting)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
<div className="flex justify-end">
|
||||
<Button onClick={handleSaveSettings}>
|
||||
Save All Settings
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,291 +0,0 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Table, TableBody, TableCell, TableHeader, TableRow } from "@/components/ui/table";
|
||||
|
||||
interface LeadTimeThreshold {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
target_days: number;
|
||||
warning_days: number;
|
||||
critical_days: number;
|
||||
}
|
||||
|
||||
interface ABCClassificationConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
a_threshold: number;
|
||||
b_threshold: number;
|
||||
classification_period_days: number;
|
||||
}
|
||||
|
||||
interface TurnoverConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
calculation_period_days: number;
|
||||
target_rate: number;
|
||||
}
|
||||
|
||||
export function PerformanceMetrics() {
|
||||
const [leadTimeThresholds, setLeadTimeThresholds] = useState<LeadTimeThreshold>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
target_days: 14,
|
||||
warning_days: 21,
|
||||
critical_days: 30
|
||||
});
|
||||
|
||||
const [abcConfigs, setAbcConfigs] = useState<ABCClassificationConfig[]>([]);
|
||||
|
||||
const [turnoverConfigs, setTurnoverConfigs] = useState<TurnoverConfig[]>([]);
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configuration');
|
||||
}
|
||||
const data = await response.json();
|
||||
setLeadTimeThresholds(data.leadTimeThresholds);
|
||||
setAbcConfigs(data.abcConfigs);
|
||||
setTurnoverConfigs(data.turnoverConfigs);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
loadConfig();
|
||||
}, []);
|
||||
|
||||
const handleUpdateLeadTimeThresholds = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/lead-time-thresholds/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(leadTimeThresholds)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update lead time thresholds');
|
||||
}
|
||||
|
||||
toast.success('Lead time thresholds updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update thresholds: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateABCConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/abc-classification/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(abcConfigs)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update ABC classification configuration');
|
||||
}
|
||||
|
||||
toast.success('ABC classification configuration updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateTurnoverConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/turnover/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(turnoverConfigs)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update turnover configuration');
|
||||
}
|
||||
|
||||
toast.success('Turnover configuration updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
function getCategoryName(cat_id: number): import("react").ReactNode {
|
||||
// Simple implementation that just returns the ID as a string
|
||||
return `Category ${cat_id}`;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="max-w-[700px] space-y-4">
|
||||
{/* Lead Time Thresholds Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Lead Time Thresholds</CardTitle>
|
||||
<CardDescription>Configure lead time thresholds for vendor performance</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="target-days">Target Days</Label>
|
||||
<Input
|
||||
id="target-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={leadTimeThresholds.target_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
target_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="warning-days">Warning Days</Label>
|
||||
<Input
|
||||
id="warning-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={leadTimeThresholds.warning_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
warning_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="critical-days-lead">Critical Days</Label>
|
||||
<Input
|
||||
id="critical-days-lead"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={leadTimeThresholds.critical_days}
|
||||
onChange={(e) => setLeadTimeThresholds(prev => ({
|
||||
...prev,
|
||||
critical_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateLeadTimeThresholds}>
|
||||
Update Lead Time Thresholds
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* ABC Classification Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>ABC Classification</CardTitle>
|
||||
<CardDescription>Configure ABC classification parameters</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableCell>Category</TableCell>
|
||||
<TableCell>Vendor</TableCell>
|
||||
<TableCell className="text-right">A Threshold</TableCell>
|
||||
<TableCell className="text-right">B Threshold</TableCell>
|
||||
<TableCell className="text-right">Period Days</TableCell>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{abcConfigs && abcConfigs.length > 0 ? abcConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.a_threshold !== undefined ? `${config.a_threshold}%` : '0%'}</TableCell>
|
||||
<TableCell className="text-right">{config.b_threshold !== undefined ? `${config.b_threshold}%` : '0%'}</TableCell>
|
||||
<TableCell className="text-right">{config.classification_period_days || 0}</TableCell>
|
||||
</TableRow>
|
||||
)) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={5} className="text-center py-4">No ABC configurations available</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateABCConfig}>
|
||||
Update ABC Classification
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* Turnover Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Turnover Rate</CardTitle>
|
||||
<CardDescription>Configure turnover rate calculations</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableCell>Category</TableCell>
|
||||
<TableCell>Vendor</TableCell>
|
||||
<TableCell className="text-right">Period Days</TableCell>
|
||||
<TableCell className="text-right">Target Rate</TableCell>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{turnoverConfigs && turnoverConfigs.length > 0 ? turnoverConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.calculation_period_days}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{config.target_rate !== undefined && config.target_rate !== null
|
||||
? (typeof config.target_rate === 'number'
|
||||
? config.target_rate.toFixed(2)
|
||||
: (isNaN(parseFloat(String(config.target_rate)))
|
||||
? '0.00'
|
||||
: parseFloat(String(config.target_rate)).toFixed(2)))
|
||||
: '0.00'}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)) : (
|
||||
<TableRow>
|
||||
<TableCell colSpan={4} className="text-center py-4">No turnover configurations available</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateTurnoverConfig}>
|
||||
Update Turnover Configuration
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
322
inventory/src/components/settings/ProductSettings.tsx
Normal file
322
inventory/src/components/settings/ProductSettings.tsx
Normal file
@@ -0,0 +1,322 @@
|
||||
import { useState, useEffect, useCallback, useMemo } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { Search } from 'lucide-react';
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import {
|
||||
Pagination,
|
||||
PaginationContent,
|
||||
PaginationEllipsis,
|
||||
PaginationItem,
|
||||
PaginationLink,
|
||||
PaginationNext,
|
||||
PaginationPrevious,
|
||||
} from "@/components/ui/pagination";
|
||||
|
||||
interface ProductSetting {
|
||||
pid: string;
|
||||
lead_time_days: number | null;
|
||||
days_of_stock: number | null;
|
||||
safety_stock: number;
|
||||
forecast_method: string | null;
|
||||
exclude_from_forecast: boolean;
|
||||
updated_at: string;
|
||||
product_name?: string; // Added for display purposes
|
||||
}
|
||||
|
||||
export function ProductSettings() {
|
||||
const [settings, setSettings] = useState<ProductSetting[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [page, setPage] = useState(1);
|
||||
const [pageSize] = useState(50);
|
||||
const [totalCount, setTotalCount] = useState(0);
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [pendingChanges, setPendingChanges] = useState<Record<string, boolean>>({});
|
||||
|
||||
// Use useCallback to avoid unnecessary re-renders
|
||||
const loadSettings = useCallback(async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const response = await fetch(`${config.apiUrl}/config/products?page=${page}&pageSize=${pageSize}&search=${encodeURIComponent(searchQuery)}`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load product settings');
|
||||
}
|
||||
const data = await response.json();
|
||||
setSettings(data.items);
|
||||
setTotalCount(data.total);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load settings: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
}, [page, searchQuery, pageSize]);
|
||||
|
||||
useEffect(() => {
|
||||
loadSettings();
|
||||
}, [loadSettings]);
|
||||
|
||||
const updateSetting = useCallback((pid: string, field: keyof ProductSetting, value: any) => {
|
||||
setSettings(prev => prev.map(setting =>
|
||||
setting.pid === pid ? { ...setting, [field]: value } : setting
|
||||
));
|
||||
setPendingChanges(prev => ({ ...prev, [pid]: true }));
|
||||
}, []);
|
||||
|
||||
const handleSaveSetting = useCallback(async (pid: string) => {
|
||||
try {
|
||||
const setting = settings.find(s => s.pid === pid);
|
||||
if (!setting) return;
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/config/products/${pid}`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify({
|
||||
lead_time_days: setting.lead_time_days,
|
||||
days_of_stock: setting.days_of_stock,
|
||||
safety_stock: setting.safety_stock,
|
||||
forecast_method: setting.forecast_method,
|
||||
exclude_from_forecast: setting.exclude_from_forecast
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update product setting');
|
||||
}
|
||||
|
||||
toast.success(`Settings updated for product ${pid}`);
|
||||
setPendingChanges(prev => ({ ...prev, [pid]: false }));
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update setting: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}, [settings]);
|
||||
|
||||
const handleResetToDefault = useCallback(async (pid: string) => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/products/${pid}/reset`, {
|
||||
method: 'POST',
|
||||
credentials: 'include'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to reset product setting');
|
||||
}
|
||||
|
||||
toast.success(`Settings reset for product ${pid}`);
|
||||
loadSettings(); // Reload settings to get defaults
|
||||
} catch (error) {
|
||||
toast.error(`Failed to reset setting: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}, [loadSettings]);
|
||||
|
||||
const totalPages = useMemo(() => Math.ceil(totalCount / pageSize), [totalCount, pageSize]);
|
||||
|
||||
// Generate page numbers for pagination
|
||||
const paginationItems = useMemo(() => {
|
||||
const pages = [];
|
||||
const maxVisiblePages = 5;
|
||||
|
||||
// Always include first page
|
||||
pages.push(1);
|
||||
|
||||
// Calculate range of visible pages
|
||||
let startPage = Math.max(2, page - Math.floor(maxVisiblePages / 2));
|
||||
let endPage = Math.min(totalPages - 1, startPage + maxVisiblePages - 3);
|
||||
|
||||
// Adjust if we're near the end
|
||||
if (endPage <= startPage) {
|
||||
endPage = Math.min(totalPages - 1, startPage + 1);
|
||||
}
|
||||
|
||||
// Add ellipsis after first page if needed
|
||||
if (startPage > 2) {
|
||||
pages.push('ellipsis1');
|
||||
}
|
||||
|
||||
// Add visible pages
|
||||
for (let i = startPage; i <= endPage; i++) {
|
||||
pages.push(i);
|
||||
}
|
||||
|
||||
// Add ellipsis before last page if needed
|
||||
if (endPage < totalPages - 1) {
|
||||
pages.push('ellipsis2');
|
||||
}
|
||||
|
||||
// Always include last page if it exists and is not already included
|
||||
if (totalPages > 1) {
|
||||
pages.push(totalPages);
|
||||
}
|
||||
|
||||
return pages;
|
||||
}, [page, totalPages]);
|
||||
|
||||
if (loading && settings.length === 0) {
|
||||
return <div className="py-4">Loading settings...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="max-w-[900px] space-y-6">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Product-Specific Settings</CardTitle>
|
||||
<CardDescription>Configure settings for individual products that override global defaults</CardDescription>
|
||||
|
||||
<div className="relative">
|
||||
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search products by ID or name..."
|
||||
className="pl-8"
|
||||
value={searchQuery}
|
||||
onChange={(e) => setSearchQuery(e.target.value)}
|
||||
/>
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ScrollArea className="h-[500px] rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product ID</TableHead>
|
||||
<TableHead>Lead Time (days)</TableHead>
|
||||
<TableHead>Days of Stock</TableHead>
|
||||
<TableHead>Safety Stock</TableHead>
|
||||
<TableHead>Forecast Method</TableHead>
|
||||
<TableHead>Exclude</TableHead>
|
||||
<TableHead>Actions</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{settings.map(setting => (
|
||||
<TableRow key={setting.pid}>
|
||||
<TableCell>{setting.pid} {setting.product_name && <span className="text-muted-foreground text-xs block">{setting.product_name}</span>}</TableCell>
|
||||
<TableCell>
|
||||
<Input
|
||||
type="number"
|
||||
min="1"
|
||||
className="w-20 [appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.lead_time_days ?? ''}
|
||||
onChange={(e) => updateSetting(setting.pid, 'lead_time_days', e.target.value ? parseInt(e.target.value) : null)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Input
|
||||
type="number"
|
||||
min="1"
|
||||
className="w-20 [appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.days_of_stock ?? ''}
|
||||
onChange={(e) => updateSetting(setting.pid, 'days_of_stock', e.target.value ? parseInt(e.target.value) : null)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Input
|
||||
type="number"
|
||||
min="0"
|
||||
className="w-20 [appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.safety_stock}
|
||||
onChange={(e) => updateSetting(setting.pid, 'safety_stock', parseInt(e.target.value) || 0)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Select
|
||||
value={setting.forecast_method || 'default'}
|
||||
onValueChange={(value) => updateSetting(setting.pid, 'forecast_method', value === 'default' ? null : value)}
|
||||
>
|
||||
<SelectTrigger className="w-28">
|
||||
<SelectValue placeholder="Default" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="default">Default</SelectItem>
|
||||
<SelectItem value="standard">Standard</SelectItem>
|
||||
<SelectItem value="seasonal">Seasonal</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Switch
|
||||
checked={setting.exclude_from_forecast}
|
||||
onCheckedChange={(checked) => updateSetting(setting.pid, 'exclude_from_forecast', checked)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<div className="flex space-x-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleSaveSetting(setting.pid)}
|
||||
disabled={!pendingChanges[setting.pid]}
|
||||
>
|
||||
Save
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleResetToDefault(setting.pid)}
|
||||
>
|
||||
Reset
|
||||
</Button>
|
||||
</div>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</ScrollArea>
|
||||
|
||||
{/* shadcn/ui Pagination */}
|
||||
{totalPages > 1 && (
|
||||
<div className="mt-4">
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
onClick={() => setPage(p => Math.max(1, p - 1))}
|
||||
className={page === 1 ? "pointer-events-none opacity-50" : "cursor-pointer"}
|
||||
/>
|
||||
</PaginationItem>
|
||||
|
||||
{paginationItems.map((item, i) => (
|
||||
typeof item === 'number' ? (
|
||||
<PaginationItem key={i}>
|
||||
<PaginationLink
|
||||
onClick={() => setPage(item)}
|
||||
isActive={page === item}
|
||||
>
|
||||
{item}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
) : (
|
||||
<PaginationItem key={i}>
|
||||
<PaginationEllipsis />
|
||||
</PaginationItem>
|
||||
)
|
||||
))}
|
||||
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
onClick={() => setPage(p => Math.min(totalPages, p + 1))}
|
||||
className={page === totalPages ? "pointer-events-none opacity-50" : "cursor-pointer"}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,248 +0,0 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
|
||||
interface StockThreshold {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
critical_days: number;
|
||||
reorder_days: number;
|
||||
overstock_days: number;
|
||||
low_stock_threshold: number;
|
||||
min_reorder_quantity: number;
|
||||
}
|
||||
|
||||
interface SafetyStockConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
coverage_days: number;
|
||||
service_level: number;
|
||||
}
|
||||
|
||||
export function StockManagement() {
|
||||
const [stockThresholds, setStockThresholds] = useState<StockThreshold>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
critical_days: 7,
|
||||
reorder_days: 14,
|
||||
overstock_days: 90,
|
||||
low_stock_threshold: 5,
|
||||
min_reorder_quantity: 1
|
||||
});
|
||||
|
||||
const [safetyStockConfig, setSafetyStockConfig] = useState<SafetyStockConfig>({
|
||||
id: 1,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
coverage_days: 14,
|
||||
service_level: 95.0
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load configuration');
|
||||
}
|
||||
const data = await response.json();
|
||||
setStockThresholds(data.stockThresholds);
|
||||
setSafetyStockConfig(data.safetyStockConfig);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
loadConfig();
|
||||
}, []);
|
||||
|
||||
const handleUpdateStockThresholds = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/stock-thresholds/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(stockThresholds)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update stock thresholds');
|
||||
}
|
||||
|
||||
toast.success('Stock thresholds updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update thresholds: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
const handleUpdateSafetyStockConfig = async () => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/safety-stock/1`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(safetyStockConfig)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update safety stock configuration');
|
||||
}
|
||||
|
||||
toast.success('Safety stock configuration updated successfully');
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="max-w-[700px] space-y-4">
|
||||
{/* Stock Thresholds Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Stock Thresholds</CardTitle>
|
||||
<CardDescription>Configure stock level thresholds for inventory management</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="critical-days">Critical Days</Label>
|
||||
<Input
|
||||
id="critical-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={stockThresholds.critical_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
critical_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="reorder-days">Reorder Days</Label>
|
||||
<Input
|
||||
id="reorder-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={stockThresholds.reorder_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
reorder_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="overstock-days">Overstock Days</Label>
|
||||
<Input
|
||||
id="overstock-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={stockThresholds.overstock_days}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
overstock_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="low-stock-threshold">Low Stock Threshold</Label>
|
||||
<Input
|
||||
id="low-stock-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={stockThresholds.low_stock_threshold}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
low_stock_threshold: parseInt(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="min-reorder-quantity">Minimum Reorder Quantity</Label>
|
||||
<Input
|
||||
id="min-reorder-quantity"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={stockThresholds.min_reorder_quantity}
|
||||
onChange={(e) => setStockThresholds(prev => ({
|
||||
...prev,
|
||||
min_reorder_quantity: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateStockThresholds}>
|
||||
Update Stock Thresholds
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
|
||||
{/* Safety Stock Configuration Card */}
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Safety Stock</CardTitle>
|
||||
<CardDescription>Configure safety stock parameters</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="coverage-days">Coverage Days</Label>
|
||||
<Input
|
||||
id="coverage-days"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={safetyStockConfig.coverage_days}
|
||||
onChange={(e) => setSafetyStockConfig(prev => ({
|
||||
...prev,
|
||||
coverage_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="service-level">Service Level (%)</Label>
|
||||
<Input
|
||||
id="service-level"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={safetyStockConfig.service_level}
|
||||
onChange={(e) => setSafetyStockConfig(prev => ({
|
||||
...prev,
|
||||
service_level: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Button onClick={handleUpdateSafetyStockConfig}>
|
||||
Update Safety Stock Configuration
|
||||
</Button>
|
||||
</div>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
283
inventory/src/components/settings/VendorSettings.tsx
Normal file
283
inventory/src/components/settings/VendorSettings.tsx
Normal file
@@ -0,0 +1,283 @@
|
||||
import { useState, useEffect, useCallback, useMemo } from 'react';
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { Search } from 'lucide-react';
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import {
|
||||
Pagination,
|
||||
PaginationContent,
|
||||
PaginationEllipsis,
|
||||
PaginationItem,
|
||||
PaginationLink,
|
||||
PaginationNext,
|
||||
PaginationPrevious,
|
||||
} from "@/components/ui/pagination";
|
||||
import { useDebounce } from '@/hooks/useDebounce';
|
||||
|
||||
interface VendorSetting {
|
||||
vendor: string;
|
||||
default_lead_time_days: number | null;
|
||||
default_days_of_stock: number | null;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
export function VendorSettings() {
|
||||
const [settings, setSettings] = useState<VendorSetting[]>([]);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [page, setPage] = useState(1);
|
||||
const [pageSize] = useState(50);
|
||||
const [totalCount, setTotalCount] = useState(0);
|
||||
const [searchInputValue, setSearchInputValue] = useState('');
|
||||
const searchQuery = useDebounce(searchInputValue, 300); // 300ms debounce
|
||||
const [pendingChanges, setPendingChanges] = useState<Record<string, boolean>>({});
|
||||
|
||||
// Use useCallback to avoid unnecessary re-renders
|
||||
const loadSettings = useCallback(async () => {
|
||||
try {
|
||||
setLoading(true);
|
||||
const response = await fetch(`${config.apiUrl}/config/vendors?page=${page}&pageSize=${pageSize}&search=${encodeURIComponent(searchQuery)}`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to load vendor settings');
|
||||
}
|
||||
const data = await response.json();
|
||||
setSettings(data.items);
|
||||
setTotalCount(data.total);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load settings: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
}, [page, searchQuery, pageSize]);
|
||||
|
||||
useEffect(() => {
|
||||
loadSettings();
|
||||
}, [loadSettings]);
|
||||
|
||||
const updateSetting = useCallback((vendor: string, field: keyof VendorSetting, value: any) => {
|
||||
setSettings(prev => prev.map(setting =>
|
||||
setting.vendor === vendor ? { ...setting, [field]: value } : setting
|
||||
));
|
||||
setPendingChanges(prev => ({ ...prev, [vendor]: true }));
|
||||
}, []);
|
||||
|
||||
const handleSaveSetting = useCallback(async (vendor: string) => {
|
||||
try {
|
||||
const setting = settings.find(s => s.vendor === vendor);
|
||||
if (!setting) return;
|
||||
|
||||
const response = await fetch(`${config.apiUrl}/config/vendors/${encodeURIComponent(vendor)}`, {
|
||||
method: 'PUT',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify({
|
||||
default_lead_time_days: setting.default_lead_time_days,
|
||||
default_days_of_stock: setting.default_days_of_stock
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to update vendor setting');
|
||||
}
|
||||
|
||||
toast.success(`Settings updated for vendor ${vendor}`);
|
||||
setPendingChanges(prev => ({ ...prev, [vendor]: false }));
|
||||
} catch (error) {
|
||||
toast.error(`Failed to update setting: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}, [settings]);
|
||||
|
||||
const handleResetToDefault = useCallback(async (vendor: string) => {
|
||||
try {
|
||||
const response = await fetch(`${config.apiUrl}/config/vendors/${encodeURIComponent(vendor)}/reset`, {
|
||||
method: 'POST',
|
||||
credentials: 'include'
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const data = await response.json().catch(() => ({}));
|
||||
throw new Error(data.error || 'Failed to reset vendor setting');
|
||||
}
|
||||
|
||||
toast.success(`Settings reset for vendor ${vendor}`);
|
||||
loadSettings(); // Reload settings to get defaults
|
||||
} catch (error) {
|
||||
toast.error(`Failed to reset setting: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
}, [loadSettings]);
|
||||
|
||||
const totalPages = useMemo(() => Math.ceil(totalCount / pageSize), [totalCount, pageSize]);
|
||||
|
||||
// Generate page numbers for pagination
|
||||
const paginationItems = useMemo(() => {
|
||||
const pages = [];
|
||||
const maxVisiblePages = 5;
|
||||
|
||||
// Always include first page
|
||||
pages.push(1);
|
||||
|
||||
// Calculate range of visible pages
|
||||
let startPage = Math.max(2, page - Math.floor(maxVisiblePages / 2));
|
||||
let endPage = Math.min(totalPages - 1, startPage + maxVisiblePages - 3);
|
||||
|
||||
// Adjust if we're near the end
|
||||
if (endPage <= startPage) {
|
||||
endPage = Math.min(totalPages - 1, startPage + 1);
|
||||
}
|
||||
|
||||
// Add ellipsis after first page if needed
|
||||
if (startPage > 2) {
|
||||
pages.push('ellipsis1');
|
||||
}
|
||||
|
||||
// Add visible pages
|
||||
for (let i = startPage; i <= endPage; i++) {
|
||||
pages.push(i);
|
||||
}
|
||||
|
||||
// Add ellipsis before last page if needed
|
||||
if (endPage < totalPages - 1) {
|
||||
pages.push('ellipsis2');
|
||||
}
|
||||
|
||||
// Always include last page if it exists and is not already included
|
||||
if (totalPages > 1) {
|
||||
pages.push(totalPages);
|
||||
}
|
||||
|
||||
return pages;
|
||||
}, [page, totalPages]);
|
||||
|
||||
if (loading && settings.length === 0) {
|
||||
return <div className="py-4">Loading settings...</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="max-w-[900px] space-y-6">
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>Vendor-Specific Settings</CardTitle>
|
||||
<CardDescription>Configure default settings for products from specific vendors</CardDescription>
|
||||
|
||||
<div className="relative">
|
||||
<Search className="absolute left-2.5 top-2.5 h-4 w-4 text-muted-foreground" />
|
||||
<Input
|
||||
type="search"
|
||||
placeholder="Search vendors..."
|
||||
className="pl-8"
|
||||
value={searchInputValue}
|
||||
onChange={(e) => setSearchInputValue(e.target.value)}
|
||||
/>
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<ScrollArea className="h-[500px] rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Vendor</TableHead>
|
||||
<TableHead>Default Lead Time (days)</TableHead>
|
||||
<TableHead>Default Days of Stock</TableHead>
|
||||
<TableHead>Actions</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{settings.map(setting => (
|
||||
<TableRow key={setting.vendor}>
|
||||
<TableCell>{setting.vendor}</TableCell>
|
||||
<TableCell>
|
||||
<Input
|
||||
type="number"
|
||||
min="1"
|
||||
className="w-20 [appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.default_lead_time_days ?? ''}
|
||||
onChange={(e) => updateSetting(setting.vendor, 'default_lead_time_days', e.target.value ? parseInt(e.target.value) : null)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Input
|
||||
type="number"
|
||||
min="1"
|
||||
className="w-20 [appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={setting.default_days_of_stock ?? ''}
|
||||
onChange={(e) => updateSetting(setting.vendor, 'default_days_of_stock', e.target.value ? parseInt(e.target.value) : null)}
|
||||
/>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<div className="flex space-x-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleSaveSetting(setting.vendor)}
|
||||
disabled={!pendingChanges[setting.vendor]}
|
||||
>
|
||||
Save
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => handleResetToDefault(setting.vendor)}
|
||||
>
|
||||
Reset
|
||||
</Button>
|
||||
</div>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</ScrollArea>
|
||||
|
||||
{/* shadcn/ui Pagination */}
|
||||
{totalPages > 1 && (
|
||||
<div className="mt-4">
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
onClick={() => setPage(p => Math.max(1, p - 1))}
|
||||
className={page === 1 ? "pointer-events-none opacity-50" : "cursor-pointer"}
|
||||
/>
|
||||
</PaginationItem>
|
||||
|
||||
{paginationItems.map((item, i) => (
|
||||
typeof item === 'number' ? (
|
||||
<PaginationItem key={i}>
|
||||
<PaginationLink
|
||||
onClick={() => setPage(item)}
|
||||
isActive={page === item}
|
||||
>
|
||||
{item}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
) : (
|
||||
<PaginationItem key={i}>
|
||||
<PaginationEllipsis />
|
||||
</PaginationItem>
|
||||
)
|
||||
))}
|
||||
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
onClick={() => setPage(p => Math.min(totalPages, p + 1))}
|
||||
className={page === totalPages ? "pointer-events-none opacity-50" : "cursor-pointer"}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
25
inventory/src/hooks/useDebounce.ts
Normal file
25
inventory/src/hooks/useDebounce.ts
Normal file
@@ -0,0 +1,25 @@
|
||||
import { useState, useEffect } from 'react';
|
||||
|
||||
/**
|
||||
* A hook that returns a debounced value after the specified delay
|
||||
* @param value The value to debounce
|
||||
* @param delay The delay in milliseconds
|
||||
* @returns The debounced value
|
||||
*/
|
||||
export function useDebounce<T>(value: T, delay: number): T {
|
||||
const [debouncedValue, setDebouncedValue] = useState<T>(value);
|
||||
|
||||
useEffect(() => {
|
||||
// Update the debounced value after the specified delay
|
||||
const timer = setTimeout(() => {
|
||||
setDebouncedValue(value);
|
||||
}, delay);
|
||||
|
||||
// Clean up the timeout on unmount or when value/delay changes
|
||||
return () => {
|
||||
clearTimeout(timer);
|
||||
};
|
||||
}, [value, delay]);
|
||||
|
||||
return debouncedValue;
|
||||
}
|
||||
459
inventory/src/pages/Brands.tsx
Normal file
459
inventory/src/pages/Brands.tsx
Normal file
@@ -0,0 +1,459 @@
|
||||
import { useState, useMemo, useCallback } from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { Pagination, PaginationContent, PaginationItem, PaginationLink, PaginationNext, PaginationPrevious } from "@/components/ui/pagination";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { motion } from "framer-motion";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import config from "../config";
|
||||
import { Skeleton } from "@/components/ui/skeleton";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
|
||||
// Matches backend COLUMN_MAP keys for sorting
|
||||
type BrandSortableColumns =
|
||||
| 'brandName' | 'productCount' | 'activeProductCount' | 'currentStockUnits'
|
||||
| 'currentStockCost' | 'currentStockRetail' | 'revenue_7d' | 'revenue_30d'
|
||||
| 'profit_30d' | 'sales_30d' | 'avg_margin_30d' | 'stock_turn_30d' | 'status'; // Add more as needed
|
||||
|
||||
interface BrandMetric {
|
||||
brand_id: string | number;
|
||||
brand_name: string;
|
||||
last_calculated: string;
|
||||
product_count: number;
|
||||
active_product_count: number;
|
||||
replenishable_product_count: number;
|
||||
current_stock_units: number;
|
||||
current_stock_cost: string | number;
|
||||
current_stock_retail: string | number;
|
||||
sales_7d: number;
|
||||
revenue_7d: string | number;
|
||||
sales_30d: number;
|
||||
revenue_30d: string | number;
|
||||
profit_30d: string | number;
|
||||
cogs_30d: string | number;
|
||||
sales_365d: number;
|
||||
revenue_365d: string | number;
|
||||
lifetime_sales: number;
|
||||
lifetime_revenue: string | number;
|
||||
avg_margin_30d: string | number | null;
|
||||
stock_turn_30d: string | number | null;
|
||||
status: string;
|
||||
brand_status: string;
|
||||
description: string;
|
||||
// Camel case versions
|
||||
brandId: string | number;
|
||||
brandName: string;
|
||||
lastCalculated: string;
|
||||
productCount: number;
|
||||
activeProductCount: number;
|
||||
replenishableProductCount: number;
|
||||
currentStockUnits: number;
|
||||
currentStockCost: string | number;
|
||||
currentStockRetail: string | number;
|
||||
lifetimeSales: number;
|
||||
lifetimeRevenue: string | number;
|
||||
avgMargin_30d: string | number | null;
|
||||
stockTurn_30d: string | number | null;
|
||||
}
|
||||
|
||||
// Define response type to avoid type errors
|
||||
interface BrandResponse {
|
||||
brands: BrandMetric[];
|
||||
pagination: {
|
||||
total: number;
|
||||
pages: number;
|
||||
currentPage: number;
|
||||
limit: number;
|
||||
};
|
||||
}
|
||||
|
||||
interface BrandFilterOptions {
|
||||
statuses: string[];
|
||||
}
|
||||
|
||||
interface BrandStats {
|
||||
totalBrands: number;
|
||||
activeBrands: number;
|
||||
totalActiveProducts: number; // SUM(active_product_count)
|
||||
totalValue: number; // SUM(current_stock_cost)
|
||||
avgMargin: number; // Weighted avg margin 30d
|
||||
}
|
||||
|
||||
interface BrandFilters {
|
||||
search: string;
|
||||
status: string;
|
||||
showInactive: boolean; // Show brands with 0 active products
|
||||
}
|
||||
|
||||
const ITEMS_PER_PAGE = 50;
|
||||
|
||||
// Re-use formatting helpers or define here
|
||||
const formatCurrency = (value: number | string | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
}).format(parsed);
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
}).format(value);
|
||||
};
|
||||
|
||||
const formatNumber = (value: number | string | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return parsed.toLocaleString(undefined, {
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits,
|
||||
});
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return value.toLocaleString(undefined, {
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits,
|
||||
});
|
||||
};
|
||||
|
||||
const formatPercentage = (value: number | string | null | undefined, digits = 1): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return `${parsed.toFixed(digits)}%`;
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return `${value.toFixed(digits)}%`;
|
||||
};
|
||||
|
||||
const getStatusVariant = (status: string): "default" | "secondary" | "outline" | "destructive" => {
|
||||
switch (status) {
|
||||
case 'active':
|
||||
return 'default';
|
||||
case 'inactive':
|
||||
return 'secondary';
|
||||
case 'discontinued':
|
||||
return 'destructive';
|
||||
default:
|
||||
return 'outline';
|
||||
}
|
||||
};
|
||||
|
||||
export function Brands() {
|
||||
const [page, setPage] = useState(1);
|
||||
const [limit] = useState(ITEMS_PER_PAGE);
|
||||
const [sortColumn, setSortColumn] = useState<BrandSortableColumns>("brandName");
|
||||
const [sortDirection, setSortDirection] = useState<"asc" | "desc">("asc");
|
||||
const [filters, setFilters] = useState<BrandFilters>({
|
||||
search: "",
|
||||
status: "all",
|
||||
showInactive: false, // Default to hiding brands with 0 active products
|
||||
});
|
||||
|
||||
// --- Data Fetching ---
|
||||
|
||||
const queryParams = useMemo(() => {
|
||||
const params = new URLSearchParams();
|
||||
params.set('page', page.toString());
|
||||
params.set('limit', limit.toString());
|
||||
params.set('sort', sortColumn);
|
||||
params.set('order', sortDirection);
|
||||
|
||||
if (filters.search) {
|
||||
params.set('brandName_ilike', filters.search); // Filter by name
|
||||
}
|
||||
if (filters.status !== 'all') {
|
||||
params.set('status', filters.status); // Filter by status
|
||||
}
|
||||
if (!filters.showInactive) {
|
||||
params.set('activeProductCount_gt', '0'); // Only show brands with active products
|
||||
}
|
||||
// Add more filters here if needed (e.g., revenue30d_gt=5000)
|
||||
|
||||
return params;
|
||||
}, [page, limit, sortColumn, sortDirection, filters]);
|
||||
|
||||
const { data: listData, isLoading: isLoadingList, error: listError } = useQuery<BrandResponse, Error>({
|
||||
queryKey: ['brands', queryParams.toString()],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/brands-aggregate?${queryParams.toString()}`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error(`Network response was not ok (${response.status})`);
|
||||
const data = await response.json();
|
||||
console.log('Brands data:', JSON.stringify(data, null, 2));
|
||||
return data;
|
||||
},
|
||||
placeholderData: (prev) => prev, // Modern replacement for keepPreviousData
|
||||
});
|
||||
|
||||
const { data: statsData, isLoading: isLoadingStats } = useQuery<BrandStats, Error>({
|
||||
queryKey: ['brandsStats'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/brands-aggregate/stats`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error("Failed to fetch brand stats");
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// Fetch filter options
|
||||
const { data: filterOptions } = useQuery<BrandFilterOptions, Error>({
|
||||
queryKey: ['brandsFilterOptions'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/brands-aggregate/filter-options`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error("Failed to fetch filter options");
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// --- Event Handlers ---
|
||||
|
||||
const handleSort = useCallback((column: BrandSortableColumns) => {
|
||||
setSortDirection(prev => (sortColumn === column && prev === "asc" ? "desc" : "asc"));
|
||||
setSortColumn(column);
|
||||
setPage(1);
|
||||
}, [sortColumn]);
|
||||
|
||||
const handleFilterChange = useCallback((filterName: keyof BrandFilters, value: string | boolean) => {
|
||||
setFilters(prev => ({ ...prev, [filterName]: value }));
|
||||
setPage(1);
|
||||
}, []);
|
||||
|
||||
const handlePageChange = (newPage: number) => {
|
||||
if (newPage >= 1 && newPage <= (listData?.pagination.pages ?? 1)) {
|
||||
setPage(newPage);
|
||||
}
|
||||
};
|
||||
|
||||
// --- Derived Data ---
|
||||
const brands = listData?.brands ?? [];
|
||||
const pagination = listData?.pagination;
|
||||
const totalPages = pagination?.pages ?? 0;
|
||||
|
||||
// --- Rendering ---
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
layout
|
||||
transition={{ layout: { duration: 0.15, ease: [0.4, 0, 0.2, 1] } }}
|
||||
className="container mx-auto py-6 space-y-4"
|
||||
>
|
||||
{/* Header */}
|
||||
<motion.div layout="position" transition={{ duration: 0.15 }} className="flex items-center justify-between">
|
||||
<h1 className="text-3xl font-bold tracking-tight">Brands</h1>
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{isLoadingList && !pagination ? 'Loading...' : `${formatNumber(pagination?.total)} brands`}
|
||||
</div>
|
||||
</motion.div>
|
||||
|
||||
{/* Stats Cards */}
|
||||
<motion.div layout="preserve-aspect" transition={{ duration: 0.15 }} className="grid gap-4 md:grid-cols-4">
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Brands</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-24" /> : <div className="text-2xl font-bold">{formatNumber(statsData?.totalBrands)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
{isLoadingStats ? <Skeleton className="h-4 w-28" /> :
|
||||
`${formatNumber(statsData?.activeBrands)} active`}
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Stock Value</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-28" /> : <div className="text-2xl font-bold">{formatCurrency(statsData?.totalValue)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Current cost value
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Avg Margin (30d)</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-20" /> : <div className="text-2xl font-bold">{formatPercentage(statsData?.avgMargin)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Weighted by revenue
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Active Products</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-24" /> : <div className="text-2xl font-bold">{formatNumber(statsData?.totalActiveProducts)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Across all brands
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
|
||||
{/* Filter Controls */}
|
||||
<div className="flex flex-wrap items-center space-y-2 sm:space-y-0 sm:space-x-2">
|
||||
<Input
|
||||
placeholder="Search brands..."
|
||||
value={filters.search}
|
||||
onChange={(e) => handleFilterChange('search', e.target.value)}
|
||||
className="w-full sm:w-[250px]"
|
||||
/>
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => handleFilterChange('status', value)}
|
||||
>
|
||||
<SelectTrigger className="w-full sm:w-[180px]">
|
||||
<SelectValue placeholder="Status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Statuses</SelectItem>
|
||||
{filterOptions?.statuses?.map((status) => (
|
||||
<SelectItem key={status} value={status}>
|
||||
{status.charAt(0).toUpperCase() + status.slice(1)}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<div className="flex items-center space-x-2 ml-auto">
|
||||
<Switch
|
||||
id="show-inactive-brands"
|
||||
checked={filters.showInactive}
|
||||
onCheckedChange={(checked) => handleFilterChange('showInactive', checked)}
|
||||
/>
|
||||
<Label htmlFor="show-inactive-brands">Show brands with no active products</Label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Data Table */}
|
||||
<div className="rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead onClick={() => handleSort("brandName")} className="cursor-pointer">Brand</TableHead>
|
||||
<TableHead onClick={() => handleSort("activeProductCount")} className="cursor-pointer text-right">Active Prod.</TableHead>
|
||||
<TableHead onClick={() => handleSort("currentStockUnits")} className="cursor-pointer text-right">Stock Units</TableHead>
|
||||
<TableHead onClick={() => handleSort("currentStockCost")} className="cursor-pointer text-right">Stock Cost</TableHead>
|
||||
<TableHead onClick={() => handleSort("currentStockRetail")} className="cursor-pointer text-right">Stock Retail</TableHead>
|
||||
<TableHead onClick={() => handleSort("revenue_30d")} className="cursor-pointer text-right">Revenue (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("profit_30d")} className="cursor-pointer text-right">Profit (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("avg_margin_30d")} className="cursor-pointer text-right">Margin (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("stock_turn_30d")} className="cursor-pointer text-right">Stock Turn (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("status")} className="cursor-pointer text-right">Status</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{isLoadingList && !listData ? (
|
||||
Array.from({ length: 5 }).map((_, i) => ( // Skeleton rows
|
||||
<TableRow key={`skel-${i}`}>
|
||||
<TableCell><Skeleton className="h-5 w-40" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
</TableRow>
|
||||
))
|
||||
) : listError ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={9} className="text-center py-8 text-destructive">
|
||||
Error loading brands: {listError.message}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : brands.length === 0 ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={9} className="text-center py-8 text-muted-foreground">
|
||||
No brands found matching your criteria.
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : (
|
||||
brands.map((brand: BrandMetric) => (
|
||||
<TableRow key={brand.brand_id} className={brand.active_product_count === 0 ? "opacity-60" : ""}>
|
||||
<TableCell className="font-medium">{brand.brand_name}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(brand.active_product_count || brand.activeProductCount)}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(brand.current_stock_units || brand.currentStockUnits)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(brand.current_stock_cost as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(brand.current_stock_retail as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(brand.revenue_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(brand.profit_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatPercentage(brand.avg_margin_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(brand.stock_turn_30d, 2)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<Badge variant={getStatusVariant(brand.status)}>
|
||||
{brand.status || 'Unknown'}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
|
||||
{/* Pagination Controls */}
|
||||
{totalPages > 1 && pagination && (
|
||||
<motion.div layout="position" transition={{ duration: 0.15 }} className="flex justify-center">
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(pagination.currentPage - 1); }}
|
||||
aria-disabled={pagination.currentPage === 1}
|
||||
className={pagination.currentPage === 1 ? "pointer-events-none opacity-50" : ""}
|
||||
/>
|
||||
</PaginationItem>
|
||||
{[...Array(totalPages)].map((_, i) => (
|
||||
<PaginationItem key={i + 1}>
|
||||
<PaginationLink
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(i + 1); }}
|
||||
isActive={pagination.currentPage === i + 1}
|
||||
>
|
||||
{i + 1}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
))}
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(pagination.currentPage + 1); }}
|
||||
aria-disabled={pagination.currentPage >= totalPages}
|
||||
className={pagination.currentPage >= totalPages ? "pointer-events-none opacity-50" : ""}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
</motion.div>
|
||||
)}
|
||||
</motion.div>
|
||||
);
|
||||
}
|
||||
|
||||
export default Brands;
|
||||
File diff suppressed because it is too large
Load Diff
@@ -8,8 +8,7 @@ import { ProductTableSkeleton } from '@/components/products/ProductTableSkeleton
|
||||
import { ProductDetail } from '@/components/products/ProductDetail';
|
||||
import { ProductViews } from '@/components/products/ProductViews';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Product } from '@/types/products';
|
||||
import type { ColumnKey } from '@/components/products/ProductTable';
|
||||
import { ProductMetric, ProductMetricColumnKey } from '@/types/products';
|
||||
import {
|
||||
DropdownMenu,
|
||||
DropdownMenuCheckboxItem,
|
||||
@@ -35,7 +34,7 @@ import { toast } from "sonner";
|
||||
|
||||
// Column definition type
|
||||
interface ColumnDef {
|
||||
key: ColumnKey;
|
||||
key: ProductMetricColumnKey;
|
||||
label: string;
|
||||
group: string;
|
||||
noLabel?: boolean;
|
||||
@@ -45,172 +44,263 @@ interface ColumnDef {
|
||||
|
||||
// Define available columns with their groups
|
||||
const AVAILABLE_COLUMNS: ColumnDef[] = [
|
||||
{ key: 'image', label: 'Image', group: 'Basic Info', noLabel: true, width: 'w-[60px]' },
|
||||
{ key: 'title', label: 'Name', group: 'Basic Info' },
|
||||
{ key: 'SKU', label: 'SKU', group: 'Basic Info' },
|
||||
{ key: 'brand', label: 'Company', group: 'Basic Info' },
|
||||
{ key: 'vendor', label: 'Supplier', group: 'Basic Info' },
|
||||
{ key: 'vendor_reference', label: 'Supplier #', group: 'Basic Info' },
|
||||
{ key: 'barcode', label: 'UPC', group: 'Basic Info' },
|
||||
{ key: 'description', label: 'Description', group: 'Basic Info' },
|
||||
{ key: 'created_at', label: 'Created', group: 'Basic Info' },
|
||||
{ key: 'harmonized_tariff_code', label: 'HTS Code', group: 'Basic Info' },
|
||||
{ key: 'notions_reference', label: 'Notions Ref', group: 'Basic Info' },
|
||||
{ key: 'line', label: 'Line', group: 'Basic Info' },
|
||||
{ key: 'subline', label: 'Subline', group: 'Basic Info' },
|
||||
{ key: 'artist', label: 'Artist', group: 'Basic Info' },
|
||||
{ key: 'country_of_origin', label: 'Origin', group: 'Basic Info' },
|
||||
{ key: 'location', label: 'Location', group: 'Basic Info' },
|
||||
// Identity & Basic Info
|
||||
{ key: 'imageUrl', label: 'Image', group: 'Product Identity', noLabel: true, width: 'w-[60px]' },
|
||||
{ key: 'title', label: 'Name', group: 'Product Identity'},
|
||||
{ key: 'sku', label: 'Item Number', group: 'Product Identity' },
|
||||
{ key: 'barcode', label: 'UPC', group: 'Product Identity' },
|
||||
{ key: 'brand', label: 'Company', group: 'Product Identity' },
|
||||
{ key: 'line', label: 'Line', group: 'Product Identity' },
|
||||
{ key: 'subline', label: 'Subline', group: 'Product Identity' },
|
||||
{ key: 'artist', label: 'Artist', group: 'Product Identity' },
|
||||
{ key: 'isVisible', label: 'Visible', group: 'Product Identity' },
|
||||
{ key: 'isReplenishable', label: 'Replenishable', group: 'Product Identity' },
|
||||
{ key: 'abcClass', label: 'ABC Class', group: 'Product Identity' },
|
||||
{ key: 'status', label: 'Status', group: 'Product Identity' },
|
||||
{ key: 'dateCreated', label: 'Created', group: 'Dates' },
|
||||
|
||||
// Physical properties
|
||||
{ key: 'weight', label: 'Weight', group: 'Physical', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'dimensions', label: 'Dimensions', group: 'Physical', format: (v) => v ? `${v.length}x${v.width}x${v.height}` : '-' },
|
||||
// Supply Chain
|
||||
{ key: 'vendor', label: 'Supplier', group: 'Supply Chain' },
|
||||
{ key: 'vendorReference', label: 'Supplier #', group: 'Supply Chain' },
|
||||
{ key: 'notionsReference', label: 'Notions #', group: 'Supply Chain' },
|
||||
{ key: 'harmonizedTariffCode', label: 'Tariff Code', group: 'Supply Chain' },
|
||||
{ key: 'countryOfOrigin', label: 'Country', group: 'Supply Chain' },
|
||||
{ key: 'location', label: 'Location', group: 'Supply Chain' },
|
||||
{ key: 'moq', label: 'MOQ', group: 'Supply Chain', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
|
||||
// Physical Properties
|
||||
{ key: 'weight', label: 'Weight', group: 'Physical', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'dimensions', label: 'Dimensions', group: 'Physical', format: (v) => v ? `${v.length}×${v.width}×${v.height}` : '-' },
|
||||
|
||||
// Stock columns
|
||||
{ key: 'stock_quantity', label: 'Shelf Count', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'stock_status', label: 'Stock Status', group: 'Stock' },
|
||||
{ key: 'preorder_count', label: 'Preorders', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'notions_inv_count', label: 'Notions Inv', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'days_of_inventory', label: 'Days of Stock', group: 'Stock', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'weeks_of_inventory', label: 'Weeks of Stock', group: 'Stock', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'abc_class', label: 'ABC Class', group: 'Stock' },
|
||||
{ key: 'replenishable', label: 'Replenishable', group: 'Stock' },
|
||||
{ key: 'moq', label: 'MOQ', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'reorder_qty', label: 'Reorder Qty', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'reorder_point', label: 'Reorder Point', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'safety_stock', label: 'Safety Stock', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'overstocked_amt', label: 'Overstock Amt', group: 'Stock', format: (v) => v?.toString() ?? '-' },
|
||||
// Customer Engagement
|
||||
{ key: 'rating', label: 'Rating', group: 'Customer', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'reviews', label: 'Reviews', group: 'Customer', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'baskets', label: 'Basket Adds', group: 'Customer', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'notifies', label: 'Stock Alerts', group: 'Customer', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
|
||||
// Pricing columns
|
||||
{ key: 'price', label: 'Price', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'regular_price', label: 'Default Price', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'cost_price', label: 'Cost', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'landing_cost_price', label: 'Landing Cost', group: 'Pricing', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
// Inventory & Stock
|
||||
{ key: 'currentStock', label: 'Current Stock', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'preorderCount', label: 'Preorders', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'notionsInvCount', label: 'Notions Inv.', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'configSafetyStock', label: 'Safety Stock', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'replenishmentUnits', label: 'Replenish Units', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'stockCoverInDays', label: 'Stock Cover (Days)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'sellsOutInDays', label: 'Sells Out In (Days)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'onOrderQty', label: 'On Order', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'earliestExpectedDate', label: 'Expected Date', group: 'Inventory' },
|
||||
{ key: 'isOldStock', label: 'Old Stock', group: 'Inventory' },
|
||||
{ key: 'overstockedUnits', label: 'Overstock Qty', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'stockoutDays30d', label: 'Stockout Days (30d)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'stockoutRate30d', label: 'Stockout Rate %', group: 'Inventory', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'avgStockUnits30d', label: 'Avg Stock Units (30d)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'receivedQty30d', label: 'Received Qty (30d)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'poCoverInDays', label: 'PO Cover (Days)', group: 'Inventory', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
|
||||
// Sales columns
|
||||
{ key: 'daily_sales_avg', label: 'Daily Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'weekly_sales_avg', label: 'Weekly Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'monthly_sales_avg', label: 'Monthly Sales', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'avg_quantity_per_order', label: 'Avg Qty/Order', group: 'Sales', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'number_of_orders', label: 'Order Count', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'first_sale_date', label: 'First Sale', group: 'Sales' },
|
||||
{ key: 'last_sale_date', label: 'Last Sale', group: 'Sales' },
|
||||
{ key: 'date_last_sold', label: 'Date Last Sold', group: 'Sales' },
|
||||
{ key: 'total_sold', label: 'Total Sold', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'baskets', label: 'In Baskets', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'notifies', label: 'Notifies', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
{ key: 'rating', label: 'Rating', group: 'Sales', format: (v) => v ? v.toFixed(1) : '-' },
|
||||
{ key: 'reviews', label: 'Reviews', group: 'Sales', format: (v) => v?.toString() ?? '-' },
|
||||
// Pricing & Costs
|
||||
{ key: 'currentPrice', label: 'Price', group: 'Pricing', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentRegularPrice', label: 'Regular Price', group: 'Pricing', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentCostPrice', label: 'Cost', group: 'Pricing', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentLandingCostPrice', label: 'Landing Cost', group: 'Pricing', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentStockCost', label: 'Stock Cost', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentStockRetail', label: 'Stock Retail', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'currentStockGross', label: 'Stock Gross', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'onOrderCost', label: 'On Order Cost', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'onOrderRetail', label: 'On Order Retail', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'overstockedCost', label: 'Overstock Cost', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'overstockedRetail', label: 'Overstock Retail', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'avgStockCost30d', label: 'Avg Stock Cost (30d)', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'avgStockRetail30d', label: 'Avg Stock Retail (30d)', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'avgStockGross30d', label: 'Avg Stock Gross (30d)', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'receivedCost30d', label: 'Received Cost (30d)', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'replenishmentCost', label: 'Replenishment Cost', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'replenishmentRetail', label: 'Replenishment Retail', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'replenishmentProfit', label: 'Replenishment Profit', group: 'Valuation', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
|
||||
// Financial columns
|
||||
{ key: 'gmroi', label: 'GMROI', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'turnover_rate', label: 'Turnover Rate', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'avg_margin_percent', label: 'Margin %', group: 'Financial', format: (v) => v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'inventory_value', label: 'Inventory Value', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'cost_of_goods_sold', label: 'COGS', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
{ key: 'gross_profit', label: 'Gross Profit', group: 'Financial', format: (v) => v?.toFixed(2) ?? '-' },
|
||||
// Dates & Timing
|
||||
{ key: 'dateFirstReceived', label: 'First Received', group: 'Dates' },
|
||||
{ key: 'dateLastReceived', label: 'Last Received', group: 'Dates' },
|
||||
{ key: 'dateFirstSold', label: 'First Sold', group: 'Dates' },
|
||||
{ key: 'dateLastSold', label: 'Last Sold', group: 'Dates' },
|
||||
{ key: 'ageDays', label: 'Age (Days)', group: 'Dates', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'avgLeadTimeDays', label: 'Avg Lead Time', group: 'Dates', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'replenishDate', label: 'Replenish Date', group: 'Dates' },
|
||||
{ key: 'planningPeriodDays', label: 'Planning Period (Days)', group: 'Dates', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
|
||||
// Lead Time columns
|
||||
{ key: 'current_lead_time', label: 'Current Lead Time', group: 'Lead Time', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'target_lead_time', label: 'Target Lead Time', group: 'Lead Time', format: (v) => v?.toFixed(1) ?? '-' },
|
||||
{ key: 'lead_time_status', label: 'Lead Time Status', group: 'Lead Time' },
|
||||
{ key: 'last_purchase_date', label: 'Last Purchase', group: 'Lead Time' },
|
||||
{ key: 'first_received_date', label: 'First Received', group: 'Lead Time' },
|
||||
{ key: 'last_received_date', label: 'Last Received', group: 'Lead Time' },
|
||||
// Sales & Revenue
|
||||
{ key: 'salesVelocityDaily', label: 'Daily Velocity', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'yesterdaySales', label: 'Yesterday Sales', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'sales7d', label: 'Sales (7d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'revenue7d', label: 'Revenue (7d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'sales14d', label: 'Sales (14d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'revenue14d', label: 'Revenue (14d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'sales30d', label: 'Sales (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'revenue30d', label: 'Revenue (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'sales365d', label: 'Sales (365d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'revenue365d', label: 'Revenue (365d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'avgSalesPerDay30d', label: 'Avg Sales/Day (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'avgSalesPerMonth30d', label: 'Avg Sales/Month (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'asp30d', label: 'ASP (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'acp30d', label: 'ACP (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'avgRos30d', label: 'Avg ROS (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'returnsUnits30d', label: 'Returns Units (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'returnsRevenue30d', label: 'Returns Revenue (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'discounts30d', label: 'Discounts (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'grossRevenue30d', label: 'Gross Revenue (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'grossRegularRevenue30d', label: 'Gross Regular Revenue (30d)', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'lifetimeSales', label: 'Lifetime Sales', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'lifetimeRevenue', label: 'Lifetime Revenue', group: 'Sales', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
|
||||
// Financial Performance
|
||||
{ key: 'cogs30d', label: 'COGS (30d)', group: 'Financial', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'profit30d', label: 'Profit (30d)', group: 'Financial', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'margin30d', label: 'Margin %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'markup30d', label: 'Markup %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'gmroi30d', label: 'GMROI', group: 'Financial', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'stockturn30d', label: 'Stock Turn', group: 'Financial', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'sellThrough30d', label: 'Sell Through %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'returnRate30d', label: 'Return Rate %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'discountRate30d', label: 'Discount Rate %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
{ key: 'markdown30d', label: 'Markdown (30d)', group: 'Financial', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'markdownRate30d', label: 'Markdown Rate %', group: 'Financial', format: (v) => v === 0 ? '0%' : v ? `${v.toFixed(1)}%` : '-' },
|
||||
|
||||
// Forecasting
|
||||
{ key: 'leadTimeForecastUnits', label: 'Lead Time Forecast Units', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'daysOfStockForecastUnits', label: 'Days of Stock Forecast Units', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'planningPeriodForecastUnits', label: 'Planning Period Forecast Units', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'leadTimeClosingStock', label: 'Lead Time Closing Stock', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'daysOfStockClosingStock', label: 'Days of Stock Closing Stock', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'replenishmentNeededRaw', label: 'Replenishment Needed Raw', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'forecastLostSalesUnits', label: 'Forecast Lost Sales Units', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(1) : '-' },
|
||||
{ key: 'forecastLostRevenue', label: 'Forecast Lost Revenue', group: 'Forecasting', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
|
||||
// First Period Performance
|
||||
{ key: 'first7DaysSales', label: 'First 7 Days Sales', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'first7DaysRevenue', label: 'First 7 Days Revenue', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'first30DaysSales', label: 'First 30 Days Sales', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'first30DaysRevenue', label: 'First 30 Days Revenue', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'first60DaysSales', label: 'First 60 Days Sales', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'first60DaysRevenue', label: 'First 60 Days Revenue', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
{ key: 'first90DaysSales', label: 'First 90 Days Sales', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toString() : '-' },
|
||||
{ key: 'first90DaysRevenue', label: 'First 90 Days Revenue', group: 'First Period', format: (v) => v === 0 ? '0' : v ? v.toFixed(2) : '-' },
|
||||
];
|
||||
|
||||
// Define default columns for each view
|
||||
const VIEW_COLUMNS: Record<string, ColumnKey[]> = {
|
||||
const VIEW_COLUMNS: Record<string, ProductMetricColumnKey[]> = {
|
||||
all: [
|
||||
'image',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'brand',
|
||||
'vendor',
|
||||
'stock_quantity',
|
||||
'stock_status',
|
||||
'reorder_qty',
|
||||
'price',
|
||||
'regular_price',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'monthly_sales_avg',
|
||||
'inventory_value',
|
||||
'status',
|
||||
'currentStock',
|
||||
'currentPrice',
|
||||
'salesVelocityDaily',
|
||||
'sales30d',
|
||||
'revenue30d',
|
||||
'profit30d',
|
||||
'stockCoverInDays',
|
||||
'currentStockCost'
|
||||
],
|
||||
critical: [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'safety_stock',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'reorder_qty',
|
||||
'reorder_point',
|
||||
'currentStock',
|
||||
'configSafetyStock',
|
||||
'replenishmentUnits',
|
||||
'salesVelocityDaily',
|
||||
'sales7d',
|
||||
'sales30d',
|
||||
'onOrderQty',
|
||||
'earliestExpectedDate',
|
||||
'vendor',
|
||||
'last_purchase_date',
|
||||
'current_lead_time',
|
||||
'dateLastReceived',
|
||||
'avgLeadTimeDays'
|
||||
],
|
||||
reorder: [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'reorder_point',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'reorder_qty',
|
||||
'currentStock',
|
||||
'configSafetyStock',
|
||||
'replenishmentUnits',
|
||||
'salesVelocityDaily',
|
||||
'sellsOutInDays',
|
||||
'currentCostPrice',
|
||||
'sales30d',
|
||||
'vendor',
|
||||
'last_purchase_date',
|
||||
'avg_lead_time_days',
|
||||
'avgLeadTimeDays',
|
||||
'dateLastReceived'
|
||||
],
|
||||
overstocked: [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'overstocked_amt',
|
||||
'days_of_inventory',
|
||||
'inventory_value',
|
||||
'turnover_rate',
|
||||
'currentStock',
|
||||
'overstockedUnits',
|
||||
'sales7d',
|
||||
'sales30d',
|
||||
'salesVelocityDaily',
|
||||
'stockCoverInDays',
|
||||
'stockturn30d',
|
||||
'currentStockCost',
|
||||
'overstockedCost',
|
||||
'dateLastSold'
|
||||
],
|
||||
'at-risk': [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'safety_stock',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'days_of_inventory',
|
||||
'last_sale_date',
|
||||
'current_lead_time',
|
||||
'currentStock',
|
||||
'configSafetyStock',
|
||||
'salesVelocityDaily',
|
||||
'sales7d',
|
||||
'sales30d',
|
||||
'stockCoverInDays',
|
||||
'sellsOutInDays',
|
||||
'dateLastSold',
|
||||
'avgLeadTimeDays',
|
||||
'profit30d'
|
||||
],
|
||||
new: [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'currentStock',
|
||||
'salesVelocityDaily',
|
||||
'sales7d',
|
||||
'vendor',
|
||||
'brand',
|
||||
'price',
|
||||
'regular_price',
|
||||
'first_received_date',
|
||||
'currentPrice',
|
||||
'currentCostPrice',
|
||||
'dateFirstReceived',
|
||||
'ageDays',
|
||||
'abcClass'
|
||||
],
|
||||
healthy: [
|
||||
'image',
|
||||
'status',
|
||||
'imageUrl',
|
||||
'title',
|
||||
'stock_quantity',
|
||||
'daily_sales_avg',
|
||||
'weekly_sales_avg',
|
||||
'monthly_sales_avg',
|
||||
'days_of_inventory',
|
||||
'gross_profit',
|
||||
'gmroi',
|
||||
'currentStock',
|
||||
'stockCoverInDays',
|
||||
'salesVelocityDaily',
|
||||
'sales30d',
|
||||
'revenue30d',
|
||||
'profit30d',
|
||||
'margin30d',
|
||||
'gmroi30d',
|
||||
'stockturn30d'
|
||||
],
|
||||
};
|
||||
|
||||
export function Products() {
|
||||
const [searchParams, setSearchParams] = useSearchParams();
|
||||
const [filters, setFilters] = useState<Record<string, ActiveFilterValue>>({});
|
||||
const [sortColumn, setSortColumn] = useState<ColumnKey>('title');
|
||||
const [sortColumn, setSortColumn] = useState<ProductMetricColumnKey>('title');
|
||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('asc');
|
||||
// Track last sort direction for each column
|
||||
const [columnSortDirections, setColumnSortDirections] = useState<Record<string, 'asc' | 'desc'>>({
|
||||
'title': 'asc' // Initialize with default sort column and direction
|
||||
});
|
||||
const [currentPage, setCurrentPage] = useState(1);
|
||||
const [activeView, setActiveView] = useState(searchParams.get('view') || "all");
|
||||
const [pageSize] = useState(50);
|
||||
@@ -219,16 +309,16 @@ export function Products() {
|
||||
const [, setIsLoading] = useState(false);
|
||||
|
||||
// Store visible columns and order for each view
|
||||
const [viewColumns, setViewColumns] = useState<Record<string, Set<ColumnKey>>>(() => {
|
||||
const initialColumns: Record<string, Set<ColumnKey>> = {};
|
||||
const [viewColumns, setViewColumns] = useState<Record<string, Set<ProductMetricColumnKey>>>(() => {
|
||||
const initialColumns: Record<string, Set<ProductMetricColumnKey>> = {};
|
||||
Object.entries(VIEW_COLUMNS).forEach(([view, columns]) => {
|
||||
initialColumns[view] = new Set(columns);
|
||||
});
|
||||
return initialColumns;
|
||||
});
|
||||
|
||||
const [viewColumnOrder, setViewColumnOrder] = useState<Record<string, ColumnKey[]>>(() => {
|
||||
const initialOrder: Record<string, ColumnKey[]> = {};
|
||||
const [viewColumnOrder, setViewColumnOrder] = useState<Record<string, ProductMetricColumnKey[]>>(() => {
|
||||
const initialOrder: Record<string, ProductMetricColumnKey[]> = {};
|
||||
Object.entries(VIEW_COLUMNS).forEach(([view, defaultColumns]) => {
|
||||
initialOrder[view] = [
|
||||
...defaultColumns,
|
||||
@@ -241,16 +331,19 @@ export function Products() {
|
||||
// Get current view's columns
|
||||
const visibleColumns = useMemo(() => {
|
||||
const columns = new Set(viewColumns[activeView] || VIEW_COLUMNS.all);
|
||||
|
||||
// Add isReplenishable column when showing non-replenishable products for better visibility
|
||||
if (showNonReplenishable) {
|
||||
columns.add('replenishable');
|
||||
columns.add('isReplenishable');
|
||||
}
|
||||
|
||||
return columns;
|
||||
}, [viewColumns, activeView, showNonReplenishable]);
|
||||
|
||||
const columnOrder = viewColumnOrder[activeView] || viewColumnOrder.all;
|
||||
|
||||
// Handle column visibility changes
|
||||
const handleColumnVisibilityChange = (column: ColumnKey, isVisible: boolean) => {
|
||||
const handleColumnVisibilityChange = (column: ProductMetricColumnKey, isVisible: boolean) => {
|
||||
setViewColumns(prev => ({
|
||||
...prev,
|
||||
[activeView]: isVisible
|
||||
@@ -260,7 +353,7 @@ export function Products() {
|
||||
};
|
||||
|
||||
// Handle column order changes
|
||||
const handleColumnOrderChange = (newOrder: ColumnKey[]) => {
|
||||
const handleColumnOrderChange = (newOrder: ProductMetricColumnKey[]) => {
|
||||
setViewColumnOrder(prev => ({
|
||||
...prev,
|
||||
[activeView]: newOrder
|
||||
@@ -289,9 +382,19 @@ export function Products() {
|
||||
|
||||
Object.entries(filters).forEach(([key, value]) => {
|
||||
if (typeof value === 'object' && 'operator' in value) {
|
||||
transformedFilters[key] = value.value;
|
||||
transformedFilters[`${key}_operator`] = value.operator;
|
||||
// Convert the operator format to match what the backend expects
|
||||
// Backend expects keys like "sales30d_gt" instead of separate operator parameters
|
||||
const operatorSuffix = value.operator === '=' ? 'eq' :
|
||||
value.operator === '>' ? 'gt' :
|
||||
value.operator === '>=' ? 'gte' :
|
||||
value.operator === '<' ? 'lt' :
|
||||
value.operator === '<=' ? 'lte' :
|
||||
value.operator === 'between' ? 'between' : 'eq';
|
||||
|
||||
// Create a key with the correct suffix format: key_operator
|
||||
transformedFilters[`${key}_${operatorSuffix}`] = value.value;
|
||||
} else {
|
||||
// Simple values are passed as-is
|
||||
transformedFilters[key] = value;
|
||||
}
|
||||
});
|
||||
@@ -307,18 +410,29 @@ export function Products() {
|
||||
params.append('limit', pageSize.toString());
|
||||
|
||||
if (sortColumn) {
|
||||
// Don't convert camelCase to snake_case - use the column name directly
|
||||
// as defined in the backend's COLUMN_MAP
|
||||
console.log(`Sorting: ${sortColumn} (${sortDirection})`);
|
||||
params.append('sort', sortColumn);
|
||||
params.append('order', sortDirection);
|
||||
}
|
||||
|
||||
if (activeView && activeView !== 'all') {
|
||||
params.append('stockStatus', activeView === 'at-risk' ? 'At Risk' : activeView);
|
||||
const stockStatus = activeView === 'at-risk' ? 'At Risk' :
|
||||
activeView === 'reorder' ? 'Reorder Soon' :
|
||||
activeView === 'overstocked' ? 'Overstock' :
|
||||
activeView === 'new' ? 'New' :
|
||||
activeView.charAt(0).toUpperCase() + activeView.slice(1);
|
||||
|
||||
console.log(`View: ${activeView} → Stock Status: ${stockStatus}`);
|
||||
params.append('stock_status', stockStatus);
|
||||
}
|
||||
|
||||
// Transform filters to match API expectations
|
||||
const transformedFilters = transformFilters(filters);
|
||||
Object.entries(transformedFilters).forEach(([key, value]) => {
|
||||
if (value !== undefined && value !== null && value !== '') {
|
||||
// Don't convert camelCase to snake_case - use the filter name directly
|
||||
if (Array.isArray(value)) {
|
||||
params.append(key, JSON.stringify(value));
|
||||
} else {
|
||||
@@ -331,11 +445,67 @@ export function Products() {
|
||||
params.append('showNonReplenishable', 'false');
|
||||
}
|
||||
|
||||
const response = await fetch(`/api/products?${params.toString()}`);
|
||||
// Log the final query parameters for debugging
|
||||
console.log('API Query:', params.toString());
|
||||
|
||||
const response = await fetch(`/api/metrics?${params.toString()}`);
|
||||
if (!response.ok) throw new Error('Failed to fetch products');
|
||||
|
||||
const data = await response.json();
|
||||
return data;
|
||||
|
||||
// Transform snake_case keys to camelCase and convert string numbers to actual numbers
|
||||
const transformedProducts = data.metrics?.map((product: any) => {
|
||||
const transformed: any = {};
|
||||
|
||||
// Process all keys to convert from snake_case to camelCase
|
||||
Object.entries(product).forEach(([key, value]) => {
|
||||
// Better handling of snake_case to camelCase conversion
|
||||
let camelKey = key;
|
||||
|
||||
// First handle cases like sales_7d -> sales7d
|
||||
camelKey = camelKey.replace(/_(\d+[a-z])/g, '$1');
|
||||
|
||||
// Then handle regular snake_case -> camelCase
|
||||
camelKey = camelKey.replace(/_([a-z])/g, (_, p1) => p1.toUpperCase());
|
||||
|
||||
// Convert numeric strings to actual numbers, but handle empty strings properly
|
||||
if (typeof value === 'string' && value !== '' && !isNaN(Number(value)) &&
|
||||
!key.toLowerCase().includes('date') && key !== 'sku' && key !== 'title' &&
|
||||
key !== 'brand' && key !== 'vendor') {
|
||||
transformed[camelKey] = Number(value);
|
||||
} else {
|
||||
transformed[camelKey] = value;
|
||||
}
|
||||
});
|
||||
|
||||
// Ensure pid is a number
|
||||
transformed.pid = typeof transformed.pid === 'string' ?
|
||||
parseInt(transformed.pid, 10) : transformed.pid;
|
||||
|
||||
return transformed;
|
||||
}) || [];
|
||||
|
||||
// Debug: Log the first item to check field mapping
|
||||
if (transformedProducts.length > 0) {
|
||||
console.log('Sample product after transformation:');
|
||||
console.log('sales7d:', transformedProducts[0].sales7d);
|
||||
console.log('sales30d:', transformedProducts[0].sales30d);
|
||||
console.log('revenue30d:', transformedProducts[0].revenue30d);
|
||||
console.log('margin30d:', transformedProducts[0].margin30d);
|
||||
console.log('markup30d:', transformedProducts[0].markup30d);
|
||||
}
|
||||
|
||||
// Transform the metrics response to match our expected format
|
||||
return {
|
||||
products: transformedProducts,
|
||||
pagination: data.pagination || {
|
||||
total: 0,
|
||||
pages: 0,
|
||||
currentPage: 1,
|
||||
limit: pageSize
|
||||
},
|
||||
filters: data.appliedQuery?.filters || {}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error fetching products:', error);
|
||||
toast("Failed to fetch products. Please try again.");
|
||||
@@ -345,6 +515,29 @@ export function Products() {
|
||||
}
|
||||
};
|
||||
|
||||
// Query for filter options
|
||||
const { data: filterOptionsData, isLoading: isLoadingFilterOptions } = useQuery({
|
||||
queryKey: ['filterOptions'],
|
||||
queryFn: async () => {
|
||||
try {
|
||||
const response = await fetch('/api/metrics/filter-options');
|
||||
if (!response.ok) throw new Error('Failed to fetch filter options');
|
||||
const data = await response.json();
|
||||
|
||||
// Ensure we have the expected structure with correct casing
|
||||
return {
|
||||
vendors: data.vendors || [],
|
||||
brands: data.brands || [],
|
||||
abcClasses: data.abc_classes || data.abcClasses || []
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error fetching filter options:', error);
|
||||
return { vendors: [], brands: [], abcClasses: [] };
|
||||
}
|
||||
},
|
||||
staleTime: 5 * 60 * 1000, // Cache for 5 minutes
|
||||
});
|
||||
|
||||
// Query for products data
|
||||
const { data, isFetching } = useQuery({
|
||||
queryKey: ['products', currentPage, pageSize, sortColumn, sortDirection, activeView, filters, showNonReplenishable],
|
||||
@@ -359,13 +552,45 @@ export function Products() {
|
||||
}
|
||||
}, [currentPage, data?.pagination.pages]);
|
||||
|
||||
// Handle sort column change
|
||||
const handleSort = (column: keyof Product) => {
|
||||
setSortDirection(prev => {
|
||||
if (sortColumn !== column) return 'asc';
|
||||
return prev === 'asc' ? 'desc' : 'asc';
|
||||
});
|
||||
// Handle sort column change with improved column-specific direction memory
|
||||
const handleSort = (column: ProductMetricColumnKey) => {
|
||||
let nextDirection: 'asc' | 'desc';
|
||||
|
||||
if (sortColumn === column) {
|
||||
// If clicking the same column, toggle direction
|
||||
nextDirection = sortDirection === 'asc' ? 'desc' : 'asc';
|
||||
} else {
|
||||
// If clicking a different column:
|
||||
// 1. If this column has been sorted before, use the stored direction
|
||||
// 2. Otherwise use a sensible default (asc for text, desc for numeric columns)
|
||||
const prevDirection = columnSortDirections[column];
|
||||
|
||||
if (prevDirection) {
|
||||
// Use the stored direction
|
||||
nextDirection = prevDirection;
|
||||
} else {
|
||||
// Determine sensible default based on column type
|
||||
const columnDef = AVAILABLE_COLUMNS.find(c => c.key === column);
|
||||
const isNumeric = columnDef?.group === 'Sales' ||
|
||||
columnDef?.group === 'Financial' ||
|
||||
columnDef?.group === 'Stock' ||
|
||||
['currentPrice', 'currentRegularPrice', 'currentCostPrice', 'currentStock'].includes(column);
|
||||
|
||||
// Start with descending for numeric columns (to see highest values first)
|
||||
// Start with ascending for text columns (alphabetical order)
|
||||
nextDirection = isNumeric ? 'desc' : 'asc';
|
||||
}
|
||||
}
|
||||
|
||||
// Update the current sort state
|
||||
setSortDirection(nextDirection);
|
||||
setSortColumn(column);
|
||||
|
||||
// Remember this column's sort direction for next time
|
||||
setColumnSortDirections(prev => ({
|
||||
...prev,
|
||||
[column]: nextDirection
|
||||
}));
|
||||
};
|
||||
|
||||
// Handle filter changes
|
||||
@@ -407,7 +632,7 @@ export function Products() {
|
||||
</DropdownMenuTrigger>
|
||||
<DropdownMenuContent
|
||||
align="end"
|
||||
className="w-[500px] max-h-[calc(100vh-4rem)] overflow-y-auto"
|
||||
className="w-[600px] max-h-[calc(100vh-16rem)] overflow-y-auto"
|
||||
onCloseAutoFocus={(e) => e.preventDefault()}
|
||||
onPointerDownOutside={(e) => {
|
||||
// Only close if clicking outside the dropdown
|
||||
@@ -422,45 +647,47 @@ export function Products() {
|
||||
}
|
||||
}}
|
||||
>
|
||||
<DropdownMenuLabel className="sticky top-0 bg-background z-10">Toggle columns</DropdownMenuLabel>
|
||||
<div className="sticky top-0 bg-background z-10 flex items-center justify-between">
|
||||
<DropdownMenuLabel>Toggle columns</DropdownMenuLabel>
|
||||
<Button
|
||||
variant="secondary"
|
||||
size="sm"
|
||||
onClick={(e) => {
|
||||
resetColumnsToDefault();
|
||||
// Prevent closing by stopping propagation
|
||||
e.stopPropagation();
|
||||
}}
|
||||
>
|
||||
Reset to Default
|
||||
</Button>
|
||||
</div>
|
||||
<DropdownMenuSeparator className="sticky top-[29px] bg-background z-10" />
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div style={{ columnCount: 3, columnGap: '2rem' }} className="p-2">
|
||||
{Object.entries(columnsByGroup).map(([group, columns]) => (
|
||||
<div key={group}>
|
||||
<DropdownMenuLabel className="text-xs font-normal text-muted-foreground">
|
||||
<div key={group} style={{ breakInside: 'avoid' }} className="mb-4">
|
||||
<DropdownMenuLabel className="text-xs font-semibold text-muted-foreground mb-2">
|
||||
{group}
|
||||
</DropdownMenuLabel>
|
||||
{columns.map((column) => (
|
||||
<DropdownMenuCheckboxItem
|
||||
key={column.key}
|
||||
className="capitalize"
|
||||
checked={visibleColumns.has(column.key)}
|
||||
onCheckedChange={(checked) => {
|
||||
handleColumnVisibilityChange(column.key, checked);
|
||||
}}
|
||||
onSelect={(e) => {
|
||||
// Prevent closing by stopping propagation
|
||||
e.preventDefault();
|
||||
}}
|
||||
>
|
||||
{column.label}
|
||||
</DropdownMenuCheckboxItem>
|
||||
))}
|
||||
<div className="flex flex-col gap-1">
|
||||
{columns.map((column) => (
|
||||
<DropdownMenuCheckboxItem
|
||||
key={column.key}
|
||||
className="capitalize"
|
||||
checked={visibleColumns.has(column.key)}
|
||||
onCheckedChange={(checked) => {
|
||||
handleColumnVisibilityChange(column.key, checked);
|
||||
}}
|
||||
onSelect={(e) => {
|
||||
e.preventDefault();
|
||||
}}
|
||||
>
|
||||
{column.label}
|
||||
</DropdownMenuCheckboxItem>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
<DropdownMenuSeparator />
|
||||
<Button
|
||||
variant="ghost"
|
||||
className="w-full justify-start"
|
||||
onClick={(e) => {
|
||||
resetColumnsToDefault();
|
||||
// Prevent closing by stopping propagation
|
||||
e.stopPropagation();
|
||||
}}
|
||||
>
|
||||
Reset to Default
|
||||
</Button>
|
||||
</DropdownMenuContent>
|
||||
</DropdownMenu>
|
||||
);
|
||||
@@ -515,9 +742,12 @@ export function Products() {
|
||||
<div>
|
||||
<div className="flex items-center justify-between mb-4">
|
||||
<ProductFilters
|
||||
categories={data?.filters?.categories ?? []}
|
||||
vendors={data?.filters?.vendors ?? []}
|
||||
brands={data?.filters?.brands ?? []}
|
||||
filterOptions={{
|
||||
vendors: filterOptionsData?.vendors ?? [],
|
||||
brands: filterOptionsData?.brands ?? [],
|
||||
abcClasses: filterOptionsData?.abcClasses ?? []
|
||||
}}
|
||||
isLoadingOptions={isLoadingFilterOptions}
|
||||
onFilterChange={handleFilterChange}
|
||||
onClearFilters={handleClearFilters}
|
||||
activeFilters={filters}
|
||||
@@ -534,7 +764,7 @@ export function Products() {
|
||||
/>
|
||||
<Label htmlFor="show-non-replenishable">Show Non-Replenishable</Label>
|
||||
</div>
|
||||
{data?.pagination.total > 0 && (
|
||||
{data?.pagination?.total !== undefined && (
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{data.pagination.total.toLocaleString()} products
|
||||
</div>
|
||||
@@ -548,7 +778,11 @@ export function Products() {
|
||||
) : (
|
||||
<div className="space-y-4">
|
||||
<ProductTable
|
||||
products={data?.products || []}
|
||||
products={data?.products?.map((product: ProductMetric) => ({
|
||||
...product,
|
||||
// No need to calculate status anymore since it comes from the backend
|
||||
status: product.status || 'Healthy' // Fallback only if status is null
|
||||
})) || []}
|
||||
onSort={handleSort}
|
||||
sortColumn={sortColumn}
|
||||
sortDirection={sortDirection}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import { DataManagement } from "@/components/settings/DataManagement";
|
||||
import { StockManagement } from "@/components/settings/StockManagement";
|
||||
import { PerformanceMetrics } from "@/components/settings/PerformanceMetrics";
|
||||
import { CalculationSettings } from "@/components/settings/CalculationSettings";
|
||||
import { GlobalSettings } from "@/components/settings/GlobalSettings";
|
||||
import { ProductSettings } from "@/components/settings/ProductSettings";
|
||||
import { VendorSettings } from "@/components/settings/VendorSettings";
|
||||
import { TemplateManagement } from "@/components/settings/TemplateManagement";
|
||||
import { UserManagement } from "@/components/settings/UserManagement";
|
||||
import { PromptManagement } from "@/components/settings/PromptManagement";
|
||||
@@ -33,9 +33,9 @@ const SETTINGS_GROUPS: SettingsGroup[] = [
|
||||
id: "inventory",
|
||||
label: "Inventory Settings",
|
||||
tabs: [
|
||||
{ id: "stock-management", permission: "settings:stock_management", label: "Stock Management" },
|
||||
{ id: "performance-metrics", permission: "settings:performance_metrics", label: "Performance Metrics" },
|
||||
{ id: "calculation-settings", permission: "settings:calculation_settings", label: "Calculation Settings" },
|
||||
{ id: "global-settings", permission: "settings:global", label: "Global Settings" },
|
||||
{ id: "product-settings", permission: "settings:products", label: "Product Settings" },
|
||||
{ id: "vendor-settings", permission: "settings:vendors", label: "Vendor Settings" },
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -160,48 +160,48 @@ export function Settings() {
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="stock-management" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<TabsContent value="global-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:stock_management"
|
||||
permission="settings:global"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Stock Management.
|
||||
You don't have permission to access Global Settings.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<StockManagement />
|
||||
<GlobalSettings />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="performance-metrics" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<TabsContent value="product-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:performance_metrics"
|
||||
permission="settings:products"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Performance Metrics.
|
||||
You don't have permission to access Product Settings.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<PerformanceMetrics />
|
||||
<ProductSettings />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="calculation-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<TabsContent value="vendor-settings" className="mt-0 focus-visible:outline-none focus-visible:ring-0">
|
||||
<Protected
|
||||
permission="settings:calculation_settings"
|
||||
permission="settings:vendors"
|
||||
fallback={
|
||||
<Alert>
|
||||
<AlertDescription>
|
||||
You don't have permission to access Calculation Settings.
|
||||
You don't have permission to access Vendor Settings.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
}
|
||||
>
|
||||
<CalculationSettings />
|
||||
<VendorSettings />
|
||||
</Protected>
|
||||
</TabsContent>
|
||||
|
||||
|
||||
@@ -1,350 +1,481 @@
|
||||
import { useState, useMemo } from "react";
|
||||
import { useState, useMemo, useCallback } from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { Card, CardContent, CardHeader, CardTitle } from "@/components/ui/card";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { Pagination, PaginationContent, PaginationItem, PaginationLink, PaginationNext, PaginationPrevious } from "@/components/ui/pagination";
|
||||
import { motion } from "framer-motion";
|
||||
import config from "../config";
|
||||
import { Skeleton } from "@/components/ui/skeleton";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import { Label } from "@/components/ui/label";
|
||||
|
||||
interface Vendor {
|
||||
vendor_id: number;
|
||||
name: string;
|
||||
status: string;
|
||||
avg_lead_time_days: number;
|
||||
on_time_delivery_rate: number;
|
||||
order_fill_rate: number;
|
||||
total_orders: number;
|
||||
active_products: number;
|
||||
avg_unit_cost: number;
|
||||
total_spend: number;
|
||||
// Matches backend COLUMN_MAP keys for sorting
|
||||
type VendorSortableColumns =
|
||||
| 'vendorName' | 'productCount' | 'activeProductCount' | 'currentStockUnits'
|
||||
| 'currentStockCost' | 'onOrderUnits' | 'onOrderCost' | 'avgLeadTimeDays'
|
||||
| 'revenue_30d' | 'profit_30d' | 'avg_margin_30d' | 'po_count_365d' | 'status';
|
||||
|
||||
interface VendorMetric {
|
||||
vendor_id: string | number;
|
||||
vendor_name: string;
|
||||
last_calculated: string;
|
||||
product_count: number;
|
||||
active_product_count: number;
|
||||
replenishable_product_count: number;
|
||||
current_stock_units: number;
|
||||
current_stock_cost: string | number;
|
||||
current_stock_retail: string | number;
|
||||
on_order_units: number;
|
||||
on_order_cost: string | number;
|
||||
po_count_365d: number;
|
||||
avg_lead_time_days: number | null;
|
||||
sales_7d: number;
|
||||
revenue_7d: string | number;
|
||||
sales_30d: number;
|
||||
revenue_30d: string | number;
|
||||
profit_30d: string | number;
|
||||
cogs_30d: string | number;
|
||||
sales_365d: number;
|
||||
revenue_365d: string | number;
|
||||
lifetime_sales: number;
|
||||
lifetime_revenue: string | number;
|
||||
avg_margin_30d: string | number | null;
|
||||
// New fields added by vendorsAggregate
|
||||
status: string;
|
||||
vendor_status: string;
|
||||
cost_metrics_30d: {
|
||||
avg_unit_cost: number;
|
||||
total_spend: number;
|
||||
order_count: number;
|
||||
};
|
||||
// Camel case versions
|
||||
vendorId: string | number;
|
||||
vendorName: string;
|
||||
lastCalculated: string;
|
||||
productCount: number;
|
||||
activeProductCount: number;
|
||||
replenishableProductCount: number;
|
||||
currentStockUnits: number;
|
||||
currentStockCost: string | number;
|
||||
currentStockRetail: string | number;
|
||||
onOrderUnits: number;
|
||||
onOrderCost: string | number;
|
||||
poCount_365d: number;
|
||||
avgLeadTimeDays: number | null;
|
||||
lifetimeSales: number;
|
||||
lifetimeRevenue: string | number;
|
||||
avgMargin_30d: string | number | null;
|
||||
}
|
||||
|
||||
// Define response type to avoid type errors
|
||||
interface VendorResponse {
|
||||
vendors: VendorMetric[];
|
||||
pagination: {
|
||||
total: number;
|
||||
pages: number;
|
||||
currentPage: number;
|
||||
limit: number;
|
||||
};
|
||||
}
|
||||
|
||||
interface VendorFilterOptions {
|
||||
statuses: string[];
|
||||
}
|
||||
|
||||
interface VendorStats {
|
||||
totalVendors: number;
|
||||
activeVendors: number;
|
||||
totalActiveProducts: number;
|
||||
totalValue: number;
|
||||
totalOnOrderValue: number;
|
||||
avgLeadTime: number;
|
||||
}
|
||||
|
||||
interface VendorFilters {
|
||||
search: string;
|
||||
status: string;
|
||||
performance: string;
|
||||
search: string;
|
||||
status: string;
|
||||
showInactive: boolean;
|
||||
}
|
||||
|
||||
const ITEMS_PER_PAGE = 50;
|
||||
|
||||
const formatCurrency = (value: number | string | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
}).format(parsed);
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
}).format(value);
|
||||
};
|
||||
|
||||
const formatNumber = (value: number | string | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return parsed.toLocaleString(undefined, {
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits,
|
||||
});
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return value.toLocaleString(undefined, {
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits,
|
||||
});
|
||||
};
|
||||
|
||||
const formatPercentage = (value: number | string | null | undefined, digits = 1): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return `${parsed.toFixed(digits)}%`;
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return `${value.toFixed(digits)}%`;
|
||||
};
|
||||
|
||||
const formatDays = (value: number | string | null | undefined, digits = 1): string => {
|
||||
if (value == null) return 'N/A';
|
||||
if (typeof value === 'string') {
|
||||
const parsed = parseFloat(value);
|
||||
if (isNaN(parsed)) return 'N/A';
|
||||
return `${parsed.toFixed(digits)} days`;
|
||||
}
|
||||
if (typeof value !== 'number' || isNaN(value)) return 'N/A';
|
||||
return `${value.toFixed(digits)} days`;
|
||||
};
|
||||
|
||||
const getStatusVariant = (status: string): "default" | "secondary" | "outline" | "destructive" => {
|
||||
switch (status) {
|
||||
case 'active':
|
||||
return 'default';
|
||||
case 'inactive':
|
||||
return 'secondary';
|
||||
case 'discontinued':
|
||||
return 'destructive';
|
||||
default:
|
||||
return 'outline';
|
||||
}
|
||||
};
|
||||
|
||||
export function Vendors() {
|
||||
const [page, setPage] = useState(1);
|
||||
const [sortColumn, setSortColumn] = useState<keyof Vendor>("name");
|
||||
const [sortDirection, setSortDirection] = useState<"asc" | "desc">("asc");
|
||||
const [filters, setFilters] = useState<VendorFilters>({
|
||||
search: "",
|
||||
status: "all",
|
||||
performance: "all",
|
||||
});
|
||||
|
||||
const { data, isLoading } = useQuery({
|
||||
queryKey: ["vendors"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/vendors`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error("Failed to fetch vendors");
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
// Filter and sort the data client-side
|
||||
const filteredData = useMemo(() => {
|
||||
if (!data?.vendors) return [];
|
||||
|
||||
let filtered = [...data.vendors];
|
||||
|
||||
// Apply search filter
|
||||
if (filters.search) {
|
||||
const searchLower = filters.search.toLowerCase();
|
||||
filtered = filtered.filter(vendor =>
|
||||
vendor.name.toLowerCase().includes(searchLower)
|
||||
);
|
||||
}
|
||||
|
||||
// Apply status filter
|
||||
if (filters.status !== 'all') {
|
||||
filtered = filtered.filter(vendor => vendor.status === filters.status);
|
||||
}
|
||||
|
||||
// Apply performance filter
|
||||
if (filters.performance !== 'all') {
|
||||
filtered = filtered.filter(vendor => {
|
||||
const fillRate = vendor.order_fill_rate ?? 0;
|
||||
switch (filters.performance) {
|
||||
case 'excellent': return fillRate >= 95;
|
||||
case 'good': return fillRate >= 85 && fillRate < 95;
|
||||
case 'fair': return fillRate >= 75 && fillRate < 85;
|
||||
case 'poor': return fillRate < 75;
|
||||
default: return true;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Apply sorting
|
||||
filtered.sort((a, b) => {
|
||||
const aVal = a[sortColumn];
|
||||
const bVal = b[sortColumn];
|
||||
|
||||
if (typeof aVal === 'number' && typeof bVal === 'number') {
|
||||
return sortDirection === 'asc' ? aVal - bVal : bVal - aVal;
|
||||
}
|
||||
|
||||
const aStr = String(aVal || '');
|
||||
const bStr = String(bVal || '');
|
||||
return sortDirection === 'asc' ?
|
||||
aStr.localeCompare(bStr) :
|
||||
bStr.localeCompare(aStr);
|
||||
const [page, setPage] = useState(1);
|
||||
const [limit] = useState(ITEMS_PER_PAGE);
|
||||
const [sortColumn, setSortColumn] = useState<VendorSortableColumns>("vendorName");
|
||||
const [sortDirection, setSortDirection] = useState<"asc" | "desc">("asc");
|
||||
const [filters, setFilters] = useState<VendorFilters>({
|
||||
search: "",
|
||||
status: "all",
|
||||
showInactive: false, // Default to hiding vendors with 0 active products
|
||||
});
|
||||
|
||||
return filtered;
|
||||
}, [data?.vendors, filters, sortColumn, sortDirection]);
|
||||
// --- Data Fetching ---
|
||||
|
||||
// Calculate pagination
|
||||
const totalPages = Math.ceil(filteredData.length / ITEMS_PER_PAGE);
|
||||
const paginatedData = useMemo(() => {
|
||||
const start = (page - 1) * ITEMS_PER_PAGE;
|
||||
const end = start + ITEMS_PER_PAGE;
|
||||
return filteredData.slice(start, end);
|
||||
}, [filteredData, page]);
|
||||
const queryParams = useMemo(() => {
|
||||
const params = new URLSearchParams();
|
||||
params.set('page', page.toString());
|
||||
params.set('limit', limit.toString());
|
||||
params.set('sort', sortColumn);
|
||||
params.set('order', sortDirection);
|
||||
|
||||
const handleSort = (column: keyof Vendor) => {
|
||||
setSortDirection(prev => {
|
||||
if (sortColumn !== column) return "asc";
|
||||
return prev === "asc" ? "desc" : "asc";
|
||||
});
|
||||
setSortColumn(column);
|
||||
};
|
||||
|
||||
const getPerformanceBadge = (fillRate: number) => {
|
||||
if (fillRate >= 95) return <Badge variant="default">Excellent</Badge>;
|
||||
if (fillRate >= 85) return <Badge variant="secondary">Good</Badge>;
|
||||
if (fillRate >= 75) return <Badge variant="outline">Fair</Badge>;
|
||||
return <Badge variant="destructive">Poor</Badge>;
|
||||
};
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
layout
|
||||
transition={{
|
||||
layout: {
|
||||
duration: 0.15,
|
||||
ease: [0.4, 0, 0.2, 1]
|
||||
if (filters.search) {
|
||||
params.set('vendorName_ilike', filters.search); // Filter by name
|
||||
}
|
||||
if (filters.status !== 'all') {
|
||||
params.set('status', filters.status); // Filter by status
|
||||
}
|
||||
if (!filters.showInactive) {
|
||||
params.set('activeProductCount_gt', '0'); // Only show vendors with active products
|
||||
}
|
||||
}}
|
||||
className="container mx-auto py-6 space-y-4"
|
||||
>
|
||||
<motion.div
|
||||
layout="position"
|
||||
transition={{ duration: 0.15 }}
|
||||
className="flex items-center justify-between"
|
||||
>
|
||||
<h1 className="text-3xl font-bold tracking-tight">Vendors</h1>
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{filteredData.length.toLocaleString()} vendors
|
||||
</div>
|
||||
</motion.div>
|
||||
|
||||
<motion.div
|
||||
layout="preserve-aspect"
|
||||
transition={{ duration: 0.15 }}
|
||||
className="grid gap-4 md:grid-cols-4"
|
||||
>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Vendors</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{data?.stats?.totalVendors ?? "..."}</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
{data?.stats?.activeVendors ?? "..."} active
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
return params;
|
||||
}, [page, limit, sortColumn, sortDirection, filters]);
|
||||
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Spend</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">
|
||||
${typeof data?.stats?.totalSpend === 'number' ? data.stats.totalSpend.toLocaleString(undefined, { minimumFractionDigits: 0, maximumFractionDigits: 0 }) : "..."}
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Avg unit cost: ${typeof data?.stats?.avgUnitCost === 'number' ? data.stats.avgUnitCost.toFixed(2) : "..."}
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
const { data: listData, isLoading: isLoadingList, error: listError } = useQuery<VendorResponse, Error>({
|
||||
queryKey: ['vendors', queryParams.toString()],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/vendors-aggregate?${queryParams.toString()}`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error(`Network response was not ok (${response.status})`);
|
||||
return response.json();
|
||||
},
|
||||
placeholderData: (prev) => prev, // Modern replacement for keepPreviousData
|
||||
});
|
||||
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Performance</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{typeof data?.stats?.avgFillRate === 'number' ? data.stats.avgFillRate.toFixed(1) : "..."}%</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Fill rate / {typeof data?.stats?.avgOnTimeDelivery === 'number' ? data.stats.avgOnTimeDelivery.toFixed(1) : "..."}% on-time
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
const { data: statsData, isLoading: isLoadingStats } = useQuery<VendorStats, Error>({
|
||||
queryKey: ['vendorsStats'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/vendors-aggregate/stats`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error("Failed to fetch vendor stats");
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Lead Time</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="text-2xl font-bold">{typeof data?.stats?.avgLeadTime === 'number' ? data.stats.avgLeadTime.toFixed(1) : "..."} days</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Average delivery time
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
// Fetch filter options
|
||||
const { data: filterOptions, isLoading: isLoadingFilterOptions } = useQuery<VendorFilterOptions, Error>({
|
||||
queryKey: ['vendorsFilterOptions'],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/vendors-aggregate/filter-options`, {
|
||||
credentials: 'include'
|
||||
});
|
||||
if (!response.ok) throw new Error("Failed to fetch filter options");
|
||||
return response.json();
|
||||
},
|
||||
});
|
||||
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex flex-1 items-center space-x-2">
|
||||
<Input
|
||||
placeholder="Search vendors..."
|
||||
value={filters.search}
|
||||
onChange={(e) => setFilters(prev => ({ ...prev, search: e.target.value }))}
|
||||
className="h-8 w-[150px] lg:w-[250px]"
|
||||
/>
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, status: value }))}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[150px]">
|
||||
<SelectValue placeholder="Status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Status</SelectItem>
|
||||
<SelectItem value="active">Active</SelectItem>
|
||||
<SelectItem value="inactive">Inactive</SelectItem>
|
||||
<SelectItem value="pending">Pending</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<Select
|
||||
value={filters.performance}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, performance: value }))}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[150px]">
|
||||
<SelectValue placeholder="Performance" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Performance</SelectItem>
|
||||
<SelectItem value="excellent">Excellent</SelectItem>
|
||||
<SelectItem value="good">Good</SelectItem>
|
||||
<SelectItem value="fair">Fair</SelectItem>
|
||||
<SelectItem value="poor">Poor</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
</div>
|
||||
// --- Event Handlers ---
|
||||
|
||||
<div className="rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead onClick={() => handleSort("name")} className="cursor-pointer">Vendor</TableHead>
|
||||
<TableHead onClick={() => handleSort("status")} className="cursor-pointer">Status</TableHead>
|
||||
<TableHead onClick={() => handleSort("avg_lead_time_days")} className="cursor-pointer">Lead Time</TableHead>
|
||||
<TableHead onClick={() => handleSort("on_time_delivery_rate")} className="cursor-pointer">On-Time %</TableHead>
|
||||
<TableHead onClick={() => handleSort("order_fill_rate")} className="cursor-pointer">Fill Rate</TableHead>
|
||||
<TableHead onClick={() => handleSort("avg_unit_cost")} className="cursor-pointer">Avg Unit Cost</TableHead>
|
||||
<TableHead onClick={() => handleSort("total_spend")} className="cursor-pointer">Total Spend</TableHead>
|
||||
<TableHead onClick={() => handleSort("total_orders")} className="cursor-pointer">Orders</TableHead>
|
||||
<TableHead onClick={() => handleSort("active_products")} className="cursor-pointer">Products</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{isLoading ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={9} className="text-center py-8">
|
||||
Loading vendors...
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : paginatedData.map((vendor: Vendor) => (
|
||||
<TableRow key={vendor.vendor_id}>
|
||||
<TableCell className="font-medium">{vendor.name}</TableCell>
|
||||
<TableCell>{vendor.status}</TableCell>
|
||||
<TableCell>{typeof vendor.avg_lead_time_days === 'number' ? vendor.avg_lead_time_days.toFixed(1) : "0.0"} days</TableCell>
|
||||
<TableCell>{typeof vendor.on_time_delivery_rate === 'number' ? vendor.on_time_delivery_rate.toFixed(1) : "0.0"}%</TableCell>
|
||||
<TableCell>
|
||||
<div className="flex items-center gap-2" style={{ minWidth: '120px' }}>
|
||||
<div style={{ width: '50px', textAlign: 'right' }}>
|
||||
{typeof vendor.order_fill_rate === 'number' ? vendor.order_fill_rate.toFixed(1) : "0.0"}%
|
||||
</div>
|
||||
{getPerformanceBadge(vendor.order_fill_rate ?? 0)}
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell>${typeof vendor.avg_unit_cost === 'number' ? vendor.avg_unit_cost.toFixed(2) : "0.00"}</TableCell>
|
||||
<TableCell>${typeof vendor.total_spend === 'number' ? vendor.total_spend.toLocaleString(undefined, { minimumFractionDigits: 0, maximumFractionDigits: 0 }) : "0"}</TableCell>
|
||||
<TableCell>{vendor.total_orders?.toLocaleString() ?? 0}</TableCell>
|
||||
<TableCell>{vendor.active_products?.toLocaleString() ?? 0}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
{!isLoading && !paginatedData.length && (
|
||||
<TableRow>
|
||||
<TableCell colSpan={9} className="text-center py-8 text-muted-foreground">
|
||||
No vendors found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
const handleSort = useCallback((column: VendorSortableColumns) => {
|
||||
setSortDirection(prev => (sortColumn === column && prev === "asc" ? "desc" : "asc"));
|
||||
setSortColumn(column);
|
||||
setPage(1);
|
||||
}, [sortColumn]);
|
||||
|
||||
{totalPages > 1 && (
|
||||
<motion.div
|
||||
layout="position"
|
||||
transition={{ duration: 0.15 }}
|
||||
className="flex justify-center"
|
||||
const handleFilterChange = useCallback((filterName: keyof VendorFilters, value: string | boolean) => {
|
||||
setFilters(prev => ({ ...prev, [filterName]: value }));
|
||||
setPage(1);
|
||||
}, []);
|
||||
|
||||
const handlePageChange = (newPage: number) => {
|
||||
if (newPage >= 1 && newPage <= (listData?.pagination.pages ?? 1)) {
|
||||
setPage(newPage);
|
||||
}
|
||||
};
|
||||
|
||||
// --- Derived Data ---
|
||||
const vendors = listData?.vendors ?? [];
|
||||
const pagination = listData?.pagination;
|
||||
const totalPages = pagination?.pages ?? 0;
|
||||
|
||||
// --- Rendering ---
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
layout
|
||||
transition={{ layout: { duration: 0.15, ease: [0.4, 0, 0.2, 1] } }}
|
||||
className="container mx-auto py-6 space-y-4"
|
||||
>
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
href="#"
|
||||
onClick={(e) => {
|
||||
e.preventDefault();
|
||||
if (page > 1) setPage(p => p - 1);
|
||||
}}
|
||||
aria-disabled={page === 1}
|
||||
{/* Header */}
|
||||
<motion.div layout="position" transition={{ duration: 0.15 }} className="flex items-center justify-between">
|
||||
<h1 className="text-3xl font-bold tracking-tight">Vendors</h1>
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{isLoadingList && !pagination ? 'Loading...' : `${formatNumber(pagination?.total)} vendors`}
|
||||
</div>
|
||||
</motion.div>
|
||||
|
||||
{/* Stats Cards */}
|
||||
<motion.div layout="preserve-aspect" transition={{ duration: 0.15 }} className="grid gap-4 md:grid-cols-4">
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Vendors</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-24" /> : <div className="text-2xl font-bold">{formatNumber(statsData?.totalVendors)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
{isLoadingStats ? <Skeleton className="h-4 w-28" /> :
|
||||
`${formatNumber(statsData?.activeVendors)} active`}
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Total Stock Value</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-28" /> : <div className="text-2xl font-bold">{formatCurrency(statsData?.totalValue)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Current cost value
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Value On Order</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-28" /> : <div className="text-2xl font-bold">{formatCurrency(statsData?.totalOnOrderValue)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Total cost on open POs
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
<Card>
|
||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||
<CardTitle className="text-sm font-medium">Avg Lead Time</CardTitle>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{isLoadingStats ? <Skeleton className="h-8 w-20" /> : <div className="text-2xl font-bold">{formatDays(statsData?.avgLeadTime)}</div>}
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Average across vendors
|
||||
</p>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
|
||||
{/* Filter Controls */}
|
||||
<div className="flex flex-wrap items-center space-y-2 sm:space-y-0 sm:space-x-2">
|
||||
<Input
|
||||
placeholder="Search vendors..."
|
||||
value={filters.search}
|
||||
onChange={(e) => handleFilterChange('search', e.target.value)}
|
||||
className="w-full sm:w-[250px]"
|
||||
/>
|
||||
</PaginationItem>
|
||||
{Array.from({ length: totalPages }, (_, i) => (
|
||||
<PaginationItem key={i + 1}>
|
||||
<PaginationLink
|
||||
href="#"
|
||||
onClick={(e) => {
|
||||
e.preventDefault();
|
||||
setPage(i + 1);
|
||||
}}
|
||||
isActive={page === i + 1}
|
||||
>
|
||||
{i + 1}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
))}
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
href="#"
|
||||
onClick={(e) => {
|
||||
e.preventDefault();
|
||||
if (page < totalPages) setPage(p => p + 1);
|
||||
}}
|
||||
aria-disabled={page >= totalPages}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => handleFilterChange('status', value)}
|
||||
>
|
||||
<SelectTrigger className="w-full sm:w-[180px]">
|
||||
<SelectValue placeholder="Status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Statuses</SelectItem>
|
||||
{filterOptions?.statuses?.map((status) => (
|
||||
<SelectItem key={status} value={status}>
|
||||
{status.charAt(0).toUpperCase() + status.slice(1)}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<div className="flex items-center space-x-2 ml-auto">
|
||||
<Switch
|
||||
id="show-inactive-vendors"
|
||||
checked={filters.showInactive}
|
||||
onCheckedChange={(checked) => handleFilterChange('showInactive', checked)}
|
||||
/>
|
||||
<Label htmlFor="show-inactive-vendors">Show vendors with no active products</Label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Data Table */}
|
||||
<div className="rounded-md border">
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead onClick={() => handleSort("vendorName")} className="cursor-pointer">Vendor</TableHead>
|
||||
<TableHead onClick={() => handleSort("activeProductCount")} className="cursor-pointer text-right">Active Prod.</TableHead>
|
||||
<TableHead onClick={() => handleSort("currentStockCost")} className="cursor-pointer text-right">Stock Value</TableHead>
|
||||
<TableHead onClick={() => handleSort("onOrderUnits")} className="cursor-pointer text-right">On Order (Units)</TableHead>
|
||||
<TableHead onClick={() => handleSort("onOrderCost")} className="cursor-pointer text-right">On Order (Cost)</TableHead>
|
||||
<TableHead onClick={() => handleSort("avgLeadTimeDays")} className="cursor-pointer text-right">Avg Lead Time</TableHead>
|
||||
<TableHead onClick={() => handleSort("revenue_30d")} className="cursor-pointer text-right">Revenue (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("profit_30d")} className="cursor-pointer text-right">Profit (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("avg_margin_30d")} className="cursor-pointer text-right">Margin (30d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("po_count_365d")} className="cursor-pointer text-right">POs (365d)</TableHead>
|
||||
<TableHead onClick={() => handleSort("status")} className="cursor-pointer text-right">Status</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{isLoadingList && !listData ? (
|
||||
Array.from({ length: 5 }).map((_, i) => ( // Skeleton rows
|
||||
<TableRow key={`skel-${i}`}>
|
||||
<TableCell><Skeleton className="h-5 w-40" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-20 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
<TableCell className="text-right"><Skeleton className="h-5 w-16 ml-auto" /></TableCell>
|
||||
</TableRow>
|
||||
))
|
||||
) : listError ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={11} className="text-center py-8 text-destructive">
|
||||
Error loading vendors: {listError.message}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : vendors.length === 0 ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={11} className="text-center py-8 text-muted-foreground">
|
||||
No vendors found matching your criteria.
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : (
|
||||
vendors.map((vendor: VendorMetric) => (
|
||||
<TableRow key={vendor.vendor_id} className={vendor.active_product_count === 0 ? "opacity-60" : ""}>
|
||||
<TableCell className="font-medium">{vendor.vendor_name}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(vendor.active_product_count || vendor.activeProductCount)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(vendor.current_stock_cost as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(vendor.on_order_units || vendor.onOrderUnits)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(vendor.on_order_cost as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatDays(vendor.avg_lead_time_days || vendor.avgLeadTimeDays)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(vendor.revenue_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(vendor.profit_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatPercentage(vendor.avg_margin_30d as number)}</TableCell>
|
||||
<TableCell className="text-right">{formatNumber(vendor.po_count_365d || vendor.poCount_365d)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<Badge variant={getStatusVariant(vendor.status)}>
|
||||
{vendor.status || 'Unknown'}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
|
||||
{/* Pagination Controls */}
|
||||
{totalPages > 1 && pagination && (
|
||||
<div className="flex justify-center">
|
||||
<Pagination>
|
||||
<PaginationContent>
|
||||
<PaginationItem>
|
||||
<PaginationPrevious
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(pagination.currentPage - 1); }}
|
||||
aria-disabled={pagination.currentPage === 1}
|
||||
className={pagination.currentPage === 1 ? "pointer-events-none opacity-50" : ""}
|
||||
/>
|
||||
</PaginationItem>
|
||||
{[...Array(totalPages)].map((_, i) => (
|
||||
<PaginationItem key={i + 1}>
|
||||
<PaginationLink
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(i + 1); }}
|
||||
isActive={pagination.currentPage === i + 1}
|
||||
>
|
||||
{i + 1}
|
||||
</PaginationLink>
|
||||
</PaginationItem>
|
||||
))}
|
||||
<PaginationItem>
|
||||
<PaginationNext
|
||||
href="#"
|
||||
onClick={(e) => { e.preventDefault(); handlePageChange(pagination.currentPage + 1); }}
|
||||
aria-disabled={pagination.currentPage >= totalPages}
|
||||
className={pagination.currentPage >= totalPages ? "pointer-events-none opacity-50" : ""}
|
||||
/>
|
||||
</PaginationItem>
|
||||
</PaginationContent>
|
||||
</Pagination>
|
||||
</div>
|
||||
)}
|
||||
</motion.div>
|
||||
)}
|
||||
</motion.div>
|
||||
);
|
||||
);
|
||||
}
|
||||
|
||||
export default Vendors;
|
||||
export default Vendors;
|
||||
@@ -78,3 +78,359 @@ export interface Product {
|
||||
reorder_qty?: number;
|
||||
overstocked_amt?: string; // numeric(15,3)
|
||||
}
|
||||
|
||||
// Type for product status (used for calculated statuses)
|
||||
export type ProductStatus = "Critical" | "Reorder Soon" | "Healthy" | "Overstock" | "At Risk" | "New" | "Unknown";
|
||||
|
||||
// Represents data returned by the /metrics endpoint (from product_metrics table)
|
||||
export interface ProductMetric {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
brand: string | null;
|
||||
vendor: string | null;
|
||||
imageUrl: string | null;
|
||||
isVisible: boolean;
|
||||
isReplenishable: boolean;
|
||||
|
||||
// Additional Product Fields
|
||||
barcode: string | null;
|
||||
vendorReference: string | null; // Supplier #
|
||||
notionsReference: string | null; // Notions #
|
||||
preorderCount: number | null;
|
||||
notionsInvCount: number | null;
|
||||
harmonizedTariffCode: string | null;
|
||||
line: string | null;
|
||||
subline: string | null;
|
||||
artist: string | null;
|
||||
moq: number | null;
|
||||
rating: number | null;
|
||||
reviews: number | null;
|
||||
weight: number | null;
|
||||
dimensions: {
|
||||
length: number | null;
|
||||
width: number | null;
|
||||
height: number | null;
|
||||
} | null;
|
||||
countryOfOrigin: string | null;
|
||||
location: string | null;
|
||||
baskets: number | null; // Number of times added to basket
|
||||
notifies: number | null; // Number of stock notifications
|
||||
|
||||
// Current Status
|
||||
currentPrice: number | null;
|
||||
currentRegularPrice: number | null;
|
||||
currentCostPrice: number | null;
|
||||
currentLandingCostPrice: number | null;
|
||||
currentStock: number;
|
||||
currentStockCost: number | null;
|
||||
currentStockRetail: number | null;
|
||||
currentStockGross: number | null;
|
||||
onOrderQty: number | null;
|
||||
onOrderCost: number | null;
|
||||
onOrderRetail: number | null;
|
||||
earliestExpectedDate: string | null; // Date as string
|
||||
|
||||
// Historical Dates
|
||||
dateCreated: string | null;
|
||||
dateFirstReceived: string | null;
|
||||
dateLastReceived: string | null;
|
||||
dateFirstSold: string | null;
|
||||
dateLastSold: string | null;
|
||||
ageDays: number | null;
|
||||
|
||||
// Rolling Period Metrics
|
||||
sales7d: number | null;
|
||||
revenue7d: number | null;
|
||||
sales14d: number | null;
|
||||
revenue14d: number | null;
|
||||
sales30d: number | null;
|
||||
revenue30d: number | null;
|
||||
cogs30d: number | null;
|
||||
profit30d: number | null;
|
||||
returnsUnits30d: number | null;
|
||||
returnsRevenue30d: number | null;
|
||||
discounts30d: number | null;
|
||||
grossRevenue30d: number | null;
|
||||
grossRegularRevenue30d: number | null;
|
||||
stockoutDays30d: number | null;
|
||||
sales365d: number | null;
|
||||
revenue365d: number | null;
|
||||
avgStockUnits30d: number | null;
|
||||
avgStockCost30d: number | null;
|
||||
avgStockRetail30d: number | null;
|
||||
avgStockGross30d: number | null;
|
||||
receivedQty30d: number | null;
|
||||
receivedCost30d: number | null;
|
||||
|
||||
// Calculated KPIs
|
||||
asp30d: number | null;
|
||||
acp30d: number | null;
|
||||
avgRos30d: number | null;
|
||||
avgSalesPerDay30d: number | null;
|
||||
avgSalesPerMonth30d: number | null;
|
||||
margin30d: number | null;
|
||||
markup30d: number | null;
|
||||
gmroi30d: number | null;
|
||||
stockturn30d: number | null;
|
||||
returnRate30d: number | null;
|
||||
discountRate30d: number | null;
|
||||
stockoutRate30d: number | null;
|
||||
markdown30d: number | null;
|
||||
markdownRate30d: number | null;
|
||||
sellThrough30d: number | null;
|
||||
avgLeadTimeDays: number | null;
|
||||
|
||||
// Forecasting & Replenishment
|
||||
abcClass: string | null;
|
||||
salesVelocityDaily: number | null;
|
||||
configLeadTime: number | null;
|
||||
configDaysOfStock: number | null;
|
||||
configSafetyStock: number | null;
|
||||
planningPeriodDays: number | null;
|
||||
leadTimeForecastUnits: number | null;
|
||||
daysOfStockForecastUnits: number | null;
|
||||
planningPeriodForecastUnits: number | null;
|
||||
leadTimeClosingStock: number | null;
|
||||
daysOfStockClosingStock: number | null;
|
||||
replenishmentNeededRaw: number | null;
|
||||
replenishmentUnits: number | null;
|
||||
replenishmentCost: number | null;
|
||||
replenishmentRetail: number | null;
|
||||
replenishmentProfit: number | null;
|
||||
toOrderUnits: number | null;
|
||||
forecastLostSalesUnits: number | null;
|
||||
forecastLostRevenue: number | null;
|
||||
stockCoverInDays: number | null;
|
||||
poCoverInDays: number | null;
|
||||
sellsOutInDays: number | null;
|
||||
replenishDate: string | null;
|
||||
overstockedUnits: number | null;
|
||||
overstockedCost: number | null;
|
||||
overstockedRetail: number | null;
|
||||
isOldStock: boolean | null;
|
||||
|
||||
// Yesterday
|
||||
yesterdaySales: number | null;
|
||||
|
||||
// Calculated status (added by frontend)
|
||||
status?: ProductStatus;
|
||||
}
|
||||
|
||||
// Type for filter options returned by /metrics/filter-options
|
||||
export interface ProductFilterOptions {
|
||||
vendors: string[];
|
||||
brands: string[];
|
||||
abcClasses: string[];
|
||||
}
|
||||
|
||||
// Type for keys used in sorting/filtering (matching frontend state/UI)
|
||||
export type ProductMetricColumnKey =
|
||||
| 'pid'
|
||||
| 'title'
|
||||
| 'sku'
|
||||
| 'barcode'
|
||||
| 'brand'
|
||||
| 'line'
|
||||
| 'subline'
|
||||
| 'artist'
|
||||
| 'vendor'
|
||||
| 'vendorReference'
|
||||
| 'notionsReference'
|
||||
| 'harmonizedTariffCode'
|
||||
| 'countryOfOrigin'
|
||||
| 'location'
|
||||
| 'moq'
|
||||
| 'weight'
|
||||
| 'dimensions'
|
||||
| 'rating'
|
||||
| 'reviews'
|
||||
| 'baskets'
|
||||
| 'notifies'
|
||||
| 'preorderCount'
|
||||
| 'notionsInvCount'
|
||||
| 'isVisible'
|
||||
| 'isReplenishable'
|
||||
| 'abcClass'
|
||||
| 'status'
|
||||
| 'dateCreated'
|
||||
| 'currentStock'
|
||||
| 'currentStockCost'
|
||||
| 'currentStockRetail'
|
||||
| 'currentStockGross'
|
||||
| 'ageDays'
|
||||
| 'replenishDate'
|
||||
| 'planningPeriodDays'
|
||||
| 'currentPrice'
|
||||
| 'currentRegularPrice'
|
||||
| 'currentCostPrice'
|
||||
| 'currentLandingCostPrice'
|
||||
| 'configSafetyStock'
|
||||
| 'replenishmentUnits'
|
||||
| 'stockCoverInDays'
|
||||
| 'sellsOutInDays'
|
||||
| 'onOrderQty'
|
||||
| 'earliestExpectedDate'
|
||||
| 'isOldStock'
|
||||
| 'overstockedUnits'
|
||||
| 'stockoutDays30d'
|
||||
| 'stockoutRate30d'
|
||||
| 'avgStockUnits30d'
|
||||
| 'avgStockCost30d'
|
||||
| 'avgStockRetail30d'
|
||||
| 'avgStockGross30d'
|
||||
| 'receivedQty30d'
|
||||
| 'receivedCost30d'
|
||||
| 'configLeadTime'
|
||||
| 'configDaysOfStock'
|
||||
| 'poCoverInDays'
|
||||
| 'toOrderUnits'
|
||||
| 'costPrice'
|
||||
| 'valueAtCost'
|
||||
| 'profit'
|
||||
| 'margin'
|
||||
| 'targetPrice'
|
||||
| 'replenishmentCost'
|
||||
| 'replenishmentRetail'
|
||||
| 'replenishmentProfit'
|
||||
| 'onOrderCost'
|
||||
| 'onOrderRetail'
|
||||
| 'overstockedCost'
|
||||
| 'overstockedRetail'
|
||||
| 'sales7d'
|
||||
| 'revenue7d'
|
||||
| 'sales14d'
|
||||
| 'revenue14d'
|
||||
| 'sales30d'
|
||||
| 'units30d'
|
||||
| 'revenue30d'
|
||||
| 'sales365d'
|
||||
| 'revenue365d'
|
||||
| 'avgSalePrice30d'
|
||||
| 'avgDailySales30d'
|
||||
| 'avgDailyRevenue30d'
|
||||
| 'stockturnRate30d'
|
||||
| 'margin30d'
|
||||
| 'cogs30d'
|
||||
| 'profit30d'
|
||||
| 'roas30d'
|
||||
| 'adSpend30d'
|
||||
| 'gmroi30d'
|
||||
| 'first7DaysSales'
|
||||
| 'first7DaysRevenue'
|
||||
| 'first30DaysSales'
|
||||
| 'first30DaysRevenue'
|
||||
| 'first60DaysSales'
|
||||
| 'first60DaysRevenue'
|
||||
| 'first90DaysSales'
|
||||
| 'first90DaysRevenue'
|
||||
| 'lifetimeSales'
|
||||
| 'lifetimeRevenue'
|
||||
| 'lifetimeAvgPrice'
|
||||
| 'forecastSalesUnits'
|
||||
| 'forecastSalesValue'
|
||||
| 'forecastStockCover'
|
||||
| 'forecastedOutOfStockDate'
|
||||
| 'salesVelocity'
|
||||
| 'salesVelocityDaily'
|
||||
| 'dateLastSold'
|
||||
| 'yesterdaySales'
|
||||
| 'avgSalesPerDay30d'
|
||||
| 'avgSalesPerMonth30d'
|
||||
| 'returnsUnits30d'
|
||||
| 'returnsRevenue30d'
|
||||
| 'discounts30d'
|
||||
| 'grossRevenue30d'
|
||||
| 'grossRegularRevenue30d'
|
||||
| 'asp30d'
|
||||
| 'acp30d'
|
||||
| 'avgRos30d'
|
||||
| 'markup30d'
|
||||
| 'stockturn30d'
|
||||
| 'sellThrough30d'
|
||||
| 'returnRate30d'
|
||||
| 'discountRate30d'
|
||||
| 'markdown30d'
|
||||
| 'markdownRate30d'
|
||||
| 'leadTimeForecastUnits'
|
||||
| 'daysOfStockForecastUnits'
|
||||
| 'planningPeriodForecastUnits'
|
||||
| 'leadTimeClosingStock'
|
||||
| 'daysOfStockClosingStock'
|
||||
| 'replenishmentNeededRaw'
|
||||
| 'forecastLostSalesUnits'
|
||||
| 'forecastLostRevenue'
|
||||
| 'avgLeadTimeDays'
|
||||
| 'dateLastReceived'
|
||||
| 'dateFirstReceived'
|
||||
| 'dateFirstSold'
|
||||
| 'imageUrl';
|
||||
|
||||
// Mapping frontend keys to backend query param keys
|
||||
export const FRONTEND_TO_BACKEND_KEY_MAP: Record<string, string> = {
|
||||
pid: 'pid',
|
||||
sku: 'sku',
|
||||
title: 'title',
|
||||
brand: 'brand',
|
||||
vendor: 'vendor',
|
||||
imageUrl: 'imageUrl',
|
||||
isVisible: 'isVisible',
|
||||
isReplenishable: 'isReplenishable',
|
||||
currentPrice: 'currentPrice',
|
||||
currentRegularPrice: 'currentRegularPrice',
|
||||
currentCostPrice: 'currentCostPrice',
|
||||
currentLandingCostPrice: 'currentLandingCostPrice',
|
||||
currentStock: 'currentStock',
|
||||
currentStockCost: 'currentStockCost',
|
||||
currentStockRetail: 'currentStockRetail',
|
||||
currentStockGross: 'currentStockGross',
|
||||
onOrderQty: 'onOrderQty',
|
||||
onOrderCost: 'onOrderCost',
|
||||
onOrderRetail: 'onOrderRetail',
|
||||
earliestExpectedDate: 'earliestExpectedDate',
|
||||
dateCreated: 'dateCreated',
|
||||
dateFirstReceived: 'dateFirstReceived',
|
||||
dateLastReceived: 'dateLastReceived',
|
||||
dateFirstSold: 'dateFirstSold',
|
||||
dateLastSold: 'dateLastSold',
|
||||
ageDays: 'ageDays',
|
||||
sales7d: 'sales7d',
|
||||
revenue7d: 'revenue7d',
|
||||
sales14d: 'sales14d',
|
||||
revenue14d: 'revenue14d',
|
||||
sales30d: 'sales30d',
|
||||
revenue30d: 'revenue30d',
|
||||
cogs30d: 'cogs30d',
|
||||
profit30d: 'profit30d',
|
||||
stockoutDays30d: 'stockoutDays30d',
|
||||
sales365d: 'sales365d',
|
||||
revenue365d: 'revenue365d',
|
||||
avgStockUnits30d: 'avgStockUnits30d',
|
||||
avgStockCost30d: 'avgStockCost30d',
|
||||
receivedQty30d: 'receivedQty30d',
|
||||
receivedCost30d: 'receivedCost30d',
|
||||
asp30d: 'asp30d',
|
||||
acp30d: 'acp30d',
|
||||
margin30d: 'margin30d',
|
||||
gmroi30d: 'gmroi30d',
|
||||
stockturn30d: 'stockturn30d',
|
||||
sellThrough30d: 'sellThrough30d',
|
||||
avgLeadTimeDays: 'avgLeadTimeDays',
|
||||
abcClass: 'abcClass',
|
||||
salesVelocityDaily: 'salesVelocityDaily',
|
||||
configLeadTime: 'configLeadTime',
|
||||
configDaysOfStock: 'configDaysOfStock',
|
||||
stockCoverInDays: 'stockCoverInDays',
|
||||
sellsOutInDays: 'sellsOutInDays',
|
||||
replenishDate: 'replenishDate',
|
||||
overstockedUnits: 'overstockedUnits',
|
||||
overstockedCost: 'overstockedCost',
|
||||
isOldStock: 'isOldStock',
|
||||
yesterdaySales: 'yesterdaySales',
|
||||
status: 'status' // Frontend-only field
|
||||
};
|
||||
|
||||
// Function to get backend key safely
|
||||
export function getBackendKey(frontendKey: string): string | null {
|
||||
return FRONTEND_TO_BACKEND_KEY_MAP[frontendKey] || null;
|
||||
}
|
||||
|
||||
136
inventory/src/utils/productUtils.ts
Normal file
136
inventory/src/utils/productUtils.ts
Normal file
@@ -0,0 +1,136 @@
|
||||
import { ProductMetric, ProductStatus } from "@/types/products";
|
||||
|
||||
//Calculates the product status based on various metrics
|
||||
|
||||
export function getProductStatus(product: ProductMetric): ProductStatus {
|
||||
if (!product.isReplenishable) {
|
||||
return "Healthy"; // Non-replenishable items default to Healthy
|
||||
}
|
||||
|
||||
const {
|
||||
currentStock,
|
||||
stockCoverInDays,
|
||||
sellsOutInDays,
|
||||
overstockedUnits,
|
||||
configLeadTime,
|
||||
avgLeadTimeDays,
|
||||
dateLastSold,
|
||||
ageDays,
|
||||
isOldStock
|
||||
} = product;
|
||||
|
||||
const leadTime = configLeadTime ?? avgLeadTimeDays ?? 30; // Default lead time if none configured
|
||||
const safetyThresholdDays = leadTime * 0.5; // Safety threshold is 50% of lead time
|
||||
|
||||
// Check for overstock first
|
||||
if (overstockedUnits != null && overstockedUnits > 0) {
|
||||
return "Overstock";
|
||||
}
|
||||
|
||||
// Check for critical stock
|
||||
if (stockCoverInDays != null) {
|
||||
// Stock is <= 0 or very low compared to lead time
|
||||
if (currentStock <= 0 || stockCoverInDays <= 0) {
|
||||
return "Critical";
|
||||
}
|
||||
if (stockCoverInDays < safetyThresholdDays) {
|
||||
return "Critical";
|
||||
}
|
||||
}
|
||||
|
||||
// Check for products that will need reordering soon
|
||||
if (sellsOutInDays != null && sellsOutInDays < (leadTime + 7)) { // Within lead time + 1 week
|
||||
// If also critically low, keep Critical status
|
||||
if (stockCoverInDays != null && stockCoverInDays < safetyThresholdDays) {
|
||||
return "Critical";
|
||||
}
|
||||
return "Reorder Soon";
|
||||
}
|
||||
|
||||
// Check for 'At Risk' - e.g., old stock or hasn't sold in a long time
|
||||
const ninetyDaysAgo = new Date();
|
||||
ninetyDaysAgo.setDate(ninetyDaysAgo.getDate() - 90);
|
||||
|
||||
if (isOldStock) {
|
||||
return "At Risk";
|
||||
}
|
||||
|
||||
if (dateLastSold && new Date(dateLastSold) < ninetyDaysAgo && (ageDays ?? 0) > 180) {
|
||||
return "At Risk";
|
||||
}
|
||||
|
||||
// Very high stock cover (more than a year) is at risk too
|
||||
if (stockCoverInDays != null && stockCoverInDays > 365) {
|
||||
return "At Risk";
|
||||
}
|
||||
|
||||
// If none of the above, assume Healthy
|
||||
return "Healthy";
|
||||
}
|
||||
|
||||
//Returns a Badge component HTML string for a given product status
|
||||
export function getStatusBadge(status: ProductStatus): string {
|
||||
switch (status) {
|
||||
case 'Critical':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-transparent bg-red-600 text-white">Critical</div>';
|
||||
case 'Reorder Soon':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-secondary bg-yellow-500 text-black">Reorder Soon</div>';
|
||||
case 'Healthy':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-transparent bg-green-600 text-white">Healthy</div>';
|
||||
case 'Overstock':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-secondary bg-blue-600 text-white">Overstock</div>';
|
||||
case 'At Risk':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-orange-500 text-orange-600">At Risk</div>';
|
||||
case 'New':
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2 border-transparent bg-purple-600 text-white">New</div>';
|
||||
default:
|
||||
return '<div class="inline-flex items-center rounded-full border px-2.5 py-0.5 text-xs font-semibold transition-colors focus:outline-none focus:ring-2 focus:ring-ring focus:ring-offset-2">Unknown</div>';
|
||||
}
|
||||
}
|
||||
|
||||
//Formatting utilities for displaying metrics
|
||||
export const formatCurrency = (value: number | null | undefined, digits = 2): string => {
|
||||
if (value == null) return 'N/A';
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
style: 'currency',
|
||||
currency: 'USD',
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
}).format(value);
|
||||
};
|
||||
|
||||
export const formatNumber = (value: number | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
return value.toLocaleString(undefined, {
|
||||
minimumFractionDigits: digits,
|
||||
maximumFractionDigits: digits
|
||||
});
|
||||
};
|
||||
|
||||
export const formatPercentage = (value: number | null | undefined, digits = 1): string => {
|
||||
if (value == null) return 'N/A';
|
||||
return `${value.toFixed(digits)}%`;
|
||||
};
|
||||
|
||||
export const formatDays = (value: number | null | undefined, digits = 0): string => {
|
||||
if (value == null) return 'N/A';
|
||||
return `${value.toFixed(digits)} days`;
|
||||
};
|
||||
|
||||
export const formatDate = (dateString: string | null | undefined): string => {
|
||||
if (!dateString) return 'N/A';
|
||||
try {
|
||||
return new Date(dateString).toLocaleDateString('en-US', {
|
||||
year: 'numeric',
|
||||
month: 'short',
|
||||
day: 'numeric'
|
||||
});
|
||||
} catch (e) {
|
||||
return 'Invalid Date';
|
||||
}
|
||||
};
|
||||
|
||||
export const formatBoolean = (value: boolean | null | undefined): string => {
|
||||
if (value == null) return 'N/A';
|
||||
return value ? 'Yes' : 'No';
|
||||
};
|
||||
Reference in New Issue
Block a user