Compare commits
65 Commits
44d9ae2aad
...
Improve-ca
| Author | SHA1 | Date | |
|---|---|---|---|
| 169407a729 | |||
| 302172c537 | |||
| 4fdaab9e87 | |||
| 4dcc1f9e90 | |||
| 67d57c8872 | |||
| d7bf79dec9 | |||
| d90e9b51dc | |||
| 98e2e4073a | |||
| 23c2085f1c | |||
| 2a6a0d0a87 | |||
| ebffb8f912 | |||
| 5676e9094d | |||
| b926aba9ff | |||
| e62c6ac8ee | |||
| 18f4970059 | |||
| 12cab7473a | |||
| 06b0f1251e | |||
| 8a43da502a | |||
| bd5bcdd548 | |||
| 0a51328da2 | |||
| b2d7744cc5 | |||
| 8124fc9add | |||
| 9e1989ac66 | |||
| 5bfd6f6d04 | |||
| 1003ff3cf2 | |||
| 2d0089dc52 | |||
| 50b86d6d8a | |||
| 07f14c0017 | |||
| e77b488cd4 | |||
| d57239c40c | |||
| 1c932e0df5 | |||
| a867117c3c | |||
| 996d3d36af | |||
| d0abe9d9a2 | |||
| 5e4d1c3bd8 | |||
| 1be97d6610 | |||
| b506f89dd7 | |||
| c433f1aae8 | |||
| 31d4011902 | |||
| 6c5f119ee5 | |||
| 3c5fb9e435 | |||
| 2b329a55a4 | |||
| 0d377466aa | |||
| fb5bf4a144 | |||
| 4d8a677c5b | |||
| 655c071960 | |||
| d2a2dbc812 | |||
| d60b2d4fae | |||
| 81a724db9d | |||
| 84baa7e7d3 | |||
| 814d5d1a84 | |||
| b578549763 | |||
| d56f1e1437 | |||
| ebebd37f11 | |||
| 9c34e24909 | |||
| a1e3803ca3 | |||
| a661b6a11e | |||
| 1410dc5571 | |||
| b1f252bea8 | |||
| 7e341a152c | |||
| 25a0bc8d4c | |||
| 57b0e9a120 | |||
| 64d9ab2f83 | |||
| 8323ae7703 | |||
| 5781b45f37 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -26,6 +26,7 @@ dist-ssr
|
||||
dashboard/build/**
|
||||
dashboard-server/frontend/build/**
|
||||
**/build/**
|
||||
.fuse_hidden**
|
||||
._*
|
||||
|
||||
# Build directories
|
||||
@@ -57,3 +58,4 @@ csv/**/*
|
||||
**/csv/**/*
|
||||
!csv/.gitkeep
|
||||
inventory/tsconfig.tsbuildinfo
|
||||
inventory-server/scripts/.fuse_hidden00000fa20000000a
|
||||
|
||||
185
docs/calculate-issues.md
Normal file
185
docs/calculate-issues.md
Normal file
@@ -0,0 +1,185 @@
|
||||
1. **Missing Updates for Reorder Point and Safety Stock** [RESOLVED - product-metrics.js]
|
||||
- **Problem:** In the **product_metrics** table (used by the inventory health view), the fields **reorder_point** and **safety_stock** are never updated in the product metrics calculations. Although a helper function (`calculateReorderQuantities`) exists and computes these values, the update query in the `calculateProductMetrics` function does not assign any values to these columns.
|
||||
- **Effect:** The inventory health view relies on these fields (using COALESCE to default them to 0), which means that stock might never be classified as "Reorder" or "Healthy" based on the proper reorder point or safety stock calculations.
|
||||
- **Example:** Even if a product's base metrics would require a reorder (for example, if its days of inventory are low), the view always shows a value of 0 for reorder_point and safety_stock.
|
||||
- **Fix:** Update the product metrics query (or add a subsequent update) so that **pm.reorder_point** and **pm.safety_stock** are calculated (for instance, by integrating the logic from `calculateReorderQuantities`) and stored in the table.
|
||||
|
||||
2. **Overwritten Module Exports When Combining Scripts** [RESOLVED - calculate-metrics.js]
|
||||
- **Problem:** The code provided shows two distinct exports. The main metrics calculation module exports `calculateMetrics` (along with cancel and getProgress helpers), but later in the same concatenated file the module exports are overwritten.
|
||||
- **Effect:** If these two code sections end up in a single module file, the export for the main calculation will be lost. This would break any code that calls the overall metrics calculation.
|
||||
- **Example:** An external caller expecting to run `calculateMetrics` would instead receive the `calculateProductMetrics` function.
|
||||
- **Fix:** Make sure each script resides in its own module file. Verify that the module boundaries and exports are not accidentally merged or overwritten when deployed.
|
||||
|
||||
3. **Potential Formula Issue in EOQ Calculation (Reorder Qty)** [RESOLVED - product-metrics.js]
|
||||
- **Problem:** The helper function `calculateReorderQuantities` uses an EOQ formula with a holding cost expressed as a percentage (0.25) rather than a per‐unit cost.
|
||||
- **Effect:** If the intent was to use the traditional EOQ formula (which expects a holding cost per unit rather than a percentage), this could lead to an incorrect reorder quantity.
|
||||
- **Example:** For a given annual demand and fixed order cost, the computed reorder quantity might be higher or lower than expected.
|
||||
- **Fix:** Double-check the EOQ formula. If the intention is to compute based on a percentage, then document that clearly; otherwise, adjust the formula to use the proper holding cost value.
|
||||
|
||||
4. **Potential Overlap or Redundancy in GMROI Calculation** [RESOLVED - time-aggregates.js]
|
||||
- **Problem:** In the time aggregates function, GMROI is calculated in two steps. The initial INSERT query computes GMROI as
|
||||
|
||||
`CASE WHEN s.inventory_value > 0 THEN (s.total_revenue - s.total_cost) / s.inventory_value ELSE 0 END`
|
||||
|
||||
and then a subsequent UPDATE query recalculates it as an annualized value using gross profit and active days.
|
||||
|
||||
|
||||
- **Effect:** Overwriting a computed value may be intentional to refine the metric, but if not coordinated it can cause confusion or unexpected output in the `product_time_aggregates` table.
|
||||
- **Example:** A product's GMROI might first appear as a simple ratio but then be updated to a scaled value based on the number of active days, which could lead to inconsistent reporting if not documented.
|
||||
- **Fix:** Consolidated the GMROI calculation into a single step in the initial INSERT query, properly handling annualization and NULL values.
|
||||
|
||||
5. **Handling of Products Without Orders or Purchase Data** [RESOLVED - time-aggregates.js]
|
||||
- **Problem:** In the INSERT query of the time aggregates function, the UNION covers two cases: one for products with order data (from `monthly_sales`) and one for products that have entries in `monthly_stock` but no matching order data.
|
||||
- **Effect:** If a product has neither orders nor purchase orders, it won't get an entry in `product_time_aggregates`. Depending on business rules, this might be acceptable or might mean missing data.
|
||||
- **Example:** A product that's new or rarely ordered might not appear in the time aggregates view, potentially affecting downstream calculations.
|
||||
- **Fix:** Added an `all_products` CTE and modified the JOIN structure to ensure every product gets an entry with appropriate default values, even if it has no orders or purchase orders.
|
||||
|
||||
6. **Redundant Recalculation of Vendor Metrics**
|
||||
- **Problem:** Similar concepts from prior scripts where cumulative metrics (like **total_revenue** and **total_cost**) are calculated in multiple query steps without necessary validation or optimization. In the vendor metrics script, calculations for total revenue and margin are performed within a `WITH` clause, which is then used in other parts of the process, making it more complex than needed.
|
||||
- **Effect:** There's unnecessary duplication in querying the same data multiple times across subqueries. It could result in decreased performance and may even lead to excess computation if the subqueries are not optimized or correctly indexed.
|
||||
- **Example:** Vendor sales and vendor purchase orders (PO) metrics are calculated in separate `WITH` clauses, leading to repeated calculations.
|
||||
- **Fix:** Synthesize the required metrics into fewer queries or reuse the results within the `WITH` clause itself. Avoid redundant calculations of **revenue** and **cost** unless truly necessary.
|
||||
|
||||
7. **Handling Products Without Orders or Purchase Orders**
|
||||
- **Problem:** In your `calculateVendorMetrics` script, the initial insert for vendor sales doesn't fully address the products that might not have matching orders or purchase orders. If a vendor has products without any sales within the last 12 months, the results may not be fully accurate unless handled explicitly.
|
||||
- **Effect:** If no orders exist for a product associated with a particular vendor, that product will not contribute to the vendor's metrics, potentially omitting important data when calculating **total_orders** or **total_revenue**.
|
||||
- **Example:** The scripted statistics fill gaps, but products with no recent purchase or sales orders might not be counted accurately.
|
||||
- **Fix:** Include logic to handle scenarios where these products still need to be part of the vendor calculation. Use a `LEFT JOIN` wherever possible to account for cases without sales or purchase orders.
|
||||
|
||||
8. **Redundant `ON DUPLICATE KEY UPDATE`**
|
||||
- **Problem:** Multiple queries in the `calculateVendorMetrics` script use `ON DUPLICATE KEY UPDATE` clauses to handle repeated metrics updates. This is useful for ensuring the most up-to-date calculations but can cause inconsistencies if multiple calculations happen for the same product or vendor simultaneously.
|
||||
- **Effect:** This approach can lead to an inaccurate update of brand-specific data when insertion and update overlap. Each time you add a new batch, an existing entry could be overwritten if not handled correctly.
|
||||
- **Example:** Vendor country, category, or sales-related metrics could unintentionally update during processing.
|
||||
- **Fix:** Match on current status more robustly in case of existing rows to avoid unnecessary updates. Ensure that the key used for `ON DUPLICATE KEY` aligns with any foreign key relationships that might indicate an already processed entry.
|
||||
|
||||
9. **SQL Query Performance with Multiple Nested `WITH` Clauses**
|
||||
- **Problem:** Heavily nested queries (especially **WITH** clauses) may lead to slow performance depending on the size of the dataset.
|
||||
- **Effect:** Computational burden could be high when the database is large, e.g., querying **purchase orders**, **vendor sales**, and **product info** simultaneously. Even with proper indexes, the deployment might struggle in production environments.
|
||||
- **Example:** Multiple `WITH` clauses in the vendor and brand metrics calculation scripts might work fine in small datasets but degrade performance in production.
|
||||
- **Fix:** Combine some subqueries and reduce the layer of computations needed for calculating final metrics. Test performance on a production-sized dataset to see how nested queries are handled.
|
||||
|
||||
10. **Missing Updates for Reorder Metrics (Vendor/Brand)**
|
||||
- **Previously Identified Issue:** Inconsistent updates for **reorder_point** and **safety_stock** across earlier scripts.
|
||||
- **Current Impact on This Script:** The vendor and brand metrics do not have explicit updates for reorder point or safety stock, which are essential for inventory evaluation.
|
||||
- **Effect:** The correct thresholds and reorder logic for vendor product inventory aren't fully accounted for in these scripts.
|
||||
- **Fix:** Integrate relevant logic to update **reorder_point** or **safety_stock** within the vendor and brand metrics calculations. Ensure that it's consistently computed and stored.
|
||||
|
||||
11. **Data Integrity and Consistency**
|
||||
|
||||
**w**hen tracking sales growth or performance
|
||||
|
||||
|
||||
- **Problem:** Brand metrics include a sales growth clause where negative results can sometimes be skewed severely if period data varies considerably.
|
||||
- **Effect:** If period boundaries are incorrect or records are missing, this can create drastic growth rate calculations.
|
||||
- **Example:** If the "previous" period has no sales but "current" has a substantial increase, the growth rate will show as **100%**.
|
||||
- **Fix:** Implement checks that ensure both periods are valid and that the system calculates growth accurately, avoiding growth rates based solely on potential outliers. Replace consistent gaps with a no-growth rate or a meaningful zero.
|
||||
|
||||
12. **Exclusion of Vendors With No Sales**
|
||||
|
||||
The vendor metrics query is driven by the `vendor_sales` CTE, which aggregates data only for vendors that have orders in the past 12 months.
|
||||
|
||||
|
||||
- **Impact:** Vendors that have purchase activity (or simply exist in vendor_details) but no recent sales won't show up in vendor_metrics. This could cause the frontend to miss metrics for vendors that might still be important.
|
||||
- **Fix:** Consider adding a UNION or changing the driving set so that all vendors (for example, from vendor_details) are included—even if they have zero sales.
|
||||
13. **Identical Formulas for On-Time Delivery and Order Fill Rates**
|
||||
|
||||
Both metrics are calculated as `(received_orders / total_orders) * 100`.
|
||||
|
||||
|
||||
- **Impact:** If the business expects these to be distinct (for example, one might factor in on-time receipt versus mere receipt), then showing identical values on the frontend could be misleading.
|
||||
- **Fix:** Verify and adjust the formulas if on-time delivery and order fill rates should be computed differently.
|
||||
14. **Handling Nulls and Defaults in Aggregations**
|
||||
|
||||
The query uses COALESCE in most places, but be sure that every aggregated value (like average lead time) correctly defaults when no data is present.
|
||||
|
||||
|
||||
- **Impact:** Incorrect defaults might cause odd or missing numbers on the production interface.
|
||||
- **Fix:** Double-check that all numeric aggregates reliably default to 0 where needed.
|
||||
|
||||
15. **Inconsistent Stock Filtering Conditions**
|
||||
|
||||
In the main brand metrics query the CTE filters products with the condition
|
||||
|
||||
`p.stock_quantity <= 5000 AND p.stock_quantity >= 0`
|
||||
|
||||
whereas in the brand time-based metrics query the condition is only `p.stock_quantity <= 5000`.
|
||||
|
||||
|
||||
- **Impact:** This discrepancy may lead to inconsistent numbers (for example, if any products have negative stock, which might be due to data issues) between overall brand metrics and time-based metrics on the frontend.
|
||||
- **Fix:** Standardize the filtering criteria so that both queries treat out-of-range stock values in the same way.
|
||||
16. **Growth Rate Calculation Periods**
|
||||
|
||||
The growth rate is computed by comparing revenue from the last 3 months ("current") against a period from 15–12 months ago ("previous").
|
||||
|
||||
|
||||
- **Impact:** This narrow window may not reflect typical year-over-year performance and could lead to volatile or unexpected growth percentages on the frontend.
|
||||
- **Fix:** Revisit the business logic for growth—if a longer or different comparison period is preferred, adjust the date intervals accordingly.
|
||||
17. **Potential NULLs in Aggregated Time-Based Metrics**
|
||||
|
||||
In the brand time-based metrics query, aggregate expressions such as `SUM(o.quantity * o.price)` aren't wrapped with COALESCE.
|
||||
|
||||
|
||||
- **Impact:** If there are no orders for a given brand/month, these sums might return NULL rather than 0, which could propagate into the frontend display.
|
||||
- **Fix:** Wrap such aggregates in COALESCE (e.g. `COALESCE(SUM(o.quantity * o.price), 0)`) to ensure a default numeric value.
|
||||
|
||||
18. **Grouping by Category Status in Base Metrics Insert**
|
||||
- **Problem:** The INSERT for base category metrics groups by both `c.cat_id` and `c.status` even though the table's primary key is just `category_id`.
|
||||
- **Effect:** If a category's status changes over time, the grouping may produce unexpected updates (or even multiple groups before the duplicate key update kicks in), possibly causing the wrong status or aggregated figures to be stored.
|
||||
- **Example:** A category that toggles between "active" and "inactive" might have its metrics calculated differently on different runs.
|
||||
- **Fix:** Ensure that the grouping keys match the primary key (or that the status update logic is exactly as intended) so that a single row per category is maintained.
|
||||
19. **Potential Null Handling in Margin Calculations**
|
||||
- **Problem:** In the query for category time metrics, the calculation of average margin uses expressions such as `SUM(o.quantity * (o.price - GREATEST(p.cost_price, 0)))` without using `COALESCE` on `p.cost_price`.
|
||||
- **Effect:** If any product's `cost_price` is `NULL`, then `GREATEST(p.cost_price, 0)` returns `NULL` and the resulting sum (and thus the margin) could become `NULL` rather than defaulting to 0. This might lead to missing or misleading margin figures on the frontend.
|
||||
- **Example:** A product with a missing cost price would make the entire margin expression evaluate to `NULL` even when sales exist.
|
||||
- **Fix:** Replace `GREATEST(p.cost_price, 0)` with `GREATEST(COALESCE(p.cost_price, 0), 0)` (or simply use `COALESCE(p.cost_price, 0)`) to ensure that missing values are handled.
|
||||
20. **Data Coverage in Growth Rate Calculation**
|
||||
- **Problem:** The growth rate update depends on multiple CTEs (current period, previous period, and trend analysis) that require a minimum amount of data (for instance, `HAVING COUNT(*) >= 6` in the trend_stats CTE).
|
||||
- **Effect:** Categories with insufficient historical data will fall into the "ELSE" branch (or may even be skipped if no revenue is present), which might result in a growth rate of 0.0 or an unexpected value.
|
||||
- **Example:** A newly created category that has only two months of data won't have trend analysis, so its growth rate will be calculated solely by the simple difference, which might not reflect true performance.
|
||||
- **Fix:** Confirm that this fallback behavior is acceptable for production; if not, adjust the logic so that every category receives a consistent growth rate even with sparse data.
|
||||
21. **Omission of Forecasts for Zero–Sales Categories**
|
||||
- **Observation:** The category–sales metrics query uses a `HAVING AVG(cs.daily_quantity) > 0` clause.
|
||||
- **Effect:** Categories without any average daily sales will not receive a forecast record in `category_sales_metrics`. If the frontend expects a row (even with zeros) for every category, this will lead to missing data.
|
||||
- **Fix:** Verify that it's acceptable for categories with no sales to have no forecast entry. If not, adjust the query so that a default forecast (with zeros) is inserted.
|
||||
|
||||
22. **Randomness in Category-Level Forecast Revenue Calculation**
|
||||
- **Problem:** In the category-level forecasts query, the forecast revenue is multiplied by a factor of `(0.95 + (RAND() * 0.1))`.
|
||||
- **Effect:** This introduces randomness into the forecast figures so that repeated runs could yield slightly different values. If deterministic forecasts are expected on the production frontend, this could lead to inconsistent displays.
|
||||
- **Example:** The same category might show a 5% higher forecast on one run and 3% on another because of the random multiplier.
|
||||
- **Fix:** Confirm that this randomness is intentional for your forecasting model; if forecasts are meant to be reproducible, remove or replace the `RAND()` factor with a fixed multiplier.
|
||||
23. **Multi-Statement Cleanup of Temporary Tables**
|
||||
- **Problem:** The cleanup query drops multiple temporary tables in one call (separated by semicolons).
|
||||
- **Effect:** If your Node.js MySQL driver isn't configured to allow multi-statement execution, this query may fail, leaving temporary tables behind. Leftover temporary tables might eventually cause conflicts or resource issues.
|
||||
- **Example:** Running the cleanup query could produce an error like "multi-statement queries not enabled," preventing proper cleanup.
|
||||
- **Fix:** Either configure your database connection to allow multi-statements or issue separate queries for each temporary table drop to ensure that the cleanup runs successfully.
|
||||
24. **Handling Products with No Sales Data**
|
||||
- **Problem:** In the product-level forecast calculation, the CTE `daily_stats` includes a `HAVING AVG(ds.daily_quantity) > 0` clause.
|
||||
- **Effect:** Products that have no sales (or a zero average daily quantity) will be excluded from the forecasts. This means the frontend won't show forecasts for non–selling products, which might be acceptable but could also be a completeness issue.
|
||||
- **Example:** A product that has never sold will not appear in the `sales_forecasts` table.
|
||||
- **Fix:** Confirm that it is intended for forecasts to be generated only for products with some sales activity. If forecasts are required for all products, adjust the query to insert default forecast records for products with zero sales.
|
||||
25. **Complexity of the Forecast Formula Involving the Seasonality Factor**
|
||||
- **Issue:**
|
||||
|
||||
The sales forecast calculations incorporate an adjustment factor using `COALESCE(sf.seasonality_factor, 0)` to modify forecast units and revenue. This means that if the seasonality data is missing (or not populated), the factor defaults to 0.
|
||||
|
||||
|
||||
- **Potential Problem:**
|
||||
|
||||
A default value of 0 will drastically alter the forecast calculations—often leading to a forecast of 0 or an overly dampened forecast—when in reality the intended behavior might be to use a neutral multiplier (typically 1.0). This could result in forecasts that are not reflective of the actual seasonal impact, thereby skewing the figures that reach the frontend.
|
||||
|
||||
|
||||
- **Fix:**
|
||||
|
||||
Review your data source for seasonality (the `sales_seasonality` table) and ensure it's consistently populated. Alternatively, if missing seasonality data is possible, consider using a more neutral default (such as 1.0) in your COALESCE. This change would prevent the forecast formulas from over-simplifying (or even nullifying) the forecast output due to missing seasonality factors.
|
||||
|
||||
|
||||
26. **Group By with Seasonality Factor Variability**
|
||||
- **Observation:** In the forecast insertion query, the GROUP BY clause includes `sf.seasonality_factor` along with other fields.
|
||||
- **Effect:** If the seasonality factor differs (or is `NULL` versus a value) for different forecast dates, this might result in multiple rows for the same product and forecast date. However, the `ON DUPLICATE KEY UPDATE` clause will merge them—but only if the primary key (pid, forecast_date) is truly unique.
|
||||
- **Fix:** Verify that the grouping produces exactly one row per product per forecast date. If there's potential for multiple rows due to seasonality variability, consider applying a COALESCE or an aggregation on the seasonality factor so that it does not affect grouping.
|
||||
|
||||
27. **Memory Management for Temporary Tables** [RESOLVED - calculate-metrics.js]
|
||||
- **Problem:** In metrics calculations, temporary tables aren't always properly cleaned up if the process fails between creation and the DROP statement.
|
||||
- **Effect:** If a process fails after creating temporary tables but before dropping them, these tables remain in memory until the connection is closed. In a production environment with multiple calculation runs, this could lead to memory leaks or table name conflicts.
|
||||
- **Example:** The `temp_revenue_ranks` table creation in ABC classification could remain if the process fails before reaching the DROP statement.
|
||||
- **Fix:** Implement proper cleanup in a finally block or use transaction management that ensures temporary tables are always cleaned up, even in failure scenarios.
|
||||
181
docs/metrics-changes.md
Normal file
181
docs/metrics-changes.md
Normal file
@@ -0,0 +1,181 @@
|
||||
# Metrics System Changes
|
||||
|
||||
## Schema Changes
|
||||
|
||||
### Product Identifiers
|
||||
- Changed `product_id` to `pid` throughout all metrics tables and queries
|
||||
- Changed `category_id` to `cat_id` in category-related queries
|
||||
|
||||
### Purchase Orders
|
||||
- Changed status check from `status = 'closed'` to `receiving_status >= 30`
|
||||
- Added comment `-- Partial or fully received` for clarity
|
||||
- Now using `received_date` instead of relying on status changes
|
||||
|
||||
### New Product Fields
|
||||
- Added support for `notions_inv_count`
|
||||
- Added support for `date_last_sold`
|
||||
- Added support for `total_sold`
|
||||
- Using `visible` flag for active product counts
|
||||
|
||||
### Field Size Updates
|
||||
- Increased size of financial fields to handle larger numbers:
|
||||
- Changed category metrics `total_value` from `DECIMAL(10,3)` to `DECIMAL(15,3)`
|
||||
- Changed brand metrics financial fields from `DECIMAL(10,2)` to `DECIMAL(15,2)`
|
||||
- Affects `total_stock_cost`, `total_stock_retail`, `total_revenue`
|
||||
|
||||
## Metrics File Changes
|
||||
|
||||
### Product Metrics (`product-metrics.js`)
|
||||
- Updated SQL queries to use new field names
|
||||
- Enhanced stock status calculations
|
||||
- Added financial metrics:
|
||||
- `gross_profit`
|
||||
- `gmroi`
|
||||
- `avg_margin_percent`
|
||||
- `inventory_value`
|
||||
- Improved reorder quantity calculations with:
|
||||
- Enhanced safety stock calculation
|
||||
- Lead time consideration
|
||||
- Service level factors
|
||||
- Added NaN/NULL value handling:
|
||||
- Added `sanitizeValue` helper function
|
||||
- Properly converts JavaScript NaN to SQL NULL
|
||||
- Ensures all numeric fields have valid values
|
||||
|
||||
### Vendor Metrics (`vendor-metrics.js`)
|
||||
- Updated field references to use `pid`
|
||||
- Modified purchase order status checks
|
||||
- Enhanced vendor performance metrics:
|
||||
- Order fill rate calculation
|
||||
- On-time delivery rate
|
||||
- Lead time tracking
|
||||
|
||||
### Category Metrics (`category-metrics.js`)
|
||||
- Updated to use `cat_id` instead of `category_id`
|
||||
- Enhanced category performance tracking:
|
||||
- Active vs total products
|
||||
- Category growth rate
|
||||
- Turnover rate
|
||||
- Added time-based metrics for:
|
||||
- Product counts
|
||||
- Revenue tracking
|
||||
- Margin analysis
|
||||
- Added NULL brand handling:
|
||||
- Uses 'Unbranded' for NULL brand values
|
||||
- Maintains data integrity in category sales metrics
|
||||
|
||||
### Brand Metrics (`brand-metrics.js`)
|
||||
- Updated product references to use `pid`
|
||||
- Enhanced brand performance metrics:
|
||||
- Stock value calculations
|
||||
- Revenue tracking
|
||||
- Growth rate analysis
|
||||
- Added time-based aggregates for:
|
||||
- Stock levels
|
||||
- Sales performance
|
||||
- Margin analysis
|
||||
- Increased field sizes to handle large retailers
|
||||
|
||||
### Sales Forecasts (`sales-forecasts.js`)
|
||||
- Updated to use new product identifiers
|
||||
- Enhanced forecast calculations:
|
||||
- Day-of-week patterns
|
||||
- Seasonality factors
|
||||
- Confidence levels
|
||||
- Added category-level forecasts with:
|
||||
- Units and revenue predictions
|
||||
- Confidence scoring
|
||||
- Seasonal adjustments
|
||||
|
||||
### Time Aggregates (`time-aggregates.js`)
|
||||
- Updated field references to use `pid`
|
||||
- Enhanced financial metrics:
|
||||
- GMROI calculations
|
||||
- Profit margin tracking
|
||||
- Added inventory value tracking
|
||||
- Improved purchase order integration
|
||||
|
||||
## Database Impact
|
||||
|
||||
### Tables Modified
|
||||
- `product_metrics`
|
||||
- `vendor_metrics`
|
||||
- `vendor_time_metrics`
|
||||
- `category_metrics`
|
||||
- `category_time_metrics`
|
||||
- `brand_metrics`
|
||||
- `brand_time_metrics`
|
||||
- `sales_forecasts`
|
||||
- `category_forecasts`
|
||||
- `product_time_aggregates`
|
||||
|
||||
### New Fields Added
|
||||
Several tables have new fields for:
|
||||
- Enhanced financial tracking
|
||||
- Improved inventory metrics
|
||||
- Better performance monitoring
|
||||
- More accurate forecasting
|
||||
|
||||
## Frontend Considerations
|
||||
|
||||
### Data Access Changes
|
||||
- All product lookups need to use `pid` instead of `product_id`
|
||||
- Category references should use `cat_id`
|
||||
- Purchase order status checks need updating
|
||||
- Handle 'Unbranded' as a valid brand value
|
||||
|
||||
### New Features Available
|
||||
- Enhanced stock status indicators
|
||||
- More detailed financial metrics
|
||||
- Improved forecasting data
|
||||
- Better category and brand performance tracking
|
||||
|
||||
### UI Updates Needed
|
||||
- Update all API calls to use new field names
|
||||
- Modify data displays for new metrics
|
||||
- Add new financial performance indicators
|
||||
- Update stock status logic
|
||||
- Enhance forecast displays
|
||||
|
||||
## API Route Updates Needed
|
||||
|
||||
### Product Routes
|
||||
- Update ID field references
|
||||
- Modify stock status calculations
|
||||
- Add new financial metrics endpoints
|
||||
|
||||
### Category Routes
|
||||
- Update to use `cat_id`
|
||||
- Add new performance metrics
|
||||
- Include time-based aggregates
|
||||
|
||||
### Vendor Routes
|
||||
- Update product reference handling
|
||||
- Add enhanced performance metrics
|
||||
- Include new time-based data
|
||||
|
||||
### Reporting Routes
|
||||
- Update all ID references
|
||||
- Add new metrics support
|
||||
- Include enhanced forecasting data
|
||||
|
||||
## Migration Considerations
|
||||
|
||||
### Data Migration
|
||||
- Update existing records to use new IDs
|
||||
- Backfill new metrics where possible
|
||||
- Verify data integrity after changes
|
||||
- Handle NULL to 'Unbranded' brand conversion
|
||||
|
||||
### Code Updates
|
||||
- Update all API endpoints
|
||||
- Modify database queries
|
||||
- Update frontend components
|
||||
- Revise reporting logic
|
||||
|
||||
### Testing Requirements
|
||||
- Verify ID changes throughout system
|
||||
- Test new metrics calculations
|
||||
- Validate forecasting accuracy
|
||||
- Check performance impact
|
||||
- Verify NULL value handling
|
||||
270
docs/schema-update-changes.md
Normal file
270
docs/schema-update-changes.md
Normal file
@@ -0,0 +1,270 @@
|
||||
# Schema Update Changes Required
|
||||
|
||||
## Core Field Name Changes
|
||||
|
||||
### Global Changes
|
||||
- Update all references from `product_id` to `pid` in all tables and queries
|
||||
- This includes foreign key references in related tables
|
||||
- Update TypeScript interfaces and types (e.g., `interface Product { pid: number; ... }`)
|
||||
- Update API request/response types
|
||||
- Update all references from `category_id` to `cat_id` in category-related queries
|
||||
- This affects the `categories` table and all tables with category foreign keys
|
||||
- Update purchase order status to use numeric codes instead of strings
|
||||
- Status codes: 0=canceled, 1=created, 10=electronically_ready_send, 11=ordered, 12=preordered, 13=electronically_sent, 15=receiving_started, 50=done
|
||||
- Receiving status codes: 0=canceled, 1=created, 30=partial_received, 40=full_received, 50=paid
|
||||
- Handle NULL brand values as 'Unbranded'
|
||||
- Add COALESCE(brand, 'Unbranded') in all brand-related queries
|
||||
- Update frontend brand filters to handle 'Unbranded' as a valid value
|
||||
|
||||
## Backend Route Changes
|
||||
|
||||
### Product Routes
|
||||
1. Update ID field references in all product routes:
|
||||
- `/api/products/:id` -> `/api/products/:pid`
|
||||
- All query parameters using `product_id` should be changed to `pid`
|
||||
- Update all SQL queries to use `pid` instead of `product_id`
|
||||
- Update `/api/products/:id/metrics` -> `/api/products/:pid/metrics`
|
||||
- Update `/api/products/:id/time-series` -> `/api/products/:pid/time-series`
|
||||
- Update request parameter validation in routes
|
||||
- Example query change:
|
||||
```sql
|
||||
-- Old
|
||||
SELECT * FROM products WHERE product_id = ?
|
||||
-- New
|
||||
SELECT * FROM products WHERE pid = ?
|
||||
```
|
||||
|
||||
2. Update purchase order status checks:
|
||||
- Change `status = 'closed'` to `receiving_status >= 30` in all relevant queries
|
||||
- Update any route logic that checks PO status to use the new numeric status codes
|
||||
- Example status check change:
|
||||
```sql
|
||||
-- Old
|
||||
WHERE po.status = 'closed'
|
||||
-- New
|
||||
WHERE po.receiving_status >= 30 -- Partial or fully received
|
||||
```
|
||||
|
||||
### Category Routes
|
||||
1. Update ID references:
|
||||
- `/api/categories/:id` -> `/api/categories/:cat_id`
|
||||
- Update all SQL queries to use `cat_id` instead of `category_id`
|
||||
- Update join conditions in category-related queries
|
||||
- Example join change:
|
||||
```sql
|
||||
-- Old
|
||||
JOIN categories c ON p.category_id = c.category_id
|
||||
-- New
|
||||
JOIN categories c ON p.cat_id = c.cat_id
|
||||
```
|
||||
|
||||
2. Update category metrics queries:
|
||||
- Modify field size handling for financial fields (DECIMAL(15,3) instead of DECIMAL(10,3))
|
||||
- Update category performance calculations to use new field sizes
|
||||
- Example field size change:
|
||||
```sql
|
||||
-- Old
|
||||
total_value DECIMAL(10,3)
|
||||
-- New
|
||||
total_value DECIMAL(15,3)
|
||||
```
|
||||
|
||||
### Vendor Routes
|
||||
1. Update product references:
|
||||
- Change all queries to use `pid` instead of `product_id`
|
||||
- Update purchase order status checks to use new numeric codes
|
||||
- Example vendor query change:
|
||||
```sql
|
||||
-- Old
|
||||
SELECT v.*, p.product_id FROM vendors v
|
||||
JOIN products p ON p.vendor = v.name
|
||||
WHERE p.product_id = ?
|
||||
-- New
|
||||
SELECT v.*, p.pid FROM vendors v
|
||||
JOIN products p ON p.vendor = v.name
|
||||
WHERE p.pid = ?
|
||||
```
|
||||
|
||||
2. Update vendor metrics queries:
|
||||
- Add COALESCE for NULL brand handling:
|
||||
```sql
|
||||
-- Old
|
||||
GROUP BY brand
|
||||
-- New
|
||||
GROUP BY COALESCE(brand, 'Unbranded')
|
||||
```
|
||||
- Update field references in performance metrics calculations
|
||||
|
||||
### Dashboard Routes
|
||||
1. Update all dashboard endpoints:
|
||||
- `/dashboard/best-sellers`:
|
||||
```typescript
|
||||
interface BestSellerProduct {
|
||||
pid: number; // Changed from product_id
|
||||
sku: string;
|
||||
title: string;
|
||||
units_sold: number;
|
||||
revenue: number; // Now handles larger decimals
|
||||
profit: number; // Now handles larger decimals
|
||||
}
|
||||
```
|
||||
- `/dashboard/overstock/products`:
|
||||
```typescript
|
||||
interface OverstockedProduct {
|
||||
pid: number; // Changed from product_id
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
overstocked_amt: number;
|
||||
excess_cost: number; // Now DECIMAL(15,3)
|
||||
excess_retail: number; // Now DECIMAL(15,3)
|
||||
}
|
||||
```
|
||||
|
||||
### Analytics Routes
|
||||
1. Update analytics endpoints:
|
||||
- `/analytics/stats` - Update all ID references and decimal handling
|
||||
- `/analytics/profit` - Update decimal precision in calculations
|
||||
- `/analytics/vendors` - Add brand NULL handling
|
||||
- Example analytics query change:
|
||||
```sql
|
||||
-- Old
|
||||
SELECT product_id, SUM(price * quantity) as revenue
|
||||
FROM orders
|
||||
GROUP BY product_id
|
||||
-- New
|
||||
SELECT pid, CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
|
||||
FROM orders
|
||||
GROUP BY pid
|
||||
```
|
||||
|
||||
## Frontend Component Changes
|
||||
|
||||
### Product Components
|
||||
1. Update API calls:
|
||||
```typescript
|
||||
// Old
|
||||
fetch(`/api/products/${product_id}`)
|
||||
// New
|
||||
fetch(`/api/products/${pid}`)
|
||||
```
|
||||
- Update route parameters in React Router:
|
||||
```typescript
|
||||
// Old
|
||||
<Route path="/products/:product_id" />
|
||||
// New
|
||||
<Route path="/products/:pid" />
|
||||
```
|
||||
- Update useParams usage:
|
||||
```typescript
|
||||
// Old
|
||||
const { product_id } = useParams();
|
||||
// New
|
||||
const { pid } = useParams();
|
||||
```
|
||||
|
||||
2. Update data display:
|
||||
```typescript
|
||||
// Old
|
||||
<td>{formatCurrency(product.price)}</td>
|
||||
// New
|
||||
<td>{formatCurrency(Number(product.price))}</td>
|
||||
```
|
||||
|
||||
### Dashboard Components
|
||||
1. Update metrics displays:
|
||||
```typescript
|
||||
// Old
|
||||
interface ProductMetrics {
|
||||
product_id: number;
|
||||
total_value: number;
|
||||
}
|
||||
// New
|
||||
interface ProductMetrics {
|
||||
pid: number;
|
||||
total_value: string; // Handle as string due to DECIMAL(15,3)
|
||||
}
|
||||
```
|
||||
|
||||
2. Update stock status displays:
|
||||
```typescript
|
||||
// Old
|
||||
const isReceived = po.status === 'closed';
|
||||
// New
|
||||
const isReceived = po.receiving_status >= 30;
|
||||
```
|
||||
|
||||
### Product Filters Component
|
||||
1. Update filter options:
|
||||
```typescript
|
||||
// Old
|
||||
const productFilter = { id: 'product_id', value: id };
|
||||
// New
|
||||
const productFilter = { id: 'pid', value: id };
|
||||
```
|
||||
|
||||
2. Update status filters:
|
||||
```typescript
|
||||
// Old
|
||||
const poStatusOptions = [
|
||||
{ label: 'Closed', value: 'closed' }
|
||||
];
|
||||
// New
|
||||
const poStatusOptions = [
|
||||
{ label: 'Received', value: '30' } // Using numeric codes
|
||||
];
|
||||
```
|
||||
|
||||
## Data Type Considerations
|
||||
|
||||
### Financial Fields
|
||||
- Update TypeScript types:
|
||||
```typescript
|
||||
// Old
|
||||
price: number;
|
||||
// New
|
||||
price: string; // Handle as string due to DECIMAL(15,3)
|
||||
```
|
||||
- Update formatting:
|
||||
```typescript
|
||||
// Old
|
||||
formatCurrency(value)
|
||||
// New
|
||||
formatCurrency(Number(value))
|
||||
```
|
||||
|
||||
### Status Codes
|
||||
- Add status code mapping:
|
||||
```typescript
|
||||
const PO_STATUS_MAP = {
|
||||
0: 'Canceled',
|
||||
1: 'Created',
|
||||
10: 'Ready to Send',
|
||||
11: 'Ordered',
|
||||
12: 'Preordered',
|
||||
13: 'Sent',
|
||||
15: 'Receiving Started',
|
||||
50: 'Done'
|
||||
};
|
||||
```
|
||||
|
||||
## Testing Requirements
|
||||
|
||||
1. API Route Testing:
|
||||
```typescript
|
||||
// Test decimal handling
|
||||
expect(typeof response.total_value).toBe('string');
|
||||
expect(response.total_value).toMatch(/^\d+\.\d{3}$/);
|
||||
|
||||
// Test status codes
|
||||
expect(response.receiving_status).toBeGreaterThanOrEqual(30);
|
||||
|
||||
// Test brand handling
|
||||
expect(response.brand || 'Unbranded').toBeDefined();
|
||||
```
|
||||
|
||||
## Notes
|
||||
- All numeric status code comparisons should use >= for status checks to handle future status codes
|
||||
- All financial values should be handled as strings in TypeScript/JavaScript to preserve precision
|
||||
- Brand grouping should always use COALESCE(brand, 'Unbranded') in SQL queries
|
||||
- All ID parameters in routes should be validated as numbers
|
||||
@@ -14,7 +14,8 @@ CREATE TABLE IF NOT EXISTS stock_thresholds (
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
FOREIGN KEY (category_id) REFERENCES categories(cat_id) ON DELETE CASCADE,
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor)
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor),
|
||||
INDEX idx_st_metrics (category_id, vendor)
|
||||
);
|
||||
|
||||
-- Lead time threshold configurations
|
||||
@@ -44,7 +45,8 @@ CREATE TABLE IF NOT EXISTS sales_velocity_config (
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
FOREIGN KEY (category_id) REFERENCES categories(cat_id) ON DELETE CASCADE,
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor)
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor),
|
||||
INDEX idx_sv_metrics (category_id, vendor)
|
||||
);
|
||||
|
||||
-- ABC Classification configurations
|
||||
@@ -68,7 +70,8 @@ CREATE TABLE IF NOT EXISTS safety_stock_config (
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
FOREIGN KEY (category_id) REFERENCES categories(cat_id) ON DELETE CASCADE,
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor)
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor),
|
||||
INDEX idx_ss_metrics (category_id, vendor)
|
||||
);
|
||||
|
||||
-- Turnover rate configurations
|
||||
@@ -85,6 +88,16 @@ CREATE TABLE IF NOT EXISTS turnover_config (
|
||||
UNIQUE KEY unique_category_vendor (category_id, vendor)
|
||||
);
|
||||
|
||||
-- Create table for sales seasonality factors
|
||||
CREATE TABLE IF NOT EXISTS sales_seasonality (
|
||||
month INT NOT NULL,
|
||||
seasonality_factor DECIMAL(5,3) DEFAULT 0,
|
||||
last_updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (month),
|
||||
CHECK (month BETWEEN 1 AND 12),
|
||||
CHECK (seasonality_factor BETWEEN -1.0 AND 1.0)
|
||||
);
|
||||
|
||||
-- Insert default global thresholds if not exists
|
||||
INSERT INTO stock_thresholds (id, category_id, vendor, critical_days, reorder_days, overstock_days)
|
||||
VALUES (1, NULL, NULL, 7, 14, 90)
|
||||
@@ -126,6 +139,13 @@ ON DUPLICATE KEY UPDATE
|
||||
calculation_period_days = VALUES(calculation_period_days),
|
||||
target_rate = VALUES(target_rate);
|
||||
|
||||
-- Insert default seasonality factors (neutral)
|
||||
INSERT INTO sales_seasonality (month, seasonality_factor)
|
||||
VALUES
|
||||
(1, 0), (2, 0), (3, 0), (4, 0), (5, 0), (6, 0),
|
||||
(7, 0), (8, 0), (9, 0), (10, 0), (11, 0), (12, 0)
|
||||
ON DUPLICATE KEY UPDATE last_updated = CURRENT_TIMESTAMP;
|
||||
|
||||
-- View to show thresholds with category names
|
||||
CREATE OR REPLACE VIEW stock_thresholds_view AS
|
||||
SELECT
|
||||
@@ -150,3 +170,60 @@ ORDER BY
|
||||
END,
|
||||
c.name,
|
||||
st.vendor;
|
||||
|
||||
CREATE TABLE IF NOT EXISTS calculate_history (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
start_time TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
end_time TIMESTAMP NULL,
|
||||
duration_seconds INT,
|
||||
duration_minutes DECIMAL(10,2) GENERATED ALWAYS AS (duration_seconds / 60.0) STORED,
|
||||
total_products INT DEFAULT 0,
|
||||
total_orders INT DEFAULT 0,
|
||||
total_purchase_orders INT DEFAULT 0,
|
||||
processed_products INT DEFAULT 0,
|
||||
processed_orders INT DEFAULT 0,
|
||||
processed_purchase_orders INT DEFAULT 0,
|
||||
status ENUM('running', 'completed', 'failed', 'cancelled') DEFAULT 'running',
|
||||
error_message TEXT,
|
||||
additional_info JSON,
|
||||
INDEX idx_status_time (status, start_time)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS calculate_status (
|
||||
module_name ENUM(
|
||||
'product_metrics',
|
||||
'time_aggregates',
|
||||
'financial_metrics',
|
||||
'vendor_metrics',
|
||||
'category_metrics',
|
||||
'brand_metrics',
|
||||
'sales_forecasts',
|
||||
'abc_classification'
|
||||
) PRIMARY KEY,
|
||||
last_calculation_timestamp TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
INDEX idx_last_calc (last_calculation_timestamp)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS sync_status (
|
||||
table_name VARCHAR(50) PRIMARY KEY,
|
||||
last_sync_timestamp TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
last_sync_id BIGINT,
|
||||
INDEX idx_last_sync (last_sync_timestamp)
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS import_history (
|
||||
id BIGINT AUTO_INCREMENT PRIMARY KEY,
|
||||
table_name VARCHAR(50) NOT NULL,
|
||||
start_time TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
end_time TIMESTAMP NULL,
|
||||
duration_seconds INT,
|
||||
duration_minutes DECIMAL(10,2) GENERATED ALWAYS AS (duration_seconds / 60.0) STORED,
|
||||
records_added INT DEFAULT 0,
|
||||
records_updated INT DEFAULT 0,
|
||||
is_incremental BOOLEAN DEFAULT FALSE,
|
||||
status ENUM('running', 'completed', 'failed', 'cancelled') DEFAULT 'running',
|
||||
error_message TEXT,
|
||||
additional_info JSON,
|
||||
INDEX idx_table_time (table_name, start_time),
|
||||
INDEX idx_status (status)
|
||||
);
|
||||
@@ -102,19 +102,17 @@ CREATE TABLE IF NOT EXISTS product_time_aggregates (
|
||||
INDEX idx_date (year, month)
|
||||
);
|
||||
|
||||
-- Create vendor details table
|
||||
CREATE TABLE IF NOT EXISTS vendor_details (
|
||||
vendor VARCHAR(100) NOT NULL,
|
||||
-- Create vendor_details table
|
||||
CREATE TABLE vendor_details (
|
||||
vendor VARCHAR(100) PRIMARY KEY,
|
||||
contact_name VARCHAR(100),
|
||||
email VARCHAR(100),
|
||||
phone VARCHAR(20),
|
||||
email VARCHAR(255),
|
||||
phone VARCHAR(50),
|
||||
status VARCHAR(20) DEFAULT 'active',
|
||||
notes TEXT,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (vendor),
|
||||
INDEX idx_vendor_status (status)
|
||||
);
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_status (status)
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
-- New table for vendor metrics
|
||||
CREATE TABLE IF NOT EXISTS vendor_metrics (
|
||||
@@ -152,7 +150,7 @@ CREATE TABLE IF NOT EXISTS category_metrics (
|
||||
product_count INT DEFAULT 0,
|
||||
active_products INT DEFAULT 0,
|
||||
-- Financial metrics
|
||||
total_value DECIMAL(10,3) DEFAULT 0,
|
||||
total_value DECIMAL(15,3) DEFAULT 0,
|
||||
avg_margin DECIMAL(5,2),
|
||||
turnover_rate DECIMAL(12,3),
|
||||
growth_rate DECIMAL(5,2),
|
||||
@@ -193,8 +191,8 @@ CREATE TABLE IF NOT EXISTS category_time_metrics (
|
||||
product_count INT DEFAULT 0,
|
||||
active_products INT DEFAULT 0,
|
||||
-- Financial metrics
|
||||
total_value DECIMAL(10,3) DEFAULT 0,
|
||||
total_revenue DECIMAL(10,3) DEFAULT 0,
|
||||
total_value DECIMAL(15,3) DEFAULT 0,
|
||||
total_revenue DECIMAL(15,3) DEFAULT 0,
|
||||
avg_margin DECIMAL(5,2),
|
||||
turnover_rate DECIMAL(12,3),
|
||||
PRIMARY KEY (category_id, year, month),
|
||||
@@ -228,10 +226,10 @@ CREATE TABLE IF NOT EXISTS brand_metrics (
|
||||
active_products INT DEFAULT 0,
|
||||
-- Stock metrics
|
||||
total_stock_units INT DEFAULT 0,
|
||||
total_stock_cost DECIMAL(10,2) DEFAULT 0,
|
||||
total_stock_retail DECIMAL(10,2) DEFAULT 0,
|
||||
total_stock_cost DECIMAL(15,2) DEFAULT 0,
|
||||
total_stock_retail DECIMAL(15,2) DEFAULT 0,
|
||||
-- Sales metrics
|
||||
total_revenue DECIMAL(10,2) DEFAULT 0,
|
||||
total_revenue DECIMAL(15,2) DEFAULT 0,
|
||||
avg_margin DECIMAL(5,2) DEFAULT 0,
|
||||
growth_rate DECIMAL(5,2) DEFAULT 0,
|
||||
PRIMARY KEY (brand),
|
||||
@@ -250,10 +248,10 @@ CREATE TABLE IF NOT EXISTS brand_time_metrics (
|
||||
active_products INT DEFAULT 0,
|
||||
-- Stock metrics
|
||||
total_stock_units INT DEFAULT 0,
|
||||
total_stock_cost DECIMAL(10,2) DEFAULT 0,
|
||||
total_stock_retail DECIMAL(10,2) DEFAULT 0,
|
||||
total_stock_cost DECIMAL(15,2) DEFAULT 0,
|
||||
total_stock_retail DECIMAL(15,2) DEFAULT 0,
|
||||
-- Sales metrics
|
||||
total_revenue DECIMAL(10,2) DEFAULT 0,
|
||||
total_revenue DECIMAL(15,2) DEFAULT 0,
|
||||
avg_margin DECIMAL(5,2) DEFAULT 0,
|
||||
PRIMARY KEY (brand, year, month),
|
||||
INDEX idx_brand_date (year, month)
|
||||
@@ -287,26 +285,6 @@ CREATE TABLE IF NOT EXISTS category_forecasts (
|
||||
INDEX idx_category_forecast_last_calculated (last_calculated_at)
|
||||
);
|
||||
|
||||
-- Create table for sales seasonality factors
|
||||
CREATE TABLE IF NOT EXISTS sales_seasonality (
|
||||
month INT NOT NULL,
|
||||
seasonality_factor DECIMAL(5,3) DEFAULT 0,
|
||||
last_updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (month),
|
||||
CHECK (month BETWEEN 1 AND 12),
|
||||
CHECK (seasonality_factor BETWEEN -1.0 AND 1.0)
|
||||
);
|
||||
|
||||
-- Insert default seasonality factors (neutral)
|
||||
INSERT INTO sales_seasonality (month, seasonality_factor)
|
||||
VALUES
|
||||
(1, 0), (2, 0), (3, 0), (4, 0), (5, 0), (6, 0),
|
||||
(7, 0), (8, 0), (9, 0), (10, 0), (11, 0), (12, 0)
|
||||
ON DUPLICATE KEY UPDATE last_updated = CURRENT_TIMESTAMP;
|
||||
|
||||
-- Re-enable foreign key checks
|
||||
SET FOREIGN_KEY_CHECKS = 1;
|
||||
|
||||
-- Create view for inventory health
|
||||
CREATE OR REPLACE VIEW inventory_health AS
|
||||
WITH product_thresholds AS (
|
||||
@@ -428,3 +406,6 @@ LEFT JOIN
|
||||
categories p ON c.parent_id = p.cat_id
|
||||
LEFT JOIN
|
||||
category_metrics cm ON c.cat_id = cm.category_id;
|
||||
|
||||
-- Re-enable foreign key checks
|
||||
SET FOREIGN_KEY_CHECKS = 1;
|
||||
@@ -39,7 +39,7 @@ CREATE TABLE products (
|
||||
tags TEXT,
|
||||
moq INT DEFAULT 1,
|
||||
uom INT DEFAULT 1,
|
||||
rating TINYINT UNSIGNED DEFAULT 0,
|
||||
rating DECIMAL(10,2) DEFAULT 0.00,
|
||||
reviews INT UNSIGNED DEFAULT 0,
|
||||
weight DECIMAL(10,3),
|
||||
length DECIMAL(10,3),
|
||||
@@ -51,13 +51,15 @@ CREATE TABLE products (
|
||||
baskets INT UNSIGNED DEFAULT 0,
|
||||
notifies INT UNSIGNED DEFAULT 0,
|
||||
date_last_sold DATE,
|
||||
updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (pid),
|
||||
UNIQUE KEY unique_sku (SKU),
|
||||
INDEX idx_sku (SKU),
|
||||
INDEX idx_vendor (vendor),
|
||||
INDEX idx_brand (brand),
|
||||
INDEX idx_location (location),
|
||||
INDEX idx_total_sold (total_sold),
|
||||
INDEX idx_date_last_sold (date_last_sold)
|
||||
INDEX idx_date_last_sold (date_last_sold),
|
||||
INDEX idx_updated (updated)
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
-- Create categories table with hierarchy support
|
||||
@@ -77,18 +79,6 @@ CREATE TABLE categories (
|
||||
INDEX idx_name_type (name, type)
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
-- Create vendor_details table
|
||||
CREATE TABLE vendor_details (
|
||||
vendor VARCHAR(100) PRIMARY KEY,
|
||||
contact_name VARCHAR(100),
|
||||
email VARCHAR(255),
|
||||
phone VARCHAR(50),
|
||||
status VARCHAR(20) DEFAULT 'active',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
INDEX idx_status (status)
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
-- Create product_categories junction table
|
||||
CREATE TABLE product_categories (
|
||||
cat_id BIGINT NOT NULL,
|
||||
@@ -113,16 +103,21 @@ CREATE TABLE IF NOT EXISTS orders (
|
||||
tax DECIMAL(10,3) DEFAULT 0.000,
|
||||
tax_included TINYINT(1) DEFAULT 0,
|
||||
shipping DECIMAL(10,3) DEFAULT 0.000,
|
||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||
customer VARCHAR(50) NOT NULL,
|
||||
customer_name VARCHAR(100),
|
||||
status VARCHAR(20) DEFAULT 'pending',
|
||||
canceled TINYINT(1) DEFAULT 0,
|
||||
updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
UNIQUE KEY unique_order_line (order_number, pid),
|
||||
KEY order_number (order_number),
|
||||
KEY pid (pid),
|
||||
KEY customer (customer),
|
||||
KEY date (date),
|
||||
KEY status (status)
|
||||
KEY status (status),
|
||||
INDEX idx_orders_metrics (pid, date, canceled),
|
||||
INDEX idx_updated (updated)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4;
|
||||
|
||||
-- Create purchase_orders table with its indexes
|
||||
@@ -134,7 +129,9 @@ CREATE TABLE purchase_orders (
|
||||
expected_date DATE,
|
||||
pid BIGINT NOT NULL,
|
||||
sku VARCHAR(50) NOT NULL,
|
||||
name VARCHAR(100) NOT NULL COMMENT 'Product name from products.description',
|
||||
cost_price DECIMAL(10, 3) NOT NULL,
|
||||
po_cost_price DECIMAL(10, 3) NOT NULL COMMENT 'Original cost from PO, before receiving adjustments',
|
||||
status TINYINT UNSIGNED DEFAULT 1 COMMENT '0=canceled,1=created,10=electronically_ready_send,11=ordered,12=preordered,13=electronically_sent,15=receiving_started,50=done',
|
||||
receiving_status TINYINT UNSIGNED DEFAULT 1 COMMENT '0=canceled,1=created,30=partial_received,40=full_received,50=paid',
|
||||
notes TEXT,
|
||||
@@ -143,17 +140,19 @@ CREATE TABLE purchase_orders (
|
||||
received INT DEFAULT 0,
|
||||
received_date DATE COMMENT 'Date of first receiving',
|
||||
last_received_date DATE COMMENT 'Date of most recent receiving',
|
||||
received_by INT,
|
||||
received_by VARCHAR(100) COMMENT 'Name of person who first received this PO line',
|
||||
receiving_history JSON COMMENT 'Array of receiving records with qty, date, cost, receiving_id, and alt_po flag',
|
||||
updated TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
FOREIGN KEY (pid) REFERENCES products(pid),
|
||||
FOREIGN KEY (sku) REFERENCES products(SKU),
|
||||
INDEX idx_po_id (po_id),
|
||||
INDEX idx_vendor (vendor),
|
||||
INDEX idx_status (status),
|
||||
INDEX idx_receiving_status (receiving_status),
|
||||
INDEX idx_purchase_orders_metrics (pid, date, status, ordered, received),
|
||||
INDEX idx_po_metrics (pid, date, receiving_status, received_date),
|
||||
INDEX idx_po_product_date (pid, date),
|
||||
INDEX idx_po_product_status (pid, status),
|
||||
INDEX idx_updated (updated),
|
||||
UNIQUE KEY unique_po_product (po_id, pid)
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -5,6 +5,16 @@ process.chdir(path.dirname(__filename));
|
||||
|
||||
require('dotenv').config({ path: path.resolve(__dirname, '..', '.env') });
|
||||
|
||||
// Configuration flags for controlling which metrics to calculate
|
||||
// Set to 1 to skip the corresponding calculation, 0 to run it
|
||||
const SKIP_PRODUCT_METRICS = 0;
|
||||
const SKIP_TIME_AGGREGATES = 0;
|
||||
const SKIP_FINANCIAL_METRICS = 0;
|
||||
const SKIP_VENDOR_METRICS = 0;
|
||||
const SKIP_CATEGORY_METRICS = 0;
|
||||
const SKIP_BRAND_METRICS = 0;
|
||||
const SKIP_SALES_FORECASTS = 0;
|
||||
|
||||
// Add error handler for uncaught exceptions
|
||||
process.on('uncaughtException', (error) => {
|
||||
console.error('Uncaught Exception:', error);
|
||||
@@ -34,6 +44,34 @@ global.clearProgress = progress.clearProgress;
|
||||
global.getProgress = progress.getProgress;
|
||||
global.logError = progress.logError;
|
||||
|
||||
// List of temporary tables used in the calculation process
|
||||
const TEMP_TABLES = [
|
||||
'temp_revenue_ranks',
|
||||
'temp_sales_metrics',
|
||||
'temp_purchase_metrics',
|
||||
'temp_product_metrics',
|
||||
'temp_vendor_metrics',
|
||||
'temp_category_metrics',
|
||||
'temp_brand_metrics',
|
||||
'temp_forecast_dates',
|
||||
'temp_daily_sales',
|
||||
'temp_product_stats',
|
||||
'temp_category_sales',
|
||||
'temp_category_stats'
|
||||
];
|
||||
|
||||
// Add cleanup function for temporary tables
|
||||
async function cleanupTemporaryTables(connection) {
|
||||
try {
|
||||
for (const table of TEMP_TABLES) {
|
||||
await connection.query(`DROP TEMPORARY TABLE IF EXISTS ${table}`);
|
||||
}
|
||||
} catch (error) {
|
||||
logError(error, 'Error cleaning up temporary tables');
|
||||
throw error; // Re-throw to be handled by the caller
|
||||
}
|
||||
}
|
||||
|
||||
const { getConnection, closePool } = require('./metrics/utils/db');
|
||||
const calculateProductMetrics = require('./metrics/product-metrics');
|
||||
const calculateTimeAggregates = require('./metrics/time-aggregates');
|
||||
@@ -43,9 +81,6 @@ const calculateCategoryMetrics = require('./metrics/category-metrics');
|
||||
const calculateBrandMetrics = require('./metrics/brand-metrics');
|
||||
const calculateSalesForecasts = require('./metrics/sales-forecasts');
|
||||
|
||||
// Set to 1 to skip product metrics and only calculate the remaining metrics
|
||||
const SKIP_PRODUCT_METRICS = 1;
|
||||
|
||||
// Add cancel handler
|
||||
let isCancelled = false;
|
||||
|
||||
@@ -76,10 +111,78 @@ process.on('SIGTERM', cancelCalculation);
|
||||
async function calculateMetrics() {
|
||||
let connection;
|
||||
const startTime = Date.now();
|
||||
let processedCount = 0;
|
||||
let processedProducts = 0;
|
||||
let processedOrders = 0;
|
||||
let processedPurchaseOrders = 0;
|
||||
let totalProducts = 0;
|
||||
let totalOrders = 0;
|
||||
let totalPurchaseOrders = 0;
|
||||
let calculateHistoryId;
|
||||
|
||||
try {
|
||||
// Clean up any previously running calculations
|
||||
connection = await getConnection();
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
status = 'cancelled',
|
||||
end_time = NOW(),
|
||||
duration_seconds = TIMESTAMPDIFF(SECOND, start_time, NOW()),
|
||||
error_message = 'Previous calculation was not completed properly'
|
||||
WHERE status = 'running'
|
||||
`);
|
||||
|
||||
// Get counts from all relevant tables
|
||||
const [[productCount], [orderCount], [poCount]] = await Promise.all([
|
||||
connection.query('SELECT COUNT(*) as total FROM products'),
|
||||
connection.query('SELECT COUNT(*) as total FROM orders'),
|
||||
connection.query('SELECT COUNT(*) as total FROM purchase_orders')
|
||||
]);
|
||||
|
||||
totalProducts = productCount.total;
|
||||
totalOrders = orderCount.total;
|
||||
totalPurchaseOrders = poCount.total;
|
||||
|
||||
// Create history record for this calculation
|
||||
const [historyResult] = await connection.query(`
|
||||
INSERT INTO calculate_history (
|
||||
start_time,
|
||||
status,
|
||||
total_products,
|
||||
total_orders,
|
||||
total_purchase_orders,
|
||||
additional_info
|
||||
) VALUES (
|
||||
NOW(),
|
||||
'running',
|
||||
?,
|
||||
?,
|
||||
?,
|
||||
JSON_OBJECT(
|
||||
'skip_product_metrics', ?,
|
||||
'skip_time_aggregates', ?,
|
||||
'skip_financial_metrics', ?,
|
||||
'skip_vendor_metrics', ?,
|
||||
'skip_category_metrics', ?,
|
||||
'skip_brand_metrics', ?,
|
||||
'skip_sales_forecasts', ?
|
||||
)
|
||||
)
|
||||
`, [
|
||||
totalProducts,
|
||||
totalOrders,
|
||||
totalPurchaseOrders,
|
||||
SKIP_PRODUCT_METRICS,
|
||||
SKIP_TIME_AGGREGATES,
|
||||
SKIP_FINANCIAL_METRICS,
|
||||
SKIP_VENDOR_METRICS,
|
||||
SKIP_CATEGORY_METRICS,
|
||||
SKIP_BRAND_METRICS,
|
||||
SKIP_SALES_FORECASTS
|
||||
]);
|
||||
calculateHistoryId = historyResult.insertId;
|
||||
connection.release();
|
||||
|
||||
// Add debug logging for the progress functions
|
||||
console.log('Debug - Progress functions:', {
|
||||
formatElapsedTime: typeof global.formatElapsedTime,
|
||||
@@ -108,135 +211,493 @@ async function calculateMetrics() {
|
||||
elapsed: '0s',
|
||||
remaining: 'Calculating...',
|
||||
rate: 0,
|
||||
percentage: '0'
|
||||
percentage: '0',
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Get total number of products
|
||||
const [countResult] = await connection.query('SELECT COUNT(*) as total FROM products')
|
||||
.catch(err => {
|
||||
global.logError(err, 'Failed to count products');
|
||||
throw err;
|
||||
});
|
||||
totalProducts = countResult[0].total;
|
||||
// Update progress periodically
|
||||
const updateProgress = async (products = null, orders = null, purchaseOrders = null) => {
|
||||
// Ensure all values are valid numbers or default to previous value
|
||||
if (products !== null) processedProducts = Number(products) || processedProducts || 0;
|
||||
if (orders !== null) processedOrders = Number(orders) || processedOrders || 0;
|
||||
if (purchaseOrders !== null) processedPurchaseOrders = Number(purchaseOrders) || processedPurchaseOrders || 0;
|
||||
|
||||
// Ensure we never send NaN to the database
|
||||
const safeProducts = Number(processedProducts) || 0;
|
||||
const safeOrders = Number(processedOrders) || 0;
|
||||
const safePurchaseOrders = Number(processedPurchaseOrders) || 0;
|
||||
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?
|
||||
WHERE id = ?
|
||||
`, [safeProducts, safeOrders, safePurchaseOrders, calculateHistoryId]);
|
||||
};
|
||||
|
||||
// Helper function to ensure valid progress numbers
|
||||
const ensureValidProgress = (current, total) => ({
|
||||
current: Number(current) || 0,
|
||||
total: Number(total) || 1, // Default to 1 to avoid division by zero
|
||||
percentage: (((Number(current) || 0) / (Number(total) || 1)) * 100).toFixed(1)
|
||||
});
|
||||
|
||||
// Initial progress
|
||||
const initialProgress = ensureValidProgress(0, totalProducts);
|
||||
global.outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting metrics calculation',
|
||||
current: initialProgress.current,
|
||||
total: initialProgress.total,
|
||||
elapsed: '0s',
|
||||
remaining: 'Calculating...',
|
||||
rate: 0,
|
||||
percentage: initialProgress.percentage,
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (!SKIP_PRODUCT_METRICS) {
|
||||
processedCount = await calculateProductMetrics(startTime, totalProducts);
|
||||
const result = await calculateProductMetrics(startTime, totalProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Product metrics calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping product metrics calculation...');
|
||||
processedCount = Math.floor(totalProducts * 0.6);
|
||||
processedProducts = Math.floor(totalProducts * 0.6);
|
||||
await updateProgress(processedProducts);
|
||||
global.outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Skipping product metrics calculation',
|
||||
current: processedCount,
|
||||
current: processedProducts,
|
||||
total: totalProducts,
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
remaining: global.estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: global.calculateRate(startTime, processedCount),
|
||||
percentage: '60'
|
||||
remaining: global.estimateRemaining(startTime, processedProducts, totalProducts),
|
||||
rate: global.calculateRate(startTime, processedProducts),
|
||||
percentage: '60',
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Calculate time-based aggregates
|
||||
processedCount = await calculateTimeAggregates(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_TIME_AGGREGATES) {
|
||||
const result = await calculateTimeAggregates(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Time aggregates calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping time aggregates calculation');
|
||||
}
|
||||
|
||||
// Calculate financial metrics
|
||||
processedCount = await calculateFinancialMetrics(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_FINANCIAL_METRICS) {
|
||||
const result = await calculateFinancialMetrics(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Financial metrics calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping financial metrics calculation');
|
||||
}
|
||||
|
||||
// Calculate vendor metrics
|
||||
processedCount = await calculateVendorMetrics(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_VENDOR_METRICS) {
|
||||
const result = await calculateVendorMetrics(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Vendor metrics calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping vendor metrics calculation');
|
||||
}
|
||||
|
||||
// Calculate category metrics
|
||||
processedCount = await calculateCategoryMetrics(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_CATEGORY_METRICS) {
|
||||
const result = await calculateCategoryMetrics(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Category metrics calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping category metrics calculation');
|
||||
}
|
||||
|
||||
// Calculate brand metrics
|
||||
processedCount = await calculateBrandMetrics(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_BRAND_METRICS) {
|
||||
const result = await calculateBrandMetrics(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Brand metrics calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping brand metrics calculation');
|
||||
}
|
||||
|
||||
// Calculate sales forecasts
|
||||
processedCount = await calculateSalesForecasts(startTime, totalProducts, processedCount);
|
||||
if (!SKIP_SALES_FORECASTS) {
|
||||
const result = await calculateSalesForecasts(startTime, totalProducts, processedProducts);
|
||||
await updateProgress(result.processedProducts, result.processedOrders, result.processedPurchaseOrders);
|
||||
if (!result.success) {
|
||||
throw new Error('Sales forecasts calculation failed');
|
||||
}
|
||||
} else {
|
||||
console.log('Skipping sales forecasts calculation');
|
||||
}
|
||||
|
||||
// Calculate ABC classification
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting ABC classification',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
const [abcConfig] = await connection.query('SELECT a_threshold, b_threshold FROM abc_classification_config WHERE id = 1');
|
||||
const abcThresholds = abcConfig[0] || { a_threshold: 20, b_threshold: 50 };
|
||||
|
||||
// First, create and populate the rankings table with an index
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
await connection.query(`
|
||||
WITH revenue_rankings AS (
|
||||
SELECT
|
||||
product_id,
|
||||
total_revenue,
|
||||
PERCENT_RANK() OVER (ORDER BY COALESCE(total_revenue, 0) DESC) * 100 as revenue_percentile
|
||||
CREATE TEMPORARY TABLE temp_revenue_ranks (
|
||||
pid BIGINT NOT NULL,
|
||||
total_revenue DECIMAL(10,3),
|
||||
rank_num INT,
|
||||
total_count INT,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX (rank_num)
|
||||
) ENGINE=MEMORY
|
||||
`);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Creating revenue rankings',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
await connection.query(`
|
||||
INSERT INTO temp_revenue_ranks
|
||||
SELECT
|
||||
pid,
|
||||
total_revenue,
|
||||
@rank := @rank + 1 as rank_num,
|
||||
@total_count := @rank as total_count
|
||||
FROM (
|
||||
SELECT pid, total_revenue
|
||||
FROM product_metrics
|
||||
),
|
||||
classification_update AS (
|
||||
SELECT
|
||||
product_id,
|
||||
WHERE total_revenue > 0
|
||||
ORDER BY total_revenue DESC
|
||||
) ranked,
|
||||
(SELECT @rank := 0) r
|
||||
`);
|
||||
|
||||
// Get total count for percentage calculation
|
||||
const [rankingCount] = await connection.query('SELECT MAX(rank_num) as total_count FROM temp_revenue_ranks');
|
||||
const totalCount = rankingCount[0].total_count || 1;
|
||||
const max_rank = totalCount; // Store max_rank for use in classification
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Updating ABC classifications',
|
||||
current: processedProducts || 0,
|
||||
total: totalProducts || 0,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedProducts || 0, totalProducts || 0),
|
||||
rate: calculateRate(startTime, processedProducts || 0),
|
||||
percentage: (((processedProducts || 0) / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedProducts || 0,
|
||||
processedOrders: processedOrders || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// ABC classification progress tracking
|
||||
let abcProcessedCount = 0;
|
||||
const batchSize = 5000;
|
||||
let lastProgressUpdate = Date.now();
|
||||
const progressUpdateInterval = 1000; // Update every second
|
||||
|
||||
while (true) {
|
||||
if (isCancelled) return {
|
||||
processedProducts: Number(processedProducts) || 0,
|
||||
processedOrders: Number(processedOrders) || 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success: false
|
||||
};
|
||||
|
||||
// First get a batch of PIDs that need updating
|
||||
const [pids] = await connection.query(`
|
||||
SELECT pm.pid
|
||||
FROM product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
WHERE pm.abc_class IS NULL
|
||||
OR pm.abc_class !=
|
||||
CASE
|
||||
WHEN revenue_percentile <= ? THEN 'A'
|
||||
WHEN revenue_percentile <= ? THEN 'B'
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END as abc_class
|
||||
FROM revenue_rankings
|
||||
)
|
||||
UPDATE product_metrics pm
|
||||
JOIN classification_update cu ON pm.product_id = cu.product_id
|
||||
SET pm.abc_class = cu.abc_class,
|
||||
pm.last_calculated_at = NOW()
|
||||
`, [abcThresholds.a_threshold, abcThresholds.b_threshold]);
|
||||
END
|
||||
LIMIT ?
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
batchSize]);
|
||||
|
||||
if (pids.length === 0) {
|
||||
break;
|
||||
}
|
||||
|
||||
// Then update just those PIDs
|
||||
const [result] = await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
LEFT JOIN temp_revenue_ranks tr ON pm.pid = tr.pid
|
||||
SET pm.abc_class =
|
||||
CASE
|
||||
WHEN tr.rank_num IS NULL THEN 'C'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'A'
|
||||
WHEN (tr.rank_num / ?) * 100 <= ? THEN 'B'
|
||||
ELSE 'C'
|
||||
END,
|
||||
pm.last_calculated_at = NOW()
|
||||
WHERE pm.pid IN (?)
|
||||
`, [max_rank, abcThresholds.a_threshold,
|
||||
max_rank, abcThresholds.b_threshold,
|
||||
pids.map(row => row.pid)]);
|
||||
|
||||
abcProcessedCount += result.affectedRows;
|
||||
|
||||
// Calculate progress ensuring valid numbers
|
||||
const currentProgress = Math.floor(totalProducts * (0.99 + (abcProcessedCount / (totalCount || 1)) * 0.01));
|
||||
processedProducts = Number(currentProgress) || processedProducts || 0;
|
||||
|
||||
// Only update progress at most once per second
|
||||
const now = Date.now();
|
||||
if (now - lastProgressUpdate >= progressUpdateInterval) {
|
||||
const progress = ensureValidProgress(processedProducts, totalProducts);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'ABC classification progress',
|
||||
current: progress.current,
|
||||
total: progress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, progress.current, progress.total),
|
||||
rate: calculateRate(startTime, progress.current),
|
||||
percentage: progress.percentage,
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
|
||||
// Update database progress
|
||||
await updateProgress(processedProducts, processedOrders, processedPurchaseOrders);
|
||||
|
||||
// Small delay between batches to allow other transactions
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
}
|
||||
|
||||
// Clean up
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_revenue_ranks');
|
||||
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
|
||||
// Update calculate_status for ABC classification
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('abc_classification', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
// Final progress update with guaranteed valid numbers
|
||||
const finalProgress = ensureValidProgress(totalProducts, totalProducts);
|
||||
|
||||
// Final success message
|
||||
global.outputProgress({
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Metrics calculation complete',
|
||||
current: totalProducts,
|
||||
total: totalProducts,
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
current: finalProgress.current,
|
||||
total: finalProgress.total,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: '0s',
|
||||
rate: global.calculateRate(startTime, totalProducts),
|
||||
percentage: '100'
|
||||
rate: calculateRate(startTime, finalProgress.current),
|
||||
percentage: '100',
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: totalElapsedSeconds
|
||||
}
|
||||
});
|
||||
|
||||
// Ensure all values are valid numbers before final update
|
||||
const finalStats = {
|
||||
processedProducts: Number(processedProducts) || 0,
|
||||
processedOrders: Number(processedOrders) || 0,
|
||||
processedPurchaseOrders: Number(processedPurchaseOrders) || 0
|
||||
};
|
||||
|
||||
// Update history with completion
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
status = 'completed'
|
||||
WHERE id = ?
|
||||
`, [totalElapsedSeconds,
|
||||
finalStats.processedProducts,
|
||||
finalStats.processedOrders,
|
||||
finalStats.processedPurchaseOrders,
|
||||
calculateHistoryId]);
|
||||
|
||||
// Clear progress file on successful completion
|
||||
global.clearProgress();
|
||||
|
||||
} catch (error) {
|
||||
const endTime = Date.now();
|
||||
const totalElapsedSeconds = Math.round((endTime - startTime) / 1000);
|
||||
|
||||
// Update history with error
|
||||
await connection.query(`
|
||||
UPDATE calculate_history
|
||||
SET
|
||||
end_time = NOW(),
|
||||
duration_seconds = ?,
|
||||
processed_products = ?,
|
||||
processed_orders = ?,
|
||||
processed_purchase_orders = ?,
|
||||
status = ?,
|
||||
error_message = ?
|
||||
WHERE id = ?
|
||||
`, [
|
||||
totalElapsedSeconds,
|
||||
processedProducts || 0, // Ensure we have a valid number
|
||||
processedOrders || 0, // Ensure we have a valid number
|
||||
processedPurchaseOrders || 0, // Ensure we have a valid number
|
||||
isCancelled ? 'cancelled' : 'failed',
|
||||
error.message,
|
||||
calculateHistoryId
|
||||
]);
|
||||
|
||||
if (isCancelled) {
|
||||
global.outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Calculation cancelled',
|
||||
current: processedCount,
|
||||
current: processedProducts,
|
||||
total: totalProducts || 0,
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: global.calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / (totalProducts || 1)) * 100).toFixed(1)
|
||||
rate: global.calculateRate(startTime, processedProducts),
|
||||
percentage: ((processedProducts / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
} else {
|
||||
global.outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Error: ' + error.message,
|
||||
current: processedCount,
|
||||
current: processedProducts,
|
||||
total: totalProducts || 0,
|
||||
elapsed: global.formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: global.calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / (totalProducts || 1)) * 100).toFixed(1)
|
||||
rate: global.calculateRate(startTime, processedProducts),
|
||||
percentage: ((processedProducts / (totalProducts || 1)) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
if (connection) {
|
||||
connection.release();
|
||||
// Ensure temporary tables are cleaned up
|
||||
await cleanupTemporaryTables(connection);
|
||||
connection.release();
|
||||
}
|
||||
// Close the connection pool when we're done
|
||||
await closePool();
|
||||
}
|
||||
} finally {
|
||||
// Close the connection pool when we're done
|
||||
await closePool();
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error in metrics calculation');
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Export both functions and progress checker
|
||||
module.exports = calculateMetrics;
|
||||
module.exports.cancelCalculation = cancelCalculation;
|
||||
module.exports.getProgress = global.getProgress;
|
||||
// Export as a module with all necessary functions
|
||||
module.exports = {
|
||||
calculateMetrics,
|
||||
cancelCalculation,
|
||||
getProgress: global.getProgress
|
||||
};
|
||||
|
||||
// Run directly if called from command line
|
||||
if (require.main === module) {
|
||||
|
||||
107
inventory-server/scripts/full-reset.js
Normal file
107
inventory-server/scripts/full-reset.js
Normal file
@@ -0,0 +1,107 @@
|
||||
const path = require('path');
|
||||
const { spawn } = require('child_process');
|
||||
|
||||
function outputProgress(data) {
|
||||
if (!data.status) {
|
||||
data = {
|
||||
status: 'running',
|
||||
...data
|
||||
};
|
||||
}
|
||||
console.log(JSON.stringify(data));
|
||||
}
|
||||
|
||||
function runScript(scriptPath) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const child = spawn('node', [scriptPath], {
|
||||
stdio: ['inherit', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
let output = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
const lines = data.toString().split('\n');
|
||||
lines.filter(line => line.trim()).forEach(line => {
|
||||
try {
|
||||
console.log(line); // Pass through the JSON output
|
||||
output += line + '\n';
|
||||
} catch (e) {
|
||||
console.log(line); // If not JSON, just log it directly
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
console.error(data.toString());
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
if (code !== 0) {
|
||||
reject(new Error(`Script ${scriptPath} exited with code ${code}`));
|
||||
} else {
|
||||
resolve(output);
|
||||
}
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function fullReset() {
|
||||
try {
|
||||
// Step 1: Reset Database
|
||||
outputProgress({
|
||||
operation: 'Starting full reset',
|
||||
message: 'Step 1/3: Resetting database...'
|
||||
});
|
||||
await runScript(path.join(__dirname, 'reset-db.js'));
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Database reset step complete',
|
||||
message: 'Database reset finished, moving to import...'
|
||||
});
|
||||
|
||||
// Step 2: Import from Production
|
||||
outputProgress({
|
||||
operation: 'Starting import',
|
||||
message: 'Step 2/3: Importing from production...'
|
||||
});
|
||||
await runScript(path.join(__dirname, 'import-from-prod.js'));
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Import step complete',
|
||||
message: 'Import finished, moving to metrics calculation...'
|
||||
});
|
||||
|
||||
// Step 3: Calculate Metrics
|
||||
outputProgress({
|
||||
operation: 'Starting metrics calculation',
|
||||
message: 'Step 3/3: Calculating metrics...'
|
||||
});
|
||||
await runScript(path.join(__dirname, 'calculate-metrics.js'));
|
||||
|
||||
// Final completion message
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Full reset complete',
|
||||
message: 'Successfully completed all steps: database reset, import, and metrics calculation'
|
||||
});
|
||||
} catch (error) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Full reset failed',
|
||||
error: error.message,
|
||||
stack: error.stack
|
||||
});
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (require.main === module) {
|
||||
fullReset();
|
||||
}
|
||||
|
||||
module.exports = fullReset;
|
||||
100
inventory-server/scripts/full-update.js
Normal file
100
inventory-server/scripts/full-update.js
Normal file
@@ -0,0 +1,100 @@
|
||||
const path = require('path');
|
||||
const { spawn } = require('child_process');
|
||||
|
||||
function outputProgress(data) {
|
||||
if (!data.status) {
|
||||
data = {
|
||||
status: 'running',
|
||||
...data
|
||||
};
|
||||
}
|
||||
console.log(JSON.stringify(data));
|
||||
}
|
||||
|
||||
function runScript(scriptPath) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const child = spawn('node', [scriptPath], {
|
||||
stdio: ['inherit', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
let output = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
const lines = data.toString().split('\n');
|
||||
lines.filter(line => line.trim()).forEach(line => {
|
||||
try {
|
||||
console.log(line); // Pass through the JSON output
|
||||
output += line + '\n';
|
||||
} catch (e) {
|
||||
console.log(line); // If not JSON, just log it directly
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
console.error(data.toString());
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
if (code !== 0) {
|
||||
reject(new Error(`Script ${scriptPath} exited with code ${code}`));
|
||||
} else {
|
||||
resolve(output);
|
||||
}
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function fullUpdate() {
|
||||
try {
|
||||
// Step 1: Import from Production
|
||||
outputProgress({
|
||||
operation: 'Starting full update',
|
||||
message: 'Step 1/2: Importing from production...'
|
||||
});
|
||||
await runScript(path.join(__dirname, 'import-from-prod.js'));
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Import step complete',
|
||||
message: 'Import finished, moving to metrics calculation...'
|
||||
});
|
||||
|
||||
// Step 2: Calculate Metrics
|
||||
outputProgress({
|
||||
operation: 'Starting metrics calculation',
|
||||
message: 'Step 2/2: Calculating metrics...'
|
||||
});
|
||||
await runScript(path.join(__dirname, 'calculate-metrics.js'));
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Metrics step complete',
|
||||
message: 'Metrics calculation finished'
|
||||
});
|
||||
|
||||
// Final completion message
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Full update complete',
|
||||
message: 'Successfully completed all steps: import and metrics calculation'
|
||||
});
|
||||
} catch (error) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Full update failed',
|
||||
error: error.message,
|
||||
stack: error.stack
|
||||
});
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (require.main === module) {
|
||||
fullUpdate();
|
||||
}
|
||||
|
||||
module.exports = fullUpdate;
|
||||
File diff suppressed because it is too large
Load Diff
182
inventory-server/scripts/import/categories.js
Normal file
182
inventory-server/scripts/import/categories.js
Normal file
@@ -0,0 +1,182 @@
|
||||
const { outputProgress, formatElapsedTime } = require('../metrics/utils/progress');
|
||||
|
||||
async function importCategories(prodConnection, localConnection) {
|
||||
outputProgress({
|
||||
operation: "Starting categories import",
|
||||
status: "running",
|
||||
});
|
||||
|
||||
const startTime = Date.now();
|
||||
const typeOrder = [10, 20, 11, 21, 12, 13];
|
||||
let totalInserted = 0;
|
||||
let skippedCategories = [];
|
||||
|
||||
try {
|
||||
// Process each type in order with its own query
|
||||
for (const type of typeOrder) {
|
||||
const [categories] = await prodConnection.query(
|
||||
`
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
pc.name,
|
||||
pc.type,
|
||||
CASE
|
||||
WHEN pc.type IN (10, 20) THEN NULL -- Top level categories should have no parent
|
||||
WHEN pc.master_cat_id IS NULL THEN NULL
|
||||
ELSE pc.master_cat_id
|
||||
END as parent_id,
|
||||
pc.combined_name as description
|
||||
FROM product_categories pc
|
||||
WHERE pc.type = ?
|
||||
ORDER BY pc.cat_id
|
||||
`,
|
||||
[type]
|
||||
);
|
||||
|
||||
if (categories.length === 0) continue;
|
||||
|
||||
console.log(`\nProcessing ${categories.length} type ${type} categories`);
|
||||
if (type === 10) {
|
||||
console.log("Type 10 categories:", JSON.stringify(categories, null, 2));
|
||||
}
|
||||
|
||||
// For types that can have parents (11, 21, 12, 13), verify parent existence
|
||||
let categoriesToInsert = categories;
|
||||
if (![10, 20].includes(type)) {
|
||||
// Get all parent IDs
|
||||
const parentIds = [
|
||||
...new Set(
|
||||
categories.map((c) => c.parent_id).filter((id) => id !== null)
|
||||
),
|
||||
];
|
||||
|
||||
// Check which parents exist
|
||||
const [existingParents] = await localConnection.query(
|
||||
"SELECT cat_id FROM categories WHERE cat_id IN (?)",
|
||||
[parentIds]
|
||||
);
|
||||
const existingParentIds = new Set(existingParents.map((p) => p.cat_id));
|
||||
|
||||
// Filter categories and track skipped ones
|
||||
categoriesToInsert = categories.filter(
|
||||
(cat) =>
|
||||
cat.parent_id === null || existingParentIds.has(cat.parent_id)
|
||||
);
|
||||
const invalidCategories = categories.filter(
|
||||
(cat) =>
|
||||
cat.parent_id !== null && !existingParentIds.has(cat.parent_id)
|
||||
);
|
||||
|
||||
if (invalidCategories.length > 0) {
|
||||
const skippedInfo = invalidCategories.map((c) => ({
|
||||
id: c.cat_id,
|
||||
name: c.name,
|
||||
type: c.type,
|
||||
missing_parent: c.parent_id,
|
||||
}));
|
||||
skippedCategories.push(...skippedInfo);
|
||||
|
||||
console.log(
|
||||
"\nSkipping categories with missing parents:",
|
||||
invalidCategories
|
||||
.map(
|
||||
(c) =>
|
||||
`${c.cat_id} - ${c.name} (missing parent: ${c.parent_id})`
|
||||
)
|
||||
.join("\n")
|
||||
);
|
||||
}
|
||||
|
||||
if (categoriesToInsert.length === 0) {
|
||||
console.log(
|
||||
`No valid categories of type ${type} to insert - all had missing parents`
|
||||
);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Inserting ${categoriesToInsert.length} type ${type} categories`
|
||||
);
|
||||
|
||||
const placeholders = categoriesToInsert
|
||||
.map(() => "(?, ?, ?, ?, ?, ?, CURRENT_TIMESTAMP, CURRENT_TIMESTAMP)")
|
||||
.join(",");
|
||||
|
||||
const values = categoriesToInsert.flatMap((cat) => [
|
||||
cat.cat_id,
|
||||
cat.name,
|
||||
cat.type,
|
||||
cat.parent_id,
|
||||
cat.description,
|
||||
"active",
|
||||
]);
|
||||
|
||||
// Insert categories and create relationships in one query to avoid race conditions
|
||||
await localConnection.query(
|
||||
`
|
||||
INSERT INTO categories (cat_id, name, type, parent_id, description, status, created_at, updated_at)
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
name = VALUES(name),
|
||||
type = VALUES(type),
|
||||
parent_id = VALUES(parent_id),
|
||||
description = VALUES(description),
|
||||
status = VALUES(status),
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
`,
|
||||
values
|
||||
);
|
||||
|
||||
totalInserted += categoriesToInsert.length;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Categories import",
|
||||
current: totalInserted,
|
||||
total: totalInserted,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
});
|
||||
}
|
||||
|
||||
// After all imports, if we skipped any categories, throw an error
|
||||
if (skippedCategories.length > 0) {
|
||||
const error = new Error(
|
||||
"Categories import completed with errors - some categories were skipped due to missing parents"
|
||||
);
|
||||
error.skippedCategories = skippedCategories;
|
||||
throw error;
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "complete",
|
||||
operation: "Categories import completed",
|
||||
current: totalInserted,
|
||||
total: totalInserted,
|
||||
duration: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
});
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: totalInserted
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error importing categories:", error);
|
||||
if (error.skippedCategories) {
|
||||
console.error(
|
||||
"Skipped categories:",
|
||||
JSON.stringify(error.skippedCategories, null, 2)
|
||||
);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "error",
|
||||
operation: "Categories import failed",
|
||||
error: error.message,
|
||||
skippedCategories: error.skippedCategories
|
||||
});
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = importCategories;
|
||||
628
inventory-server/scripts/import/orders.js
Normal file
628
inventory-server/scripts/import/orders.js
Normal file
@@ -0,0 +1,628 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
const { importMissingProducts, setupTemporaryTables, cleanupTemporaryTables, materializeCalculations } = require('./products');
|
||||
|
||||
/**
|
||||
* Imports orders from a production MySQL database to a local MySQL database.
|
||||
* It can run in two modes:
|
||||
* 1. Incremental update mode (default): Only fetch orders that have changed since the last sync time.
|
||||
* 2. Full update mode: Fetch all eligible orders within the last 5 years regardless of timestamp.
|
||||
*
|
||||
* @param {object} prodConnection - A MySQL connection to production DB (MySQL 5.7).
|
||||
* @param {object} localConnection - A MySQL connection to local DB (MySQL 8.0).
|
||||
* @param {boolean} incrementalUpdate - Set to false for a full sync; true for incremental.
|
||||
*
|
||||
* @returns {object} Information about the sync operation.
|
||||
*/
|
||||
async function importOrders(prodConnection, localConnection, incrementalUpdate = true) {
|
||||
const startTime = Date.now();
|
||||
const skippedOrders = new Set();
|
||||
const missingProducts = new Set();
|
||||
let recordsAdded = 0;
|
||||
let recordsUpdated = 0;
|
||||
let processedCount = 0;
|
||||
let importedCount = 0;
|
||||
let totalOrderItems = 0;
|
||||
let totalUniqueOrders = 0;
|
||||
|
||||
// Add a cumulative counter for processed orders before the loop
|
||||
let cumulativeProcessedOrders = 0;
|
||||
|
||||
try {
|
||||
// Clean up any existing temp tables first
|
||||
await localConnection.query(`
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_items;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_meta;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_costs;
|
||||
`);
|
||||
|
||||
// Create all temp tables with correct schema
|
||||
await localConnection.query(`
|
||||
CREATE TEMPORARY TABLE temp_order_items (
|
||||
order_id INT UNSIGNED NOT NULL,
|
||||
pid INT UNSIGNED NOT NULL,
|
||||
SKU VARCHAR(50) NOT NULL,
|
||||
price DECIMAL(10,2) NOT NULL,
|
||||
quantity INT NOT NULL,
|
||||
base_discount DECIMAL(10,2) DEFAULT 0,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
await localConnection.query(`
|
||||
CREATE TEMPORARY TABLE temp_order_meta (
|
||||
order_id INT UNSIGNED NOT NULL,
|
||||
date DATE NOT NULL,
|
||||
customer VARCHAR(100) NOT NULL,
|
||||
customer_name VARCHAR(150) NOT NULL,
|
||||
status INT,
|
||||
canceled TINYINT(1),
|
||||
summary_discount DECIMAL(10,2) DEFAULT 0.00,
|
||||
summary_subtotal DECIMAL(10,2) DEFAULT 0.00,
|
||||
PRIMARY KEY (order_id)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
await localConnection.query(`
|
||||
CREATE TEMPORARY TABLE temp_order_discounts (
|
||||
order_id INT UNSIGNED NOT NULL,
|
||||
pid INT UNSIGNED NOT NULL,
|
||||
discount DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
await localConnection.query(`
|
||||
CREATE TEMPORARY TABLE temp_order_taxes (
|
||||
order_id INT UNSIGNED NOT NULL,
|
||||
pid INT UNSIGNED NOT NULL,
|
||||
tax DECIMAL(10,2) NOT NULL,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
await localConnection.query(`
|
||||
CREATE TEMPORARY TABLE temp_order_costs (
|
||||
order_id INT UNSIGNED NOT NULL,
|
||||
pid INT UNSIGNED NOT NULL,
|
||||
costeach DECIMAL(10,3) DEFAULT 0.000,
|
||||
PRIMARY KEY (order_id, pid)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
// Get column names from the local table
|
||||
const [columns] = await localConnection.query(`
|
||||
SELECT COLUMN_NAME
|
||||
FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE TABLE_NAME = 'orders'
|
||||
AND COLUMN_NAME != 'updated' -- Exclude the updated column
|
||||
ORDER BY ORDINAL_POSITION
|
||||
`);
|
||||
const columnNames = columns.map(col => col.COLUMN_NAME);
|
||||
|
||||
// Get last sync info
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'orders'"
|
||||
);
|
||||
const lastSyncTime = syncInfo?.[0]?.last_sync_timestamp || '1970-01-01';
|
||||
|
||||
console.log('Orders: Using last sync time:', lastSyncTime);
|
||||
|
||||
// First get count of order items
|
||||
const [[{ total }]] = await prodConnection.query(`
|
||||
SELECT COUNT(*) as total
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed_onlydate IS NOT NULL
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
o.stamp > ?
|
||||
OR oi.stamp > ?
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM order_discount_items odi
|
||||
WHERE odi.order_id = o.order_id
|
||||
AND odi.pid = oi.prod_pid
|
||||
)
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM order_tax_info oti
|
||||
JOIN order_tax_info_products otip ON oti.taxinfo_id = otip.taxinfo_id
|
||||
WHERE oti.order_id = o.order_id
|
||||
AND otip.pid = oi.prod_pid
|
||||
AND oti.stamp > ?
|
||||
)
|
||||
)
|
||||
` : ''}
|
||||
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime] : []);
|
||||
|
||||
totalOrderItems = total;
|
||||
console.log('Orders: Found changes:', totalOrderItems);
|
||||
|
||||
// Get order items in batches
|
||||
const [orderItems] = await prodConnection.query(`
|
||||
SELECT
|
||||
oi.order_id,
|
||||
oi.prod_pid as pid,
|
||||
oi.prod_itemnumber as SKU,
|
||||
oi.prod_price as price,
|
||||
oi.qty_ordered as quantity,
|
||||
COALESCE(oi.prod_price_reg - oi.prod_price, 0) as base_discount,
|
||||
oi.stamp as last_modified
|
||||
FROM order_items oi
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE o.order_status >= 15
|
||||
AND o.date_placed_onlydate >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
AND o.date_placed_onlydate IS NOT NULL
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
o.stamp > ?
|
||||
OR oi.stamp > ?
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM order_discount_items odi
|
||||
WHERE odi.order_id = o.order_id
|
||||
AND odi.pid = oi.prod_pid
|
||||
)
|
||||
OR EXISTS (
|
||||
SELECT 1 FROM order_tax_info oti
|
||||
JOIN order_tax_info_products otip ON oti.taxinfo_id = otip.taxinfo_id
|
||||
WHERE oti.order_id = o.order_id
|
||||
AND otip.pid = oi.prod_pid
|
||||
AND oti.stamp > ?
|
||||
)
|
||||
)
|
||||
` : ''}
|
||||
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime] : []);
|
||||
|
||||
console.log('Orders: Processing', orderItems.length, 'order items');
|
||||
|
||||
// Insert order items in batches
|
||||
for (let i = 0; i < orderItems.length; i += 5000) {
|
||||
const batch = orderItems.slice(i, Math.min(i + 5000, orderItems.length));
|
||||
const placeholders = batch.map(() => "(?, ?, ?, ?, ?, ?)").join(",");
|
||||
const values = batch.flatMap(item => [
|
||||
item.order_id, item.pid, item.SKU, item.price, item.quantity, item.base_discount
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_items (order_id, pid, SKU, price, quantity, base_discount)
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
SKU = VALUES(SKU),
|
||||
price = VALUES(price),
|
||||
quantity = VALUES(quantity),
|
||||
base_discount = VALUES(base_discount)
|
||||
`, values);
|
||||
|
||||
processedCount = i + batch.length;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Loading order items: ${processedCount} of ${totalOrderItems}`,
|
||||
current: processedCount,
|
||||
total: totalOrderItems
|
||||
});
|
||||
}
|
||||
|
||||
// Get unique order IDs
|
||||
const orderIds = [...new Set(orderItems.map(item => item.order_id))];
|
||||
totalUniqueOrders = orderIds.length;
|
||||
console.log('Total unique order IDs:', totalUniqueOrders);
|
||||
|
||||
// Reset processed count for order processing phase
|
||||
processedCount = 0;
|
||||
|
||||
// Get order metadata in batches
|
||||
for (let i = 0; i < orderIds.length; i += 5000) {
|
||||
const batchIds = orderIds.slice(i, i + 5000);
|
||||
console.log(`Processing batch ${i/5000 + 1}, size: ${batchIds.length}`);
|
||||
console.log('Sample of batch IDs:', batchIds.slice(0, 5));
|
||||
|
||||
const [orders] = await prodConnection.query(`
|
||||
SELECT
|
||||
o.order_id,
|
||||
o.date_placed_onlydate as date,
|
||||
o.order_cid as customer,
|
||||
CONCAT(COALESCE(u.firstname, ''), ' ', COALESCE(u.lastname, '')) as customer_name,
|
||||
o.order_status as status,
|
||||
CASE WHEN o.date_cancelled != '0000-00-00 00:00:00' THEN 1 ELSE 0 END as canceled,
|
||||
o.summary_discount,
|
||||
o.summary_subtotal
|
||||
FROM _order o
|
||||
LEFT JOIN users u ON o.order_cid = u.cid
|
||||
WHERE o.order_id IN (?)
|
||||
`, [batchIds]);
|
||||
|
||||
console.log(`Retrieved ${orders.length} orders for ${batchIds.length} IDs`);
|
||||
const duplicates = orders.filter((order, index, self) =>
|
||||
self.findIndex(o => o.order_id === order.order_id) !== index
|
||||
);
|
||||
if (duplicates.length > 0) {
|
||||
console.log('Found duplicates:', duplicates);
|
||||
}
|
||||
|
||||
const placeholders = orders.map(() => "(?, ?, ?, ?, ?, ?, ?, ?)").join(",");
|
||||
const values = orders.flatMap(order => [
|
||||
order.order_id,
|
||||
order.date,
|
||||
order.customer,
|
||||
order.customer_name,
|
||||
order.status,
|
||||
order.canceled,
|
||||
order.summary_discount,
|
||||
order.summary_subtotal
|
||||
]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_meta (
|
||||
order_id,
|
||||
date,
|
||||
customer,
|
||||
customer_name,
|
||||
status,
|
||||
canceled,
|
||||
summary_discount,
|
||||
summary_subtotal
|
||||
) VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
date = VALUES(date),
|
||||
customer = VALUES(customer),
|
||||
customer_name = VALUES(customer_name),
|
||||
status = VALUES(status),
|
||||
canceled = VALUES(canceled),
|
||||
summary_discount = VALUES(summary_discount),
|
||||
summary_subtotal = VALUES(summary_subtotal)
|
||||
`, values);
|
||||
|
||||
processedCount = i + orders.length;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Loading order metadata: ${processedCount} of ${totalUniqueOrders}`,
|
||||
current: processedCount,
|
||||
total: totalUniqueOrders
|
||||
});
|
||||
}
|
||||
|
||||
// Reset processed count for final phase
|
||||
processedCount = 0;
|
||||
|
||||
// Get promotional discounts in batches
|
||||
for (let i = 0; i < orderIds.length; i += 5000) {
|
||||
const batchIds = orderIds.slice(i, i + 5000);
|
||||
const [discounts] = await prodConnection.query(`
|
||||
SELECT order_id, pid, SUM(amount) as discount
|
||||
FROM order_discount_items
|
||||
WHERE order_id IN (?)
|
||||
GROUP BY order_id, pid
|
||||
`, [batchIds]);
|
||||
|
||||
if (discounts.length > 0) {
|
||||
const placeholders = discounts.map(() => "(?, ?, ?)").join(",");
|
||||
const values = discounts.flatMap(d => [d.order_id, d.pid, d.discount]);
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_discounts VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
discount = VALUES(discount)
|
||||
`, values);
|
||||
}
|
||||
}
|
||||
|
||||
// Get tax information in batches
|
||||
for (let i = 0; i < orderIds.length; i += 5000) {
|
||||
const batchIds = orderIds.slice(i, i + 5000);
|
||||
const [taxes] = await prodConnection.query(`
|
||||
SELECT DISTINCT
|
||||
oti.order_id,
|
||||
otip.pid,
|
||||
otip.item_taxes_to_collect as tax
|
||||
FROM order_tax_info oti
|
||||
JOIN (
|
||||
SELECT order_id, MAX(stamp) as max_stamp
|
||||
FROM order_tax_info
|
||||
WHERE order_id IN (?)
|
||||
GROUP BY order_id
|
||||
) latest ON oti.order_id = latest.order_id AND oti.stamp = latest.max_stamp
|
||||
JOIN order_tax_info_products otip ON oti.taxinfo_id = otip.taxinfo_id
|
||||
`, [batchIds]);
|
||||
|
||||
if (taxes.length > 0) {
|
||||
// Remove any duplicates before inserting
|
||||
const uniqueTaxes = new Map();
|
||||
taxes.forEach(t => {
|
||||
const key = `${t.order_id}-${t.pid}`;
|
||||
uniqueTaxes.set(key, t);
|
||||
});
|
||||
|
||||
const values = Array.from(uniqueTaxes.values()).flatMap(t => [t.order_id, t.pid, t.tax]);
|
||||
if (values.length > 0) {
|
||||
const placeholders = Array(uniqueTaxes.size).fill("(?, ?, ?)").join(",");
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_taxes VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE tax = VALUES(tax)
|
||||
`, values);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get costeach values in batches
|
||||
for (let i = 0; i < orderIds.length; i += 5000) {
|
||||
const batchIds = orderIds.slice(i, i + 5000);
|
||||
const [costs] = await prodConnection.query(`
|
||||
SELECT
|
||||
oc.orderid as order_id,
|
||||
oc.pid,
|
||||
COALESCE(
|
||||
oc.costeach,
|
||||
(SELECT pi.costeach
|
||||
FROM product_inventory pi
|
||||
WHERE pi.pid = oc.pid
|
||||
AND pi.daterec <= o.date_placed
|
||||
ORDER BY pi.daterec DESC LIMIT 1)
|
||||
) as costeach
|
||||
FROM order_costs oc
|
||||
JOIN _order o ON oc.orderid = o.order_id
|
||||
WHERE oc.orderid IN (?)
|
||||
`, [batchIds]);
|
||||
|
||||
if (costs.length > 0) {
|
||||
const placeholders = costs.map(() => '(?, ?, ?)').join(",");
|
||||
const values = costs.flatMap(c => [c.order_id, c.pid, c.costeach || 0]);
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_order_costs (order_id, pid, costeach)
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE costeach = VALUES(costeach)
|
||||
`, values);
|
||||
}
|
||||
}
|
||||
|
||||
// Now combine all the data and insert into orders table
|
||||
// Pre-check all products at once instead of per batch
|
||||
const allOrderPids = [...new Set(orderItems.map(item => item.pid))];
|
||||
const [existingProducts] = allOrderPids.length > 0 ? await localConnection.query(
|
||||
"SELECT pid FROM products WHERE pid IN (?)",
|
||||
[allOrderPids]
|
||||
) : [[]];
|
||||
const existingPids = new Set(existingProducts.map(p => p.pid));
|
||||
|
||||
// Process in larger batches
|
||||
for (let i = 0; i < orderIds.length; i += 5000) {
|
||||
const batchIds = orderIds.slice(i, i + 5000);
|
||||
|
||||
// Get combined data for this batch
|
||||
const [orders] = await localConnection.query(`
|
||||
SELECT
|
||||
oi.order_id as order_number,
|
||||
oi.pid,
|
||||
oi.SKU,
|
||||
om.date,
|
||||
oi.price,
|
||||
oi.quantity,
|
||||
oi.base_discount + COALESCE(od.discount, 0) +
|
||||
CASE
|
||||
WHEN om.summary_discount > 0 THEN
|
||||
ROUND((om.summary_discount * (oi.price * oi.quantity)) /
|
||||
NULLIF(om.summary_subtotal, 0), 2)
|
||||
ELSE 0
|
||||
END as discount,
|
||||
COALESCE(ot.tax, 0) as tax,
|
||||
0 as tax_included,
|
||||
0 as shipping,
|
||||
om.customer,
|
||||
om.customer_name,
|
||||
om.status,
|
||||
om.canceled,
|
||||
COALESCE(tc.costeach, 0) as costeach
|
||||
FROM temp_order_items oi
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs tc ON oi.order_id = tc.order_id AND oi.pid = tc.pid
|
||||
WHERE oi.order_id IN (?)
|
||||
`, [batchIds]);
|
||||
|
||||
// Filter orders and track missing products - do this in a single pass
|
||||
const validOrders = [];
|
||||
const values = [];
|
||||
const processedOrderItems = new Set(); // Track unique order items
|
||||
const processedOrders = new Set(); // Track unique orders
|
||||
|
||||
for (const order of orders) {
|
||||
if (!existingPids.has(order.pid)) {
|
||||
missingProducts.add(order.pid);
|
||||
skippedOrders.add(order.order_number);
|
||||
continue;
|
||||
}
|
||||
validOrders.push(order);
|
||||
values.push(...columnNames.map(col => order[col] ?? null));
|
||||
processedOrderItems.add(`${order.order_number}-${order.pid}`);
|
||||
processedOrders.add(order.order_number);
|
||||
}
|
||||
|
||||
if (validOrders.length > 0) {
|
||||
// Pre-compute the placeholders string once
|
||||
const singlePlaceholder = `(${columnNames.map(() => "?").join(",")})`;
|
||||
const placeholders = Array(validOrders.length).fill(singlePlaceholder).join(",");
|
||||
|
||||
const result = await localConnection.query(`
|
||||
INSERT INTO orders (${columnNames.join(",")})
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
SKU = VALUES(SKU),
|
||||
date = VALUES(date),
|
||||
price = VALUES(price),
|
||||
quantity = VALUES(quantity),
|
||||
discount = VALUES(discount),
|
||||
tax = VALUES(tax),
|
||||
tax_included = VALUES(tax_included),
|
||||
shipping = VALUES(shipping),
|
||||
customer = VALUES(customer),
|
||||
customer_name = VALUES(customer_name),
|
||||
status = VALUES(status),
|
||||
canceled = VALUES(canceled),
|
||||
costeach = VALUES(costeach)
|
||||
`, validOrders.map(o => columnNames.map(col => o[col] ?? null)).flat());
|
||||
|
||||
const affectedRows = result[0].affectedRows;
|
||||
const updates = Math.floor(affectedRows / 2);
|
||||
const inserts = affectedRows - (updates * 2);
|
||||
|
||||
recordsAdded += inserts;
|
||||
recordsUpdated += updates;
|
||||
importedCount += processedOrderItems.size; // Count unique order items processed
|
||||
}
|
||||
|
||||
// Update progress based on unique orders processed
|
||||
cumulativeProcessedOrders += processedOrders.size;
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Imported ${importedCount} order items (${cumulativeProcessedOrders} of ${totalUniqueOrders} orders processed)`,
|
||||
current: cumulativeProcessedOrders,
|
||||
total: totalUniqueOrders,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, cumulativeProcessedOrders, totalUniqueOrders),
|
||||
rate: calculateRate(startTime, cumulativeProcessedOrders)
|
||||
});
|
||||
}
|
||||
|
||||
// Now try to import any orders that were skipped due to missing products
|
||||
if (skippedOrders.size > 0) {
|
||||
try {
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Retrying import of ${skippedOrders.size} orders with previously missing products`,
|
||||
});
|
||||
|
||||
// Get the orders that were skipped
|
||||
const [skippedProdOrders] = await localConnection.query(`
|
||||
SELECT DISTINCT
|
||||
oi.order_id as order_number,
|
||||
oi.pid,
|
||||
oi.SKU,
|
||||
om.date,
|
||||
oi.price,
|
||||
oi.quantity,
|
||||
oi.base_discount + COALESCE(od.discount, 0) +
|
||||
CASE
|
||||
WHEN o.summary_discount > 0 THEN
|
||||
ROUND((o.summary_discount * (oi.price * oi.quantity)) /
|
||||
NULLIF(o.summary_subtotal, 0), 2)
|
||||
ELSE 0
|
||||
END as discount,
|
||||
COALESCE(ot.tax, 0) as tax,
|
||||
0 as tax_included,
|
||||
0 as shipping,
|
||||
om.customer,
|
||||
om.customer_name,
|
||||
om.status,
|
||||
om.canceled,
|
||||
COALESCE(tc.costeach, 0) as costeach
|
||||
FROM temp_order_items oi
|
||||
JOIN temp_order_meta om ON oi.order_id = om.order_id
|
||||
LEFT JOIN _order o ON oi.order_id = o.order_id
|
||||
LEFT JOIN temp_order_discounts od ON oi.order_id = od.order_id AND oi.pid = od.pid
|
||||
LEFT JOIN temp_order_taxes ot ON oi.order_id = ot.order_id AND oi.pid = ot.pid
|
||||
LEFT JOIN temp_order_costs tc ON oi.order_id = tc.order_id AND oi.pid = tc.pid
|
||||
WHERE oi.order_id IN (?)
|
||||
`, [Array.from(skippedOrders)]);
|
||||
|
||||
// Check which products exist now
|
||||
const skippedPids = [...new Set(skippedProdOrders.map(o => o.pid))];
|
||||
const [existingProducts] = skippedPids.length > 0 ? await localConnection.query(
|
||||
"SELECT pid FROM products WHERE pid IN (?)",
|
||||
[skippedPids]
|
||||
) : [[]];
|
||||
const existingPids = new Set(existingProducts.map(p => p.pid));
|
||||
|
||||
// Filter orders that can now be imported
|
||||
const validOrders = skippedProdOrders.filter(order => existingPids.has(order.pid));
|
||||
const retryOrderItems = new Set(); // Track unique order items in retry
|
||||
|
||||
if (validOrders.length > 0) {
|
||||
const placeholders = validOrders.map(() => `(${columnNames.map(() => "?").join(", ")})`).join(",");
|
||||
const values = validOrders.map(o => columnNames.map(col => o[col] ?? null)).flat();
|
||||
|
||||
const result = await localConnection.query(`
|
||||
INSERT INTO orders (${columnNames.join(", ")})
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
SKU = VALUES(SKU),
|
||||
date = VALUES(date),
|
||||
price = VALUES(price),
|
||||
quantity = VALUES(quantity),
|
||||
discount = VALUES(discount),
|
||||
tax = VALUES(tax),
|
||||
tax_included = VALUES(tax_included),
|
||||
shipping = VALUES(shipping),
|
||||
customer = VALUES(customer),
|
||||
customer_name = VALUES(customer_name),
|
||||
status = VALUES(status),
|
||||
canceled = VALUES(canceled),
|
||||
costeach = VALUES(costeach)
|
||||
`, values);
|
||||
|
||||
const affectedRows = result[0].affectedRows;
|
||||
const updates = Math.floor(affectedRows / 2);
|
||||
const inserts = affectedRows - (updates * 2);
|
||||
|
||||
// Track unique order items
|
||||
validOrders.forEach(order => {
|
||||
retryOrderItems.add(`${order.order_number}-${order.pid}`);
|
||||
});
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Orders import",
|
||||
message: `Successfully imported ${retryOrderItems.size} previously skipped order items`,
|
||||
});
|
||||
|
||||
// Update the main counters
|
||||
recordsAdded += inserts;
|
||||
recordsUpdated += updates;
|
||||
importedCount += retryOrderItems.size;
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('Warning: Failed to retry skipped orders:', error.message);
|
||||
console.warn(`Skipped ${skippedOrders.size} orders due to ${missingProducts.size} missing products`);
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up temporary tables after ALL processing is complete
|
||||
await localConnection.query(`
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_items;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_meta;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_discounts;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_taxes;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_order_costs;
|
||||
`);
|
||||
|
||||
// Only update sync status if we get here (no errors thrown)
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('orders', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: Math.floor(importedCount),
|
||||
recordsAdded: recordsAdded || 0,
|
||||
recordsUpdated: Math.floor(recordsUpdated),
|
||||
totalSkipped: skippedOrders.size,
|
||||
missingProducts: missingProducts.size,
|
||||
incrementalUpdate,
|
||||
lastSyncTime
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("Error during orders import:", error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = importOrders;
|
||||
802
inventory-server/scripts/import/products.js
Normal file
802
inventory-server/scripts/import/products.js
Normal file
@@ -0,0 +1,802 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
|
||||
// Utility functions
|
||||
const imageUrlBase = 'https://sbing.com/i/products/0000/';
|
||||
const getImageUrls = (pid, iid = 1) => {
|
||||
const paddedPid = pid.toString().padStart(6, '0');
|
||||
// Use padded PID only for the first 3 digits
|
||||
const prefix = paddedPid.slice(0, 3);
|
||||
// Use the actual pid for the rest of the URL
|
||||
const basePath = `${imageUrlBase}${prefix}/${pid}`;
|
||||
return {
|
||||
image: `${basePath}-t-${iid}.jpg`,
|
||||
image_175: `${basePath}-175x175-${iid}.jpg`,
|
||||
image_full: `${basePath}-o-${iid}.jpg`
|
||||
};
|
||||
};
|
||||
|
||||
async function setupAndCleanupTempTables(connection, operation = 'setup') {
|
||||
if (operation === 'setup') {
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_products (
|
||||
pid BIGINT NOT NULL,
|
||||
title VARCHAR(255),
|
||||
description TEXT,
|
||||
SKU VARCHAR(50),
|
||||
stock_quantity INT DEFAULT 0,
|
||||
pending_qty INT DEFAULT 0,
|
||||
preorder_count INT DEFAULT 0,
|
||||
notions_inv_count INT DEFAULT 0,
|
||||
price DECIMAL(10,3) NOT NULL DEFAULT 0,
|
||||
regular_price DECIMAL(10,3) NOT NULL DEFAULT 0,
|
||||
cost_price DECIMAL(10,3),
|
||||
vendor VARCHAR(100),
|
||||
vendor_reference VARCHAR(100),
|
||||
notions_reference VARCHAR(100),
|
||||
brand VARCHAR(100),
|
||||
line VARCHAR(100),
|
||||
subline VARCHAR(100),
|
||||
artist VARCHAR(100),
|
||||
category_ids TEXT,
|
||||
created_at DATETIME,
|
||||
first_received DATETIME,
|
||||
landing_cost_price DECIMAL(10,3),
|
||||
barcode VARCHAR(50),
|
||||
harmonized_tariff_code VARCHAR(50),
|
||||
updated_at DATETIME,
|
||||
visible BOOLEAN,
|
||||
replenishable BOOLEAN,
|
||||
permalink VARCHAR(255),
|
||||
moq DECIMAL(10,3),
|
||||
rating DECIMAL(10,2),
|
||||
reviews INT,
|
||||
weight DECIMAL(10,3),
|
||||
length DECIMAL(10,3),
|
||||
width DECIMAL(10,3),
|
||||
height DECIMAL(10,3),
|
||||
country_of_origin VARCHAR(100),
|
||||
location VARCHAR(100),
|
||||
total_sold INT,
|
||||
baskets INT,
|
||||
notifies INT,
|
||||
date_last_sold DATETIME,
|
||||
needs_update BOOLEAN DEFAULT TRUE,
|
||||
PRIMARY KEY (pid),
|
||||
INDEX idx_needs_update (needs_update)
|
||||
) ENGINE=InnoDB;
|
||||
`);
|
||||
} else {
|
||||
await connection.query('DROP TEMPORARY TABLE IF EXISTS temp_products;');
|
||||
}
|
||||
}
|
||||
|
||||
async function materializeCalculations(prodConnection, localConnection, incrementalUpdate = true, lastSyncTime = '1970-01-01') {
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
message: "Fetching product data from production"
|
||||
});
|
||||
|
||||
// Get all product data in a single optimized query
|
||||
const [prodData] = await prodConnection.query(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.description AS title,
|
||||
p.notes AS description,
|
||||
p.itemnumber AS SKU,
|
||||
p.date_created,
|
||||
p.datein AS first_received,
|
||||
p.location,
|
||||
p.upc AS barcode,
|
||||
p.harmonized_tariff_code,
|
||||
p.stamp AS updated_at,
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN (
|
||||
(IFNULL(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURDATE(), INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(NOW(), INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(NOW(), INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
COALESCE(si.available_local, 0) - COALESCE(
|
||||
(SELECT SUM(oi.qty_ordered - oi.qty_placed)
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE oi.prod_pid = p.pid
|
||||
AND o.date_placed != '0000-00-00 00:00:00'
|
||||
AND o.date_shipped = '0000-00-00 00:00:00'
|
||||
AND oi.pick_finished = 0
|
||||
AND oi.qty_back = 0
|
||||
AND o.order_status != 15
|
||||
AND o.order_status < 90
|
||||
AND oi.qty_ordered >= oi.qty_placed
|
||||
AND oi.qty_ordered > 0
|
||||
), 0
|
||||
) as stock_quantity,
|
||||
COALESCE(
|
||||
(SELECT SUM(oi.qty_ordered - oi.qty_placed)
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE oi.prod_pid = p.pid
|
||||
AND o.date_placed != '0000-00-00 00:00:00'
|
||||
AND o.date_shipped = '0000-00-00 00:00:00'
|
||||
AND oi.pick_finished = 0
|
||||
AND oi.qty_back = 0
|
||||
AND o.order_status != 15
|
||||
AND o.order_status < 90
|
||||
AND oi.qty_ordered >= oi.qty_placed
|
||||
AND oi.qty_ordered > 0
|
||||
), 0
|
||||
) as pending_qty,
|
||||
COALESCE(ci.onpreorder, 0) as preorder_count,
|
||||
COALESCE(pnb.inventory, 0) as notions_inv_count,
|
||||
COALESCE(pcp.price_each, 0) as price,
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL as landing_cost_price,
|
||||
s.companyname AS vendor,
|
||||
CASE
|
||||
WHEN s.companyname = 'Notions' THEN sid.notions_itemnumber
|
||||
ELSE sid.supplier_itemnumber
|
||||
END AS vendor_reference,
|
||||
sid.notions_itemnumber AS notions_reference,
|
||||
CONCAT('https://www.acherryontop.com/shop/product/', p.pid) AS permalink,
|
||||
pc1.name AS brand,
|
||||
pc2.name AS line,
|
||||
pc3.name AS subline,
|
||||
pc4.name AS artist,
|
||||
COALESCE(CASE
|
||||
WHEN sid.supplier_id = 92 THEN sid.notions_qty_per_unit
|
||||
ELSE sid.supplier_qty_per_unit
|
||||
END, sid.notions_qty_per_unit) AS moq,
|
||||
p.rating,
|
||||
p.rating_votes AS reviews,
|
||||
p.weight,
|
||||
p.length,
|
||||
p.width,
|
||||
p.height,
|
||||
p.country_of_origin,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE
|
||||
WHEN pc.cat_id IS NOT NULL
|
||||
AND pc.type IN (10, 20, 11, 21, 12, 13)
|
||||
AND pci.cat_id NOT IN (16, 17)
|
||||
THEN pci.cat_id
|
||||
END) as category_ids
|
||||
FROM products p
|
||||
LEFT JOIN shop_inventory si ON p.pid = si.pid AND si.store = 0
|
||||
LEFT JOIN current_inventory ci ON p.pid = ci.pid
|
||||
LEFT JOIN product_notions_b2b pnb ON p.pid = pnb.pid
|
||||
LEFT JOIN product_current_prices pcp ON p.pid = pcp.pid AND pcp.active = 1
|
||||
LEFT JOIN supplier_item_data sid ON p.pid = sid.pid
|
||||
LEFT JOIN suppliers s ON sid.supplier_id = s.supplierid
|
||||
LEFT JOIN product_category_index pci ON p.pid = pci.pid
|
||||
LEFT JOIN product_categories pc ON pci.cat_id = pc.cat_id
|
||||
LEFT JOIN product_categories pc1 ON p.company = pc1.cat_id
|
||||
LEFT JOIN product_categories pc2 ON p.line = pc2.cat_id
|
||||
LEFT JOIN product_categories pc3 ON p.subline = pc3.cat_id
|
||||
LEFT JOIN product_categories pc4 ON p.artist = pc4.cat_id
|
||||
LEFT JOIN product_last_sold pls ON p.pid = pls.pid
|
||||
WHERE ${incrementalUpdate ? `
|
||||
p.stamp > ? OR
|
||||
ci.stamp > ? OR
|
||||
pcp.date_deactive > ? OR
|
||||
pcp.date_active > ? OR
|
||||
pnb.date_updated > ?
|
||||
` : 'TRUE'}
|
||||
GROUP BY p.pid
|
||||
`, incrementalUpdate ? [lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime] : []);
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
message: `Processing ${prodData.length} product records`
|
||||
});
|
||||
|
||||
// Insert all product data into temp table in batches
|
||||
for (let i = 0; i < prodData.length; i += 1000) {
|
||||
const batch = prodData.slice(i, i + 1000);
|
||||
const values = batch.map(row => [
|
||||
row.pid,
|
||||
row.title,
|
||||
row.description,
|
||||
row.SKU,
|
||||
// Set stock quantity to 0 if it's over 5000
|
||||
row.stock_quantity > 5000 ? 0 : Math.max(0, row.stock_quantity),
|
||||
row.pending_qty,
|
||||
row.preorder_count,
|
||||
row.notions_inv_count,
|
||||
row.price,
|
||||
row.regular_price,
|
||||
row.cost_price,
|
||||
row.vendor,
|
||||
row.vendor_reference,
|
||||
row.notions_reference,
|
||||
row.brand,
|
||||
row.line,
|
||||
row.subline,
|
||||
row.artist,
|
||||
row.category_ids,
|
||||
row.date_created, // map to created_at
|
||||
row.first_received,
|
||||
row.landing_cost_price,
|
||||
row.barcode,
|
||||
row.harmonized_tariff_code,
|
||||
row.updated_at,
|
||||
row.visible,
|
||||
row.replenishable,
|
||||
row.permalink,
|
||||
row.moq,
|
||||
row.rating ? Number(row.rating).toFixed(2) : null,
|
||||
row.reviews,
|
||||
row.weight,
|
||||
row.length,
|
||||
row.width,
|
||||
row.height,
|
||||
row.country_of_origin,
|
||||
row.location,
|
||||
row.total_sold,
|
||||
row.baskets,
|
||||
row.notifies,
|
||||
row.date_last_sold,
|
||||
true // Mark as needing update
|
||||
]);
|
||||
|
||||
if (values.length > 0) {
|
||||
await localConnection.query(`
|
||||
INSERT INTO temp_products (
|
||||
pid, title, description, SKU,
|
||||
stock_quantity, pending_qty, preorder_count, notions_inv_count,
|
||||
price, regular_price, cost_price,
|
||||
vendor, vendor_reference, notions_reference,
|
||||
brand, line, subline, artist,
|
||||
category_ids, created_at, first_received,
|
||||
landing_cost_price, barcode, harmonized_tariff_code,
|
||||
updated_at, visible, replenishable, permalink,
|
||||
moq, rating, reviews, weight, length, width,
|
||||
height, country_of_origin, location, total_sold,
|
||||
baskets, notifies, date_last_sold, needs_update
|
||||
)
|
||||
VALUES ?
|
||||
ON DUPLICATE KEY UPDATE
|
||||
title = VALUES(title),
|
||||
description = VALUES(description),
|
||||
SKU = VALUES(SKU),
|
||||
stock_quantity = VALUES(stock_quantity),
|
||||
pending_qty = VALUES(pending_qty),
|
||||
preorder_count = VALUES(preorder_count),
|
||||
notions_inv_count = VALUES(notions_inv_count),
|
||||
price = VALUES(price),
|
||||
regular_price = VALUES(regular_price),
|
||||
cost_price = VALUES(cost_price),
|
||||
vendor = VALUES(vendor),
|
||||
vendor_reference = VALUES(vendor_reference),
|
||||
notions_reference = VALUES(notions_reference),
|
||||
brand = VALUES(brand),
|
||||
line = VALUES(line),
|
||||
subline = VALUES(subline),
|
||||
artist = VALUES(artist),
|
||||
category_ids = VALUES(category_ids),
|
||||
created_at = VALUES(created_at),
|
||||
first_received = VALUES(first_received),
|
||||
landing_cost_price = VALUES(landing_cost_price),
|
||||
barcode = VALUES(barcode),
|
||||
harmonized_tariff_code = VALUES(harmonized_tariff_code),
|
||||
updated_at = VALUES(updated_at),
|
||||
visible = VALUES(visible),
|
||||
replenishable = VALUES(replenishable),
|
||||
permalink = VALUES(permalink),
|
||||
moq = VALUES(moq),
|
||||
rating = VALUES(rating),
|
||||
reviews = VALUES(reviews),
|
||||
weight = VALUES(weight),
|
||||
length = VALUES(length),
|
||||
width = VALUES(width),
|
||||
height = VALUES(height),
|
||||
country_of_origin = VALUES(country_of_origin),
|
||||
location = VALUES(location),
|
||||
total_sold = VALUES(total_sold),
|
||||
baskets = VALUES(baskets),
|
||||
notifies = VALUES(notifies),
|
||||
date_last_sold = VALUES(date_last_sold),
|
||||
needs_update = TRUE
|
||||
`, [values]);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
message: `Processed ${Math.min(i + 1000, prodData.length)} of ${prodData.length} product records`,
|
||||
current: i + batch.length,
|
||||
total: prodData.length
|
||||
});
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
message: "Finished materializing calculations"
|
||||
});
|
||||
}
|
||||
|
||||
async function importProducts(prodConnection, localConnection, incrementalUpdate = true) {
|
||||
const startTime = Date.now();
|
||||
let recordsAdded = 0;
|
||||
let recordsUpdated = 0;
|
||||
|
||||
try {
|
||||
// Get column names first
|
||||
const [columns] = await localConnection.query(`
|
||||
SELECT COLUMN_NAME
|
||||
FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE TABLE_NAME = 'products'
|
||||
AND COLUMN_NAME != 'updated' -- Exclude the updated column
|
||||
ORDER BY ORDINAL_POSITION
|
||||
`);
|
||||
const columnNames = columns.map(col => col.COLUMN_NAME);
|
||||
|
||||
// Get last sync info
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'products'"
|
||||
);
|
||||
const lastSyncTime = syncInfo?.[0]?.last_sync_timestamp || '1970-01-01';
|
||||
|
||||
console.log('Products: Using last sync time:', lastSyncTime);
|
||||
|
||||
// Setup temporary tables
|
||||
await setupAndCleanupTempTables(localConnection, 'setup');
|
||||
|
||||
// Materialize calculations - this will populate temp_products
|
||||
await materializeCalculations(prodConnection, localConnection, incrementalUpdate, lastSyncTime);
|
||||
|
||||
// Get actual count from temp table - only count products that need updates
|
||||
const [[{ actualTotal }]] = await localConnection.query(`
|
||||
SELECT COUNT(DISTINCT pid) as actualTotal
|
||||
FROM temp_products
|
||||
WHERE needs_update = 1
|
||||
`);
|
||||
|
||||
console.log('Products: Found changes:', actualTotal);
|
||||
|
||||
// Process in batches
|
||||
const BATCH_SIZE = 5000;
|
||||
let processed = 0;
|
||||
|
||||
while (processed < actualTotal) {
|
||||
const [batch] = await localConnection.query(`
|
||||
SELECT * FROM temp_products
|
||||
WHERE needs_update = 1
|
||||
LIMIT ? OFFSET ?
|
||||
`, [BATCH_SIZE, processed]);
|
||||
|
||||
if (!batch || batch.length === 0) break;
|
||||
|
||||
// Add image URLs
|
||||
batch.forEach(row => {
|
||||
const urls = getImageUrls(row.pid);
|
||||
row.image = urls.image;
|
||||
row.image_175 = urls.image_175;
|
||||
row.image_full = urls.image_full;
|
||||
});
|
||||
|
||||
if (batch.length > 0) {
|
||||
// Get existing products in one query
|
||||
const [existingProducts] = await localConnection.query(
|
||||
`SELECT ${columnNames.join(',')} FROM products WHERE pid IN (?)`,
|
||||
[batch.map(p => p.pid)]
|
||||
);
|
||||
const existingPidsMap = new Map(existingProducts.map(p => [p.pid, p]));
|
||||
|
||||
// Split into inserts and updates
|
||||
const insertsAndUpdates = batch.reduce((acc, product) => {
|
||||
if (existingPidsMap.has(product.pid)) {
|
||||
const existing = existingPidsMap.get(product.pid);
|
||||
// Check if any values are different
|
||||
const hasChanges = columnNames.some(col => {
|
||||
const newVal = product[col] ?? null;
|
||||
const oldVal = existing[col] ?? null;
|
||||
if (col === "managing_stock") return false; // Skip this as it's always 1
|
||||
if (typeof newVal === 'number' && typeof oldVal === 'number') {
|
||||
return Math.abs(newVal - oldVal) > 0.00001;
|
||||
}
|
||||
return newVal !== oldVal;
|
||||
});
|
||||
|
||||
if (hasChanges) {
|
||||
acc.updates.push(product);
|
||||
}
|
||||
} else {
|
||||
acc.inserts.push(product);
|
||||
}
|
||||
return acc;
|
||||
}, { inserts: [], updates: [] });
|
||||
|
||||
// Process inserts
|
||||
if (insertsAndUpdates.inserts.length > 0) {
|
||||
const insertValues = insertsAndUpdates.inserts.map(product =>
|
||||
columnNames.map(col => {
|
||||
const val = product[col] ?? null;
|
||||
if (col === "managing_stock") return 1;
|
||||
return val;
|
||||
})
|
||||
);
|
||||
|
||||
const insertPlaceholders = insertsAndUpdates.inserts
|
||||
.map(() => `(${Array(columnNames.length).fill('?').join(',')})`)
|
||||
.join(',');
|
||||
|
||||
const insertResult = await localConnection.query(`
|
||||
INSERT INTO products (${columnNames.join(',')})
|
||||
VALUES ${insertPlaceholders}
|
||||
`, insertValues.flat());
|
||||
|
||||
recordsAdded += insertResult[0].affectedRows;
|
||||
}
|
||||
|
||||
// Process updates
|
||||
if (insertsAndUpdates.updates.length > 0) {
|
||||
const updateValues = insertsAndUpdates.updates.map(product =>
|
||||
columnNames.map(col => {
|
||||
const val = product[col] ?? null;
|
||||
if (col === "managing_stock") return 1;
|
||||
return val;
|
||||
})
|
||||
);
|
||||
|
||||
const updatePlaceholders = insertsAndUpdates.updates
|
||||
.map(() => `(${Array(columnNames.length).fill('?').join(',')})`)
|
||||
.join(',');
|
||||
|
||||
const updateResult = await localConnection.query(`
|
||||
INSERT INTO products (${columnNames.join(',')})
|
||||
VALUES ${updatePlaceholders}
|
||||
ON DUPLICATE KEY UPDATE
|
||||
${columnNames
|
||||
.filter(col => col !== 'pid')
|
||||
.map(col => `${col} = VALUES(${col})`)
|
||||
.join(',')};
|
||||
`, updateValues.flat());
|
||||
|
||||
recordsUpdated += insertsAndUpdates.updates.length;
|
||||
}
|
||||
|
||||
// Process category relationships
|
||||
if (batch.some(p => p.category_ids)) {
|
||||
// First get all valid categories
|
||||
const allCategoryIds = [...new Set(
|
||||
batch
|
||||
.filter(p => p.category_ids)
|
||||
.flatMap(product =>
|
||||
product.category_ids
|
||||
.split(',')
|
||||
.map(id => id.trim())
|
||||
.filter(id => id)
|
||||
.map(Number)
|
||||
.filter(id => !isNaN(id))
|
||||
)
|
||||
)];
|
||||
|
||||
// Verify categories exist and get their hierarchy
|
||||
const [categories] = await localConnection.query(`
|
||||
WITH RECURSIVE category_hierarchy AS (
|
||||
SELECT
|
||||
cat_id,
|
||||
parent_id,
|
||||
type,
|
||||
1 as level,
|
||||
CAST(cat_id AS CHAR(200)) as path
|
||||
FROM categories
|
||||
WHERE cat_id IN (?)
|
||||
UNION ALL
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.parent_id,
|
||||
c.type,
|
||||
ch.level + 1,
|
||||
CONCAT(ch.path, ',', c.cat_id)
|
||||
FROM categories c
|
||||
JOIN category_hierarchy ch ON c.parent_id = ch.cat_id
|
||||
WHERE ch.level < 10 -- Prevent infinite recursion
|
||||
)
|
||||
SELECT
|
||||
h.cat_id,
|
||||
h.parent_id,
|
||||
h.type,
|
||||
h.path,
|
||||
h.level
|
||||
FROM (
|
||||
SELECT DISTINCT cat_id, parent_id, type, path, level
|
||||
FROM category_hierarchy
|
||||
WHERE cat_id IN (?)
|
||||
) h
|
||||
ORDER BY h.level DESC
|
||||
`, [allCategoryIds, allCategoryIds]);
|
||||
|
||||
const validCategories = new Map(categories.map(c => [c.cat_id, c]));
|
||||
const validCategoryIds = new Set(categories.map(c => c.cat_id));
|
||||
|
||||
// Build category relationships ensuring proper hierarchy
|
||||
const categoryRelationships = [];
|
||||
batch
|
||||
.filter(p => p.category_ids)
|
||||
.forEach(product => {
|
||||
const productCategories = product.category_ids
|
||||
.split(',')
|
||||
.map(id => id.trim())
|
||||
.filter(id => id)
|
||||
.map(Number)
|
||||
.filter(id => !isNaN(id))
|
||||
.filter(id => validCategoryIds.has(id))
|
||||
.map(id => validCategories.get(id))
|
||||
.sort((a, b) => a.type - b.type); // Sort by type to ensure proper hierarchy
|
||||
|
||||
// Only add relationships that maintain proper hierarchy
|
||||
productCategories.forEach(category => {
|
||||
if (category.path.split(',').every(parentId =>
|
||||
validCategoryIds.has(Number(parentId))
|
||||
)) {
|
||||
categoryRelationships.push([category.cat_id, product.pid]);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
if (categoryRelationships.length > 0) {
|
||||
// First remove any existing relationships that will be replaced
|
||||
await localConnection.query(`
|
||||
DELETE FROM product_categories
|
||||
WHERE pid IN (?) AND cat_id IN (?)
|
||||
`, [
|
||||
[...new Set(categoryRelationships.map(([_, pid]) => pid))],
|
||||
[...new Set(categoryRelationships.map(([catId, _]) => catId))]
|
||||
]);
|
||||
|
||||
// Then insert the new relationships
|
||||
const placeholders = categoryRelationships
|
||||
.map(() => "(?, ?)")
|
||||
.join(",");
|
||||
|
||||
await localConnection.query(`
|
||||
INSERT INTO product_categories (cat_id, pid)
|
||||
VALUES ${placeholders}
|
||||
`, categoryRelationships.flat());
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
processed += batch.length;
|
||||
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Products import",
|
||||
message: `Processed ${processed} of ${actualTotal} products`,
|
||||
current: processed,
|
||||
total: actualTotal,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, processed, actualTotal),
|
||||
rate: calculateRate(startTime, processed)
|
||||
});
|
||||
}
|
||||
|
||||
// Drop temporary tables
|
||||
await setupAndCleanupTempTables(localConnection, 'cleanup');
|
||||
|
||||
// Only update sync status if we get here (no errors thrown)
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('products', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_sync_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: actualTotal,
|
||||
recordsAdded: recordsAdded || 0,
|
||||
recordsUpdated: recordsUpdated || 0,
|
||||
incrementalUpdate,
|
||||
lastSyncTime
|
||||
};
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async function importMissingProducts(prodConnection, localConnection, missingPids) {
|
||||
try {
|
||||
// Get column names first
|
||||
const [columns] = await localConnection.query(`
|
||||
SELECT COLUMN_NAME
|
||||
FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE TABLE_NAME = 'products'
|
||||
AND COLUMN_NAME != 'updated' -- Exclude the updated column
|
||||
ORDER BY ORDINAL_POSITION
|
||||
`);
|
||||
const columnNames = columns.map((col) => col.COLUMN_NAME);
|
||||
|
||||
// Get the missing products with all their data in one optimized query
|
||||
const [products] = await prodConnection.query(`
|
||||
SELECT
|
||||
p.pid,
|
||||
p.description AS title,
|
||||
p.notes AS description,
|
||||
p.itemnumber AS SKU,
|
||||
p.date_created,
|
||||
p.datein AS first_received,
|
||||
p.location,
|
||||
p.upc AS barcode,
|
||||
p.harmonized_tariff_code,
|
||||
p.stamp AS updated_at,
|
||||
CASE WHEN si.show + si.buyable > 0 THEN 1 ELSE 0 END AS visible,
|
||||
CASE
|
||||
WHEN p.reorder < 0 THEN 0
|
||||
WHEN (
|
||||
(IFNULL(pls.date_sold, '0000-00-00') = '0000-00-00' OR pls.date_sold <= DATE_SUB(CURDATE(), INTERVAL 5 YEAR))
|
||||
OR (p.datein = '0000-00-00 00:00:00' OR p.datein <= DATE_SUB(NOW(), INTERVAL 5 YEAR))
|
||||
OR (p.date_refill = '0000-00-00 00:00:00' OR p.date_refill <= DATE_SUB(NOW(), INTERVAL 5 YEAR))
|
||||
) THEN 0
|
||||
ELSE 1
|
||||
END AS replenishable,
|
||||
COALESCE(si.available_local, 0) as stock_quantity,
|
||||
COALESCE(
|
||||
(SELECT SUM(oi.qty_ordered - oi.qty_placed)
|
||||
FROM order_items oi
|
||||
JOIN _order o ON oi.order_id = o.order_id
|
||||
WHERE oi.prod_pid = p.pid
|
||||
AND o.date_placed != '0000-00-00 00:00:00'
|
||||
AND o.date_shipped = '0000-00-00 00:00:00'
|
||||
AND oi.pick_finished = 0
|
||||
AND oi.qty_back = 0
|
||||
AND o.order_status != 15
|
||||
AND o.order_status < 90
|
||||
AND oi.qty_ordered >= oi.qty_placed
|
||||
AND oi.qty_ordered > 0
|
||||
), 0
|
||||
) as pending_qty,
|
||||
COALESCE(ci.onpreorder, 0) as preorder_count,
|
||||
COALESCE(pnb.inventory, 0) as notions_inv_count,
|
||||
COALESCE(pcp.price_each, 0) as price,
|
||||
COALESCE(p.sellingprice, 0) AS regular_price,
|
||||
CASE
|
||||
WHEN EXISTS (SELECT 1 FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
THEN (SELECT ROUND(AVG(costeach), 5) FROM product_inventory WHERE pid = p.pid AND count > 0)
|
||||
ELSE (SELECT costeach FROM product_inventory WHERE pid = p.pid ORDER BY daterec DESC LIMIT 1)
|
||||
END AS cost_price,
|
||||
NULL AS landing_cost_price,
|
||||
p.rating,
|
||||
p.rating_votes AS reviews,
|
||||
p.weight,
|
||||
p.length,
|
||||
p.width,
|
||||
p.height,
|
||||
(SELECT COUNT(*) FROM mybasket mb WHERE mb.item = p.pid AND mb.qty > 0) AS baskets,
|
||||
(SELECT COUNT(*) FROM product_notify pn WHERE pn.pid = p.pid) AS notifies,
|
||||
p.totalsold AS total_sold,
|
||||
p.country_of_origin,
|
||||
pls.date_sold as date_last_sold,
|
||||
GROUP_CONCAT(DISTINCT CASE WHEN pc.cat_id IS NOT NULL THEN pci.cat_id END) as category_ids
|
||||
FROM products p
|
||||
LEFT JOIN shop_inventory si ON p.pid = si.pid AND si.store = 0
|
||||
LEFT JOIN supplier_item_data sid ON p.pid = sid.pid
|
||||
LEFT JOIN suppliers s ON sid.supplier_id = s.supplierid
|
||||
LEFT JOIN product_category_index pci ON p.pid = pci.pid
|
||||
LEFT JOIN product_categories pc ON pci.cat_id = pc.cat_id
|
||||
AND pc.type IN (10, 20, 11, 21, 12, 13)
|
||||
AND pci.cat_id NOT IN (16, 17)
|
||||
LEFT JOIN product_categories pc1 ON p.company = pc1.cat_id
|
||||
LEFT JOIN product_categories pc2 ON p.line = pc2.cat_id
|
||||
LEFT JOIN product_categories pc3 ON p.subline = pc3.cat_id
|
||||
LEFT JOIN product_categories pc4 ON p.artist = pc4.cat_id
|
||||
LEFT JOIN product_last_sold pls ON p.pid = pls.pid
|
||||
LEFT JOIN current_inventory ci ON p.pid = ci.pid
|
||||
LEFT JOIN product_current_prices pcp ON p.pid = pcp.pid AND pcp.active = 1
|
||||
LEFT JOIN product_notions_b2b pnb ON p.pid = pnb.pid
|
||||
WHERE p.pid IN (?)
|
||||
GROUP BY p.pid
|
||||
`, [missingPids]);
|
||||
|
||||
// Add image URLs
|
||||
products.forEach(product => {
|
||||
const urls = getImageUrls(product.pid);
|
||||
product.image = urls.image;
|
||||
product.image_175 = urls.image_175;
|
||||
product.image_full = urls.image_full;
|
||||
});
|
||||
|
||||
let recordsAdded = 0;
|
||||
let recordsUpdated = 0;
|
||||
|
||||
if (products.length > 0) {
|
||||
// Map values in the same order as columns
|
||||
const productValues = products.flatMap(product =>
|
||||
columnNames.map(col => {
|
||||
const val = product[col] ?? null;
|
||||
if (col === "managing_stock") return 1;
|
||||
if (typeof val === "number") return val || 0;
|
||||
return val;
|
||||
})
|
||||
);
|
||||
|
||||
// Generate placeholders for all products
|
||||
const placeholders = products
|
||||
.map(() => `(${Array(columnNames.length).fill("?").join(",")})`)
|
||||
.join(",");
|
||||
|
||||
// Build and execute the query
|
||||
const query = `
|
||||
INSERT INTO products (${columnNames.join(",")})
|
||||
VALUES ${placeholders}
|
||||
ON DUPLICATE KEY UPDATE ${columnNames
|
||||
.filter((col) => col !== "pid")
|
||||
.map((col) => `${col} = VALUES(${col})`)
|
||||
.join(",")};
|
||||
`;
|
||||
|
||||
const result = await localConnection.query(query, productValues);
|
||||
recordsAdded = result.affectedRows - result.changedRows;
|
||||
recordsUpdated = result.changedRows;
|
||||
|
||||
// Handle category relationships if any
|
||||
const categoryRelationships = [];
|
||||
products.forEach(product => {
|
||||
if (product.category_ids) {
|
||||
const catIds = product.category_ids
|
||||
.split(",")
|
||||
.map(id => id.trim())
|
||||
.filter(id => id)
|
||||
.map(Number);
|
||||
catIds.forEach(catId => {
|
||||
if (catId) categoryRelationships.push([catId, product.pid]);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
if (categoryRelationships.length > 0) {
|
||||
// Verify categories exist before inserting relationships
|
||||
const uniqueCatIds = [...new Set(categoryRelationships.map(([catId]) => catId))];
|
||||
const [existingCats] = await localConnection.query(
|
||||
"SELECT cat_id FROM categories WHERE cat_id IN (?)",
|
||||
[uniqueCatIds]
|
||||
);
|
||||
const existingCatIds = new Set(existingCats.map(c => c.cat_id));
|
||||
|
||||
// Filter relationships to only include existing categories
|
||||
const validRelationships = categoryRelationships.filter(([catId]) =>
|
||||
existingCatIds.has(catId)
|
||||
);
|
||||
|
||||
if (validRelationships.length > 0) {
|
||||
const catPlaceholders = validRelationships
|
||||
.map(() => "(?, ?)")
|
||||
.join(",");
|
||||
await localConnection.query(
|
||||
`INSERT IGNORE INTO product_categories (cat_id, pid)
|
||||
VALUES ${catPlaceholders}`,
|
||||
validRelationships.flat()
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: products.length,
|
||||
recordsAdded,
|
||||
recordsUpdated
|
||||
};
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
importProducts,
|
||||
importMissingProducts
|
||||
};
|
||||
548
inventory-server/scripts/import/purchase-orders.js
Normal file
548
inventory-server/scripts/import/purchase-orders.js
Normal file
@@ -0,0 +1,548 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
|
||||
async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) {
|
||||
const startTime = Date.now();
|
||||
let recordsAdded = 0;
|
||||
let recordsUpdated = 0;
|
||||
|
||||
try {
|
||||
// Get last sync info
|
||||
const [syncInfo] = await localConnection.query(
|
||||
"SELECT last_sync_timestamp FROM sync_status WHERE table_name = 'purchase_orders'"
|
||||
);
|
||||
const lastSyncTime = syncInfo?.[0]?.last_sync_timestamp || '1970-01-01';
|
||||
|
||||
console.log('Purchase Orders: Using last sync time:', lastSyncTime);
|
||||
|
||||
// Insert temporary table creation query for purchase orders
|
||||
await localConnection.query(`
|
||||
CREATE TABLE IF NOT EXISTS temp_purchase_orders (
|
||||
po_id INT UNSIGNED NOT NULL,
|
||||
pid INT UNSIGNED NOT NULL,
|
||||
vendor VARCHAR(255),
|
||||
date DATE,
|
||||
expected_date DATE,
|
||||
status INT,
|
||||
notes TEXT,
|
||||
PRIMARY KEY (po_id, pid)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
|
||||
`);
|
||||
|
||||
outputProgress({
|
||||
operation: `Starting ${incrementalUpdate ? 'incremental' : 'full'} purchase orders import`,
|
||||
status: "running",
|
||||
});
|
||||
|
||||
// Get column names first
|
||||
const [columns] = await localConnection.query(`
|
||||
SELECT COLUMN_NAME
|
||||
FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE TABLE_NAME = 'purchase_orders'
|
||||
AND COLUMN_NAME != 'updated' -- Exclude the updated column
|
||||
ORDER BY ORDINAL_POSITION
|
||||
`);
|
||||
const columnNames = columns.map(col => col.COLUMN_NAME);
|
||||
|
||||
// Build incremental conditions
|
||||
const incrementalWhereClause = incrementalUpdate
|
||||
? `AND (
|
||||
p.date_updated > ?
|
||||
OR p.date_ordered > ?
|
||||
OR p.date_estin > ?
|
||||
OR r.date_updated > ?
|
||||
OR r.date_created > ?
|
||||
OR r.date_checked > ?
|
||||
OR rp.stamp > ?
|
||||
OR rp.received_date > ?
|
||||
)`
|
||||
: "";
|
||||
const incrementalParams = incrementalUpdate
|
||||
? [lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime]
|
||||
: [];
|
||||
|
||||
// First get all relevant PO IDs with basic info
|
||||
const [[{ total }]] = await prodConnection.query(`
|
||||
SELECT COUNT(*) as total
|
||||
FROM (
|
||||
SELECT DISTINCT pop.po_id, pop.pid
|
||||
FROM po p
|
||||
USE INDEX (idx_date_created)
|
||||
JOIN po_products pop ON p.po_id = pop.po_id
|
||||
JOIN suppliers s ON p.supplier_id = s.supplierid
|
||||
WHERE p.date_ordered >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
p.date_updated > ?
|
||||
OR p.date_ordered > ?
|
||||
OR p.date_estin > ?
|
||||
)
|
||||
` : ''}
|
||||
UNION
|
||||
SELECT DISTINCT r.receiving_id as po_id, rp.pid
|
||||
FROM receivings_products rp
|
||||
USE INDEX (received_date)
|
||||
LEFT JOIN receivings r ON r.receiving_id = rp.receiving_id
|
||||
WHERE rp.received_date >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
r.date_created > ?
|
||||
OR r.date_checked > ?
|
||||
OR rp.stamp > ?
|
||||
OR rp.received_date > ?
|
||||
)
|
||||
` : ''}
|
||||
) all_items
|
||||
`, incrementalUpdate ? [
|
||||
lastSyncTime, lastSyncTime, lastSyncTime, // PO conditions
|
||||
lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime // Receiving conditions
|
||||
] : []);
|
||||
|
||||
console.log('Purchase Orders: Found changes:', total);
|
||||
|
||||
const [poList] = await prodConnection.query(`
|
||||
SELECT DISTINCT
|
||||
COALESCE(p.po_id, r.receiving_id) as po_id,
|
||||
COALESCE(
|
||||
NULLIF(s1.companyname, ''),
|
||||
NULLIF(s2.companyname, ''),
|
||||
'Unknown Vendor'
|
||||
) as vendor,
|
||||
CASE
|
||||
WHEN p.po_id IS NOT NULL THEN
|
||||
DATE(COALESCE(
|
||||
NULLIF(p.date_ordered, '0000-00-00 00:00:00'),
|
||||
p.date_created
|
||||
))
|
||||
WHEN r.receiving_id IS NOT NULL THEN
|
||||
DATE(r.date_created)
|
||||
END as date,
|
||||
CASE
|
||||
WHEN p.date_estin = '0000-00-00' THEN NULL
|
||||
WHEN p.date_estin IS NULL THEN NULL
|
||||
WHEN p.date_estin NOT REGEXP '^[0-9]{4}-[0-9]{2}-[0-9]{2}$' THEN NULL
|
||||
ELSE p.date_estin
|
||||
END as expected_date,
|
||||
COALESCE(p.status, 50) as status,
|
||||
p.short_note as notes,
|
||||
p.notes as long_note
|
||||
FROM (
|
||||
SELECT po_id FROM po
|
||||
USE INDEX (idx_date_created)
|
||||
WHERE date_ordered >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
date_ordered > ?
|
||||
OR date_updated > ?
|
||||
OR date_estin > ?
|
||||
)
|
||||
` : ''}
|
||||
UNION
|
||||
SELECT DISTINCT r.receiving_id as po_id
|
||||
FROM receivings r
|
||||
JOIN receivings_products rp USE INDEX (received_date) ON r.receiving_id = rp.receiving_id
|
||||
WHERE rp.received_date >= DATE_SUB(CURRENT_DATE, INTERVAL ${incrementalUpdate ? '1' : '5'} YEAR)
|
||||
${incrementalUpdate ? `
|
||||
AND (
|
||||
r.date_created > ?
|
||||
OR r.date_checked > ?
|
||||
OR rp.stamp > ?
|
||||
OR rp.received_date > ?
|
||||
)
|
||||
` : ''}
|
||||
) ids
|
||||
LEFT JOIN po p ON ids.po_id = p.po_id
|
||||
LEFT JOIN suppliers s1 ON p.supplier_id = s1.supplierid
|
||||
LEFT JOIN receivings r ON ids.po_id = r.receiving_id
|
||||
LEFT JOIN suppliers s2 ON r.supplier_id = s2.supplierid
|
||||
ORDER BY po_id
|
||||
`, incrementalUpdate ? [
|
||||
lastSyncTime, lastSyncTime, lastSyncTime, // PO conditions
|
||||
lastSyncTime, lastSyncTime, lastSyncTime, lastSyncTime // Receiving conditions
|
||||
] : []);
|
||||
|
||||
console.log('Sample PO dates:', poList.slice(0, 5).map(po => ({
|
||||
po_id: po.po_id,
|
||||
raw_date_ordered: po.raw_date_ordered,
|
||||
raw_date_created: po.raw_date_created,
|
||||
raw_date_estin: po.raw_date_estin,
|
||||
computed_date: po.date,
|
||||
expected_date: po.expected_date
|
||||
})));
|
||||
|
||||
const totalItems = total;
|
||||
let processed = 0;
|
||||
|
||||
const BATCH_SIZE = 5000;
|
||||
const PROGRESS_INTERVAL = 500;
|
||||
let lastProgressUpdate = Date.now();
|
||||
|
||||
outputProgress({
|
||||
operation: `Starting purchase orders import - Processing ${totalItems} purchase order items`,
|
||||
status: "running",
|
||||
});
|
||||
|
||||
for (let i = 0; i < poList.length; i += BATCH_SIZE) {
|
||||
const batch = poList.slice(i, Math.min(i + BATCH_SIZE, poList.length));
|
||||
const poIds = batch.map(po => po.po_id);
|
||||
|
||||
// Get all products for these POs in one query
|
||||
const [poProducts] = await prodConnection.query(`
|
||||
SELECT
|
||||
pop.po_id,
|
||||
pop.pid,
|
||||
pr.itemnumber as sku,
|
||||
pr.description as name,
|
||||
pop.cost_each,
|
||||
pop.qty_each as ordered
|
||||
FROM po_products pop
|
||||
USE INDEX (PRIMARY)
|
||||
JOIN products pr ON pop.pid = pr.pid
|
||||
WHERE pop.po_id IN (?)
|
||||
`, [poIds]);
|
||||
|
||||
// Process PO products in smaller sub-batches to avoid packet size issues
|
||||
const SUB_BATCH_SIZE = 5000;
|
||||
for (let j = 0; j < poProducts.length; j += SUB_BATCH_SIZE) {
|
||||
const productBatch = poProducts.slice(j, j + SUB_BATCH_SIZE);
|
||||
const productPids = [...new Set(productBatch.map(p => p.pid))];
|
||||
const batchPoIds = [...new Set(productBatch.map(p => p.po_id))];
|
||||
|
||||
// Get receivings for this batch with employee names
|
||||
const [receivings] = await prodConnection.query(`
|
||||
SELECT
|
||||
r.po_id,
|
||||
rp.pid,
|
||||
rp.receiving_id,
|
||||
rp.qty_each,
|
||||
rp.cost_each,
|
||||
COALESCE(rp.received_date, r.date_created) as received_date,
|
||||
rp.received_by,
|
||||
CONCAT(e.firstname, ' ', e.lastname) as received_by_name,
|
||||
CASE
|
||||
WHEN r.po_id IS NULL THEN 2 -- No PO
|
||||
WHEN r.po_id IN (?) THEN 0 -- Original PO
|
||||
ELSE 1 -- Different PO
|
||||
END as is_alt_po
|
||||
FROM receivings_products rp
|
||||
USE INDEX (received_date)
|
||||
LEFT JOIN receivings r ON r.receiving_id = rp.receiving_id
|
||||
LEFT JOIN employees e ON rp.received_by = e.employeeid
|
||||
WHERE rp.pid IN (?)
|
||||
AND rp.received_date >= DATE_SUB(CURRENT_DATE, INTERVAL 5 YEAR)
|
||||
ORDER BY r.po_id, rp.pid, rp.received_date
|
||||
`, [batchPoIds, productPids]);
|
||||
|
||||
// Create maps for this sub-batch
|
||||
const poProductMap = new Map();
|
||||
productBatch.forEach(product => {
|
||||
const key = `${product.po_id}-${product.pid}`;
|
||||
poProductMap.set(key, product);
|
||||
});
|
||||
|
||||
const receivingMap = new Map();
|
||||
const altReceivingMap = new Map();
|
||||
const noPOReceivingMap = new Map();
|
||||
|
||||
receivings.forEach(receiving => {
|
||||
const key = `${receiving.po_id}-${receiving.pid}`;
|
||||
if (receiving.is_alt_po === 2) {
|
||||
// No PO
|
||||
if (!noPOReceivingMap.has(receiving.pid)) {
|
||||
noPOReceivingMap.set(receiving.pid, []);
|
||||
}
|
||||
noPOReceivingMap.get(receiving.pid).push(receiving);
|
||||
} else if (receiving.is_alt_po === 1) {
|
||||
// Different PO
|
||||
if (!altReceivingMap.has(receiving.pid)) {
|
||||
altReceivingMap.set(receiving.pid, []);
|
||||
}
|
||||
altReceivingMap.get(receiving.pid).push(receiving);
|
||||
} else {
|
||||
// Original PO
|
||||
if (!receivingMap.has(key)) {
|
||||
receivingMap.set(key, []);
|
||||
}
|
||||
receivingMap.get(key).push(receiving);
|
||||
}
|
||||
});
|
||||
|
||||
// Verify PIDs exist
|
||||
const [existingPids] = await localConnection.query(
|
||||
'SELECT pid FROM products WHERE pid IN (?)',
|
||||
[productPids]
|
||||
);
|
||||
const validPids = new Set(existingPids.map(p => p.pid));
|
||||
|
||||
// First check which PO lines already exist and get their current values
|
||||
const poLines = Array.from(poProductMap.values())
|
||||
.filter(p => validPids.has(p.pid))
|
||||
.map(p => [p.po_id, p.pid]);
|
||||
|
||||
const [existingPOs] = await localConnection.query(
|
||||
`SELECT ${columnNames.join(',')} FROM purchase_orders WHERE (po_id, pid) IN (${poLines.map(() => "(?,?)").join(",")})`,
|
||||
poLines.flat()
|
||||
);
|
||||
const existingPOMap = new Map(
|
||||
existingPOs.map(po => [`${po.po_id}-${po.pid}`, po])
|
||||
);
|
||||
|
||||
// Split into inserts and updates
|
||||
const insertsAndUpdates = { inserts: [], updates: [] };
|
||||
let batchProcessed = 0;
|
||||
|
||||
for (const po of batch) {
|
||||
const poProducts = Array.from(poProductMap.values())
|
||||
.filter(p => p.po_id === po.po_id && validPids.has(p.pid));
|
||||
|
||||
for (const product of poProducts) {
|
||||
const key = `${po.po_id}-${product.pid}`;
|
||||
const receivingHistory = receivingMap.get(key) || [];
|
||||
const altReceivingHistory = altReceivingMap.get(product.pid) || [];
|
||||
const noPOReceivingHistory = noPOReceivingMap.get(product.pid) || [];
|
||||
|
||||
// Combine all receivings and sort by date
|
||||
const allReceivings = [
|
||||
...receivingHistory.map(r => ({ ...r, type: 'original' })),
|
||||
...altReceivingHistory.map(r => ({ ...r, type: 'alternate' })),
|
||||
...noPOReceivingHistory.map(r => ({ ...r, type: 'no_po' }))
|
||||
].sort((a, b) => new Date(a.received_date || '9999-12-31') - new Date(b.received_date || '9999-12-31'));
|
||||
|
||||
// Split receivings into original PO and others
|
||||
const originalPOReceivings = allReceivings.filter(r => r.type === 'original');
|
||||
const otherReceivings = allReceivings.filter(r => r.type !== 'original');
|
||||
|
||||
// Track FIFO fulfillment
|
||||
let remainingToFulfill = product.ordered;
|
||||
const fulfillmentTracking = [];
|
||||
let totalReceived = 0;
|
||||
let actualCost = null; // Will store the cost of the first receiving that fulfills this PO
|
||||
let firstFulfillmentReceiving = null;
|
||||
let lastFulfillmentReceiving = null;
|
||||
|
||||
for (const receiving of allReceivings) {
|
||||
// Convert quantities to base units using supplier data
|
||||
const baseQtyReceived = receiving.qty_each * (
|
||||
receiving.type === 'original' ? 1 :
|
||||
Math.max(1, product.supplier_qty_per_unit || 1)
|
||||
);
|
||||
const qtyToApply = Math.min(remainingToFulfill, baseQtyReceived);
|
||||
|
||||
if (qtyToApply > 0) {
|
||||
// If this is the first receiving being applied, use its cost
|
||||
if (actualCost === null && receiving.cost_each > 0) {
|
||||
actualCost = receiving.cost_each;
|
||||
firstFulfillmentReceiving = receiving;
|
||||
}
|
||||
lastFulfillmentReceiving = receiving;
|
||||
fulfillmentTracking.push({
|
||||
receiving_id: receiving.receiving_id,
|
||||
qty_applied: qtyToApply,
|
||||
qty_total: baseQtyReceived,
|
||||
cost: receiving.cost_each || actualCost || product.cost_each,
|
||||
date: receiving.received_date,
|
||||
received_by: receiving.received_by,
|
||||
received_by_name: receiving.received_by_name || 'Unknown',
|
||||
type: receiving.type,
|
||||
remaining_qty: baseQtyReceived - qtyToApply
|
||||
});
|
||||
remainingToFulfill -= qtyToApply;
|
||||
} else {
|
||||
// Track excess receivings
|
||||
fulfillmentTracking.push({
|
||||
receiving_id: receiving.receiving_id,
|
||||
qty_applied: 0,
|
||||
qty_total: baseQtyReceived,
|
||||
cost: receiving.cost_each || actualCost || product.cost_each,
|
||||
date: receiving.received_date,
|
||||
received_by: receiving.received_by,
|
||||
received_by_name: receiving.received_by_name || 'Unknown',
|
||||
type: receiving.type,
|
||||
is_excess: true
|
||||
});
|
||||
}
|
||||
totalReceived += baseQtyReceived;
|
||||
}
|
||||
|
||||
const receiving_status = !totalReceived ? 1 : // created
|
||||
remainingToFulfill > 0 ? 30 : // partial
|
||||
40; // full
|
||||
|
||||
function formatDate(dateStr) {
|
||||
if (!dateStr) return null;
|
||||
if (dateStr === '0000-00-00' || dateStr === '0000-00-00 00:00:00') return null;
|
||||
if (typeof dateStr === 'string' && !dateStr.match(/^\d{4}-\d{2}-\d{2}/)) return null;
|
||||
try {
|
||||
const date = new Date(dateStr);
|
||||
if (isNaN(date.getTime())) return null;
|
||||
if (date.getFullYear() < 1900 || date.getFullYear() > 2100) return null;
|
||||
return date.toISOString().split('T')[0];
|
||||
} catch (e) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
const rowValues = columnNames.map(col => {
|
||||
switch (col) {
|
||||
case 'po_id': return po.po_id;
|
||||
case 'vendor': return po.vendor;
|
||||
case 'date': return formatDate(po.date);
|
||||
case 'expected_date': return formatDate(po.expected_date);
|
||||
case 'pid': return product.pid;
|
||||
case 'sku': return product.sku;
|
||||
case 'name': return product.name;
|
||||
case 'cost_price': return actualCost || product.cost_each;
|
||||
case 'po_cost_price': return product.cost_each;
|
||||
case 'status': return po.status;
|
||||
case 'notes': return po.notes;
|
||||
case 'long_note': return po.long_note;
|
||||
case 'ordered': return product.ordered;
|
||||
case 'received': return totalReceived;
|
||||
case 'unfulfilled': return remainingToFulfill;
|
||||
case 'excess_received': return Math.max(0, totalReceived - product.ordered);
|
||||
case 'received_date': return formatDate(firstFulfillmentReceiving?.received_date);
|
||||
case 'last_received_date': return formatDate(lastFulfillmentReceiving?.received_date);
|
||||
case 'received_by': return firstFulfillmentReceiving?.received_by_name || null;
|
||||
case 'receiving_status': return receiving_status;
|
||||
case 'receiving_history': return JSON.stringify({
|
||||
fulfillment: fulfillmentTracking,
|
||||
ordered_qty: product.ordered,
|
||||
total_received: totalReceived,
|
||||
remaining_unfulfilled: remainingToFulfill,
|
||||
excess_received: Math.max(0, totalReceived - product.ordered),
|
||||
po_cost: product.cost_each,
|
||||
actual_cost: actualCost || product.cost_each
|
||||
});
|
||||
default: return null;
|
||||
}
|
||||
});
|
||||
|
||||
if (existingPOMap.has(key)) {
|
||||
const existing = existingPOMap.get(key);
|
||||
// Check if any values are different
|
||||
const hasChanges = columnNames.some(col => {
|
||||
const newVal = rowValues[columnNames.indexOf(col)];
|
||||
const oldVal = existing[col] ?? null;
|
||||
// Special handling for numbers to avoid type coercion issues
|
||||
if (typeof newVal === 'number' && typeof oldVal === 'number') {
|
||||
return Math.abs(newVal - oldVal) > 0.00001; // Allow for tiny floating point differences
|
||||
}
|
||||
// Special handling for receiving_history - parse and compare
|
||||
if (col === 'receiving_history') {
|
||||
const newHistory = JSON.parse(newVal || '{}');
|
||||
const oldHistory = JSON.parse(oldVal || '{}');
|
||||
return JSON.stringify(newHistory) !== JSON.stringify(oldHistory);
|
||||
}
|
||||
return newVal !== oldVal;
|
||||
});
|
||||
|
||||
if (hasChanges) {
|
||||
insertsAndUpdates.updates.push({
|
||||
po_id: po.po_id,
|
||||
pid: product.pid,
|
||||
values: rowValues
|
||||
});
|
||||
}
|
||||
} else {
|
||||
insertsAndUpdates.inserts.push({
|
||||
po_id: po.po_id,
|
||||
pid: product.pid,
|
||||
values: rowValues
|
||||
});
|
||||
}
|
||||
batchProcessed++;
|
||||
}
|
||||
}
|
||||
|
||||
// Handle inserts
|
||||
if (insertsAndUpdates.inserts.length > 0) {
|
||||
const insertPlaceholders = insertsAndUpdates.inserts
|
||||
.map(() => `(${Array(columnNames.length).fill("?").join(",")})`)
|
||||
.join(",");
|
||||
|
||||
const insertResult = await localConnection.query(`
|
||||
INSERT INTO purchase_orders (${columnNames.join(",")})
|
||||
VALUES ${insertPlaceholders}
|
||||
`, insertsAndUpdates.inserts.map(i => i.values).flat());
|
||||
|
||||
const affectedRows = insertResult[0].affectedRows;
|
||||
// For an upsert, MySQL counts rows twice for updates
|
||||
// So if affectedRows is odd, we have (updates * 2 + inserts)
|
||||
const updates = Math.floor(affectedRows / 2);
|
||||
const inserts = affectedRows - (updates * 2);
|
||||
|
||||
recordsAdded += inserts;
|
||||
recordsUpdated += Math.floor(updates); // Ensure we never have fractional updates
|
||||
processed += batchProcessed;
|
||||
}
|
||||
|
||||
// Handle updates - now we know these actually have changes
|
||||
if (insertsAndUpdates.updates.length > 0) {
|
||||
const updatePlaceholders = insertsAndUpdates.updates
|
||||
.map(() => `(${Array(columnNames.length).fill("?").join(",")})`)
|
||||
.join(",");
|
||||
|
||||
const updateResult = await localConnection.query(`
|
||||
INSERT INTO purchase_orders (${columnNames.join(",")})
|
||||
VALUES ${updatePlaceholders}
|
||||
ON DUPLICATE KEY UPDATE ${columnNames
|
||||
.filter((col) => col !== "po_id" && col !== "pid")
|
||||
.map((col) => `${col} = VALUES(${col})`)
|
||||
.join(",")};
|
||||
`, insertsAndUpdates.updates.map(u => u.values).flat());
|
||||
|
||||
const affectedRows = updateResult[0].affectedRows;
|
||||
// For an upsert, MySQL counts rows twice for updates
|
||||
// So if affectedRows is odd, we have (updates * 2 + inserts)
|
||||
const updates = Math.floor(affectedRows / 2);
|
||||
const inserts = affectedRows - (updates * 2);
|
||||
|
||||
recordsUpdated += Math.floor(updates); // Ensure we never have fractional updates
|
||||
processed += batchProcessed;
|
||||
}
|
||||
|
||||
// Update progress based on time interval
|
||||
const now = Date.now();
|
||||
if (now - lastProgressUpdate >= PROGRESS_INTERVAL || processed === totalItems) {
|
||||
outputProgress({
|
||||
status: "running",
|
||||
operation: "Purchase orders import",
|
||||
current: processed,
|
||||
total: totalItems,
|
||||
elapsed: formatElapsedTime((Date.now() - startTime) / 1000),
|
||||
remaining: estimateRemaining(startTime, processed, totalItems),
|
||||
rate: calculateRate(startTime, processed)
|
||||
});
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Only update sync status if we get here (no errors thrown)
|
||||
await localConnection.query(`
|
||||
INSERT INTO sync_status (table_name, last_sync_timestamp)
|
||||
VALUES ('purchase_orders', NOW())
|
||||
ON DUPLICATE KEY UPDATE
|
||||
last_sync_timestamp = NOW(),
|
||||
last_sync_id = LAST_INSERT_ID(last_sync_id)
|
||||
`);
|
||||
|
||||
return {
|
||||
status: "complete",
|
||||
totalImported: totalItems,
|
||||
recordsAdded: recordsAdded || 0,
|
||||
recordsUpdated: recordsUpdated || 0,
|
||||
incrementalUpdate,
|
||||
lastSyncTime
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
outputProgress({
|
||||
operation: `${incrementalUpdate ? 'Incremental' : 'Full'} purchase orders import failed`,
|
||||
status: "error",
|
||||
error: error.message,
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = importPurchaseOrders;
|
||||
88
inventory-server/scripts/import/utils.js
Normal file
88
inventory-server/scripts/import/utils.js
Normal file
@@ -0,0 +1,88 @@
|
||||
const mysql = require("mysql2/promise");
|
||||
const { Client } = require("ssh2");
|
||||
const dotenv = require("dotenv");
|
||||
const path = require("path");
|
||||
|
||||
// Helper function to setup SSH tunnel
|
||||
async function setupSshTunnel(sshConfig) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const ssh = new Client();
|
||||
|
||||
ssh.on('error', (err) => {
|
||||
console.error('SSH connection error:', err);
|
||||
});
|
||||
|
||||
ssh.on('end', () => {
|
||||
console.log('SSH connection ended normally');
|
||||
});
|
||||
|
||||
ssh.on('close', () => {
|
||||
console.log('SSH connection closed');
|
||||
});
|
||||
|
||||
ssh
|
||||
.on("ready", () => {
|
||||
ssh.forwardOut(
|
||||
"127.0.0.1",
|
||||
0,
|
||||
sshConfig.prodDbConfig.host,
|
||||
sshConfig.prodDbConfig.port,
|
||||
async (err, stream) => {
|
||||
if (err) reject(err);
|
||||
resolve({ ssh, stream });
|
||||
}
|
||||
);
|
||||
})
|
||||
.connect(sshConfig.ssh);
|
||||
});
|
||||
}
|
||||
|
||||
// Helper function to setup database connections
|
||||
async function setupConnections(sshConfig) {
|
||||
const tunnel = await setupSshTunnel(sshConfig);
|
||||
|
||||
const prodConnection = await mysql.createConnection({
|
||||
...sshConfig.prodDbConfig,
|
||||
stream: tunnel.stream,
|
||||
});
|
||||
|
||||
const localConnection = await mysql.createPool({
|
||||
...sshConfig.localDbConfig,
|
||||
waitForConnections: true,
|
||||
connectionLimit: 10,
|
||||
queueLimit: 0
|
||||
});
|
||||
|
||||
return {
|
||||
ssh: tunnel.ssh,
|
||||
prodConnection,
|
||||
localConnection
|
||||
};
|
||||
}
|
||||
|
||||
// Helper function to close connections
|
||||
async function closeConnections(connections) {
|
||||
const { ssh, prodConnection, localConnection } = connections;
|
||||
|
||||
try {
|
||||
if (prodConnection) await prodConnection.end();
|
||||
if (localConnection) await localConnection.end();
|
||||
|
||||
// Wait a bit for any pending data to be written before closing SSH
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
|
||||
if (ssh) {
|
||||
ssh.on('close', () => {
|
||||
console.log('SSH connection closed cleanly');
|
||||
});
|
||||
ssh.end();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Error during cleanup:', err);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
setupConnections,
|
||||
closeConnections
|
||||
};
|
||||
@@ -1,21 +1,61 @@
|
||||
const { outputProgress } = require('./utils/progress');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateBrandMetrics(startTime, totalProducts, processedCount) {
|
||||
async function calculateBrandMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Brand metrics calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders: 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Calculating brand metrics',
|
||||
current: Math.floor(totalProducts * 0.95),
|
||||
operation: 'Starting brand metrics calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, Math.floor(totalProducts * 0.95), totalProducts),
|
||||
rate: calculateRate(startTime, Math.floor(totalProducts * 0.95)),
|
||||
percentage: '95'
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate brand metrics
|
||||
// Calculate brand metrics with optimized queries
|
||||
await connection.query(`
|
||||
INSERT INTO brand_metrics (
|
||||
brand,
|
||||
@@ -28,59 +68,103 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount) {
|
||||
avg_margin,
|
||||
growth_rate
|
||||
)
|
||||
WITH brand_data AS (
|
||||
WITH filtered_products AS (
|
||||
SELECT
|
||||
p.*,
|
||||
CASE
|
||||
WHEN p.stock_quantity <= 5000 AND p.stock_quantity >= 0
|
||||
THEN p.pid
|
||||
END as valid_pid,
|
||||
CASE
|
||||
WHEN p.visible = true
|
||||
AND p.stock_quantity <= 5000
|
||||
AND p.stock_quantity >= 0
|
||||
THEN p.pid
|
||||
END as active_pid,
|
||||
CASE
|
||||
WHEN p.stock_quantity IS NULL
|
||||
OR p.stock_quantity < 0
|
||||
OR p.stock_quantity > 5000
|
||||
THEN 0
|
||||
ELSE p.stock_quantity
|
||||
END as valid_stock
|
||||
FROM products p
|
||||
WHERE p.brand IS NOT NULL
|
||||
),
|
||||
sales_periods AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
COUNT(DISTINCT p.product_id) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.product_id END) as active_products,
|
||||
SUM(p.stock_quantity) as total_stock_units,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_stock_cost,
|
||||
SUM(p.stock_quantity * p.price) as total_stock_retail,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0))) as period_revenue,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as period_margin,
|
||||
COUNT(DISTINCT DATE(o.date)) as period_days,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0 THEN
|
||||
(SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH) THEN 'current'
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH) THEN 'previous'
|
||||
END as period_type
|
||||
FROM filtered_products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
GROUP BY p.brand, period_type
|
||||
),
|
||||
brand_data AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
COUNT(DISTINCT p.valid_pid) as product_count,
|
||||
COUNT(DISTINCT p.active_pid) as active_products,
|
||||
SUM(p.valid_stock) as total_stock_units,
|
||||
SUM(p.valid_stock * p.cost_price) as total_stock_cost,
|
||||
SUM(p.valid_stock * p.price) as total_stock_retail,
|
||||
COALESCE(SUM(o.quantity * (o.price - COALESCE(o.discount, 0))), 0) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.quantity * o.price) > 0
|
||||
THEN GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
100.0,
|
||||
(
|
||||
SUM(o.quantity * o.price) - -- Use gross revenue (before discounts)
|
||||
SUM(o.quantity * COALESCE(p.cost_price, 0)) -- Total costs
|
||||
) * 100.0 /
|
||||
NULLIF(SUM(o.quantity * o.price), 0) -- Divide by gross revenue
|
||||
)
|
||||
)
|
||||
ELSE 0
|
||||
END as avg_margin,
|
||||
-- Current period (last 3 months)
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
THEN COALESCE(o.quantity * o.price, 0)
|
||||
ELSE 0
|
||||
END) as current_period_sales,
|
||||
-- Previous year same period
|
||||
SUM(CASE
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH) AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
THEN COALESCE(o.quantity * o.price, 0)
|
||||
ELSE 0
|
||||
END) as previous_year_period_sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
WHERE p.brand IS NOT NULL
|
||||
END as avg_margin
|
||||
FROM filtered_products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
GROUP BY p.brand
|
||||
)
|
||||
SELECT
|
||||
brand,
|
||||
product_count,
|
||||
active_products,
|
||||
total_stock_units,
|
||||
total_stock_cost,
|
||||
total_stock_retail,
|
||||
total_revenue,
|
||||
avg_margin,
|
||||
bd.brand,
|
||||
bd.product_count,
|
||||
bd.active_products,
|
||||
bd.total_stock_units,
|
||||
bd.total_stock_cost,
|
||||
bd.total_stock_retail,
|
||||
bd.total_revenue,
|
||||
bd.avg_margin,
|
||||
CASE
|
||||
WHEN previous_year_period_sales = 0 AND current_period_sales > 0 THEN 100.0
|
||||
WHEN previous_year_period_sales = 0 THEN 0.0
|
||||
ELSE LEAST(
|
||||
GREATEST(
|
||||
((current_period_sales - previous_year_period_sales) /
|
||||
NULLIF(previous_year_period_sales, 0)) * 100.0,
|
||||
-100.0
|
||||
),
|
||||
999.99
|
||||
WHEN MAX(CASE WHEN sp.period_type = 'previous' THEN sp.period_revenue END) = 0
|
||||
AND MAX(CASE WHEN sp.period_type = 'current' THEN sp.period_revenue END) > 0
|
||||
THEN 100.0
|
||||
WHEN MAX(CASE WHEN sp.period_type = 'previous' THEN sp.period_revenue END) = 0
|
||||
THEN 0.0
|
||||
ELSE GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
((MAX(CASE WHEN sp.period_type = 'current' THEN sp.period_revenue END) -
|
||||
MAX(CASE WHEN sp.period_type = 'previous' THEN sp.period_revenue END)) /
|
||||
NULLIF(ABS(MAX(CASE WHEN sp.period_type = 'previous' THEN sp.period_revenue END)), 0)) * 100.0,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
END as growth_rate
|
||||
FROM brand_data
|
||||
FROM brand_data bd
|
||||
LEFT JOIN sales_periods sp ON bd.brand = sp.brand
|
||||
GROUP BY bd.brand, bd.product_count, bd.active_products, bd.total_stock_units,
|
||||
bd.total_stock_cost, bd.total_stock_retail, bd.total_revenue, bd.avg_margin
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
@@ -93,7 +177,31 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount) {
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
`);
|
||||
|
||||
// Calculate brand time-based metrics
|
||||
processedCount = Math.floor(totalProducts * 0.97);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Brand metrics calculated, starting time-based metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate brand time-based metrics with optimized query
|
||||
await connection.query(`
|
||||
INSERT INTO brand_time_metrics (
|
||||
brand,
|
||||
@@ -107,26 +215,51 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount) {
|
||||
total_revenue,
|
||||
avg_margin
|
||||
)
|
||||
SELECT
|
||||
p.brand,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
COUNT(DISTINCT p.product_id) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.product_id END) as active_products,
|
||||
SUM(p.stock_quantity) as total_stock_units,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_stock_cost,
|
||||
SUM(p.stock_quantity * p.price) as total_stock_retail,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0 THEN
|
||||
(SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
ELSE 0
|
||||
END as avg_margin
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
WHERE p.brand IS NOT NULL
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
|
||||
WITH filtered_products AS (
|
||||
SELECT
|
||||
p.*,
|
||||
CASE WHEN p.stock_quantity <= 5000 THEN p.pid END as valid_pid,
|
||||
CASE WHEN p.visible = true AND p.stock_quantity <= 5000 THEN p.pid END as active_pid,
|
||||
CASE
|
||||
WHEN p.stock_quantity IS NULL OR p.stock_quantity < 0 OR p.stock_quantity > 5000 THEN 0
|
||||
ELSE p.stock_quantity
|
||||
END as valid_stock
|
||||
FROM products p
|
||||
WHERE p.brand IS NOT NULL
|
||||
),
|
||||
monthly_metrics AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
COUNT(DISTINCT p.valid_pid) as product_count,
|
||||
COUNT(DISTINCT p.active_pid) as active_products,
|
||||
SUM(p.valid_stock) as total_stock_units,
|
||||
SUM(p.valid_stock * p.cost_price) as total_stock_cost,
|
||||
SUM(p.valid_stock * p.price) as total_stock_retail,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.quantity * o.price) > 0
|
||||
THEN GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
100.0,
|
||||
(
|
||||
SUM(o.quantity * o.price) - -- Use gross revenue (before discounts)
|
||||
SUM(o.quantity * COALESCE(p.cost_price, 0)) -- Total costs
|
||||
) * 100.0 /
|
||||
NULLIF(SUM(o.quantity * o.price), 0) -- Divide by gross revenue
|
||||
)
|
||||
)
|
||||
ELSE 0
|
||||
END as avg_margin
|
||||
FROM filtered_products p
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.brand, YEAR(o.date), MONTH(o.date)
|
||||
)
|
||||
SELECT *
|
||||
FROM monthly_metrics
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
@@ -137,9 +270,48 @@ async function calculateBrandMetrics(startTime, totalProducts, processedCount) {
|
||||
avg_margin = VALUES(avg_margin)
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 0.98);
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Brand time-based metrics calculated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('brand_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating brand metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,112 +1,326 @@
|
||||
const { outputProgress } = require('./utils/progress');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateCategoryMetrics(startTime, totalProducts, processedCount) {
|
||||
async function calculateCategoryMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Category metrics calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders: 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Calculating category metrics',
|
||||
current: Math.floor(totalProducts * 0.85),
|
||||
operation: 'Starting category metrics calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, Math.floor(totalProducts * 0.85), totalProducts),
|
||||
rate: calculateRate(startTime, Math.floor(totalProducts * 0.85)),
|
||||
percentage: '85'
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate category performance metrics
|
||||
// First, calculate base category metrics
|
||||
await connection.query(`
|
||||
INSERT INTO category_metrics (
|
||||
category_id,
|
||||
product_count,
|
||||
active_products,
|
||||
total_value,
|
||||
avg_margin,
|
||||
turnover_rate,
|
||||
growth_rate,
|
||||
status
|
||||
)
|
||||
WITH category_sales AS (
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
COUNT(DISTINCT p.product_id) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.product_id END) as active_products,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_value,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0
|
||||
THEN (SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
ELSE 0
|
||||
END as avg_margin,
|
||||
CASE
|
||||
WHEN AVG(GREATEST(p.stock_quantity, 0)) >= 0.01
|
||||
THEN LEAST(
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 1 YEAR)
|
||||
THEN COALESCE(o.quantity, 0)
|
||||
ELSE 0
|
||||
END) /
|
||||
GREATEST(
|
||||
AVG(GREATEST(p.stock_quantity, 0)),
|
||||
1.0
|
||||
),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END as turnover_rate,
|
||||
-- Current period (last 3 months)
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
THEN COALESCE(o.quantity * o.price, 0)
|
||||
ELSE 0
|
||||
END) as current_period_sales,
|
||||
-- Previous year same period
|
||||
SUM(CASE
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH) AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
THEN COALESCE(o.quantity * o.price, 0)
|
||||
ELSE 0
|
||||
END) as previous_year_period_sales,
|
||||
c.status
|
||||
FROM categories c
|
||||
LEFT JOIN product_categories pc ON c.id = pc.category_id
|
||||
LEFT JOIN products p ON pc.product_id = p.product_id
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
GROUP BY c.id, c.status
|
||||
status,
|
||||
last_calculated_at
|
||||
)
|
||||
SELECT
|
||||
category_id,
|
||||
product_count,
|
||||
active_products,
|
||||
total_value,
|
||||
COALESCE(avg_margin, 0) as avg_margin,
|
||||
COALESCE(turnover_rate, 0) as turnover_rate,
|
||||
-- Enhanced YoY growth rate calculation
|
||||
CASE
|
||||
WHEN previous_year_period_sales = 0 AND current_period_sales > 0 THEN 100.0
|
||||
WHEN previous_year_period_sales = 0 THEN 0.0
|
||||
ELSE LEAST(
|
||||
GREATEST(
|
||||
((current_period_sales - previous_year_period_sales) /
|
||||
NULLIF(previous_year_period_sales, 0)) * 100.0,
|
||||
-100.0
|
||||
),
|
||||
999.99
|
||||
)
|
||||
END as growth_rate,
|
||||
status
|
||||
FROM category_sales
|
||||
c.cat_id,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
||||
COALESCE(SUM(p.stock_quantity * p.cost_price), 0) as total_value,
|
||||
c.status,
|
||||
NOW() as last_calculated_at
|
||||
FROM categories c
|
||||
LEFT JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
LEFT JOIN products p ON pc.pid = p.pid
|
||||
GROUP BY c.cat_id, c.status
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
total_value = VALUES(total_value),
|
||||
avg_margin = VALUES(avg_margin),
|
||||
turnover_rate = VALUES(turnover_rate),
|
||||
growth_rate = VALUES(growth_rate),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
`);
|
||||
|
||||
// Calculate category time-based metrics
|
||||
processedCount = Math.floor(totalProducts * 0.90);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Base category metrics calculated, updating with margin data',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Then update with margin and turnover data
|
||||
await connection.query(`
|
||||
WITH category_sales AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
SUM(o.quantity * o.price) as total_sales,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin,
|
||||
SUM(o.quantity) as units_sold,
|
||||
AVG(GREATEST(p.stock_quantity, 0)) as avg_stock,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN turnover_config tc ON
|
||||
(tc.category_id = pc.cat_id AND tc.vendor = p.vendor) OR
|
||||
(tc.category_id = pc.cat_id AND tc.vendor IS NULL) OR
|
||||
(tc.category_id IS NULL AND tc.vendor = p.vendor) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL COALESCE(tc.calculation_period_days, 30) DAY)
|
||||
GROUP BY pc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
JOIN category_sales cs ON cm.category_id = cs.cat_id
|
||||
LEFT JOIN turnover_config tc ON
|
||||
(tc.category_id = cm.category_id AND tc.vendor IS NULL) OR
|
||||
(tc.category_id IS NULL AND tc.vendor IS NULL)
|
||||
SET
|
||||
cm.avg_margin = COALESCE(cs.total_margin * 100.0 / NULLIF(cs.total_sales, 0), 0),
|
||||
cm.turnover_rate = CASE
|
||||
WHEN cs.avg_stock > 0 AND cs.active_days > 0
|
||||
THEN LEAST(
|
||||
(cs.units_sold / cs.avg_stock) * (365.0 / cs.active_days),
|
||||
999.99
|
||||
)
|
||||
ELSE 0
|
||||
END,
|
||||
cm.last_calculated_at = NOW()
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Margin data updated, calculating growth rates',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Finally update growth rates
|
||||
await connection.query(`
|
||||
WITH current_period AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0) - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as days
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
previous_period AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as days
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
AND DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY pc.cat_id
|
||||
),
|
||||
trend_data AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
MONTH(o.date) as month,
|
||||
SUM(o.quantity * (o.price - COALESCE(o.discount, 0)) /
|
||||
(1 + COALESCE(ss.seasonality_factor, 0))) as revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as days_in_month
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN sales_seasonality ss ON MONTH(o.date) = ss.month
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 15 MONTH)
|
||||
GROUP BY pc.cat_id, MONTH(o.date)
|
||||
),
|
||||
trend_stats AS (
|
||||
SELECT
|
||||
cat_id,
|
||||
COUNT(*) as n,
|
||||
AVG(month) as avg_x,
|
||||
AVG(revenue / NULLIF(days_in_month, 0)) as avg_y,
|
||||
SUM(month * (revenue / NULLIF(days_in_month, 0))) as sum_xy,
|
||||
SUM(month * month) as sum_xx
|
||||
FROM trend_data
|
||||
GROUP BY cat_id
|
||||
HAVING COUNT(*) >= 6
|
||||
),
|
||||
trend_analysis AS (
|
||||
SELECT
|
||||
cat_id,
|
||||
((n * sum_xy) - (avg_x * n * avg_y)) /
|
||||
NULLIF((n * sum_xx) - (n * avg_x * avg_x), 0) as trend_slope,
|
||||
avg_y as avg_daily_revenue
|
||||
FROM trend_stats
|
||||
),
|
||||
margin_calc AS (
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
CASE
|
||||
WHEN SUM(o.quantity * o.price) > 0 THEN
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
100.0,
|
||||
(
|
||||
SUM(o.quantity * o.price) - -- Use gross revenue (before discounts)
|
||||
SUM(o.quantity * COALESCE(p.cost_price, 0)) -- Total costs
|
||||
) * 100.0 /
|
||||
NULLIF(SUM(o.quantity * o.price), 0) -- Divide by gross revenue
|
||||
)
|
||||
)
|
||||
ELSE NULL
|
||||
END as avg_margin
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 3 MONTH)
|
||||
GROUP BY pc.cat_id
|
||||
)
|
||||
UPDATE category_metrics cm
|
||||
LEFT JOIN current_period cp ON cm.category_id = cp.cat_id
|
||||
LEFT JOIN previous_period pp ON cm.category_id = pp.cat_id
|
||||
LEFT JOIN trend_analysis ta ON cm.category_id = ta.cat_id
|
||||
LEFT JOIN margin_calc mc ON cm.category_id = mc.cat_id
|
||||
SET
|
||||
cm.growth_rate = CASE
|
||||
WHEN pp.revenue = 0 AND COALESCE(cp.revenue, 0) > 0 THEN 100.0
|
||||
WHEN pp.revenue = 0 OR cp.revenue IS NULL THEN 0.0
|
||||
WHEN ta.trend_slope IS NOT NULL THEN
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
(ta.trend_slope / NULLIF(ta.avg_daily_revenue, 0)) * 365 * 100,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
ELSE
|
||||
GREATEST(
|
||||
-100.0,
|
||||
LEAST(
|
||||
((COALESCE(cp.revenue, 0) - pp.revenue) /
|
||||
NULLIF(ABS(pp.revenue), 0)) * 100.0,
|
||||
999.99
|
||||
)
|
||||
)
|
||||
END,
|
||||
cm.avg_margin = COALESCE(mc.avg_margin, cm.avg_margin),
|
||||
cm.last_calculated_at = NOW()
|
||||
WHERE cp.cat_id IS NOT NULL OR pp.cat_id IS NOT NULL
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.97);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Growth rates calculated, updating time-based metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate time-based metrics
|
||||
await connection.query(`
|
||||
INSERT INTO category_time_metrics (
|
||||
category_id,
|
||||
@@ -120,29 +334,38 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
turnover_rate
|
||||
)
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
pc.cat_id,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
COUNT(DISTINCT p.product_id) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.product_id END) as active_products,
|
||||
COUNT(DISTINCT p.pid) as product_count,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.pid END) as active_products,
|
||||
SUM(p.stock_quantity * p.cost_price) as total_value,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0
|
||||
THEN (SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
WHEN SUM(o.quantity * o.price) > 0 THEN
|
||||
LEAST(
|
||||
GREATEST(
|
||||
SUM(o.quantity * (o.price - GREATEST(p.cost_price, 0))) * 100.0 /
|
||||
SUM(o.quantity * o.price),
|
||||
-100
|
||||
),
|
||||
100
|
||||
)
|
||||
ELSE 0
|
||||
END as avg_margin,
|
||||
CASE
|
||||
WHEN AVG(p.stock_quantity) > 0
|
||||
THEN SUM(o.quantity) / AVG(p.stock_quantity)
|
||||
ELSE 0
|
||||
END as turnover_rate
|
||||
FROM categories c
|
||||
LEFT JOIN product_categories pc ON c.id = pc.category_id
|
||||
LEFT JOIN products p ON pc.product_id = p.product_id
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
WHERE o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY c.id, YEAR(o.date), MONTH(o.date)
|
||||
COALESCE(
|
||||
LEAST(
|
||||
SUM(o.quantity) / NULLIF(AVG(GREATEST(p.stock_quantity, 0)), 0),
|
||||
999.99
|
||||
),
|
||||
0
|
||||
) as turnover_rate
|
||||
FROM product_categories pc
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY pc.cat_id, YEAR(o.date), MONTH(o.date)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
product_count = VALUES(product_count),
|
||||
active_products = VALUES(active_products),
|
||||
@@ -152,7 +375,31 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
turnover_rate = VALUES(turnover_rate)
|
||||
`);
|
||||
|
||||
// Calculate category sales metrics
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Time-based metrics calculated, updating category-sales metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate category-sales metrics
|
||||
await connection.query(`
|
||||
INSERT INTO category_sales_metrics (
|
||||
category_id,
|
||||
@@ -167,62 +414,108 @@ async function calculateCategoryMetrics(startTime, totalProducts, processedCount
|
||||
)
|
||||
WITH date_ranges AS (
|
||||
SELECT
|
||||
DATE_SUB(CURDATE(), INTERVAL 30 DAY) as period_start,
|
||||
CURDATE() as period_end
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY) as period_start,
|
||||
CURRENT_DATE as period_end
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURDATE(), INTERVAL 90 DAY),
|
||||
CURDATE()
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 31 DAY)
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURDATE(), INTERVAL 180 DAY),
|
||||
CURDATE()
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 180 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 91 DAY)
|
||||
UNION ALL
|
||||
SELECT
|
||||
DATE_SUB(CURDATE(), INTERVAL 365 DAY),
|
||||
CURDATE()
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 365 DAY),
|
||||
DATE_SUB(CURRENT_DATE, INTERVAL 181 DAY)
|
||||
),
|
||||
category_metrics AS (
|
||||
sales_data AS (
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
p.brand,
|
||||
pc.cat_id,
|
||||
COALESCE(p.brand, 'Unknown') as brand,
|
||||
dr.period_start,
|
||||
dr.period_end,
|
||||
COUNT(DISTINCT p.product_id) as num_products,
|
||||
COALESCE(SUM(o.quantity), 0) / DATEDIFF(dr.period_end, dr.period_start) as avg_daily_sales,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
COALESCE(AVG(o.price), 0) as avg_price
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.id = pc.category_id
|
||||
JOIN products p ON pc.product_id = p.product_id
|
||||
COUNT(DISTINCT p.pid) as num_products,
|
||||
SUM(o.quantity) as total_sold,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as num_days
|
||||
FROM products p
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
CROSS JOIN date_ranges dr
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN dr.period_start AND dr.period_end
|
||||
AND o.canceled = false
|
||||
GROUP BY c.id, p.brand, dr.period_start, dr.period_end
|
||||
GROUP BY pc.cat_id, p.brand, dr.period_start, dr.period_end
|
||||
)
|
||||
SELECT
|
||||
category_id,
|
||||
cat_id as category_id,
|
||||
brand,
|
||||
period_start,
|
||||
period_end,
|
||||
avg_daily_sales,
|
||||
CASE
|
||||
WHEN num_days > 0
|
||||
THEN total_sold / num_days
|
||||
ELSE 0
|
||||
END as avg_daily_sales,
|
||||
total_sold,
|
||||
num_products,
|
||||
avg_price,
|
||||
CASE
|
||||
WHEN total_sold > 0
|
||||
THEN total_revenue / total_sold
|
||||
ELSE 0
|
||||
END as avg_price,
|
||||
NOW() as last_calculated_at
|
||||
FROM category_metrics
|
||||
FROM sales_data
|
||||
ON DUPLICATE KEY UPDATE
|
||||
avg_daily_sales = VALUES(avg_daily_sales),
|
||||
total_sold = VALUES(total_sold),
|
||||
num_products = VALUES(num_products),
|
||||
avg_price = VALUES(avg_price),
|
||||
last_calculated_at = NOW()
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 0.9);
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Category-sales metrics calculated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('category_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating category metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,80 +1,191 @@
|
||||
const { outputProgress } = require('./utils/progress');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateFinancialMetrics(startTime, totalProducts, processedCount) {
|
||||
async function calculateFinancialMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Financial metrics calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders: 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Calculating financial metrics',
|
||||
current: Math.floor(totalProducts * 0.6),
|
||||
operation: 'Starting financial metrics calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, Math.floor(totalProducts * 0.6), totalProducts),
|
||||
rate: calculateRate(startTime, Math.floor(totalProducts * 0.6)),
|
||||
percentage: '60'
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Calculate financial metrics with optimized query
|
||||
await connection.query(`
|
||||
UPDATE product_metrics pm
|
||||
JOIN (
|
||||
WITH product_financials AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * p.cost_price) as cost_of_goods_sold,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
MIN(o.date) as first_sale_date,
|
||||
MAX(o.date) as last_sale_date,
|
||||
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days
|
||||
DATEDIFF(MAX(o.date), MIN(o.date)) + 1 as calculation_period_days,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND DATE(o.date) >= DATE_SUB(CURDATE(), INTERVAL 12 MONTH)
|
||||
GROUP BY p.product_id
|
||||
) fin ON pm.product_id = fin.product_id
|
||||
GROUP BY p.pid
|
||||
)
|
||||
UPDATE product_metrics pm
|
||||
JOIN product_financials pf ON pm.pid = pf.pid
|
||||
SET
|
||||
pm.inventory_value = COALESCE(fin.inventory_value, 0),
|
||||
pm.total_revenue = COALESCE(fin.total_revenue, 0),
|
||||
pm.cost_of_goods_sold = COALESCE(fin.cost_of_goods_sold, 0),
|
||||
pm.gross_profit = COALESCE(fin.gross_profit, 0),
|
||||
pm.inventory_value = COALESCE(pf.inventory_value, 0),
|
||||
pm.total_revenue = COALESCE(pf.total_revenue, 0),
|
||||
pm.cost_of_goods_sold = COALESCE(pf.cost_of_goods_sold, 0),
|
||||
pm.gross_profit = COALESCE(pf.gross_profit, 0),
|
||||
pm.gmroi = CASE
|
||||
WHEN COALESCE(fin.inventory_value, 0) > 0 AND fin.calculation_period_days > 0 THEN
|
||||
(COALESCE(fin.gross_profit, 0) * (365.0 / fin.calculation_period_days)) / COALESCE(fin.inventory_value, 0)
|
||||
WHEN COALESCE(pf.inventory_value, 0) > 0 AND pf.active_days > 0 THEN
|
||||
(COALESCE(pf.gross_profit, 0) * (365.0 / pf.active_days)) / COALESCE(pf.inventory_value, 0)
|
||||
ELSE 0
|
||||
END
|
||||
END,
|
||||
pm.last_calculated_at = CURRENT_TIMESTAMP
|
||||
`);
|
||||
|
||||
// Update time-based aggregates with financial metrics
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Base financial metrics calculated, updating time aggregates',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Update time-based aggregates with optimized query
|
||||
await connection.query(`
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN (
|
||||
WITH monthly_financials AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as days_in_period
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days,
|
||||
MIN(o.date) as period_start,
|
||||
MAX(o.date) as period_end
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.product_id, YEAR(o.date), MONTH(o.date)
|
||||
) fin ON pta.product_id = fin.product_id
|
||||
AND pta.year = fin.year
|
||||
AND pta.month = fin.month
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
)
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN monthly_financials mf ON pta.pid = mf.pid
|
||||
AND pta.year = mf.year
|
||||
AND pta.month = mf.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(fin.inventory_value, 0),
|
||||
pta.inventory_value = COALESCE(mf.inventory_value, 0),
|
||||
pta.gmroi = CASE
|
||||
WHEN COALESCE(fin.inventory_value, 0) > 0 AND fin.days_in_period > 0 THEN
|
||||
(COALESCE(fin.gross_profit, 0) * (365.0 / fin.days_in_period)) / COALESCE(fin.inventory_value, 0)
|
||||
WHEN COALESCE(mf.inventory_value, 0) > 0 AND mf.active_days > 0 THEN
|
||||
(COALESCE(mf.gross_profit, 0) * (365.0 / mf.active_days)) / COALESCE(mf.inventory_value, 0)
|
||||
ELSE 0
|
||||
END
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 0.7);
|
||||
processedCount = Math.floor(totalProducts * 0.70);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Time-based aggregates updated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('financial_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating financial metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,116 +1,266 @@
|
||||
const { outputProgress } = require('./utils/progress');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateSalesForecasts(startTime, totalProducts, processedCount) {
|
||||
async function calculateSalesForecasts(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Sales forecasts calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders: 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Calculating sales forecasts',
|
||||
current: Math.floor(totalProducts * 0.98),
|
||||
operation: 'Starting sales forecasts calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, Math.floor(totalProducts * 0.98), totalProducts),
|
||||
rate: calculateRate(startTime, Math.floor(totalProducts * 0.98)),
|
||||
percentage: '98'
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// First, create a temporary table for forecast dates
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_forecast_dates (
|
||||
forecast_date DATE,
|
||||
day_of_week INT,
|
||||
month INT,
|
||||
PRIMARY KEY (forecast_date)
|
||||
)
|
||||
`);
|
||||
|
||||
await connection.query(`
|
||||
INSERT INTO temp_forecast_dates
|
||||
SELECT
|
||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date,
|
||||
DAYOFWEEK(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as day_of_week,
|
||||
MONTH(DATE_ADD(CURRENT_DATE, INTERVAL n DAY)) as month
|
||||
FROM (
|
||||
SELECT a.N + b.N * 10 as n
|
||||
FROM
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9) a,
|
||||
(SELECT 0 as N UNION SELECT 1 UNION SELECT 2) b
|
||||
ORDER BY n
|
||||
LIMIT 31
|
||||
) numbers
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.92);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Forecast dates prepared, calculating daily sales stats',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Create temporary table for daily sales stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_daily_sales AS
|
||||
SELECT
|
||||
o.pid,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY o.pid, DAYOFWEEK(o.date)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.94);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Daily sales stats calculated, preparing product stats',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Create temporary table for product stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_product_stats AS
|
||||
SELECT
|
||||
pid,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
SUM(day_count) as total_days
|
||||
FROM temp_daily_sales
|
||||
GROUP BY pid
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.96);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Product stats prepared, calculating product-level forecasts',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate product-level forecasts
|
||||
await connection.query(`
|
||||
INSERT INTO sales_forecasts (
|
||||
product_id,
|
||||
pid,
|
||||
forecast_date,
|
||||
forecast_units,
|
||||
forecast_revenue,
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
)
|
||||
WITH daily_sales AS (
|
||||
WITH daily_stats AS (
|
||||
SELECT
|
||||
o.product_id,
|
||||
DATE(o.date) as sale_date,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY o.product_id, DATE(o.date)
|
||||
),
|
||||
forecast_dates AS (
|
||||
SELECT
|
||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date
|
||||
FROM (
|
||||
SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9 UNION
|
||||
SELECT 10 UNION SELECT 11 UNION SELECT 12 UNION SELECT 13 UNION SELECT 14 UNION
|
||||
SELECT 15 UNION SELECT 16 UNION SELECT 17 UNION SELECT 18 UNION SELECT 19 UNION
|
||||
SELECT 20 UNION SELECT 21 UNION SELECT 22 UNION SELECT 23 UNION SELECT 24 UNION
|
||||
SELECT 25 UNION SELECT 26 UNION SELECT 27 UNION SELECT 28 UNION SELECT 29 UNION
|
||||
SELECT 30
|
||||
) numbers
|
||||
),
|
||||
product_stats AS (
|
||||
SELECT
|
||||
ds.product_id,
|
||||
AVG(ds.daily_quantity) as avg_daily_quantity,
|
||||
STDDEV_SAMP(ds.daily_quantity) as std_daily_quantity,
|
||||
ds.pid,
|
||||
AVG(ds.daily_quantity) as avg_daily_qty,
|
||||
STDDEV(ds.daily_quantity) as std_daily_qty,
|
||||
COUNT(DISTINCT ds.day_count) as data_points,
|
||||
SUM(ds.day_count) as total_days,
|
||||
AVG(ds.daily_revenue) as avg_daily_revenue,
|
||||
STDDEV_SAMP(ds.daily_revenue) as std_daily_revenue,
|
||||
COUNT(*) as data_points,
|
||||
-- Calculate day-of-week averages
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 1 THEN ds.daily_revenue END) as sunday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 2 THEN ds.daily_revenue END) as monday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 3 THEN ds.daily_revenue END) as tuesday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 4 THEN ds.daily_revenue END) as wednesday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 5 THEN ds.daily_revenue END) as thursday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 6 THEN ds.daily_revenue END) as friday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(ds.sale_date) = 7 THEN ds.daily_revenue END) as saturday_avg
|
||||
FROM daily_sales ds
|
||||
GROUP BY ds.product_id
|
||||
STDDEV(ds.daily_revenue) as std_daily_revenue,
|
||||
MIN(ds.daily_quantity) as min_daily_qty,
|
||||
MAX(ds.daily_quantity) as max_daily_qty,
|
||||
-- Calculate variance without using LAG
|
||||
COALESCE(
|
||||
STDDEV(ds.daily_quantity) / NULLIF(AVG(ds.daily_quantity), 0),
|
||||
0
|
||||
) as daily_variance_ratio
|
||||
FROM temp_daily_sales ds
|
||||
GROUP BY ds.pid
|
||||
HAVING AVG(ds.daily_quantity) > 0
|
||||
)
|
||||
SELECT
|
||||
ps.product_id,
|
||||
ds.pid,
|
||||
fd.forecast_date,
|
||||
GREATEST(0,
|
||||
ps.avg_daily_quantity *
|
||||
(1 + COALESCE(
|
||||
(SELECT seasonality_factor
|
||||
FROM sales_seasonality
|
||||
WHERE MONTH(fd.forecast_date) = month
|
||||
LIMIT 1),
|
||||
0
|
||||
))
|
||||
ROUND(
|
||||
ds.avg_daily_qty *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
CASE
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.5 THEN 0.85
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_qty / NULLIF(ds.avg_daily_qty, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
2
|
||||
)
|
||||
) as forecast_units,
|
||||
GREATEST(0,
|
||||
CASE DAYOFWEEK(fd.forecast_date)
|
||||
WHEN 1 THEN COALESCE(ps.sunday_avg, ps.avg_daily_revenue)
|
||||
WHEN 2 THEN COALESCE(ps.monday_avg, ps.avg_daily_revenue)
|
||||
WHEN 3 THEN COALESCE(ps.tuesday_avg, ps.avg_daily_revenue)
|
||||
WHEN 4 THEN COALESCE(ps.wednesday_avg, ps.avg_daily_revenue)
|
||||
WHEN 5 THEN COALESCE(ps.thursday_avg, ps.avg_daily_revenue)
|
||||
WHEN 6 THEN COALESCE(ps.friday_avg, ps.avg_daily_revenue)
|
||||
WHEN 7 THEN COALESCE(ps.saturday_avg, ps.avg_daily_revenue)
|
||||
END *
|
||||
(1 + COALESCE(
|
||||
(SELECT seasonality_factor
|
||||
FROM sales_seasonality
|
||||
WHERE MONTH(fd.forecast_date) = month
|
||||
LIMIT 1),
|
||||
0
|
||||
)) *
|
||||
-- Add some randomness within a small range (±5%)
|
||||
(0.95 + (RAND() * 0.1))
|
||||
ROUND(
|
||||
COALESCE(
|
||||
CASE
|
||||
WHEN ds.data_points >= 4 THEN ds.avg_daily_revenue
|
||||
ELSE ps.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
CASE
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.5 THEN 0.85
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 1.0 THEN 0.9
|
||||
WHEN ds.std_daily_revenue / NULLIF(ds.avg_daily_revenue, 0) > 0.5 THEN 0.95
|
||||
ELSE 1.0
|
||||
END,
|
||||
0
|
||||
),
|
||||
2
|
||||
)
|
||||
) as forecast_revenue,
|
||||
CASE
|
||||
WHEN ps.data_points >= 60 THEN 90
|
||||
WHEN ps.data_points >= 30 THEN 80
|
||||
WHEN ps.data_points >= 14 THEN 70
|
||||
WHEN ds.total_days >= 60 AND ds.daily_variance_ratio < 0.5 THEN 90
|
||||
WHEN ds.total_days >= 60 THEN 85
|
||||
WHEN ds.total_days >= 30 AND ds.daily_variance_ratio < 0.5 THEN 80
|
||||
WHEN ds.total_days >= 30 THEN 75
|
||||
WHEN ds.total_days >= 14 AND ds.daily_variance_ratio < 0.5 THEN 70
|
||||
WHEN ds.total_days >= 14 THEN 65
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
FROM product_stats ps
|
||||
CROSS JOIN forecast_dates fd
|
||||
WHERE ps.avg_daily_quantity > 0
|
||||
FROM daily_stats ds
|
||||
JOIN temp_product_stats ps ON ds.pid = ps.pid
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY ds.pid, fd.forecast_date, ps.overall_avg_revenue, sf.seasonality_factor
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
@@ -118,6 +268,80 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount)
|
||||
last_calculated_at = NOW()
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.98);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Product forecasts calculated, preparing category stats',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Create temporary table for category stats
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_sales AS
|
||||
SELECT
|
||||
pc.cat_id,
|
||||
DAYOFWEEK(o.date) as day_of_week,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue,
|
||||
COUNT(DISTINCT DATE(o.date)) as day_count
|
||||
FROM orders o
|
||||
JOIN product_categories pc ON o.pid = pc.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY pc.cat_id, DAYOFWEEK(o.date)
|
||||
`);
|
||||
|
||||
await connection.query(`
|
||||
CREATE TEMPORARY TABLE IF NOT EXISTS temp_category_stats AS
|
||||
SELECT
|
||||
cat_id,
|
||||
AVG(daily_revenue) as overall_avg_revenue,
|
||||
SUM(day_count) as total_days
|
||||
FROM temp_category_sales
|
||||
GROUP BY cat_id
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.99);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Category stats prepared, calculating category-level forecasts',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate category-level forecasts
|
||||
await connection.query(`
|
||||
INSERT INTO category_forecasts (
|
||||
@@ -128,93 +352,37 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount)
|
||||
confidence_level,
|
||||
last_calculated_at
|
||||
)
|
||||
WITH category_daily_sales AS (
|
||||
SELECT
|
||||
pc.category_id,
|
||||
DATE(o.date) as sale_date,
|
||||
SUM(o.quantity) as daily_quantity,
|
||||
SUM(o.price * o.quantity) as daily_revenue
|
||||
FROM orders o
|
||||
JOIN product_categories pc ON o.product_id = pc.product_id
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 90 DAY)
|
||||
GROUP BY pc.category_id, DATE(o.date)
|
||||
),
|
||||
forecast_dates AS (
|
||||
SELECT
|
||||
DATE_ADD(CURRENT_DATE, INTERVAL n DAY) as forecast_date
|
||||
FROM (
|
||||
SELECT 0 as n UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION
|
||||
SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9 UNION
|
||||
SELECT 10 UNION SELECT 11 UNION SELECT 12 UNION SELECT 13 UNION SELECT 14 UNION
|
||||
SELECT 15 UNION SELECT 16 UNION SELECT 17 UNION SELECT 18 UNION SELECT 19 UNION
|
||||
SELECT 20 UNION SELECT 21 UNION SELECT 22 UNION SELECT 23 UNION SELECT 24 UNION
|
||||
SELECT 25 UNION SELECT 26 UNION SELECT 27 UNION SELECT 28 UNION SELECT 29 UNION
|
||||
SELECT 30
|
||||
) numbers
|
||||
),
|
||||
category_stats AS (
|
||||
SELECT
|
||||
cds.category_id,
|
||||
AVG(cds.daily_quantity) as avg_daily_quantity,
|
||||
STDDEV_SAMP(cds.daily_quantity) as std_daily_quantity,
|
||||
AVG(cds.daily_revenue) as avg_daily_revenue,
|
||||
STDDEV_SAMP(cds.daily_revenue) as std_daily_revenue,
|
||||
COUNT(*) as data_points,
|
||||
-- Calculate day-of-week averages
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 1 THEN cds.daily_revenue END) as sunday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 2 THEN cds.daily_revenue END) as monday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 3 THEN cds.daily_revenue END) as tuesday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 4 THEN cds.daily_revenue END) as wednesday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 5 THEN cds.daily_revenue END) as thursday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 6 THEN cds.daily_revenue END) as friday_avg,
|
||||
AVG(CASE WHEN DAYOFWEEK(cds.sale_date) = 7 THEN cds.daily_revenue END) as saturday_avg
|
||||
FROM category_daily_sales cds
|
||||
GROUP BY cds.category_id
|
||||
)
|
||||
SELECT
|
||||
cs.category_id,
|
||||
cs.cat_id as category_id,
|
||||
fd.forecast_date,
|
||||
GREATEST(0,
|
||||
cs.avg_daily_quantity *
|
||||
(1 + COALESCE(
|
||||
(SELECT seasonality_factor
|
||||
FROM sales_seasonality
|
||||
WHERE MONTH(fd.forecast_date) = month
|
||||
LIMIT 1),
|
||||
0
|
||||
))
|
||||
AVG(cs.daily_quantity) *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0))
|
||||
) as forecast_units,
|
||||
GREATEST(0,
|
||||
CASE DAYOFWEEK(fd.forecast_date)
|
||||
WHEN 1 THEN COALESCE(cs.sunday_avg, cs.avg_daily_revenue)
|
||||
WHEN 2 THEN COALESCE(cs.monday_avg, cs.avg_daily_revenue)
|
||||
WHEN 3 THEN COALESCE(cs.tuesday_avg, cs.avg_daily_revenue)
|
||||
WHEN 4 THEN COALESCE(cs.wednesday_avg, cs.avg_daily_revenue)
|
||||
WHEN 5 THEN COALESCE(cs.thursday_avg, cs.avg_daily_revenue)
|
||||
WHEN 6 THEN COALESCE(cs.friday_avg, cs.avg_daily_revenue)
|
||||
WHEN 7 THEN COALESCE(cs.saturday_avg, cs.avg_daily_revenue)
|
||||
END *
|
||||
(1 + COALESCE(
|
||||
(SELECT seasonality_factor
|
||||
FROM sales_seasonality
|
||||
WHERE MONTH(fd.forecast_date) = month
|
||||
LIMIT 1),
|
||||
COALESCE(
|
||||
CASE
|
||||
WHEN SUM(cs.day_count) >= 4 THEN AVG(cs.daily_revenue)
|
||||
ELSE ct.overall_avg_revenue
|
||||
END *
|
||||
(1 + COALESCE(sf.seasonality_factor, 0)) *
|
||||
(0.95 + (RAND() * 0.1)),
|
||||
0
|
||||
)) *
|
||||
-- Add some randomness within a small range (±5%)
|
||||
(0.95 + (RAND() * 0.1))
|
||||
)
|
||||
) as forecast_revenue,
|
||||
CASE
|
||||
WHEN cs.data_points >= 60 THEN 90
|
||||
WHEN cs.data_points >= 30 THEN 80
|
||||
WHEN cs.data_points >= 14 THEN 70
|
||||
WHEN ct.total_days >= 60 THEN 90
|
||||
WHEN ct.total_days >= 30 THEN 80
|
||||
WHEN ct.total_days >= 14 THEN 70
|
||||
ELSE 60
|
||||
END as confidence_level,
|
||||
NOW() as last_calculated_at
|
||||
FROM category_stats cs
|
||||
CROSS JOIN forecast_dates fd
|
||||
WHERE cs.avg_daily_quantity > 0
|
||||
FROM temp_category_sales cs
|
||||
JOIN temp_category_stats ct ON cs.cat_id = ct.cat_id
|
||||
CROSS JOIN temp_forecast_dates fd
|
||||
LEFT JOIN sales_seasonality sf ON fd.month = sf.month
|
||||
GROUP BY cs.cat_id, fd.forecast_date, ct.overall_avg_revenue, ct.total_days, sf.seasonality_factor
|
||||
HAVING AVG(cs.daily_quantity) > 0
|
||||
ON DUPLICATE KEY UPDATE
|
||||
forecast_units = VALUES(forecast_units),
|
||||
forecast_revenue = VALUES(forecast_revenue),
|
||||
@@ -222,9 +390,57 @@ async function calculateSalesForecasts(startTime, totalProducts, processedCount)
|
||||
last_calculated_at = NOW()
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 1.0);
|
||||
// Clean up temporary tables
|
||||
await connection.query(`
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_forecast_dates;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_daily_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_product_stats;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_sales;
|
||||
DROP TEMPORARY TABLE IF EXISTS temp_category_stats;
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 1.0);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Category forecasts calculated and temporary tables cleaned up',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('sales_forecasts', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating sales forecasts');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,12 +1,64 @@
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateTimeAggregates(startTime, totalProducts, processedCount) {
|
||||
async function calculateTimeAggregates(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Time aggregates calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders: 0,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get order count that will be processed
|
||||
const [orderCount] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`);
|
||||
processedOrders = orderCount[0].count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting time aggregates calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// Initial insert of time-based aggregates
|
||||
await connection.query(`
|
||||
INSERT INTO product_time_aggregates (
|
||||
product_id,
|
||||
pid,
|
||||
year,
|
||||
month,
|
||||
total_quantity_sold,
|
||||
@@ -16,11 +68,13 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount)
|
||||
stock_received,
|
||||
stock_ordered,
|
||||
avg_price,
|
||||
profit_margin
|
||||
profit_margin,
|
||||
inventory_value,
|
||||
gmroi
|
||||
)
|
||||
WITH sales_data AS (
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
o.product_id,
|
||||
o.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
SUM(o.quantity) as total_quantity_sold,
|
||||
@@ -29,63 +83,120 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount)
|
||||
COUNT(DISTINCT o.order_number) as order_count,
|
||||
AVG(o.price - COALESCE(o.discount, 0)) as avg_price,
|
||||
CASE
|
||||
WHEN SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) = 0 THEN 0
|
||||
ELSE ((SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) -
|
||||
SUM(COALESCE(p.cost_price, 0) * o.quantity)) /
|
||||
SUM((o.price - COALESCE(o.discount, 0)) * o.quantity)) * 100
|
||||
END as profit_margin
|
||||
WHEN SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) > 0
|
||||
THEN ((SUM((o.price - COALESCE(o.discount, 0)) * o.quantity) - SUM(COALESCE(p.cost_price, 0) * o.quantity))
|
||||
/ SUM((o.price - COALESCE(o.discount, 0)) * o.quantity)) * 100
|
||||
ELSE 0
|
||||
END as profit_margin,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM orders o
|
||||
JOIN products p ON o.product_id = p.product_id
|
||||
WHERE o.canceled = 0
|
||||
GROUP BY o.product_id, YEAR(o.date), MONTH(o.date)
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY o.pid, YEAR(o.date), MONTH(o.date)
|
||||
),
|
||||
purchase_data AS (
|
||||
monthly_stock AS (
|
||||
SELECT
|
||||
product_id,
|
||||
pid,
|
||||
YEAR(date) as year,
|
||||
MONTH(date) as month,
|
||||
SUM(received) as stock_received,
|
||||
SUM(ordered) as stock_ordered
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
GROUP BY product_id, YEAR(date), MONTH(date)
|
||||
GROUP BY pid, YEAR(date), MONTH(date)
|
||||
),
|
||||
base_products AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.cost_price * p.stock_quantity as inventory_value
|
||||
FROM products p
|
||||
)
|
||||
SELECT
|
||||
s.product_id,
|
||||
s.year,
|
||||
s.month,
|
||||
s.total_quantity_sold,
|
||||
s.total_revenue,
|
||||
s.total_cost,
|
||||
s.order_count,
|
||||
COALESCE(p.stock_received, 0) as stock_received,
|
||||
COALESCE(p.stock_ordered, 0) as stock_ordered,
|
||||
s.avg_price,
|
||||
s.profit_margin
|
||||
FROM sales_data s
|
||||
LEFT JOIN purchase_data p
|
||||
ON s.product_id = p.product_id
|
||||
AND s.year = p.year
|
||||
AND s.month = p.month
|
||||
COALESCE(s.pid, ms.pid) as pid,
|
||||
COALESCE(s.year, ms.year) as year,
|
||||
COALESCE(s.month, ms.month) as month,
|
||||
COALESCE(s.total_quantity_sold, 0) as total_quantity_sold,
|
||||
COALESCE(s.total_revenue, 0) as total_revenue,
|
||||
COALESCE(s.total_cost, 0) as total_cost,
|
||||
COALESCE(s.order_count, 0) as order_count,
|
||||
COALESCE(ms.stock_received, 0) as stock_received,
|
||||
COALESCE(ms.stock_ordered, 0) as stock_ordered,
|
||||
COALESCE(s.avg_price, 0) as avg_price,
|
||||
COALESCE(s.profit_margin, 0) as profit_margin,
|
||||
COALESCE(s.inventory_value, bp.inventory_value, 0) as inventory_value,
|
||||
CASE
|
||||
WHEN COALESCE(s.inventory_value, bp.inventory_value, 0) > 0
|
||||
AND COALESCE(s.active_days, 0) > 0
|
||||
THEN (COALESCE(s.total_revenue - s.total_cost, 0) * (365.0 / s.active_days))
|
||||
/ COALESCE(s.inventory_value, bp.inventory_value)
|
||||
ELSE 0
|
||||
END as gmroi
|
||||
FROM (
|
||||
SELECT * FROM monthly_sales s
|
||||
UNION ALL
|
||||
SELECT
|
||||
ms.pid,
|
||||
ms.year,
|
||||
ms.month,
|
||||
0 as total_quantity_sold,
|
||||
0 as total_revenue,
|
||||
0 as total_cost,
|
||||
0 as order_count,
|
||||
NULL as avg_price,
|
||||
0 as profit_margin,
|
||||
NULL as inventory_value,
|
||||
0 as active_days
|
||||
FROM monthly_stock ms
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM monthly_sales s2
|
||||
WHERE s2.pid = ms.pid
|
||||
AND s2.year = ms.year
|
||||
AND s2.month = ms.month
|
||||
)
|
||||
) s
|
||||
LEFT JOIN monthly_stock ms
|
||||
ON s.pid = ms.pid
|
||||
AND s.year = ms.year
|
||||
AND s.month = ms.month
|
||||
JOIN base_products bp ON COALESCE(s.pid, ms.pid) = bp.pid
|
||||
UNION
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.year,
|
||||
p.month,
|
||||
ms.pid,
|
||||
ms.year,
|
||||
ms.month,
|
||||
0 as total_quantity_sold,
|
||||
0 as total_revenue,
|
||||
0 as total_cost,
|
||||
0 as order_count,
|
||||
p.stock_received,
|
||||
p.stock_ordered,
|
||||
ms.stock_received,
|
||||
ms.stock_ordered,
|
||||
0 as avg_price,
|
||||
0 as profit_margin
|
||||
FROM purchase_data p
|
||||
LEFT JOIN sales_data s
|
||||
ON p.product_id = s.product_id
|
||||
AND p.year = s.year
|
||||
AND p.month = s.month
|
||||
WHERE s.product_id IS NULL
|
||||
0 as profit_margin,
|
||||
bp.inventory_value,
|
||||
0 as gmroi
|
||||
FROM monthly_stock ms
|
||||
JOIN base_products bp ON ms.pid = bp.pid
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM (
|
||||
SELECT * FROM monthly_sales
|
||||
UNION ALL
|
||||
SELECT
|
||||
ms2.pid,
|
||||
ms2.year,
|
||||
ms2.month,
|
||||
0, 0, 0, 0, NULL, 0, NULL, 0
|
||||
FROM monthly_stock ms2
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1 FROM monthly_sales s2
|
||||
WHERE s2.pid = ms2.pid
|
||||
AND s2.year = ms2.year
|
||||
AND s2.month = ms2.month
|
||||
)
|
||||
) s
|
||||
WHERE s.pid = ms.pid
|
||||
AND s.year = ms.year
|
||||
AND s.month = ms.month
|
||||
)
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_quantity_sold = VALUES(total_quantity_sold),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
@@ -94,39 +205,99 @@ async function calculateTimeAggregates(startTime, totalProducts, processedCount)
|
||||
stock_received = VALUES(stock_received),
|
||||
stock_ordered = VALUES(stock_ordered),
|
||||
avg_price = VALUES(avg_price),
|
||||
profit_margin = VALUES(profit_margin)
|
||||
profit_margin = VALUES(profit_margin),
|
||||
inventory_value = VALUES(inventory_value),
|
||||
gmroi = VALUES(gmroi)
|
||||
`);
|
||||
|
||||
processedCount = Math.floor(totalProducts * 0.60);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Base time aggregates calculated, updating financial metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
// Update with financial metrics
|
||||
await connection.query(`
|
||||
UPDATE product_time_aggregates pta
|
||||
JOIN (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
p.cost_price * p.stock_quantity as inventory_value,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as gross_profit,
|
||||
COUNT(DISTINCT DATE(o.date)) as days_in_period
|
||||
COUNT(DISTINCT DATE(o.date)) as active_days
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
GROUP BY p.product_id, YEAR(o.date), MONTH(o.date)
|
||||
) fin ON pta.product_id = fin.product_id
|
||||
GROUP BY p.pid, YEAR(o.date), MONTH(o.date)
|
||||
) fin ON pta.pid = fin.pid
|
||||
AND pta.year = fin.year
|
||||
AND pta.month = fin.month
|
||||
SET
|
||||
pta.inventory_value = COALESCE(fin.inventory_value, 0),
|
||||
pta.gmroi = CASE
|
||||
WHEN COALESCE(fin.inventory_value, 0) > 0 AND fin.days_in_period > 0 THEN
|
||||
(COALESCE(fin.gross_profit, 0) * (365.0 / fin.days_in_period)) / COALESCE(fin.inventory_value, 0)
|
||||
ELSE 0
|
||||
END
|
||||
pta.inventory_value = COALESCE(fin.inventory_value, 0)
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 0.65);
|
||||
processedCount = Math.floor(totalProducts * 0.65);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Financial metrics updated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('time_aggregates', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders: 0,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating time aggregates');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,8 +2,15 @@ const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
// Helper function to format elapsed time
|
||||
function formatElapsedTime(startTime) {
|
||||
const elapsed = Date.now() - startTime;
|
||||
function formatElapsedTime(elapsed) {
|
||||
// If elapsed is a timestamp, convert to elapsed milliseconds
|
||||
if (elapsed instanceof Date || elapsed > 1000000000000) {
|
||||
elapsed = Date.now() - elapsed;
|
||||
} else {
|
||||
// If elapsed is in seconds, convert to milliseconds
|
||||
elapsed = elapsed * 1000;
|
||||
}
|
||||
|
||||
const seconds = Math.floor(elapsed / 1000);
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
const hours = Math.floor(minutes / 60);
|
||||
|
||||
@@ -1,116 +1,234 @@
|
||||
const { outputProgress } = require('./utils/progress');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate, logError } = require('./utils/progress');
|
||||
const { getConnection } = require('./utils/db');
|
||||
|
||||
async function calculateVendorMetrics(startTime, totalProducts, processedCount) {
|
||||
async function calculateVendorMetrics(startTime, totalProducts, processedCount = 0, isCancelled = false) {
|
||||
const connection = await getConnection();
|
||||
let success = false;
|
||||
let processedOrders = 0;
|
||||
let processedPurchaseOrders = 0;
|
||||
|
||||
try {
|
||||
if (isCancelled) {
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'Vendor metrics calculation cancelled',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders,
|
||||
success
|
||||
};
|
||||
}
|
||||
|
||||
// Get counts of records that will be processed
|
||||
const [[orderCount], [poCount]] = await Promise.all([
|
||||
connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
`),
|
||||
connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM purchase_orders po
|
||||
WHERE po.status != 0
|
||||
`)
|
||||
]);
|
||||
processedOrders = orderCount.count;
|
||||
processedPurchaseOrders = poCount.count;
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Calculating vendor metrics',
|
||||
current: Math.floor(totalProducts * 0.7),
|
||||
operation: 'Starting vendor metrics calculation',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, Math.floor(totalProducts * 0.7), totalProducts),
|
||||
rate: calculateRate(startTime, Math.floor(totalProducts * 0.7)),
|
||||
percentage: '70'
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// First, ensure all vendors exist in vendor_details
|
||||
// First ensure all vendors exist in vendor_details
|
||||
await connection.query(`
|
||||
INSERT IGNORE INTO vendor_details (vendor, status)
|
||||
SELECT DISTINCT vendor, 'active' as status
|
||||
INSERT IGNORE INTO vendor_details (vendor, status, created_at, updated_at)
|
||||
SELECT DISTINCT
|
||||
vendor,
|
||||
'active' as status,
|
||||
NOW() as created_at,
|
||||
NOW() as updated_at
|
||||
FROM products
|
||||
WHERE vendor IS NOT NULL
|
||||
AND vendor NOT IN (SELECT vendor FROM vendor_details)
|
||||
`);
|
||||
|
||||
// Calculate vendor performance metrics
|
||||
processedCount = Math.floor(totalProducts * 0.8);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Vendor details updated, calculating metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders,
|
||||
success
|
||||
};
|
||||
|
||||
// Now calculate vendor metrics
|
||||
await connection.query(`
|
||||
INSERT INTO vendor_metrics (
|
||||
vendor,
|
||||
total_revenue,
|
||||
total_orders,
|
||||
total_late_orders,
|
||||
avg_lead_time_days,
|
||||
on_time_delivery_rate,
|
||||
order_fill_rate,
|
||||
total_orders,
|
||||
total_late_orders,
|
||||
total_purchase_value,
|
||||
avg_order_value,
|
||||
active_products,
|
||||
total_products,
|
||||
total_revenue,
|
||||
total_purchase_value,
|
||||
avg_margin_percent,
|
||||
status
|
||||
status,
|
||||
last_calculated_at
|
||||
)
|
||||
WITH vendor_orders AS (
|
||||
WITH vendor_sales AS (
|
||||
SELECT
|
||||
po.vendor,
|
||||
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
|
||||
COUNT(*) as total_orders,
|
||||
COUNT(CASE WHEN po.received_date > po.expected_date THEN 1 END) as total_late_orders,
|
||||
SUM(po.cost_price * po.ordered) as total_purchase_value,
|
||||
AVG(po.cost_price * po.ordered) as avg_order_value,
|
||||
CASE
|
||||
WHEN COUNT(*) > 0 THEN
|
||||
(COUNT(CASE WHEN po.received = po.ordered THEN 1 END) * 100.0) / COUNT(*)
|
||||
ELSE 0
|
||||
END as order_fill_rate
|
||||
FROM purchase_orders po
|
||||
WHERE po.status = 'closed'
|
||||
GROUP BY po.vendor
|
||||
p.vendor,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
COUNT(DISTINCT o.id) as total_orders,
|
||||
COUNT(DISTINCT p.pid) as active_products,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_po AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
COUNT(DISTINCT CASE WHEN po.receiving_status = 40 THEN po.id END) as received_orders,
|
||||
COUNT(DISTINCT po.id) as total_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY p.vendor
|
||||
),
|
||||
vendor_products AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
COUNT(DISTINCT p.product_id) as total_products,
|
||||
COUNT(DISTINCT CASE WHEN p.visible = true THEN p.product_id END) as active_products,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0 THEN
|
||||
(SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
ELSE 0
|
||||
END as avg_margin_percent
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
GROUP BY p.vendor
|
||||
vendor,
|
||||
COUNT(DISTINCT pid) as total_products
|
||||
FROM products
|
||||
GROUP BY vendor
|
||||
)
|
||||
SELECT
|
||||
vd.vendor,
|
||||
COALESCE(vo.avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
vs.vendor,
|
||||
COALESCE(vs.total_revenue, 0) as total_revenue,
|
||||
COALESCE(vp.total_orders, 0) as total_orders,
|
||||
COALESCE(vp.total_orders - vp.received_orders, 0) as total_late_orders,
|
||||
COALESCE(vp.avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
CASE
|
||||
WHEN COALESCE(vo.total_orders, 0) > 0 THEN
|
||||
((COALESCE(vo.total_orders, 0) - COALESCE(vo.total_late_orders, 0)) * 100.0) / COALESCE(vo.total_orders, 1)
|
||||
WHEN vp.total_orders > 0
|
||||
THEN (vp.received_orders / vp.total_orders) * 100
|
||||
ELSE 0
|
||||
END as on_time_delivery_rate,
|
||||
COALESCE(vo.order_fill_rate, 0) as order_fill_rate,
|
||||
COALESCE(vo.total_orders, 0) as total_orders,
|
||||
COALESCE(vo.total_late_orders, 0) as total_late_orders,
|
||||
COALESCE(vo.total_purchase_value, 0) as total_purchase_value,
|
||||
COALESCE(vo.avg_order_value, 0) as avg_order_value,
|
||||
COALESCE(vp.active_products, 0) as active_products,
|
||||
COALESCE(vp.total_products, 0) as total_products,
|
||||
COALESCE(vp.total_revenue, 0) as total_revenue,
|
||||
COALESCE(vp.avg_margin_percent, 0) as avg_margin_percent,
|
||||
vd.status
|
||||
FROM vendor_details vd
|
||||
LEFT JOIN vendor_orders vo ON vd.vendor = vo.vendor
|
||||
LEFT JOIN vendor_products vp ON vd.vendor = vp.vendor
|
||||
CASE
|
||||
WHEN vp.total_orders > 0
|
||||
THEN (vp.received_orders / vp.total_orders) * 100
|
||||
ELSE 0
|
||||
END as order_fill_rate,
|
||||
CASE
|
||||
WHEN vs.total_orders > 0
|
||||
THEN vs.total_revenue / vs.total_orders
|
||||
ELSE 0
|
||||
END as avg_order_value,
|
||||
COALESCE(vs.active_products, 0) as active_products,
|
||||
COALESCE(vpr.total_products, 0) as total_products,
|
||||
COALESCE(vp.total_purchase_value, 0) as total_purchase_value,
|
||||
CASE
|
||||
WHEN vs.total_revenue > 0
|
||||
THEN (vs.total_margin / vs.total_revenue) * 100
|
||||
ELSE 0
|
||||
END as avg_margin_percent,
|
||||
'active' as status,
|
||||
NOW() as last_calculated_at
|
||||
FROM vendor_sales vs
|
||||
LEFT JOIN vendor_po vp ON vs.vendor = vp.vendor
|
||||
LEFT JOIN vendor_products vpr ON vs.vendor = vpr.vendor
|
||||
WHERE vs.vendor IS NOT NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_orders = VALUES(total_orders),
|
||||
total_late_orders = VALUES(total_late_orders),
|
||||
avg_lead_time_days = VALUES(avg_lead_time_days),
|
||||
on_time_delivery_rate = VALUES(on_time_delivery_rate),
|
||||
order_fill_rate = VALUES(order_fill_rate),
|
||||
total_orders = VALUES(total_orders),
|
||||
total_late_orders = VALUES(total_late_orders),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
avg_order_value = VALUES(avg_order_value),
|
||||
active_products = VALUES(active_products),
|
||||
total_products = VALUES(total_products),
|
||||
total_revenue = VALUES(total_revenue),
|
||||
total_purchase_value = VALUES(total_purchase_value),
|
||||
avg_margin_percent = VALUES(avg_margin_percent),
|
||||
status = VALUES(status),
|
||||
last_calculated_at = CURRENT_TIMESTAMP
|
||||
last_calculated_at = VALUES(last_calculated_at)
|
||||
`);
|
||||
|
||||
// Calculate vendor time-based metrics
|
||||
processedCount = Math.floor(totalProducts * 0.9);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Vendor metrics calculated, updating time-based metrics',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
if (isCancelled) return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders,
|
||||
success
|
||||
};
|
||||
|
||||
// Calculate time-based metrics
|
||||
await connection.query(`
|
||||
INSERT INTO vendor_time_metrics (
|
||||
vendor,
|
||||
@@ -123,39 +241,76 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount)
|
||||
total_revenue,
|
||||
avg_margin_percent
|
||||
)
|
||||
WITH vendor_time_data AS (
|
||||
WITH monthly_orders AS (
|
||||
SELECT
|
||||
vd.vendor,
|
||||
p.vendor,
|
||||
YEAR(o.date) as year,
|
||||
MONTH(o.date) as month,
|
||||
COUNT(DISTINCT o.id) as total_orders,
|
||||
SUM(o.quantity * o.price) as total_revenue,
|
||||
SUM(o.quantity * (o.price - p.cost_price)) as total_margin
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(o.date), MONTH(o.date)
|
||||
),
|
||||
monthly_po AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
YEAR(po.date) as year,
|
||||
MONTH(po.date) as month,
|
||||
COUNT(DISTINCT po.po_id) as total_orders,
|
||||
COUNT(DISTINCT CASE WHEN po.received_date > po.expected_date THEN po.po_id END) as late_orders,
|
||||
AVG(DATEDIFF(po.received_date, po.date)) as avg_lead_time_days,
|
||||
SUM(po.cost_price * po.ordered) as total_purchase_value,
|
||||
SUM(o.price * o.quantity) as total_revenue,
|
||||
CASE
|
||||
WHEN SUM(o.price * o.quantity) > 0 THEN
|
||||
(SUM((o.price - p.cost_price) * o.quantity) * 100.0) / SUM(o.price * o.quantity)
|
||||
ELSE 0
|
||||
END as avg_margin_percent
|
||||
FROM vendor_details vd
|
||||
LEFT JOIN products p ON vd.vendor = p.vendor
|
||||
LEFT JOIN purchase_orders po ON p.product_id = po.product_id
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.canceled = false
|
||||
COUNT(DISTINCT po.id) as total_po,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN po.receiving_status = 40 AND po.received_date > po.expected_date
|
||||
THEN po.id
|
||||
END) as late_orders,
|
||||
AVG(CASE
|
||||
WHEN po.receiving_status = 40
|
||||
THEN DATEDIFF(po.received_date, po.date)
|
||||
END) as avg_lead_time_days,
|
||||
SUM(po.ordered * po.po_cost_price) as total_purchase_value
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE po.date >= DATE_SUB(CURRENT_DATE, INTERVAL 12 MONTH)
|
||||
GROUP BY vd.vendor, YEAR(po.date), MONTH(po.date)
|
||||
AND p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor, YEAR(po.date), MONTH(po.date)
|
||||
)
|
||||
SELECT
|
||||
vendor,
|
||||
year,
|
||||
month,
|
||||
COALESCE(total_orders, 0) as total_orders,
|
||||
COALESCE(late_orders, 0) as late_orders,
|
||||
COALESCE(avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
COALESCE(total_purchase_value, 0) as total_purchase_value,
|
||||
COALESCE(total_revenue, 0) as total_revenue,
|
||||
COALESCE(avg_margin_percent, 0) as avg_margin_percent
|
||||
FROM vendor_time_data
|
||||
mo.vendor,
|
||||
mo.year,
|
||||
mo.month,
|
||||
COALESCE(mp.total_po, 0) as total_orders,
|
||||
COALESCE(mp.late_orders, 0) as late_orders,
|
||||
COALESCE(mp.avg_lead_time_days, 0) as avg_lead_time_days,
|
||||
COALESCE(mp.total_purchase_value, 0) as total_purchase_value,
|
||||
mo.total_revenue,
|
||||
CASE
|
||||
WHEN mo.total_revenue > 0
|
||||
THEN (mo.total_margin / mo.total_revenue) * 100
|
||||
ELSE 0
|
||||
END as avg_margin_percent
|
||||
FROM monthly_orders mo
|
||||
LEFT JOIN monthly_po mp ON mo.vendor = mp.vendor
|
||||
AND mo.year = mp.year
|
||||
AND mo.month = mp.month
|
||||
UNION
|
||||
SELECT
|
||||
mp.vendor,
|
||||
mp.year,
|
||||
mp.month,
|
||||
mp.total_po as total_orders,
|
||||
mp.late_orders,
|
||||
mp.avg_lead_time_days,
|
||||
mp.total_purchase_value,
|
||||
0 as total_revenue,
|
||||
0 as avg_margin_percent
|
||||
FROM monthly_po mp
|
||||
LEFT JOIN monthly_orders mo ON mp.vendor = mo.vendor
|
||||
AND mp.year = mo.year
|
||||
AND mp.month = mo.month
|
||||
WHERE mo.vendor IS NULL
|
||||
ON DUPLICATE KEY UPDATE
|
||||
total_orders = VALUES(total_orders),
|
||||
late_orders = VALUES(late_orders),
|
||||
@@ -165,9 +320,48 @@ async function calculateVendorMetrics(startTime, totalProducts, processedCount)
|
||||
avg_margin_percent = VALUES(avg_margin_percent)
|
||||
`);
|
||||
|
||||
return Math.floor(totalProducts * 0.75);
|
||||
processedCount = Math.floor(totalProducts * 0.95);
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Time-based vendor metrics calculated',
|
||||
current: processedCount,
|
||||
total: totalProducts,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, processedCount, totalProducts),
|
||||
rate: calculateRate(startTime, processedCount),
|
||||
percentage: ((processedCount / totalProducts) * 100).toFixed(1),
|
||||
timing: {
|
||||
start_time: new Date(startTime).toISOString(),
|
||||
end_time: new Date().toISOString(),
|
||||
elapsed_seconds: Math.round((Date.now() - startTime) / 1000)
|
||||
}
|
||||
});
|
||||
|
||||
// If we get here, everything completed successfully
|
||||
success = true;
|
||||
|
||||
// Update calculate_status
|
||||
await connection.query(`
|
||||
INSERT INTO calculate_status (module_name, last_calculation_timestamp)
|
||||
VALUES ('vendor_metrics', NOW())
|
||||
ON DUPLICATE KEY UPDATE last_calculation_timestamp = NOW()
|
||||
`);
|
||||
|
||||
return {
|
||||
processedProducts: processedCount,
|
||||
processedOrders,
|
||||
processedPurchaseOrders,
|
||||
success
|
||||
};
|
||||
|
||||
} catch (error) {
|
||||
success = false;
|
||||
logError(error, 'Error calculating vendor metrics');
|
||||
throw error;
|
||||
} finally {
|
||||
connection.release();
|
||||
if (connection) {
|
||||
connection.release();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@ const path = require('path');
|
||||
const csv = require('csv-parse');
|
||||
const mysql = require('mysql2/promise');
|
||||
const dotenv = require('dotenv');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
|
||||
// Get test limits from environment variables
|
||||
const PRODUCTS_TEST_LIMIT = parseInt(process.env.PRODUCTS_TEST_LIMIT || '0');
|
||||
@@ -106,20 +107,19 @@ async function countRows(filePath) {
|
||||
}
|
||||
|
||||
// Helper function to update progress with time estimate
|
||||
function updateProgress(current, total, operation, startTime) {
|
||||
const elapsed = (Date.now() - startTime) / 1000;
|
||||
const rate = current / elapsed; // rows per second
|
||||
const remaining = (total - current) / rate;
|
||||
|
||||
function updateProgress(current, total, operation, startTime, added = 0, updated = 0, skipped = 0) {
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation,
|
||||
current,
|
||||
total,
|
||||
rate,
|
||||
elapsed: formatDuration(elapsed),
|
||||
remaining: formatDuration(remaining),
|
||||
percentage: ((current / total) * 100).toFixed(1)
|
||||
rate: calculateRate(startTime, current),
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, current, total),
|
||||
percentage: ((current / total) * 100).toFixed(1),
|
||||
added,
|
||||
updated,
|
||||
skipped
|
||||
});
|
||||
}
|
||||
|
||||
@@ -474,7 +474,7 @@ async function importProducts(pool, filePath) {
|
||||
// Update progress every 100ms to avoid console flooding
|
||||
const now = Date.now();
|
||||
if (now - lastUpdate > 100) {
|
||||
updateProgress(rowCount, totalRows, 'Products import', startTime);
|
||||
updateProgress(rowCount, totalRows, 'Products import', startTime, added, updated, 0);
|
||||
lastUpdate = now;
|
||||
}
|
||||
|
||||
@@ -678,7 +678,7 @@ async function importOrders(pool, filePath) {
|
||||
// Update progress every 100ms
|
||||
const now = Date.now();
|
||||
if (now - lastUpdate > 100) {
|
||||
updateProgress(rowCount, totalRows, 'Orders import', startTime);
|
||||
updateProgress(rowCount, totalRows, 'Orders import', startTime, added, updated, skipped);
|
||||
lastUpdate = now;
|
||||
}
|
||||
|
||||
@@ -845,7 +845,7 @@ async function importPurchaseOrders(pool, filePath) {
|
||||
// Update progress every 100ms
|
||||
const now = Date.now();
|
||||
if (now - lastUpdate > 100) {
|
||||
updateProgress(rowCount, totalRows, 'Purchase orders import', startTime);
|
||||
updateProgress(rowCount, totalRows, 'Purchase orders import', startTime, added, updated, skipped);
|
||||
lastUpdate = now;
|
||||
}
|
||||
|
||||
180
inventory-server/scripts/old_csv/update-csv.js
Normal file
180
inventory-server/scripts/old_csv/update-csv.js
Normal file
@@ -0,0 +1,180 @@
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const axios = require('axios');
|
||||
const { outputProgress, formatElapsedTime, estimateRemaining, calculateRate } = require('../metrics/utils/progress');
|
||||
|
||||
// Change working directory to script directory
|
||||
process.chdir(path.dirname(__filename));
|
||||
|
||||
require('dotenv').config({ path: path.resolve(__dirname, '..', '.env') });
|
||||
|
||||
const FILES = [
|
||||
{
|
||||
name: '39f2x83-products.csv',
|
||||
url: process.env.PRODUCTS_CSV_URL
|
||||
},
|
||||
{
|
||||
name: '39f2x83-orders.csv',
|
||||
url: process.env.ORDERS_CSV_URL
|
||||
},
|
||||
{
|
||||
name: '39f2x83-purchase_orders.csv',
|
||||
url: process.env.PURCHASE_ORDERS_CSV_URL
|
||||
}
|
||||
];
|
||||
|
||||
let isCancelled = false;
|
||||
|
||||
function cancelUpdate() {
|
||||
isCancelled = true;
|
||||
outputProgress({
|
||||
status: 'cancelled',
|
||||
operation: 'CSV update cancelled',
|
||||
current: 0,
|
||||
total: FILES.length,
|
||||
elapsed: null,
|
||||
remaining: null,
|
||||
rate: 0
|
||||
});
|
||||
}
|
||||
|
||||
async function downloadFile(file, index, startTime) {
|
||||
if (isCancelled) return;
|
||||
|
||||
const csvDir = path.join(__dirname, '../csv');
|
||||
if (!fs.existsSync(csvDir)) {
|
||||
fs.mkdirSync(csvDir, { recursive: true });
|
||||
}
|
||||
|
||||
const writer = fs.createWriteStream(path.join(csvDir, file.name));
|
||||
|
||||
try {
|
||||
const response = await axios({
|
||||
url: file.url,
|
||||
method: 'GET',
|
||||
responseType: 'stream'
|
||||
});
|
||||
|
||||
const totalLength = response.headers['content-length'];
|
||||
let downloadedLength = 0;
|
||||
let lastProgressUpdate = Date.now();
|
||||
const PROGRESS_INTERVAL = 1000; // Update progress every second
|
||||
|
||||
response.data.on('data', (chunk) => {
|
||||
if (isCancelled) {
|
||||
writer.end();
|
||||
return;
|
||||
}
|
||||
|
||||
downloadedLength += chunk.length;
|
||||
|
||||
// Update progress based on time interval
|
||||
const now = Date.now();
|
||||
if (now - lastProgressUpdate >= PROGRESS_INTERVAL) {
|
||||
const progress = (downloadedLength / totalLength) * 100;
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: `Downloading ${file.name}`,
|
||||
current: index + (downloadedLength / totalLength),
|
||||
total: FILES.length,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, index + (downloadedLength / totalLength), FILES.length),
|
||||
rate: calculateRate(startTime, index + (downloadedLength / totalLength)),
|
||||
percentage: progress.toFixed(1),
|
||||
file_progress: {
|
||||
name: file.name,
|
||||
downloaded: downloadedLength,
|
||||
total: totalLength,
|
||||
percentage: progress.toFixed(1)
|
||||
}
|
||||
});
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
});
|
||||
|
||||
response.data.pipe(writer);
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
writer.on('finish', resolve);
|
||||
writer.on('error', reject);
|
||||
});
|
||||
} catch (error) {
|
||||
fs.unlinkSync(path.join(csvDir, file.name));
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Main function to update all files
|
||||
async function updateFiles() {
|
||||
const startTime = Date.now();
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'Starting CSV update',
|
||||
current: 0,
|
||||
total: FILES.length,
|
||||
elapsed: '0s',
|
||||
remaining: null,
|
||||
rate: 0,
|
||||
percentage: '0'
|
||||
});
|
||||
|
||||
try {
|
||||
for (let i = 0; i < FILES.length; i++) {
|
||||
if (isCancelled) {
|
||||
return;
|
||||
}
|
||||
|
||||
const file = FILES[i];
|
||||
await downloadFile(file, i, startTime);
|
||||
|
||||
outputProgress({
|
||||
status: 'running',
|
||||
operation: 'CSV update in progress',
|
||||
current: i + 1,
|
||||
total: FILES.length,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: estimateRemaining(startTime, i + 1, FILES.length),
|
||||
rate: calculateRate(startTime, i + 1),
|
||||
percentage: (((i + 1) / FILES.length) * 100).toFixed(1)
|
||||
});
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'CSV update complete',
|
||||
current: FILES.length,
|
||||
total: FILES.length,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: '0s',
|
||||
rate: calculateRate(startTime, FILES.length),
|
||||
percentage: '100'
|
||||
});
|
||||
} catch (error) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'CSV update failed',
|
||||
error: error.message,
|
||||
current: 0,
|
||||
total: FILES.length,
|
||||
elapsed: formatElapsedTime(startTime),
|
||||
remaining: null,
|
||||
rate: 0
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
// Run the update only if this is the main module
|
||||
if (require.main === module) {
|
||||
updateFiles().catch((error) => {
|
||||
console.error('Error updating CSV files:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
|
||||
// Export the functions needed by the route
|
||||
module.exports = {
|
||||
updateFiles,
|
||||
cancelUpdate
|
||||
};
|
||||
@@ -40,6 +40,7 @@ const CONFIG_TABLES = [
|
||||
'sales_velocity_config',
|
||||
'abc_classification_config',
|
||||
'safety_stock_config',
|
||||
'sales_seasonality',
|
||||
'turnover_config'
|
||||
];
|
||||
|
||||
@@ -155,7 +156,7 @@ async function resetDatabase() {
|
||||
SELECT GROUP_CONCAT(table_name) as tables
|
||||
FROM information_schema.tables
|
||||
WHERE table_schema = DATABASE()
|
||||
AND table_name != 'users'
|
||||
AND table_name NOT IN ('users', 'import_history', 'calculate_history')
|
||||
`);
|
||||
|
||||
if (!tables[0].tables) {
|
||||
@@ -174,7 +175,7 @@ async function resetDatabase() {
|
||||
DROP TABLE IF EXISTS
|
||||
${tables[0].tables
|
||||
.split(',')
|
||||
.filter(table => table !== 'users')
|
||||
.filter(table => !['users', 'calculate_history'].includes(table))
|
||||
.map(table => '`' + table + '`')
|
||||
.join(', ')}
|
||||
`;
|
||||
@@ -542,5 +543,15 @@ async function resetDatabase() {
|
||||
}
|
||||
}
|
||||
|
||||
// Run the reset
|
||||
resetDatabase();
|
||||
// Export if required as a module
|
||||
if (typeof module !== 'undefined' && module.exports) {
|
||||
module.exports = resetDatabase;
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (require.main === module) {
|
||||
resetDatabase().catch(error => {
|
||||
console.error('Error:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
@@ -12,10 +12,16 @@ const dbConfig = {
|
||||
};
|
||||
|
||||
function outputProgress(data) {
|
||||
if (!data.status) {
|
||||
data = {
|
||||
status: 'running',
|
||||
...data
|
||||
};
|
||||
}
|
||||
console.log(JSON.stringify(data));
|
||||
}
|
||||
|
||||
// Explicitly define all metrics-related tables
|
||||
// Explicitly define all metrics-related tables in dependency order
|
||||
const METRICS_TABLES = [
|
||||
'brand_metrics',
|
||||
'brand_time_metrics',
|
||||
@@ -26,7 +32,6 @@ const METRICS_TABLES = [
|
||||
'product_metrics',
|
||||
'product_time_aggregates',
|
||||
'sales_forecasts',
|
||||
'sales_seasonality',
|
||||
'temp_purchase_metrics',
|
||||
'temp_sales_metrics',
|
||||
'vendor_metrics', //before vendor_details for foreign key
|
||||
@@ -34,56 +39,279 @@ const METRICS_TABLES = [
|
||||
'vendor_details'
|
||||
];
|
||||
|
||||
// Config tables that must exist
|
||||
const CONFIG_TABLES = [
|
||||
'stock_thresholds',
|
||||
'lead_time_thresholds',
|
||||
'sales_velocity_config',
|
||||
'abc_classification_config',
|
||||
'safety_stock_config',
|
||||
'turnover_config'
|
||||
];
|
||||
// Split SQL into individual statements
|
||||
function splitSQLStatements(sql) {
|
||||
sql = sql.replace(/\r\n/g, '\n');
|
||||
let statements = [];
|
||||
let currentStatement = '';
|
||||
let inString = false;
|
||||
let stringChar = '';
|
||||
|
||||
// Core tables that must exist
|
||||
const REQUIRED_CORE_TABLES = [
|
||||
'products',
|
||||
'orders',
|
||||
'purchase_orders'
|
||||
];
|
||||
for (let i = 0; i < sql.length; i++) {
|
||||
const char = sql[i];
|
||||
const nextChar = sql[i + 1] || '';
|
||||
|
||||
if ((char === "'" || char === '"') && sql[i - 1] !== '\\') {
|
||||
if (!inString) {
|
||||
inString = true;
|
||||
stringChar = char;
|
||||
} else if (char === stringChar) {
|
||||
inString = false;
|
||||
}
|
||||
}
|
||||
|
||||
if (!inString && char === '-' && nextChar === '-') {
|
||||
while (i < sql.length && sql[i] !== '\n') i++;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!inString && char === '/' && nextChar === '*') {
|
||||
i += 2;
|
||||
while (i < sql.length && (sql[i] !== '*' || sql[i + 1] !== '/')) i++;
|
||||
i++;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!inString && char === ';') {
|
||||
if (currentStatement.trim()) {
|
||||
statements.push(currentStatement.trim());
|
||||
}
|
||||
currentStatement = '';
|
||||
} else {
|
||||
currentStatement += char;
|
||||
}
|
||||
}
|
||||
|
||||
if (currentStatement.trim()) {
|
||||
statements.push(currentStatement.trim());
|
||||
}
|
||||
|
||||
return statements;
|
||||
}
|
||||
|
||||
async function resetMetrics() {
|
||||
let connection;
|
||||
try {
|
||||
outputProgress({
|
||||
operation: 'Starting metrics reset',
|
||||
message: 'Connecting to database...'
|
||||
});
|
||||
|
||||
connection = await mysql.createConnection(dbConfig);
|
||||
await connection.beginTransaction();
|
||||
|
||||
// Drop all metrics tables
|
||||
for (const table of METRICS_TABLES) {
|
||||
console.log(`Dropping table: ${table}`);
|
||||
// First verify current state
|
||||
const [initialTables] = await connection.query(`
|
||||
SELECT TABLE_NAME as name
|
||||
FROM information_schema.tables
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME IN (?)
|
||||
`, [METRICS_TABLES]);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Initial state',
|
||||
message: `Found ${initialTables.length} existing metrics tables: ${initialTables.map(t => t.name).join(', ')}`
|
||||
});
|
||||
|
||||
// Disable foreign key checks at the start
|
||||
await connection.query('SET FOREIGN_KEY_CHECKS = 0');
|
||||
|
||||
// Drop all metrics tables in reverse order to handle dependencies
|
||||
outputProgress({
|
||||
operation: 'Dropping metrics tables',
|
||||
message: 'Removing existing metrics tables...'
|
||||
});
|
||||
|
||||
for (const table of [...METRICS_TABLES].reverse()) {
|
||||
try {
|
||||
await connection.query(`DROP TABLE IF EXISTS ${table}`);
|
||||
console.log(`Successfully dropped: ${table}`);
|
||||
|
||||
// Verify the table was actually dropped
|
||||
const [checkDrop] = await connection.query(`
|
||||
SELECT COUNT(*) as count
|
||||
FROM information_schema.tables
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME = ?
|
||||
`, [table]);
|
||||
|
||||
if (checkDrop[0].count > 0) {
|
||||
throw new Error(`Failed to drop table ${table} - table still exists`);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
operation: 'Table dropped',
|
||||
message: `Successfully dropped table: ${table}`
|
||||
});
|
||||
} catch (err) {
|
||||
console.error(`Error dropping ${table}:`, err.message);
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Drop table error',
|
||||
message: `Error dropping table ${table}: ${err.message}`
|
||||
});
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
// Recreate all metrics tables from schema
|
||||
const schemaSQL = fs.readFileSync(path.resolve(__dirname, '../db/metrics-schema.sql'), 'utf8');
|
||||
await connection.query(schemaSQL);
|
||||
console.log('All metrics tables recreated successfully');
|
||||
// Verify all tables were dropped
|
||||
const [afterDrop] = await connection.query(`
|
||||
SELECT TABLE_NAME as name
|
||||
FROM information_schema.tables
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME IN (?)
|
||||
`, [METRICS_TABLES]);
|
||||
|
||||
if (afterDrop.length > 0) {
|
||||
throw new Error(`Failed to drop all tables. Remaining tables: ${afterDrop.map(t => t.name).join(', ')}`);
|
||||
}
|
||||
|
||||
// Read metrics schema
|
||||
outputProgress({
|
||||
operation: 'Reading schema',
|
||||
message: 'Loading metrics schema file...'
|
||||
});
|
||||
|
||||
const schemaPath = path.resolve(__dirname, '../db/metrics-schema.sql');
|
||||
if (!fs.existsSync(schemaPath)) {
|
||||
throw new Error(`Schema file not found at: ${schemaPath}`);
|
||||
}
|
||||
|
||||
const schemaSQL = fs.readFileSync(schemaPath, 'utf8');
|
||||
const statements = splitSQLStatements(schemaSQL);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Schema loaded',
|
||||
message: `Found ${statements.length} SQL statements to execute`
|
||||
});
|
||||
|
||||
// Execute schema statements
|
||||
for (let i = 0; i < statements.length; i++) {
|
||||
const stmt = statements[i];
|
||||
try {
|
||||
await connection.query(stmt);
|
||||
|
||||
// Check for warnings
|
||||
const [warnings] = await connection.query('SHOW WARNINGS');
|
||||
if (warnings && warnings.length > 0) {
|
||||
outputProgress({
|
||||
status: 'warning',
|
||||
operation: 'SQL Warning',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
warnings: warnings
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// If this is a CREATE TABLE statement, verify the table was created
|
||||
if (stmt.trim().toLowerCase().startsWith('create table')) {
|
||||
const tableName = stmt.match(/create\s+table\s+(?:if\s+not\s+exists\s+)?`?(\w+)`?/i)?.[1];
|
||||
if (tableName) {
|
||||
const [checkCreate] = await connection.query(`
|
||||
SELECT TABLE_NAME as name, CREATE_TIME as created
|
||||
FROM information_schema.tables
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME = ?
|
||||
`, [tableName]);
|
||||
|
||||
if (checkCreate.length === 0) {
|
||||
throw new Error(`Failed to create table ${tableName} - table does not exist after CREATE statement`);
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
operation: 'Table created',
|
||||
message: `Successfully created table: ${tableName} at ${checkCreate[0].created}`
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
outputProgress({
|
||||
operation: 'SQL Progress',
|
||||
message: {
|
||||
statement: i + 1,
|
||||
total: statements.length,
|
||||
preview: stmt.substring(0, 100) + (stmt.length > 100 ? '...' : '')
|
||||
}
|
||||
});
|
||||
} catch (sqlError) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'SQL Error',
|
||||
message: {
|
||||
error: sqlError.message,
|
||||
sqlState: sqlError.sqlState,
|
||||
errno: sqlError.errno,
|
||||
statement: stmt,
|
||||
statementNumber: i + 1
|
||||
}
|
||||
});
|
||||
throw sqlError;
|
||||
}
|
||||
}
|
||||
|
||||
// Re-enable foreign key checks after all tables are created
|
||||
await connection.query('SET FOREIGN_KEY_CHECKS = 1');
|
||||
|
||||
// Verify metrics tables were created
|
||||
outputProgress({
|
||||
operation: 'Verifying metrics tables',
|
||||
message: 'Checking all metrics tables were created...'
|
||||
});
|
||||
|
||||
const [metricsTablesResult] = await connection.query(`
|
||||
SELECT
|
||||
TABLE_NAME as name,
|
||||
TABLE_ROWS as \`rows\`,
|
||||
CREATE_TIME as created
|
||||
FROM information_schema.tables
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME IN (?)
|
||||
`, [METRICS_TABLES]);
|
||||
|
||||
outputProgress({
|
||||
operation: 'Tables found',
|
||||
message: `Found ${metricsTablesResult.length} tables: ${metricsTablesResult.map(t =>
|
||||
`${t.name} (created: ${t.created})`
|
||||
).join(', ')}`
|
||||
});
|
||||
|
||||
const existingMetricsTables = metricsTablesResult.map(t => t.name);
|
||||
const missingMetricsTables = METRICS_TABLES.filter(t => !existingMetricsTables.includes(t));
|
||||
|
||||
if (missingMetricsTables.length > 0) {
|
||||
// Do one final check of the actual tables
|
||||
const [finalCheck] = await connection.query('SHOW TABLES');
|
||||
outputProgress({
|
||||
operation: 'Final table check',
|
||||
message: `All database tables: ${finalCheck.map(t => Object.values(t)[0]).join(', ')}`
|
||||
});
|
||||
throw new Error(`Failed to create metrics tables: ${missingMetricsTables.join(', ')}`);
|
||||
}
|
||||
|
||||
await connection.commit();
|
||||
console.log('All metrics tables reset successfully');
|
||||
|
||||
outputProgress({
|
||||
status: 'complete',
|
||||
operation: 'Reset complete',
|
||||
message: 'All metrics tables have been reset successfully'
|
||||
});
|
||||
} catch (error) {
|
||||
outputProgress({
|
||||
status: 'error',
|
||||
operation: 'Reset failed',
|
||||
message: error.message,
|
||||
stack: error.stack
|
||||
});
|
||||
|
||||
if (connection) {
|
||||
await connection.rollback();
|
||||
// Make sure to re-enable foreign key checks even if there's an error
|
||||
await connection.query('SET FOREIGN_KEY_CHECKS = 1').catch(() => {});
|
||||
}
|
||||
console.error('Error resetting metrics:', error);
|
||||
throw error;
|
||||
} finally {
|
||||
if (connection) {
|
||||
// One final attempt to ensure foreign key checks are enabled
|
||||
await connection.query('SET FOREIGN_KEY_CHECKS = 1').catch(() => {});
|
||||
await connection.end();
|
||||
}
|
||||
}
|
||||
|
||||
180
inventory-server/scripts/scripts.js
Normal file
180
inventory-server/scripts/scripts.js
Normal file
@@ -0,0 +1,180 @@
|
||||
const readline = require('readline');
|
||||
|
||||
const rl = readline.createInterface({
|
||||
input: process.stdin,
|
||||
output: process.stdout
|
||||
});
|
||||
|
||||
const question = (query) => new Promise((resolve) => rl.question(query, resolve));
|
||||
|
||||
async function loadScript(name) {
|
||||
try {
|
||||
return await require(name);
|
||||
} catch (error) {
|
||||
console.error(`Failed to load script ${name}:`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function runWithTimeout(fn) {
|
||||
return new Promise((resolve, reject) => {
|
||||
// Create a child process for the script
|
||||
const child = require('child_process').fork(fn.toString(), [], {
|
||||
stdio: 'inherit'
|
||||
});
|
||||
|
||||
child.on('exit', (code) => {
|
||||
if (code === 0) {
|
||||
resolve();
|
||||
} else {
|
||||
reject(new Error(`Script exited with code ${code}`));
|
||||
}
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function clearScreen() {
|
||||
process.stdout.write('\x1Bc');
|
||||
}
|
||||
|
||||
const scripts = {
|
||||
'Import Scripts': {
|
||||
'1': { name: 'Full Import From Production', path: './import-from-prod' },
|
||||
'2': { name: 'Individual Import Scripts ▸', submenu: {
|
||||
'1': { name: 'Import Orders', path: './import/orders', key: 'importOrders' },
|
||||
'2': { name: 'Import Products', path: './import/products', key: 'importProducts' },
|
||||
'3': { name: 'Import Purchase Orders', path: './import/purchase-orders' },
|
||||
'4': { name: 'Import Categories', path: './import/categories' },
|
||||
'b': { name: 'Back to Main Menu' }
|
||||
}}
|
||||
},
|
||||
'Metrics': {
|
||||
'3': { name: 'Calculate All Metrics', path: './calculate-metrics' },
|
||||
'4': { name: 'Individual Metric Scripts ▸', submenu: {
|
||||
'1': { name: 'Brand Metrics', path: './metrics/brand-metrics' },
|
||||
'2': { name: 'Category Metrics', path: './metrics/category-metrics' },
|
||||
'3': { name: 'Financial Metrics', path: './metrics/financial-metrics' },
|
||||
'4': { name: 'Product Metrics', path: './metrics/product-metrics' },
|
||||
'5': { name: 'Sales Forecasts', path: './metrics/sales-forecasts' },
|
||||
'6': { name: 'Time Aggregates', path: './metrics/time-aggregates' },
|
||||
'7': { name: 'Vendor Metrics', path: './metrics/vendor-metrics' },
|
||||
'b': { name: 'Back to Main Menu' }
|
||||
}}
|
||||
},
|
||||
'Database Management': {
|
||||
'5': { name: 'Test Production Connection', path: './test-prod-connection' }
|
||||
},
|
||||
'Reset Scripts': {
|
||||
'6': { name: 'Reset Database', path: './reset-db' },
|
||||
'7': { name: 'Reset Metrics', path: './reset-metrics' }
|
||||
}
|
||||
};
|
||||
|
||||
let lastRun = null;
|
||||
|
||||
async function displayMenu(menuItems, title = 'Inventory Management Script Runner') {
|
||||
clearScreen();
|
||||
console.log(`\n${title}\n`);
|
||||
|
||||
for (const [category, items] of Object.entries(menuItems)) {
|
||||
console.log(`\n${category}:`);
|
||||
Object.entries(items).forEach(([key, script]) => {
|
||||
console.log(`${key}. ${script.name}`);
|
||||
});
|
||||
}
|
||||
|
||||
if (lastRun) {
|
||||
console.log('\nQuick Access:');
|
||||
console.log(`r. Repeat Last Script (${lastRun.name})`);
|
||||
}
|
||||
|
||||
console.log('\nq. Quit\n');
|
||||
}
|
||||
|
||||
async function handleSubmenu(submenu, title) {
|
||||
while (true) {
|
||||
await displayMenu({"Individual Scripts": submenu}, title);
|
||||
const choice = await question('Select an option (or b to go back): ');
|
||||
|
||||
if (choice.toLowerCase() === 'b') {
|
||||
return null;
|
||||
}
|
||||
|
||||
if (submenu[choice]) {
|
||||
return submenu[choice];
|
||||
}
|
||||
|
||||
console.log('Invalid selection. Please try again.');
|
||||
await new Promise(resolve => setTimeout(resolve, 1000));
|
||||
}
|
||||
}
|
||||
|
||||
async function runScript(script) {
|
||||
console.log(`\nRunning: ${script.name}`);
|
||||
try {
|
||||
const scriptPath = require.resolve(script.path);
|
||||
await runWithTimeout(scriptPath);
|
||||
console.log('\nScript completed successfully');
|
||||
lastRun = script;
|
||||
} catch (error) {
|
||||
console.error('\nError running script:', error);
|
||||
}
|
||||
await question('\nPress Enter to continue...');
|
||||
}
|
||||
|
||||
async function main() {
|
||||
while (true) {
|
||||
await displayMenu(scripts);
|
||||
|
||||
const choice = await question('Select an option: ');
|
||||
|
||||
if (choice.toLowerCase() === 'q') {
|
||||
break;
|
||||
}
|
||||
|
||||
if (choice.toLowerCase() === 'r' && lastRun) {
|
||||
await runScript(lastRun);
|
||||
continue;
|
||||
}
|
||||
|
||||
let selectedScript = null;
|
||||
for (const category of Object.values(scripts)) {
|
||||
if (category[choice]) {
|
||||
selectedScript = category[choice];
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!selectedScript) {
|
||||
console.log('Invalid selection. Please try again.');
|
||||
await new Promise(resolve => setTimeout(resolve, 1000));
|
||||
continue;
|
||||
}
|
||||
|
||||
if (selectedScript.submenu) {
|
||||
const submenuChoice = await handleSubmenu(
|
||||
selectedScript.submenu,
|
||||
selectedScript.name
|
||||
);
|
||||
if (submenuChoice && submenuChoice.path) {
|
||||
await runScript(submenuChoice);
|
||||
}
|
||||
} else if (selectedScript.path) {
|
||||
await runScript(selectedScript);
|
||||
}
|
||||
}
|
||||
|
||||
rl.close();
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
main().catch(error => {
|
||||
console.error('Fatal error:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
}
|
||||
@@ -1,167 +0,0 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const https = require('https');
|
||||
|
||||
// Configuration
|
||||
const FILES = [
|
||||
{
|
||||
name: '39f2x83-products.csv',
|
||||
url: 'https://feeds.acherryontop.com/39f2x83-products.csv'
|
||||
},
|
||||
{
|
||||
name: '39f2x83-orders.csv',
|
||||
url: 'https://feeds.acherryontop.com/39f2x83-orders.csv'
|
||||
},
|
||||
{
|
||||
name: '39f2x83-purchase_orders.csv',
|
||||
url: 'https://feeds.acherryontop.com/39f2x83-purchase_orders.csv'
|
||||
}
|
||||
];
|
||||
|
||||
const CSV_DIR = path.join(__dirname, '..', 'csv');
|
||||
|
||||
// Ensure CSV directory exists
|
||||
if (!fs.existsSync(CSV_DIR)) {
|
||||
fs.mkdirSync(CSV_DIR, { recursive: true });
|
||||
}
|
||||
|
||||
// Function to download a file
|
||||
function downloadFile(url, filePath) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const file = fs.createWriteStream(filePath);
|
||||
|
||||
https.get(url, response => {
|
||||
if (response.statusCode !== 200) {
|
||||
reject(new Error(`Failed to download: ${response.statusCode} ${response.statusMessage}`));
|
||||
return;
|
||||
}
|
||||
|
||||
const totalSize = parseInt(response.headers['content-length'], 10);
|
||||
let downloadedSize = 0;
|
||||
let lastProgressUpdate = Date.now();
|
||||
const startTime = Date.now();
|
||||
|
||||
response.on('data', chunk => {
|
||||
downloadedSize += chunk.length;
|
||||
const now = Date.now();
|
||||
// Update progress at most every 100ms to avoid console flooding
|
||||
if (now - lastProgressUpdate > 100) {
|
||||
const elapsed = (now - startTime) / 1000;
|
||||
const rate = downloadedSize / elapsed;
|
||||
const remaining = (totalSize - downloadedSize) / rate;
|
||||
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: `Downloading ${path.basename(filePath)}`,
|
||||
current: downloadedSize,
|
||||
total: totalSize,
|
||||
rate: (rate / 1024 / 1024).toFixed(2), // MB/s
|
||||
elapsed: formatDuration(elapsed),
|
||||
remaining: formatDuration(remaining),
|
||||
percentage: ((downloadedSize / totalSize) * 100).toFixed(1)
|
||||
}));
|
||||
lastProgressUpdate = now;
|
||||
}
|
||||
});
|
||||
|
||||
response.pipe(file);
|
||||
|
||||
file.on('finish', () => {
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: `Completed ${path.basename(filePath)}`,
|
||||
current: totalSize,
|
||||
total: totalSize,
|
||||
percentage: '100'
|
||||
}));
|
||||
file.close();
|
||||
resolve();
|
||||
});
|
||||
}).on('error', error => {
|
||||
fs.unlink(filePath, () => {}); // Delete the file if download failed
|
||||
reject(error);
|
||||
});
|
||||
|
||||
file.on('error', error => {
|
||||
fs.unlink(filePath, () => {}); // Delete the file if there was an error
|
||||
reject(error);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Helper function to format duration
|
||||
function formatDuration(seconds) {
|
||||
if (seconds < 60) return `${Math.round(seconds)}s`;
|
||||
const minutes = Math.floor(seconds / 60);
|
||||
seconds = Math.round(seconds % 60);
|
||||
return `${minutes}m ${seconds}s`;
|
||||
}
|
||||
|
||||
// Main function to update all files
|
||||
async function updateFiles() {
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: 'Starting CSV file updates',
|
||||
total: FILES.length,
|
||||
current: 0
|
||||
}));
|
||||
|
||||
for (let i = 0; i < FILES.length; i++) {
|
||||
const file = FILES[i];
|
||||
const filePath = path.join(CSV_DIR, file.name);
|
||||
|
||||
try {
|
||||
// Delete existing file if it exists
|
||||
if (fs.existsSync(filePath)) {
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: `Removing existing file: ${file.name}`,
|
||||
current: i,
|
||||
total: FILES.length,
|
||||
percentage: ((i / FILES.length) * 100).toFixed(1)
|
||||
}));
|
||||
fs.unlinkSync(filePath);
|
||||
}
|
||||
|
||||
// Download new file
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: `Starting download: ${file.name}`,
|
||||
current: i,
|
||||
total: FILES.length,
|
||||
percentage: ((i / FILES.length) * 100).toFixed(1)
|
||||
}));
|
||||
await downloadFile(file.url, filePath);
|
||||
console.log(JSON.stringify({
|
||||
status: 'running',
|
||||
operation: `Successfully updated ${file.name}`,
|
||||
current: i + 1,
|
||||
total: FILES.length,
|
||||
percentage: (((i + 1) / FILES.length) * 100).toFixed(1)
|
||||
}));
|
||||
} catch (error) {
|
||||
console.error(JSON.stringify({
|
||||
status: 'error',
|
||||
operation: `Error updating ${file.name}`,
|
||||
error: error.message
|
||||
}));
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(JSON.stringify({
|
||||
status: 'complete',
|
||||
operation: 'CSV file update complete',
|
||||
current: FILES.length,
|
||||
total: FILES.length,
|
||||
percentage: '100'
|
||||
}));
|
||||
}
|
||||
|
||||
// Run the update
|
||||
updateFiles().catch(error => {
|
||||
console.error(JSON.stringify({
|
||||
error: `Update failed: ${error.message}`
|
||||
}));
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -36,7 +36,7 @@ router.get('/stats', async (req, res) => {
|
||||
0
|
||||
) as averageOrderValue
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
`);
|
||||
|
||||
@@ -62,22 +62,43 @@ router.get('/profit', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get profit margins by category
|
||||
// Get profit margins by category with full path
|
||||
const [byCategory] = await pool.query(`
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0)) * 100, 1
|
||||
) as profitMargin,
|
||||
SUM(o.price * o.quantity) as revenue,
|
||||
SUM(p.cost_price * o.quantity) as cost
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(p.cost_price * o.quantity) AS DECIMAL(15,3)) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY c.name
|
||||
GROUP BY c.name, cp.path
|
||||
ORDER BY profitMargin DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
@@ -90,10 +111,10 @@ router.get('/profit', async (req, res) => {
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0)) * 100, 1
|
||||
) as profitMargin,
|
||||
SUM(o.price * o.quantity) as revenue,
|
||||
SUM(p.cost_price * o.quantity) as cost
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(p.cost_price * o.quantity) AS DECIMAL(15,3)) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
CROSS JOIN (
|
||||
SELECT DATE_FORMAT(o.date, '%Y-%m-%d') as formatted_date
|
||||
FROM orders o
|
||||
@@ -106,20 +127,44 @@ router.get('/profit', async (req, res) => {
|
||||
ORDER BY formatted_date
|
||||
`);
|
||||
|
||||
// Get top performing products
|
||||
// Get top performing products with category paths
|
||||
const [topProducts] = await pool.query(`
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
SELECT
|
||||
p.title as product,
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0)) * 100, 1
|
||||
) as profitMargin,
|
||||
SUM(o.price * o.quantity) as revenue,
|
||||
SUM(p.cost_price * o.quantity) as cost
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(p.cost_price * o.quantity) AS DECIMAL(15,3)) as cost
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY p.product_id, p.title
|
||||
GROUP BY p.pid, p.title, c.name, cp.path
|
||||
HAVING revenue > 0
|
||||
ORDER BY profitMargin DESC
|
||||
LIMIT 10
|
||||
@@ -144,7 +189,7 @@ router.get('/vendors', async (req, res) => {
|
||||
SELECT COUNT(DISTINCT p.vendor) as vendor_count,
|
||||
COUNT(DISTINCT o.order_number) as order_count
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
`);
|
||||
|
||||
@@ -155,26 +200,26 @@ router.get('/vendors', async (req, res) => {
|
||||
WITH monthly_sales AS (
|
||||
SELECT
|
||||
p.vendor,
|
||||
SUM(CASE
|
||||
CAST(SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as current_month,
|
||||
SUM(CASE
|
||||
END) AS DECIMAL(15,3)) as current_month,
|
||||
CAST(SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_month
|
||||
END) AS DECIMAL(15,3)) as previous_month
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
GROUP BY p.vendor
|
||||
)
|
||||
SELECT
|
||||
p.vendor,
|
||||
SUM(o.price * o.quantity) as salesVolume,
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
||||
COALESCE(ROUND(
|
||||
(SUM(o.price * o.quantity - p.cost_price * o.quantity) /
|
||||
NULLIF(SUM(o.price * o.quantity), 0)) * 100, 1
|
||||
@@ -182,13 +227,13 @@ router.get('/vendors', async (req, res) => {
|
||||
COALESCE(ROUND(
|
||||
SUM(o.quantity) / NULLIF(AVG(p.stock_quantity), 0), 1
|
||||
), 0) as stockTurnover,
|
||||
COUNT(DISTINCT p.product_id) as productCount,
|
||||
COUNT(DISTINCT p.pid) as productCount,
|
||||
ROUND(
|
||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||
1
|
||||
) as growth
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
LEFT JOIN monthly_sales ms ON p.vendor = ms.vendor
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
@@ -203,11 +248,11 @@ router.get('/vendors', async (req, res) => {
|
||||
const [comparison] = await pool.query(`
|
||||
SELECT
|
||||
p.vendor,
|
||||
COALESCE(ROUND(SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.product_id), 0), 2), 0) as salesPerProduct,
|
||||
CAST(COALESCE(ROUND(SUM(o.price * o.quantity) / NULLIF(COUNT(DISTINCT p.pid), 0), 2), 0) AS DECIMAL(15,3)) as salesPerProduct,
|
||||
COALESCE(ROUND(AVG((o.price - p.cost_price) / NULLIF(o.price, 0) * 100), 1), 0) as averageMargin,
|
||||
COUNT(DISTINCT p.product_id) as size
|
||||
COUNT(DISTINCT p.pid) as size
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id AND o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
LEFT JOIN orders o ON p.pid = o.pid AND o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
WHERE p.vendor IS NOT NULL
|
||||
GROUP BY p.vendor
|
||||
ORDER BY salesPerProduct DESC
|
||||
@@ -221,9 +266,9 @@ router.get('/vendors', async (req, res) => {
|
||||
SELECT
|
||||
p.vendor,
|
||||
DATE_FORMAT(o.date, '%b %Y') as month,
|
||||
COALESCE(SUM(o.price * o.quantity), 0) as sales
|
||||
CAST(COALESCE(SUM(o.price * o.quantity), 0) AS DECIMAL(15,3)) as sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE p.vendor IS NOT NULL
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
|
||||
GROUP BY
|
||||
@@ -272,9 +317,9 @@ router.get('/stock', async (req, res) => {
|
||||
ROUND(AVG(p.stock_quantity), 0) as averageStock,
|
||||
SUM(o.quantity) as totalSales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
GROUP BY c.name
|
||||
HAVING turnoverRate > 0
|
||||
@@ -290,7 +335,7 @@ router.get('/stock', async (req, res) => {
|
||||
SUM(CASE WHEN p.stock_quantity <= ? AND p.stock_quantity > 0 THEN 1 ELSE 0 END) as lowStock,
|
||||
SUM(CASE WHEN p.stock_quantity = 0 THEN 1 ELSE 0 END) as outOfStock
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
ORDER BY date
|
||||
@@ -304,26 +349,14 @@ router.get('/stock', async (req, res) => {
|
||||
const [criticalItems] = await pool.query(`
|
||||
WITH product_thresholds AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
COALESCE(
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
JOIN product_categories pc ON st.category_id = pc.category_id
|
||||
WHERE pc.product_id = p.product_id
|
||||
AND st.vendor = p.vendor LIMIT 1),
|
||||
WHERE st.vendor = p.vendor LIMIT 1),
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
JOIN product_categories pc ON st.category_id = pc.category_id
|
||||
WHERE pc.product_id = p.product_id
|
||||
AND st.vendor IS NULL LIMIT 1),
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
WHERE st.category_id IS NULL
|
||||
AND st.vendor = p.vendor LIMIT 1),
|
||||
(SELECT reorder_days
|
||||
FROM stock_thresholds st
|
||||
WHERE st.category_id IS NULL
|
||||
AND st.vendor IS NULL LIMIT 1),
|
||||
WHERE st.vendor IS NULL LIMIT 1),
|
||||
14
|
||||
) as reorder_days
|
||||
FROM products p
|
||||
@@ -339,11 +372,11 @@ router.get('/stock', async (req, res) => {
|
||||
ELSE ROUND(p.stock_quantity / NULLIF((SUM(o.quantity) / ?), 0))
|
||||
END as daysUntilStockout
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_thresholds pt ON p.product_id = pt.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_thresholds pt ON p.pid = pt.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
AND p.managing_stock = true
|
||||
GROUP BY p.product_id
|
||||
GROUP BY p.pid
|
||||
HAVING daysUntilStockout < ? AND daysUntilStockout >= 0
|
||||
ORDER BY daysUntilStockout
|
||||
LIMIT 10
|
||||
@@ -369,14 +402,16 @@ router.get('/pricing', async (req, res) => {
|
||||
// Get price points analysis
|
||||
const [pricePoints] = await pool.query(`
|
||||
SELECT
|
||||
p.price,
|
||||
SUM(o.quantity) as salesVolume,
|
||||
SUM(o.price * o.quantity) as revenue,
|
||||
p.categories as category
|
||||
CAST(p.price AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as salesVolume,
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
c.name as category
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY p.price, p.categories
|
||||
GROUP BY p.price, c.name
|
||||
HAVING salesVolume > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
@@ -386,8 +421,8 @@ router.get('/pricing', async (req, res) => {
|
||||
const [elasticity] = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(o.date, '%Y-%m-%d') as date,
|
||||
AVG(o.price) as price,
|
||||
SUM(o.quantity) as demand
|
||||
CAST(AVG(o.price) AS DECIMAL(15,3)) as price,
|
||||
CAST(SUM(o.quantity) AS DECIMAL(15,3)) as demand
|
||||
FROM orders o
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY DATE_FORMAT(o.date, '%Y-%m-%d')
|
||||
@@ -398,21 +433,25 @@ router.get('/pricing', async (req, res) => {
|
||||
const [recommendations] = await pool.query(`
|
||||
SELECT
|
||||
p.title as product,
|
||||
p.price as currentPrice,
|
||||
ROUND(
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
|
||||
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
|
||||
ELSE p.price
|
||||
END, 2
|
||||
CAST(p.price AS DECIMAL(15,3)) as currentPrice,
|
||||
CAST(
|
||||
ROUND(
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN p.price * 1.1
|
||||
WHEN AVG(o.quantity) < 2 THEN p.price * 0.9
|
||||
ELSE p.price
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) as recommendedPrice,
|
||||
ROUND(
|
||||
SUM(o.price * o.quantity) *
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN 1.15
|
||||
WHEN AVG(o.quantity) < 2 THEN 0.95
|
||||
ELSE 1
|
||||
END, 2
|
||||
CAST(
|
||||
ROUND(
|
||||
SUM(o.price * o.quantity) *
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN 1.15
|
||||
WHEN AVG(o.quantity) < 2 THEN 0.95
|
||||
ELSE 1
|
||||
END, 2
|
||||
) AS DECIMAL(15,3)
|
||||
) as potentialRevenue,
|
||||
CASE
|
||||
WHEN AVG(o.quantity) > 10 THEN 85
|
||||
@@ -420,11 +459,11 @@ router.get('/pricing', async (req, res) => {
|
||||
ELSE 65
|
||||
END as confidence
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY p.product_id
|
||||
GROUP BY p.pid, p.price
|
||||
HAVING ABS(recommendedPrice - currentPrice) > 0
|
||||
ORDER BY potentialRevenue - SUM(o.price * o.quantity) DESC
|
||||
ORDER BY potentialRevenue - CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
@@ -440,11 +479,36 @@ router.get('/categories', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get category performance metrics
|
||||
// Common CTE for category paths
|
||||
const categoryPathCTE = `
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get category performance metrics with full path
|
||||
const [performance] = await pool.query(`
|
||||
WITH monthly_sales AS (
|
||||
${categoryPathCTE},
|
||||
monthly_sales AS (
|
||||
SELECT
|
||||
c.name,
|
||||
cp.path,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
@@ -457,62 +521,72 @@ router.get('/categories', async (req, res) => {
|
||||
ELSE 0
|
||||
END) as previous_month
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
GROUP BY c.name
|
||||
GROUP BY c.name, cp.path
|
||||
)
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
SUM(o.price * o.quantity) as revenue,
|
||||
SUM(o.price * o.quantity - p.cost_price * o.quantity) as profit,
|
||||
ROUND(
|
||||
((ms.current_month / NULLIF(ms.previous_month, 0)) - 1) * 100,
|
||||
1
|
||||
) as growth,
|
||||
COUNT(DISTINCT p.product_id) as productCount
|
||||
COUNT(DISTINCT p.pid) as productCount
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN monthly_sales ms ON c.name = ms.name
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
LEFT JOIN monthly_sales ms ON c.name = ms.name AND cp.path = ms.path
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
GROUP BY c.name, ms.current_month, ms.previous_month
|
||||
GROUP BY c.name, cp.path, ms.current_month, ms.previous_month
|
||||
HAVING revenue > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Get category revenue distribution
|
||||
// Get category revenue distribution with full path
|
||||
const [distribution] = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
SUM(o.price * o.quantity) as value
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
GROUP BY c.name
|
||||
GROUP BY c.name, cp.path
|
||||
HAVING value > 0
|
||||
ORDER BY value DESC
|
||||
LIMIT 6
|
||||
`);
|
||||
|
||||
// Get category sales trends
|
||||
// Get category sales trends with full path
|
||||
const [trends] = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.name as category,
|
||||
cp.path as categoryPath,
|
||||
DATE_FORMAT(o.date, '%b %Y') as month,
|
||||
SUM(o.price * o.quantity) as sales
|
||||
FROM products p
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 6 MONTH)
|
||||
GROUP BY
|
||||
c.name,
|
||||
cp.path,
|
||||
DATE_FORMAT(o.date, '%b %Y'),
|
||||
DATE_FORMAT(o.date, '%Y-%m')
|
||||
ORDER BY
|
||||
@@ -529,76 +603,97 @@ router.get('/categories', async (req, res) => {
|
||||
|
||||
// Forecast endpoint
|
||||
router.get('/forecast', async (req, res) => {
|
||||
try {
|
||||
const { brand, startDate, endDate } = req.query;
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
const { brand, startDate, endDate } = req.query;
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [results] = await pool.query(`
|
||||
WITH category_metrics AS (
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
c.name as category_name,
|
||||
p.brand,
|
||||
COUNT(DISTINCT p.product_id) as num_products,
|
||||
COALESCE(ROUND(SUM(o.quantity) / DATEDIFF(?, ?), 2), 0) as avg_daily_sales,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
COALESCE(ROUND(SUM(o.quantity) / COUNT(DISTINCT p.product_id), 2), 0) as avgTotalSold,
|
||||
COALESCE(ROUND(AVG(o.price), 2), 0) as avg_price
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.id = pc.category_id
|
||||
JOIN products p ON pc.product_id = p.product_id
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pm.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY c.id, c.name, p.brand
|
||||
),
|
||||
product_metrics AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.title,
|
||||
p.sku,
|
||||
p.stock_quantity,
|
||||
pc.category_id,
|
||||
pm.first_received_date,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
COALESCE(ROUND(AVG(o.price), 2), 0) as avg_price
|
||||
FROM products p
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN orders o ON p.product_id = o.product_id
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pm.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY p.product_id, p.title, p.sku, p.stock_quantity, pc.category_id, pm.first_received_date
|
||||
)
|
||||
SELECT
|
||||
cm.*,
|
||||
JSON_ARRAYAGG(
|
||||
JSON_OBJECT(
|
||||
'product_id', pm.product_id,
|
||||
'title', pm.title,
|
||||
'sku', pm.sku,
|
||||
'stock_quantity', pm.stock_quantity,
|
||||
'total_sold', pm.total_sold,
|
||||
'avg_price', pm.avg_price,
|
||||
'first_received_date', DATE_FORMAT(pm.first_received_date, '%Y-%m-%d')
|
||||
)
|
||||
) as products
|
||||
FROM category_metrics cm
|
||||
JOIN product_metrics pm ON cm.category_id = pm.category_id
|
||||
GROUP BY cm.category_id, cm.category_name, cm.brand, cm.num_products, cm.avg_daily_sales, cm.total_sold, cm.avgTotalSold, cm.avg_price
|
||||
ORDER BY cm.total_sold DESC
|
||||
`, [startDate, endDate, startDate, endDate, brand, startDate, endDate, startDate, endDate, brand, startDate, endDate]);
|
||||
const [results] = await pool.query(`
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
res.json(results);
|
||||
} catch (error) {
|
||||
console.error('Error fetching forecast data:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch forecast data' });
|
||||
}
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
category_metrics AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name as category_name,
|
||||
cp.path,
|
||||
p.brand,
|
||||
COUNT(DISTINCT p.pid) as num_products,
|
||||
CAST(COALESCE(ROUND(SUM(o.quantity) / DATEDIFF(?, ?), 2), 0) AS DECIMAL(15,3)) as avg_daily_sales,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
CAST(COALESCE(ROUND(SUM(o.quantity) / COUNT(DISTINCT p.pid), 2), 0) AS DECIMAL(15,3)) as avgTotalSold,
|
||||
CAST(COALESCE(ROUND(AVG(o.price), 2), 0) AS DECIMAL(15,3)) as avg_price
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
LEFT JOIN product_metrics pmet ON p.pid = pmet.pid
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pmet.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY c.cat_id, c.name, cp.path, p.brand
|
||||
),
|
||||
product_details AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
p.title,
|
||||
p.SKU,
|
||||
p.stock_quantity,
|
||||
pc.cat_id,
|
||||
pmet.first_received_date,
|
||||
COALESCE(SUM(o.quantity), 0) as total_sold,
|
||||
CAST(COALESCE(ROUND(AVG(o.price), 2), 0) AS DECIMAL(15,3)) as avg_price
|
||||
FROM products p
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN product_metrics pmet ON p.pid = pmet.pid
|
||||
LEFT JOIN orders o ON p.pid = o.pid
|
||||
AND o.date BETWEEN ? AND ?
|
||||
AND o.canceled = false
|
||||
WHERE p.brand = ?
|
||||
AND pmet.first_received_date BETWEEN ? AND ?
|
||||
GROUP BY p.pid, p.title, p.SKU, p.stock_quantity, pc.cat_id, pmet.first_received_date
|
||||
)
|
||||
SELECT
|
||||
cm.*,
|
||||
JSON_ARRAYAGG(
|
||||
JSON_OBJECT(
|
||||
'pid', pd.pid,
|
||||
'title', pd.title,
|
||||
'SKU', pd.SKU,
|
||||
'stock_quantity', pd.stock_quantity,
|
||||
'total_sold', pd.total_sold,
|
||||
'avg_price', pd.avg_price,
|
||||
'first_received_date', DATE_FORMAT(pd.first_received_date, '%Y-%m-%d')
|
||||
)
|
||||
) as products
|
||||
FROM category_metrics cm
|
||||
JOIN product_details pd ON cm.cat_id = pd.cat_id
|
||||
GROUP BY cm.cat_id, cm.category_name, cm.path, cm.brand, cm.num_products, cm.avg_daily_sales, cm.total_sold, cm.avgTotalSold, cm.avg_price
|
||||
ORDER BY cm.total_sold DESC
|
||||
`, [endDate, startDate, startDate, endDate, brand, startDate, endDate, startDate, endDate, brand, startDate, endDate]);
|
||||
|
||||
res.json(results);
|
||||
} catch (error) {
|
||||
console.error('Error fetching forecast data:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch forecast data' });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -5,62 +5,90 @@ const router = express.Router();
|
||||
router.get('/', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// Get parent categories for filter dropdown
|
||||
const [parentCategories] = await pool.query(`
|
||||
SELECT DISTINCT c2.name as parent_name
|
||||
FROM categories c1
|
||||
JOIN categories c2 ON c1.parent_id = c2.id
|
||||
WHERE c1.parent_id IS NOT NULL
|
||||
ORDER BY c2.name
|
||||
`);
|
||||
|
||||
// Get all categories with metrics
|
||||
// Get all categories with metrics and hierarchy info
|
||||
const [categories] = await pool.query(`
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.type,
|
||||
c.parent_id,
|
||||
c.description,
|
||||
COALESCE(p.name, '') as parent_name,
|
||||
cm.product_count,
|
||||
cm.total_value,
|
||||
cm.avg_margin,
|
||||
cm.turnover_rate,
|
||||
cm.growth_rate,
|
||||
cm.status
|
||||
c.status,
|
||||
p.name as parent_name,
|
||||
p.type as parent_type,
|
||||
COALESCE(cm.product_count, 0) as product_count,
|
||||
COALESCE(cm.active_products, 0) as active_products,
|
||||
CAST(COALESCE(cm.total_value, 0) AS DECIMAL(15,3)) as total_value,
|
||||
COALESCE(cm.avg_margin, 0) as avg_margin,
|
||||
COALESCE(cm.turnover_rate, 0) as turnover_rate,
|
||||
COALESCE(cm.growth_rate, 0) as growth_rate
|
||||
FROM categories c
|
||||
LEFT JOIN categories p ON c.parent_id = p.id
|
||||
LEFT JOIN category_metrics cm ON c.id = cm.category_id
|
||||
ORDER BY c.name ASC
|
||||
LEFT JOIN categories p ON c.parent_id = p.cat_id
|
||||
LEFT JOIN category_metrics cm ON c.cat_id = cm.category_id
|
||||
ORDER BY
|
||||
CASE
|
||||
WHEN c.type = 10 THEN 1 -- sections first
|
||||
WHEN c.type = 11 THEN 2 -- categories second
|
||||
WHEN c.type = 12 THEN 3 -- subcategories third
|
||||
WHEN c.type = 13 THEN 4 -- subsubcategories fourth
|
||||
WHEN c.type = 20 THEN 5 -- themes fifth
|
||||
WHEN c.type = 21 THEN 6 -- subthemes last
|
||||
ELSE 7
|
||||
END,
|
||||
c.name ASC
|
||||
`);
|
||||
|
||||
// Get overall stats
|
||||
const [stats] = await pool.query(`
|
||||
SELECT
|
||||
COUNT(DISTINCT c.id) as totalCategories,
|
||||
COUNT(DISTINCT CASE WHEN cm.status = 'active' THEN c.id END) as activeCategories,
|
||||
COALESCE(SUM(cm.total_value), 0) as totalValue,
|
||||
COUNT(DISTINCT c.cat_id) as totalCategories,
|
||||
COUNT(DISTINCT CASE WHEN c.status = 'active' THEN c.cat_id END) as activeCategories,
|
||||
CAST(COALESCE(SUM(cm.total_value), 0) AS DECIMAL(15,3)) as totalValue,
|
||||
COALESCE(ROUND(AVG(NULLIF(cm.avg_margin, 0)), 1), 0) as avgMargin,
|
||||
COALESCE(ROUND(AVG(NULLIF(cm.growth_rate, 0)), 1), 0) as avgGrowth
|
||||
FROM categories c
|
||||
LEFT JOIN category_metrics cm ON c.id = cm.category_id
|
||||
LEFT JOIN category_metrics cm ON c.cat_id = cm.category_id
|
||||
`);
|
||||
|
||||
// Get type counts for filtering
|
||||
const [typeCounts] = await pool.query(`
|
||||
SELECT
|
||||
type,
|
||||
COUNT(*) as count
|
||||
FROM categories
|
||||
GROUP BY type
|
||||
ORDER BY type
|
||||
`);
|
||||
|
||||
res.json({
|
||||
categories: categories.map(cat => ({
|
||||
...cat,
|
||||
parent_category: cat.parent_name, // Map parent_name to parent_category for frontend compatibility
|
||||
product_count: parseInt(cat.product_count || 0),
|
||||
total_value: parseFloat(cat.total_value || 0),
|
||||
avg_margin: parseFloat(cat.avg_margin || 0),
|
||||
turnover_rate: parseFloat(cat.turnover_rate || 0),
|
||||
growth_rate: parseFloat(cat.growth_rate || 0)
|
||||
cat_id: cat.cat_id,
|
||||
name: cat.name,
|
||||
type: cat.type,
|
||||
parent_id: cat.parent_id,
|
||||
parent_name: cat.parent_name,
|
||||
parent_type: cat.parent_type,
|
||||
description: cat.description,
|
||||
status: cat.status,
|
||||
metrics: {
|
||||
product_count: parseInt(cat.product_count),
|
||||
active_products: parseInt(cat.active_products),
|
||||
total_value: parseFloat(cat.total_value),
|
||||
avg_margin: parseFloat(cat.avg_margin),
|
||||
turnover_rate: parseFloat(cat.turnover_rate),
|
||||
growth_rate: parseFloat(cat.growth_rate)
|
||||
}
|
||||
})),
|
||||
typeCounts: typeCounts.map(tc => ({
|
||||
type: tc.type,
|
||||
count: parseInt(tc.count)
|
||||
})),
|
||||
parentCategories: parentCategories.map(p => p.parent_name),
|
||||
stats: {
|
||||
...stats[0],
|
||||
totalValue: parseFloat(stats[0].totalValue || 0),
|
||||
avgMargin: parseFloat(stats[0].avgMargin || 0),
|
||||
avgGrowth: parseFloat(stats[0].avgGrowth || 0)
|
||||
totalCategories: parseInt(stats[0].totalCategories),
|
||||
activeCategories: parseInt(stats[0].activeCategories),
|
||||
totalValue: parseFloat(stats[0].totalValue),
|
||||
avgMargin: parseFloat(stats[0].avgMargin),
|
||||
avgGrowth: parseFloat(stats[0].avgGrowth)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
|
||||
@@ -2,6 +2,7 @@ const express = require('express');
|
||||
const router = express.Router();
|
||||
const { spawn } = require('child_process');
|
||||
const path = require('path');
|
||||
const db = require('../utils/db');
|
||||
|
||||
// Debug middleware MUST be first
|
||||
router.use((req, res, next) => {
|
||||
@@ -9,9 +10,11 @@ router.use((req, res, next) => {
|
||||
next();
|
||||
});
|
||||
|
||||
// Store active import process and its progress
|
||||
// Store active processes and their progress
|
||||
let activeImport = null;
|
||||
let importProgress = null;
|
||||
let activeFullUpdate = null;
|
||||
let activeFullReset = null;
|
||||
|
||||
// SSE clients for progress updates
|
||||
const updateClients = new Set();
|
||||
@@ -19,17 +22,16 @@ const importClients = new Set();
|
||||
const resetClients = new Set();
|
||||
const resetMetricsClients = new Set();
|
||||
const calculateMetricsClients = new Set();
|
||||
const fullUpdateClients = new Set();
|
||||
const fullResetClients = new Set();
|
||||
|
||||
// Helper to send progress to specific clients
|
||||
function sendProgressToClients(clients, progress) {
|
||||
const data = typeof progress === 'string' ? { progress } : progress;
|
||||
|
||||
// Ensure we have a status field
|
||||
if (!data.status) {
|
||||
data.status = 'running';
|
||||
}
|
||||
|
||||
const message = `data: ${JSON.stringify(data)}\n\n`;
|
||||
function sendProgressToClients(clients, data) {
|
||||
// If data is a string, send it directly
|
||||
// If it's an object, convert it to JSON
|
||||
const message = typeof data === 'string'
|
||||
? `data: ${data}\n\n`
|
||||
: `data: ${JSON.stringify(data)}\n\n`;
|
||||
|
||||
clients.forEach(client => {
|
||||
try {
|
||||
@@ -45,115 +47,149 @@ function sendProgressToClients(clients, progress) {
|
||||
});
|
||||
}
|
||||
|
||||
// Helper to run a script and stream progress
|
||||
function runScript(scriptPath, type, clients) {
|
||||
return new Promise((resolve, reject) => {
|
||||
// Kill any existing process of this type
|
||||
let activeProcess;
|
||||
switch (type) {
|
||||
case 'update':
|
||||
if (activeFullUpdate) {
|
||||
try { activeFullUpdate.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullUpdate;
|
||||
break;
|
||||
case 'reset':
|
||||
if (activeFullReset) {
|
||||
try { activeFullReset.kill(); } catch (e) { }
|
||||
}
|
||||
activeProcess = activeFullReset;
|
||||
break;
|
||||
}
|
||||
|
||||
const child = spawn('node', [scriptPath], {
|
||||
stdio: ['inherit', 'pipe', 'pipe']
|
||||
});
|
||||
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = child;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = child;
|
||||
break;
|
||||
}
|
||||
|
||||
let output = '';
|
||||
|
||||
child.stdout.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
output += text;
|
||||
|
||||
// Split by lines to handle multiple JSON outputs
|
||||
const lines = text.split('\n');
|
||||
lines.filter(line => line.trim()).forEach(line => {
|
||||
try {
|
||||
// Try to parse as JSON but don't let it affect the display
|
||||
const jsonData = JSON.parse(line);
|
||||
// Only end the process if we get a final status
|
||||
if (jsonData.status === 'complete' || jsonData.status === 'error' || jsonData.status === 'cancelled') {
|
||||
if (jsonData.status === 'complete' && !jsonData.operation?.includes('complete')) {
|
||||
// Don't close for intermediate completion messages
|
||||
sendProgressToClients(clients, line);
|
||||
return;
|
||||
}
|
||||
// Close only on final completion/error/cancellation
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
if (jsonData.status === 'error') {
|
||||
reject(new Error(jsonData.error || 'Unknown error'));
|
||||
} else {
|
||||
resolve({ output });
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
// Not JSON, just display as is
|
||||
}
|
||||
// Always send the raw line
|
||||
sendProgressToClients(clients, line);
|
||||
});
|
||||
});
|
||||
|
||||
child.stderr.on('data', (data) => {
|
||||
const text = data.toString();
|
||||
console.error(text);
|
||||
// Send stderr output directly too
|
||||
sendProgressToClients(clients, text);
|
||||
});
|
||||
|
||||
child.on('close', (code) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
|
||||
if (code !== 0) {
|
||||
const error = `Script ${scriptPath} exited with code ${code}`;
|
||||
sendProgressToClients(clients, error);
|
||||
reject(new Error(error));
|
||||
}
|
||||
// Don't resolve here - let the completion message from the script trigger the resolve
|
||||
});
|
||||
|
||||
child.on('error', (err) => {
|
||||
switch (type) {
|
||||
case 'update':
|
||||
activeFullUpdate = null;
|
||||
break;
|
||||
case 'reset':
|
||||
activeFullReset = null;
|
||||
break;
|
||||
}
|
||||
sendProgressToClients(clients, err.message);
|
||||
reject(err);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Progress endpoints
|
||||
router.get('/update/progress', (req, res) => {
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Send an initial message to test the connection
|
||||
res.write('data: {"status":"running","operation":"Initializing connection..."}\n\n');
|
||||
|
||||
// Add this client to the update set
|
||||
updateClients.add(res);
|
||||
|
||||
// Remove client when connection closes
|
||||
req.on('close', () => {
|
||||
updateClients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
router.get('/import/progress', (req, res) => {
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Send an initial message to test the connection
|
||||
res.write('data: {"status":"running","operation":"Initializing connection..."}\n\n');
|
||||
|
||||
// Add this client to the import set
|
||||
importClients.add(res);
|
||||
|
||||
// Remove client when connection closes
|
||||
req.on('close', () => {
|
||||
importClients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
router.get('/reset/progress', (req, res) => {
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Send an initial message to test the connection
|
||||
res.write('data: {"status":"running","operation":"Initializing connection..."}\n\n');
|
||||
|
||||
// Add this client to the reset set
|
||||
resetClients.add(res);
|
||||
|
||||
// Remove client when connection closes
|
||||
req.on('close', () => {
|
||||
resetClients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
// Add reset-metrics progress endpoint
|
||||
router.get('/reset-metrics/progress', (req, res) => {
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Send an initial message to test the connection
|
||||
res.write('data: {"status":"running","operation":"Initializing connection..."}\n\n');
|
||||
|
||||
// Add this client to the reset-metrics set
|
||||
resetMetricsClients.add(res);
|
||||
|
||||
// Remove client when connection closes
|
||||
req.on('close', () => {
|
||||
resetMetricsClients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
// Add calculate-metrics progress endpoint
|
||||
router.get('/calculate-metrics/progress', (req, res) => {
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Send current progress if it exists
|
||||
if (importProgress) {
|
||||
res.write(`data: ${JSON.stringify(importProgress)}\n\n`);
|
||||
} else {
|
||||
res.write('data: {"status":"running","operation":"Initializing connection..."}\n\n');
|
||||
router.get('/:type/progress', (req, res) => {
|
||||
const { type } = req.params;
|
||||
if (!['update', 'reset'].includes(type)) {
|
||||
return res.status(400).json({ error: 'Invalid operation type' });
|
||||
}
|
||||
|
||||
// Add this client to the calculate-metrics set
|
||||
calculateMetricsClients.add(res);
|
||||
res.writeHead(200, {
|
||||
'Content-Type': 'text/event-stream',
|
||||
'Cache-Control': 'no-cache',
|
||||
'Connection': 'keep-alive',
|
||||
'Access-Control-Allow-Origin': req.headers.origin || '*',
|
||||
'Access-Control-Allow-Credentials': 'true'
|
||||
});
|
||||
|
||||
// Remove client when connection closes
|
||||
// Add this client to the correct set
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
clients.add(res);
|
||||
|
||||
// Send initial connection message
|
||||
sendProgressToClients(new Set([res]), JSON.stringify({
|
||||
status: 'running',
|
||||
operation: 'Initializing connection...'
|
||||
}));
|
||||
|
||||
// Handle client disconnect
|
||||
req.on('close', () => {
|
||||
calculateMetricsClients.delete(res);
|
||||
clients.delete(res);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -174,7 +210,6 @@ router.get('/status', (req, res) => {
|
||||
|
||||
// Add calculate-metrics status endpoint
|
||||
router.get('/calculate-metrics/status', (req, res) => {
|
||||
console.log('Calculate metrics status endpoint hit');
|
||||
const calculateMetrics = require('../../scripts/calculate-metrics');
|
||||
const progress = calculateMetrics.getProgress();
|
||||
|
||||
@@ -371,49 +406,35 @@ router.post('/import', async (req, res) => {
|
||||
|
||||
// Route to cancel active process
|
||||
router.post('/cancel', (req, res) => {
|
||||
if (!activeImport) {
|
||||
return res.status(404).json({ error: 'No active process to cancel' });
|
||||
let killed = false;
|
||||
|
||||
// Get the operation type from the request
|
||||
const { type } = req.query;
|
||||
const clients = type === 'update' ? fullUpdateClients : fullResetClients;
|
||||
const activeProcess = type === 'update' ? activeFullUpdate : activeFullReset;
|
||||
|
||||
if (activeProcess) {
|
||||
try {
|
||||
activeProcess.kill('SIGTERM');
|
||||
if (type === 'update') {
|
||||
activeFullUpdate = null;
|
||||
} else {
|
||||
activeFullReset = null;
|
||||
}
|
||||
killed = true;
|
||||
sendProgressToClients(clients, JSON.stringify({
|
||||
status: 'cancelled',
|
||||
operation: 'Operation cancelled'
|
||||
}));
|
||||
} catch (err) {
|
||||
console.error(`Error killing ${type} process:`, err);
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
// If it's the prod import module, call its cancel function
|
||||
if (typeof activeImport.cancelImport === 'function') {
|
||||
activeImport.cancelImport();
|
||||
} else {
|
||||
// Otherwise it's a child process
|
||||
activeImport.kill('SIGTERM');
|
||||
}
|
||||
|
||||
// Get the operation type from the request
|
||||
const { operation } = req.query;
|
||||
|
||||
// Send cancel message only to the appropriate client set
|
||||
const cancelMessage = {
|
||||
status: 'cancelled',
|
||||
operation: 'Operation cancelled'
|
||||
};
|
||||
|
||||
switch (operation) {
|
||||
case 'update':
|
||||
sendProgressToClients(updateClients, cancelMessage);
|
||||
break;
|
||||
case 'import':
|
||||
sendProgressToClients(importClients, cancelMessage);
|
||||
break;
|
||||
case 'reset':
|
||||
sendProgressToClients(resetClients, cancelMessage);
|
||||
break;
|
||||
case 'calculate-metrics':
|
||||
sendProgressToClients(calculateMetricsClients, cancelMessage);
|
||||
break;
|
||||
}
|
||||
|
||||
if (killed) {
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
// Even if there's an error, try to clean up
|
||||
activeImport = null;
|
||||
importProgress = null;
|
||||
res.status(500).json({ error: 'Failed to cancel process' });
|
||||
} else {
|
||||
res.status(404).json({ error: 'No active process to cancel' });
|
||||
}
|
||||
});
|
||||
|
||||
@@ -552,20 +573,6 @@ router.post('/reset-metrics', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Add calculate-metrics status endpoint
|
||||
router.get('/calculate-metrics/status', (req, res) => {
|
||||
const calculateMetrics = require('../../scripts/calculate-metrics');
|
||||
const progress = calculateMetrics.getProgress();
|
||||
|
||||
// Only consider it active if both the process is running and we have progress
|
||||
const isActive = !!activeImport && !!progress;
|
||||
|
||||
res.json({
|
||||
active: isActive,
|
||||
progress: isActive ? progress : null
|
||||
});
|
||||
});
|
||||
|
||||
// Add calculate-metrics endpoint
|
||||
router.post('/calculate-metrics', async (req, res) => {
|
||||
if (activeImport) {
|
||||
@@ -711,4 +718,96 @@ router.post('/import-from-prod', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-update - Run full update script
|
||||
router.post('/full-update', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-update.js');
|
||||
runScript(scriptPath, 'update', fullUpdateClients)
|
||||
.catch(error => {
|
||||
console.error('Update failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Update started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// POST /csv/full-reset - Run full reset script
|
||||
router.post('/full-reset', async (req, res) => {
|
||||
try {
|
||||
const scriptPath = path.join(__dirname, '../../scripts/full-reset.js');
|
||||
runScript(scriptPath, 'reset', fullResetClients)
|
||||
.catch(error => {
|
||||
console.error('Reset failed:', error);
|
||||
});
|
||||
res.status(202).json({ message: 'Reset started' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/import - Get recent import history
|
||||
router.get('/history/import', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const [rows] = await pool.query(`
|
||||
SELECT * FROM import_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching import history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /history/calculate - Get recent calculation history
|
||||
router.get('/history/calculate', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const [rows] = await pool.query(`
|
||||
SELECT * FROM calculate_history
|
||||
ORDER BY start_time DESC
|
||||
LIMIT 20
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching calculate history:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/modules - Get module calculation status
|
||||
router.get('/status/modules', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const [rows] = await pool.query(`
|
||||
SELECT module_name, last_calculation_timestamp
|
||||
FROM calculate_status
|
||||
ORDER BY module_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching module status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /status/tables - Get table sync status
|
||||
router.get('/status/tables', async (req, res) => {
|
||||
try {
|
||||
const pool = req.app.locals.pool;
|
||||
const [rows] = await pool.query(`
|
||||
SELECT table_name, last_sync_timestamp
|
||||
FROM sync_status
|
||||
ORDER BY table_name
|
||||
`);
|
||||
res.json(rows || []);
|
||||
} catch (error) {
|
||||
console.error('Error fetching table status:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -2,6 +2,9 @@ const express = require('express');
|
||||
const router = express.Router();
|
||||
const db = require('../utils/db');
|
||||
|
||||
// Import status codes
|
||||
const { ReceivingStatus } = require('../types/status-codes');
|
||||
|
||||
// Helper function to execute queries using the connection pool
|
||||
async function executeQuery(sql, params = []) {
|
||||
const pool = db.getPool();
|
||||
@@ -38,15 +41,14 @@ router.get('/stock/metrics', async (req, res) => {
|
||||
const [brandValues] = await executeQuery(`
|
||||
WITH brand_totals AS (
|
||||
SELECT
|
||||
brand,
|
||||
COUNT(DISTINCT product_id) as variant_count,
|
||||
COALESCE(brand, 'Unbranded') as brand,
|
||||
COUNT(DISTINCT pid) as variant_count,
|
||||
COALESCE(SUM(stock_quantity), 0) as stock_units,
|
||||
COALESCE(SUM(stock_quantity * cost_price), 0) as stock_cost,
|
||||
COALESCE(SUM(stock_quantity * price), 0) as stock_retail
|
||||
CAST(COALESCE(SUM(stock_quantity * cost_price), 0) AS DECIMAL(15,3)) as stock_cost,
|
||||
CAST(COALESCE(SUM(stock_quantity * price), 0) AS DECIMAL(15,3)) as stock_retail
|
||||
FROM products
|
||||
WHERE brand IS NOT NULL
|
||||
AND stock_quantity > 0
|
||||
GROUP BY brand
|
||||
WHERE stock_quantity > 0
|
||||
GROUP BY COALESCE(brand, 'Unbranded')
|
||||
HAVING stock_cost > 0
|
||||
),
|
||||
other_brands AS (
|
||||
@@ -54,8 +56,8 @@ router.get('/stock/metrics', async (req, res) => {
|
||||
'Other' as brand,
|
||||
SUM(variant_count) as variant_count,
|
||||
SUM(stock_units) as stock_units,
|
||||
SUM(stock_cost) as stock_cost,
|
||||
SUM(stock_retail) as stock_retail
|
||||
CAST(SUM(stock_cost) AS DECIMAL(15,3)) as stock_cost,
|
||||
CAST(SUM(stock_retail) AS DECIMAL(15,3)) as stock_retail
|
||||
FROM brand_totals
|
||||
WHERE stock_cost <= 5000
|
||||
),
|
||||
@@ -101,49 +103,51 @@ router.get('/purchase/metrics', async (req, res) => {
|
||||
try {
|
||||
const [rows] = await executeQuery(`
|
||||
SELECT
|
||||
COALESCE(COUNT(DISTINCT CASE WHEN po.status = 'open' THEN po.po_id END), 0) as active_pos,
|
||||
COALESCE(COUNT(DISTINCT CASE
|
||||
WHEN po.status = 'open' AND po.expected_date < CURDATE()
|
||||
WHEN po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
THEN po.po_id
|
||||
END), 0) as active_pos,
|
||||
COALESCE(COUNT(DISTINCT CASE
|
||||
WHEN po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
AND po.expected_date < CURDATE()
|
||||
THEN po.po_id
|
||||
END), 0) as overdue_pos,
|
||||
COALESCE(SUM(CASE WHEN po.status = 'open' THEN po.ordered ELSE 0 END), 0) as total_units,
|
||||
COALESCE(SUM(CASE
|
||||
WHEN po.status = 'open'
|
||||
WHEN po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
THEN po.ordered
|
||||
ELSE 0
|
||||
END), 0) as total_units,
|
||||
CAST(COALESCE(SUM(CASE
|
||||
WHEN po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
THEN po.ordered * po.cost_price
|
||||
ELSE 0
|
||||
END), 0) as total_cost,
|
||||
COALESCE(SUM(CASE
|
||||
WHEN po.status = 'open'
|
||||
END), 0) AS DECIMAL(15,3)) as total_cost,
|
||||
CAST(COALESCE(SUM(CASE
|
||||
WHEN po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
THEN po.ordered * p.price
|
||||
ELSE 0
|
||||
END), 0) as total_retail
|
||||
END), 0) AS DECIMAL(15,3)) as total_retail
|
||||
FROM purchase_orders po
|
||||
JOIN products p ON po.product_id = p.product_id
|
||||
JOIN products p ON po.pid = p.pid
|
||||
`);
|
||||
const poMetrics = rows[0];
|
||||
|
||||
console.log('Raw poMetrics from database:', poMetrics);
|
||||
console.log('poMetrics.active_pos:', poMetrics.active_pos);
|
||||
console.log('poMetrics.overdue_pos:', poMetrics.overdue_pos);
|
||||
console.log('poMetrics.total_units:', poMetrics.total_units);
|
||||
console.log('poMetrics.total_cost:', poMetrics.total_cost);
|
||||
console.log('poMetrics.total_retail:', poMetrics.total_retail);
|
||||
|
||||
const [vendorOrders] = await executeQuery(`
|
||||
SELECT
|
||||
po.vendor,
|
||||
COUNT(DISTINCT po.po_id) as order_count,
|
||||
COALESCE(SUM(po.ordered), 0) as ordered_units,
|
||||
COALESCE(SUM(po.ordered * po.cost_price), 0) as order_cost,
|
||||
COALESCE(SUM(po.ordered * p.price), 0) as order_retail
|
||||
COUNT(DISTINCT po.po_id) as orders,
|
||||
COALESCE(SUM(po.ordered), 0) as units,
|
||||
CAST(COALESCE(SUM(po.ordered * po.cost_price), 0) AS DECIMAL(15,3)) as cost,
|
||||
CAST(COALESCE(SUM(po.ordered * p.price), 0) AS DECIMAL(15,3)) as retail
|
||||
FROM purchase_orders po
|
||||
JOIN products p ON po.product_id = p.product_id
|
||||
WHERE po.status = 'open'
|
||||
JOIN products p ON po.pid = p.pid
|
||||
WHERE po.receiving_status < ${ReceivingStatus.PartialReceived}
|
||||
GROUP BY po.vendor
|
||||
HAVING order_cost > 0
|
||||
ORDER BY order_cost DESC
|
||||
HAVING cost > 0
|
||||
ORDER BY cost DESC
|
||||
`);
|
||||
|
||||
// Format response to match PurchaseMetricsData interface
|
||||
const response = {
|
||||
activePurchaseOrders: parseInt(poMetrics.active_pos) || 0,
|
||||
overduePurchaseOrders: parseInt(poMetrics.overdue_pos) || 0,
|
||||
@@ -152,10 +156,10 @@ router.get('/purchase/metrics', async (req, res) => {
|
||||
onOrderRetail: parseFloat(poMetrics.total_retail) || 0,
|
||||
vendorOrders: vendorOrders.map(v => ({
|
||||
vendor: v.vendor,
|
||||
orders: parseInt(v.order_count) || 0,
|
||||
units: parseInt(v.ordered_units) || 0,
|
||||
cost: parseFloat(v.order_cost) || 0,
|
||||
retail: parseFloat(v.order_retail) || 0
|
||||
orders: parseInt(v.orders) || 0,
|
||||
units: parseInt(v.units) || 0,
|
||||
cost: parseFloat(v.cost) || 0,
|
||||
retail: parseFloat(v.retail) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
@@ -173,21 +177,21 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
// Get summary metrics
|
||||
const [metrics] = await executeQuery(`
|
||||
SELECT
|
||||
COUNT(DISTINCT p.product_id) as products_to_replenish,
|
||||
COUNT(DISTINCT p.pid) as products_to_replenish,
|
||||
COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN ABS(p.stock_quantity) + pm.reorder_qty
|
||||
ELSE pm.reorder_qty
|
||||
END), 0) as total_units_needed,
|
||||
COALESCE(SUM(CASE
|
||||
CAST(COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.cost_price
|
||||
ELSE pm.reorder_qty * p.cost_price
|
||||
END), 0) as total_cost,
|
||||
COALESCE(SUM(CASE
|
||||
END), 0) AS DECIMAL(15,3)) as total_cost,
|
||||
CAST(COALESCE(SUM(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.price
|
||||
ELSE pm.reorder_qty * p.price
|
||||
END), 0) as total_retail
|
||||
END), 0) AS DECIMAL(15,3)) as total_retail
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND (pm.stock_status IN ('Critical', 'Reorder')
|
||||
OR p.stock_quantity < 0)
|
||||
@@ -197,24 +201,24 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
// Get top variants to replenish
|
||||
const [variants] = await executeQuery(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.title,
|
||||
p.stock_quantity as current_stock,
|
||||
CASE
|
||||
WHEN p.stock_quantity < 0 THEN ABS(p.stock_quantity) + pm.reorder_qty
|
||||
ELSE pm.reorder_qty
|
||||
END as replenish_qty,
|
||||
CASE
|
||||
CAST(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.cost_price
|
||||
ELSE pm.reorder_qty * p.cost_price
|
||||
END as replenish_cost,
|
||||
CASE
|
||||
END AS DECIMAL(15,3)) as replenish_cost,
|
||||
CAST(CASE
|
||||
WHEN p.stock_quantity < 0 THEN (ABS(p.stock_quantity) + pm.reorder_qty) * p.price
|
||||
ELSE pm.reorder_qty * p.price
|
||||
END as replenish_retail,
|
||||
END AS DECIMAL(15,3)) as replenish_retail,
|
||||
pm.stock_status
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND (pm.stock_status IN ('Critical', 'Reorder')
|
||||
OR p.stock_quantity < 0)
|
||||
@@ -235,7 +239,7 @@ router.get('/replenishment/metrics', async (req, res) => {
|
||||
replenishmentCost: parseFloat(metrics[0].total_cost) || 0,
|
||||
replenishmentRetail: parseFloat(metrics[0].total_retail) || 0,
|
||||
topVariants: variants.map(v => ({
|
||||
id: v.product_id,
|
||||
id: v.pid,
|
||||
title: v.title,
|
||||
currentStock: parseInt(v.current_stock) || 0,
|
||||
replenishQty: parseInt(v.replenish_qty) || 0,
|
||||
@@ -287,9 +291,9 @@ router.get('/forecast/metrics', async (req, res) => {
|
||||
COALESCE(SUM(cf.forecast_revenue), 0) as revenue,
|
||||
COALESCE(AVG(cf.confidence_level), 0) as confidence
|
||||
FROM category_forecasts cf
|
||||
JOIN categories c ON cf.category_id = c.id
|
||||
JOIN categories c ON cf.category_id = c.cat_id
|
||||
WHERE cf.forecast_date BETWEEN ? AND ?
|
||||
GROUP BY c.id, c.name
|
||||
GROUP BY c.cat_id, c.name
|
||||
ORDER BY revenue DESC
|
||||
`, [startDate, endDate]);
|
||||
|
||||
@@ -325,11 +329,11 @@ router.get('/overstock/metrics', async (req, res) => {
|
||||
const [rows] = await executeQuery(`
|
||||
WITH category_overstock AS (
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
c.cat_id,
|
||||
c.name as category_name,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
THEN p.product_id
|
||||
THEN p.pid
|
||||
END) as overstocked_products,
|
||||
SUM(CASE
|
||||
WHEN pm.stock_status = 'Overstocked'
|
||||
@@ -347,10 +351,10 @@ router.get('/overstock/metrics', async (req, res) => {
|
||||
ELSE 0
|
||||
END) as total_excess_retail
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.id = pc.category_id
|
||||
JOIN products p ON pc.product_id = p.product_id
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
GROUP BY c.id, c.name
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
GROUP BY c.cat_id, c.name
|
||||
)
|
||||
SELECT
|
||||
SUM(overstocked_products) as total_overstocked,
|
||||
@@ -405,7 +409,7 @@ router.get('/overstock/products', async (req, res) => {
|
||||
try {
|
||||
const [rows] = await executeQuery(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.SKU,
|
||||
p.title,
|
||||
p.brand,
|
||||
@@ -420,11 +424,11 @@ router.get('/overstock/products', async (req, res) => {
|
||||
(pm.overstocked_amt * p.price) as excess_retail,
|
||||
GROUP_CONCAT(c.name) as categories
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
LEFT JOIN categories c ON pc.category_id = c.id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN product_categories pc ON p.pid = pc.pid
|
||||
LEFT JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE pm.stock_status = 'Overstocked'
|
||||
GROUP BY p.product_id
|
||||
GROUP BY p.pid
|
||||
ORDER BY excess_cost DESC
|
||||
LIMIT ?
|
||||
`, [limit]);
|
||||
@@ -439,196 +443,116 @@ router.get('/overstock/products', async (req, res) => {
|
||||
// Returns best-selling products, vendors, and categories
|
||||
router.get('/best-sellers', async (req, res) => {
|
||||
try {
|
||||
const [products] = await executeQuery(`
|
||||
WITH product_sales AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.SKU as sku,
|
||||
p.title,
|
||||
-- Current period (last 30 days)
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.quantity
|
||||
ELSE 0
|
||||
END) as units_sold,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as revenue,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN (o.price - p.cost_price) * o.quantity
|
||||
ELSE 0
|
||||
END) as profit,
|
||||
-- Previous period (30-60 days ago)
|
||||
SUM(CASE
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY) AND DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_revenue
|
||||
FROM products p
|
||||
JOIN orders o ON p.product_id = o.product_id
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY)
|
||||
GROUP BY p.product_id, p.SKU, p.title
|
||||
)
|
||||
SELECT
|
||||
product_id,
|
||||
sku,
|
||||
title,
|
||||
units_sold,
|
||||
revenue,
|
||||
profit,
|
||||
CASE
|
||||
WHEN previous_revenue > 0
|
||||
THEN ((revenue - previous_revenue) / previous_revenue * 100)
|
||||
WHEN revenue > 0
|
||||
THEN 100
|
||||
ELSE 0
|
||||
END as growth_rate
|
||||
FROM product_sales
|
||||
WHERE units_sold > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
`);
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [brands] = await executeQuery(`
|
||||
WITH brand_sales AS (
|
||||
// Common CTE for category paths
|
||||
const categoryPathCTE = `
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
p.brand,
|
||||
-- Current period (last 30 days)
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.quantity
|
||||
ELSE 0
|
||||
END) as units_sold,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as revenue,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN (o.price - p.cost_price) * o.quantity
|
||||
ELSE 0
|
||||
END) as profit,
|
||||
-- Previous period (30-60 days ago)
|
||||
SUM(CASE
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY) AND DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_revenue
|
||||
FROM products p
|
||||
JOIN orders o ON p.product_id = o.product_id
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY)
|
||||
AND p.brand IS NOT NULL
|
||||
GROUP BY p.brand
|
||||
)
|
||||
SELECT
|
||||
brand,
|
||||
units_sold,
|
||||
revenue,
|
||||
profit,
|
||||
CASE
|
||||
WHEN previous_revenue > 0
|
||||
THEN ((revenue - previous_revenue) / previous_revenue * 100)
|
||||
WHEN revenue > 0
|
||||
THEN 100
|
||||
ELSE 0
|
||||
END as growth_rate
|
||||
FROM brand_sales
|
||||
WHERE units_sold > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
`);
|
||||
|
||||
const [categories] = await executeQuery(`
|
||||
WITH category_sales AS (
|
||||
SELECT
|
||||
c.id as category_id,
|
||||
c.cat_id,
|
||||
c.name,
|
||||
-- Current period (last 30 days)
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.quantity
|
||||
ELSE 0
|
||||
END) as units_sold,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as revenue,
|
||||
SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN (o.price - p.cost_price) * o.quantity
|
||||
ELSE 0
|
||||
END) as profit,
|
||||
-- Previous period (30-60 days ago)
|
||||
SUM(CASE
|
||||
WHEN o.date BETWEEN DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY) AND DATE_SUB(CURRENT_DATE, INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) as previous_revenue
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.id = pc.category_id
|
||||
JOIN products p ON pc.product_id = p.product_id
|
||||
JOIN orders o ON p.product_id = o.product_id
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURRENT_DATE, INTERVAL 60 DAY)
|
||||
GROUP BY c.id, c.name
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get best selling products
|
||||
const [products] = await pool.query(`
|
||||
SELECT
|
||||
category_id,
|
||||
name,
|
||||
units_sold,
|
||||
revenue,
|
||||
profit,
|
||||
CASE
|
||||
WHEN previous_revenue > 0
|
||||
THEN ((revenue - previous_revenue) / previous_revenue * 100)
|
||||
WHEN revenue > 0
|
||||
THEN 100
|
||||
ELSE 0
|
||||
END as growth_rate
|
||||
FROM category_sales
|
||||
WHERE units_sold > 0
|
||||
ORDER BY revenue DESC
|
||||
LIMIT 50
|
||||
p.pid,
|
||||
p.SKU as sku,
|
||||
p.title,
|
||||
SUM(o.quantity) as units_sold,
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(o.price * o.quantity - p.cost_price * o.quantity) AS DECIMAL(15,3)) as profit
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
AND o.canceled = false
|
||||
GROUP BY p.pid
|
||||
ORDER BY units_sold DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
// Format response with explicit type conversion
|
||||
const formattedProducts = products.map(p => ({
|
||||
...p,
|
||||
units_sold: parseInt(p.units_sold) || 0,
|
||||
revenue: parseFloat(p.revenue) || 0,
|
||||
profit: parseFloat(p.profit) || 0,
|
||||
growth_rate: parseFloat(p.growth_rate) || 0
|
||||
}));
|
||||
// Get best selling brands
|
||||
const [brands] = await pool.query(`
|
||||
SELECT
|
||||
p.brand,
|
||||
SUM(o.quantity) as units_sold,
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(o.price * o.quantity - p.cost_price * o.quantity) AS DECIMAL(15,3)) as profit,
|
||||
ROUND(
|
||||
((SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) /
|
||||
NULLIF(SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END), 0)) - 1) * 100,
|
||||
1
|
||||
) as growth_rate
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.canceled = false
|
||||
GROUP BY p.brand
|
||||
ORDER BY units_sold DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
const formattedBrands = brands.map(b => ({
|
||||
brand: b.brand,
|
||||
units_sold: parseInt(b.units_sold) || 0,
|
||||
revenue: parseFloat(b.revenue) || 0,
|
||||
profit: parseFloat(b.profit) || 0,
|
||||
growth_rate: parseFloat(b.growth_rate) || 0
|
||||
}));
|
||||
// Get best selling categories with full path
|
||||
const [categories] = await pool.query(`
|
||||
${categoryPathCTE}
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
cp.path as categoryPath,
|
||||
SUM(o.quantity) as units_sold,
|
||||
CAST(SUM(o.price * o.quantity) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(SUM(o.price * o.quantity - p.cost_price * o.quantity) AS DECIMAL(15,3)) as profit,
|
||||
ROUND(
|
||||
((SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END) /
|
||||
NULLIF(SUM(CASE
|
||||
WHEN o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.date < DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
THEN o.price * o.quantity
|
||||
ELSE 0
|
||||
END), 0)) - 1) * 100,
|
||||
1
|
||||
) as growth_rate
|
||||
FROM products p
|
||||
JOIN orders o ON p.pid = o.pid
|
||||
JOIN product_categories pc ON p.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
WHERE o.date >= DATE_SUB(CURDATE(), INTERVAL 60 DAY)
|
||||
AND o.canceled = false
|
||||
GROUP BY c.cat_id, c.name, cp.path
|
||||
ORDER BY units_sold DESC
|
||||
LIMIT 10
|
||||
`);
|
||||
|
||||
const formattedCategories = categories.map(c => ({
|
||||
category_id: c.category_id,
|
||||
name: c.name,
|
||||
units_sold: parseInt(c.units_sold) || 0,
|
||||
revenue: parseFloat(c.revenue) || 0,
|
||||
profit: parseFloat(c.profit) || 0,
|
||||
growth_rate: parseFloat(c.growth_rate) || 0
|
||||
}));
|
||||
|
||||
res.json({
|
||||
products: formattedProducts,
|
||||
brands: formattedBrands,
|
||||
categories: formattedCategories
|
||||
});
|
||||
res.json({ products, brands, categories });
|
||||
} catch (err) {
|
||||
console.error('Error fetching best sellers:', err);
|
||||
res.status(500).json({ error: 'Failed to fetch best sellers' });
|
||||
@@ -650,7 +574,7 @@ router.get('/sales/metrics', async (req, res) => {
|
||||
SUM(p.cost_price * o.quantity) as total_cogs,
|
||||
SUM((o.price - p.cost_price) * o.quantity) as total_profit
|
||||
FROM orders o
|
||||
JOIN products p ON o.product_id = p.product_id
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN ? AND ?
|
||||
GROUP BY DATE(o.date)
|
||||
@@ -666,7 +590,7 @@ router.get('/sales/metrics', async (req, res) => {
|
||||
SUM(p.cost_price * o.quantity) as total_cogs,
|
||||
SUM((o.price - p.cost_price) * o.quantity) as total_profit
|
||||
FROM orders o
|
||||
JOIN products p ON o.product_id = p.product_id
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.canceled = false
|
||||
AND o.date BETWEEN ? AND ?
|
||||
`, [startDate, endDate]);
|
||||
@@ -698,7 +622,7 @@ router.get('/low-stock/products', async (req, res) => {
|
||||
try {
|
||||
const [rows] = await executeQuery(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.SKU,
|
||||
p.title,
|
||||
p.brand,
|
||||
@@ -712,12 +636,12 @@ router.get('/low-stock/products', async (req, res) => {
|
||||
(pm.reorder_qty * p.cost_price) as reorder_cost,
|
||||
GROUP_CONCAT(c.name) as categories
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
LEFT JOIN categories c ON pc.category_id = c.id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN product_categories pc ON p.pid = pc.pid
|
||||
LEFT JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE pm.stock_status IN ('Critical', 'Reorder')
|
||||
AND p.replenishable = true
|
||||
GROUP BY p.product_id
|
||||
GROUP BY p.pid
|
||||
ORDER BY
|
||||
CASE pm.stock_status
|
||||
WHEN 'Critical' THEN 1
|
||||
@@ -742,17 +666,17 @@ router.get('/trending/products', async (req, res) => {
|
||||
const [rows] = await executeQuery(`
|
||||
WITH recent_sales AS (
|
||||
SELECT
|
||||
o.product_id,
|
||||
o.pid,
|
||||
COUNT(DISTINCT o.order_number) as recent_orders,
|
||||
SUM(o.quantity) as recent_units,
|
||||
SUM(o.price * o.quantity) as recent_revenue
|
||||
FROM orders o
|
||||
WHERE o.canceled = false
|
||||
AND o.date >= DATE_SUB(CURDATE(), INTERVAL ? DAY)
|
||||
GROUP BY o.product_id
|
||||
GROUP BY o.pid
|
||||
)
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.SKU,
|
||||
p.title,
|
||||
p.brand,
|
||||
@@ -767,15 +691,15 @@ router.get('/trending/products', async (req, res) => {
|
||||
((rs.recent_units / ?) - pm.daily_sales_avg) / pm.daily_sales_avg * 100 as velocity_change,
|
||||
GROUP_CONCAT(c.name) as categories
|
||||
FROM recent_sales rs
|
||||
JOIN products p ON rs.product_id = p.product_id
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
LEFT JOIN categories c ON pc.category_id = c.id
|
||||
GROUP BY p.product_id
|
||||
JOIN products p ON rs.pid = p.pid
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN product_categories pc ON p.pid = pc.pid
|
||||
LEFT JOIN categories c ON pc.cat_id = c.cat_id
|
||||
GROUP BY p.pid
|
||||
HAVING velocity_change > 0
|
||||
ORDER BY velocity_change DESC
|
||||
LIMIT ?
|
||||
`, [days, days, days, limit]);
|
||||
`, [days, days, limit]);
|
||||
res.json(rows);
|
||||
} catch (err) {
|
||||
console.error('Error fetching trending products:', err);
|
||||
@@ -859,7 +783,7 @@ router.get('/key-metrics', async (req, res) => {
|
||||
COUNT(CASE WHEN pm.stock_status = 'Critical' THEN 1 END) as critical_stock_count,
|
||||
COUNT(CASE WHEN pm.stock_status = 'Overstocked' THEN 1 END) as overstock_count
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
),
|
||||
sales_summary AS (
|
||||
SELECT
|
||||
@@ -909,7 +833,7 @@ router.get('/inventory-health', async (req, res) => {
|
||||
AVG(pm.turnover_rate) as avg_turnover_rate,
|
||||
AVG(pm.days_of_inventory) as avg_days_inventory
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
),
|
||||
value_distribution AS (
|
||||
@@ -931,7 +855,7 @@ router.get('/inventory-health', async (req, res) => {
|
||||
ELSE 0
|
||||
END) * 100.0 / SUM(p.stock_quantity * p.cost_price) as overstock_value_percent
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
),
|
||||
category_health AS (
|
||||
SELECT
|
||||
@@ -940,11 +864,11 @@ router.get('/inventory-health', async (req, res) => {
|
||||
SUM(CASE WHEN pm.stock_status = 'Healthy' THEN 1 ELSE 0 END) * 100.0 / COUNT(*) as category_healthy_percent,
|
||||
AVG(pm.turnover_rate) as category_turnover_rate
|
||||
FROM categories c
|
||||
JOIN product_categories pc ON c.id = pc.category_id
|
||||
JOIN products p ON pc.product_id = p.product_id
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_categories pc ON c.cat_id = pc.cat_id
|
||||
JOIN products p ON pc.pid = p.pid
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
GROUP BY c.id, c.name
|
||||
GROUP BY c.cat_id, c.name
|
||||
)
|
||||
SELECT
|
||||
sd.*,
|
||||
@@ -975,20 +899,15 @@ router.get('/replenish/products', async (req, res) => {
|
||||
try {
|
||||
const [products] = await executeQuery(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.SKU,
|
||||
p.pid,
|
||||
p.SKU as sku,
|
||||
p.title,
|
||||
p.stock_quantity as current_stock,
|
||||
pm.reorder_qty as replenish_qty,
|
||||
(pm.reorder_qty * p.cost_price) as replenish_cost,
|
||||
(pm.reorder_qty * p.price) as replenish_retail,
|
||||
CASE
|
||||
WHEN pm.daily_sales_avg > 0
|
||||
THEN FLOOR(p.stock_quantity / pm.daily_sales_avg)
|
||||
ELSE NULL
|
||||
END as days_until_stockout
|
||||
p.stock_quantity,
|
||||
pm.daily_sales_avg,
|
||||
pm.reorder_qty,
|
||||
pm.last_purchase_date
|
||||
FROM products p
|
||||
JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.replenishable = true
|
||||
AND pm.stock_status IN ('Critical', 'Reorder')
|
||||
AND pm.reorder_qty > 0
|
||||
@@ -997,23 +916,16 @@ router.get('/replenish/products', async (req, res) => {
|
||||
WHEN 'Critical' THEN 1
|
||||
WHEN 'Reorder' THEN 2
|
||||
END,
|
||||
replenish_cost DESC
|
||||
pm.reorder_qty * p.cost_price DESC
|
||||
LIMIT ?
|
||||
`, [limit]);
|
||||
|
||||
// Format response
|
||||
const response = products.map(p => ({
|
||||
product_id: p.product_id,
|
||||
SKU: p.SKU,
|
||||
title: p.title,
|
||||
current_stock: parseInt(p.current_stock) || 0,
|
||||
replenish_qty: parseInt(p.replenish_qty) || 0,
|
||||
replenish_cost: parseFloat(p.replenish_cost) || 0,
|
||||
replenish_retail: parseFloat(p.replenish_retail) || 0,
|
||||
days_until_stockout: p.days_until_stockout
|
||||
}));
|
||||
|
||||
res.json(response);
|
||||
res.json(products.map(p => ({
|
||||
...p,
|
||||
stock_quantity: parseInt(p.stock_quantity) || 0,
|
||||
daily_sales_avg: parseFloat(p.daily_sales_avg) || 0,
|
||||
reorder_qty: parseInt(p.reorder_qty) || 0
|
||||
})));
|
||||
} catch (err) {
|
||||
console.error('Error fetching products to replenish:', err);
|
||||
res.status(500).json({ error: 'Failed to fetch products to replenish' });
|
||||
|
||||
@@ -9,25 +9,25 @@ router.get('/trends', async (req, res) => {
|
||||
WITH MonthlyMetrics AS (
|
||||
SELECT
|
||||
DATE(CONCAT(pta.year, '-', LPAD(pta.month, 2, '0'), '-01')) as date,
|
||||
SUM(pta.total_revenue) as revenue,
|
||||
SUM(pta.total_cost) as cost,
|
||||
SUM(pm.inventory_value) as inventory_value,
|
||||
CAST(COALESCE(SUM(pta.total_revenue), 0) AS DECIMAL(15,3)) as revenue,
|
||||
CAST(COALESCE(SUM(pta.total_cost), 0) AS DECIMAL(15,3)) as cost,
|
||||
CAST(COALESCE(SUM(pm.inventory_value), 0) AS DECIMAL(15,3)) as inventory_value,
|
||||
CASE
|
||||
WHEN SUM(pm.inventory_value) > 0
|
||||
THEN (SUM(pta.total_revenue - pta.total_cost) / SUM(pm.inventory_value)) * 100
|
||||
THEN CAST((SUM(pta.total_revenue - pta.total_cost) / SUM(pm.inventory_value)) * 100 AS DECIMAL(15,3))
|
||||
ELSE 0
|
||||
END as gmroi
|
||||
FROM product_time_aggregates pta
|
||||
JOIN product_metrics pm ON pta.product_id = pm.product_id
|
||||
JOIN product_metrics pm ON pta.pid = pm.pid
|
||||
WHERE (pta.year * 100 + pta.month) >= DATE_FORMAT(DATE_SUB(CURDATE(), INTERVAL 12 MONTH), '%Y%m')
|
||||
GROUP BY pta.year, pta.month
|
||||
ORDER BY date ASC
|
||||
)
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%b %y') as date,
|
||||
ROUND(revenue, 2) as revenue,
|
||||
ROUND(inventory_value, 2) as inventory_value,
|
||||
ROUND(gmroi, 2) as gmroi
|
||||
revenue,
|
||||
inventory_value,
|
||||
gmroi
|
||||
FROM MonthlyMetrics
|
||||
`);
|
||||
|
||||
@@ -37,15 +37,15 @@ router.get('/trends', async (req, res) => {
|
||||
const transformedData = {
|
||||
revenue: rows.map(row => ({
|
||||
date: row.date,
|
||||
value: parseFloat(row.revenue || 0)
|
||||
value: parseFloat(row.revenue)
|
||||
})),
|
||||
inventory_value: rows.map(row => ({
|
||||
date: row.date,
|
||||
value: parseFloat(row.inventory_value || 0)
|
||||
value: parseFloat(row.inventory_value)
|
||||
})),
|
||||
gmroi: rows.map(row => ({
|
||||
date: row.date,
|
||||
value: parseFloat(row.gmroi || 0)
|
||||
value: parseFloat(row.gmroi)
|
||||
}))
|
||||
};
|
||||
|
||||
|
||||
@@ -74,8 +74,8 @@ router.get('/', async (req, res) => {
|
||||
o1.status,
|
||||
o1.payment_method,
|
||||
o1.shipping_method,
|
||||
COUNT(o2.product_id) as items_count,
|
||||
SUM(o2.price * o2.quantity) as total_amount
|
||||
COUNT(o2.pid) as items_count,
|
||||
CAST(SUM(o2.price * o2.quantity) AS DECIMAL(15,3)) as total_amount
|
||||
FROM orders o1
|
||||
JOIN orders o2 ON o1.order_number = o2.order_number
|
||||
WHERE ${conditions.join(' AND ')}
|
||||
@@ -101,7 +101,7 @@ router.get('/', async (req, res) => {
|
||||
WITH CurrentStats AS (
|
||||
SELECT
|
||||
COUNT(DISTINCT order_number) as total_orders,
|
||||
SUM(price * quantity) as total_revenue
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as total_revenue
|
||||
FROM orders
|
||||
WHERE canceled = false
|
||||
AND DATE(date) >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
@@ -109,7 +109,7 @@ router.get('/', async (req, res) => {
|
||||
PreviousStats AS (
|
||||
SELECT
|
||||
COUNT(DISTINCT order_number) as prev_orders,
|
||||
SUM(price * quantity) as prev_revenue
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as prev_revenue
|
||||
FROM orders
|
||||
WHERE canceled = false
|
||||
AND DATE(date) BETWEEN DATE_SUB(CURDATE(), INTERVAL 60 DAY) AND DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
@@ -117,7 +117,7 @@ router.get('/', async (req, res) => {
|
||||
OrderValues AS (
|
||||
SELECT
|
||||
order_number,
|
||||
SUM(price * quantity) as order_value
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as order_value
|
||||
FROM orders
|
||||
WHERE canceled = false
|
||||
AND DATE(date) >= DATE_SUB(CURDATE(), INTERVAL 30 DAY)
|
||||
@@ -138,12 +138,12 @@ router.get('/', async (req, res) => {
|
||||
END as revenue_growth,
|
||||
CASE
|
||||
WHEN cs.total_orders > 0
|
||||
THEN (cs.total_revenue / cs.total_orders)
|
||||
THEN CAST((cs.total_revenue / cs.total_orders) AS DECIMAL(15,3))
|
||||
ELSE 0
|
||||
END as average_order_value,
|
||||
CASE
|
||||
WHEN ps.prev_orders > 0
|
||||
THEN (ps.prev_revenue / ps.prev_orders)
|
||||
THEN CAST((ps.prev_revenue / ps.prev_orders) AS DECIMAL(15,3))
|
||||
ELSE 0
|
||||
END as prev_average_order_value
|
||||
FROM CurrentStats cs
|
||||
@@ -199,8 +199,8 @@ router.get('/:orderNumber', async (req, res) => {
|
||||
o1.shipping_method,
|
||||
o1.shipping_address,
|
||||
o1.billing_address,
|
||||
COUNT(o2.product_id) as items_count,
|
||||
SUM(o2.price * o2.quantity) as total_amount
|
||||
COUNT(o2.pid) as items_count,
|
||||
CAST(SUM(o2.price * o2.quantity) AS DECIMAL(15,3)) as total_amount
|
||||
FROM orders o1
|
||||
JOIN orders o2 ON o1.order_number = o2.order_number
|
||||
WHERE o1.order_number = ? AND o1.canceled = false
|
||||
@@ -222,14 +222,14 @@ router.get('/:orderNumber', async (req, res) => {
|
||||
// Get order items
|
||||
const [itemRows] = await pool.query(`
|
||||
SELECT
|
||||
o.product_id,
|
||||
o.pid,
|
||||
p.title,
|
||||
p.sku,
|
||||
p.SKU,
|
||||
o.quantity,
|
||||
o.price,
|
||||
(o.price * o.quantity) as total
|
||||
CAST((o.price * o.quantity) AS DECIMAL(15,3)) as total
|
||||
FROM orders o
|
||||
JOIN products p ON o.product_id = p.product_id
|
||||
JOIN products p ON o.pid = p.pid
|
||||
WHERE o.order_number = ? AND o.canceled = false
|
||||
`, [req.params.orderNumber]);
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ const express = require('express');
|
||||
const router = express.Router();
|
||||
const multer = require('multer');
|
||||
const { importProductsFromCSV } = require('../utils/csvImporter');
|
||||
const { PurchaseOrderStatus, ReceivingStatus } = require('../types/status-codes');
|
||||
|
||||
// Configure multer for file uploads
|
||||
const upload = multer({ dest: 'uploads/' });
|
||||
@@ -20,15 +21,13 @@ router.get('/brands', async (req, res) => {
|
||||
console.log('Fetching brands from database...');
|
||||
|
||||
const [results] = await pool.query(`
|
||||
SELECT DISTINCT p.brand
|
||||
SELECT DISTINCT COALESCE(p.brand, 'Unbranded') as brand
|
||||
FROM products p
|
||||
JOIN purchase_orders po ON p.product_id = po.product_id
|
||||
WHERE p.brand IS NOT NULL
|
||||
AND p.brand != ''
|
||||
AND p.visible = true
|
||||
GROUP BY p.brand
|
||||
JOIN purchase_orders po ON p.pid = po.pid
|
||||
WHERE p.visible = true
|
||||
GROUP BY COALESCE(p.brand, 'Unbranded')
|
||||
HAVING SUM(po.cost_price * po.received) >= 500
|
||||
ORDER BY p.brand
|
||||
ORDER BY COALESCE(p.brand, 'Unbranded')
|
||||
`);
|
||||
|
||||
console.log(`Found ${results.length} brands:`, results.slice(0, 3));
|
||||
@@ -147,9 +146,9 @@ router.get('/', async (req, res) => {
|
||||
|
||||
// Get total count for pagination
|
||||
const countQuery = `
|
||||
SELECT COUNT(DISTINCT p.product_id) as total
|
||||
SELECT COUNT(DISTINCT p.pid) as total
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
${whereClause}
|
||||
`;
|
||||
const [countResult] = await pool.query(countQuery, params);
|
||||
@@ -163,36 +162,69 @@ router.get('/', async (req, res) => {
|
||||
'SELECT DISTINCT vendor FROM products WHERE visible = true AND vendor IS NOT NULL AND vendor != "" ORDER BY vendor'
|
||||
);
|
||||
const [brands] = await pool.query(
|
||||
'SELECT DISTINCT brand FROM products WHERE visible = true AND brand IS NOT NULL AND brand != "" ORDER BY brand'
|
||||
'SELECT DISTINCT COALESCE(brand, \'Unbranded\') as brand FROM products WHERE visible = true ORDER BY brand'
|
||||
);
|
||||
|
||||
// Main query with all fields
|
||||
const query = `
|
||||
WITH product_thresholds AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
COALESCE(
|
||||
(SELECT overstock_days FROM stock_thresholds st
|
||||
WHERE st.category_id IN (
|
||||
SELECT pc.category_id
|
||||
FROM product_categories pc
|
||||
WHERE pc.product_id = p.product_id
|
||||
)
|
||||
AND (st.vendor = p.vendor OR st.vendor IS NULL)
|
||||
ORDER BY st.vendor IS NULL
|
||||
LIMIT 1),
|
||||
(SELECT overstock_days FROM stock_thresholds st
|
||||
WHERE st.category_id IS NULL
|
||||
AND (st.vendor = p.vendor OR st.vendor IS NULL)
|
||||
ORDER BY st.vendor IS NULL
|
||||
LIMIT 1),
|
||||
90
|
||||
) as target_days
|
||||
FROM products p
|
||||
)
|
||||
WITH RECURSIVE
|
||||
category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
product_thresholds AS (
|
||||
SELECT
|
||||
p.pid,
|
||||
COALESCE(
|
||||
(SELECT overstock_days FROM stock_thresholds st
|
||||
WHERE st.category_id IN (
|
||||
SELECT pc.cat_id
|
||||
FROM product_categories pc
|
||||
WHERE pc.pid = p.pid
|
||||
)
|
||||
AND (st.vendor = p.vendor OR st.vendor IS NULL)
|
||||
ORDER BY st.vendor IS NULL
|
||||
LIMIT 1),
|
||||
(SELECT overstock_days FROM stock_thresholds st
|
||||
WHERE st.category_id IS NULL
|
||||
AND (st.vendor = p.vendor OR st.vendor IS NULL)
|
||||
ORDER BY st.vendor IS NULL
|
||||
LIMIT 1),
|
||||
90
|
||||
) as target_days
|
||||
FROM products p
|
||||
),
|
||||
product_leaf_categories AS (
|
||||
-- Find categories that aren't parents to other categories for this product
|
||||
SELECT DISTINCT pc.cat_id
|
||||
FROM product_categories pc
|
||||
WHERE NOT EXISTS (
|
||||
SELECT 1
|
||||
FROM categories child
|
||||
JOIN product_categories child_pc ON child.cat_id = child_pc.cat_id
|
||||
WHERE child.parent_id = pc.cat_id
|
||||
AND child_pc.pid = pc.pid
|
||||
)
|
||||
)
|
||||
SELECT
|
||||
p.*,
|
||||
GROUP_CONCAT(DISTINCT c.name) as categories,
|
||||
COALESCE(p.brand, 'Unbranded') as brand,
|
||||
GROUP_CONCAT(DISTINCT CONCAT(c.cat_id, ':', c.name)) as categories,
|
||||
pm.daily_sales_avg,
|
||||
pm.weekly_sales_avg,
|
||||
pm.monthly_sales_avg,
|
||||
@@ -205,10 +237,10 @@ router.get('/', async (req, res) => {
|
||||
pm.reorder_point,
|
||||
pm.safety_stock,
|
||||
pm.avg_margin_percent,
|
||||
pm.total_revenue,
|
||||
pm.inventory_value,
|
||||
pm.cost_of_goods_sold,
|
||||
pm.gross_profit,
|
||||
CAST(pm.total_revenue AS DECIMAL(15,3)) as total_revenue,
|
||||
CAST(pm.inventory_value AS DECIMAL(15,3)) as inventory_value,
|
||||
CAST(pm.cost_of_goods_sold AS DECIMAL(15,3)) as cost_of_goods_sold,
|
||||
CAST(pm.gross_profit AS DECIMAL(15,3)) as gross_profit,
|
||||
pm.gmroi,
|
||||
pm.avg_lead_time_days,
|
||||
pm.last_purchase_date,
|
||||
@@ -223,12 +255,13 @@ router.get('/', async (req, res) => {
|
||||
pm.overstocked_amt,
|
||||
COALESCE(pm.days_of_inventory / NULLIF(pt.target_days, 0), 0) as stock_coverage_ratio
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
LEFT JOIN categories c ON pc.category_id = c.id
|
||||
LEFT JOIN product_thresholds pt ON p.product_id = pt.product_id
|
||||
${whereClause}
|
||||
GROUP BY p.product_id
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN product_categories pc ON p.pid = pc.pid
|
||||
LEFT JOIN categories c ON pc.cat_id = c.cat_id
|
||||
LEFT JOIN product_thresholds pt ON p.pid = pt.pid
|
||||
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
||||
${whereClause ? 'WHERE ' + whereClause.substring(6) : ''}
|
||||
GROUP BY p.pid
|
||||
ORDER BY ${sortColumn} ${sortDirection}
|
||||
LIMIT ? OFFSET ?
|
||||
`;
|
||||
@@ -308,7 +341,7 @@ router.get('/trending', async (req, res) => {
|
||||
SELECT COUNT(*) as count,
|
||||
MAX(total_revenue) as max_revenue,
|
||||
MAX(daily_sales_avg) as max_daily_sales,
|
||||
COUNT(DISTINCT product_id) as products_with_metrics
|
||||
COUNT(DISTINCT pid) as products_with_metrics
|
||||
FROM product_metrics
|
||||
WHERE total_revenue > 0 OR daily_sales_avg > 0
|
||||
`);
|
||||
@@ -322,7 +355,7 @@ router.get('/trending', async (req, res) => {
|
||||
// Get trending products
|
||||
const [rows] = await pool.query(`
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
p.sku,
|
||||
p.title,
|
||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||
@@ -334,7 +367,7 @@ router.get('/trending', async (req, res) => {
|
||||
END as growth_rate,
|
||||
COALESCE(pm.total_revenue, 0) as total_revenue
|
||||
FROM products p
|
||||
INNER JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
INNER JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE (pm.total_revenue > 0 OR pm.daily_sales_avg > 0)
|
||||
AND p.visible = true
|
||||
ORDER BY growth_rate DESC
|
||||
@@ -351,130 +384,160 @@ router.get('/trending', async (req, res) => {
|
||||
|
||||
// Get a single product
|
||||
router.get('/:id', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
try {
|
||||
// Get basic product data with metrics
|
||||
const [rows] = await pool.query(
|
||||
`SELECT
|
||||
const pool = req.app.locals.pool;
|
||||
const id = parseInt(req.params.id);
|
||||
|
||||
// Common CTE for category paths
|
||||
const categoryPathCTE = `
|
||||
WITH RECURSIVE category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
)
|
||||
`;
|
||||
|
||||
// Get product details with category paths
|
||||
const [productRows] = await pool.query(`
|
||||
SELECT
|
||||
p.*,
|
||||
GROUP_CONCAT(DISTINCT c.name) as categories,
|
||||
pm.daily_sales_avg,
|
||||
pm.weekly_sales_avg,
|
||||
pm.monthly_sales_avg,
|
||||
pm.days_of_inventory,
|
||||
pm.reorder_point,
|
||||
pm.safety_stock,
|
||||
pm.stock_status,
|
||||
pm.abc_class,
|
||||
pm.avg_margin_percent,
|
||||
pm.total_revenue,
|
||||
pm.inventory_value,
|
||||
pm.turnover_rate,
|
||||
pm.abc_class,
|
||||
pm.stock_status,
|
||||
pm.gmroi,
|
||||
pm.cost_of_goods_sold,
|
||||
pm.gross_profit,
|
||||
pm.avg_lead_time_days,
|
||||
pm.current_lead_time,
|
||||
pm.target_lead_time,
|
||||
pm.lead_time_status,
|
||||
pm.gmroi,
|
||||
pm.cost_of_goods_sold,
|
||||
pm.gross_profit
|
||||
pm.reorder_qty,
|
||||
pm.overstocked_amt
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
LEFT JOIN categories c ON pc.category_id = c.id
|
||||
WHERE p.product_id = ? AND p.visible = true
|
||||
GROUP BY p.product_id`,
|
||||
[req.params.id]
|
||||
);
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
`, [id]);
|
||||
|
||||
if (rows.length === 0) {
|
||||
if (!productRows.length) {
|
||||
return res.status(404).json({ error: 'Product not found' });
|
||||
}
|
||||
|
||||
// Get vendor performance metrics
|
||||
const [vendorMetrics] = await pool.query(
|
||||
`SELECT * FROM vendor_metrics WHERE vendor = ?`,
|
||||
[rows[0].vendor]
|
||||
);
|
||||
// Get categories and their paths separately to avoid GROUP BY issues
|
||||
const [categoryRows] = await pool.query(`
|
||||
WITH RECURSIVE
|
||||
category_path AS (
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CAST(c.name AS CHAR(1000)) as path
|
||||
FROM categories c
|
||||
WHERE c.parent_id IS NULL
|
||||
|
||||
UNION ALL
|
||||
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name,
|
||||
c.parent_id,
|
||||
CONCAT(cp.path, ' > ', c.name)
|
||||
FROM categories c
|
||||
JOIN category_path cp ON c.parent_id = cp.cat_id
|
||||
),
|
||||
product_leaf_categories AS (
|
||||
-- Find categories assigned to this product that aren't parents
|
||||
-- of other categories assigned to this product
|
||||
SELECT pc.cat_id
|
||||
FROM product_categories pc
|
||||
WHERE pc.pid = ?
|
||||
AND NOT EXISTS (
|
||||
-- Check if there are any child categories also assigned to this product
|
||||
SELECT 1
|
||||
FROM categories child
|
||||
JOIN product_categories child_pc ON child.cat_id = child_pc.cat_id
|
||||
WHERE child.parent_id = pc.cat_id
|
||||
AND child_pc.pid = pc.pid
|
||||
)
|
||||
)
|
||||
SELECT
|
||||
c.cat_id,
|
||||
c.name as category_name,
|
||||
cp.path as full_path
|
||||
FROM product_categories pc
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
JOIN category_path cp ON c.cat_id = cp.cat_id
|
||||
JOIN product_leaf_categories plc ON c.cat_id = plc.cat_id
|
||||
WHERE pc.pid = ?
|
||||
ORDER BY cp.path
|
||||
`, [id, id]);
|
||||
|
||||
// Transform the results
|
||||
const categoryPathMap = categoryRows.reduce((acc, row) => {
|
||||
// Use cat_id in the key to differentiate categories with the same name
|
||||
acc[`${row.cat_id}:${row.category_name}`] = row.full_path;
|
||||
return acc;
|
||||
}, {});
|
||||
|
||||
// Transform the data to match frontend expectations
|
||||
const product = {
|
||||
// Basic product info
|
||||
product_id: rows[0].product_id,
|
||||
title: rows[0].title,
|
||||
SKU: rows[0].SKU,
|
||||
barcode: rows[0].barcode,
|
||||
created_at: rows[0].created_at,
|
||||
updated_at: rows[0].updated_at,
|
||||
|
||||
// Inventory fields
|
||||
stock_quantity: parseInt(rows[0].stock_quantity),
|
||||
moq: parseInt(rows[0].moq),
|
||||
uom: parseInt(rows[0].uom),
|
||||
managing_stock: Boolean(rows[0].managing_stock),
|
||||
replenishable: Boolean(rows[0].replenishable),
|
||||
|
||||
// Pricing fields
|
||||
price: parseFloat(rows[0].price),
|
||||
regular_price: parseFloat(rows[0].regular_price),
|
||||
cost_price: parseFloat(rows[0].cost_price),
|
||||
landing_cost_price: parseFloat(rows[0].landing_cost_price),
|
||||
|
||||
// Categorization
|
||||
categories: rows[0].categories ? rows[0].categories.split(',') : [],
|
||||
tags: rows[0].tags ? rows[0].tags.split(',') : [],
|
||||
options: rows[0].options ? JSON.parse(rows[0].options) : {},
|
||||
|
||||
// Vendor info
|
||||
vendor: rows[0].vendor,
|
||||
vendor_reference: rows[0].vendor_reference,
|
||||
brand: rows[0].brand,
|
||||
|
||||
// URLs
|
||||
permalink: rows[0].permalink,
|
||||
image: rows[0].image,
|
||||
|
||||
// Metrics
|
||||
metrics: {
|
||||
// Sales metrics
|
||||
daily_sales_avg: parseFloat(rows[0].daily_sales_avg) || 0,
|
||||
weekly_sales_avg: parseFloat(rows[0].weekly_sales_avg) || 0,
|
||||
monthly_sales_avg: parseFloat(rows[0].monthly_sales_avg) || 0,
|
||||
|
||||
// Inventory metrics
|
||||
days_of_inventory: parseInt(rows[0].days_of_inventory) || 0,
|
||||
reorder_point: parseInt(rows[0].reorder_point) || 0,
|
||||
safety_stock: parseInt(rows[0].safety_stock) || 0,
|
||||
stock_status: rows[0].stock_status || 'Unknown',
|
||||
abc_class: rows[0].abc_class || 'C',
|
||||
|
||||
// Financial metrics
|
||||
avg_margin_percent: parseFloat(rows[0].avg_margin_percent) || 0,
|
||||
total_revenue: parseFloat(rows[0].total_revenue) || 0,
|
||||
inventory_value: parseFloat(rows[0].inventory_value) || 0,
|
||||
turnover_rate: parseFloat(rows[0].turnover_rate) || 0,
|
||||
gmroi: parseFloat(rows[0].gmroi) || 0,
|
||||
cost_of_goods_sold: parseFloat(rows[0].cost_of_goods_sold) || 0,
|
||||
gross_profit: parseFloat(rows[0].gross_profit) || 0,
|
||||
|
||||
// Lead time metrics
|
||||
avg_lead_time_days: parseInt(rows[0].avg_lead_time_days) || 0,
|
||||
current_lead_time: parseInt(rows[0].current_lead_time) || 0,
|
||||
target_lead_time: parseInt(rows[0].target_lead_time) || 14,
|
||||
lead_time_status: rows[0].lead_time_status || 'Unknown',
|
||||
reorder_qty: parseInt(rows[0].reorder_qty) || 0,
|
||||
overstocked_amt: parseInt(rows[0].overstocked_amt) || 0
|
||||
},
|
||||
|
||||
// Vendor performance (if available)
|
||||
vendor_performance: vendorMetrics.length ? {
|
||||
avg_lead_time_days: parseFloat(vendorMetrics[0].avg_lead_time_days) || 0,
|
||||
on_time_delivery_rate: parseFloat(vendorMetrics[0].on_time_delivery_rate) || 0,
|
||||
order_fill_rate: parseFloat(vendorMetrics[0].order_fill_rate) || 0,
|
||||
total_orders: parseInt(vendorMetrics[0].total_orders) || 0,
|
||||
total_late_orders: parseInt(vendorMetrics[0].total_late_orders) || 0,
|
||||
total_purchase_value: parseFloat(vendorMetrics[0].total_purchase_value) || 0,
|
||||
avg_order_value: parseFloat(vendorMetrics[0].avg_order_value) || 0
|
||||
} : null
|
||||
...productRows[0],
|
||||
// Include cat_id in categories array to match the keys in categoryPathMap
|
||||
categories: categoryRows.map(row => `${row.cat_id}:${row.category_name}`),
|
||||
category_paths: categoryPathMap,
|
||||
price: parseFloat(productRows[0].price),
|
||||
regular_price: parseFloat(productRows[0].regular_price),
|
||||
cost_price: parseFloat(productRows[0].cost_price),
|
||||
landing_cost_price: parseFloat(productRows[0].landing_cost_price),
|
||||
stock_quantity: parseInt(productRows[0].stock_quantity),
|
||||
moq: parseInt(productRows[0].moq),
|
||||
uom: parseInt(productRows[0].uom),
|
||||
managing_stock: Boolean(productRows[0].managing_stock),
|
||||
replenishable: Boolean(productRows[0].replenishable),
|
||||
daily_sales_avg: parseFloat(productRows[0].daily_sales_avg) || 0,
|
||||
weekly_sales_avg: parseFloat(productRows[0].weekly_sales_avg) || 0,
|
||||
monthly_sales_avg: parseFloat(productRows[0].monthly_sales_avg) || 0,
|
||||
avg_quantity_per_order: parseFloat(productRows[0].avg_quantity_per_order) || 0,
|
||||
number_of_orders: parseInt(productRows[0].number_of_orders) || 0,
|
||||
first_sale_date: productRows[0].first_sale_date || null,
|
||||
last_sale_date: productRows[0].last_sale_date || null,
|
||||
days_of_inventory: parseFloat(productRows[0].days_of_inventory) || 0,
|
||||
weeks_of_inventory: parseFloat(productRows[0].weeks_of_inventory) || 0,
|
||||
reorder_point: parseFloat(productRows[0].reorder_point) || 0,
|
||||
safety_stock: parseFloat(productRows[0].safety_stock) || 0,
|
||||
avg_margin_percent: parseFloat(productRows[0].avg_margin_percent) || 0,
|
||||
total_revenue: parseFloat(productRows[0].total_revenue) || 0,
|
||||
inventory_value: parseFloat(productRows[0].inventory_value) || 0,
|
||||
cost_of_goods_sold: parseFloat(productRows[0].cost_of_goods_sold) || 0,
|
||||
gross_profit: parseFloat(productRows[0].gross_profit) || 0,
|
||||
gmroi: parseFloat(productRows[0].gmroi) || 0,
|
||||
avg_lead_time_days: parseFloat(productRows[0].avg_lead_time_days) || 0,
|
||||
current_lead_time: parseFloat(productRows[0].current_lead_time) || 0,
|
||||
target_lead_time: parseFloat(productRows[0].target_lead_time) || 0,
|
||||
lead_time_status: productRows[0].lead_time_status || null,
|
||||
reorder_qty: parseInt(productRows[0].reorder_qty) || 0,
|
||||
overstocked_amt: parseInt(productRows[0].overstocked_amt) || 0
|
||||
};
|
||||
|
||||
res.json(product);
|
||||
@@ -532,7 +595,7 @@ router.put('/:id', async (req, res) => {
|
||||
categories = ?,
|
||||
visible = ?,
|
||||
managing_stock = ?
|
||||
WHERE product_id = ?`,
|
||||
WHERE pid = ?`,
|
||||
[
|
||||
title,
|
||||
sku,
|
||||
@@ -570,7 +633,7 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
const [metrics] = await pool.query(`
|
||||
WITH inventory_status AS (
|
||||
SELECT
|
||||
p.product_id,
|
||||
p.pid,
|
||||
CASE
|
||||
WHEN pm.daily_sales_avg = 0 THEN 'New'
|
||||
WHEN p.stock_quantity <= CEIL(pm.daily_sales_avg * 7) THEN 'Critical'
|
||||
@@ -579,8 +642,8 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
ELSE 'Healthy'
|
||||
END as calculated_status
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
WHERE p.product_id = ?
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
WHERE p.pid = ?
|
||||
)
|
||||
SELECT
|
||||
COALESCE(pm.daily_sales_avg, 0) as daily_sales_avg,
|
||||
@@ -604,9 +667,9 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
COALESCE(pm.reorder_qty, 0) as reorder_qty,
|
||||
COALESCE(pm.overstocked_amt, 0) as overstocked_amt
|
||||
FROM products p
|
||||
LEFT JOIN product_metrics pm ON p.product_id = pm.product_id
|
||||
LEFT JOIN inventory_status is ON p.product_id = is.product_id
|
||||
WHERE p.product_id = ?
|
||||
LEFT JOIN product_metrics pm ON p.pid = pm.pid
|
||||
LEFT JOIN inventory_status is ON p.pid = is.pid
|
||||
WHERE p.pid = ?
|
||||
`, [id]);
|
||||
|
||||
if (!metrics.length) {
|
||||
@@ -643,57 +706,35 @@ router.get('/:id/metrics', async (req, res) => {
|
||||
|
||||
// Get product time series data
|
||||
router.get('/:id/time-series', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
const { id } = req.params;
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const months = parseInt(req.query.months) || 12;
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
// Get monthly sales data with running totals and growth rates
|
||||
// Get monthly sales data
|
||||
const [monthlySales] = await pool.query(`
|
||||
WITH monthly_data AS (
|
||||
SELECT
|
||||
CONCAT(year, '-', LPAD(month, 2, '0')) as month,
|
||||
total_quantity_sold as quantity,
|
||||
total_revenue as revenue,
|
||||
total_cost as cost,
|
||||
avg_price,
|
||||
profit_margin,
|
||||
inventory_value
|
||||
FROM product_time_aggregates
|
||||
WHERE product_id = ?
|
||||
ORDER BY year DESC, month DESC
|
||||
LIMIT ?
|
||||
)
|
||||
SELECT
|
||||
month,
|
||||
quantity,
|
||||
revenue,
|
||||
cost,
|
||||
avg_price,
|
||||
profit_margin,
|
||||
inventory_value,
|
||||
LAG(quantity) OVER (ORDER BY month) as prev_month_quantity,
|
||||
LAG(revenue) OVER (ORDER BY month) as prev_month_revenue
|
||||
FROM monthly_data
|
||||
ORDER BY month ASC
|
||||
`, [id, months]);
|
||||
DATE_FORMAT(date, '%Y-%m') as month,
|
||||
COUNT(DISTINCT order_number) as order_count,
|
||||
SUM(quantity) as units_sold,
|
||||
CAST(SUM(price * quantity) AS DECIMAL(15,3)) as revenue
|
||||
FROM orders
|
||||
WHERE pid = ?
|
||||
AND canceled = false
|
||||
GROUP BY DATE_FORMAT(date, '%Y-%m')
|
||||
ORDER BY month DESC
|
||||
LIMIT 12
|
||||
`, [id]);
|
||||
|
||||
// Calculate growth rates and format data
|
||||
const formattedMonthlySales = monthlySales.map(row => ({
|
||||
month: row.month,
|
||||
quantity: parseInt(row.quantity) || 0,
|
||||
revenue: parseFloat(row.revenue) || 0,
|
||||
cost: parseFloat(row.cost) || 0,
|
||||
avg_price: parseFloat(row.avg_price) || 0,
|
||||
profit_margin: parseFloat(row.profit_margin) || 0,
|
||||
inventory_value: parseFloat(row.inventory_value) || 0,
|
||||
quantity_growth: row.prev_month_quantity ?
|
||||
((row.quantity - row.prev_month_quantity) / row.prev_month_quantity) * 100 : 0,
|
||||
revenue_growth: row.prev_month_revenue ?
|
||||
((row.revenue - row.prev_month_revenue) / row.prev_month_revenue) * 100 : 0
|
||||
// Format monthly sales data
|
||||
const formattedMonthlySales = monthlySales.map(month => ({
|
||||
month: month.month,
|
||||
order_count: parseInt(month.order_count),
|
||||
units_sold: parseInt(month.units_sold),
|
||||
revenue: parseFloat(month.revenue),
|
||||
profit: 0 // Set to 0 since we don't have cost data in orders table
|
||||
}));
|
||||
|
||||
// Get recent orders with customer info and status
|
||||
// Get recent orders
|
||||
const [recentOrders] = await pool.query(`
|
||||
SELECT
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as date,
|
||||
@@ -703,11 +744,10 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
discount,
|
||||
tax,
|
||||
shipping,
|
||||
customer,
|
||||
status,
|
||||
payment_method
|
||||
customer_name as customer,
|
||||
status
|
||||
FROM orders
|
||||
WHERE product_id = ?
|
||||
WHERE pid = ?
|
||||
AND canceled = false
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
@@ -723,17 +763,19 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
ordered,
|
||||
received,
|
||||
status,
|
||||
receiving_status,
|
||||
cost_price,
|
||||
notes,
|
||||
CASE
|
||||
WHEN received_date IS NOT NULL THEN
|
||||
DATEDIFF(received_date, date)
|
||||
WHEN expected_date < CURDATE() AND status != 'received' THEN
|
||||
WHEN expected_date < CURDATE() AND status < ${PurchaseOrderStatus.ReceivingStarted} THEN
|
||||
DATEDIFF(CURDATE(), expected_date)
|
||||
ELSE NULL
|
||||
END as lead_time_days
|
||||
FROM purchase_orders
|
||||
WHERE product_id = ?
|
||||
WHERE pid = ?
|
||||
AND status != ${PurchaseOrderStatus.Canceled}
|
||||
ORDER BY date DESC
|
||||
LIMIT 10
|
||||
`, [id]);
|
||||
@@ -752,6 +794,8 @@ router.get('/:id/time-series', async (req, res) => {
|
||||
...po,
|
||||
ordered: parseInt(po.ordered),
|
||||
received: parseInt(po.received),
|
||||
status: parseInt(po.status),
|
||||
receiving_status: parseInt(po.receiving_status),
|
||||
cost_price: parseFloat(po.cost_price),
|
||||
lead_time_days: po.lead_time_days ? parseInt(po.lead_time_days) : null
|
||||
}))
|
||||
|
||||
@@ -1,6 +1,26 @@
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
|
||||
// Status code constants
|
||||
const STATUS = {
|
||||
CANCELED: 0,
|
||||
CREATED: 1,
|
||||
ELECTRONICALLY_READY_SEND: 10,
|
||||
ORDERED: 11,
|
||||
PREORDERED: 12,
|
||||
ELECTRONICALLY_SENT: 13,
|
||||
RECEIVING_STARTED: 15,
|
||||
DONE: 50
|
||||
};
|
||||
|
||||
const RECEIVING_STATUS = {
|
||||
CANCELED: 0,
|
||||
CREATED: 1,
|
||||
PARTIAL_RECEIVED: 30,
|
||||
FULL_RECEIVED: 40,
|
||||
PAID: 50
|
||||
};
|
||||
|
||||
// Get all purchase orders with summary metrics
|
||||
router.get('/', async (req, res) => {
|
||||
try {
|
||||
@@ -11,13 +31,13 @@ router.get('/', async (req, res) => {
|
||||
const params = [];
|
||||
|
||||
if (search) {
|
||||
whereClause += ' AND (po.po_id LIKE ? OR po.vendor LIKE ? OR po.status LIKE ?)';
|
||||
params.push(`%${search}%`, `%${search}%`, `%${search}%`);
|
||||
whereClause += ' AND (po.po_id LIKE ? OR po.vendor LIKE ?)';
|
||||
params.push(`%${search}%`, `%${search}%`);
|
||||
}
|
||||
|
||||
if (status && status !== 'all') {
|
||||
whereClause += ' AND po.status = ?';
|
||||
params.push(status);
|
||||
params.push(Number(status));
|
||||
}
|
||||
|
||||
if (vendor && vendor !== 'all') {
|
||||
@@ -42,7 +62,7 @@ router.get('/', async (req, res) => {
|
||||
po_id,
|
||||
SUM(ordered) as total_ordered,
|
||||
SUM(received) as total_received,
|
||||
SUM(ordered * cost_price) as total_cost
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
|
||||
FROM purchase_orders po
|
||||
WHERE ${whereClause}
|
||||
GROUP BY po_id
|
||||
@@ -54,8 +74,8 @@ router.get('/', async (req, res) => {
|
||||
ROUND(
|
||||
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
||||
) as fulfillment_rate,
|
||||
SUM(total_cost) as total_value,
|
||||
ROUND(AVG(total_cost), 2) as avg_cost
|
||||
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
|
||||
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost
|
||||
FROM po_totals
|
||||
`, params);
|
||||
|
||||
@@ -78,22 +98,24 @@ router.get('/', async (req, res) => {
|
||||
vendor,
|
||||
date,
|
||||
status,
|
||||
COUNT(DISTINCT product_id) as total_items,
|
||||
receiving_status,
|
||||
COUNT(DISTINCT pid) as total_items,
|
||||
SUM(ordered) as total_quantity,
|
||||
SUM(ordered * cost_price) as total_cost,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost,
|
||||
SUM(received) as total_received,
|
||||
ROUND(
|
||||
SUM(received) / NULLIF(SUM(ordered), 0), 3
|
||||
) as fulfillment_rate
|
||||
FROM purchase_orders po
|
||||
WHERE ${whereClause}
|
||||
GROUP BY po_id, vendor, date, status
|
||||
GROUP BY po_id, vendor, date, status, receiving_status
|
||||
)
|
||||
SELECT
|
||||
po_id as id,
|
||||
vendor as vendor_name,
|
||||
DATE_FORMAT(date, '%Y-%m-%d') as order_date,
|
||||
status,
|
||||
receiving_status,
|
||||
total_items,
|
||||
total_quantity,
|
||||
total_cost,
|
||||
@@ -104,8 +126,8 @@ router.get('/', async (req, res) => {
|
||||
CASE
|
||||
WHEN ? = 'order_date' THEN date
|
||||
WHEN ? = 'vendor_name' THEN vendor
|
||||
WHEN ? = 'total_cost' THEN CAST(total_cost AS DECIMAL(15,2))
|
||||
WHEN ? = 'total_received' THEN CAST(total_received AS DECIMAL(15,2))
|
||||
WHEN ? = 'total_cost' THEN CAST(total_cost AS DECIMAL(15,3))
|
||||
WHEN ? = 'total_received' THEN CAST(total_received AS DECIMAL(15,3))
|
||||
WHEN ? = 'total_items' THEN CAST(total_items AS SIGNED)
|
||||
WHEN ? = 'total_quantity' THEN CAST(total_quantity AS SIGNED)
|
||||
WHEN ? = 'fulfillment_rate' THEN CAST(fulfillment_rate AS DECIMAL(5,3))
|
||||
@@ -127,7 +149,7 @@ router.get('/', async (req, res) => {
|
||||
const [statuses] = await pool.query(`
|
||||
SELECT DISTINCT status
|
||||
FROM purchase_orders
|
||||
WHERE status IS NOT NULL AND status != ''
|
||||
WHERE status IS NOT NULL
|
||||
ORDER BY status
|
||||
`);
|
||||
|
||||
@@ -136,7 +158,8 @@ router.get('/', async (req, res) => {
|
||||
id: order.id,
|
||||
vendor_name: order.vendor_name,
|
||||
order_date: order.order_date,
|
||||
status: order.status,
|
||||
status: Number(order.status),
|
||||
receiving_status: Number(order.receiving_status),
|
||||
total_items: Number(order.total_items) || 0,
|
||||
total_quantity: Number(order.total_quantity) || 0,
|
||||
total_cost: Number(order.total_cost) || 0,
|
||||
@@ -165,7 +188,7 @@ router.get('/', async (req, res) => {
|
||||
},
|
||||
filters: {
|
||||
vendors: vendors.map(v => v.vendor),
|
||||
statuses: statuses.map(s => s.status)
|
||||
statuses: statuses.map(s => Number(s.status))
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
@@ -188,12 +211,14 @@ router.get('/vendor-metrics', async (req, res) => {
|
||||
received,
|
||||
cost_price,
|
||||
CASE
|
||||
WHEN status = 'received' AND received_date IS NOT NULL AND date IS NOT NULL
|
||||
WHEN status >= ${STATUS.RECEIVING_STARTED} AND receiving_status >= ${RECEIVING_STATUS.PARTIAL_RECEIVED}
|
||||
AND received_date IS NOT NULL AND date IS NOT NULL
|
||||
THEN DATEDIFF(received_date, date)
|
||||
ELSE NULL
|
||||
END as delivery_days
|
||||
FROM purchase_orders
|
||||
WHERE vendor IS NOT NULL AND vendor != ''
|
||||
AND status != ${STATUS.CANCELED} -- Exclude canceled orders
|
||||
)
|
||||
SELECT
|
||||
vendor as vendor_name,
|
||||
@@ -203,10 +228,10 @@ router.get('/vendor-metrics', async (req, res) => {
|
||||
ROUND(
|
||||
SUM(received) / NULLIF(SUM(ordered), 0), 3
|
||||
) as fulfillment_rate,
|
||||
ROUND(
|
||||
CAST(ROUND(
|
||||
SUM(ordered * cost_price) / NULLIF(SUM(ordered), 0), 2
|
||||
) as avg_unit_cost,
|
||||
SUM(ordered * cost_price) as total_spend,
|
||||
) AS DECIMAL(15,3)) as avg_unit_cost,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend,
|
||||
ROUND(
|
||||
AVG(NULLIF(delivery_days, 0)), 1
|
||||
) as avg_delivery_days
|
||||
@@ -242,47 +267,47 @@ router.get('/cost-analysis', async (req, res) => {
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const [analysis] = await pool.query(`
|
||||
WITH category_costs AS (
|
||||
SELECT
|
||||
c.name as category,
|
||||
po.pid,
|
||||
po.cost_price,
|
||||
po.ordered,
|
||||
po.received,
|
||||
po.status,
|
||||
po.receiving_status
|
||||
FROM purchase_orders po
|
||||
JOIN product_categories pc ON po.pid = pc.pid
|
||||
JOIN categories c ON pc.cat_id = c.cat_id
|
||||
WHERE po.status != ${STATUS.CANCELED} -- Exclude canceled orders
|
||||
)
|
||||
SELECT
|
||||
c.name as categories,
|
||||
COUNT(DISTINCT po.product_id) as unique_products,
|
||||
ROUND(AVG(po.cost_price), 2) as avg_cost,
|
||||
MIN(po.cost_price) as min_cost,
|
||||
MAX(po.cost_price) as max_cost,
|
||||
ROUND(
|
||||
STDDEV(po.cost_price), 2
|
||||
) as cost_variance,
|
||||
SUM(po.ordered * po.cost_price) as total_spend
|
||||
FROM purchase_orders po
|
||||
JOIN products p ON po.product_id = p.product_id
|
||||
JOIN product_categories pc ON p.product_id = pc.product_id
|
||||
JOIN categories c ON pc.category_id = c.id
|
||||
GROUP BY c.name
|
||||
category,
|
||||
COUNT(DISTINCT pid) as unique_products,
|
||||
CAST(AVG(cost_price) AS DECIMAL(15,3)) as avg_cost,
|
||||
CAST(MIN(cost_price) AS DECIMAL(15,3)) as min_cost,
|
||||
CAST(MAX(cost_price) AS DECIMAL(15,3)) as max_cost,
|
||||
CAST(STDDEV(cost_price) AS DECIMAL(15,3)) as cost_variance,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
||||
FROM category_costs
|
||||
GROUP BY category
|
||||
ORDER BY total_spend DESC
|
||||
`);
|
||||
|
||||
// Parse numeric values and add ids for React keys
|
||||
const parsedAnalysis = analysis.map(item => ({
|
||||
id: item.categories || 'Uncategorized',
|
||||
categories: item.categories || 'Uncategorized',
|
||||
unique_products: Number(item.unique_products) || 0,
|
||||
avg_cost: Number(item.avg_cost) || 0,
|
||||
min_cost: Number(item.min_cost) || 0,
|
||||
max_cost: Number(item.max_cost) || 0,
|
||||
cost_variance: Number(item.cost_variance) || 0,
|
||||
total_spend: Number(item.total_spend) || 0
|
||||
}));
|
||||
|
||||
// Transform the data with parsed values
|
||||
const transformedAnalysis = {
|
||||
...parsedAnalysis[0],
|
||||
total_spend_by_category: parsedAnalysis.map(item => ({
|
||||
id: item.categories,
|
||||
category: item.categories,
|
||||
total_spend: Number(item.total_spend)
|
||||
// Parse numeric values
|
||||
const parsedAnalysis = {
|
||||
categories: analysis.map(cat => ({
|
||||
category: cat.category,
|
||||
unique_products: Number(cat.unique_products) || 0,
|
||||
avg_cost: Number(cat.avg_cost) || 0,
|
||||
min_cost: Number(cat.min_cost) || 0,
|
||||
max_cost: Number(cat.max_cost) || 0,
|
||||
cost_variance: Number(cat.cost_variance) || 0,
|
||||
total_spend: Number(cat.total_spend) || 0
|
||||
}))
|
||||
};
|
||||
|
||||
res.json(transformedAnalysis);
|
||||
res.json(parsedAnalysis);
|
||||
} catch (error) {
|
||||
console.error('Error fetching cost analysis:', error);
|
||||
res.status(500).json({ error: 'Failed to fetch cost analysis' });
|
||||
@@ -298,11 +323,14 @@ router.get('/receiving-status', async (req, res) => {
|
||||
WITH po_totals AS (
|
||||
SELECT
|
||||
po_id,
|
||||
status,
|
||||
receiving_status,
|
||||
SUM(ordered) as total_ordered,
|
||||
SUM(received) as total_received,
|
||||
SUM(ordered * cost_price) as total_cost
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_cost
|
||||
FROM purchase_orders
|
||||
GROUP BY po_id
|
||||
WHERE status != ${STATUS.CANCELED}
|
||||
GROUP BY po_id, status, receiving_status
|
||||
)
|
||||
SELECT
|
||||
COUNT(DISTINCT po_id) as order_count,
|
||||
@@ -311,8 +339,20 @@ router.get('/receiving-status', async (req, res) => {
|
||||
ROUND(
|
||||
SUM(total_received) / NULLIF(SUM(total_ordered), 0), 3
|
||||
) as fulfillment_rate,
|
||||
SUM(total_cost) as total_value,
|
||||
ROUND(AVG(total_cost), 2) as avg_cost
|
||||
CAST(SUM(total_cost) AS DECIMAL(15,3)) as total_value,
|
||||
CAST(AVG(total_cost) AS DECIMAL(15,3)) as avg_cost,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status = ${RECEIVING_STATUS.CREATED} THEN po_id
|
||||
END) as pending_count,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status = ${RECEIVING_STATUS.PARTIAL_RECEIVED} THEN po_id
|
||||
END) as partial_count,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status >= ${RECEIVING_STATUS.FULL_RECEIVED} THEN po_id
|
||||
END) as completed_count,
|
||||
COUNT(DISTINCT CASE
|
||||
WHEN receiving_status = ${RECEIVING_STATUS.CANCELED} THEN po_id
|
||||
END) as canceled_count
|
||||
FROM po_totals
|
||||
`);
|
||||
|
||||
@@ -323,7 +363,13 @@ router.get('/receiving-status', async (req, res) => {
|
||||
total_received: Number(status[0].total_received) || 0,
|
||||
fulfillment_rate: Number(status[0].fulfillment_rate) || 0,
|
||||
total_value: Number(status[0].total_value) || 0,
|
||||
avg_cost: Number(status[0].avg_cost) || 0
|
||||
avg_cost: Number(status[0].avg_cost) || 0,
|
||||
status_breakdown: {
|
||||
pending: Number(status[0].pending_count) || 0,
|
||||
partial: Number(status[0].partial_count) || 0,
|
||||
completed: Number(status[0].completed_count) || 0,
|
||||
canceled: Number(status[0].canceled_count) || 0
|
||||
}
|
||||
};
|
||||
|
||||
res.json(parsedStatus);
|
||||
|
||||
@@ -29,8 +29,8 @@ router.get('/', async (req, res) => {
|
||||
const [costMetrics] = await pool.query(`
|
||||
SELECT
|
||||
vendor,
|
||||
ROUND(SUM(ordered * cost_price) / NULLIF(SUM(ordered), 0), 2) as avg_unit_cost,
|
||||
SUM(ordered * cost_price) as total_spend
|
||||
CAST(ROUND(SUM(ordered * cost_price) / NULLIF(SUM(ordered), 0), 2) AS DECIMAL(15,3)) as avg_unit_cost,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
AND cost_price IS NOT NULL
|
||||
@@ -56,9 +56,9 @@ router.get('/', async (req, res) => {
|
||||
WHEN COALESCE(vm.total_orders, 0) > 0 AND COALESCE(vm.order_fill_rate, 0) >= 75
|
||||
THEN p.vendor
|
||||
END) as activeVendors,
|
||||
ROUND(AVG(NULLIF(vm.avg_lead_time_days, 0)), 1) as avgLeadTime,
|
||||
ROUND(AVG(NULLIF(vm.order_fill_rate, 0)), 1) as avgFillRate,
|
||||
ROUND(AVG(NULLIF(vm.on_time_delivery_rate, 0)), 1) as avgOnTimeDelivery
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.avg_lead_time_days, 0)), 1), 0) as avgLeadTime,
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.order_fill_rate, 0)), 1), 0) as avgFillRate,
|
||||
COALESCE(ROUND(AVG(NULLIF(vm.on_time_delivery_rate, 0)), 1), 0) as avgOnTimeDelivery
|
||||
FROM products p
|
||||
LEFT JOIN vendor_metrics vm ON p.vendor = vm.vendor
|
||||
WHERE p.vendor IS NOT NULL AND p.vendor != ''
|
||||
@@ -67,8 +67,8 @@ router.get('/', async (req, res) => {
|
||||
// Get overall cost metrics
|
||||
const [overallCostMetrics] = await pool.query(`
|
||||
SELECT
|
||||
ROUND(SUM(ordered * cost_price) / NULLIF(SUM(ordered), 0), 2) as avg_unit_cost,
|
||||
SUM(ordered * cost_price) as total_spend
|
||||
CAST(ROUND(SUM(ordered * cost_price) / NULLIF(SUM(ordered), 0), 2) AS DECIMAL(15,3)) as avg_unit_cost,
|
||||
CAST(SUM(ordered * cost_price) AS DECIMAL(15,3)) as total_spend
|
||||
FROM purchase_orders
|
||||
WHERE status = 'closed'
|
||||
AND cost_price IS NOT NULL
|
||||
@@ -78,25 +78,25 @@ router.get('/', async (req, res) => {
|
||||
|
||||
res.json({
|
||||
vendors: vendors.map(vendor => ({
|
||||
vendor_id: vendor.vendor_id || vendor.name,
|
||||
vendor_id: vendor.name,
|
||||
name: vendor.name,
|
||||
status: vendor.status,
|
||||
avg_lead_time_days: parseFloat(vendor.avg_lead_time_days || 0),
|
||||
on_time_delivery_rate: parseFloat(vendor.on_time_delivery_rate || 0),
|
||||
order_fill_rate: parseFloat(vendor.order_fill_rate || 0),
|
||||
total_orders: parseInt(vendor.total_orders || 0),
|
||||
active_products: parseInt(vendor.active_products || 0),
|
||||
avg_lead_time_days: parseFloat(vendor.avg_lead_time_days),
|
||||
on_time_delivery_rate: parseFloat(vendor.on_time_delivery_rate),
|
||||
order_fill_rate: parseFloat(vendor.order_fill_rate),
|
||||
total_orders: parseInt(vendor.total_orders),
|
||||
active_products: parseInt(vendor.active_products),
|
||||
avg_unit_cost: parseFloat(costMetricsMap[vendor.name]?.avg_unit_cost || 0),
|
||||
total_spend: parseFloat(costMetricsMap[vendor.name]?.total_spend || 0)
|
||||
})),
|
||||
stats: {
|
||||
totalVendors: parseInt(stats[0].totalVendors || 0),
|
||||
activeVendors: parseInt(stats[0].activeVendors || 0),
|
||||
avgLeadTime: parseFloat(stats[0].avgLeadTime || 0),
|
||||
avgFillRate: parseFloat(stats[0].avgFillRate || 0),
|
||||
avgOnTimeDelivery: parseFloat(stats[0].avgOnTimeDelivery || 0),
|
||||
avgUnitCost: parseFloat(overallCostMetrics[0].avg_unit_cost || 0),
|
||||
totalSpend: parseFloat(overallCostMetrics[0].total_spend || 0)
|
||||
totalVendors: parseInt(stats[0].totalVendors),
|
||||
activeVendors: parseInt(stats[0].activeVendors),
|
||||
avgLeadTime: parseFloat(stats[0].avgLeadTime),
|
||||
avgFillRate: parseFloat(stats[0].avgFillRate),
|
||||
avgOnTimeDelivery: parseFloat(stats[0].avgOnTimeDelivery),
|
||||
avgUnitCost: parseFloat(overallCostMetrics[0].avg_unit_cost),
|
||||
totalSpend: parseFloat(overallCostMetrics[0].total_spend)
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
|
||||
79
inventory-server/src/types/status-codes.js
Normal file
79
inventory-server/src/types/status-codes.js
Normal file
@@ -0,0 +1,79 @@
|
||||
// Purchase Order Status Codes
|
||||
const PurchaseOrderStatus = {
|
||||
Canceled: 0,
|
||||
Created: 1,
|
||||
ElectronicallyReadySend: 10,
|
||||
Ordered: 11,
|
||||
Preordered: 12,
|
||||
ElectronicallySent: 13,
|
||||
ReceivingStarted: 15,
|
||||
Done: 50
|
||||
};
|
||||
|
||||
// Receiving Status Codes
|
||||
const ReceivingStatus = {
|
||||
Canceled: 0,
|
||||
Created: 1,
|
||||
PartialReceived: 30,
|
||||
FullReceived: 40,
|
||||
Paid: 50
|
||||
};
|
||||
|
||||
// Status Code Display Names
|
||||
const PurchaseOrderStatusLabels = {
|
||||
[PurchaseOrderStatus.Canceled]: 'Canceled',
|
||||
[PurchaseOrderStatus.Created]: 'Created',
|
||||
[PurchaseOrderStatus.ElectronicallyReadySend]: 'Ready to Send',
|
||||
[PurchaseOrderStatus.Ordered]: 'Ordered',
|
||||
[PurchaseOrderStatus.Preordered]: 'Preordered',
|
||||
[PurchaseOrderStatus.ElectronicallySent]: 'Sent',
|
||||
[PurchaseOrderStatus.ReceivingStarted]: 'Receiving Started',
|
||||
[PurchaseOrderStatus.Done]: 'Done'
|
||||
};
|
||||
|
||||
const ReceivingStatusLabels = {
|
||||
[ReceivingStatus.Canceled]: 'Canceled',
|
||||
[ReceivingStatus.Created]: 'Created',
|
||||
[ReceivingStatus.PartialReceived]: 'Partially Received',
|
||||
[ReceivingStatus.FullReceived]: 'Fully Received',
|
||||
[ReceivingStatus.Paid]: 'Paid'
|
||||
};
|
||||
|
||||
// Helper functions
|
||||
function getPurchaseOrderStatusLabel(status) {
|
||||
return PurchaseOrderStatusLabels[status] || 'Unknown';
|
||||
}
|
||||
|
||||
function getReceivingStatusLabel(status) {
|
||||
return ReceivingStatusLabels[status] || 'Unknown';
|
||||
}
|
||||
|
||||
// Status checks
|
||||
function isReceivingComplete(status) {
|
||||
return status >= ReceivingStatus.PartialReceived;
|
||||
}
|
||||
|
||||
function isPurchaseOrderComplete(status) {
|
||||
return status === PurchaseOrderStatus.Done;
|
||||
}
|
||||
|
||||
function isPurchaseOrderCanceled(status) {
|
||||
return status === PurchaseOrderStatus.Canceled;
|
||||
}
|
||||
|
||||
function isReceivingCanceled(status) {
|
||||
return status === ReceivingStatus.Canceled;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
PurchaseOrderStatus,
|
||||
ReceivingStatus,
|
||||
PurchaseOrderStatusLabels,
|
||||
ReceivingStatusLabels,
|
||||
getPurchaseOrderStatusLabel,
|
||||
getReceivingStatusLabel,
|
||||
isReceivingComplete,
|
||||
isPurchaseOrderComplete,
|
||||
isPurchaseOrderCanceled,
|
||||
isReceivingCanceled
|
||||
};
|
||||
@@ -6,6 +6,7 @@ import config from '../../config';
|
||||
interface CategoryData {
|
||||
performance: {
|
||||
category: string;
|
||||
categoryPath: string; // Full hierarchy path
|
||||
revenue: number;
|
||||
profit: number;
|
||||
growth: number;
|
||||
@@ -13,10 +14,12 @@ interface CategoryData {
|
||||
}[];
|
||||
distribution: {
|
||||
category: string;
|
||||
categoryPath: string; // Full hierarchy path
|
||||
value: number;
|
||||
}[];
|
||||
trends: {
|
||||
category: string;
|
||||
categoryPath: string; // Full hierarchy path
|
||||
month: string;
|
||||
sales: number;
|
||||
}[];
|
||||
@@ -36,6 +39,7 @@ export function CategoryPerformance() {
|
||||
return {
|
||||
performance: rawData.performance.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
profit: Number(item.profit) || 0,
|
||||
growth: Number(item.growth) || 0,
|
||||
@@ -43,10 +47,12 @@ export function CategoryPerformance() {
|
||||
})),
|
||||
distribution: rawData.distribution.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
value: Number(item.value) || 0
|
||||
})),
|
||||
trends: rawData.trends.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
sales: Number(item.sales) || 0
|
||||
}))
|
||||
};
|
||||
@@ -63,6 +69,8 @@ export function CategoryPerformance() {
|
||||
return <span className={color}>{value}</span>;
|
||||
};
|
||||
|
||||
const getShortCategoryName = (path: string) => path.split(' > ').pop() || path;
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
@@ -76,24 +84,34 @@ export function CategoryPerformance() {
|
||||
<Pie
|
||||
data={data.distribution}
|
||||
dataKey="value"
|
||||
nameKey="category"
|
||||
nameKey="categoryPath"
|
||||
cx="50%"
|
||||
cy="50%"
|
||||
outerRadius={100}
|
||||
fill="#8884d8"
|
||||
label={(entry) => entry.category}
|
||||
label={({ categoryPath }) => getShortCategoryName(categoryPath)}
|
||||
>
|
||||
{data.distribution.map((entry, index) => (
|
||||
<Cell
|
||||
key={entry.category}
|
||||
key={`${entry.category}-${entry.value}-${index}`}
|
||||
fill={COLORS[index % COLORS.length]}
|
||||
/>
|
||||
))}
|
||||
</Pie>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`$${value.toLocaleString()}`, 'Revenue']}
|
||||
formatter={(value: number, name: string, props: any) => [
|
||||
`$${value.toLocaleString()}`,
|
||||
<div key="tooltip">
|
||||
<div className="font-medium">Category Path:</div>
|
||||
<div className="text-sm text-muted-foreground">{props.payload.categoryPath}</div>
|
||||
<div className="mt-1">Revenue</div>
|
||||
</div>
|
||||
]}
|
||||
/>
|
||||
<Legend
|
||||
formatter={(value) => getShortCategoryName(value)}
|
||||
wrapperStyle={{ fontSize: '12px' }}
|
||||
/>
|
||||
<Legend />
|
||||
</PieChart>
|
||||
</ResponsiveContainer>
|
||||
</CardContent>
|
||||
@@ -106,10 +124,33 @@ export function CategoryPerformance() {
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data.performance}>
|
||||
<XAxis dataKey="category" />
|
||||
<XAxis
|
||||
dataKey="categoryPath"
|
||||
tick={({ x, y, payload }) => (
|
||||
<g transform={`translate(${x},${y})`}>
|
||||
<text
|
||||
x={0}
|
||||
y={0}
|
||||
dy={16}
|
||||
textAnchor="end"
|
||||
fill="#888888"
|
||||
transform="rotate(-35)"
|
||||
>
|
||||
{getShortCategoryName(payload.value)}
|
||||
</text>
|
||||
</g>
|
||||
)}
|
||||
/>
|
||||
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Growth Rate']}
|
||||
formatter={(value: number, name: string, props: any) => [
|
||||
`${value.toFixed(1)}%`,
|
||||
<div key="tooltip">
|
||||
<div className="font-medium">Category Path:</div>
|
||||
<div className="text-sm text-muted-foreground">{props.payload.categoryPath}</div>
|
||||
<div className="mt-1">Growth Rate</div>
|
||||
</div>
|
||||
]}
|
||||
/>
|
||||
<Bar
|
||||
dataKey="growth"
|
||||
@@ -129,10 +170,13 @@ export function CategoryPerformance() {
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.performance.map((category) => (
|
||||
<div key={category.category} className="flex items-center">
|
||||
<div key={`${category.category}-${category.revenue}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{category.category}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
<div className="space-y-1">
|
||||
<p className="text-sm font-medium">{getShortCategoryName(category.categoryPath)}</p>
|
||||
<p className="text-xs text-muted-foreground">{category.categoryPath}</p>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground mt-1">
|
||||
{category.productCount} products
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@@ -154,7 +154,7 @@ export function PriceAnalysis() {
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.recommendations.map((item) => (
|
||||
<div key={item.product} className="flex items-center">
|
||||
<div key={`${item.product}-${item.currentPrice}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{item.product}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -6,6 +6,7 @@ import config from '../../config';
|
||||
interface ProfitData {
|
||||
byCategory: {
|
||||
category: string;
|
||||
categoryPath: string; // Full hierarchy path
|
||||
profitMargin: number;
|
||||
revenue: number;
|
||||
cost: number;
|
||||
@@ -18,6 +19,8 @@ interface ProfitData {
|
||||
}[];
|
||||
topProducts: {
|
||||
product: string;
|
||||
category: string;
|
||||
categoryPath: string; // Full hierarchy path
|
||||
profitMargin: number;
|
||||
revenue: number;
|
||||
cost: number;
|
||||
@@ -36,6 +39,7 @@ export function ProfitAnalysis() {
|
||||
return {
|
||||
byCategory: rawData.byCategory.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
profitMargin: Number(item.profitMargin) || 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
cost: Number(item.cost) || 0
|
||||
@@ -48,6 +52,7 @@ export function ProfitAnalysis() {
|
||||
})),
|
||||
topProducts: rawData.topProducts.map((item: any) => ({
|
||||
...item,
|
||||
categoryPath: item.categoryPath || item.category,
|
||||
profitMargin: Number(item.profitMargin) || 0,
|
||||
revenue: Number(item.revenue) || 0,
|
||||
cost: Number(item.cost) || 0
|
||||
@@ -60,6 +65,8 @@ export function ProfitAnalysis() {
|
||||
return <div>Loading profit analysis...</div>;
|
||||
}
|
||||
|
||||
const getShortCategoryName = (path: string) => path.split(' > ').pop() || path;
|
||||
|
||||
return (
|
||||
<div className="grid gap-4">
|
||||
<div className="grid gap-4 md:grid-cols-2">
|
||||
@@ -70,10 +77,33 @@ export function ProfitAnalysis() {
|
||||
<CardContent>
|
||||
<ResponsiveContainer width="100%" height={300}>
|
||||
<BarChart data={data.byCategory}>
|
||||
<XAxis dataKey="category" />
|
||||
<XAxis
|
||||
dataKey="categoryPath"
|
||||
tick={({ x, y, payload }) => (
|
||||
<g transform={`translate(${x},${y})`}>
|
||||
<text
|
||||
x={0}
|
||||
y={0}
|
||||
dy={16}
|
||||
textAnchor="end"
|
||||
fill="#888888"
|
||||
transform="rotate(-35)"
|
||||
>
|
||||
{getShortCategoryName(payload.value)}
|
||||
</text>
|
||||
</g>
|
||||
)}
|
||||
/>
|
||||
<YAxis tickFormatter={(value) => `${value}%`} />
|
||||
<Tooltip
|
||||
formatter={(value: number) => [`${value.toFixed(1)}%`, 'Profit Margin']}
|
||||
formatter={(value: number, name: string, props: any) => [
|
||||
`${value.toFixed(1)}%`,
|
||||
<div key="tooltip">
|
||||
<div className="font-medium">Category Path:</div>
|
||||
<div className="text-sm text-muted-foreground">{props.payload.categoryPath}</div>
|
||||
<div className="mt-1">Profit Margin</div>
|
||||
</div>
|
||||
]}
|
||||
/>
|
||||
<Bar
|
||||
dataKey="profitMargin"
|
||||
@@ -120,10 +150,14 @@ export function ProfitAnalysis() {
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.topProducts.map((product) => (
|
||||
<div key={product.product} className="flex items-center">
|
||||
<div key={`${product.product}-${product.category}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{product.product}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
<div className="text-xs text-muted-foreground space-y-1">
|
||||
<p className="font-medium">Category:</p>
|
||||
<p>{product.categoryPath}</p>
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground mt-1">
|
||||
Revenue: ${product.revenue.toLocaleString()}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@@ -145,7 +145,7 @@ export function StockAnalysis() {
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.criticalItems.map((item) => (
|
||||
<div key={item.sku} className="flex items-center">
|
||||
<div key={`${item.sku}-${item.product}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<div className="flex items-center gap-2">
|
||||
<p className="text-sm font-medium">{item.product}</p>
|
||||
|
||||
@@ -131,7 +131,7 @@ export function VendorPerformance() {
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
{data.performance.map((vendor) => (
|
||||
<div key={vendor.vendor} className="flex items-center">
|
||||
<div key={`${vendor.vendor}-${vendor.salesVolume}`} className="flex items-center">
|
||||
<div className="flex-1">
|
||||
<p className="text-sm font-medium">{vendor.vendor}</p>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
|
||||
@@ -6,37 +6,46 @@ import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs"
|
||||
import config from "@/config"
|
||||
import { formatCurrency } from "@/lib/utils"
|
||||
|
||||
interface BestSellerProduct {
|
||||
product_id: number
|
||||
sku: string
|
||||
title: string
|
||||
units_sold: number
|
||||
revenue: number
|
||||
profit: number
|
||||
growth_rate: number
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
units_sold: number;
|
||||
revenue: string;
|
||||
profit: string;
|
||||
}
|
||||
|
||||
interface Category {
|
||||
cat_id: number;
|
||||
name: string;
|
||||
categoryPath: string;
|
||||
units_sold: number;
|
||||
revenue: string;
|
||||
profit: string;
|
||||
growth_rate: string;
|
||||
}
|
||||
|
||||
interface BestSellerBrand {
|
||||
brand: string
|
||||
units_sold: number
|
||||
revenue: number
|
||||
profit: number
|
||||
growth_rate: number
|
||||
revenue: string
|
||||
profit: string
|
||||
growth_rate: string
|
||||
}
|
||||
|
||||
interface BestSellerCategory {
|
||||
category_id: number
|
||||
name: string
|
||||
units_sold: number
|
||||
revenue: number
|
||||
profit: number
|
||||
growth_rate: number
|
||||
cat_id: number;
|
||||
name: string;
|
||||
units_sold: number;
|
||||
revenue: string;
|
||||
profit: string;
|
||||
growth_rate: string;
|
||||
}
|
||||
|
||||
interface BestSellersData {
|
||||
products: BestSellerProduct[]
|
||||
products: Product[]
|
||||
brands: BestSellerBrand[]
|
||||
categories: BestSellerCategory[]
|
||||
categories: Category[]
|
||||
}
|
||||
|
||||
export function BestSellers() {
|
||||
@@ -70,41 +79,29 @@ export function BestSellers() {
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="w-[40%]">Product</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Sales</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Revenue</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Profit</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Growth</TableHead>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Units Sold</TableHead>
|
||||
<TableHead className="text-right">Revenue</TableHead>
|
||||
<TableHead className="text-right">Profit</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.products.map((product) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableCell className="w-[40%]">
|
||||
<div>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.product_id}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="font-medium hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<p className="text-sm text-muted-foreground">{product.sku}</p>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{product.units_sold.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(product.revenue)}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(product.profit)}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{product.growth_rate > 0 ? '+' : ''}{product.growth_rate.toFixed(1)}%
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.units_sold}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(Number(product.revenue))}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(Number(product.profit))}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
@@ -134,13 +131,13 @@ export function BestSellers() {
|
||||
{brand.units_sold.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(brand.revenue)}
|
||||
{formatCurrency(Number(brand.revenue))}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(brand.profit)}
|
||||
{formatCurrency(Number(brand.profit))}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{brand.growth_rate > 0 ? '+' : ''}{brand.growth_rate.toFixed(1)}%
|
||||
{Number(brand.growth_rate) > 0 ? '+' : ''}{Number(brand.growth_rate).toFixed(1)}%
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
@@ -154,31 +151,26 @@ export function BestSellers() {
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="w-[40%]">Category</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Sales</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Revenue</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Profit</TableHead>
|
||||
<TableHead className="w-[15%] text-right">Growth</TableHead>
|
||||
<TableHead>Category</TableHead>
|
||||
<TableHead className="text-right">Units Sold</TableHead>
|
||||
<TableHead className="text-right">Revenue</TableHead>
|
||||
<TableHead className="text-right">Profit</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.categories.map((category) => (
|
||||
<TableRow key={category.category_id}>
|
||||
<TableCell className="w-[40%]">
|
||||
<p className="font-medium">{category.name}</p>
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{category.units_sold.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(category.revenue)}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{formatCurrency(category.profit)}
|
||||
</TableCell>
|
||||
<TableCell className="w-[15%] text-right">
|
||||
{category.growth_rate > 0 ? '+' : ''}{category.growth_rate.toFixed(1)}%
|
||||
<TableRow key={category.cat_id}>
|
||||
<TableCell>
|
||||
<div className="font-medium">{category.name}</div>
|
||||
{category.categoryPath && (
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{category.categoryPath}
|
||||
</div>
|
||||
)}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{category.units_sold}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(Number(category.revenue))}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(Number(category.profit))}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
|
||||
@@ -11,18 +11,18 @@ import { DateRangePicker } from "@/components/ui/date-range-picker-narrow"
|
||||
|
||||
interface ForecastData {
|
||||
forecastSales: number
|
||||
forecastRevenue: number
|
||||
forecastRevenue: string
|
||||
confidenceLevel: number
|
||||
dailyForecasts: {
|
||||
date: string
|
||||
units: number
|
||||
revenue: number
|
||||
revenue: string
|
||||
confidence: number
|
||||
}[]
|
||||
categoryForecasts: {
|
||||
category: string
|
||||
units: number
|
||||
revenue: number
|
||||
revenue: string
|
||||
confidence: number
|
||||
}[]
|
||||
}
|
||||
@@ -86,7 +86,7 @@ export function ForecastMetrics() {
|
||||
<DollarSign className="h-4 w-4 text-muted-foreground" />
|
||||
<p className="text-sm font-medium text-muted-foreground">Forecast Revenue</p>
|
||||
</div>
|
||||
<p className="text-lg font-bold">{formatCurrency(data?.forecastRevenue || 0)}</p>
|
||||
<p className="text-lg font-bold">{formatCurrency(Number(data?.forecastRevenue) || 0)}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -108,7 +108,7 @@ export function ForecastMetrics() {
|
||||
tick={false}
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [formatCurrency(value), "Revenue"]}
|
||||
formatter={(value: string) => [formatCurrency(Number(value)), "Revenue"]}
|
||||
labelFormatter={(date) => format(new Date(date), 'MMM d, yyyy')}
|
||||
/>
|
||||
<Area
|
||||
|
||||
@@ -13,11 +13,11 @@ interface InventoryMetrics {
|
||||
topVendors: {
|
||||
vendor: string;
|
||||
productCount: number;
|
||||
averageStockLevel: number;
|
||||
averageStockLevel: string;
|
||||
}[];
|
||||
stockTurnover: {
|
||||
category: string;
|
||||
rate: number;
|
||||
rate: string;
|
||||
}[];
|
||||
}
|
||||
|
||||
@@ -70,7 +70,7 @@ export function InventoryStats() {
|
||||
<BarChart data={data?.stockTurnover}>
|
||||
<XAxis dataKey="category" />
|
||||
<YAxis />
|
||||
<Tooltip />
|
||||
<Tooltip formatter={(value: string) => [Number(value).toFixed(2), "Rate"]} />
|
||||
<Bar dataKey="rate" name="Turnover Rate" fill="#60a5fa" />
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
@@ -93,7 +93,7 @@ export function InventoryStats() {
|
||||
</div>
|
||||
<div className="ml-4 text-right">
|
||||
<p className="text-sm font-medium">
|
||||
Avg. Stock: {vendor.averageStockLevel.toFixed(0)}
|
||||
Avg. Stock: {Number(vendor.averageStockLevel).toFixed(0)}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -12,19 +12,20 @@ import { Badge } from "@/components/ui/badge"
|
||||
import { AlertCircle, AlertTriangle } from "lucide-react"
|
||||
import config from "@/config"
|
||||
|
||||
interface LowStockProduct {
|
||||
product_id: number
|
||||
SKU: string
|
||||
title: string
|
||||
stock_quantity: number
|
||||
reorder_qty: number
|
||||
days_of_inventory: number
|
||||
stock_status: "Critical" | "Reorder"
|
||||
daily_sales_avg: number
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
daily_sales_avg: string;
|
||||
days_of_inventory: string;
|
||||
reorder_qty: number;
|
||||
last_purchase_date: string | null;
|
||||
lead_time_status: string;
|
||||
}
|
||||
|
||||
export function LowStockAlerts() {
|
||||
const { data: products } = useQuery<LowStockProduct[]>({
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["low-stock"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/low-stock/products`)
|
||||
@@ -45,35 +46,37 @@ export function LowStockAlerts() {
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>SKU</TableHead>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Status</TableHead>
|
||||
<TableHead className="text-right">Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Days Left</TableHead>
|
||||
<TableHead className="text-right">Reorder Qty</TableHead>
|
||||
<TableHead>Last Purchase</TableHead>
|
||||
<TableHead>Lead Time</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableCell className="font-medium">{product.SKU}</TableCell>
|
||||
<TableCell>{product.title}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{product.stock_quantity} / {product.reorder_qty}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<Badge
|
||||
variant="outline"
|
||||
className={
|
||||
product.stock_status === "Critical"
|
||||
? "border-destructive text-destructive"
|
||||
: "border-warning text-warning"
|
||||
}
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.stock_status === "Critical" ? (
|
||||
<AlertCircle className="mr-1 h-3 w-3" />
|
||||
) : (
|
||||
<AlertTriangle className="mr-1 h-3 w-3" />
|
||||
)}
|
||||
{product.stock_status}
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.days_of_inventory).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.reorder_qty}</TableCell>
|
||||
<TableCell>{product.last_purchase_date ? formatDate(product.last_purchase_date) : '-'}</TableCell>
|
||||
<TableCell>
|
||||
<Badge variant={getLeadTimeVariant(product.lead_time_status)}>
|
||||
{product.lead_time_status}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
|
||||
@@ -5,13 +5,14 @@ import config from "@/config"
|
||||
import { formatCurrency } from "@/lib/utils"
|
||||
import { ClipboardList, AlertCircle, Layers, DollarSign, ShoppingCart } from "lucide-react" // Importing icons
|
||||
import { useState } from "react"
|
||||
import { PurchaseOrderStatus, ReceivingStatus } from "@/types/status-codes"
|
||||
|
||||
interface PurchaseMetricsData {
|
||||
activePurchaseOrders: number
|
||||
overduePurchaseOrders: number
|
||||
onOrderUnits: number
|
||||
onOrderCost: number
|
||||
onOrderRetail: number
|
||||
activePurchaseOrders: number // Orders that are not canceled, done, or fully received
|
||||
overduePurchaseOrders: number // Orders past their expected delivery date
|
||||
onOrderUnits: number // Total units across all active orders
|
||||
onOrderCost: number // Total cost across all active orders
|
||||
onOrderRetail: number // Total retail value across all active orders
|
||||
vendorOrders: {
|
||||
vendor: string
|
||||
orders: number
|
||||
|
||||
@@ -12,13 +12,13 @@ import { DateRangePicker } from "@/components/ui/date-range-picker-narrow"
|
||||
interface SalesData {
|
||||
totalOrders: number
|
||||
totalUnitsSold: number
|
||||
totalCogs: number
|
||||
totalRevenue: number
|
||||
totalCogs: string
|
||||
totalRevenue: string
|
||||
dailySales: {
|
||||
date: string
|
||||
units: number
|
||||
revenue: number
|
||||
cogs: number
|
||||
revenue: string
|
||||
cogs: string
|
||||
}[]
|
||||
}
|
||||
|
||||
@@ -78,14 +78,14 @@ export function SalesMetrics() {
|
||||
<DollarSign className="h-4 w-4 text-muted-foreground" />
|
||||
<p className="text-sm font-medium text-muted-foreground">Cost of Goods</p>
|
||||
</div>
|
||||
<p className="text-lg font-bold">{formatCurrency(data?.totalCogs || 0)}</p>
|
||||
<p className="text-lg font-bold">{formatCurrency(Number(data?.totalCogs) || 0)}</p>
|
||||
</div>
|
||||
<div className="flex items-baseline justify-between">
|
||||
<div className="flex items-center gap-2">
|
||||
<ShoppingCart className="h-4 w-4 text-muted-foreground" />
|
||||
<p className="text-sm font-medium text-muted-foreground">Revenue</p>
|
||||
</div>
|
||||
<p className="text-lg font-bold">{formatCurrency(data?.totalRevenue || 0)}</p>
|
||||
<p className="text-lg font-bold">{formatCurrency(Number(data?.totalRevenue) || 0)}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -107,7 +107,7 @@ export function SalesMetrics() {
|
||||
tick={false}
|
||||
/>
|
||||
<Tooltip
|
||||
formatter={(value: number) => [formatCurrency(value), "Revenue"]}
|
||||
formatter={(value: string) => [formatCurrency(Number(value)), "Revenue"]}
|
||||
labelFormatter={(date) => format(new Date(date), 'MMM d, yyyy')}
|
||||
/>
|
||||
<Area
|
||||
|
||||
@@ -10,14 +10,14 @@ interface StockMetricsData {
|
||||
totalProducts: number
|
||||
productsInStock: number
|
||||
totalStockUnits: number
|
||||
totalStockCost: number
|
||||
totalStockRetail: number
|
||||
totalStockCost: string
|
||||
totalStockRetail: string
|
||||
brandStock: {
|
||||
brand: string
|
||||
variants: number
|
||||
units: number
|
||||
cost: number
|
||||
retail: number
|
||||
cost: string
|
||||
retail: string
|
||||
}[]
|
||||
}
|
||||
|
||||
@@ -91,7 +91,7 @@ const renderActiveShape = (props: any) => {
|
||||
fill="#000000"
|
||||
className="text-base font-medium"
|
||||
>
|
||||
{formatCurrency(retail)}
|
||||
{formatCurrency(Number(retail))}
|
||||
</text>
|
||||
</g>
|
||||
);
|
||||
@@ -154,14 +154,14 @@ export function StockMetrics() {
|
||||
<DollarSign className="h-4 w-4 text-muted-foreground" />
|
||||
<p className="text-sm font-medium text-muted-foreground">Stock Cost</p>
|
||||
</div>
|
||||
<p className="text-lg font-bold">{formatCurrency(data?.totalStockCost || 0)}</p>
|
||||
<p className="text-lg font-bold">{formatCurrency(Number(data?.totalStockCost) || 0)}</p>
|
||||
</div>
|
||||
<div className="flex items-baseline justify-between">
|
||||
<div className="flex items-center gap-2">
|
||||
<ShoppingCart className="h-4 w-4 text-muted-foreground" />
|
||||
<p className="text-sm font-medium text-muted-foreground">Stock Retail</p>
|
||||
</div>
|
||||
<p className="text-lg font-bold">{formatCurrency(data?.totalStockRetail || 0)}</p>
|
||||
<p className="text-lg font-bold">{formatCurrency(Number(data?.totalStockRetail) || 0)}</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -5,18 +5,18 @@ import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@
|
||||
import config from "@/config"
|
||||
import { formatCurrency } from "@/lib/utils"
|
||||
|
||||
interface OverstockedProduct {
|
||||
product_id: number
|
||||
SKU: string
|
||||
title: string
|
||||
stock_quantity: number
|
||||
overstocked_amt: number
|
||||
excess_cost: number
|
||||
excess_retail: number
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
overstocked_amt: number;
|
||||
excess_cost: number;
|
||||
excess_retail: number;
|
||||
}
|
||||
|
||||
export function TopOverstockedProducts() {
|
||||
const { data } = useQuery<OverstockedProduct[]>({
|
||||
const { data } = useQuery<Product[]>({
|
||||
queryKey: ["top-overstocked-products"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/overstock/products?limit=50`)
|
||||
@@ -38,40 +38,30 @@ export function TopOverstockedProducts() {
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Current Stock</TableHead>
|
||||
<TableHead className="text-right">Overstock Amt</TableHead>
|
||||
<TableHead className="text-right">Overstock Cost</TableHead>
|
||||
<TableHead className="text-right">Overstock Retail</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Excess</TableHead>
|
||||
<TableHead className="text-right">Cost</TableHead>
|
||||
<TableHead className="text-right">Retail</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.map((product) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<div>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.product_id}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="font-medium hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<p className="text-sm text-muted-foreground">{product.SKU}</p>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{product.stock_quantity.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{product.overstocked_amt.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{formatCurrency(product.excess_cost)}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{formatCurrency(product.excess_retail)}
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{product.overstocked_amt}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(product.excess_cost)}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(product.excess_retail)}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
|
||||
@@ -3,20 +3,19 @@ import { CardHeader, CardTitle, CardContent } from "@/components/ui/card"
|
||||
import { ScrollArea } from "@/components/ui/scroll-area"
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table"
|
||||
import config from "@/config"
|
||||
import { formatCurrency } from "@/lib/utils"
|
||||
|
||||
interface ReplenishProduct {
|
||||
product_id: number
|
||||
SKU: string
|
||||
title: string
|
||||
current_stock: number
|
||||
replenish_qty: number
|
||||
replenish_cost: number
|
||||
replenish_retail: number
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
daily_sales_avg: string;
|
||||
reorder_qty: number;
|
||||
last_purchase_date: string | null;
|
||||
}
|
||||
|
||||
export function TopReplenishProducts() {
|
||||
const { data } = useQuery<ReplenishProduct[]>({
|
||||
const { data } = useQuery<Product[]>({
|
||||
queryKey: ["top-replenish-products"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/dashboard/replenish/products?limit=50`)
|
||||
@@ -39,39 +38,29 @@ export function TopReplenishProducts() {
|
||||
<TableRow>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Replenish</TableHead>
|
||||
<TableHead className="text-right">Cost</TableHead>
|
||||
<TableHead className="text-right">Retail</TableHead>
|
||||
<TableHead className="text-right">Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Reorder Qty</TableHead>
|
||||
<TableHead>Last Purchase</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.map((product) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<div>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.product_id}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="font-medium hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<p className="text-sm text-muted-foreground">{product.SKU}</p>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{product.current_stock.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{product.replenish_qty.toLocaleString()}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{formatCurrency(product.replenish_cost)}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">
|
||||
{formatCurrency(product.replenish_retail)}
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.reorder_qty}</TableCell>
|
||||
<TableCell>{product.last_purchase_date ? product.last_purchase_date : '-'}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
|
||||
@@ -11,18 +11,18 @@ import {
|
||||
import { TrendingUp, TrendingDown } from "lucide-react"
|
||||
import config from "@/config"
|
||||
|
||||
interface TrendingProduct {
|
||||
product_id: number
|
||||
sku: string
|
||||
title: string
|
||||
daily_sales_avg: string
|
||||
weekly_sales_avg: string
|
||||
growth_rate: string
|
||||
total_revenue: string
|
||||
interface Product {
|
||||
pid: number;
|
||||
sku: string;
|
||||
title: string;
|
||||
daily_sales_avg: string;
|
||||
weekly_sales_avg: string;
|
||||
growth_rate: string;
|
||||
total_revenue: string;
|
||||
}
|
||||
|
||||
export function TrendingProducts() {
|
||||
const { data: products } = useQuery<TrendingProduct[]>({
|
||||
const { data: products } = useQuery<Product[]>({
|
||||
queryKey: ["trending-products"],
|
||||
queryFn: async () => {
|
||||
const response = await fetch(`${config.apiUrl}/products/trending`)
|
||||
@@ -33,7 +33,6 @@ export function TrendingProducts() {
|
||||
},
|
||||
})
|
||||
|
||||
|
||||
const formatPercent = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "percent",
|
||||
@@ -42,6 +41,14 @@ export function TrendingProducts() {
|
||||
signDisplay: "exceptZero",
|
||||
}).format(value / 100)
|
||||
|
||||
const formatCurrency = (value: number) =>
|
||||
new Intl.NumberFormat("en-US", {
|
||||
style: "currency",
|
||||
currency: "USD",
|
||||
minimumFractionDigits: 2,
|
||||
maximumFractionDigits: 2,
|
||||
}).format(value)
|
||||
|
||||
return (
|
||||
<>
|
||||
<CardHeader>
|
||||
@@ -59,7 +66,7 @@ export function TrendingProducts() {
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products?.map((product) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell className="font-medium">
|
||||
<div className="flex flex-col">
|
||||
<span className="font-medium">{product.title}</span>
|
||||
@@ -68,20 +75,20 @@ export function TrendingProducts() {
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell>{parseFloat(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell>{Number(product.daily_sales_avg).toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">
|
||||
<div className="flex items-center justify-end gap-1">
|
||||
{parseFloat(product.growth_rate) > 0 ? (
|
||||
{Number(product.growth_rate) > 0 ? (
|
||||
<TrendingUp className="h-4 w-4 text-success" />
|
||||
) : (
|
||||
<TrendingDown className="h-4 w-4 text-destructive" />
|
||||
)}
|
||||
<span
|
||||
className={
|
||||
parseFloat(product.growth_rate) > 0 ? "text-success" : "text-destructive"
|
||||
Number(product.growth_rate) > 0 ? "text-success" : "text-destructive"
|
||||
}
|
||||
>
|
||||
{formatPercent(parseFloat(product.growth_rate))}
|
||||
{formatPercent(Number(product.growth_rate))}
|
||||
</span>
|
||||
</div>
|
||||
</TableCell>
|
||||
|
||||
@@ -3,24 +3,27 @@ import { ArrowUpDown, ChevronDown, ChevronRight } from "lucide-react";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
interface ProductDetail {
|
||||
product_id: string;
|
||||
name: string;
|
||||
|
||||
interface Product {
|
||||
pid: string;
|
||||
sku: string;
|
||||
title: string;
|
||||
stock_quantity: number;
|
||||
total_sold: number;
|
||||
avg_price: number;
|
||||
first_received_date: string;
|
||||
daily_sales_avg: number;
|
||||
forecast_units: number;
|
||||
forecast_revenue: number;
|
||||
confidence_level: number;
|
||||
}
|
||||
|
||||
export interface ForecastItem {
|
||||
category: string;
|
||||
categoryPath: string;
|
||||
avgDailySales: number;
|
||||
totalSold: number;
|
||||
numProducts: number;
|
||||
avgPrice: number;
|
||||
avgTotalSold: number;
|
||||
products?: ProductDetail[];
|
||||
products?: Product[];
|
||||
}
|
||||
|
||||
export const columns: ColumnDef<ForecastItem>[] = [
|
||||
@@ -42,6 +45,16 @@ export const columns: ColumnDef<ForecastItem>[] = [
|
||||
{
|
||||
accessorKey: "category",
|
||||
header: "Category",
|
||||
cell: ({ row }) => (
|
||||
<div>
|
||||
<div className="font-medium">{row.original.category}</div>
|
||||
{row.original.categoryPath && (
|
||||
<div className="text-sm text-muted-foreground">
|
||||
{row.original.categoryPath}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
accessorKey: "avgDailySales",
|
||||
@@ -147,23 +160,33 @@ export const renderSubComponent = ({ row }: { row: any }) => {
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead>Product Name</TableHead>
|
||||
<TableHead>SKU</TableHead>
|
||||
<TableHead>First Received</TableHead>
|
||||
<TableHead>Stock Quantity</TableHead>
|
||||
<TableHead>Total Sold</TableHead>
|
||||
<TableHead>Average Price</TableHead>
|
||||
<TableHead>Product</TableHead>
|
||||
<TableHead className="text-right">Stock</TableHead>
|
||||
<TableHead className="text-right">Daily Sales</TableHead>
|
||||
<TableHead className="text-right">Forecast Units</TableHead>
|
||||
<TableHead className="text-right">Forecast Revenue</TableHead>
|
||||
<TableHead className="text-right">Confidence</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{products.map((product: ProductDetail) => (
|
||||
<TableRow key={product.product_id}>
|
||||
<TableCell>{product.name}</TableCell>
|
||||
<TableCell>{product.sku}</TableCell>
|
||||
<TableCell>{product.first_received_date}</TableCell>
|
||||
<TableCell>{product.stock_quantity.toLocaleString()}</TableCell>
|
||||
<TableCell>{product.total_sold.toLocaleString()}</TableCell>
|
||||
<TableCell>${product.avg_price.toFixed(2)}</TableCell>
|
||||
{products.map((product) => (
|
||||
<TableRow key={product.pid}>
|
||||
<TableCell>
|
||||
<a
|
||||
href={`https://backend.acherryontop.com/product/${product.pid}`}
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="hover:underline"
|
||||
>
|
||||
{product.title}
|
||||
</a>
|
||||
<div className="text-sm text-muted-foreground">{product.sku}</div>
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{product.stock_quantity}</TableCell>
|
||||
<TableCell className="text-right">{product.daily_sales_avg.toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.forecast_units.toFixed(1)}</TableCell>
|
||||
<TableCell className="text-right">{product.forecast_revenue.toFixed(2)}</TableCell>
|
||||
<TableCell className="text-right">{product.confidence_level.toFixed(1)}%</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
|
||||
@@ -10,7 +10,7 @@ import { LineChart, Line, XAxis, YAxis, CartesianGrid, Tooltip, ResponsiveContai
|
||||
import config from "@/config";
|
||||
|
||||
interface Product {
|
||||
product_id: number;
|
||||
pid: number;
|
||||
title: string;
|
||||
SKU: string;
|
||||
barcode: string;
|
||||
@@ -38,7 +38,7 @@ interface Product {
|
||||
// Vendor info
|
||||
vendor: string;
|
||||
vendor_reference: string;
|
||||
brand: string;
|
||||
brand: string | 'Unbranded';
|
||||
|
||||
// URLs
|
||||
permalink: string;
|
||||
@@ -123,6 +123,8 @@ interface Product {
|
||||
notes: string;
|
||||
lead_time_days: number | null;
|
||||
}>;
|
||||
|
||||
category_paths?: Record<string, string>;
|
||||
}
|
||||
|
||||
interface ProductDetailProps {
|
||||
@@ -205,8 +207,8 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
</div>
|
||||
)}
|
||||
<div>
|
||||
<h2 className="text-xl font-semibold">{product?.title || 'Loading...'}</h2>
|
||||
<p className="text-sm text-muted-foreground">{product?.SKU || ''}</p>
|
||||
<VaulDrawer.Title className="text-xl font-semibold">{product?.title || 'Loading...'}</VaulDrawer.Title>
|
||||
<VaulDrawer.Description className="text-sm text-muted-foreground">{product?.SKU || ''}</VaulDrawer.Description>
|
||||
</div>
|
||||
</div>
|
||||
<Button variant="ghost" size="icon" onClick={onClose}>
|
||||
@@ -255,22 +257,28 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Categories</dt>
|
||||
<dd className="flex flex-wrap gap-2">
|
||||
{product?.categories?.map(category => (
|
||||
<span key={category} className="inline-flex items-center rounded-md bg-muted px-2 py-1 text-xs font-medium ring-1 ring-inset ring-muted">
|
||||
{category}
|
||||
</span>
|
||||
)) || "N/A"}
|
||||
<dd className="flex flex-col gap-2">
|
||||
{product?.category_paths ?
|
||||
Object.entries(product.category_paths).map(([key, fullPath], index) => {
|
||||
const [, leafCategory] = key.split(':');
|
||||
return (
|
||||
<div key={key} className="flex flex-col">
|
||||
<span className="inline-flex items-center rounded-md bg-muted px-2 py-1 text-xs font-medium ring-1 ring-inset ring-muted">
|
||||
{leafCategory}
|
||||
</span>
|
||||
<span className="text-xs text-muted-foreground ml-2 mt-1">
|
||||
{fullPath}
|
||||
</span>
|
||||
</div>
|
||||
);
|
||||
})
|
||||
: "N/A"}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Tags</dt>
|
||||
<dd className="flex flex-wrap gap-2">
|
||||
{product?.tags?.map(tag => (
|
||||
<span key={tag} className="inline-flex items-center rounded-md bg-muted px-2 py-1 text-xs font-medium ring-1 ring-inset ring-muted">
|
||||
{tag}
|
||||
</span>
|
||||
)) || "N/A"}
|
||||
N/A
|
||||
</dd>
|
||||
</div>
|
||||
</dl>
|
||||
@@ -307,11 +315,11 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Status</dt>
|
||||
<dd>{product?.metrics?.stock_status}</dd>
|
||||
<dd>{product?.stock_status || "N/A"}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Days of Stock</dt>
|
||||
<dd>{product?.metrics?.days_of_inventory} days</dd>
|
||||
<dd>{product?.days_of_inventory || 0} days</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -321,15 +329,15 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="space-y-2">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Daily Sales</dt>
|
||||
<dd>{product?.metrics?.daily_sales_avg?.toFixed(1)} units</dd>
|
||||
<dd>{product?.daily_sales_avg?.toFixed(1) || "0.0"} units</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Weekly Sales</dt>
|
||||
<dd>{product?.metrics?.weekly_sales_avg?.toFixed(1)} units</dd>
|
||||
<dd>{product?.weekly_sales_avg?.toFixed(1) || "0.0"} units</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Monthly Sales</dt>
|
||||
<dd>{product?.metrics?.monthly_sales_avg?.toFixed(1)} units</dd>
|
||||
<dd>{product?.monthly_sales_avg?.toFixed(1) || "0.0"} units</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -356,19 +364,19 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="space-y-2">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Total Revenue</dt>
|
||||
<dd>${formatPrice(product?.metrics.total_revenue)}</dd>
|
||||
<dd>${formatPrice(product?.total_revenue)}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Gross Profit</dt>
|
||||
<dd>${formatPrice(product?.metrics.gross_profit)}</dd>
|
||||
<dd>${formatPrice(product?.gross_profit)}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Margin</dt>
|
||||
<dd>{product?.metrics.avg_margin_percent.toFixed(2)}%</dd>
|
||||
<dd>{product?.avg_margin_percent?.toFixed(2) || "0.00"}%</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">GMROI</dt>
|
||||
<dd>{product?.metrics.gmroi.toFixed(2)}</dd>
|
||||
<dd>{product?.gmroi?.toFixed(2) || "0.00"}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -378,15 +386,15 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="space-y-2">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Current Lead Time</dt>
|
||||
<dd>{product?.metrics.current_lead_time}</dd>
|
||||
<dd>{product?.current_lead_time || "N/A"}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Target Lead Time</dt>
|
||||
<dd>{product?.metrics.target_lead_time}</dd>
|
||||
<dd>{product?.target_lead_time || "N/A"}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Lead Time Status</dt>
|
||||
<dd>{product?.metrics.lead_time_status}</dd>
|
||||
<dd>{product?.lead_time_status || "N/A"}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -408,11 +416,11 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Days of Inventory</dt>
|
||||
<dd className="text-2xl font-semibold">{product?.metrics?.days_of_inventory || 0}</dd>
|
||||
<dd className="text-2xl font-semibold">{product?.days_of_inventory || 0}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Status</dt>
|
||||
<dd className="text-2xl font-semibold">{product?.metrics?.stock_status || "N/A"}</dd>
|
||||
<dd className="text-2xl font-semibold">{product?.stock_status || "N/A"}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -422,15 +430,15 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Reorder Point</dt>
|
||||
<dd>{product?.metrics?.reorder_point || 0}</dd>
|
||||
<dd>{product?.reorder_point || 0}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Safety Stock</dt>
|
||||
<dd>{product?.metrics?.safety_stock || 0}</dd>
|
||||
<dd>{product?.safety_stock || 0}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">ABC Class</dt>
|
||||
<dd>{product?.metrics?.abc_class || "N/A"}</dd>
|
||||
<dd>{product?.abc_class || "N/A"}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -551,15 +559,15 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Gross Profit</dt>
|
||||
<dd className="text-2xl font-semibold">${formatPrice(product?.metrics.gross_profit)}</dd>
|
||||
<dd className="text-2xl font-semibold">${formatPrice(product?.gross_profit)}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">GMROI</dt>
|
||||
<dd className="text-2xl font-semibold">{product?.metrics.gmroi.toFixed(2)}</dd>
|
||||
<dd className="text-2xl font-semibold">{product?.gmroi?.toFixed(2) || "0.00"}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Margin %</dt>
|
||||
<dd className="text-2xl font-semibold">{product?.metrics.avg_margin_percent.toFixed(2)}%</dd>
|
||||
<dd className="text-2xl font-semibold">{product?.avg_margin_percent?.toFixed(2) || "0.00"}%</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</Card>
|
||||
@@ -569,7 +577,7 @@ export function ProductDetail({ productId, onClose }: ProductDetailProps) {
|
||||
<dl className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Cost of Goods Sold</dt>
|
||||
<dd>${formatPrice(product?.metrics.cost_of_goods_sold)}</dd>
|
||||
<dd>${formatPrice(product?.cost_of_goods_sold)}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt className="text-sm text-muted-foreground">Landing Cost</dt>
|
||||
|
||||
@@ -24,7 +24,7 @@ type FilterValue = string | number | boolean;
|
||||
type ComparisonOperator = "=" | ">" | ">=" | "<" | "<=" | "between";
|
||||
|
||||
interface FilterValueWithOperator {
|
||||
value: FilterValue | [number, number];
|
||||
value: FilterValue | [string, string];
|
||||
operator: ComparisonOperator;
|
||||
}
|
||||
|
||||
@@ -317,18 +317,32 @@ export function ProductFilters({
|
||||
});
|
||||
}, []);
|
||||
|
||||
const handleApplyFilter = (value: FilterValue | [number, number]) => {
|
||||
const handleApplyFilter = (value: FilterValue | [string, string]) => {
|
||||
if (!selectedFilter) return;
|
||||
|
||||
const newFilters = {
|
||||
...activeFilters,
|
||||
[selectedFilter.id]: {
|
||||
value,
|
||||
operator: selectedOperator,
|
||||
},
|
||||
};
|
||||
let filterValue: ActiveFilterValue;
|
||||
|
||||
if (selectedFilter.type === "number") {
|
||||
if (selectedOperator === "between" && Array.isArray(value)) {
|
||||
filterValue = {
|
||||
value: [value[0].toString(), value[1].toString()],
|
||||
operator: selectedOperator,
|
||||
};
|
||||
} else {
|
||||
filterValue = {
|
||||
value: value.toString(),
|
||||
operator: selectedOperator,
|
||||
};
|
||||
}
|
||||
} else {
|
||||
filterValue = value;
|
||||
}
|
||||
|
||||
onFilterChange({
|
||||
...activeFilters,
|
||||
[selectedFilter.id]: filterValue,
|
||||
});
|
||||
|
||||
onFilterChange(newFilters as Record<string, ActiveFilterValue>);
|
||||
handlePopoverClose();
|
||||
};
|
||||
|
||||
@@ -394,38 +408,14 @@ export function ProductFilters({
|
||||
|
||||
|
||||
const getFilterDisplayValue = (filter: ActiveFilter) => {
|
||||
const filterValue = activeFilters[filter.id];
|
||||
const filterOption = filterOptions.find((opt) => opt.id === filter.id);
|
||||
|
||||
// For between ranges
|
||||
if (Array.isArray(filterValue)) {
|
||||
return `${filter.label} between ${filterValue[0]} and ${filterValue[1]}`;
|
||||
if (typeof filter.value === "object" && "operator" in filter.value) {
|
||||
const { operator, value } = filter.value;
|
||||
if (Array.isArray(value)) {
|
||||
return `${operator} ${value[0]} and ${value[1]}`;
|
||||
}
|
||||
return `${operator} ${value}`;
|
||||
}
|
||||
|
||||
// For direct selections (select type) or text search
|
||||
if (
|
||||
filterOption?.type === "select" ||
|
||||
filterOption?.type === "text" ||
|
||||
typeof filterValue !== "object"
|
||||
) {
|
||||
const value =
|
||||
typeof filterValue === "object" ? filterValue.value : filterValue;
|
||||
return `${filter.label}: ${value}`;
|
||||
}
|
||||
|
||||
// For numeric filters with operators
|
||||
const operator = filterValue.operator;
|
||||
const value = filterValue.value;
|
||||
const operatorDisplay = {
|
||||
"=": "=",
|
||||
">": ">",
|
||||
">=": "≥",
|
||||
"<": "<",
|
||||
"<=": "≤",
|
||||
between: "between",
|
||||
}[operator];
|
||||
|
||||
return `${filter.label} ${operatorDisplay} ${value}`;
|
||||
return filter.value.toString();
|
||||
};
|
||||
|
||||
return (
|
||||
|
||||
@@ -230,7 +230,7 @@ export function ProductTable({
|
||||
return (
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{Array.from(new Set(value as string[])).map((category) => (
|
||||
<Badge key={`${product.product_id}-${category}`} variant="outline">{category}</Badge>
|
||||
<Badge key={`${product.pid}-${category}`} variant="outline">{category}</Badge>
|
||||
)) || '-'}
|
||||
</div>
|
||||
);
|
||||
@@ -261,6 +261,11 @@ export function ProductTable({
|
||||
return columnDef.format(num);
|
||||
}
|
||||
}
|
||||
// If the value is already a number, format it directly
|
||||
if (typeof value === 'number') {
|
||||
return columnDef.format(value);
|
||||
}
|
||||
// For other formats (e.g., date formatting), pass the value as is
|
||||
return columnDef.format(value);
|
||||
}
|
||||
return value ?? '-';
|
||||
@@ -297,12 +302,12 @@ export function ProductTable({
|
||||
<TableBody>
|
||||
{products.map((product) => (
|
||||
<TableRow
|
||||
key={product.product_id}
|
||||
key={product.pid}
|
||||
onClick={() => onRowClick?.(product)}
|
||||
className="cursor-pointer"
|
||||
>
|
||||
{orderedColumns.map((column) => (
|
||||
<TableCell key={`${product.product_id}-${column}`}>
|
||||
<TableCell key={`${product.pid}-${column}`}>
|
||||
{formatColumnValue(product, column)}
|
||||
</TableCell>
|
||||
))}
|
||||
|
||||
@@ -8,7 +8,7 @@ import config from '../../config';
|
||||
|
||||
interface SalesVelocityConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
daily_window_days: number;
|
||||
weekly_window_days: number;
|
||||
@@ -18,7 +18,7 @@ interface SalesVelocityConfig {
|
||||
export function CalculationSettings() {
|
||||
const [salesVelocityConfig, setSalesVelocityConfig] = useState<SalesVelocityConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
daily_window_days: 30,
|
||||
weekly_window_days: 7,
|
||||
|
||||
@@ -6,10 +6,11 @@ import { Label } from "@/components/ui/label";
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Table, TableBody, TableCell, TableHeader, TableRow } from "@/components/ui/table";
|
||||
|
||||
interface StockThreshold {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
critical_days: number;
|
||||
reorder_days: number;
|
||||
@@ -22,7 +23,7 @@ interface StockThreshold {
|
||||
|
||||
interface LeadTimeThreshold {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
target_days: number;
|
||||
warning_days: number;
|
||||
@@ -31,7 +32,7 @@ interface LeadTimeThreshold {
|
||||
|
||||
interface SalesVelocityConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
daily_window_days: number;
|
||||
weekly_window_days: number;
|
||||
@@ -47,7 +48,7 @@ interface ABCClassificationConfig {
|
||||
|
||||
interface SafetyStockConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
coverage_days: number;
|
||||
service_level: number;
|
||||
@@ -55,7 +56,7 @@ interface SafetyStockConfig {
|
||||
|
||||
interface TurnoverConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
calculation_period_days: number;
|
||||
target_rate: number;
|
||||
@@ -64,7 +65,7 @@ interface TurnoverConfig {
|
||||
export function Configuration() {
|
||||
const [stockThresholds, setStockThresholds] = useState<StockThreshold>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
critical_days: 7,
|
||||
reorder_days: 14,
|
||||
@@ -75,7 +76,7 @@ export function Configuration() {
|
||||
|
||||
const [leadTimeThresholds, setLeadTimeThresholds] = useState<LeadTimeThreshold>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
target_days: 14,
|
||||
warning_days: 21,
|
||||
@@ -84,7 +85,7 @@ export function Configuration() {
|
||||
|
||||
const [salesVelocityConfig, setSalesVelocityConfig] = useState<SalesVelocityConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
daily_window_days: 30,
|
||||
weekly_window_days: 7,
|
||||
@@ -100,7 +101,7 @@ export function Configuration() {
|
||||
|
||||
const [safetyStockConfig, setSafetyStockConfig] = useState<SafetyStockConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
coverage_days: 14,
|
||||
service_level: 95.0
|
||||
@@ -108,7 +109,7 @@ export function Configuration() {
|
||||
|
||||
const [turnoverConfig, setTurnoverConfig] = useState<TurnoverConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
calculation_period_days: 30,
|
||||
target_rate: 1.0
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -5,10 +5,11 @@ import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { toast } from "sonner";
|
||||
import config from '../../config';
|
||||
import { Table, TableBody, TableCell, TableHeader, TableRow } from "@/components/ui/table";
|
||||
|
||||
interface LeadTimeThreshold {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
target_days: number;
|
||||
warning_days: number;
|
||||
@@ -17,6 +18,8 @@ interface LeadTimeThreshold {
|
||||
|
||||
interface ABCClassificationConfig {
|
||||
id: number;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
a_threshold: number;
|
||||
b_threshold: number;
|
||||
classification_period_days: number;
|
||||
@@ -24,7 +27,7 @@ interface ABCClassificationConfig {
|
||||
|
||||
interface TurnoverConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
calculation_period_days: number;
|
||||
target_rate: number;
|
||||
@@ -33,27 +36,16 @@ interface TurnoverConfig {
|
||||
export function PerformanceMetrics() {
|
||||
const [leadTimeThresholds, setLeadTimeThresholds] = useState<LeadTimeThreshold>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
target_days: 14,
|
||||
warning_days: 21,
|
||||
critical_days: 30
|
||||
});
|
||||
|
||||
const [abcConfig, setAbcConfig] = useState<ABCClassificationConfig>({
|
||||
id: 1,
|
||||
a_threshold: 20.0,
|
||||
b_threshold: 50.0,
|
||||
classification_period_days: 90
|
||||
});
|
||||
const [abcConfigs, setAbcConfigs] = useState<ABCClassificationConfig[]>([]);
|
||||
|
||||
const [turnoverConfig, setTurnoverConfig] = useState<TurnoverConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
vendor: null,
|
||||
calculation_period_days: 30,
|
||||
target_rate: 1.0
|
||||
});
|
||||
const [turnoverConfigs, setTurnoverConfigs] = useState<TurnoverConfig[]>([]);
|
||||
|
||||
useEffect(() => {
|
||||
const loadConfig = async () => {
|
||||
@@ -66,8 +58,8 @@ export function PerformanceMetrics() {
|
||||
}
|
||||
const data = await response.json();
|
||||
setLeadTimeThresholds(data.leadTimeThresholds);
|
||||
setAbcConfig(data.abcConfig);
|
||||
setTurnoverConfig(data.turnoverConfig);
|
||||
setAbcConfigs(data.abcConfigs);
|
||||
setTurnoverConfigs(data.turnoverConfigs);
|
||||
} catch (error) {
|
||||
toast.error(`Failed to load configuration: ${error instanceof Error ? error.message : 'Unknown error'}`);
|
||||
}
|
||||
@@ -105,7 +97,7 @@ export function PerformanceMetrics() {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(abcConfig)
|
||||
body: JSON.stringify(abcConfigs)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
@@ -127,7 +119,7 @@ export function PerformanceMetrics() {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
credentials: 'include',
|
||||
body: JSON.stringify(turnoverConfig)
|
||||
body: JSON.stringify(turnoverConfigs)
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
@@ -141,6 +133,10 @@ export function PerformanceMetrics() {
|
||||
}
|
||||
};
|
||||
|
||||
function getCategoryName(_cat_id: number): import("react").ReactNode {
|
||||
throw new Error('Function not implemented.');
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="max-w-[700px] space-y-4">
|
||||
{/* Lead Time Thresholds Card */}
|
||||
@@ -210,54 +206,28 @@ export function PerformanceMetrics() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-3 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="a-threshold">A Threshold (%)</Label>
|
||||
<Input
|
||||
id="a-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={abcConfig.a_threshold}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
a_threshold: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="b-threshold">B Threshold (%)</Label>
|
||||
<Input
|
||||
id="b-threshold"
|
||||
type="number"
|
||||
min="0"
|
||||
max="100"
|
||||
step="0.1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={abcConfig.b_threshold}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
b_threshold: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="classification-period">Classification Period (days)</Label>
|
||||
<Input
|
||||
id="classification-period"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={abcConfig.classification_period_days}
|
||||
onChange={(e) => setAbcConfig(prev => ({
|
||||
...prev,
|
||||
classification_period_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableCell>Category</TableCell>
|
||||
<TableCell>Vendor</TableCell>
|
||||
<TableCell className="text-right">A Threshold</TableCell>
|
||||
<TableCell className="text-right">B Threshold</TableCell>
|
||||
<TableCell className="text-right">Period Days</TableCell>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{abcConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.a_threshold}%</TableCell>
|
||||
<TableCell className="text-right">{config.b_threshold}%</TableCell>
|
||||
<TableCell className="text-right">{config.classification_period_days}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateABCConfig}>
|
||||
Update ABC Classification
|
||||
</Button>
|
||||
@@ -273,37 +243,26 @@ export function PerformanceMetrics() {
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<div className="space-y-4">
|
||||
<div className="grid grid-cols-2 gap-4">
|
||||
<div>
|
||||
<Label htmlFor="calculation-period">Calculation Period (days)</Label>
|
||||
<Input
|
||||
id="calculation-period"
|
||||
type="number"
|
||||
min="1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={turnoverConfig.calculation_period_days}
|
||||
onChange={(e) => setTurnoverConfig(prev => ({
|
||||
...prev,
|
||||
calculation_period_days: parseInt(e.target.value) || 1
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Label htmlFor="target-rate">Target Rate</Label>
|
||||
<Input
|
||||
id="target-rate"
|
||||
type="number"
|
||||
min="0"
|
||||
step="0.1"
|
||||
className="[appearance:textfield] [&::-webkit-outer-spin-button]:appearance-none [&::-webkit-inner-spin-button]:appearance-none"
|
||||
value={turnoverConfig.target_rate}
|
||||
onChange={(e) => setTurnoverConfig(prev => ({
|
||||
...prev,
|
||||
target_rate: parseFloat(e.target.value) || 0
|
||||
}))}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableCell>Category</TableCell>
|
||||
<TableCell>Vendor</TableCell>
|
||||
<TableCell className="text-right">Period Days</TableCell>
|
||||
<TableCell className="text-right">Target Rate</TableCell>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{turnoverConfigs.map((config) => (
|
||||
<TableRow key={`${config.cat_id}-${config.vendor}`}>
|
||||
<TableCell>{config.cat_id ? getCategoryName(config.cat_id) : 'Global'}</TableCell>
|
||||
<TableCell>{config.vendor || 'All Vendors'}</TableCell>
|
||||
<TableCell className="text-right">{config.calculation_period_days}</TableCell>
|
||||
<TableCell className="text-right">{config.target_rate.toFixed(2)}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
<Button onClick={handleUpdateTurnoverConfig}>
|
||||
Update Turnover Configuration
|
||||
</Button>
|
||||
|
||||
@@ -8,7 +8,7 @@ import config from '../../config';
|
||||
|
||||
interface StockThreshold {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
critical_days: number;
|
||||
reorder_days: number;
|
||||
@@ -19,7 +19,7 @@ interface StockThreshold {
|
||||
|
||||
interface SafetyStockConfig {
|
||||
id: number;
|
||||
category_id: number | null;
|
||||
cat_id: number | null;
|
||||
vendor: string | null;
|
||||
coverage_days: number;
|
||||
service_level: number;
|
||||
@@ -28,7 +28,7 @@ interface SafetyStockConfig {
|
||||
export function StockManagement() {
|
||||
const [stockThresholds, setStockThresholds] = useState<StockThreshold>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
critical_days: 7,
|
||||
reorder_days: 14,
|
||||
@@ -39,7 +39,7 @@ export function StockManagement() {
|
||||
|
||||
const [safetyStockConfig, setSafetyStockConfig] = useState<SafetyStockConfig>({
|
||||
id: 1,
|
||||
category_id: null,
|
||||
cat_id: null,
|
||||
vendor: null,
|
||||
coverage_days: 14,
|
||||
service_level: 95.0
|
||||
|
||||
@@ -10,37 +10,66 @@ import { motion } from "motion/react";
|
||||
import config from "../config";
|
||||
|
||||
interface Category {
|
||||
category_id: number;
|
||||
cat_id: number;
|
||||
name: string;
|
||||
description: string;
|
||||
parent_category?: string;
|
||||
product_count: number;
|
||||
total_value: number;
|
||||
avg_margin: number;
|
||||
turnover_rate: number;
|
||||
growth_rate: number;
|
||||
type: number;
|
||||
parent_id: number | null;
|
||||
parent_name: string | null;
|
||||
parent_type: number | null;
|
||||
description: string | null;
|
||||
status: string;
|
||||
metrics?: {
|
||||
product_count: number;
|
||||
active_products: number;
|
||||
total_value: number;
|
||||
avg_margin: number;
|
||||
turnover_rate: number;
|
||||
growth_rate: number;
|
||||
};
|
||||
}
|
||||
|
||||
interface TypeCount {
|
||||
type: number;
|
||||
count: number;
|
||||
}
|
||||
|
||||
interface CategoryFilters {
|
||||
search: string;
|
||||
parent: string;
|
||||
type: string;
|
||||
performance: string;
|
||||
}
|
||||
|
||||
const TYPE_LABELS: Record<number, string> = {
|
||||
10: 'Section',
|
||||
11: 'Category',
|
||||
12: 'Subcategory',
|
||||
13: 'Sub-subcategory',
|
||||
20: 'Theme',
|
||||
21: 'Subtheme'
|
||||
};
|
||||
|
||||
function getCategoryStatusVariant(status: string): "default" | "secondary" | "destructive" | "outline" {
|
||||
switch (status.toLowerCase()) {
|
||||
case 'active':
|
||||
return 'default';
|
||||
case 'inactive':
|
||||
return 'secondary';
|
||||
case 'archived':
|
||||
return 'destructive';
|
||||
default:
|
||||
return 'outline';
|
||||
}
|
||||
}
|
||||
|
||||
export function Categories() {
|
||||
const [page, setPage] = useState(1);
|
||||
const [sortColumn, setSortColumn] = useState<keyof Category>("name");
|
||||
const [sortDirection, setSortDirection] = useState<"asc" | "desc">("asc");
|
||||
const [sortColumn] = useState<keyof Category>("name");
|
||||
const [sortDirection] = useState<"asc" | "desc">("asc");
|
||||
const [filters, setFilters] = useState<CategoryFilters>({
|
||||
search: "",
|
||||
parent: "all",
|
||||
type: "all",
|
||||
performance: "all",
|
||||
});
|
||||
const [] = useState({
|
||||
column: 'name',
|
||||
direction: 'asc'
|
||||
});
|
||||
|
||||
const { data, isLoading } = useQuery({
|
||||
queryKey: ["categories"],
|
||||
@@ -68,19 +97,15 @@ export function Categories() {
|
||||
);
|
||||
}
|
||||
|
||||
// Apply parent filter
|
||||
if (filters.parent !== 'all') {
|
||||
if (filters.parent === 'none') {
|
||||
filtered = filtered.filter(category => !category.parent_category);
|
||||
} else {
|
||||
filtered = filtered.filter(category => category.parent_category === filters.parent);
|
||||
}
|
||||
// Apply type filter
|
||||
if (filters.type !== 'all') {
|
||||
filtered = filtered.filter(category => category.type === parseInt(filters.type));
|
||||
}
|
||||
|
||||
// Apply performance filter
|
||||
if (filters.performance !== 'all') {
|
||||
filtered = filtered.filter(category => {
|
||||
const growth = category.growth_rate ?? 0;
|
||||
const growth = category.metrics?.growth_rate ?? 0;
|
||||
switch (filters.performance) {
|
||||
case 'high_growth': return growth >= 20;
|
||||
case 'growing': return growth >= 5 && growth < 20;
|
||||
@@ -93,6 +118,19 @@ export function Categories() {
|
||||
|
||||
// Apply sorting
|
||||
filtered.sort((a, b) => {
|
||||
// First sort by type if not explicitly sorting by another column
|
||||
if (sortColumn === "name") {
|
||||
if (a.type !== b.type) {
|
||||
return a.type - b.type;
|
||||
}
|
||||
// Then by parent hierarchy
|
||||
if (a.parent_id !== b.parent_id) {
|
||||
if (!a.parent_id) return -1;
|
||||
if (!b.parent_id) return 1;
|
||||
return a.parent_id - b.parent_id;
|
||||
}
|
||||
}
|
||||
|
||||
const aVal = a[sortColumn];
|
||||
const bVal = b[sortColumn];
|
||||
|
||||
@@ -123,9 +161,9 @@ export function Categories() {
|
||||
if (!filteredData.length) return data?.stats;
|
||||
|
||||
const activeCategories = filteredData.filter(c => c.status === 'active').length;
|
||||
const totalValue = filteredData.reduce((sum, c) => sum + (c.total_value || 0), 0);
|
||||
const margins = filteredData.map(c => c.avg_margin || 0).filter(m => m !== 0);
|
||||
const growthRates = filteredData.map(c => c.growth_rate || 0).filter(g => g !== 0);
|
||||
const totalValue = filteredData.reduce((sum, c) => sum + (c.metrics?.total_value || 0), 0);
|
||||
const margins = filteredData.map(c => c.metrics?.avg_margin || 0).filter(m => m !== 0);
|
||||
const growthRates = filteredData.map(c => c.metrics?.growth_rate || 0).filter(g => g !== 0);
|
||||
|
||||
return {
|
||||
totalCategories: filteredData.length,
|
||||
@@ -136,20 +174,7 @@ export function Categories() {
|
||||
};
|
||||
}, [filteredData, data?.stats]);
|
||||
|
||||
const handleSort = (column: keyof Category) => {
|
||||
setSortDirection(prev => {
|
||||
if (sortColumn !== column) return "asc";
|
||||
return prev === "asc" ? "desc" : "asc";
|
||||
});
|
||||
setSortColumn(column);
|
||||
};
|
||||
|
||||
const getPerformanceBadge = (growth: number) => {
|
||||
if (growth >= 20) return <Badge variant="default">High Growth</Badge>;
|
||||
if (growth >= 5) return <Badge variant="secondary">Growing</Badge>;
|
||||
if (growth >= -5) return <Badge variant="outline">Stable</Badge>;
|
||||
return <Badge variant="destructive">Declining</Badge>;
|
||||
};
|
||||
|
||||
const formatCurrency = (value: number) => {
|
||||
return new Intl.NumberFormat('en-US', {
|
||||
@@ -245,17 +270,18 @@ export function Categories() {
|
||||
className="h-8 w-[150px] lg:w-[250px]"
|
||||
/>
|
||||
<Select
|
||||
value={filters.parent}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, parent: value }))}
|
||||
value={filters.type}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, type: value }))}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[180px]">
|
||||
<SelectValue placeholder="Parent Category" />
|
||||
<SelectValue placeholder="Category Type" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Categories</SelectItem>
|
||||
<SelectItem value="none">Top Level Only</SelectItem>
|
||||
{data?.parentCategories?.map((parent: string) => (
|
||||
<SelectItem key={parent} value={parent}>{parent}</SelectItem>
|
||||
<SelectItem value="all">All Types</SelectItem>
|
||||
{data?.typeCounts?.map((tc: TypeCount) => (
|
||||
<SelectItem key={tc.type} value={tc.type.toString()}>
|
||||
{TYPE_LABELS[tc.type]} ({tc.count})
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
@@ -281,48 +307,66 @@ export function Categories() {
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead onClick={() => handleSort("name")} className="cursor-pointer">Name</TableHead>
|
||||
<TableHead onClick={() => handleSort("parent_category")} className="cursor-pointer">Parent</TableHead>
|
||||
<TableHead onClick={() => handleSort("product_count")} className="cursor-pointer">Products</TableHead>
|
||||
<TableHead onClick={() => handleSort("total_value")} className="cursor-pointer">Value</TableHead>
|
||||
<TableHead onClick={() => handleSort("avg_margin")} className="cursor-pointer">Margin</TableHead>
|
||||
<TableHead onClick={() => handleSort("turnover_rate")} className="cursor-pointer">Turnover</TableHead>
|
||||
<TableHead onClick={() => handleSort("growth_rate")} className="cursor-pointer">Growth</TableHead>
|
||||
<TableHead onClick={() => handleSort("status")} className="cursor-pointer">Status</TableHead>
|
||||
<TableHead>Type</TableHead>
|
||||
<TableHead>Name</TableHead>
|
||||
<TableHead>Parent</TableHead>
|
||||
<TableHead className="text-right">Products</TableHead>
|
||||
<TableHead className="text-right">Active</TableHead>
|
||||
<TableHead className="text-right">Value</TableHead>
|
||||
<TableHead className="text-right">Margin</TableHead>
|
||||
<TableHead className="text-right">Turnover</TableHead>
|
||||
<TableHead className="text-right">Growth</TableHead>
|
||||
<TableHead>Status</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{isLoading ? (
|
||||
<TableRow>
|
||||
<TableCell colSpan={8} className="text-center py-8">
|
||||
<TableCell colSpan={10} className="text-center py-8">
|
||||
Loading categories...
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
) : paginatedData.map((category: Category) => (
|
||||
<TableRow key={category.category_id}>
|
||||
<TableRow key={category.cat_id}>
|
||||
<TableCell>
|
||||
<div className="font-medium">{category.name}</div>
|
||||
<div className="text-sm text-muted-foreground">{category.description}</div>
|
||||
<Badge variant="outline">
|
||||
{TYPE_LABELS[category.type]}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
<TableCell>{category.parent_category || "—"}</TableCell>
|
||||
<TableCell>{category.product_count?.toLocaleString() ?? 0}</TableCell>
|
||||
<TableCell>{formatCurrency(category.total_value ?? 0)}</TableCell>
|
||||
<TableCell>{typeof category.avg_margin === 'number' ? category.avg_margin.toFixed(1) : "0.0"}%</TableCell>
|
||||
<TableCell>{typeof category.turnover_rate === 'number' ? category.turnover_rate.toFixed(1) : "0.0"}x</TableCell>
|
||||
<TableCell>
|
||||
<div className="flex items-center gap-2" style={{ minWidth: '120px' }}>
|
||||
<div style={{ width: '50px', textAlign: 'right' }}>
|
||||
{typeof category.growth_rate === 'number' ? category.growth_rate.toFixed(1) : "0.0"}%
|
||||
<div className="flex flex-col gap-1">
|
||||
<div className="flex items-center gap-2">
|
||||
<span className="font-medium">{category.name}</span>
|
||||
|
||||
</div>
|
||||
{getPerformanceBadge(category.growth_rate ?? 0)}
|
||||
{category.description && (
|
||||
<div className="text-xs text-muted-foreground">{category.description}</div>
|
||||
)}
|
||||
</div>
|
||||
</TableCell>
|
||||
<TableCell>{category.status}</TableCell>
|
||||
<TableCell className="text-sm text-muted-foreground">
|
||||
{category.type === 10 ? category.name : // Section
|
||||
category.type === 11 ? `${category.parent_name}` : // Category
|
||||
category.type === 12 ? `${category.parent_name} > ${category.name}` : // Subcategory
|
||||
category.type === 13 ? `${category.parent_name} > ${category.name}` : // Sub-subcategory
|
||||
category.parent_name ? `${category.parent_name} > ${category.name}` : category.name}
|
||||
</TableCell>
|
||||
<TableCell className="text-right">{category.metrics?.product_count || 0}</TableCell>
|
||||
<TableCell className="text-right">{category.metrics?.active_products || 0}</TableCell>
|
||||
<TableCell className="text-right">{formatCurrency(category.metrics?.total_value || 0)}</TableCell>
|
||||
<TableCell className="text-right">{category.metrics?.avg_margin?.toFixed(1)}%</TableCell>
|
||||
<TableCell className="text-right">{category.metrics?.turnover_rate?.toFixed(2)}</TableCell>
|
||||
<TableCell className="text-right">{category.metrics?.growth_rate?.toFixed(1)}%</TableCell>
|
||||
<TableCell>
|
||||
<Badge variant={getCategoryStatusVariant(category.status)}>
|
||||
{category.status}
|
||||
</Badge>
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
{!isLoading && !paginatedData.length && (
|
||||
<TableRow>
|
||||
<TableCell colSpan={8} className="text-center py-8 text-muted-foreground">
|
||||
<TableCell colSpan={10} className="text-center py-8 text-muted-foreground">
|
||||
No categories found
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
|
||||
@@ -60,19 +60,23 @@ export default function Forecasting() {
|
||||
const data = await response.json();
|
||||
return data.map((item: any) => ({
|
||||
category: item.category_name,
|
||||
categoryPath: item.path,
|
||||
avgDailySales: Number(item.avg_daily_sales) || 0,
|
||||
totalSold: Number(item.total_sold) || 0,
|
||||
numProducts: Number(item.num_products) || 0,
|
||||
avgPrice: Number(item.avg_price) || 0,
|
||||
avgTotalSold: Number(item.avgTotalSold) || 0,
|
||||
products: item.products?.map((p: any) => ({
|
||||
product_id: p.product_id,
|
||||
name: p.title,
|
||||
pid: p.pid,
|
||||
title: p.title,
|
||||
sku: p.sku,
|
||||
stock_quantity: Number(p.stock_quantity) || 0,
|
||||
total_sold: Number(p.total_sold) || 0,
|
||||
avg_price: Number(p.avg_price) || 0,
|
||||
first_received_date: p.first_received_date,
|
||||
daily_sales_avg: Number(p.daily_sales_avg) || 0,
|
||||
forecast_units: Number(p.forecast_units) || 0,
|
||||
forecast_revenue: Number(p.forecast_revenue) || 0,
|
||||
confidence_level: Number(p.confidence_level) || 0,
|
||||
categoryPath: item.path
|
||||
}))
|
||||
}));
|
||||
},
|
||||
|
||||
@@ -503,7 +503,7 @@ export function Products() {
|
||||
columnDefs={AVAILABLE_COLUMNS}
|
||||
columnOrder={columnOrder}
|
||||
onColumnOrderChange={handleColumnOrderChange}
|
||||
onRowClick={(product) => setSelectedProductId(product.product_id)}
|
||||
onRowClick={(product) => setSelectedProductId(product.pid)}
|
||||
/>
|
||||
|
||||
{totalPages > 1 && (
|
||||
|
||||
@@ -20,12 +20,21 @@ import {
|
||||
PaginationPrevious,
|
||||
} from '../components/ui/pagination';
|
||||
import { motion } from 'motion/react';
|
||||
import {
|
||||
PurchaseOrderStatus,
|
||||
ReceivingStatus as ReceivingStatusCode,
|
||||
getPurchaseOrderStatusLabel,
|
||||
getReceivingStatusLabel,
|
||||
getPurchaseOrderStatusVariant,
|
||||
getReceivingStatusVariant
|
||||
} from '../types/status-codes';
|
||||
|
||||
interface PurchaseOrder {
|
||||
id: number;
|
||||
vendor_name: string;
|
||||
order_date: string;
|
||||
status: string;
|
||||
status: number;
|
||||
receiving_status: number;
|
||||
total_items: number;
|
||||
total_quantity: number;
|
||||
total_cost: number;
|
||||
@@ -113,6 +122,16 @@ export default function PurchaseOrders() {
|
||||
limit: 100,
|
||||
});
|
||||
|
||||
const STATUS_FILTER_OPTIONS = [
|
||||
{ value: 'all', label: 'All Statuses' },
|
||||
{ value: String(PurchaseOrderStatus.Created), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Created) },
|
||||
{ value: String(PurchaseOrderStatus.ElectronicallyReadySend), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.ElectronicallyReadySend) },
|
||||
{ value: String(PurchaseOrderStatus.Ordered), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Ordered) },
|
||||
{ value: String(PurchaseOrderStatus.ReceivingStarted), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.ReceivingStarted) },
|
||||
{ value: String(PurchaseOrderStatus.Done), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Done) },
|
||||
{ value: String(PurchaseOrderStatus.Canceled), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Canceled) },
|
||||
];
|
||||
|
||||
const fetchData = async () => {
|
||||
try {
|
||||
const searchParams = new URLSearchParams({
|
||||
@@ -171,16 +190,25 @@ export default function PurchaseOrders() {
|
||||
}
|
||||
};
|
||||
|
||||
const getStatusBadge = (status: string) => {
|
||||
const variants: Record<string, { variant: "default" | "secondary" | "destructive" | "outline"; label: string }> = {
|
||||
pending: { variant: "outline", label: "Pending" },
|
||||
received: { variant: "default", label: "Received" },
|
||||
partial: { variant: "secondary", label: "Partial" },
|
||||
cancelled: { variant: "destructive", label: "Cancelled" },
|
||||
};
|
||||
const getStatusBadge = (status: number, receivingStatus: number) => {
|
||||
// If the PO is canceled, show that status
|
||||
if (status === PurchaseOrderStatus.Canceled) {
|
||||
return <Badge variant={getPurchaseOrderStatusVariant(status)}>
|
||||
{getPurchaseOrderStatusLabel(status)}
|
||||
</Badge>;
|
||||
}
|
||||
|
||||
const statusConfig = variants[status.toLowerCase()] || variants.pending;
|
||||
return <Badge variant={statusConfig.variant}>{statusConfig.label}</Badge>;
|
||||
// If receiving has started, show receiving status
|
||||
if (status >= PurchaseOrderStatus.ReceivingStarted) {
|
||||
return <Badge variant={getReceivingStatusVariant(receivingStatus)}>
|
||||
{getReceivingStatusLabel(receivingStatus)}
|
||||
</Badge>;
|
||||
}
|
||||
|
||||
// Otherwise show PO status
|
||||
return <Badge variant={getPurchaseOrderStatusVariant(status)}>
|
||||
{getPurchaseOrderStatusLabel(status)}
|
||||
</Badge>;
|
||||
};
|
||||
|
||||
const formatNumber = (value: number) => {
|
||||
@@ -252,45 +280,44 @@ export default function PurchaseOrders() {
|
||||
</div>
|
||||
|
||||
{/* Filters */}
|
||||
<div className="mb-6 flex flex-col gap-4 md:flex-row md:items-center">
|
||||
<div className="flex items-center gap-2 flex-1">
|
||||
<Input
|
||||
placeholder="Search orders..."
|
||||
value={filters.search}
|
||||
onChange={(e) => setFilters(prev => ({ ...prev, search: e.target.value }))}
|
||||
className="h-8 w-[300px]"
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-wrap items-center gap-2">
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, status: value }))}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[180px]">
|
||||
<SelectValue placeholder="Status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Statuses</SelectItem>
|
||||
{filterOptions.statuses.map(status => (
|
||||
<SelectItem key={status} value={status}>{status}</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<Select
|
||||
value={filters.vendor}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, vendor: value }))}
|
||||
>
|
||||
<SelectTrigger className="h-8 w-[180px]">
|
||||
<SelectValue placeholder="Vendor" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Vendors</SelectItem>
|
||||
{filterOptions.vendors.map(vendor => (
|
||||
<SelectItem key={vendor} value={vendor}>{vendor}</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
<div className="mb-4 flex items-center gap-4">
|
||||
<Input
|
||||
placeholder="Search orders..."
|
||||
value={filters.search}
|
||||
onChange={(e) => setFilters(prev => ({ ...prev, search: e.target.value }))}
|
||||
className="max-w-xs"
|
||||
/>
|
||||
<Select
|
||||
value={filters.status}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, status: value }))}
|
||||
>
|
||||
<SelectTrigger className="w-[180px]">
|
||||
<SelectValue placeholder="Select status" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{STATUS_FILTER_OPTIONS.map(option => (
|
||||
<SelectItem key={option.value} value={option.value}>
|
||||
{option.label}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<Select
|
||||
value={filters.vendor}
|
||||
onValueChange={(value) => setFilters(prev => ({ ...prev, vendor: value }))}
|
||||
>
|
||||
<SelectTrigger className="w-[180px]">
|
||||
<SelectValue placeholder="Select vendor" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="all">All Vendors</SelectItem>
|
||||
{filterOptions.vendors.map(vendor => (
|
||||
<SelectItem key={vendor} value={vendor}>
|
||||
{vendor}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
|
||||
{/* Purchase Orders Table */}
|
||||
@@ -343,7 +370,7 @@ export default function PurchaseOrders() {
|
||||
<TableCell>{po.id}</TableCell>
|
||||
<TableCell>{po.vendor_name}</TableCell>
|
||||
<TableCell>{new Date(po.order_date).toLocaleDateString()}</TableCell>
|
||||
<TableCell>{getStatusBadge(po.status)}</TableCell>
|
||||
<TableCell>{getStatusBadge(po.status, po.receiving_status)}</TableCell>
|
||||
<TableCell>{po.total_items.toLocaleString()}</TableCell>
|
||||
<TableCell>{po.total_quantity.toLocaleString()}</TableCell>
|
||||
<TableCell>${formatNumber(po.total_cost)}</TableCell>
|
||||
|
||||
@@ -1,16 +1,16 @@
|
||||
export interface Product {
|
||||
product_id: number;
|
||||
pid: number;
|
||||
title: string;
|
||||
SKU: string;
|
||||
stock_quantity: number;
|
||||
price: number;
|
||||
regular_price: number;
|
||||
cost_price: number;
|
||||
landing_cost_price: number | null;
|
||||
price: string; // DECIMAL(15,3)
|
||||
regular_price: string; // DECIMAL(15,3)
|
||||
cost_price: string; // DECIMAL(15,3)
|
||||
landing_cost_price: string | null; // DECIMAL(15,3)
|
||||
barcode: string;
|
||||
vendor: string;
|
||||
vendor_reference: string;
|
||||
brand: string;
|
||||
brand: string | 'Unbranded';
|
||||
categories: string[];
|
||||
tags: string[];
|
||||
options: Record<string, any>;
|
||||
@@ -24,32 +24,32 @@ export interface Product {
|
||||
updated_at: string;
|
||||
|
||||
// Metrics
|
||||
daily_sales_avg?: number;
|
||||
weekly_sales_avg?: number;
|
||||
monthly_sales_avg?: number;
|
||||
avg_quantity_per_order?: number;
|
||||
daily_sales_avg?: string; // DECIMAL(15,3)
|
||||
weekly_sales_avg?: string; // DECIMAL(15,3)
|
||||
monthly_sales_avg?: string; // DECIMAL(15,3)
|
||||
avg_quantity_per_order?: string; // DECIMAL(15,3)
|
||||
number_of_orders?: number;
|
||||
first_sale_date?: string;
|
||||
last_sale_date?: string;
|
||||
last_purchase_date?: string;
|
||||
days_of_inventory?: number;
|
||||
weeks_of_inventory?: number;
|
||||
reorder_point?: number;
|
||||
safety_stock?: number;
|
||||
avg_margin_percent?: number;
|
||||
total_revenue?: number;
|
||||
inventory_value?: number;
|
||||
cost_of_goods_sold?: number;
|
||||
gross_profit?: number;
|
||||
gmroi?: number;
|
||||
avg_lead_time_days?: number;
|
||||
days_of_inventory?: string; // DECIMAL(15,3)
|
||||
weeks_of_inventory?: string; // DECIMAL(15,3)
|
||||
reorder_point?: string; // DECIMAL(15,3)
|
||||
safety_stock?: string; // DECIMAL(15,3)
|
||||
avg_margin_percent?: string; // DECIMAL(15,3)
|
||||
total_revenue?: string; // DECIMAL(15,3)
|
||||
inventory_value?: string; // DECIMAL(15,3)
|
||||
cost_of_goods_sold?: string; // DECIMAL(15,3)
|
||||
gross_profit?: string; // DECIMAL(15,3)
|
||||
gmroi?: string; // DECIMAL(15,3)
|
||||
avg_lead_time_days?: string; // DECIMAL(15,3)
|
||||
last_received_date?: string;
|
||||
abc_class?: string;
|
||||
stock_status?: string;
|
||||
turnover_rate?: number;
|
||||
current_lead_time?: number;
|
||||
target_lead_time?: number;
|
||||
turnover_rate?: string; // DECIMAL(15,3)
|
||||
current_lead_time?: string; // DECIMAL(15,3)
|
||||
target_lead_time?: string; // DECIMAL(15,3)
|
||||
lead_time_status?: string;
|
||||
reorder_qty?: number;
|
||||
overstocked_amt?: number;
|
||||
overstocked_amt?: string; // DECIMAL(15,3)
|
||||
}
|
||||
|
||||
81
inventory/src/types/status-codes.ts
Normal file
81
inventory/src/types/status-codes.ts
Normal file
@@ -0,0 +1,81 @@
|
||||
// Purchase Order Status Codes
|
||||
export enum PurchaseOrderStatus {
|
||||
Canceled = 0,
|
||||
Created = 1,
|
||||
ElectronicallyReadySend = 10,
|
||||
Ordered = 11,
|
||||
Preordered = 12,
|
||||
ElectronicallySent = 13,
|
||||
ReceivingStarted = 15,
|
||||
Done = 50
|
||||
}
|
||||
|
||||
// Receiving Status Codes
|
||||
export enum ReceivingStatus {
|
||||
Canceled = 0,
|
||||
Created = 1,
|
||||
PartialReceived = 30,
|
||||
FullReceived = 40,
|
||||
Paid = 50
|
||||
}
|
||||
|
||||
// Status Code Display Names
|
||||
export const PurchaseOrderStatusLabels: Record<PurchaseOrderStatus, string> = {
|
||||
[PurchaseOrderStatus.Canceled]: 'Canceled',
|
||||
[PurchaseOrderStatus.Created]: 'Created',
|
||||
[PurchaseOrderStatus.ElectronicallyReadySend]: 'Ready to Send',
|
||||
[PurchaseOrderStatus.Ordered]: 'Ordered',
|
||||
[PurchaseOrderStatus.Preordered]: 'Preordered',
|
||||
[PurchaseOrderStatus.ElectronicallySent]: 'Sent',
|
||||
[PurchaseOrderStatus.ReceivingStarted]: 'Receiving Started',
|
||||
[PurchaseOrderStatus.Done]: 'Done'
|
||||
};
|
||||
|
||||
export const ReceivingStatusLabels: Record<ReceivingStatus, string> = {
|
||||
[ReceivingStatus.Canceled]: 'Canceled',
|
||||
[ReceivingStatus.Created]: 'Created',
|
||||
[ReceivingStatus.PartialReceived]: 'Partially Received',
|
||||
[ReceivingStatus.FullReceived]: 'Fully Received',
|
||||
[ReceivingStatus.Paid]: 'Paid'
|
||||
};
|
||||
|
||||
// Helper functions
|
||||
export function getPurchaseOrderStatusLabel(status: number): string {
|
||||
return PurchaseOrderStatusLabels[status as PurchaseOrderStatus] || 'Unknown';
|
||||
}
|
||||
|
||||
export function getReceivingStatusLabel(status: number): string {
|
||||
return ReceivingStatusLabels[status as ReceivingStatus] || 'Unknown';
|
||||
}
|
||||
|
||||
// Status checks
|
||||
export function isReceivingComplete(status: number): boolean {
|
||||
return status >= ReceivingStatus.PartialReceived;
|
||||
}
|
||||
|
||||
export function isPurchaseOrderComplete(status: number): boolean {
|
||||
return status === PurchaseOrderStatus.Done;
|
||||
}
|
||||
|
||||
export function isPurchaseOrderCanceled(status: number): boolean {
|
||||
return status === PurchaseOrderStatus.Canceled;
|
||||
}
|
||||
|
||||
export function isReceivingCanceled(status: number): boolean {
|
||||
return status === ReceivingStatus.Canceled;
|
||||
}
|
||||
|
||||
// Badge variants for different statuses
|
||||
export function getPurchaseOrderStatusVariant(status: number): 'default' | 'secondary' | 'destructive' | 'outline' {
|
||||
if (isPurchaseOrderCanceled(status)) return 'destructive';
|
||||
if (isPurchaseOrderComplete(status)) return 'default';
|
||||
if (status >= PurchaseOrderStatus.ElectronicallyReadySend) return 'secondary';
|
||||
return 'outline';
|
||||
}
|
||||
|
||||
export function getReceivingStatusVariant(status: number): 'default' | 'secondary' | 'destructive' | 'outline' {
|
||||
if (isReceivingCanceled(status)) return 'destructive';
|
||||
if (status === ReceivingStatus.Paid) return 'default';
|
||||
if (status >= ReceivingStatus.PartialReceived) return 'secondary';
|
||||
return 'outline';
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
#!/bin/zsh
|
||||
|
||||
#Clear previous mount in case it’s still there
|
||||
umount ~/Dev/inventory/inventory-server
|
||||
umount /Users/matt/Library/Mobile Documents/com~apple~CloudDocs/Dev/inventory/inventory-server
|
||||
|
||||
#Mount
|
||||
sshfs matt@dashboard.kent.pw:/var/www/html/inventory -p 22122 ~/Dev/inventory/inventory-server/
|
||||
sshfs matt@dashboard.kent.pw:/var/www/html/inventory -p 22122 /Users/matt/Library/Mobile Documents/com~apple~CloudDocs/Dev/inventory/inventory-server/
|
||||
Reference in New Issue
Block a user