Compare commits
16 Commits
check-numb
...
dd82c624d8
| Author | SHA1 | Date | |
|---|---|---|---|
| dd82c624d8 | |||
| 7999e1e64a | |||
| 12a0f540b3 | |||
| e793cb0cc5 | |||
| b2330dee22 | |||
| 00501704df | |||
| 4cb41a7e4c | |||
| d05d27494d | |||
| 4ed734e5c0 | |||
| 1e3be5d4cb | |||
| 8dd852dd6a | |||
| eeff5817ea | |||
| 1b19feb172 | |||
| 80ff8124ec | |||
| 8508bfac93 | |||
| ac14179bd2 |
6
.gitignore
vendored
6
.gitignore
vendored
@@ -68,3 +68,9 @@ inventory-server/scripts/.fuse_hidden00000fa20000000a
|
|||||||
.VSCodeCounter/
|
.VSCodeCounter/
|
||||||
.VSCodeCounter/*
|
.VSCodeCounter/*
|
||||||
.VSCodeCounter/**/*
|
.VSCodeCounter/**/*
|
||||||
|
|
||||||
|
*/chat/db-convert/db/*
|
||||||
|
*/chat/db-convert/mongo_converter_env/*
|
||||||
|
|
||||||
|
# Ignore compiled Vite config to avoid duplication
|
||||||
|
vite.config.js
|
||||||
23
docs/setup-chat.md
Normal file
23
docs/setup-chat.md
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
This portion of the application is going to be a read only chat archive. It will pull data from a rocketchat export converted to postgresql. This is a separate database than the rest of the inventory application uses, but it will still use users and permissions from the inventory database. Both databases are on the same postgres instance.
|
||||||
|
|
||||||
|
For now, let's add a select to the top of the page that allows me to "view as" any of the users in the rocketchat database. We'll connect this to the authorization in the main application later.
|
||||||
|
|
||||||
|
The db connection info is stored in the .env file in the inventory-server root. It contains these variables
|
||||||
|
DB_HOST=localhost
|
||||||
|
DB_USER=rocketchat_user
|
||||||
|
DB_PASSWORD=password
|
||||||
|
DB_NAME=rocketchat_converted
|
||||||
|
DB_PORT=5432
|
||||||
|
|
||||||
|
Not all of the information in this database is relevant as it's a direct export from another app with more features. You can use the query tool to examine the structure and data available.
|
||||||
|
|
||||||
|
Server-side files should use similar conventions and the same technologies as the inventory-server (inventor-server root) and auth-server (inventory-server/auth). I will provide my current pm2 ecosystem file upon request for you to add the configuration for the new "chat-server". I use Caddy on the server and can provide my caddyfile to assist with configuring the api routes. All configuration and routes for the chat-server should go in the inventory-server/chat folder or subfolders you create.
|
||||||
|
|
||||||
|
The folder you see as inventory-server is actually a direct mount of the /var/www/html/inventory folder on the server. You can read and write files from there like usual, but any terminal commands for the server I will have to run myself.
|
||||||
|
|
||||||
|
The "Chat" page should be added to the main application sidebar and a similar page to the others should be created in inventory/src/pages. All other frontend pages should go in inventory/src/components/chat.
|
||||||
|
|
||||||
|
The application uses shadcn components and those should be used for all ui elements where possible (located in inventory/src/components/ui). The UI should match existing pages and components.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
112
docs/split-up-pos.md
Normal file
112
docs/split-up-pos.md
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
Okay, I understand completely now. The core issue is that the previous approaches tried too hard to reconcile every receipt back to a specific PO line within the `purchase_orders` table structure, which doesn't reflect the reality where receipts can be independent events. Your downstream scripts, especially `daily_snapshots` and `product_metrics`, rely on having a complete picture of *all* receivings.
|
||||||
|
|
||||||
|
Let's pivot to a model that respects both distinct data streams: **Orders (Intent)** and **Receivings (Actuals)**.
|
||||||
|
|
||||||
|
**Proposed Solution: Separate `purchase_orders` and `receivings` Tables**
|
||||||
|
|
||||||
|
This is the cleanest way to model the reality you've described.
|
||||||
|
|
||||||
|
1. **`purchase_orders` Table:**
|
||||||
|
* **Purpose:** Tracks the status and details of purchase *orders* placed. Represents the *intent* to receive goods.
|
||||||
|
* **Key Columns:** `po_id`, `pid`, `ordered` (quantity ordered), `po_cost_price`, `date` (order/created date), `expected_date`, `status` (PO lifecycle: 'ordered', 'canceled', 'done'), `vendor`, `notes`, etc.
|
||||||
|
* **Crucially:** This table *does not* need a `received` column or a `receiving_history` column derived from complex allocations. It focuses solely on the PO itself.
|
||||||
|
|
||||||
|
2. **`receivings` Table (New or Refined):**
|
||||||
|
* **Purpose:** Tracks every single line item received, regardless of whether it was linked to a PO during the receiving process. Represents the *actual* goods that arrived.
|
||||||
|
* **Key Columns:**
|
||||||
|
* `receiving_id` (Identifier for the overall receiving document/batch)
|
||||||
|
* `pid` (Product ID received)
|
||||||
|
* `received_qty` (Quantity received for this specific line)
|
||||||
|
* `cost_each` (Actual cost paid for this item on this receiving)
|
||||||
|
* `received_date` (Actual date the item was received)
|
||||||
|
* `received_by` (Employee ID/Name)
|
||||||
|
* `source_po_id` (The `po_id` entered on the receiving screen, *nullable*. Stores the original link attempt, even if it was wrong or missing)
|
||||||
|
* `source_receiving_status` (The status from the source `receivings` table: 'partial_received', 'full_received', 'paid', 'canceled')
|
||||||
|
|
||||||
|
**How the Import Script Changes:**
|
||||||
|
|
||||||
|
1. **Fetch POs:** Fetch data from `po` and `po_products`.
|
||||||
|
2. **Populate `purchase_orders`:**
|
||||||
|
* Insert/Update rows into `purchase_orders` based directly on the fetched PO data.
|
||||||
|
* Set `po_id`, `pid`, `ordered`, `po_cost_price`, `date` (`COALESCE(date_ordered, date_created)`), `expected_date`.
|
||||||
|
* Set `status` by mapping the source `po.status` code directly ('ordered', 'canceled', 'done', etc.).
|
||||||
|
* **No complex allocation needed here.**
|
||||||
|
3. **Fetch Receivings:** Fetch data from `receivings` and `receivings_products`.
|
||||||
|
4. **Populate `receivings`:**
|
||||||
|
* For *every* line item fetched from `receivings_products`:
|
||||||
|
* Perform necessary data validation (dates, numbers).
|
||||||
|
* Insert a new row into `receivings` with all the relevant details (`receiving_id`, `pid`, `received_qty`, `cost_each`, `received_date`, `received_by`, `source_po_id`, `source_receiving_status`).
|
||||||
|
* Use `ON CONFLICT (receiving_id, pid)` (or similar unique key based on your source data) `DO UPDATE SET ...` for incremental updates if necessary, or simply delete/re-insert based on `receiving_id` for simplicity if performance allows.
|
||||||
|
|
||||||
|
**Impact on Downstream Scripts (and how to adapt):**
|
||||||
|
|
||||||
|
* **Initial Query (Active POs):**
|
||||||
|
* `SELECT ... FROM purchase_orders po WHERE po.status NOT IN ('canceled', 'done', 'paid_equivalent_status?') AND po.date >= ...`
|
||||||
|
* `active_pos`: `COUNT(DISTINCT po.po_id)` based on the filtered POs.
|
||||||
|
* `overdue_pos`: Add `AND po.expected_date < CURRENT_DATE`.
|
||||||
|
* `total_units`: `SUM(po.ordered)`. Represents total units *ordered* on active POs.
|
||||||
|
* `total_cost`: `SUM(po.ordered * po.po_cost_price)`. Cost of units *ordered*.
|
||||||
|
* `total_retail`: `SUM(po.ordered * pm.current_price)`. Retail value of units *ordered*.
|
||||||
|
* **Result:** This query now cleanly reports on the status of *orders* placed, which seems closer to its original intent. The filter `po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')` is replaced by `po.status NOT IN ('canceled', 'done', 'paid_equivalent?')`. The 90% received check is removed as `received` is not reliably tracked *on the PO* anymore.
|
||||||
|
|
||||||
|
* **`daily_product_snapshots`:**
|
||||||
|
* **`SalesData` CTE:** No change needed.
|
||||||
|
* **`ReceivingData` CTE:** **Must be changed.** Query the **`receivings`** table instead of `purchase_orders`.
|
||||||
|
```sql
|
||||||
|
ReceivingData AS (
|
||||||
|
SELECT
|
||||||
|
rl.pid,
|
||||||
|
COUNT(DISTINCT rl.receiving_id) as receiving_doc_count,
|
||||||
|
SUM(rl.received_qty) AS units_received,
|
||||||
|
SUM(rl.received_qty * rl.cost_each) AS cost_received
|
||||||
|
FROM public.receivings rl
|
||||||
|
WHERE rl.received_date::date = _date
|
||||||
|
-- Optional: Filter out canceled receivings if needed
|
||||||
|
-- AND rl.source_receiving_status <> 'canceled'
|
||||||
|
GROUP BY rl.pid
|
||||||
|
),
|
||||||
|
```
|
||||||
|
* **Result:** This now accurately reflects *all* units received on a given day from the definitive source.
|
||||||
|
|
||||||
|
* **`update_product_metrics`:**
|
||||||
|
* **`CurrentInfo` CTE:** No change needed (pulls from `products`).
|
||||||
|
* **`OnOrderInfo` CTE:** Needs re-evaluation. How do you want to define "On Order"?
|
||||||
|
* **Option A (Strict PO View):** `SUM(po.ordered)` from `purchase_orders po WHERE po.status NOT IN ('canceled', 'done', 'paid_equivalent?')`. This is quantity on *open orders*, ignoring fulfillment state. Simple, but might overestimate if items arrived unlinked.
|
||||||
|
* **Option B (Approximate Fulfillment):** `SUM(po.ordered)` from open POs MINUS `SUM(rl.received_qty)` from `receivings rl` where `rl.source_po_id = po.po_id` (summing only directly linked receivings). Better, but still misses fulfillment via unlinked receivings.
|
||||||
|
* **Option C (Heuristic):** `SUM(po.ordered)` from open POs MINUS `SUM(rl.received_qty)` from `receivings rl` where `rl.pid = po.pid` and `rl.received_date >= po.date`. This *tries* to account for unlinked receivings but is imprecise.
|
||||||
|
* **Recommendation:** Start with **Option A** for simplicity, clearly labeling it "Quantity on Open POs". You might need a separate process or metric for a more nuanced view of expected vs. actual pipeline.
|
||||||
|
```sql
|
||||||
|
-- Example for Option A
|
||||||
|
OnOrderInfo AS (
|
||||||
|
SELECT
|
||||||
|
pid,
|
||||||
|
SUM(ordered) AS on_order_qty, -- Total qty on open POs
|
||||||
|
SUM(ordered * po_cost_price) AS on_order_cost -- Cost of qty on open POs
|
||||||
|
FROM public.purchase_orders
|
||||||
|
WHERE status NOT IN ('canceled', 'done', 'paid_equivalent?') -- Define your open statuses
|
||||||
|
GROUP BY pid
|
||||||
|
),
|
||||||
|
```
|
||||||
|
* **`HistoricalDates` CTE:**
|
||||||
|
* `date_first_sold`, `max_order_date`: No change (queries `orders`).
|
||||||
|
* `date_first_received_calc`, `date_last_received_calc`: **Must be changed.** Query `MIN(rl.received_date)` and `MAX(rl.received_date)` from the **`receivings`** table grouped by `pid`.
|
||||||
|
* **`SnapshotAggregates` CTE:**
|
||||||
|
* `received_qty_30d`, `received_cost_30d`: These are calculated from `daily_product_snapshots`, which are now correctly sourced from `receivings`, so this part is fine.
|
||||||
|
* **Forecasting Calculations:** Will use the chosen definition of `on_order_qty`. Be aware of the implications of Option A (potentially inflated if unlinked receivings fulfill orders).
|
||||||
|
* **Result:** Metrics are calculated based on distinct order data and complete receiving data. The definition of "on order" needs careful consideration.
|
||||||
|
|
||||||
|
**Summary of this Approach:**
|
||||||
|
|
||||||
|
* **Pros:**
|
||||||
|
* Accurately models distinct order and receiving events.
|
||||||
|
* Provides a definitive source (`receivings`) for all received inventory.
|
||||||
|
* Simplifies the `purchase_orders` table and its import logic.
|
||||||
|
* Avoids complex/potentially inaccurate allocation logic for unlinked receivings within the main tables.
|
||||||
|
* Avoids synthetic records.
|
||||||
|
* Fixes downstream reporting (`daily_snapshots` receiving data).
|
||||||
|
* **Cons:**
|
||||||
|
* Requires creating/managing the `receivings` table.
|
||||||
|
* Requires modifying downstream queries (`ReceivingData`, `OnOrderInfo`, `HistoricalDates`).
|
||||||
|
* Calculating a precise "net quantity still expected to arrive" (true on-order minus all relevant fulfillment) becomes more complex and may require specific business rules or heuristics outside the basic table structure if Option A for `OnOrderInfo` isn't sufficient.
|
||||||
|
|
||||||
|
This two-table approach (`purchase_orders` + `receivings`) seems the most robust and accurate way to handle your requirement for complete receiving records independent of potentially flawed PO linking. It directly addresses the shortcomings of the previous attempts.
|
||||||
@@ -35,7 +35,7 @@ global.pool = pool;
|
|||||||
app.use(express.json());
|
app.use(express.json());
|
||||||
app.use(morgan('combined'));
|
app.use(morgan('combined'));
|
||||||
app.use(cors({
|
app.use(cors({
|
||||||
origin: ['http://localhost:5173', 'http://localhost:5174', 'https://inventory.kent.pw'],
|
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw'],
|
||||||
credentials: true
|
credentials: true
|
||||||
}));
|
}));
|
||||||
|
|
||||||
|
|||||||
881
inventory-server/chat/db-convert/mongo_to_postgres_converter.py
Normal file
881
inventory-server/chat/db-convert/mongo_to_postgres_converter.py
Normal file
@@ -0,0 +1,881 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
MongoDB to PostgreSQL Converter for Rocket.Chat
|
||||||
|
Converts MongoDB BSON export files to PostgreSQL database
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python3 mongo_to_postgres_converter.py \
|
||||||
|
--mongo-path db/database/62df06d44234d20001289144 \
|
||||||
|
--pg-database rocketchat_converted \
|
||||||
|
--pg-user rocketchat_user \
|
||||||
|
--pg-password your_password \
|
||||||
|
--debug
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
import struct
|
||||||
|
from datetime import datetime
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Dict, Any, List, Optional
|
||||||
|
import argparse
|
||||||
|
import traceback
|
||||||
|
|
||||||
|
# Auto-install dependencies if needed
|
||||||
|
try:
|
||||||
|
import bson
|
||||||
|
import psycopg2
|
||||||
|
except ImportError:
|
||||||
|
print("Installing required packages...")
|
||||||
|
subprocess.check_call([sys.executable, "-m", "pip", "install", "pymongo", "psycopg2-binary"])
|
||||||
|
import bson
|
||||||
|
import psycopg2
|
||||||
|
|
||||||
|
class MongoToPostgresConverter:
|
||||||
|
def __init__(self, mongo_db_path: str, postgres_config: Dict[str, str], debug_mode: bool = False, debug_collections: List[str] = None):
|
||||||
|
self.mongo_db_path = Path(mongo_db_path)
|
||||||
|
self.postgres_config = postgres_config
|
||||||
|
self.debug_mode = debug_mode
|
||||||
|
self.debug_collections = debug_collections or []
|
||||||
|
self.collections = {}
|
||||||
|
self.schema_info = {}
|
||||||
|
self.error_log = {}
|
||||||
|
|
||||||
|
def log_debug(self, message: str, collection: str = None):
|
||||||
|
"""Log debug messages if debug mode is enabled and collection is in debug list"""
|
||||||
|
if self.debug_mode and (not self.debug_collections or collection in self.debug_collections):
|
||||||
|
print(f"DEBUG: {message}")
|
||||||
|
|
||||||
|
def log_error(self, collection: str, error_type: str, details: str):
|
||||||
|
"""Log detailed error information"""
|
||||||
|
if collection not in self.error_log:
|
||||||
|
self.error_log[collection] = []
|
||||||
|
self.error_log[collection].append({
|
||||||
|
'type': error_type,
|
||||||
|
'details': details,
|
||||||
|
'timestamp': datetime.now().isoformat()
|
||||||
|
})
|
||||||
|
|
||||||
|
def sample_documents(self, collection_name: str, max_samples: int = 3) -> List[Dict]:
|
||||||
|
"""Sample documents from a collection for debugging"""
|
||||||
|
if not self.debug_mode or (self.debug_collections and collection_name not in self.debug_collections):
|
||||||
|
return []
|
||||||
|
|
||||||
|
print(f"\n🔍 Sampling documents from {collection_name}:")
|
||||||
|
|
||||||
|
bson_file = self.collections[collection_name]['bson_file']
|
||||||
|
if bson_file.stat().st_size == 0:
|
||||||
|
print(" Collection is empty")
|
||||||
|
return []
|
||||||
|
|
||||||
|
samples = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(bson_file, 'rb') as f:
|
||||||
|
sample_count = 0
|
||||||
|
while sample_count < max_samples:
|
||||||
|
try:
|
||||||
|
doc_size = int.from_bytes(f.read(4), byteorder='little')
|
||||||
|
if doc_size <= 0:
|
||||||
|
break
|
||||||
|
f.seek(-4, 1)
|
||||||
|
doc_bytes = f.read(doc_size)
|
||||||
|
if len(doc_bytes) != doc_size:
|
||||||
|
break
|
||||||
|
|
||||||
|
doc = bson.decode(doc_bytes)
|
||||||
|
samples.append(doc)
|
||||||
|
sample_count += 1
|
||||||
|
|
||||||
|
print(f" Sample {sample_count} - Keys: {list(doc.keys())}")
|
||||||
|
# Show a few key fields with their types and truncated values
|
||||||
|
for key, value in list(doc.items())[:3]:
|
||||||
|
value_preview = str(value)[:50] + "..." if len(str(value)) > 50 else str(value)
|
||||||
|
print(f" {key}: {type(value).__name__} = {value_preview}")
|
||||||
|
if len(doc) > 3:
|
||||||
|
print(f" ... and {len(doc) - 3} more fields")
|
||||||
|
print()
|
||||||
|
|
||||||
|
except (bson.InvalidBSON, struct.error, OSError) as e:
|
||||||
|
self.log_error(collection_name, 'document_parsing', str(e))
|
||||||
|
break
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.log_error(collection_name, 'file_reading', str(e))
|
||||||
|
print(f" Error reading collection: {e}")
|
||||||
|
|
||||||
|
return samples
|
||||||
|
|
||||||
|
def discover_collections(self):
|
||||||
|
"""Discover all BSON files and their metadata"""
|
||||||
|
print("Discovering MongoDB collections...")
|
||||||
|
|
||||||
|
for bson_file in self.mongo_db_path.glob("*.bson"):
|
||||||
|
collection_name = bson_file.stem
|
||||||
|
metadata_file = bson_file.with_suffix(".metadata.json")
|
||||||
|
|
||||||
|
# Read metadata if available
|
||||||
|
metadata = {}
|
||||||
|
if metadata_file.exists():
|
||||||
|
try:
|
||||||
|
with open(metadata_file, 'r', encoding='utf-8') as f:
|
||||||
|
metadata = json.load(f)
|
||||||
|
except (UnicodeDecodeError, json.JSONDecodeError) as e:
|
||||||
|
print(f"Warning: Could not read metadata for {collection_name}: {e}")
|
||||||
|
metadata = {}
|
||||||
|
|
||||||
|
# Get file size and document count estimate
|
||||||
|
file_size = bson_file.stat().st_size
|
||||||
|
doc_count = self._estimate_document_count(bson_file)
|
||||||
|
|
||||||
|
self.collections[collection_name] = {
|
||||||
|
'bson_file': bson_file,
|
||||||
|
'metadata': metadata,
|
||||||
|
'file_size': file_size,
|
||||||
|
'estimated_docs': doc_count
|
||||||
|
}
|
||||||
|
|
||||||
|
print(f"Found {len(self.collections)} collections")
|
||||||
|
for name, info in self.collections.items():
|
||||||
|
print(f" - {name}: {info['file_size']/1024/1024:.1f}MB (~{info['estimated_docs']} docs)")
|
||||||
|
|
||||||
|
def _estimate_document_count(self, bson_file: Path) -> int:
|
||||||
|
"""Estimate document count by reading first few documents"""
|
||||||
|
if bson_file.stat().st_size == 0:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(bson_file, 'rb') as f:
|
||||||
|
docs_sampled = 0
|
||||||
|
bytes_sampled = 0
|
||||||
|
max_sample_size = min(1024 * 1024, bson_file.stat().st_size) # 1MB or file size
|
||||||
|
|
||||||
|
while bytes_sampled < max_sample_size:
|
||||||
|
try:
|
||||||
|
doc_size = int.from_bytes(f.read(4), byteorder='little')
|
||||||
|
if doc_size <= 0 or doc_size > 16 * 1024 * 1024: # MongoDB doc size limit
|
||||||
|
break
|
||||||
|
f.seek(-4, 1) # Go back
|
||||||
|
doc_bytes = f.read(doc_size)
|
||||||
|
if len(doc_bytes) != doc_size:
|
||||||
|
break
|
||||||
|
bson.decode(doc_bytes) # Validate it's a valid BSON document
|
||||||
|
docs_sampled += 1
|
||||||
|
bytes_sampled += doc_size
|
||||||
|
except (bson.InvalidBSON, struct.error, OSError):
|
||||||
|
break
|
||||||
|
|
||||||
|
if docs_sampled > 0 and bytes_sampled > 0:
|
||||||
|
avg_doc_size = bytes_sampled / docs_sampled
|
||||||
|
return int(bson_file.stat().st_size / avg_doc_size)
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
def analyze_schema(self, collection_name: str, sample_size: int = 100) -> Dict[str, Any]:
|
||||||
|
"""Analyze collection schema by sampling documents"""
|
||||||
|
print(f"Analyzing schema for {collection_name}...")
|
||||||
|
|
||||||
|
bson_file = self.collections[collection_name]['bson_file']
|
||||||
|
if bson_file.stat().st_size == 0:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
schema = {}
|
||||||
|
docs_analyzed = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(bson_file, 'rb') as f:
|
||||||
|
while docs_analyzed < sample_size:
|
||||||
|
try:
|
||||||
|
doc_size = int.from_bytes(f.read(4), byteorder='little')
|
||||||
|
if doc_size <= 0:
|
||||||
|
break
|
||||||
|
f.seek(-4, 1)
|
||||||
|
doc_bytes = f.read(doc_size)
|
||||||
|
if len(doc_bytes) != doc_size:
|
||||||
|
break
|
||||||
|
|
||||||
|
doc = bson.decode(doc_bytes)
|
||||||
|
self._analyze_document_schema(doc, schema)
|
||||||
|
docs_analyzed += 1
|
||||||
|
|
||||||
|
except (bson.InvalidBSON, struct.error, OSError):
|
||||||
|
break
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error analyzing {collection_name}: {e}")
|
||||||
|
|
||||||
|
self.schema_info[collection_name] = schema
|
||||||
|
return schema
|
||||||
|
|
||||||
|
def _analyze_document_schema(self, doc: Dict[str, Any], schema: Dict[str, Any], prefix: str = ""):
|
||||||
|
"""Recursively analyze document structure"""
|
||||||
|
for key, value in doc.items():
|
||||||
|
full_key = f"{prefix}.{key}" if prefix else key
|
||||||
|
|
||||||
|
if full_key not in schema:
|
||||||
|
schema[full_key] = {
|
||||||
|
'types': set(),
|
||||||
|
'null_count': 0,
|
||||||
|
'total_count': 0,
|
||||||
|
'is_array': False,
|
||||||
|
'nested_schema': {}
|
||||||
|
}
|
||||||
|
|
||||||
|
schema[full_key]['total_count'] += 1
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
schema[full_key]['null_count'] += 1
|
||||||
|
schema[full_key]['types'].add('null')
|
||||||
|
elif isinstance(value, dict):
|
||||||
|
schema[full_key]['types'].add('object')
|
||||||
|
if 'nested_schema' not in schema[full_key]:
|
||||||
|
schema[full_key]['nested_schema'] = {}
|
||||||
|
self._analyze_document_schema(value, schema[full_key]['nested_schema'])
|
||||||
|
elif isinstance(value, list):
|
||||||
|
schema[full_key]['types'].add('array')
|
||||||
|
schema[full_key]['is_array'] = True
|
||||||
|
if value and isinstance(value[0], dict):
|
||||||
|
if 'array_item_schema' not in schema[full_key]:
|
||||||
|
schema[full_key]['array_item_schema'] = {}
|
||||||
|
for item in value[:5]: # Sample first 5 items
|
||||||
|
if isinstance(item, dict):
|
||||||
|
self._analyze_document_schema(item, schema[full_key]['array_item_schema'])
|
||||||
|
else:
|
||||||
|
schema[full_key]['types'].add(type(value).__name__)
|
||||||
|
|
||||||
|
def generate_postgres_schema(self) -> Dict[str, str]:
|
||||||
|
"""Generate PostgreSQL CREATE TABLE statements"""
|
||||||
|
print("Generating PostgreSQL schema...")
|
||||||
|
|
||||||
|
table_definitions = {}
|
||||||
|
|
||||||
|
for collection_name, schema in self.schema_info.items():
|
||||||
|
if not schema: # Empty collection
|
||||||
|
continue
|
||||||
|
|
||||||
|
table_name = self._sanitize_table_name(collection_name)
|
||||||
|
columns = []
|
||||||
|
|
||||||
|
# Always add an id column (PostgreSQL doesn't use _id like MongoDB)
|
||||||
|
columns.append("id SERIAL PRIMARY KEY")
|
||||||
|
|
||||||
|
for field_name, field_info in schema.items():
|
||||||
|
if field_name == '_id':
|
||||||
|
columns.append("mongo_id TEXT") # Always allow NULL for mongo_id
|
||||||
|
continue
|
||||||
|
|
||||||
|
col_name = self._sanitize_column_name(field_name)
|
||||||
|
|
||||||
|
# Handle conflicts with PostgreSQL auto-generated columns
|
||||||
|
if col_name in ['id', 'mongo_id', 'created_at', 'updated_at']:
|
||||||
|
col_name = f"field_{col_name}"
|
||||||
|
|
||||||
|
col_type = self._determine_postgres_type(field_info)
|
||||||
|
|
||||||
|
# Make all fields nullable by default to avoid constraint violations
|
||||||
|
columns.append(f"{col_name} {col_type}")
|
||||||
|
|
||||||
|
# Add metadata columns
|
||||||
|
columns.extend([
|
||||||
|
"created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP",
|
||||||
|
"updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP"
|
||||||
|
])
|
||||||
|
|
||||||
|
column_definitions = ',\n '.join(columns)
|
||||||
|
table_sql = f"""
|
||||||
|
CREATE TABLE IF NOT EXISTS {table_name} (
|
||||||
|
{column_definitions}
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Create indexes based on MongoDB indexes
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Get list of actual columns that will exist in the table
|
||||||
|
existing_columns = set(['id', 'mongo_id', 'created_at', 'updated_at'])
|
||||||
|
for field_name in schema.keys():
|
||||||
|
if field_name != '_id':
|
||||||
|
col_name = self._sanitize_column_name(field_name)
|
||||||
|
# Handle conflicts with PostgreSQL auto-generated columns
|
||||||
|
if col_name in ['id', 'mongo_id', 'created_at', 'updated_at']:
|
||||||
|
col_name = f"field_{col_name}"
|
||||||
|
existing_columns.add(col_name)
|
||||||
|
|
||||||
|
# Add indexes from MongoDB metadata
|
||||||
|
metadata = self.collections[collection_name].get('metadata', {})
|
||||||
|
indexes = metadata.get('indexes', [])
|
||||||
|
|
||||||
|
for index in indexes:
|
||||||
|
if index['name'] != '_id_': # Skip the default _id index
|
||||||
|
# Sanitize index name - remove special characters
|
||||||
|
sanitized_index_name = re.sub(r'[^a-zA-Z0-9_]', '_', index['name'])
|
||||||
|
index_name = f"idx_{table_name}_{sanitized_index_name}"
|
||||||
|
index_keys = list(index['key'].keys())
|
||||||
|
if index_keys:
|
||||||
|
sanitized_keys = []
|
||||||
|
for key in index_keys:
|
||||||
|
if key != '_id':
|
||||||
|
sanitized_key = self._sanitize_column_name(key)
|
||||||
|
# Handle conflicts with PostgreSQL auto-generated columns
|
||||||
|
if sanitized_key in ['id', 'mongo_id', 'created_at', 'updated_at']:
|
||||||
|
sanitized_key = f"field_{sanitized_key}"
|
||||||
|
# Only add if the column actually exists in our table
|
||||||
|
if sanitized_key in existing_columns:
|
||||||
|
sanitized_keys.append(sanitized_key)
|
||||||
|
|
||||||
|
if sanitized_keys:
|
||||||
|
table_sql += f"CREATE INDEX IF NOT EXISTS {index_name} ON {table_name} ({', '.join(sanitized_keys)});\n"
|
||||||
|
|
||||||
|
table_definitions[collection_name] = table_sql
|
||||||
|
|
||||||
|
return table_definitions
|
||||||
|
|
||||||
|
def _sanitize_table_name(self, name: str) -> str:
|
||||||
|
"""Convert MongoDB collection name to PostgreSQL table name"""
|
||||||
|
# Remove rocketchat_ prefix if present
|
||||||
|
if name.startswith('rocketchat_'):
|
||||||
|
name = name[11:]
|
||||||
|
|
||||||
|
# Replace special characters with underscores
|
||||||
|
name = re.sub(r'[^a-zA-Z0-9_]', '_', name)
|
||||||
|
|
||||||
|
# Ensure it starts with a letter
|
||||||
|
if name and name[0].isdigit():
|
||||||
|
name = 'table_' + name
|
||||||
|
|
||||||
|
return name.lower()
|
||||||
|
|
||||||
|
def _sanitize_column_name(self, name: str) -> str:
|
||||||
|
"""Convert MongoDB field name to PostgreSQL column name"""
|
||||||
|
# Handle nested field names (convert dots to underscores)
|
||||||
|
name = name.replace('.', '_')
|
||||||
|
|
||||||
|
# Replace special characters with underscores
|
||||||
|
name = re.sub(r'[^a-zA-Z0-9_]', '_', name)
|
||||||
|
|
||||||
|
# Ensure it starts with a letter or underscore
|
||||||
|
if name and name[0].isdigit():
|
||||||
|
name = 'col_' + name
|
||||||
|
|
||||||
|
# Handle PostgreSQL reserved words
|
||||||
|
reserved = {
|
||||||
|
'user', 'order', 'group', 'table', 'index', 'key', 'value', 'date', 'time', 'timestamp',
|
||||||
|
'default', 'select', 'from', 'where', 'insert', 'update', 'delete', 'create', 'drop',
|
||||||
|
'alter', 'grant', 'revoke', 'commit', 'rollback', 'begin', 'end', 'case', 'when',
|
||||||
|
'then', 'else', 'if', 'null', 'not', 'and', 'or', 'in', 'exists', 'between',
|
||||||
|
'like', 'limit', 'offset', 'union', 'join', 'inner', 'outer', 'left', 'right',
|
||||||
|
'full', 'cross', 'natural', 'on', 'using', 'distinct', 'all', 'any', 'some',
|
||||||
|
'desc', 'asc', 'primary', 'foreign', 'references', 'constraint', 'unique',
|
||||||
|
'check', 'cascade', 'restrict', 'action', 'match', 'partial', 'full'
|
||||||
|
}
|
||||||
|
if name.lower() in reserved:
|
||||||
|
name = name + '_col'
|
||||||
|
|
||||||
|
return name.lower()
|
||||||
|
|
||||||
|
def _determine_postgres_type(self, field_info: Dict[str, Any]) -> str:
|
||||||
|
"""Determine PostgreSQL column type from MongoDB field analysis with improved logic"""
|
||||||
|
types = field_info['types']
|
||||||
|
|
||||||
|
# Convert set to list for easier checking
|
||||||
|
type_list = list(types)
|
||||||
|
|
||||||
|
# If there's only one type (excluding null), use specific typing
|
||||||
|
non_null_types = [t for t in type_list if t != 'null']
|
||||||
|
|
||||||
|
if len(non_null_types) == 1:
|
||||||
|
single_type = non_null_types[0]
|
||||||
|
if single_type == 'bool':
|
||||||
|
return 'BOOLEAN'
|
||||||
|
elif single_type == 'int':
|
||||||
|
return 'INTEGER'
|
||||||
|
elif single_type == 'float':
|
||||||
|
return 'NUMERIC'
|
||||||
|
elif single_type == 'str':
|
||||||
|
return 'TEXT'
|
||||||
|
elif single_type == 'datetime':
|
||||||
|
return 'TIMESTAMP'
|
||||||
|
elif single_type == 'ObjectId':
|
||||||
|
return 'TEXT'
|
||||||
|
|
||||||
|
# Handle mixed types more conservatively
|
||||||
|
if 'array' in types or field_info.get('is_array', False):
|
||||||
|
return 'JSONB' # Arrays always go to JSONB
|
||||||
|
elif 'object' in types:
|
||||||
|
return 'JSONB' # Objects always go to JSONB
|
||||||
|
elif len(non_null_types) > 1:
|
||||||
|
# Multiple non-null types - check for common combinations
|
||||||
|
if set(non_null_types) <= {'int', 'float'}:
|
||||||
|
return 'NUMERIC' # Can handle both int and float
|
||||||
|
elif set(non_null_types) <= {'bool', 'str'}:
|
||||||
|
return 'TEXT' # Convert everything to text
|
||||||
|
elif set(non_null_types) <= {'str', 'ObjectId'}:
|
||||||
|
return 'TEXT' # Both are string-like
|
||||||
|
else:
|
||||||
|
return 'JSONB' # Complex mixed types go to JSONB
|
||||||
|
elif 'ObjectId' in types:
|
||||||
|
return 'TEXT'
|
||||||
|
elif 'datetime' in types:
|
||||||
|
return 'TIMESTAMP'
|
||||||
|
elif 'bool' in types:
|
||||||
|
return 'BOOLEAN'
|
||||||
|
elif 'int' in types:
|
||||||
|
return 'INTEGER'
|
||||||
|
elif 'float' in types:
|
||||||
|
return 'NUMERIC'
|
||||||
|
elif 'str' in types:
|
||||||
|
return 'TEXT'
|
||||||
|
else:
|
||||||
|
return 'TEXT' # Default fallback
|
||||||
|
|
||||||
|
def create_postgres_database(self, table_definitions: Dict[str, str]):
|
||||||
|
"""Create PostgreSQL database and tables"""
|
||||||
|
print("Creating PostgreSQL database schema...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Connect to PostgreSQL
|
||||||
|
conn = psycopg2.connect(**self.postgres_config)
|
||||||
|
conn.autocommit = True
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
# Create tables
|
||||||
|
for collection_name, table_sql in table_definitions.items():
|
||||||
|
print(f"Creating table for {collection_name}...")
|
||||||
|
cursor.execute(table_sql)
|
||||||
|
|
||||||
|
cursor.close()
|
||||||
|
conn.close()
|
||||||
|
print("Database schema created successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error creating database schema: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def convert_and_insert_data(self, batch_size: int = 1000):
|
||||||
|
"""Convert BSON data and insert into PostgreSQL"""
|
||||||
|
print("Converting and inserting data...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
conn = psycopg2.connect(**self.postgres_config)
|
||||||
|
conn.autocommit = False
|
||||||
|
|
||||||
|
for collection_name in self.collections:
|
||||||
|
print(f"Processing {collection_name}...")
|
||||||
|
self._convert_collection(conn, collection_name, batch_size)
|
||||||
|
|
||||||
|
conn.close()
|
||||||
|
print("Data conversion completed successfully!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error converting data: {e}")
|
||||||
|
raise
|
||||||
|
|
||||||
|
def _convert_collection(self, conn, collection_name: str, batch_size: int):
|
||||||
|
"""Convert a single collection"""
|
||||||
|
bson_file = self.collections[collection_name]['bson_file']
|
||||||
|
|
||||||
|
if bson_file.stat().st_size == 0:
|
||||||
|
print(f" Skipping empty collection {collection_name}")
|
||||||
|
return
|
||||||
|
|
||||||
|
table_name = self._sanitize_table_name(collection_name)
|
||||||
|
cursor = conn.cursor()
|
||||||
|
|
||||||
|
batch = []
|
||||||
|
total_inserted = 0
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(bson_file, 'rb') as f:
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
doc_size = int.from_bytes(f.read(4), byteorder='little')
|
||||||
|
if doc_size <= 0:
|
||||||
|
break
|
||||||
|
f.seek(-4, 1)
|
||||||
|
doc_bytes = f.read(doc_size)
|
||||||
|
if len(doc_bytes) != doc_size:
|
||||||
|
break
|
||||||
|
|
||||||
|
doc = bson.decode(doc_bytes)
|
||||||
|
batch.append(doc)
|
||||||
|
|
||||||
|
if len(batch) >= batch_size:
|
||||||
|
inserted, batch_errors = self._insert_batch(cursor, table_name, batch, collection_name)
|
||||||
|
total_inserted += inserted
|
||||||
|
errors += batch_errors
|
||||||
|
batch = []
|
||||||
|
conn.commit()
|
||||||
|
if total_inserted % 5000 == 0: # Less frequent progress updates
|
||||||
|
print(f" Inserted {total_inserted} documents...")
|
||||||
|
|
||||||
|
except (bson.InvalidBSON, struct.error, OSError):
|
||||||
|
break
|
||||||
|
|
||||||
|
# Insert remaining documents
|
||||||
|
if batch:
|
||||||
|
inserted, batch_errors = self._insert_batch(cursor, table_name, batch, collection_name)
|
||||||
|
total_inserted += inserted
|
||||||
|
errors += batch_errors
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
if errors > 0:
|
||||||
|
print(f" Completed {collection_name}: {total_inserted} documents inserted ({errors} errors)")
|
||||||
|
else:
|
||||||
|
print(f" Completed {collection_name}: {total_inserted} documents inserted")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f" Error processing {collection_name}: {e}")
|
||||||
|
conn.rollback()
|
||||||
|
finally:
|
||||||
|
cursor.close()
|
||||||
|
|
||||||
|
def _insert_batch(self, cursor, table_name: str, documents: List[Dict], collection_name: str):
|
||||||
|
"""Insert a batch of documents with proper transaction handling"""
|
||||||
|
if not documents:
|
||||||
|
return 0, 0
|
||||||
|
|
||||||
|
# Get schema info for this collection
|
||||||
|
schema = self.schema_info.get(collection_name, {})
|
||||||
|
|
||||||
|
# Build column list
|
||||||
|
columns = ['mongo_id']
|
||||||
|
for field_name in schema.keys():
|
||||||
|
if field_name != '_id':
|
||||||
|
col_name = self._sanitize_column_name(field_name)
|
||||||
|
# Handle conflicts with PostgreSQL auto-generated columns
|
||||||
|
if col_name in ['id', 'mongo_id', 'created_at', 'updated_at']:
|
||||||
|
col_name = f"field_{col_name}"
|
||||||
|
columns.append(col_name)
|
||||||
|
|
||||||
|
# Build INSERT statement
|
||||||
|
placeholders = ', '.join(['%s'] * len(columns))
|
||||||
|
sql = f"INSERT INTO {table_name} ({', '.join(columns)}) VALUES ({placeholders})"
|
||||||
|
|
||||||
|
self.log_debug(f"SQL: {sql}", collection_name)
|
||||||
|
|
||||||
|
# Convert documents to tuples
|
||||||
|
rows = []
|
||||||
|
errors = 0
|
||||||
|
|
||||||
|
for doc_idx, doc in enumerate(documents):
|
||||||
|
try:
|
||||||
|
row = []
|
||||||
|
|
||||||
|
# Add mongo_id
|
||||||
|
row.append(str(doc.get('_id', '')))
|
||||||
|
|
||||||
|
# Add other fields
|
||||||
|
for field_name in schema.keys():
|
||||||
|
if field_name != '_id':
|
||||||
|
try:
|
||||||
|
value = self._get_nested_value(doc, field_name)
|
||||||
|
converted_value = self._convert_value_for_postgres(value, field_name, schema)
|
||||||
|
row.append(converted_value)
|
||||||
|
except Exception as e:
|
||||||
|
self.log_error(collection_name, 'field_conversion',
|
||||||
|
f"Field '{field_name}' in doc {doc_idx}: {str(e)}")
|
||||||
|
# Only show debug for collections we're focusing on
|
||||||
|
if collection_name in self.debug_collections:
|
||||||
|
print(f" ⚠️ Error converting field '{field_name}': {e}")
|
||||||
|
row.append(None) # Use NULL for problematic fields
|
||||||
|
|
||||||
|
rows.append(tuple(row))
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.log_error(collection_name, 'document_conversion', f"Document {doc_idx}: {str(e)}")
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Execute batch insert
|
||||||
|
if rows:
|
||||||
|
try:
|
||||||
|
cursor.executemany(sql, rows)
|
||||||
|
return len(rows), errors
|
||||||
|
except Exception as batch_error:
|
||||||
|
self.log_error(collection_name, 'batch_insert', str(batch_error))
|
||||||
|
|
||||||
|
# Only show detailed debugging for targeted collections
|
||||||
|
if collection_name in self.debug_collections:
|
||||||
|
print(f" 🔴 Batch insert failed for {collection_name}: {batch_error}")
|
||||||
|
print(" Trying individual inserts with rollback handling...")
|
||||||
|
|
||||||
|
# Rollback the failed transaction
|
||||||
|
cursor.connection.rollback()
|
||||||
|
|
||||||
|
# Try inserting one by one in individual transactions
|
||||||
|
success_count = 0
|
||||||
|
for row_idx, row in enumerate(rows):
|
||||||
|
try:
|
||||||
|
cursor.execute(sql, row)
|
||||||
|
cursor.connection.commit() # Commit each successful insert
|
||||||
|
success_count += 1
|
||||||
|
except Exception as row_error:
|
||||||
|
cursor.connection.rollback() # Rollback failed insert
|
||||||
|
self.log_error(collection_name, 'row_insert', f"Row {row_idx}: {str(row_error)}")
|
||||||
|
|
||||||
|
# Show detailed error only for the first few failures and only for targeted collections
|
||||||
|
if collection_name in self.debug_collections and errors < 3:
|
||||||
|
print(f" Row {row_idx} failed: {row_error}")
|
||||||
|
print(f" Row data: {len(row)} values, expected {len(columns)} columns")
|
||||||
|
|
||||||
|
errors += 1
|
||||||
|
continue
|
||||||
|
return success_count, errors
|
||||||
|
|
||||||
|
return 0, errors
|
||||||
|
|
||||||
|
def _get_nested_value(self, doc: Dict, field_path: str):
|
||||||
|
"""Get value from nested document using dot notation"""
|
||||||
|
keys = field_path.split('.')
|
||||||
|
value = doc
|
||||||
|
|
||||||
|
for key in keys:
|
||||||
|
if isinstance(value, dict) and key in value:
|
||||||
|
value = value[key]
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
def _convert_value_for_postgres(self, value, field_name: str = None, schema: Dict = None):
|
||||||
|
"""Convert MongoDB value to PostgreSQL compatible value with schema-aware conversion"""
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get the expected PostgreSQL type for this field if available
|
||||||
|
expected_type = None
|
||||||
|
if schema and field_name and field_name in schema:
|
||||||
|
field_info = schema[field_name]
|
||||||
|
expected_type = self._determine_postgres_type(field_info)
|
||||||
|
|
||||||
|
# Handle conversion based on expected type
|
||||||
|
if expected_type == 'BOOLEAN':
|
||||||
|
if isinstance(value, bool):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, str):
|
||||||
|
return value.lower() in ('true', '1', 'yes', 'on')
|
||||||
|
elif isinstance(value, (int, float)):
|
||||||
|
return bool(value)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
elif expected_type == 'INTEGER':
|
||||||
|
if isinstance(value, int):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, float):
|
||||||
|
return int(value)
|
||||||
|
elif isinstance(value, str) and value.isdigit():
|
||||||
|
return int(value)
|
||||||
|
elif isinstance(value, bool):
|
||||||
|
return int(value)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
elif expected_type == 'NUMERIC':
|
||||||
|
if isinstance(value, (int, float)):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, str):
|
||||||
|
try:
|
||||||
|
return float(value)
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
elif isinstance(value, bool):
|
||||||
|
return float(value)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
elif expected_type == 'TEXT':
|
||||||
|
if isinstance(value, str):
|
||||||
|
return value
|
||||||
|
elif value is not None:
|
||||||
|
str_value = str(value)
|
||||||
|
# Handle very long strings
|
||||||
|
if len(str_value) > 65535:
|
||||||
|
return str_value[:65535]
|
||||||
|
return str_value
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
elif expected_type == 'TIMESTAMP':
|
||||||
|
if hasattr(value, 'isoformat'):
|
||||||
|
return value.isoformat()
|
||||||
|
elif isinstance(value, str):
|
||||||
|
return value
|
||||||
|
else:
|
||||||
|
return str(value) if value is not None else None
|
||||||
|
elif expected_type == 'JSONB':
|
||||||
|
if isinstance(value, (dict, list)):
|
||||||
|
return json.dumps(value, default=self._json_serializer)
|
||||||
|
elif isinstance(value, str):
|
||||||
|
# Check if it's already valid JSON
|
||||||
|
try:
|
||||||
|
json.loads(value)
|
||||||
|
return value
|
||||||
|
except (json.JSONDecodeError, TypeError):
|
||||||
|
# Not valid JSON, wrap it
|
||||||
|
return json.dumps(value)
|
||||||
|
else:
|
||||||
|
return json.dumps(value, default=self._json_serializer)
|
||||||
|
|
||||||
|
# Fallback to original logic if no expected type or type not recognized
|
||||||
|
if isinstance(value, bool):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, (int, float)):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, str):
|
||||||
|
return value
|
||||||
|
elif isinstance(value, (dict, list)):
|
||||||
|
return json.dumps(value, default=self._json_serializer)
|
||||||
|
elif hasattr(value, 'isoformat'): # datetime
|
||||||
|
return value.isoformat()
|
||||||
|
elif hasattr(value, '__str__'):
|
||||||
|
str_value = str(value)
|
||||||
|
if len(str_value) > 65535:
|
||||||
|
return str_value[:65535]
|
||||||
|
return str_value
|
||||||
|
else:
|
||||||
|
return str(value)
|
||||||
|
|
||||||
|
def _json_serializer(self, obj):
|
||||||
|
"""Custom JSON serializer for complex objects with better error handling"""
|
||||||
|
try:
|
||||||
|
if hasattr(obj, 'isoformat'): # datetime
|
||||||
|
return obj.isoformat()
|
||||||
|
elif hasattr(obj, '__str__'):
|
||||||
|
return str(obj)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
except Exception as e:
|
||||||
|
self.log_debug(f"JSON serialization error: {e}")
|
||||||
|
return str(obj)
|
||||||
|
|
||||||
|
def run_conversion(self, sample_size: int = 100, batch_size: int = 1000):
|
||||||
|
"""Run the full conversion process with focused debugging"""
|
||||||
|
print("Starting MongoDB to PostgreSQL conversion...")
|
||||||
|
print("This will convert your Rocket.Chat database from MongoDB to PostgreSQL")
|
||||||
|
if self.debug_mode:
|
||||||
|
if self.debug_collections:
|
||||||
|
print(f"🐛 DEBUG MODE: Focusing on collections: {', '.join(self.debug_collections)}")
|
||||||
|
else:
|
||||||
|
print("🐛 DEBUG MODE: All collections")
|
||||||
|
print("=" * 70)
|
||||||
|
|
||||||
|
# Step 1: Discover collections
|
||||||
|
self.discover_collections()
|
||||||
|
|
||||||
|
# Step 2: Analyze schemas
|
||||||
|
print("\nAnalyzing collection schemas...")
|
||||||
|
for collection_name in self.collections:
|
||||||
|
self.analyze_schema(collection_name, sample_size)
|
||||||
|
|
||||||
|
# Sample problematic collections if debugging
|
||||||
|
if self.debug_mode and self.debug_collections:
|
||||||
|
for coll in self.debug_collections:
|
||||||
|
if coll in self.collections:
|
||||||
|
self.sample_documents(coll, 2)
|
||||||
|
|
||||||
|
# Step 3: Generate PostgreSQL schema
|
||||||
|
table_definitions = self.generate_postgres_schema()
|
||||||
|
|
||||||
|
# Step 4: Create database schema
|
||||||
|
self.create_postgres_database(table_definitions)
|
||||||
|
|
||||||
|
# Step 5: Convert and insert data
|
||||||
|
self.convert_and_insert_data(batch_size)
|
||||||
|
|
||||||
|
# Step 6: Show error summary
|
||||||
|
self._print_error_summary()
|
||||||
|
|
||||||
|
print("=" * 70)
|
||||||
|
print("✅ Conversion completed!")
|
||||||
|
print(f" Database: {self.postgres_config['database']}")
|
||||||
|
print(f" Tables created: {len(table_definitions)}")
|
||||||
|
|
||||||
|
def _print_error_summary(self):
|
||||||
|
"""Print a focused summary of errors"""
|
||||||
|
if not self.error_log:
|
||||||
|
print("\n✅ No errors encountered during conversion!")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("\n⚠️ ERROR SUMMARY:")
|
||||||
|
print("=" * 50)
|
||||||
|
|
||||||
|
# Sort by error count descending
|
||||||
|
sorted_collections = sorted(self.error_log.items(),
|
||||||
|
key=lambda x: len(x[1]), reverse=True)
|
||||||
|
|
||||||
|
for collection, errors in sorted_collections:
|
||||||
|
error_types = {}
|
||||||
|
for error in errors:
|
||||||
|
error_type = error['type']
|
||||||
|
if error_type not in error_types:
|
||||||
|
error_types[error_type] = []
|
||||||
|
error_types[error_type].append(error['details'])
|
||||||
|
|
||||||
|
print(f"\n🔴 {collection} ({len(errors)} total errors):")
|
||||||
|
for error_type, details_list in error_types.items():
|
||||||
|
print(f" {error_type}: {len(details_list)} errors")
|
||||||
|
|
||||||
|
# Show sample errors for critical collections
|
||||||
|
if collection in ['rocketchat_settings', 'rocketchat_room'] and len(details_list) > 0:
|
||||||
|
print(f" Sample: {details_list[0][:100]}...")
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description='Convert MongoDB BSON export to PostgreSQL',
|
||||||
|
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||||
|
epilog="""
|
||||||
|
Examples:
|
||||||
|
# Basic usage
|
||||||
|
python3 mongo_to_postgres_converter.py \\
|
||||||
|
--mongo-path db/database/62df06d44234d20001289144 \\
|
||||||
|
--pg-database rocketchat_converted \\
|
||||||
|
--pg-user rocketchat_user \\
|
||||||
|
--pg-password mypassword
|
||||||
|
|
||||||
|
# Debug specific failing collections
|
||||||
|
python3 mongo_to_postgres_converter.py \\
|
||||||
|
--mongo-path db/database/62df06d44234d20001289144 \\
|
||||||
|
--pg-database rocketchat_converted \\
|
||||||
|
--pg-user rocketchat_user \\
|
||||||
|
--pg-password mypassword \\
|
||||||
|
--debug-collections rocketchat_settings rocketchat_room
|
||||||
|
|
||||||
|
Before running this script:
|
||||||
|
1. Run: sudo -u postgres psql -f reset_database.sql
|
||||||
|
2. Update the password in reset_database.sql
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
parser.add_argument('--mongo-path', required=True, help='Path to MongoDB export directory')
|
||||||
|
parser.add_argument('--pg-host', default='localhost', help='PostgreSQL host (default: localhost)')
|
||||||
|
parser.add_argument('--pg-port', default='5432', help='PostgreSQL port (default: 5432)')
|
||||||
|
parser.add_argument('--pg-database', required=True, help='PostgreSQL database name')
|
||||||
|
parser.add_argument('--pg-user', required=True, help='PostgreSQL username')
|
||||||
|
parser.add_argument('--pg-password', required=True, help='PostgreSQL password')
|
||||||
|
parser.add_argument('--sample-size', type=int, default=100, help='Number of documents to sample for schema analysis (default: 100)')
|
||||||
|
parser.add_argument('--batch-size', type=int, default=1000, help='Batch size for data insertion (default: 1000)')
|
||||||
|
parser.add_argument('--debug', action='store_true', help='Enable debug mode with detailed error logging')
|
||||||
|
parser.add_argument('--debug-collections', nargs='*', help='Specific collections to debug (e.g., rocketchat_settings rocketchat_room)')
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
postgres_config = {
|
||||||
|
'host': args.pg_host,
|
||||||
|
'port': args.pg_port,
|
||||||
|
'database': args.pg_database,
|
||||||
|
'user': args.pg_user,
|
||||||
|
'password': args.pg_password
|
||||||
|
}
|
||||||
|
|
||||||
|
# Enable debug mode if debug collections are specified
|
||||||
|
debug_mode = args.debug or (args.debug_collections is not None)
|
||||||
|
|
||||||
|
converter = MongoToPostgresConverter(args.mongo_path, postgres_config, debug_mode, args.debug_collections)
|
||||||
|
converter.run_conversion(args.sample_size, args.batch_size)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
41
inventory-server/chat/db-convert/reset_database.sql
Normal file
41
inventory-server/chat/db-convert/reset_database.sql
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
-- PostgreSQL Database Reset Script for Rocket.Chat Import
|
||||||
|
-- Run as: sudo -u postgres psql -f reset_database.sql
|
||||||
|
|
||||||
|
-- Terminate all connections to the database (force disconnect users)
|
||||||
|
SELECT pg_terminate_backend(pid)
|
||||||
|
FROM pg_stat_activity
|
||||||
|
WHERE datname = 'rocketchat_converted' AND pid <> pg_backend_pid();
|
||||||
|
|
||||||
|
-- Drop the database if it exists
|
||||||
|
DROP DATABASE IF EXISTS rocketchat_converted;
|
||||||
|
|
||||||
|
-- Create fresh database
|
||||||
|
CREATE DATABASE rocketchat_converted;
|
||||||
|
|
||||||
|
-- Create user (if not exists)
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF NOT EXISTS (SELECT FROM pg_user WHERE usename = 'rocketchat_user') THEN
|
||||||
|
CREATE USER rocketchat_user WITH PASSWORD 'HKjLgt23gWuPXzEAn3rW';
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Grant database privileges
|
||||||
|
GRANT CONNECT ON DATABASE rocketchat_converted TO rocketchat_user;
|
||||||
|
GRANT CREATE ON DATABASE rocketchat_converted TO rocketchat_user;
|
||||||
|
|
||||||
|
-- Connect to the new database
|
||||||
|
\c rocketchat_converted;
|
||||||
|
|
||||||
|
-- Grant schema privileges
|
||||||
|
GRANT CREATE ON SCHEMA public TO rocketchat_user;
|
||||||
|
GRANT USAGE ON SCHEMA public TO rocketchat_user;
|
||||||
|
|
||||||
|
-- Grant privileges on all future tables and sequences
|
||||||
|
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT, INSERT, UPDATE, DELETE ON TABLES TO rocketchat_user;
|
||||||
|
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT USAGE, SELECT ON SEQUENCES TO rocketchat_user;
|
||||||
|
|
||||||
|
-- Display success message
|
||||||
|
\echo 'Database reset completed successfully!'
|
||||||
|
\echo 'You can now run the converter with:'
|
||||||
|
\echo 'python3 mongo_to_postgres_converter.py --mongo-path db/database/62df06d44234d20001289144 --pg-database rocketchat_converted --pg-user rocketchat_user --pg-password your_password'
|
||||||
54
inventory-server/chat/db-convert/test_converter.py
Normal file
54
inventory-server/chat/db-convert/test_converter.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Quick test script to verify the converter fixes work for problematic collections
|
||||||
|
"""
|
||||||
|
|
||||||
|
from mongo_to_postgres_converter import MongoToPostgresConverter
|
||||||
|
|
||||||
|
def test_problematic_collections():
|
||||||
|
print("🧪 Testing converter fixes for problematic collections...")
|
||||||
|
|
||||||
|
postgres_config = {
|
||||||
|
'host': 'localhost',
|
||||||
|
'port': '5432',
|
||||||
|
'database': 'rocketchat_test',
|
||||||
|
'user': 'rocketchat_user',
|
||||||
|
'password': 'password123'
|
||||||
|
}
|
||||||
|
|
||||||
|
converter = MongoToPostgresConverter(
|
||||||
|
'db/database/62df06d44234d20001289144',
|
||||||
|
postgres_config,
|
||||||
|
debug_mode=True,
|
||||||
|
debug_collections=['rocketchat_settings', 'rocketchat_room']
|
||||||
|
)
|
||||||
|
|
||||||
|
# Test just discovery and schema analysis
|
||||||
|
print("\n1. Testing collection discovery...")
|
||||||
|
converter.discover_collections()
|
||||||
|
|
||||||
|
print("\n2. Testing schema analysis...")
|
||||||
|
if 'rocketchat_settings' in converter.collections:
|
||||||
|
settings_schema = converter.analyze_schema('rocketchat_settings', 10)
|
||||||
|
print(f"Settings schema fields: {len(settings_schema)}")
|
||||||
|
|
||||||
|
# Check specific problematic fields
|
||||||
|
if 'packageValue' in settings_schema:
|
||||||
|
packagevalue_info = settings_schema['packageValue']
|
||||||
|
pg_type = converter._determine_postgres_type(packagevalue_info)
|
||||||
|
print(f"packageValue types: {packagevalue_info['types']} -> PostgreSQL: {pg_type}")
|
||||||
|
|
||||||
|
if 'rocketchat_room' in converter.collections:
|
||||||
|
room_schema = converter.analyze_schema('rocketchat_room', 10)
|
||||||
|
print(f"Room schema fields: {len(room_schema)}")
|
||||||
|
|
||||||
|
# Check specific problematic fields
|
||||||
|
if 'sysMes' in room_schema:
|
||||||
|
sysmes_info = room_schema['sysMes']
|
||||||
|
pg_type = converter._determine_postgres_type(sysmes_info)
|
||||||
|
print(f"sysMes types: {sysmes_info['types']} -> PostgreSQL: {pg_type}")
|
||||||
|
|
||||||
|
print("\n✅ Test completed - check the type mappings above!")
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
test_problematic_collections()
|
||||||
1447
inventory-server/chat/package-lock.json
generated
Normal file
1447
inventory-server/chat/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
20
inventory-server/chat/package.json
Normal file
20
inventory-server/chat/package.json
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
{
|
||||||
|
"name": "chat-server",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "Chat archive server for Rocket.Chat data",
|
||||||
|
"main": "server.js",
|
||||||
|
"scripts": {
|
||||||
|
"start": "node server.js",
|
||||||
|
"dev": "nodemon server.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"express": "^4.18.2",
|
||||||
|
"cors": "^2.8.5",
|
||||||
|
"pg": "^8.11.0",
|
||||||
|
"dotenv": "^16.0.3",
|
||||||
|
"morgan": "^1.10.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"nodemon": "^2.0.22"
|
||||||
|
}
|
||||||
|
}
|
||||||
649
inventory-server/chat/routes.js
Normal file
649
inventory-server/chat/routes.js
Normal file
@@ -0,0 +1,649 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const path = require('path');
|
||||||
|
const router = express.Router();
|
||||||
|
|
||||||
|
// Serve uploaded files with proper mapping from database paths to actual file locations
|
||||||
|
router.get('/files/uploads/*', async (req, res) => {
|
||||||
|
try {
|
||||||
|
// Extract the path from the URL (everything after /files/uploads/)
|
||||||
|
const requestPath = req.params[0];
|
||||||
|
|
||||||
|
// The URL path will be like: ufs/AmazonS3:Uploads/274Mf9CyHNG72oF86/filename.jpg
|
||||||
|
// We need to extract the mongo_id (274Mf9CyHNG72oF86) from this path
|
||||||
|
const pathParts = requestPath.split('/');
|
||||||
|
let mongoId = null;
|
||||||
|
|
||||||
|
// Find the mongo_id in the path structure
|
||||||
|
for (let i = 0; i < pathParts.length; i++) {
|
||||||
|
if (pathParts[i].includes('AmazonS3:Uploads') && i + 1 < pathParts.length) {
|
||||||
|
mongoId = pathParts[i + 1];
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
// Sometimes the mongo_id might be the last part of ufs/AmazonS3:Uploads/mongoId
|
||||||
|
if (pathParts[i] === 'AmazonS3:Uploads' && i + 1 < pathParts.length) {
|
||||||
|
mongoId = pathParts[i + 1];
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!mongoId) {
|
||||||
|
// Try to get mongo_id from database by matching the full path
|
||||||
|
const result = await global.pool.query(`
|
||||||
|
SELECT mongo_id, name, type
|
||||||
|
FROM uploads
|
||||||
|
WHERE path = $1 OR url = $1
|
||||||
|
LIMIT 1
|
||||||
|
`, [`/ufs/AmazonS3:Uploads/${requestPath}`, `/ufs/AmazonS3:Uploads/${requestPath}`]);
|
||||||
|
|
||||||
|
if (result.rows.length > 0) {
|
||||||
|
mongoId = result.rows[0].mongo_id;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!mongoId) {
|
||||||
|
return res.status(404).json({ error: 'File not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
// The actual file is stored with just the mongo_id as filename
|
||||||
|
const filePath = path.join(__dirname, 'db-convert/db/files/uploads', mongoId);
|
||||||
|
|
||||||
|
// Get file info from database for proper content-type
|
||||||
|
const fileInfo = await global.pool.query(`
|
||||||
|
SELECT name, type
|
||||||
|
FROM uploads
|
||||||
|
WHERE mongo_id = $1
|
||||||
|
LIMIT 1
|
||||||
|
`, [mongoId]);
|
||||||
|
|
||||||
|
if (fileInfo.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'File metadata not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { name, type } = fileInfo.rows[0];
|
||||||
|
|
||||||
|
// Set proper content type
|
||||||
|
if (type) {
|
||||||
|
res.set('Content-Type', type);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set content disposition with original filename
|
||||||
|
if (name) {
|
||||||
|
res.set('Content-Disposition', `inline; filename="${name}"`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send the file
|
||||||
|
res.sendFile(filePath, (err) => {
|
||||||
|
if (err) {
|
||||||
|
console.error('Error serving file:', err);
|
||||||
|
if (!res.headersSent) {
|
||||||
|
res.status(404).json({ error: 'File not found on disk' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error serving upload:', error);
|
||||||
|
res.status(500).json({ error: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Also serve files directly by mongo_id for simpler access
|
||||||
|
router.get('/files/by-id/:mongoId', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { mongoId } = req.params;
|
||||||
|
|
||||||
|
// Get file info from database
|
||||||
|
const fileInfo = await global.pool.query(`
|
||||||
|
SELECT name, type
|
||||||
|
FROM uploads
|
||||||
|
WHERE mongo_id = $1
|
||||||
|
LIMIT 1
|
||||||
|
`, [mongoId]);
|
||||||
|
|
||||||
|
if (fileInfo.rows.length === 0) {
|
||||||
|
return res.status(404).json({ error: 'File not found' });
|
||||||
|
}
|
||||||
|
|
||||||
|
const { name, type } = fileInfo.rows[0];
|
||||||
|
const filePath = path.join(__dirname, 'db-convert/db/files/uploads', mongoId);
|
||||||
|
|
||||||
|
// Set proper content type and filename
|
||||||
|
if (type) {
|
||||||
|
res.set('Content-Type', type);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (name) {
|
||||||
|
res.set('Content-Disposition', `inline; filename="${name}"`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send the file
|
||||||
|
res.sendFile(filePath, (err) => {
|
||||||
|
if (err) {
|
||||||
|
console.error('Error serving file:', err);
|
||||||
|
if (!res.headersSent) {
|
||||||
|
res.status(404).json({ error: 'File not found on disk' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error serving upload by ID:', error);
|
||||||
|
res.status(500).json({ error: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Serve user avatars by mongo_id
|
||||||
|
router.get('/avatar/:mongoId', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const { mongoId } = req.params;
|
||||||
|
|
||||||
|
console.log(`[Avatar Debug] Looking up avatar for user mongo_id: ${mongoId}`);
|
||||||
|
|
||||||
|
// First try to find avatar by user's avataretag
|
||||||
|
const userResult = await global.pool.query(`
|
||||||
|
SELECT avataretag, username FROM users WHERE mongo_id = $1
|
||||||
|
`, [mongoId]);
|
||||||
|
|
||||||
|
let avatarPath = null;
|
||||||
|
|
||||||
|
if (userResult.rows.length > 0) {
|
||||||
|
const username = userResult.rows[0].username;
|
||||||
|
const avataretag = userResult.rows[0].avataretag;
|
||||||
|
|
||||||
|
// Try method 1: Look up by avataretag -> etag (for users with avataretag set)
|
||||||
|
if (avataretag) {
|
||||||
|
console.log(`[Avatar Debug] Found user ${username} with avataretag: ${avataretag}`);
|
||||||
|
|
||||||
|
const avatarResult = await global.pool.query(`
|
||||||
|
SELECT url, path FROM avatars WHERE etag = $1
|
||||||
|
`, [avataretag]);
|
||||||
|
|
||||||
|
if (avatarResult.rows.length > 0) {
|
||||||
|
const dbPath = avatarResult.rows[0].path || avatarResult.rows[0].url;
|
||||||
|
console.log(`[Avatar Debug] Found avatar record with path: ${dbPath}`);
|
||||||
|
|
||||||
|
if (dbPath) {
|
||||||
|
const pathParts = dbPath.split('/');
|
||||||
|
for (let i = 0; i < pathParts.length; i++) {
|
||||||
|
if (pathParts[i].includes('AmazonS3:Avatars') && i + 1 < pathParts.length) {
|
||||||
|
const avatarMongoId = pathParts[i + 1];
|
||||||
|
avatarPath = path.join(__dirname, 'db-convert/db/files/avatars', avatarMongoId);
|
||||||
|
console.log(`[Avatar Debug] Extracted avatar mongo_id: ${avatarMongoId}, full path: ${avatarPath}`);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
console.log(`[Avatar Debug] No avatar record found for etag: ${avataretag}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try method 2: Look up by userid directly (for users without avataretag)
|
||||||
|
if (!avatarPath) {
|
||||||
|
console.log(`[Avatar Debug] Trying direct userid lookup for user ${username} (${mongoId})`);
|
||||||
|
|
||||||
|
const avatarResult = await global.pool.query(`
|
||||||
|
SELECT url, path FROM avatars WHERE userid = $1
|
||||||
|
`, [mongoId]);
|
||||||
|
|
||||||
|
if (avatarResult.rows.length > 0) {
|
||||||
|
const dbPath = avatarResult.rows[0].path || avatarResult.rows[0].url;
|
||||||
|
console.log(`[Avatar Debug] Found avatar record by userid with path: ${dbPath}`);
|
||||||
|
|
||||||
|
if (dbPath) {
|
||||||
|
const pathParts = dbPath.split('/');
|
||||||
|
for (let i = 0; i < pathParts.length; i++) {
|
||||||
|
if (pathParts[i].includes('AmazonS3:Avatars') && i + 1 < pathParts.length) {
|
||||||
|
const avatarMongoId = pathParts[i + 1];
|
||||||
|
avatarPath = path.join(__dirname, 'db-convert/db/files/avatars', avatarMongoId);
|
||||||
|
console.log(`[Avatar Debug] Extracted avatar mongo_id: ${avatarMongoId}, full path: ${avatarPath}`);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
console.log(`[Avatar Debug] No avatar record found for userid: ${mongoId}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
console.log(`[Avatar Debug] No user found for mongo_id: ${mongoId}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: try direct lookup by user mongo_id
|
||||||
|
if (!avatarPath) {
|
||||||
|
avatarPath = path.join(__dirname, 'db-convert/db/files/avatars', mongoId);
|
||||||
|
console.log(`[Avatar Debug] Using fallback path: ${avatarPath}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set proper content type for images
|
||||||
|
res.set('Content-Type', 'image/jpeg'); // Most avatars are likely JPEG
|
||||||
|
|
||||||
|
// Send the file
|
||||||
|
res.sendFile(avatarPath, (err) => {
|
||||||
|
if (err) {
|
||||||
|
// If avatar doesn't exist, send a default 404 or generate initials
|
||||||
|
console.log(`[Avatar Debug] Avatar file not found at path: ${avatarPath}, error:`, err.message);
|
||||||
|
if (!res.headersSent) {
|
||||||
|
res.status(404).json({ error: 'Avatar not found' });
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
console.log(`[Avatar Debug] Successfully served avatar from: ${avatarPath}`);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error serving avatar:', error);
|
||||||
|
res.status(500).json({ error: 'Server error' });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Serve avatars statically as fallback
|
||||||
|
router.use('/files/avatars', express.static(path.join(__dirname, 'db-convert/db/files/avatars')));
|
||||||
|
|
||||||
|
// Get all users for the "view as" dropdown (active and inactive)
|
||||||
|
router.get('/users', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const result = await global.pool.query(`
|
||||||
|
SELECT id, username, name, type, active, status, lastlogin,
|
||||||
|
statustext, utcoffset, statusconnection, mongo_id, avataretag
|
||||||
|
FROM users
|
||||||
|
WHERE type = 'user'
|
||||||
|
ORDER BY
|
||||||
|
active DESC, -- Active users first
|
||||||
|
CASE
|
||||||
|
WHEN status = 'online' THEN 1
|
||||||
|
WHEN status = 'away' THEN 2
|
||||||
|
WHEN status = 'busy' THEN 3
|
||||||
|
ELSE 4
|
||||||
|
END,
|
||||||
|
name ASC
|
||||||
|
`);
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
users: result.rows
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching users:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to fetch users',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get rooms for a specific user with enhanced room names for direct messages
|
||||||
|
router.get('/users/:userId/rooms', async (req, res) => {
|
||||||
|
const { userId } = req.params;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get the current user's mongo_id for filtering
|
||||||
|
const userResult = await global.pool.query(`
|
||||||
|
SELECT mongo_id, username FROM users WHERE id = $1
|
||||||
|
`, [userId]);
|
||||||
|
|
||||||
|
if (userResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'User not found'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const currentUserMongoId = userResult.rows[0].mongo_id;
|
||||||
|
const currentUsername = userResult.rows[0].username;
|
||||||
|
|
||||||
|
// Get rooms where the user is a member with proper naming from subscription table
|
||||||
|
// Include archived and closed rooms but sort them at the bottom
|
||||||
|
const result = await global.pool.query(`
|
||||||
|
SELECT DISTINCT
|
||||||
|
r.id,
|
||||||
|
r.mongo_id as room_mongo_id,
|
||||||
|
r.name,
|
||||||
|
r.fname,
|
||||||
|
r.t as type,
|
||||||
|
r.msgs,
|
||||||
|
r.lm as last_message_date,
|
||||||
|
r.usernames,
|
||||||
|
r.uids,
|
||||||
|
r.userscount,
|
||||||
|
r.description,
|
||||||
|
r.teamid,
|
||||||
|
r.archived,
|
||||||
|
s.open,
|
||||||
|
-- Use the subscription's name for direct messages (excludes current user)
|
||||||
|
-- For channels/groups, use room's fname or name
|
||||||
|
CASE
|
||||||
|
WHEN r.t = 'd' THEN COALESCE(s.fname, s.name, 'Unknown User')
|
||||||
|
ELSE COALESCE(r.fname, r.name, 'Unnamed Room')
|
||||||
|
END as display_name
|
||||||
|
FROM room r
|
||||||
|
JOIN subscription s ON s.rid = r.mongo_id
|
||||||
|
WHERE s.u->>'_id' = $1
|
||||||
|
ORDER BY
|
||||||
|
s.open DESC NULLS LAST, -- Open rooms first
|
||||||
|
r.archived NULLS FIRST, -- Non-archived first (nulls treated as false)
|
||||||
|
r.lm DESC NULLS LAST
|
||||||
|
LIMIT 50
|
||||||
|
`, [currentUserMongoId]);
|
||||||
|
|
||||||
|
// Enhance rooms with participant information for direct messages
|
||||||
|
const enhancedRooms = await Promise.all(result.rows.map(async (room) => {
|
||||||
|
if (room.type === 'd' && room.uids) {
|
||||||
|
// Get participant info (excluding current user) for direct messages
|
||||||
|
const participantResult = await global.pool.query(`
|
||||||
|
SELECT u.username, u.name, u.mongo_id, u.avataretag
|
||||||
|
FROM users u
|
||||||
|
WHERE u.mongo_id = ANY($1::text[])
|
||||||
|
AND u.mongo_id != $2
|
||||||
|
`, [room.uids, currentUserMongoId]);
|
||||||
|
|
||||||
|
room.participants = participantResult.rows;
|
||||||
|
}
|
||||||
|
return room;
|
||||||
|
}));
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
rooms: enhancedRooms
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching user rooms:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to fetch user rooms',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get room details including participants
|
||||||
|
router.get('/rooms/:roomId', async (req, res) => {
|
||||||
|
const { roomId } = req.params;
|
||||||
|
const { userId } = req.query; // Accept current user ID as query parameter
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await global.pool.query(`
|
||||||
|
SELECT r.id, r.name, r.fname, r.t as type, r.msgs, r.description,
|
||||||
|
r.lm as last_message_date, r.usernames, r.uids, r.userscount, r.teamid
|
||||||
|
FROM room r
|
||||||
|
WHERE r.id = $1
|
||||||
|
`, [roomId]);
|
||||||
|
|
||||||
|
if (result.rows.length === 0) {
|
||||||
|
return res.status(404).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Room not found'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const room = result.rows[0];
|
||||||
|
|
||||||
|
// For direct messages, get the proper display name based on current user
|
||||||
|
if (room.type === 'd' && room.uids && userId) {
|
||||||
|
// Get current user's mongo_id
|
||||||
|
const userResult = await global.pool.query(`
|
||||||
|
SELECT mongo_id FROM users WHERE id = $1
|
||||||
|
`, [userId]);
|
||||||
|
|
||||||
|
if (userResult.rows.length > 0) {
|
||||||
|
const currentUserMongoId = userResult.rows[0].mongo_id;
|
||||||
|
|
||||||
|
// Get display name from subscription table for this user
|
||||||
|
// Use room mongo_id to match with subscription.rid
|
||||||
|
const roomMongoResult = await global.pool.query(`
|
||||||
|
SELECT mongo_id FROM room WHERE id = $1
|
||||||
|
`, [roomId]);
|
||||||
|
|
||||||
|
if (roomMongoResult.rows.length > 0) {
|
||||||
|
const roomMongoId = roomMongoResult.rows[0].mongo_id;
|
||||||
|
|
||||||
|
const subscriptionResult = await global.pool.query(`
|
||||||
|
SELECT fname, name FROM subscription
|
||||||
|
WHERE rid = $1 AND u->>'_id' = $2
|
||||||
|
`, [roomMongoId, currentUserMongoId]);
|
||||||
|
|
||||||
|
if (subscriptionResult.rows.length > 0) {
|
||||||
|
const sub = subscriptionResult.rows[0];
|
||||||
|
room.display_name = sub.fname || sub.name || 'Unknown User';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all participants for additional info
|
||||||
|
const participantResult = await global.pool.query(`
|
||||||
|
SELECT username, name
|
||||||
|
FROM users
|
||||||
|
WHERE mongo_id = ANY($1::text[])
|
||||||
|
`, [room.uids]);
|
||||||
|
|
||||||
|
room.participants = participantResult.rows;
|
||||||
|
} else {
|
||||||
|
// For channels/groups, use room's fname or name
|
||||||
|
room.display_name = room.fname || room.name || 'Unnamed Room';
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
room: room
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching room details:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to fetch room details',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get messages for a specific room (fast, without attachments)
|
||||||
|
router.get('/rooms/:roomId/messages', async (req, res) => {
|
||||||
|
const { roomId } = req.params;
|
||||||
|
const { limit = 50, offset = 0, before } = req.query;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Fast query - just get messages without expensive attachment joins
|
||||||
|
let query = `
|
||||||
|
SELECT m.id, m.msg, m.ts, m.u, m._updatedat, m.urls, m.mentions, m.md
|
||||||
|
FROM message m
|
||||||
|
JOIN room r ON m.rid = r.mongo_id
|
||||||
|
WHERE r.id = $1
|
||||||
|
`;
|
||||||
|
|
||||||
|
const params = [roomId];
|
||||||
|
|
||||||
|
if (before) {
|
||||||
|
query += ` AND m.ts < $${params.length + 1}`;
|
||||||
|
params.push(before);
|
||||||
|
}
|
||||||
|
|
||||||
|
query += ` ORDER BY m.ts DESC LIMIT $${params.length + 1} OFFSET $${params.length + 2}`;
|
||||||
|
params.push(limit, offset);
|
||||||
|
|
||||||
|
const result = await global.pool.query(query, params);
|
||||||
|
|
||||||
|
// Add empty attachments array for now - attachments will be loaded separately if needed
|
||||||
|
const messages = result.rows.map(msg => ({
|
||||||
|
...msg,
|
||||||
|
attachments: []
|
||||||
|
}));
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
messages: messages.reverse() // Reverse to show oldest first
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching messages:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to fetch messages',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get attachments for specific messages (called separately for performance)
|
||||||
|
router.post('/messages/attachments', async (req, res) => {
|
||||||
|
const { messageIds } = req.body;
|
||||||
|
|
||||||
|
if (!messageIds || !Array.isArray(messageIds) || messageIds.length === 0) {
|
||||||
|
return res.json({ status: 'success', attachments: {} });
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get room mongo_id from first message to limit search scope
|
||||||
|
const roomQuery = await global.pool.query(`
|
||||||
|
SELECT r.mongo_id as room_mongo_id
|
||||||
|
FROM message m
|
||||||
|
JOIN room r ON m.rid = r.mongo_id
|
||||||
|
WHERE m.id = $1
|
||||||
|
LIMIT 1
|
||||||
|
`, [messageIds[0]]);
|
||||||
|
|
||||||
|
if (roomQuery.rows.length === 0) {
|
||||||
|
return res.json({ status: 'success', attachments: {} });
|
||||||
|
}
|
||||||
|
|
||||||
|
const roomMongoId = roomQuery.rows[0].room_mongo_id;
|
||||||
|
|
||||||
|
// Get messages and their upload timestamps
|
||||||
|
const messagesQuery = await global.pool.query(`
|
||||||
|
SELECT m.id, m.ts, m.u->>'_id' as user_id
|
||||||
|
FROM message m
|
||||||
|
WHERE m.id = ANY($1::int[])
|
||||||
|
`, [messageIds]);
|
||||||
|
|
||||||
|
if (messagesQuery.rows.length === 0) {
|
||||||
|
return res.json({ status: 'success', attachments: {} });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build a map of user_id -> array of message timestamps for efficient lookup
|
||||||
|
const userTimeMap = {};
|
||||||
|
const messageMap = {};
|
||||||
|
messagesQuery.rows.forEach(msg => {
|
||||||
|
if (!userTimeMap[msg.user_id]) {
|
||||||
|
userTimeMap[msg.user_id] = [];
|
||||||
|
}
|
||||||
|
userTimeMap[msg.user_id].push(msg.ts);
|
||||||
|
messageMap[msg.id] = { ts: msg.ts, user_id: msg.user_id };
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get attachments for this room and these users
|
||||||
|
const uploadsQuery = await global.pool.query(`
|
||||||
|
SELECT mongo_id, name, size, type, url, path, typegroup, identify,
|
||||||
|
userid, uploadedat
|
||||||
|
FROM uploads
|
||||||
|
WHERE rid = $1
|
||||||
|
AND userid = ANY($2::text[])
|
||||||
|
ORDER BY uploadedat
|
||||||
|
`, [roomMongoId, Object.keys(userTimeMap)]);
|
||||||
|
|
||||||
|
// Match attachments to messages based on timestamp proximity (within 5 minutes)
|
||||||
|
const attachmentsByMessage = {};
|
||||||
|
|
||||||
|
uploadsQuery.rows.forEach(upload => {
|
||||||
|
const uploadTime = new Date(upload.uploadedat).getTime();
|
||||||
|
|
||||||
|
// Find the closest message from this user within 5 minutes
|
||||||
|
let closestMessageId = null;
|
||||||
|
let closestTimeDiff = Infinity;
|
||||||
|
|
||||||
|
Object.entries(messageMap).forEach(([msgId, msgData]) => {
|
||||||
|
if (msgData.user_id === upload.userid) {
|
||||||
|
const msgTime = new Date(msgData.ts).getTime();
|
||||||
|
const timeDiff = Math.abs(uploadTime - msgTime);
|
||||||
|
|
||||||
|
if (timeDiff < 300000 && timeDiff < closestTimeDiff) { // 5 minutes = 300000ms
|
||||||
|
closestMessageId = msgId;
|
||||||
|
closestTimeDiff = timeDiff;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (closestMessageId) {
|
||||||
|
if (!attachmentsByMessage[closestMessageId]) {
|
||||||
|
attachmentsByMessage[closestMessageId] = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
attachmentsByMessage[closestMessageId].push({
|
||||||
|
id: upload.id,
|
||||||
|
mongo_id: upload.mongo_id,
|
||||||
|
name: upload.name,
|
||||||
|
size: upload.size,
|
||||||
|
type: upload.type,
|
||||||
|
url: upload.url,
|
||||||
|
path: upload.path,
|
||||||
|
typegroup: upload.typegroup,
|
||||||
|
identify: upload.identify
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
attachments: attachmentsByMessage
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error fetching message attachments:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to fetch attachments',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Search messages in accessible rooms for a user
|
||||||
|
router.get('/users/:userId/search', async (req, res) => {
|
||||||
|
const { userId } = req.params;
|
||||||
|
const { q, limit = 20 } = req.query;
|
||||||
|
|
||||||
|
if (!q || q.length < 2) {
|
||||||
|
return res.status(400).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Search query must be at least 2 characters'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const userResult = await global.pool.query(`
|
||||||
|
SELECT mongo_id FROM users WHERE id = $1
|
||||||
|
`, [userId]);
|
||||||
|
|
||||||
|
if (userResult.rows.length === 0) {
|
||||||
|
return res.status(404).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'User not found'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
const currentUserMongoId = userResult.rows[0].mongo_id;
|
||||||
|
|
||||||
|
const result = await global.pool.query(`
|
||||||
|
SELECT m.id, m.msg, m.ts, m.u, r.id as room_id, r.name as room_name, r.fname as room_fname, r.t as room_type
|
||||||
|
FROM message m
|
||||||
|
JOIN room r ON m.rid = r.mongo_id
|
||||||
|
JOIN subscription s ON s.rid = r.mongo_id AND s.u->>'_id' = $1
|
||||||
|
WHERE m.msg ILIKE $2
|
||||||
|
AND r.archived IS NOT TRUE
|
||||||
|
ORDER BY m.ts DESC
|
||||||
|
LIMIT $3
|
||||||
|
`, [currentUserMongoId, `%${q}%`, limit]);
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
results: result.rows
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error searching messages:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Failed to search messages',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
module.exports = router;
|
||||||
83
inventory-server/chat/server.js
Normal file
83
inventory-server/chat/server.js
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
require('dotenv').config({ path: '../.env' });
|
||||||
|
const express = require('express');
|
||||||
|
const cors = require('cors');
|
||||||
|
const { Pool } = require('pg');
|
||||||
|
const morgan = require('morgan');
|
||||||
|
const chatRoutes = require('./routes');
|
||||||
|
|
||||||
|
// Log startup configuration
|
||||||
|
console.log('Starting chat server with config:', {
|
||||||
|
host: process.env.CHAT_DB_HOST,
|
||||||
|
user: process.env.CHAT_DB_USER,
|
||||||
|
database: process.env.CHAT_DB_NAME || 'rocketchat_converted',
|
||||||
|
port: process.env.CHAT_DB_PORT,
|
||||||
|
chat_port: process.env.CHAT_PORT || 3014
|
||||||
|
});
|
||||||
|
|
||||||
|
const app = express();
|
||||||
|
const port = process.env.CHAT_PORT || 3014;
|
||||||
|
|
||||||
|
// Database configuration for rocketchat_converted database
|
||||||
|
const pool = new Pool({
|
||||||
|
host: process.env.CHAT_DB_HOST,
|
||||||
|
user: process.env.CHAT_DB_USER,
|
||||||
|
password: process.env.CHAT_DB_PASSWORD,
|
||||||
|
database: process.env.CHAT_DB_NAME || 'rocketchat_converted',
|
||||||
|
port: process.env.CHAT_DB_PORT,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Make pool available globally
|
||||||
|
global.pool = pool;
|
||||||
|
|
||||||
|
// Middleware
|
||||||
|
app.use(express.json());
|
||||||
|
app.use(morgan('combined'));
|
||||||
|
app.use(cors({
|
||||||
|
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw'],
|
||||||
|
credentials: true
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Test database connection endpoint
|
||||||
|
app.get('/test-db', async (req, res) => {
|
||||||
|
try {
|
||||||
|
const result = await pool.query('SELECT COUNT(*) as user_count FROM users WHERE active = true');
|
||||||
|
const messageResult = await pool.query('SELECT COUNT(*) as message_count FROM message');
|
||||||
|
const roomResult = await pool.query('SELECT COUNT(*) as room_count FROM room');
|
||||||
|
|
||||||
|
res.json({
|
||||||
|
status: 'success',
|
||||||
|
database: 'rocketchat_converted',
|
||||||
|
stats: {
|
||||||
|
active_users: parseInt(result.rows[0].user_count),
|
||||||
|
total_messages: parseInt(messageResult.rows[0].message_count),
|
||||||
|
total_rooms: parseInt(roomResult.rows[0].room_count)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Database test error:', error);
|
||||||
|
res.status(500).json({
|
||||||
|
status: 'error',
|
||||||
|
error: 'Database connection failed',
|
||||||
|
details: error.message
|
||||||
|
});
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Mount all routes from routes.js
|
||||||
|
app.use('/', chatRoutes);
|
||||||
|
|
||||||
|
// Health check endpoint
|
||||||
|
app.get('/health', (req, res) => {
|
||||||
|
res.json({ status: 'healthy' });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Error handling middleware
|
||||||
|
app.use((err, req, res, next) => {
|
||||||
|
console.error(err.stack);
|
||||||
|
res.status(500).json({ error: 'Something broke!' });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start server
|
||||||
|
app.listen(port, () => {
|
||||||
|
console.log(`Chat server running on port ${port}`);
|
||||||
|
});
|
||||||
@@ -7,7 +7,7 @@ BEGIN
|
|||||||
-- Check which table is being updated and use the appropriate column
|
-- Check which table is being updated and use the appropriate column
|
||||||
IF TG_TABLE_NAME = 'categories' THEN
|
IF TG_TABLE_NAME = 'categories' THEN
|
||||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||||
ELSIF TG_TABLE_NAME IN ('products', 'orders', 'purchase_orders') THEN
|
ELSIF TG_TABLE_NAME IN ('products', 'orders', 'purchase_orders', 'receivings') THEN
|
||||||
NEW.updated = CURRENT_TIMESTAMP;
|
NEW.updated = CURRENT_TIMESTAMP;
|
||||||
END IF;
|
END IF;
|
||||||
RETURN NEW;
|
RETURN NEW;
|
||||||
@@ -159,27 +159,24 @@ CREATE INDEX idx_orders_pid_date ON orders(pid, date);
|
|||||||
CREATE INDEX idx_orders_updated ON orders(updated);
|
CREATE INDEX idx_orders_updated ON orders(updated);
|
||||||
|
|
||||||
-- Create purchase_orders table with its indexes
|
-- Create purchase_orders table with its indexes
|
||||||
|
-- This table now focuses solely on purchase order intent, not receivings
|
||||||
CREATE TABLE purchase_orders (
|
CREATE TABLE purchase_orders (
|
||||||
id BIGSERIAL PRIMARY KEY,
|
id BIGSERIAL PRIMARY KEY,
|
||||||
po_id TEXT NOT NULL,
|
po_id TEXT NOT NULL,
|
||||||
vendor TEXT NOT NULL,
|
vendor TEXT NOT NULL,
|
||||||
date DATE NOT NULL,
|
date TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
expected_date DATE,
|
expected_date DATE,
|
||||||
pid BIGINT NOT NULL,
|
pid BIGINT NOT NULL,
|
||||||
sku TEXT NOT NULL,
|
sku TEXT NOT NULL,
|
||||||
name TEXT NOT NULL,
|
name TEXT NOT NULL,
|
||||||
cost_price NUMERIC(14, 4) NOT NULL,
|
|
||||||
po_cost_price NUMERIC(14, 4) NOT NULL,
|
po_cost_price NUMERIC(14, 4) NOT NULL,
|
||||||
status TEXT DEFAULT 'created',
|
status TEXT DEFAULT 'created',
|
||||||
receiving_status TEXT DEFAULT 'created',
|
|
||||||
notes TEXT,
|
notes TEXT,
|
||||||
long_note TEXT,
|
long_note TEXT,
|
||||||
ordered INTEGER NOT NULL,
|
ordered INTEGER NOT NULL,
|
||||||
received INTEGER DEFAULT 0,
|
supplier_id INTEGER,
|
||||||
received_date DATE,
|
date_created TIMESTAMP WITH TIME ZONE,
|
||||||
last_received_date DATE,
|
date_ordered TIMESTAMP WITH TIME ZONE,
|
||||||
received_by TEXT,
|
|
||||||
receiving_history JSONB,
|
|
||||||
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE,
|
FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE,
|
||||||
UNIQUE (po_id, pid)
|
UNIQUE (po_id, pid)
|
||||||
@@ -192,21 +189,61 @@ CREATE TRIGGER update_purchase_orders_updated
|
|||||||
EXECUTE FUNCTION update_updated_column();
|
EXECUTE FUNCTION update_updated_column();
|
||||||
|
|
||||||
COMMENT ON COLUMN purchase_orders.name IS 'Product name from products.description';
|
COMMENT ON COLUMN purchase_orders.name IS 'Product name from products.description';
|
||||||
COMMENT ON COLUMN purchase_orders.po_cost_price IS 'Original cost from PO, before receiving adjustments';
|
COMMENT ON COLUMN purchase_orders.po_cost_price IS 'Original cost from PO';
|
||||||
COMMENT ON COLUMN purchase_orders.status IS 'canceled, created, electronically_ready_send, ordered, preordered, electronically_sent, receiving_started, done';
|
COMMENT ON COLUMN purchase_orders.status IS 'canceled, created, electronically_ready_send, ordered, preordered, electronically_sent, receiving_started, done';
|
||||||
COMMENT ON COLUMN purchase_orders.receiving_status IS 'canceled, created, partial_received, full_received, paid';
|
|
||||||
COMMENT ON COLUMN purchase_orders.receiving_history IS 'Array of receiving records with qty, date, cost, receiving_id, and alt_po flag';
|
|
||||||
|
|
||||||
CREATE INDEX idx_po_id ON purchase_orders(po_id);
|
CREATE INDEX idx_po_id ON purchase_orders(po_id);
|
||||||
CREATE INDEX idx_po_sku ON purchase_orders(sku);
|
CREATE INDEX idx_po_sku ON purchase_orders(sku);
|
||||||
CREATE INDEX idx_po_vendor ON purchase_orders(vendor);
|
CREATE INDEX idx_po_vendor ON purchase_orders(vendor);
|
||||||
CREATE INDEX idx_po_status ON purchase_orders(status);
|
CREATE INDEX idx_po_status ON purchase_orders(status);
|
||||||
CREATE INDEX idx_po_receiving_status ON purchase_orders(receiving_status);
|
|
||||||
CREATE INDEX idx_po_expected_date ON purchase_orders(expected_date);
|
CREATE INDEX idx_po_expected_date ON purchase_orders(expected_date);
|
||||||
CREATE INDEX idx_po_last_received_date ON purchase_orders(last_received_date);
|
|
||||||
CREATE INDEX idx_po_pid_status ON purchase_orders(pid, status);
|
CREATE INDEX idx_po_pid_status ON purchase_orders(pid, status);
|
||||||
CREATE INDEX idx_po_pid_date ON purchase_orders(pid, date);
|
CREATE INDEX idx_po_pid_date ON purchase_orders(pid, date);
|
||||||
CREATE INDEX idx_po_updated ON purchase_orders(updated);
|
CREATE INDEX idx_po_updated ON purchase_orders(updated);
|
||||||
|
CREATE INDEX idx_po_supplier_id ON purchase_orders(supplier_id);
|
||||||
|
|
||||||
|
-- Create receivings table to track actual receipt of goods
|
||||||
|
CREATE TABLE receivings (
|
||||||
|
id BIGSERIAL PRIMARY KEY,
|
||||||
|
receiving_id TEXT NOT NULL,
|
||||||
|
pid BIGINT NOT NULL,
|
||||||
|
sku TEXT NOT NULL,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
vendor TEXT,
|
||||||
|
qty_each INTEGER NOT NULL,
|
||||||
|
qty_each_orig INTEGER,
|
||||||
|
cost_each NUMERIC(14, 5) NOT NULL,
|
||||||
|
cost_each_orig NUMERIC(14, 5),
|
||||||
|
received_by INTEGER,
|
||||||
|
received_by_name TEXT,
|
||||||
|
received_date TIMESTAMP WITH TIME ZONE NOT NULL,
|
||||||
|
receiving_created_date TIMESTAMP WITH TIME ZONE,
|
||||||
|
supplier_id INTEGER,
|
||||||
|
status TEXT DEFAULT 'created',
|
||||||
|
updated TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
FOREIGN KEY (pid) REFERENCES products(pid) ON DELETE CASCADE,
|
||||||
|
UNIQUE (receiving_id, pid)
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Create trigger for receivings
|
||||||
|
CREATE TRIGGER update_receivings_updated
|
||||||
|
BEFORE UPDATE ON receivings
|
||||||
|
FOR EACH ROW
|
||||||
|
EXECUTE FUNCTION update_updated_column();
|
||||||
|
|
||||||
|
COMMENT ON COLUMN receivings.status IS 'canceled, created, partial_received, full_received, paid';
|
||||||
|
COMMENT ON COLUMN receivings.qty_each_orig IS 'Original quantity from the source system';
|
||||||
|
COMMENT ON COLUMN receivings.cost_each_orig IS 'Original cost from the source system';
|
||||||
|
COMMENT ON COLUMN receivings.vendor IS 'Vendor name, same as in purchase_orders';
|
||||||
|
|
||||||
|
CREATE INDEX idx_receivings_id ON receivings(receiving_id);
|
||||||
|
CREATE INDEX idx_receivings_pid ON receivings(pid);
|
||||||
|
CREATE INDEX idx_receivings_sku ON receivings(sku);
|
||||||
|
CREATE INDEX idx_receivings_status ON receivings(status);
|
||||||
|
CREATE INDEX idx_receivings_received_date ON receivings(received_date);
|
||||||
|
CREATE INDEX idx_receivings_supplier_id ON receivings(supplier_id);
|
||||||
|
CREATE INDEX idx_receivings_vendor ON receivings(vendor);
|
||||||
|
CREATE INDEX idx_receivings_updated ON receivings(updated);
|
||||||
|
|
||||||
SET session_replication_role = 'origin'; -- Re-enable foreign key checks
|
SET session_replication_role = 'origin'; -- Re-enable foreign key checks
|
||||||
|
|
||||||
|
|||||||
@@ -35,7 +35,7 @@ function validateDate(mysqlDate) {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Imports purchase orders and receivings from a production MySQL database to a local PostgreSQL database.
|
* Imports purchase orders and receivings from a production MySQL database to a local PostgreSQL database.
|
||||||
* Implements FIFO allocation of receivings to purchase orders.
|
* Handles these as separate data streams without complex FIFO allocation.
|
||||||
*
|
*
|
||||||
* @param {object} prodConnection - A MySQL connection to production DB
|
* @param {object} prodConnection - A MySQL connection to production DB
|
||||||
* @param {object} localConnection - A PostgreSQL connection to local DB
|
* @param {object} localConnection - A PostgreSQL connection to local DB
|
||||||
@@ -44,8 +44,12 @@ function validateDate(mysqlDate) {
|
|||||||
*/
|
*/
|
||||||
async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) {
|
async function importPurchaseOrders(prodConnection, localConnection, incrementalUpdate = true) {
|
||||||
const startTime = Date.now();
|
const startTime = Date.now();
|
||||||
let recordsAdded = 0;
|
let poRecordsAdded = 0;
|
||||||
let recordsUpdated = 0;
|
let poRecordsUpdated = 0;
|
||||||
|
let poRecordsDeleted = 0;
|
||||||
|
let receivingRecordsAdded = 0;
|
||||||
|
let receivingRecordsUpdated = 0;
|
||||||
|
let receivingRecordsDeleted = 0;
|
||||||
let totalProcessed = 0;
|
let totalProcessed = 0;
|
||||||
|
|
||||||
// Batch size constants
|
// Batch size constants
|
||||||
@@ -68,8 +72,8 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
DROP TABLE IF EXISTS temp_purchase_orders;
|
DROP TABLE IF EXISTS temp_purchase_orders;
|
||||||
DROP TABLE IF EXISTS temp_receivings;
|
DROP TABLE IF EXISTS temp_receivings;
|
||||||
DROP TABLE IF EXISTS temp_receiving_allocations;
|
|
||||||
DROP TABLE IF EXISTS employee_names;
|
DROP TABLE IF EXISTS employee_names;
|
||||||
|
DROP TABLE IF EXISTS temp_supplier_names;
|
||||||
|
|
||||||
-- Temporary table for purchase orders
|
-- Temporary table for purchase orders
|
||||||
CREATE TEMP TABLE temp_purchase_orders (
|
CREATE TEMP TABLE temp_purchase_orders (
|
||||||
@@ -94,11 +98,16 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
-- Temporary table for receivings
|
-- Temporary table for receivings
|
||||||
CREATE TEMP TABLE temp_receivings (
|
CREATE TEMP TABLE temp_receivings (
|
||||||
receiving_id TEXT NOT NULL,
|
receiving_id TEXT NOT NULL,
|
||||||
po_id TEXT,
|
|
||||||
pid BIGINT NOT NULL,
|
pid BIGINT NOT NULL,
|
||||||
|
sku TEXT,
|
||||||
|
name TEXT,
|
||||||
|
vendor TEXT,
|
||||||
qty_each INTEGER,
|
qty_each INTEGER,
|
||||||
cost_each NUMERIC(14, 4),
|
qty_each_orig INTEGER,
|
||||||
|
cost_each NUMERIC(14, 5),
|
||||||
|
cost_each_orig NUMERIC(14, 5),
|
||||||
received_by INTEGER,
|
received_by INTEGER,
|
||||||
|
received_by_name TEXT,
|
||||||
received_date TIMESTAMP WITH TIME ZONE,
|
received_date TIMESTAMP WITH TIME ZONE,
|
||||||
receiving_created_date TIMESTAMP WITH TIME ZONE,
|
receiving_created_date TIMESTAMP WITH TIME ZONE,
|
||||||
supplier_id INTEGER,
|
supplier_id INTEGER,
|
||||||
@@ -106,18 +115,6 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
PRIMARY KEY (receiving_id, pid)
|
PRIMARY KEY (receiving_id, pid)
|
||||||
);
|
);
|
||||||
|
|
||||||
-- Temporary table for tracking FIFO allocations
|
|
||||||
CREATE TEMP TABLE temp_receiving_allocations (
|
|
||||||
po_id TEXT NOT NULL,
|
|
||||||
pid BIGINT NOT NULL,
|
|
||||||
receiving_id TEXT NOT NULL,
|
|
||||||
allocated_qty INTEGER NOT NULL,
|
|
||||||
cost_each NUMERIC(14, 4) NOT NULL,
|
|
||||||
received_date TIMESTAMP WITH TIME ZONE NOT NULL,
|
|
||||||
received_by INTEGER,
|
|
||||||
PRIMARY KEY (po_id, pid, receiving_id)
|
|
||||||
);
|
|
||||||
|
|
||||||
-- Temporary table for employee names
|
-- Temporary table for employee names
|
||||||
CREATE TEMP TABLE employee_names (
|
CREATE TEMP TABLE employee_names (
|
||||||
employeeid INTEGER PRIMARY KEY,
|
employeeid INTEGER PRIMARY KEY,
|
||||||
@@ -128,7 +125,6 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
-- Create indexes for efficient joins
|
-- Create indexes for efficient joins
|
||||||
CREATE INDEX idx_temp_po_pid ON temp_purchase_orders(pid);
|
CREATE INDEX idx_temp_po_pid ON temp_purchase_orders(pid);
|
||||||
CREATE INDEX idx_temp_receiving_pid ON temp_receivings(pid);
|
CREATE INDEX idx_temp_receiving_pid ON temp_receivings(pid);
|
||||||
CREATE INDEX idx_temp_receiving_po_id ON temp_receivings(po_id);
|
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Map status codes to text values
|
// Map status codes to text values
|
||||||
@@ -191,7 +187,56 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
`, employeeValues);
|
`, employeeValues);
|
||||||
}
|
}
|
||||||
|
|
||||||
// 1. First, fetch all relevant POs
|
// Add this section before the PO import to create a supplier names mapping
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Purchase orders import",
|
||||||
|
message: "Fetching supplier data for vendor mapping"
|
||||||
|
});
|
||||||
|
|
||||||
|
// Fetch supplier data from production and store in a temp table
|
||||||
|
const [suppliers] = await prodConnection.query(`
|
||||||
|
SELECT
|
||||||
|
supplierid,
|
||||||
|
companyname
|
||||||
|
FROM suppliers
|
||||||
|
WHERE companyname IS NOT NULL AND companyname != ''
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (suppliers.length > 0) {
|
||||||
|
// Create temp table for supplier names
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS temp_supplier_names;
|
||||||
|
CREATE TEMP TABLE temp_supplier_names (
|
||||||
|
supplier_id INTEGER PRIMARY KEY,
|
||||||
|
company_name TEXT NOT NULL
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Insert supplier data in batches
|
||||||
|
for (let i = 0; i < suppliers.length; i += INSERT_BATCH_SIZE) {
|
||||||
|
const batch = suppliers.slice(i, i + INSERT_BATCH_SIZE);
|
||||||
|
|
||||||
|
const placeholders = batch.map((_, idx) => {
|
||||||
|
const base = idx * 2;
|
||||||
|
return `($${base + 1}, $${base + 2})`;
|
||||||
|
}).join(',');
|
||||||
|
|
||||||
|
const values = batch.flatMap(s => [
|
||||||
|
s.supplierid,
|
||||||
|
s.companyname || 'Unnamed Supplier'
|
||||||
|
]);
|
||||||
|
|
||||||
|
await localConnection.query(`
|
||||||
|
INSERT INTO temp_supplier_names (supplier_id, company_name)
|
||||||
|
VALUES ${placeholders}
|
||||||
|
ON CONFLICT (supplier_id) DO UPDATE SET
|
||||||
|
company_name = EXCLUDED.company_name
|
||||||
|
`, values);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 1. Fetch and process purchase orders
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "running",
|
status: "running",
|
||||||
operation: "Purchase orders import",
|
operation: "Purchase orders import",
|
||||||
@@ -214,6 +259,10 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
const totalPOs = poCount[0].total;
|
const totalPOs = poCount[0].total;
|
||||||
console.log(`Found ${totalPOs} relevant purchase orders`);
|
console.log(`Found ${totalPOs} relevant purchase orders`);
|
||||||
|
|
||||||
|
// Skip processing if no POs to process
|
||||||
|
if (totalPOs === 0) {
|
||||||
|
console.log('No purchase orders to process, skipping PO import step');
|
||||||
|
} else {
|
||||||
// Fetch and process POs in batches
|
// Fetch and process POs in batches
|
||||||
let offset = 0;
|
let offset = 0;
|
||||||
let allPOsProcessed = false;
|
let allPOsProcessed = false;
|
||||||
@@ -358,6 +407,7 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
allPOsProcessed = true;
|
allPOsProcessed = true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// 2. Next, fetch all relevant receivings
|
// 2. Next, fetch all relevant receivings
|
||||||
outputProgress({
|
outputProgress({
|
||||||
@@ -381,6 +431,10 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
const totalReceivings = receivingCount[0].total;
|
const totalReceivings = receivingCount[0].total;
|
||||||
console.log(`Found ${totalReceivings} relevant receivings`);
|
console.log(`Found ${totalReceivings} relevant receivings`);
|
||||||
|
|
||||||
|
// Skip processing if no receivings to process
|
||||||
|
if (totalReceivings === 0) {
|
||||||
|
console.log('No receivings to process, skipping receivings import step');
|
||||||
|
} else {
|
||||||
// Fetch and process receivings in batches
|
// Fetch and process receivings in batches
|
||||||
offset = 0; // Reset offset for receivings
|
offset = 0; // Reset offset for receivings
|
||||||
let allReceivingsProcessed = false;
|
let allReceivingsProcessed = false;
|
||||||
@@ -389,10 +443,16 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
const [receivingList] = await prodConnection.query(`
|
const [receivingList] = await prodConnection.query(`
|
||||||
SELECT
|
SELECT
|
||||||
r.receiving_id,
|
r.receiving_id,
|
||||||
r.po_id,
|
|
||||||
r.supplier_id,
|
r.supplier_id,
|
||||||
r.status,
|
r.status,
|
||||||
r.date_created
|
r.notes,
|
||||||
|
r.shipping,
|
||||||
|
r.total_amount,
|
||||||
|
r.hold,
|
||||||
|
r.for_storefront,
|
||||||
|
r.date_created,
|
||||||
|
r.date_paid,
|
||||||
|
r.date_checked
|
||||||
FROM receivings r
|
FROM receivings r
|
||||||
WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
|
WHERE r.date_created >= DATE_SUB(CURRENT_DATE, INTERVAL ${yearInterval} YEAR)
|
||||||
${incrementalUpdate ? `
|
${incrementalUpdate ? `
|
||||||
@@ -418,12 +478,17 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
rp.receiving_id,
|
rp.receiving_id,
|
||||||
rp.pid,
|
rp.pid,
|
||||||
rp.qty_each,
|
rp.qty_each,
|
||||||
|
rp.qty_each_orig,
|
||||||
rp.cost_each,
|
rp.cost_each,
|
||||||
|
rp.cost_each_orig,
|
||||||
rp.received_by,
|
rp.received_by,
|
||||||
rp.received_date,
|
rp.received_date,
|
||||||
r.date_created as receiving_created_date
|
r.date_created as receiving_created_date,
|
||||||
|
COALESCE(p.itemnumber, 'NO-SKU') AS sku,
|
||||||
|
COALESCE(p.description, 'Unknown Product') AS name
|
||||||
FROM receivings_products rp
|
FROM receivings_products rp
|
||||||
JOIN receivings r ON rp.receiving_id = r.receiving_id
|
JOIN receivings r ON rp.receiving_id = r.receiving_id
|
||||||
|
LEFT JOIN products p ON rp.pid = p.pid
|
||||||
WHERE rp.receiving_id IN (?)
|
WHERE rp.receiving_id IN (?)
|
||||||
`, [receivingIds]);
|
`, [receivingIds]);
|
||||||
|
|
||||||
@@ -433,13 +498,46 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
const receiving = receivingList.find(r => r.receiving_id == product.receiving_id);
|
const receiving = receivingList.find(r => r.receiving_id == product.receiving_id);
|
||||||
if (!receiving) continue;
|
if (!receiving) continue;
|
||||||
|
|
||||||
|
// Get employee name if available
|
||||||
|
let receivedByName = null;
|
||||||
|
if (product.received_by) {
|
||||||
|
const [employeeResult] = await localConnection.query(`
|
||||||
|
SELECT CONCAT(firstname, ' ', lastname) as full_name
|
||||||
|
FROM employee_names
|
||||||
|
WHERE employeeid = $1
|
||||||
|
`, [product.received_by]);
|
||||||
|
|
||||||
|
if (employeeResult.rows.length > 0) {
|
||||||
|
receivedByName = employeeResult.rows[0].full_name;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get vendor name if available
|
||||||
|
let vendorName = 'Unknown Vendor';
|
||||||
|
if (receiving.supplier_id) {
|
||||||
|
const [vendorResult] = await localConnection.query(`
|
||||||
|
SELECT company_name
|
||||||
|
FROM temp_supplier_names
|
||||||
|
WHERE supplier_id = $1
|
||||||
|
`, [receiving.supplier_id]);
|
||||||
|
|
||||||
|
if (vendorResult.rows.length > 0) {
|
||||||
|
vendorName = vendorResult.rows[0].company_name;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
completeReceivings.push({
|
completeReceivings.push({
|
||||||
receiving_id: receiving.receiving_id.toString(),
|
receiving_id: receiving.receiving_id.toString(),
|
||||||
po_id: receiving.po_id ? receiving.po_id.toString() : null,
|
|
||||||
pid: product.pid,
|
pid: product.pid,
|
||||||
|
sku: product.sku,
|
||||||
|
name: product.name,
|
||||||
|
vendor: vendorName,
|
||||||
qty_each: product.qty_each,
|
qty_each: product.qty_each,
|
||||||
|
qty_each_orig: product.qty_each_orig,
|
||||||
cost_each: product.cost_each,
|
cost_each: product.cost_each,
|
||||||
|
cost_each_orig: product.cost_each_orig,
|
||||||
received_by: product.received_by,
|
received_by: product.received_by,
|
||||||
|
received_by_name: receivedByName,
|
||||||
received_date: validateDate(product.received_date) || validateDate(product.receiving_created_date),
|
received_date: validateDate(product.received_date) || validateDate(product.receiving_created_date),
|
||||||
receiving_created_date: validateDate(product.receiving_created_date),
|
receiving_created_date: validateDate(product.receiving_created_date),
|
||||||
supplier_id: receiving.supplier_id,
|
supplier_id: receiving.supplier_id,
|
||||||
@@ -452,17 +550,22 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
const batch = completeReceivings.slice(i, i + INSERT_BATCH_SIZE);
|
const batch = completeReceivings.slice(i, i + INSERT_BATCH_SIZE);
|
||||||
|
|
||||||
const placeholders = batch.map((_, idx) => {
|
const placeholders = batch.map((_, idx) => {
|
||||||
const base = idx * 10;
|
const base = idx * 15;
|
||||||
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10})`;
|
return `($${base + 1}, $${base + 2}, $${base + 3}, $${base + 4}, $${base + 5}, $${base + 6}, $${base + 7}, $${base + 8}, $${base + 9}, $${base + 10}, $${base + 11}, $${base + 12}, $${base + 13}, $${base + 14}, $${base + 15})`;
|
||||||
}).join(',');
|
}).join(',');
|
||||||
|
|
||||||
const values = batch.flatMap(r => [
|
const values = batch.flatMap(r => [
|
||||||
r.receiving_id,
|
r.receiving_id,
|
||||||
r.po_id,
|
|
||||||
r.pid,
|
r.pid,
|
||||||
|
r.sku,
|
||||||
|
r.name,
|
||||||
|
r.vendor,
|
||||||
r.qty_each,
|
r.qty_each,
|
||||||
|
r.qty_each_orig,
|
||||||
r.cost_each,
|
r.cost_each,
|
||||||
|
r.cost_each_orig,
|
||||||
r.received_by,
|
r.received_by,
|
||||||
|
r.received_by_name,
|
||||||
r.received_date,
|
r.received_date,
|
||||||
r.receiving_created_date,
|
r.receiving_created_date,
|
||||||
r.supplier_id,
|
r.supplier_id,
|
||||||
@@ -471,15 +574,21 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
INSERT INTO temp_receivings (
|
INSERT INTO temp_receivings (
|
||||||
receiving_id, po_id, pid, qty_each, cost_each, received_by,
|
receiving_id, pid, sku, name, vendor, qty_each, qty_each_orig,
|
||||||
|
cost_each, cost_each_orig, received_by, received_by_name,
|
||||||
received_date, receiving_created_date, supplier_id, status
|
received_date, receiving_created_date, supplier_id, status
|
||||||
)
|
)
|
||||||
VALUES ${placeholders}
|
VALUES ${placeholders}
|
||||||
ON CONFLICT (receiving_id, pid) DO UPDATE SET
|
ON CONFLICT (receiving_id, pid) DO UPDATE SET
|
||||||
po_id = EXCLUDED.po_id,
|
sku = EXCLUDED.sku,
|
||||||
|
name = EXCLUDED.name,
|
||||||
|
vendor = EXCLUDED.vendor,
|
||||||
qty_each = EXCLUDED.qty_each,
|
qty_each = EXCLUDED.qty_each,
|
||||||
|
qty_each_orig = EXCLUDED.qty_each_orig,
|
||||||
cost_each = EXCLUDED.cost_each,
|
cost_each = EXCLUDED.cost_each,
|
||||||
|
cost_each_orig = EXCLUDED.cost_each_orig,
|
||||||
received_by = EXCLUDED.received_by,
|
received_by = EXCLUDED.received_by,
|
||||||
|
received_by_name = EXCLUDED.received_by_name,
|
||||||
received_date = EXCLUDED.received_date,
|
received_date = EXCLUDED.received_date,
|
||||||
receiving_created_date = EXCLUDED.receiving_created_date,
|
receiving_created_date = EXCLUDED.receiving_created_date,
|
||||||
supplier_id = EXCLUDED.supplier_id,
|
supplier_id = EXCLUDED.supplier_id,
|
||||||
@@ -505,16 +614,15 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
allReceivingsProcessed = true;
|
allReceivingsProcessed = true;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// 3. Implement FIFO allocation of receivings to purchase orders
|
// Add this section to filter out invalid PIDs before final import
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "running",
|
status: "running",
|
||||||
operation: "Purchase orders import",
|
operation: "Purchase orders import",
|
||||||
message: "Validating product IDs before allocation"
|
message: "Validating product IDs before final import"
|
||||||
});
|
});
|
||||||
|
|
||||||
// Add this section to filter out invalid PIDs before allocation
|
|
||||||
// This will check all PIDs in our temp tables against the products table
|
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
-- Create temp table to store invalid PIDs
|
-- Create temp table to store invalid PIDs
|
||||||
DROP TABLE IF EXISTS temp_invalid_pids;
|
DROP TABLE IF EXISTS temp_invalid_pids;
|
||||||
@@ -552,362 +660,157 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
console.log(`Filtered out ${filteredCount} items with invalid product IDs`);
|
console.log(`Filtered out ${filteredCount} items with invalid product IDs`);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Break FIFO allocation into steps with progress tracking
|
// 3. Insert final purchase order records to the actual table
|
||||||
const fifoSteps = [
|
|
||||||
{
|
|
||||||
name: "Direct allocations",
|
|
||||||
query: `
|
|
||||||
INSERT INTO temp_receiving_allocations (
|
|
||||||
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
|
|
||||||
)
|
|
||||||
SELECT
|
|
||||||
r.po_id,
|
|
||||||
r.pid,
|
|
||||||
r.receiving_id,
|
|
||||||
LEAST(r.qty_each, po.ordered) as allocated_qty,
|
|
||||||
r.cost_each,
|
|
||||||
COALESCE(r.received_date, NOW()) as received_date,
|
|
||||||
r.received_by
|
|
||||||
FROM temp_receivings r
|
|
||||||
JOIN temp_purchase_orders po ON r.po_id = po.po_id AND r.pid = po.pid
|
|
||||||
WHERE r.po_id IS NOT NULL
|
|
||||||
`
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "Handling standalone receivings",
|
|
||||||
query: `
|
|
||||||
INSERT INTO temp_purchase_orders (
|
|
||||||
po_id, pid, sku, name, vendor, date, status,
|
|
||||||
ordered, po_cost_price, supplier_id, date_created, date_ordered
|
|
||||||
)
|
|
||||||
SELECT
|
|
||||||
r.receiving_id::text as po_id,
|
|
||||||
r.pid,
|
|
||||||
COALESCE(p.sku, 'NO-SKU') as sku,
|
|
||||||
COALESCE(p.name, 'Unknown Product') as name,
|
|
||||||
COALESCE(
|
|
||||||
(SELECT vendor FROM temp_purchase_orders
|
|
||||||
WHERE supplier_id = r.supplier_id LIMIT 1),
|
|
||||||
'Unknown Vendor'
|
|
||||||
) as vendor,
|
|
||||||
COALESCE(r.received_date, r.receiving_created_date) as date,
|
|
||||||
'created' as status,
|
|
||||||
NULL as ordered,
|
|
||||||
r.cost_each as po_cost_price,
|
|
||||||
r.supplier_id,
|
|
||||||
COALESCE(r.receiving_created_date, r.received_date) as date_created,
|
|
||||||
NULL as date_ordered
|
|
||||||
FROM temp_receivings r
|
|
||||||
LEFT JOIN (
|
|
||||||
SELECT DISTINCT pid, sku, name FROM temp_purchase_orders
|
|
||||||
) p ON r.pid = p.pid
|
|
||||||
WHERE r.po_id IS NULL
|
|
||||||
OR NOT EXISTS (
|
|
||||||
SELECT 1 FROM temp_purchase_orders po
|
|
||||||
WHERE po.po_id = r.po_id AND po.pid = r.pid
|
|
||||||
)
|
|
||||||
ON CONFLICT (po_id, pid) DO NOTHING
|
|
||||||
`
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "Allocating standalone receivings",
|
|
||||||
query: `
|
|
||||||
INSERT INTO temp_receiving_allocations (
|
|
||||||
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
|
|
||||||
)
|
|
||||||
SELECT
|
|
||||||
r.receiving_id::text as po_id,
|
|
||||||
r.pid,
|
|
||||||
r.receiving_id,
|
|
||||||
r.qty_each as allocated_qty,
|
|
||||||
r.cost_each,
|
|
||||||
COALESCE(r.received_date, NOW()) as received_date,
|
|
||||||
r.received_by
|
|
||||||
FROM temp_receivings r
|
|
||||||
WHERE r.po_id IS NULL
|
|
||||||
OR NOT EXISTS (
|
|
||||||
SELECT 1 FROM temp_purchase_orders po
|
|
||||||
WHERE po.po_id = r.po_id AND po.pid = r.pid
|
|
||||||
)
|
|
||||||
`
|
|
||||||
},
|
|
||||||
{
|
|
||||||
name: "FIFO allocation logic",
|
|
||||||
query: `
|
|
||||||
WITH
|
|
||||||
-- Calculate remaining quantities after direct allocations
|
|
||||||
remaining_po_quantities AS (
|
|
||||||
SELECT
|
|
||||||
po.po_id,
|
|
||||||
po.pid,
|
|
||||||
po.ordered,
|
|
||||||
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
|
|
||||||
po.ordered - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
|
|
||||||
po.date_ordered,
|
|
||||||
po.date_created
|
|
||||||
FROM temp_purchase_orders po
|
|
||||||
LEFT JOIN temp_receiving_allocations ra ON po.po_id = ra.po_id AND po.pid = ra.pid
|
|
||||||
WHERE po.ordered IS NOT NULL
|
|
||||||
GROUP BY po.po_id, po.pid, po.ordered, po.date_ordered, po.date_created
|
|
||||||
HAVING po.ordered > COALESCE(SUM(ra.allocated_qty), 0)
|
|
||||||
),
|
|
||||||
remaining_receiving_quantities AS (
|
|
||||||
SELECT
|
|
||||||
r.receiving_id,
|
|
||||||
r.pid,
|
|
||||||
r.qty_each,
|
|
||||||
COALESCE(SUM(ra.allocated_qty), 0) as already_allocated,
|
|
||||||
r.qty_each - COALESCE(SUM(ra.allocated_qty), 0) as remaining_qty,
|
|
||||||
r.received_date,
|
|
||||||
r.cost_each,
|
|
||||||
r.received_by
|
|
||||||
FROM temp_receivings r
|
|
||||||
LEFT JOIN temp_receiving_allocations ra ON r.receiving_id = ra.receiving_id AND r.pid = ra.pid
|
|
||||||
GROUP BY r.receiving_id, r.pid, r.qty_each, r.received_date, r.cost_each, r.received_by
|
|
||||||
HAVING r.qty_each > COALESCE(SUM(ra.allocated_qty), 0)
|
|
||||||
),
|
|
||||||
-- Rank POs by age, with a cutoff for very old POs (1 year)
|
|
||||||
ranked_pos AS (
|
|
||||||
SELECT
|
|
||||||
po.po_id,
|
|
||||||
po.pid,
|
|
||||||
po.remaining_qty,
|
|
||||||
CASE
|
|
||||||
WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2
|
|
||||||
ELSE 1
|
|
||||||
END as age_group,
|
|
||||||
ROW_NUMBER() OVER (
|
|
||||||
PARTITION BY po.pid, (CASE WHEN po.date_ordered IS NULL OR po.date_ordered < NOW() - INTERVAL '1 year' THEN 2 ELSE 1 END)
|
|
||||||
ORDER BY COALESCE(po.date_ordered, po.date_created, NOW())
|
|
||||||
) as rank_in_group
|
|
||||||
FROM remaining_po_quantities po
|
|
||||||
),
|
|
||||||
-- Rank receivings by date
|
|
||||||
ranked_receivings AS (
|
|
||||||
SELECT
|
|
||||||
r.receiving_id,
|
|
||||||
r.pid,
|
|
||||||
r.remaining_qty,
|
|
||||||
r.received_date,
|
|
||||||
r.cost_each,
|
|
||||||
r.received_by,
|
|
||||||
ROW_NUMBER() OVER (PARTITION BY r.pid ORDER BY COALESCE(r.received_date, NOW())) as rank
|
|
||||||
FROM remaining_receiving_quantities r
|
|
||||||
),
|
|
||||||
-- First allocate to recent POs
|
|
||||||
allocations_recent AS (
|
|
||||||
SELECT
|
|
||||||
po.po_id,
|
|
||||||
po.pid,
|
|
||||||
r.receiving_id,
|
|
||||||
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
|
|
||||||
r.cost_each,
|
|
||||||
COALESCE(r.received_date, NOW()) as received_date,
|
|
||||||
r.received_by,
|
|
||||||
po.age_group,
|
|
||||||
po.rank_in_group,
|
|
||||||
r.rank,
|
|
||||||
'recent' as allocation_type
|
|
||||||
FROM ranked_pos po
|
|
||||||
JOIN ranked_receivings r ON po.pid = r.pid
|
|
||||||
WHERE po.age_group = 1
|
|
||||||
ORDER BY po.pid, po.rank_in_group, r.rank
|
|
||||||
),
|
|
||||||
-- Then allocate to older POs
|
|
||||||
remaining_after_recent AS (
|
|
||||||
SELECT
|
|
||||||
r.receiving_id,
|
|
||||||
r.pid,
|
|
||||||
r.remaining_qty - COALESCE(SUM(a.allocated_qty), 0) as remaining_qty,
|
|
||||||
r.received_date,
|
|
||||||
r.cost_each,
|
|
||||||
r.received_by,
|
|
||||||
r.rank
|
|
||||||
FROM ranked_receivings r
|
|
||||||
LEFT JOIN allocations_recent a ON r.receiving_id = a.receiving_id AND r.pid = a.pid
|
|
||||||
GROUP BY r.receiving_id, r.pid, r.remaining_qty, r.received_date, r.cost_each, r.received_by, r.rank
|
|
||||||
HAVING r.remaining_qty > COALESCE(SUM(a.allocated_qty), 0)
|
|
||||||
),
|
|
||||||
allocations_old AS (
|
|
||||||
SELECT
|
|
||||||
po.po_id,
|
|
||||||
po.pid,
|
|
||||||
r.receiving_id,
|
|
||||||
LEAST(po.remaining_qty, r.remaining_qty) as allocated_qty,
|
|
||||||
r.cost_each,
|
|
||||||
COALESCE(r.received_date, NOW()) as received_date,
|
|
||||||
r.received_by,
|
|
||||||
po.age_group,
|
|
||||||
po.rank_in_group,
|
|
||||||
r.rank,
|
|
||||||
'old' as allocation_type
|
|
||||||
FROM ranked_pos po
|
|
||||||
JOIN remaining_after_recent r ON po.pid = r.pid
|
|
||||||
WHERE po.age_group = 2
|
|
||||||
ORDER BY po.pid, po.rank_in_group, r.rank
|
|
||||||
),
|
|
||||||
-- Combine allocations
|
|
||||||
combined_allocations AS (
|
|
||||||
SELECT * FROM allocations_recent
|
|
||||||
UNION ALL
|
|
||||||
SELECT * FROM allocations_old
|
|
||||||
)
|
|
||||||
-- Insert into allocations table
|
|
||||||
INSERT INTO temp_receiving_allocations (
|
|
||||||
po_id, pid, receiving_id, allocated_qty, cost_each, received_date, received_by
|
|
||||||
)
|
|
||||||
SELECT
|
|
||||||
po_id, pid, receiving_id, allocated_qty, cost_each,
|
|
||||||
COALESCE(received_date, NOW()) as received_date,
|
|
||||||
received_by
|
|
||||||
FROM combined_allocations
|
|
||||||
WHERE allocated_qty > 0
|
|
||||||
`
|
|
||||||
}
|
|
||||||
];
|
|
||||||
|
|
||||||
// Execute FIFO steps with progress tracking
|
|
||||||
for (let i = 0; i < fifoSteps.length; i++) {
|
|
||||||
const step = fifoSteps[i];
|
|
||||||
outputProgress({
|
outputProgress({
|
||||||
status: "running",
|
status: "running",
|
||||||
operation: "Purchase orders import",
|
operation: "Purchase orders import",
|
||||||
message: `FIFO allocation step ${i+1}/${fifoSteps.length}: ${step.name}`,
|
message: "Inserting final purchase order records"
|
||||||
current: i,
|
|
||||||
total: fifoSteps.length
|
|
||||||
});
|
});
|
||||||
|
|
||||||
await localConnection.query(step.query);
|
// Create a temp table to track PO IDs being processed
|
||||||
}
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS processed_po_ids;
|
||||||
|
CREATE TEMP TABLE processed_po_ids AS (
|
||||||
|
SELECT DISTINCT po_id FROM temp_purchase_orders
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
// 4. Generate final purchase order records with receiving data
|
// Delete products that were removed from POs and count them
|
||||||
outputProgress({
|
const [poDeletedResult] = await localConnection.query(`
|
||||||
status: "running",
|
WITH deleted AS (
|
||||||
operation: "Purchase orders import",
|
DELETE FROM purchase_orders
|
||||||
message: "Generating final purchase order records"
|
WHERE po_id IN (SELECT po_id FROM processed_po_ids)
|
||||||
});
|
AND NOT EXISTS (
|
||||||
|
SELECT 1 FROM temp_purchase_orders tp
|
||||||
const [finalResult] = await localConnection.query(`
|
WHERE purchase_orders.po_id = tp.po_id AND purchase_orders.pid = tp.pid
|
||||||
WITH
|
|
||||||
receiving_summaries AS (
|
|
||||||
SELECT
|
|
||||||
po_id,
|
|
||||||
pid,
|
|
||||||
SUM(allocated_qty) as total_received,
|
|
||||||
JSONB_AGG(
|
|
||||||
JSONB_BUILD_OBJECT(
|
|
||||||
'receiving_id', receiving_id,
|
|
||||||
'qty', allocated_qty,
|
|
||||||
'date', COALESCE(received_date, NOW()),
|
|
||||||
'cost', cost_each,
|
|
||||||
'received_by', received_by,
|
|
||||||
'received_by_name', CASE
|
|
||||||
WHEN received_by IS NOT NULL AND received_by > 0 THEN
|
|
||||||
(SELECT CONCAT(firstname, ' ', lastname)
|
|
||||||
FROM employee_names
|
|
||||||
WHERE employeeid = received_by)
|
|
||||||
ELSE NULL
|
|
||||||
END
|
|
||||||
) ORDER BY COALESCE(received_date, NOW())
|
|
||||||
) as receiving_history,
|
|
||||||
MIN(COALESCE(received_date, NOW())) as first_received_date,
|
|
||||||
MAX(COALESCE(received_date, NOW())) as last_received_date,
|
|
||||||
STRING_AGG(
|
|
||||||
DISTINCT CASE WHEN received_by IS NOT NULL AND received_by > 0
|
|
||||||
THEN CAST(received_by AS TEXT)
|
|
||||||
ELSE NULL
|
|
||||||
END,
|
|
||||||
','
|
|
||||||
) as received_by_list,
|
|
||||||
STRING_AGG(
|
|
||||||
DISTINCT CASE
|
|
||||||
WHEN ra.received_by IS NOT NULL AND ra.received_by > 0 THEN
|
|
||||||
(SELECT CONCAT(firstname, ' ', lastname)
|
|
||||||
FROM employee_names
|
|
||||||
WHERE employeeid = ra.received_by)
|
|
||||||
ELSE NULL
|
|
||||||
END,
|
|
||||||
', '
|
|
||||||
) as received_by_names
|
|
||||||
FROM temp_receiving_allocations ra
|
|
||||||
GROUP BY po_id, pid
|
|
||||||
),
|
|
||||||
cost_averaging AS (
|
|
||||||
SELECT
|
|
||||||
ra.po_id,
|
|
||||||
ra.pid,
|
|
||||||
SUM(ra.allocated_qty * ra.cost_each) / NULLIF(SUM(ra.allocated_qty), 0) as avg_cost
|
|
||||||
FROM temp_receiving_allocations ra
|
|
||||||
GROUP BY ra.po_id, ra.pid
|
|
||||||
)
|
)
|
||||||
|
RETURNING po_id, pid
|
||||||
|
)
|
||||||
|
SELECT COUNT(*) as count FROM deleted
|
||||||
|
`);
|
||||||
|
|
||||||
|
poRecordsDeleted = poDeletedResult.rows[0]?.count || 0;
|
||||||
|
console.log(`Deleted ${poRecordsDeleted} products that were removed from purchase orders`);
|
||||||
|
|
||||||
|
const [poResult] = await localConnection.query(`
|
||||||
INSERT INTO purchase_orders (
|
INSERT INTO purchase_orders (
|
||||||
po_id, vendor, date, expected_date, pid, sku, name,
|
po_id, vendor, date, expected_date, pid, sku, name,
|
||||||
cost_price, po_cost_price, status, receiving_status, notes, long_note,
|
po_cost_price, status, notes, long_note,
|
||||||
ordered, received, received_date, last_received_date, received_by,
|
ordered, supplier_id, date_created, date_ordered
|
||||||
receiving_history
|
|
||||||
)
|
)
|
||||||
SELECT
|
SELECT
|
||||||
po.po_id,
|
po_id,
|
||||||
po.vendor,
|
vendor,
|
||||||
CASE
|
COALESCE(date, date_created, now()) as date,
|
||||||
WHEN po.date IS NOT NULL THEN po.date
|
expected_date,
|
||||||
-- For standalone receivings, try to use the receiving date from history
|
pid,
|
||||||
WHEN po.po_id LIKE 'R%' AND rs.first_received_date IS NOT NULL THEN rs.first_received_date
|
sku,
|
||||||
-- As a last resort for data integrity, use Unix epoch (Jan 1, 1970)
|
name,
|
||||||
ELSE to_timestamp(0)
|
po_cost_price,
|
||||||
END as date,
|
status,
|
||||||
NULLIF(po.expected_date::text, '0000-00-00')::date as expected_date,
|
notes,
|
||||||
po.pid,
|
long_note,
|
||||||
po.sku,
|
ordered,
|
||||||
po.name,
|
supplier_id,
|
||||||
COALESCE(ca.avg_cost, po.po_cost_price) as cost_price,
|
date_created,
|
||||||
po.po_cost_price,
|
date_ordered
|
||||||
COALESCE(po.status, 'created'),
|
FROM temp_purchase_orders
|
||||||
CASE
|
|
||||||
WHEN rs.total_received IS NULL THEN 'created'
|
|
||||||
WHEN rs.total_received = 0 THEN 'created'
|
|
||||||
WHEN rs.total_received < po.ordered THEN 'partial_received'
|
|
||||||
WHEN rs.total_received >= po.ordered THEN 'full_received'
|
|
||||||
ELSE 'created'
|
|
||||||
END as receiving_status,
|
|
||||||
po.notes,
|
|
||||||
po.long_note,
|
|
||||||
COALESCE(po.ordered, 0),
|
|
||||||
COALESCE(rs.total_received, 0),
|
|
||||||
NULLIF(rs.first_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as received_date,
|
|
||||||
NULLIF(rs.last_received_date::text, '0000-00-00 00:00:00')::timestamp with time zone as last_received_date,
|
|
||||||
CASE
|
|
||||||
WHEN rs.received_by_list IS NULL THEN NULL
|
|
||||||
ELSE rs.received_by_names
|
|
||||||
END as received_by,
|
|
||||||
rs.receiving_history
|
|
||||||
FROM temp_purchase_orders po
|
|
||||||
LEFT JOIN receiving_summaries rs ON po.po_id = rs.po_id AND po.pid = rs.pid
|
|
||||||
LEFT JOIN cost_averaging ca ON po.po_id = ca.po_id AND po.pid = ca.pid
|
|
||||||
ON CONFLICT (po_id, pid) DO UPDATE SET
|
ON CONFLICT (po_id, pid) DO UPDATE SET
|
||||||
vendor = EXCLUDED.vendor,
|
vendor = EXCLUDED.vendor,
|
||||||
date = EXCLUDED.date,
|
date = EXCLUDED.date,
|
||||||
expected_date = EXCLUDED.expected_date,
|
expected_date = EXCLUDED.expected_date,
|
||||||
sku = EXCLUDED.sku,
|
sku = EXCLUDED.sku,
|
||||||
name = EXCLUDED.name,
|
name = EXCLUDED.name,
|
||||||
cost_price = EXCLUDED.cost_price,
|
|
||||||
po_cost_price = EXCLUDED.po_cost_price,
|
po_cost_price = EXCLUDED.po_cost_price,
|
||||||
status = EXCLUDED.status,
|
status = EXCLUDED.status,
|
||||||
receiving_status = EXCLUDED.receiving_status,
|
|
||||||
notes = EXCLUDED.notes,
|
notes = EXCLUDED.notes,
|
||||||
long_note = EXCLUDED.long_note,
|
long_note = EXCLUDED.long_note,
|
||||||
ordered = EXCLUDED.ordered,
|
ordered = EXCLUDED.ordered,
|
||||||
received = EXCLUDED.received,
|
supplier_id = EXCLUDED.supplier_id,
|
||||||
received_date = EXCLUDED.received_date,
|
date_created = EXCLUDED.date_created,
|
||||||
last_received_date = EXCLUDED.last_received_date,
|
date_ordered = EXCLUDED.date_ordered,
|
||||||
received_by = EXCLUDED.received_by,
|
|
||||||
receiving_history = EXCLUDED.receiving_history,
|
|
||||||
updated = CURRENT_TIMESTAMP
|
updated = CURRENT_TIMESTAMP
|
||||||
RETURNING (xmax = 0) as inserted
|
RETURNING (xmax = 0) as inserted
|
||||||
`);
|
`);
|
||||||
|
|
||||||
recordsAdded = finalResult.rows.filter(r => r.inserted).length;
|
poRecordsAdded = poResult.rows.filter(r => r.inserted).length;
|
||||||
recordsUpdated = finalResult.rows.filter(r => !r.inserted).length;
|
poRecordsUpdated = poResult.rows.filter(r => !r.inserted).length;
|
||||||
|
|
||||||
|
// 4. Insert final receiving records to the actual table
|
||||||
|
outputProgress({
|
||||||
|
status: "running",
|
||||||
|
operation: "Purchase orders import",
|
||||||
|
message: "Inserting final receiving records"
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create a temp table to track receiving IDs being processed
|
||||||
|
await localConnection.query(`
|
||||||
|
DROP TABLE IF EXISTS processed_receiving_ids;
|
||||||
|
CREATE TEMP TABLE processed_receiving_ids AS (
|
||||||
|
SELECT DISTINCT receiving_id FROM temp_receivings
|
||||||
|
);
|
||||||
|
`);
|
||||||
|
|
||||||
|
// Delete products that were removed from receivings and count them
|
||||||
|
const [receivingDeletedResult] = await localConnection.query(`
|
||||||
|
WITH deleted AS (
|
||||||
|
DELETE FROM receivings
|
||||||
|
WHERE receiving_id IN (SELECT receiving_id FROM processed_receiving_ids)
|
||||||
|
AND NOT EXISTS (
|
||||||
|
SELECT 1 FROM temp_receivings tr
|
||||||
|
WHERE receivings.receiving_id = tr.receiving_id AND receivings.pid = tr.pid
|
||||||
|
)
|
||||||
|
RETURNING receiving_id, pid
|
||||||
|
)
|
||||||
|
SELECT COUNT(*) as count FROM deleted
|
||||||
|
`);
|
||||||
|
|
||||||
|
receivingRecordsDeleted = receivingDeletedResult.rows[0]?.count || 0;
|
||||||
|
console.log(`Deleted ${receivingRecordsDeleted} products that were removed from receivings`);
|
||||||
|
|
||||||
|
const [receivingsResult] = await localConnection.query(`
|
||||||
|
INSERT INTO receivings (
|
||||||
|
receiving_id, pid, sku, name, vendor, qty_each, qty_each_orig,
|
||||||
|
cost_each, cost_each_orig, received_by, received_by_name,
|
||||||
|
received_date, receiving_created_date, supplier_id, status
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
receiving_id,
|
||||||
|
pid,
|
||||||
|
sku,
|
||||||
|
name,
|
||||||
|
vendor,
|
||||||
|
qty_each,
|
||||||
|
qty_each_orig,
|
||||||
|
cost_each,
|
||||||
|
cost_each_orig,
|
||||||
|
received_by,
|
||||||
|
received_by_name,
|
||||||
|
COALESCE(received_date, receiving_created_date, now()) as received_date,
|
||||||
|
receiving_created_date,
|
||||||
|
supplier_id,
|
||||||
|
status
|
||||||
|
FROM temp_receivings
|
||||||
|
ON CONFLICT (receiving_id, pid) DO UPDATE SET
|
||||||
|
sku = EXCLUDED.sku,
|
||||||
|
name = EXCLUDED.name,
|
||||||
|
vendor = EXCLUDED.vendor,
|
||||||
|
qty_each = EXCLUDED.qty_each,
|
||||||
|
qty_each_orig = EXCLUDED.qty_each_orig,
|
||||||
|
cost_each = EXCLUDED.cost_each,
|
||||||
|
cost_each_orig = EXCLUDED.cost_each_orig,
|
||||||
|
received_by = EXCLUDED.received_by,
|
||||||
|
received_by_name = EXCLUDED.received_by_name,
|
||||||
|
received_date = EXCLUDED.received_date,
|
||||||
|
receiving_created_date = EXCLUDED.receiving_created_date,
|
||||||
|
supplier_id = EXCLUDED.supplier_id,
|
||||||
|
status = EXCLUDED.status,
|
||||||
|
updated = CURRENT_TIMESTAMP
|
||||||
|
RETURNING (xmax = 0) as inserted
|
||||||
|
`);
|
||||||
|
|
||||||
|
receivingRecordsAdded = receivingsResult.rows.filter(r => r.inserted).length;
|
||||||
|
receivingRecordsUpdated = receivingsResult.rows.filter(r => !r.inserted).length;
|
||||||
|
|
||||||
// Update sync status
|
// Update sync status
|
||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
@@ -921,8 +824,11 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
await localConnection.query(`
|
await localConnection.query(`
|
||||||
DROP TABLE IF EXISTS temp_purchase_orders;
|
DROP TABLE IF EXISTS temp_purchase_orders;
|
||||||
DROP TABLE IF EXISTS temp_receivings;
|
DROP TABLE IF EXISTS temp_receivings;
|
||||||
DROP TABLE IF EXISTS temp_receiving_allocations;
|
|
||||||
DROP TABLE IF EXISTS employee_names;
|
DROP TABLE IF EXISTS employee_names;
|
||||||
|
DROP TABLE IF EXISTS temp_supplier_names;
|
||||||
|
DROP TABLE IF EXISTS temp_invalid_pids;
|
||||||
|
DROP TABLE IF EXISTS processed_po_ids;
|
||||||
|
DROP TABLE IF EXISTS processed_receiving_ids;
|
||||||
`);
|
`);
|
||||||
|
|
||||||
// Commit transaction
|
// Commit transaction
|
||||||
@@ -930,8 +836,15 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
|
|
||||||
return {
|
return {
|
||||||
status: "complete",
|
status: "complete",
|
||||||
recordsAdded: recordsAdded || 0,
|
recordsAdded: poRecordsAdded + receivingRecordsAdded,
|
||||||
recordsUpdated: recordsUpdated || 0,
|
recordsUpdated: poRecordsUpdated + receivingRecordsUpdated,
|
||||||
|
recordsDeleted: poRecordsDeleted + receivingRecordsDeleted,
|
||||||
|
poRecordsAdded,
|
||||||
|
poRecordsUpdated,
|
||||||
|
poRecordsDeleted,
|
||||||
|
receivingRecordsAdded,
|
||||||
|
receivingRecordsUpdated,
|
||||||
|
receivingRecordsDeleted,
|
||||||
totalRecords: totalProcessed
|
totalRecords: totalProcessed
|
||||||
};
|
};
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
@@ -949,6 +862,7 @@ async function importPurchaseOrders(prodConnection, localConnection, incremental
|
|||||||
error: error.message,
|
error: error.message,
|
||||||
recordsAdded: 0,
|
recordsAdded: 0,
|
||||||
recordsUpdated: 0,
|
recordsUpdated: 0,
|
||||||
|
recordsDeleted: 0,
|
||||||
totalRecords: 0
|
totalRecords: 0
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -91,6 +91,287 @@ function cancelCalculation() {
|
|||||||
process.on('SIGTERM', cancelCalculation);
|
process.on('SIGTERM', cancelCalculation);
|
||||||
process.on('SIGINT', cancelCalculation);
|
process.on('SIGINT', cancelCalculation);
|
||||||
|
|
||||||
|
const calculateInitialMetrics = (client, onProgress) => {
|
||||||
|
return client.query(`
|
||||||
|
-- Truncate the existing metrics tables to ensure clean data
|
||||||
|
TRUNCATE TABLE public.daily_product_snapshots;
|
||||||
|
TRUNCATE TABLE public.product_metrics;
|
||||||
|
|
||||||
|
-- First let's create daily snapshots for all products with order activity
|
||||||
|
WITH SalesData AS (
|
||||||
|
SELECT
|
||||||
|
p.pid,
|
||||||
|
p.sku,
|
||||||
|
o.date::date AS order_date,
|
||||||
|
-- Count orders to ensure we only include products with real activity
|
||||||
|
COUNT(o.id) as order_count,
|
||||||
|
-- Aggregate Sales (Quantity > 0, Status not Canceled/Returned)
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.quantity ELSE 0 END), 0) AS units_sold,
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.price * o.quantity ELSE 0 END), 0.00) AS gross_revenue_unadjusted,
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN o.discount ELSE 0 END), 0.00) AS discounts,
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN COALESCE(o.costeach, p.landing_cost_price, p.cost_price) * o.quantity ELSE 0 END), 0.00) AS cogs,
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity > 0 AND COALESCE(o.status, 'pending') NOT IN ('canceled', 'returned') THEN p.regular_price * o.quantity ELSE 0 END), 0.00) AS gross_regular_revenue,
|
||||||
|
|
||||||
|
-- Aggregate Returns (Quantity < 0 or Status = Returned)
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN ABS(o.quantity) ELSE 0 END), 0) AS units_returned,
|
||||||
|
COALESCE(SUM(CASE WHEN o.quantity < 0 OR COALESCE(o.status, 'pending') = 'returned' THEN o.price * ABS(o.quantity) ELSE 0 END), 0.00) AS returns_revenue
|
||||||
|
FROM public.products p
|
||||||
|
LEFT JOIN public.orders o ON p.pid = o.pid
|
||||||
|
GROUP BY p.pid, p.sku, o.date::date
|
||||||
|
HAVING COUNT(o.id) > 0 -- Only include products with actual orders
|
||||||
|
),
|
||||||
|
ReceivingData AS (
|
||||||
|
SELECT
|
||||||
|
r.pid,
|
||||||
|
r.received_date::date AS receiving_date,
|
||||||
|
-- Count receiving documents to ensure we only include products with real activity
|
||||||
|
COUNT(DISTINCT r.receiving_id) as receiving_count,
|
||||||
|
-- Calculate received quantity for this day
|
||||||
|
SUM(r.received_quantity) AS units_received,
|
||||||
|
-- Calculate received cost for this day
|
||||||
|
SUM(r.received_quantity * r.unit_cost) AS cost_received
|
||||||
|
FROM public.receivings r
|
||||||
|
GROUP BY r.pid, r.received_date::date
|
||||||
|
HAVING COUNT(DISTINCT r.receiving_id) > 0 OR SUM(r.received_quantity) > 0
|
||||||
|
),
|
||||||
|
-- Get current stock quantities
|
||||||
|
StockData AS (
|
||||||
|
SELECT
|
||||||
|
p.pid,
|
||||||
|
p.stock_quantity,
|
||||||
|
COALESCE(p.landing_cost_price, p.cost_price, 0.00) as effective_cost_price,
|
||||||
|
COALESCE(p.price, 0.00) as current_price,
|
||||||
|
COALESCE(p.regular_price, 0.00) as current_regular_price
|
||||||
|
FROM public.products p
|
||||||
|
),
|
||||||
|
-- Combine sales and receiving dates to get all activity dates
|
||||||
|
DatePidCombos AS (
|
||||||
|
SELECT DISTINCT pid, order_date AS activity_date FROM SalesData
|
||||||
|
UNION
|
||||||
|
SELECT DISTINCT pid, receiving_date FROM ReceivingData
|
||||||
|
),
|
||||||
|
-- Insert daily snapshots for all product-date combinations
|
||||||
|
SnapshotInsert AS (
|
||||||
|
INSERT INTO public.daily_product_snapshots (
|
||||||
|
snapshot_date,
|
||||||
|
pid,
|
||||||
|
sku,
|
||||||
|
eod_stock_quantity,
|
||||||
|
eod_stock_cost,
|
||||||
|
eod_stock_retail,
|
||||||
|
eod_stock_gross,
|
||||||
|
stockout_flag,
|
||||||
|
units_sold,
|
||||||
|
units_returned,
|
||||||
|
gross_revenue,
|
||||||
|
discounts,
|
||||||
|
returns_revenue,
|
||||||
|
net_revenue,
|
||||||
|
cogs,
|
||||||
|
gross_regular_revenue,
|
||||||
|
profit,
|
||||||
|
units_received,
|
||||||
|
cost_received,
|
||||||
|
calculation_timestamp
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
d.activity_date AS snapshot_date,
|
||||||
|
d.pid,
|
||||||
|
p.sku,
|
||||||
|
-- Use current stock as approximation, since historical stock data is not available
|
||||||
|
s.stock_quantity AS eod_stock_quantity,
|
||||||
|
s.stock_quantity * s.effective_cost_price AS eod_stock_cost,
|
||||||
|
s.stock_quantity * s.current_price AS eod_stock_retail,
|
||||||
|
s.stock_quantity * s.current_regular_price AS eod_stock_gross,
|
||||||
|
(s.stock_quantity <= 0) AS stockout_flag,
|
||||||
|
-- Sales metrics
|
||||||
|
COALESCE(sd.units_sold, 0),
|
||||||
|
COALESCE(sd.units_returned, 0),
|
||||||
|
COALESCE(sd.gross_revenue_unadjusted, 0.00),
|
||||||
|
COALESCE(sd.discounts, 0.00),
|
||||||
|
COALESCE(sd.returns_revenue, 0.00),
|
||||||
|
COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00) AS net_revenue,
|
||||||
|
COALESCE(sd.cogs, 0.00),
|
||||||
|
COALESCE(sd.gross_regular_revenue, 0.00),
|
||||||
|
(COALESCE(sd.gross_revenue_unadjusted, 0.00) - COALESCE(sd.discounts, 0.00)) - COALESCE(sd.cogs, 0.00) AS profit,
|
||||||
|
-- Receiving metrics
|
||||||
|
COALESCE(rd.units_received, 0),
|
||||||
|
COALESCE(rd.cost_received, 0.00),
|
||||||
|
now() -- calculation timestamp
|
||||||
|
FROM DatePidCombos d
|
||||||
|
JOIN public.products p ON d.pid = p.pid
|
||||||
|
LEFT JOIN SalesData sd ON d.pid = sd.pid AND d.activity_date = sd.order_date
|
||||||
|
LEFT JOIN ReceivingData rd ON d.pid = rd.pid AND d.activity_date = rd.receiving_date
|
||||||
|
LEFT JOIN StockData s ON d.pid = s.pid
|
||||||
|
RETURNING pid, snapshot_date
|
||||||
|
),
|
||||||
|
-- Now build the aggregated product metrics from the daily snapshots
|
||||||
|
MetricsInsert AS (
|
||||||
|
INSERT INTO public.product_metrics (
|
||||||
|
pid,
|
||||||
|
sku,
|
||||||
|
current_stock_quantity,
|
||||||
|
current_stock_cost,
|
||||||
|
current_stock_retail,
|
||||||
|
current_stock_msrp,
|
||||||
|
is_out_of_stock,
|
||||||
|
total_units_sold,
|
||||||
|
total_units_returned,
|
||||||
|
return_rate,
|
||||||
|
gross_revenue,
|
||||||
|
total_discounts,
|
||||||
|
total_returns,
|
||||||
|
net_revenue,
|
||||||
|
total_cogs,
|
||||||
|
total_gross_revenue,
|
||||||
|
total_profit,
|
||||||
|
profit_margin,
|
||||||
|
avg_daily_units,
|
||||||
|
reorder_point,
|
||||||
|
reorder_alert,
|
||||||
|
days_of_supply,
|
||||||
|
sales_velocity,
|
||||||
|
sales_velocity_score,
|
||||||
|
rank_by_revenue,
|
||||||
|
rank_by_quantity,
|
||||||
|
rank_by_profit,
|
||||||
|
total_received_quantity,
|
||||||
|
total_received_cost,
|
||||||
|
last_sold_date,
|
||||||
|
last_received_date,
|
||||||
|
days_since_last_sale,
|
||||||
|
days_since_last_received,
|
||||||
|
calculation_timestamp
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
p.pid,
|
||||||
|
p.sku,
|
||||||
|
p.stock_quantity AS current_stock_quantity,
|
||||||
|
p.stock_quantity * COALESCE(p.landing_cost_price, p.cost_price, 0) AS current_stock_cost,
|
||||||
|
p.stock_quantity * COALESCE(p.price, 0) AS current_stock_retail,
|
||||||
|
p.stock_quantity * COALESCE(p.regular_price, 0) AS current_stock_msrp,
|
||||||
|
(p.stock_quantity <= 0) AS is_out_of_stock,
|
||||||
|
-- Aggregate metrics
|
||||||
|
COALESCE(SUM(ds.units_sold), 0) AS total_units_sold,
|
||||||
|
COALESCE(SUM(ds.units_returned), 0) AS total_units_returned,
|
||||||
|
CASE
|
||||||
|
WHEN COALESCE(SUM(ds.units_sold), 0) > 0
|
||||||
|
THEN COALESCE(SUM(ds.units_returned), 0)::float / NULLIF(COALESCE(SUM(ds.units_sold), 0), 0)
|
||||||
|
ELSE 0
|
||||||
|
END AS return_rate,
|
||||||
|
COALESCE(SUM(ds.gross_revenue), 0) AS gross_revenue,
|
||||||
|
COALESCE(SUM(ds.discounts), 0) AS total_discounts,
|
||||||
|
COALESCE(SUM(ds.returns_revenue), 0) AS total_returns,
|
||||||
|
COALESCE(SUM(ds.net_revenue), 0) AS net_revenue,
|
||||||
|
COALESCE(SUM(ds.cogs), 0) AS total_cogs,
|
||||||
|
COALESCE(SUM(ds.gross_regular_revenue), 0) AS total_gross_revenue,
|
||||||
|
COALESCE(SUM(ds.profit), 0) AS total_profit,
|
||||||
|
CASE
|
||||||
|
WHEN COALESCE(SUM(ds.net_revenue), 0) > 0
|
||||||
|
THEN COALESCE(SUM(ds.profit), 0) / NULLIF(COALESCE(SUM(ds.net_revenue), 0), 0)
|
||||||
|
ELSE 0
|
||||||
|
END AS profit_margin,
|
||||||
|
-- Calculate average daily units
|
||||||
|
COALESCE(AVG(ds.units_sold), 0) AS avg_daily_units,
|
||||||
|
-- Calculate reorder point (simplified, can be enhanced with lead time and safety stock)
|
||||||
|
CEILING(COALESCE(AVG(ds.units_sold) * 14, 0)) AS reorder_point,
|
||||||
|
(p.stock_quantity <= CEILING(COALESCE(AVG(ds.units_sold) * 14, 0))) AS reorder_alert,
|
||||||
|
-- Days of supply based on average daily sales
|
||||||
|
CASE
|
||||||
|
WHEN COALESCE(AVG(ds.units_sold), 0) > 0
|
||||||
|
THEN p.stock_quantity / NULLIF(COALESCE(AVG(ds.units_sold), 0), 0)
|
||||||
|
ELSE NULL
|
||||||
|
END AS days_of_supply,
|
||||||
|
-- Sales velocity (average units sold per day over last 30 days)
|
||||||
|
(SELECT COALESCE(AVG(recent.units_sold), 0)
|
||||||
|
FROM public.daily_product_snapshots recent
|
||||||
|
WHERE recent.pid = p.pid
|
||||||
|
AND recent.snapshot_date >= CURRENT_DATE - INTERVAL '30 days'
|
||||||
|
) AS sales_velocity,
|
||||||
|
-- Placeholder for sales velocity score (can be calculated based on velocity)
|
||||||
|
0 AS sales_velocity_score,
|
||||||
|
-- Will be updated later by ranking procedure
|
||||||
|
0 AS rank_by_revenue,
|
||||||
|
0 AS rank_by_quantity,
|
||||||
|
0 AS rank_by_profit,
|
||||||
|
-- Receiving data
|
||||||
|
COALESCE(SUM(ds.units_received), 0) AS total_received_quantity,
|
||||||
|
COALESCE(SUM(ds.cost_received), 0) AS total_received_cost,
|
||||||
|
-- Date metrics
|
||||||
|
(SELECT MAX(sd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots sd
|
||||||
|
WHERE sd.pid = p.pid AND sd.units_sold > 0
|
||||||
|
) AS last_sold_date,
|
||||||
|
(SELECT MAX(rd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots rd
|
||||||
|
WHERE rd.pid = p.pid AND rd.units_received > 0
|
||||||
|
) AS last_received_date,
|
||||||
|
-- Calculate days since last sale/received
|
||||||
|
CASE
|
||||||
|
WHEN (SELECT MAX(sd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots sd
|
||||||
|
WHERE sd.pid = p.pid AND sd.units_sold > 0) IS NOT NULL
|
||||||
|
THEN (CURRENT_DATE - (SELECT MAX(sd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots sd
|
||||||
|
WHERE sd.pid = p.pid AND sd.units_sold > 0))::integer
|
||||||
|
ELSE NULL
|
||||||
|
END AS days_since_last_sale,
|
||||||
|
CASE
|
||||||
|
WHEN (SELECT MAX(rd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots rd
|
||||||
|
WHERE rd.pid = p.pid AND rd.units_received > 0) IS NOT NULL
|
||||||
|
THEN (CURRENT_DATE - (SELECT MAX(rd.snapshot_date)
|
||||||
|
FROM public.daily_product_snapshots rd
|
||||||
|
WHERE rd.pid = p.pid AND rd.units_received > 0))::integer
|
||||||
|
ELSE NULL
|
||||||
|
END AS days_since_last_received,
|
||||||
|
now() -- calculation timestamp
|
||||||
|
FROM public.products p
|
||||||
|
LEFT JOIN public.daily_product_snapshots ds ON p.pid = ds.pid
|
||||||
|
GROUP BY p.pid, p.sku, p.stock_quantity, p.landing_cost_price, p.cost_price, p.price, p.regular_price
|
||||||
|
)
|
||||||
|
|
||||||
|
-- Update the calculate_status table
|
||||||
|
INSERT INTO public.calculate_status (module_name, last_calculation_timestamp)
|
||||||
|
VALUES
|
||||||
|
('daily_snapshots', now()),
|
||||||
|
('product_metrics', now())
|
||||||
|
ON CONFLICT (module_name) DO UPDATE
|
||||||
|
SET last_calculation_timestamp = now();
|
||||||
|
|
||||||
|
-- Finally, update the ranks for products
|
||||||
|
UPDATE public.product_metrics pm SET
|
||||||
|
rank_by_revenue = rev_ranks.rank
|
||||||
|
FROM (
|
||||||
|
SELECT pid, RANK() OVER (ORDER BY net_revenue DESC) AS rank
|
||||||
|
FROM public.product_metrics
|
||||||
|
WHERE net_revenue > 0
|
||||||
|
) rev_ranks
|
||||||
|
WHERE pm.pid = rev_ranks.pid;
|
||||||
|
|
||||||
|
UPDATE public.product_metrics pm SET
|
||||||
|
rank_by_quantity = qty_ranks.rank
|
||||||
|
FROM (
|
||||||
|
SELECT pid, RANK() OVER (ORDER BY total_units_sold DESC) AS rank
|
||||||
|
FROM public.product_metrics
|
||||||
|
WHERE total_units_sold > 0
|
||||||
|
) qty_ranks
|
||||||
|
WHERE pm.pid = qty_ranks.pid;
|
||||||
|
|
||||||
|
UPDATE public.product_metrics pm SET
|
||||||
|
rank_by_profit = profit_ranks.rank
|
||||||
|
FROM (
|
||||||
|
SELECT pid, RANK() OVER (ORDER BY total_profit DESC) AS rank
|
||||||
|
FROM public.product_metrics
|
||||||
|
WHERE total_profit > 0
|
||||||
|
) profit_ranks
|
||||||
|
WHERE pm.pid = profit_ranks.pid;
|
||||||
|
|
||||||
|
-- Return count of products with metrics
|
||||||
|
SELECT COUNT(*) AS product_count FROM public.product_metrics
|
||||||
|
`);
|
||||||
|
};
|
||||||
|
|
||||||
async function populateInitialMetrics() {
|
async function populateInitialMetrics() {
|
||||||
let connection;
|
let connection;
|
||||||
const startTime = Date.now();
|
const startTime = Date.now();
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
-- historically backfilled daily_product_snapshots and current product/PO data.
|
-- historically backfilled daily_product_snapshots and current product/PO data.
|
||||||
-- Calculates all metrics considering the full available history up to 'yesterday'.
|
-- Calculates all metrics considering the full available history up to 'yesterday'.
|
||||||
-- Run ONCE after backfill_historical_snapshots_final.sql completes successfully.
|
-- Run ONCE after backfill_historical_snapshots_final.sql completes successfully.
|
||||||
-- Dependencies: Core import tables (products, purchase_orders), daily_product_snapshots (historically populated),
|
-- Dependencies: Core import tables (products, purchase_orders, receivings), daily_product_snapshots (historically populated),
|
||||||
-- configuration tables (settings_*), product_metrics table must exist.
|
-- configuration tables (settings_*), product_metrics table must exist.
|
||||||
-- Frequency: Run ONCE.
|
-- Frequency: Run ONCE.
|
||||||
DO $$
|
DO $$
|
||||||
@@ -39,35 +39,26 @@ BEGIN
|
|||||||
-- Calculates current on-order quantities and costs
|
-- Calculates current on-order quantities and costs
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
pid,
|
||||||
COALESCE(SUM(ordered - received), 0) AS on_order_qty,
|
SUM(ordered) AS on_order_qty,
|
||||||
COALESCE(SUM((ordered - received) * cost_price), 0.00) AS on_order_cost,
|
SUM(ordered * po_cost_price) AS on_order_cost,
|
||||||
MIN(expected_date) AS earliest_expected_date
|
MIN(expected_date) AS earliest_expected_date
|
||||||
FROM public.purchase_orders
|
FROM public.purchase_orders
|
||||||
-- Use the most common statuses representing active, unfulfilled POs
|
-- Use the most common statuses representing active, unfulfilled POs
|
||||||
WHERE status IN ('open', 'partially_received', 'ordered', 'preordered', 'receiving_started', 'electronically_sent', 'electronically_ready_send')
|
WHERE status IN ('created', 'ordered', 'preordered', 'electronically_sent', 'electronically_ready_send', 'receiving_started')
|
||||||
AND (ordered - received) > 0
|
AND status NOT IN ('canceled', 'done')
|
||||||
GROUP BY pid
|
GROUP BY pid
|
||||||
),
|
),
|
||||||
HistoricalDates AS (
|
HistoricalDates AS (
|
||||||
-- Determines key historical dates from orders and PO history (receiving_history)
|
-- Determines key historical dates from orders and receivings
|
||||||
SELECT
|
SELECT
|
||||||
p.pid,
|
p.pid,
|
||||||
MIN(o.date)::date AS date_first_sold,
|
MIN(o.date)::date AS date_first_sold,
|
||||||
MAX(o.date)::date AS max_order_date, -- Used as fallback for date_last_sold
|
MAX(o.date)::date AS max_order_date, -- Used as fallback for date_last_sold
|
||||||
MIN(rh.first_receipt_date) AS date_first_received_calc,
|
MIN(r.received_date)::date AS date_first_received_calc,
|
||||||
MAX(rh.last_receipt_date) AS date_last_received_calc
|
MAX(r.received_date)::date AS date_last_received_calc
|
||||||
FROM public.products p
|
FROM public.products p
|
||||||
LEFT JOIN public.orders o ON p.pid = o.pid AND o.quantity > 0 AND o.status NOT IN ('canceled', 'returned')
|
LEFT JOIN public.orders o ON p.pid = o.pid AND o.quantity > 0 AND o.status NOT IN ('canceled', 'returned')
|
||||||
LEFT JOIN (
|
LEFT JOIN public.receivings r ON p.pid = r.pid
|
||||||
SELECT
|
|
||||||
po.pid,
|
|
||||||
MIN((rh.item->>'received_at')::date) as first_receipt_date,
|
|
||||||
MAX((rh.item->>'received_at')::date) as last_receipt_date
|
|
||||||
FROM public.purchase_orders po
|
|
||||||
CROSS JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item)
|
|
||||||
WHERE jsonb_typeof(po.receiving_history) = 'array' AND jsonb_array_length(po.receiving_history) > 0
|
|
||||||
GROUP BY po.pid
|
|
||||||
) rh ON p.pid = rh.pid
|
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid
|
||||||
),
|
),
|
||||||
SnapshotAggregates AS (
|
SnapshotAggregates AS (
|
||||||
@@ -165,22 +156,23 @@ BEGIN
|
|||||||
LEFT JOIN public.settings_vendor sv ON p.vendor = sv.vendor
|
LEFT JOIN public.settings_vendor sv ON p.vendor = sv.vendor
|
||||||
),
|
),
|
||||||
AvgLeadTime AS (
|
AvgLeadTime AS (
|
||||||
-- Calculate Average Lead Time from historical POs
|
-- Calculate Average Lead Time by joining purchase_orders with receivings
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
po.pid,
|
||||||
AVG(GREATEST(1,
|
AVG(GREATEST(1,
|
||||||
CASE
|
CASE
|
||||||
WHEN last_received_date IS NOT NULL AND date IS NOT NULL
|
WHEN r.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||||
THEN (last_received_date::date - date::date)
|
THEN (r.received_date::date - po.date::date)
|
||||||
ELSE 1
|
ELSE 1
|
||||||
END
|
END
|
||||||
))::int AS avg_lead_time_days_calc
|
))::int AS avg_lead_time_days_calc
|
||||||
FROM public.purchase_orders
|
FROM public.purchase_orders po
|
||||||
WHERE status = 'received' -- Assumes 'received' marks full receipt
|
JOIN public.receivings r ON r.pid = po.pid
|
||||||
AND last_received_date IS NOT NULL
|
WHERE po.status = 'done' -- Completed POs
|
||||||
AND date IS NOT NULL
|
AND r.received_date IS NOT NULL
|
||||||
AND last_received_date >= date
|
AND po.date IS NOT NULL
|
||||||
GROUP BY pid
|
AND r.received_date >= po.date
|
||||||
|
GROUP BY po.pid
|
||||||
),
|
),
|
||||||
RankedForABC AS (
|
RankedForABC AS (
|
||||||
-- Ranks products based on the configured ABC metric (using historical data)
|
-- Ranks products based on the configured ABC metric (using historical data)
|
||||||
@@ -198,7 +190,7 @@ BEGIN
|
|||||||
WHEN 'sales_30d' THEN COALESCE(sa.sales_30d, 0)
|
WHEN 'sales_30d' THEN COALESCE(sa.sales_30d, 0)
|
||||||
WHEN 'lifetime_revenue' THEN COALESCE(sa.lifetime_revenue, 0)::numeric
|
WHEN 'lifetime_revenue' THEN COALESCE(sa.lifetime_revenue, 0)::numeric
|
||||||
ELSE COALESCE(sa.revenue_30d, 0)
|
ELSE COALESCE(sa.revenue_30d, 0)
|
||||||
END) > 0 -- Exclude zero-value products from ranking
|
END) > 0 -- Only include products with non-zero contribution
|
||||||
),
|
),
|
||||||
CumulativeABC AS (
|
CumulativeABC AS (
|
||||||
-- Calculates cumulative metric values for ABC ranking
|
-- Calculates cumulative metric values for ABC ranking
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
-- Description: Rebuilds daily product snapshots from scratch using real orders data.
|
-- Description: Rebuilds daily product snapshots from scratch using real orders data.
|
||||||
-- Fixes issues with duplicated/inflated metrics.
|
-- Fixes issues with duplicated/inflated metrics.
|
||||||
-- Dependencies: Core import tables (products, orders, purchase_orders).
|
-- Dependencies: Core import tables (products, orders, receivings).
|
||||||
-- Frequency: One-time run to clear out problematic data.
|
-- Frequency: One-time run to clear out problematic data.
|
||||||
|
|
||||||
DO $$
|
DO $$
|
||||||
@@ -51,65 +51,17 @@ BEGIN
|
|||||||
),
|
),
|
||||||
ReceivingData AS (
|
ReceivingData AS (
|
||||||
SELECT
|
SELECT
|
||||||
po.pid,
|
r.pid,
|
||||||
-- Count POs to ensure we only include products with real activity
|
-- Count receiving documents to ensure we only include products with real activity
|
||||||
COUNT(po.po_id) as po_count,
|
COUNT(DISTINCT r.receiving_id) as receiving_count,
|
||||||
-- Calculate received quantity for this day
|
-- Calculate received quantity for this day
|
||||||
COALESCE(
|
SUM(r.qty_each) AS units_received,
|
||||||
-- First try the received field from purchase_orders table (if received on this date)
|
-- Calculate received cost for this day
|
||||||
SUM(CASE WHEN po.date::date = _date THEN po.received ELSE 0 END),
|
SUM(r.qty_each * r.cost_each) AS cost_received
|
||||||
|
FROM public.receivings r
|
||||||
-- Otherwise try receiving_history JSON
|
WHERE r.received_date::date = _date
|
||||||
SUM(
|
GROUP BY r.pid
|
||||||
CASE
|
HAVING COUNT(DISTINCT r.receiving_id) > 0 OR SUM(r.qty_each) > 0
|
||||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
),
|
|
||||||
0
|
|
||||||
) AS units_received,
|
|
||||||
|
|
||||||
COALESCE(
|
|
||||||
-- First try the actual cost_price from purchase_orders
|
|
||||||
SUM(CASE WHEN po.date::date = _date THEN po.received * po.cost_price ELSE 0 END),
|
|
||||||
|
|
||||||
-- Otherwise try receiving_history JSON
|
|
||||||
SUM(
|
|
||||||
CASE
|
|
||||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
* COALESCE((rh.item->>'cost')::numeric, po.cost_price)
|
|
||||||
),
|
|
||||||
0.00
|
|
||||||
) AS cost_received
|
|
||||||
FROM public.purchase_orders po
|
|
||||||
LEFT JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item) ON
|
|
||||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
|
||||||
jsonb_array_length(po.receiving_history) > 0 AND
|
|
||||||
(
|
|
||||||
(rh.item->>'date')::date = _date OR
|
|
||||||
(rh.item->>'received_at')::date = _date OR
|
|
||||||
(rh.item->>'receipt_date')::date = _date
|
|
||||||
)
|
|
||||||
-- Include POs with the current date or relevant receiving_history
|
|
||||||
WHERE
|
|
||||||
po.date::date = _date OR
|
|
||||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
|
||||||
jsonb_array_length(po.receiving_history) > 0
|
|
||||||
GROUP BY po.pid
|
|
||||||
HAVING COUNT(po.po_id) > 0 OR SUM(
|
|
||||||
CASE
|
|
||||||
WHEN (rh.item->>'date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'received_at')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'receipt_date')::date = _date THEN (rh.item->>'qty')::numeric
|
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
) > 0
|
|
||||||
),
|
),
|
||||||
-- Get stock quantities for the day - note this is approximate since we're using current products data
|
-- Get stock quantities for the day - note this is approximate since we're using current products data
|
||||||
StockData AS (
|
StockData AS (
|
||||||
@@ -170,7 +122,7 @@ BEGIN
|
|||||||
FROM SalesData sd
|
FROM SalesData sd
|
||||||
FULL OUTER JOIN ReceivingData rd ON sd.pid = rd.pid
|
FULL OUTER JOIN ReceivingData rd ON sd.pid = rd.pid
|
||||||
LEFT JOIN StockData s ON COALESCE(sd.pid, rd.pid) = s.pid
|
LEFT JOIN StockData s ON COALESCE(sd.pid, rd.pid) = s.pid
|
||||||
WHERE (COALESCE(sd.order_count, 0) > 0 OR COALESCE(rd.po_count, 0) > 0);
|
WHERE (COALESCE(sd.order_count, 0) > 0 OR COALESCE(rd.receiving_count, 0) > 0);
|
||||||
|
|
||||||
-- Get record count for this day
|
-- Get record count for this day
|
||||||
GET DIAGNOSTICS _count = ROW_COUNT;
|
GET DIAGNOSTICS _count = ROW_COUNT;
|
||||||
|
|||||||
@@ -45,19 +45,26 @@ BEGIN
|
|||||||
GROUP BY p.vendor
|
GROUP BY p.vendor
|
||||||
),
|
),
|
||||||
VendorPOAggregates AS (
|
VendorPOAggregates AS (
|
||||||
-- Aggregate PO related stats
|
-- Aggregate PO related stats including lead time calculated from POs to receivings
|
||||||
SELECT
|
SELECT
|
||||||
vendor,
|
po.vendor,
|
||||||
COUNT(DISTINCT po_id) AS po_count_365d,
|
COUNT(DISTINCT po.po_id) AS po_count_365d,
|
||||||
AVG(GREATEST(1, CASE WHEN last_received_date IS NOT NULL AND date IS NOT NULL THEN (last_received_date::date - date::date) ELSE NULL END))::int AS avg_lead_time_days_hist -- Avg lead time from HISTORICAL received POs
|
-- Calculate lead time by averaging the days between PO date and receiving date
|
||||||
FROM public.purchase_orders
|
AVG(GREATEST(1, CASE
|
||||||
WHERE vendor IS NOT NULL AND vendor <> ''
|
WHEN r.received_date IS NOT NULL AND po.date IS NOT NULL
|
||||||
AND date >= CURRENT_DATE - INTERVAL '1 year' -- Look at POs created in the last year
|
THEN (r.received_date::date - po.date::date)
|
||||||
AND status = 'received' -- Only calculate lead time on fully received POs
|
ELSE NULL
|
||||||
AND last_received_date IS NOT NULL
|
END))::int AS avg_lead_time_days_hist -- Avg lead time from HISTORICAL received POs
|
||||||
AND date IS NOT NULL
|
FROM public.purchase_orders po
|
||||||
AND last_received_date >= date
|
-- Join to receivings table to find when items were received
|
||||||
GROUP BY vendor
|
LEFT JOIN public.receivings r ON r.pid = po.pid
|
||||||
|
WHERE po.vendor IS NOT NULL AND po.vendor <> ''
|
||||||
|
AND po.date >= CURRENT_DATE - INTERVAL '1 year' -- Look at POs created in the last year
|
||||||
|
AND po.status = 'done' -- Only calculate lead time on completed POs
|
||||||
|
AND r.received_date IS NOT NULL
|
||||||
|
AND po.date IS NOT NULL
|
||||||
|
AND r.received_date >= po.date
|
||||||
|
GROUP BY po.vendor
|
||||||
),
|
),
|
||||||
AllVendors AS (
|
AllVendors AS (
|
||||||
-- Ensure all vendors from products table are included
|
-- Ensure all vendors from products table are included
|
||||||
|
|||||||
@@ -101,66 +101,20 @@ BEGIN
|
|||||||
),
|
),
|
||||||
ReceivingData AS (
|
ReceivingData AS (
|
||||||
SELECT
|
SELECT
|
||||||
po.pid,
|
r.pid,
|
||||||
-- Track number of POs to ensure we have real data
|
-- Track number of receiving docs to ensure we have real data
|
||||||
COUNT(po.po_id) as po_count,
|
COUNT(DISTINCT r.receiving_id) as receiving_doc_count,
|
||||||
-- Prioritize the actual table fields over the JSON data
|
-- Sum the quantities received on this date
|
||||||
COALESCE(
|
SUM(r.qty_each) AS units_received,
|
||||||
-- First try the received field from purchase_orders table
|
-- Calculate the cost received (qty * cost)
|
||||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received ELSE 0 END),
|
SUM(r.qty_each * r.cost_each) AS cost_received
|
||||||
|
FROM public.receivings r
|
||||||
-- Otherwise fall back to the receiving_history JSON as secondary source
|
WHERE r.received_date::date = _target_date
|
||||||
SUM(
|
-- Optional: Filter out canceled receivings if needed
|
||||||
CASE
|
-- AND r.status <> 'canceled'
|
||||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
GROUP BY r.pid
|
||||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
-- Only include products with actual receiving activity
|
||||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
HAVING COUNT(DISTINCT r.receiving_id) > 0 OR SUM(r.qty_each) > 0
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
),
|
|
||||||
0
|
|
||||||
) AS units_received,
|
|
||||||
|
|
||||||
COALESCE(
|
|
||||||
-- First try the actual cost_price from purchase_orders
|
|
||||||
SUM(CASE WHEN po.date::date = _target_date THEN po.received * po.cost_price ELSE 0 END),
|
|
||||||
|
|
||||||
-- Otherwise fall back to receiving_history JSON
|
|
||||||
SUM(
|
|
||||||
CASE
|
|
||||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
* COALESCE((rh.item->>'cost')::numeric, po.cost_price)
|
|
||||||
),
|
|
||||||
0.00
|
|
||||||
) AS cost_received
|
|
||||||
FROM public.purchase_orders po
|
|
||||||
LEFT JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item) ON
|
|
||||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
|
||||||
jsonb_array_length(po.receiving_history) > 0 AND
|
|
||||||
(
|
|
||||||
(rh.item->>'date')::date = _target_date OR
|
|
||||||
(rh.item->>'received_at')::date = _target_date OR
|
|
||||||
(rh.item->>'receipt_date')::date = _target_date
|
|
||||||
)
|
|
||||||
-- Include POs with the current date or relevant receiving_history
|
|
||||||
WHERE
|
|
||||||
po.date::date = _target_date OR
|
|
||||||
jsonb_typeof(po.receiving_history) = 'array' AND
|
|
||||||
jsonb_array_length(po.receiving_history) > 0
|
|
||||||
GROUP BY po.pid
|
|
||||||
-- CRITICAL: Only include products with actual receiving activity
|
|
||||||
HAVING COUNT(po.po_id) > 0 OR SUM(
|
|
||||||
CASE
|
|
||||||
WHEN (rh.item->>'date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'received_at')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
WHEN (rh.item->>'receipt_date')::date = _target_date THEN (rh.item->>'qty')::numeric
|
|
||||||
ELSE 0
|
|
||||||
END
|
|
||||||
) > 0
|
|
||||||
),
|
),
|
||||||
CurrentStock AS (
|
CurrentStock AS (
|
||||||
-- Select current stock values directly from products table
|
-- Select current stock values directly from products table
|
||||||
|
|||||||
@@ -24,14 +24,17 @@ BEGIN
|
|||||||
RAISE NOTICE 'Calculating Average Lead Time...';
|
RAISE NOTICE 'Calculating Average Lead Time...';
|
||||||
WITH LeadTimes AS (
|
WITH LeadTimes AS (
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
po.pid,
|
||||||
AVG(GREATEST(1, (last_received_date::date - date::date))) AS avg_days -- Use GREATEST(1,...) to avoid 0 or negative days
|
-- Calculate lead time by looking at when items ordered on POs were received
|
||||||
FROM public.purchase_orders
|
AVG(GREATEST(1, (r.received_date::date - po.date::date))) AS avg_days -- Use GREATEST(1,...) to avoid 0 or negative days
|
||||||
WHERE status = 'received' -- Or potentially 'full_received' if using that status
|
FROM public.purchase_orders po
|
||||||
AND last_received_date IS NOT NULL
|
-- Join to receivings table to find actual receipts
|
||||||
AND date IS NOT NULL
|
JOIN public.receivings r ON r.pid = po.pid
|
||||||
AND last_received_date >= date -- Ensure received date is not before order date
|
WHERE po.status = 'done' -- Only include completed POs
|
||||||
GROUP BY pid
|
AND r.received_date >= po.date -- Ensure received date is not before order date
|
||||||
|
-- Optional: add check to make sure receiving is related to PO if you have source_po_id
|
||||||
|
-- AND (r.source_po_id = po.po_id OR r.source_po_id IS NULL)
|
||||||
|
GROUP BY po.pid
|
||||||
)
|
)
|
||||||
UPDATE public.product_metrics pm
|
UPDATE public.product_metrics pm
|
||||||
SET avg_lead_time_days = lt.avg_days::int
|
SET avg_lead_time_days = lt.avg_days::int
|
||||||
|
|||||||
@@ -64,12 +64,12 @@ BEGIN
|
|||||||
OnOrderInfo AS (
|
OnOrderInfo AS (
|
||||||
SELECT
|
SELECT
|
||||||
pid,
|
pid,
|
||||||
COALESCE(SUM(ordered - received), 0) AS on_order_qty,
|
SUM(ordered) AS on_order_qty,
|
||||||
COALESCE(SUM((ordered - received) * cost_price), 0.00) AS on_order_cost,
|
SUM(ordered * po_cost_price) AS on_order_cost,
|
||||||
MIN(expected_date) AS earliest_expected_date
|
MIN(expected_date) AS earliest_expected_date
|
||||||
FROM public.purchase_orders
|
FROM public.purchase_orders
|
||||||
WHERE status IN ('open', 'partially_received', 'ordered', 'preordered', 'receiving_started', 'electronically_sent', 'electronically_ready_send') -- Adjust based on your status workflow representing active POs not fully received
|
WHERE status IN ('created', 'ordered', 'preordered', 'electronically_sent', 'electronically_ready_send', 'receiving_started')
|
||||||
AND (ordered - received) > 0
|
AND status NOT IN ('canceled', 'done')
|
||||||
GROUP BY pid
|
GROUP BY pid
|
||||||
),
|
),
|
||||||
HistoricalDates AS (
|
HistoricalDates AS (
|
||||||
@@ -80,45 +80,14 @@ BEGIN
|
|||||||
MIN(o.date)::date AS date_first_sold,
|
MIN(o.date)::date AS date_first_sold,
|
||||||
MAX(o.date)::date AS max_order_date, -- Use MAX for potential recalc of date_last_sold
|
MAX(o.date)::date AS max_order_date, -- Use MAX for potential recalc of date_last_sold
|
||||||
|
|
||||||
-- For first received date, try table data first then fall back to JSON
|
-- For first received, use the new receivings table
|
||||||
COALESCE(
|
MIN(r.received_date)::date AS date_first_received_calc,
|
||||||
MIN(po.date)::date, -- Try purchase_order date first
|
|
||||||
MIN(rh.first_receipt_date) -- Fall back to JSON data if needed
|
|
||||||
) AS date_first_received_calc,
|
|
||||||
|
|
||||||
-- If we only have one receipt date (first = last), use that for last_received too
|
-- For last received, use the new receivings table
|
||||||
COALESCE(
|
MAX(r.received_date)::date AS date_last_received_calc
|
||||||
MAX(po.date)::date, -- Try purchase_order date first
|
|
||||||
NULLIF(MAX(rh.last_receipt_date), NULL),
|
|
||||||
MIN(rh.first_receipt_date)
|
|
||||||
) AS date_last_received_calc
|
|
||||||
FROM public.products p
|
FROM public.products p
|
||||||
LEFT JOIN public.orders o ON p.pid = o.pid AND o.quantity > 0 AND o.status NOT IN ('canceled', 'returned')
|
LEFT JOIN public.orders o ON p.pid = o.pid AND o.quantity > 0 AND o.status NOT IN ('canceled', 'returned')
|
||||||
LEFT JOIN public.purchase_orders po ON p.pid = po.pid AND po.received > 0
|
LEFT JOIN public.receivings r ON p.pid = r.pid
|
||||||
LEFT JOIN (
|
|
||||||
SELECT
|
|
||||||
po.pid,
|
|
||||||
MIN(
|
|
||||||
CASE
|
|
||||||
WHEN rh.item->>'date' IS NOT NULL THEN (rh.item->>'date')::date
|
|
||||||
WHEN rh.item->>'received_at' IS NOT NULL THEN (rh.item->>'received_at')::date
|
|
||||||
WHEN rh.item->>'receipt_date' IS NOT NULL THEN (rh.item->>'receipt_date')::date
|
|
||||||
ELSE NULL
|
|
||||||
END
|
|
||||||
) as first_receipt_date,
|
|
||||||
MAX(
|
|
||||||
CASE
|
|
||||||
WHEN rh.item->>'date' IS NOT NULL THEN (rh.item->>'date')::date
|
|
||||||
WHEN rh.item->>'received_at' IS NOT NULL THEN (rh.item->>'received_at')::date
|
|
||||||
WHEN rh.item->>'receipt_date' IS NOT NULL THEN (rh.item->>'receipt_date')::date
|
|
||||||
ELSE NULL
|
|
||||||
END
|
|
||||||
) as last_receipt_date
|
|
||||||
FROM public.purchase_orders po
|
|
||||||
CROSS JOIN LATERAL jsonb_array_elements(po.receiving_history) AS rh(item)
|
|
||||||
WHERE jsonb_typeof(po.receiving_history) = 'array' AND jsonb_array_length(po.receiving_history) > 0
|
|
||||||
GROUP BY po.pid
|
|
||||||
) rh ON p.pid = rh.pid
|
|
||||||
GROUP BY p.pid
|
GROUP BY p.pid
|
||||||
),
|
),
|
||||||
SnapshotAggregates AS (
|
SnapshotAggregates AS (
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ const cors = require('cors');
|
|||||||
const corsMiddleware = cors({
|
const corsMiddleware = cors({
|
||||||
origin: [
|
origin: [
|
||||||
'https://inventory.kent.pw',
|
'https://inventory.kent.pw',
|
||||||
'http://localhost:5173',
|
'http://localhost:5175',
|
||||||
/^http:\/\/192\.168\.\d+\.\d+(:\d+)?$/,
|
/^http:\/\/192\.168\.\d+\.\d+(:\d+)?$/,
|
||||||
/^http:\/\/10\.\d+\.\d+\.\d+(:\d+)?$/
|
/^http:\/\/10\.\d+\.\d+\.\d+(:\d+)?$/
|
||||||
],
|
],
|
||||||
@@ -26,7 +26,7 @@ const corsErrorHandler = (err, req, res, next) => {
|
|||||||
res.status(403).json({
|
res.status(403).json({
|
||||||
error: 'CORS not allowed',
|
error: 'CORS not allowed',
|
||||||
origin: req.get('Origin'),
|
origin: req.get('Origin'),
|
||||||
message: 'Origin not in allowed list: https://inventory.kent.pw, localhost:5173, 192.168.x.x, or 10.x.x.x'
|
message: 'Origin not in allowed list: https://inventory.kent.pw, localhost:5175, 192.168.x.x, or 10.x.x.x'
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
next(err);
|
next(err);
|
||||||
|
|||||||
@@ -108,47 +108,52 @@ router.get('/purchase/metrics', async (req, res) => {
|
|||||||
`);
|
`);
|
||||||
|
|
||||||
const { rows: [poMetrics] } = await executeQuery(`
|
const { rows: [poMetrics] } = await executeQuery(`
|
||||||
|
WITH po_metrics AS (
|
||||||
SELECT
|
SELECT
|
||||||
COALESCE(COUNT(DISTINCT CASE
|
po_id,
|
||||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
status,
|
||||||
THEN po.po_id
|
date,
|
||||||
END), 0)::integer as active_pos,
|
expected_date,
|
||||||
COALESCE(COUNT(DISTINCT CASE
|
pid,
|
||||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
ordered,
|
||||||
AND po.expected_date < CURRENT_DATE
|
po_cost_price
|
||||||
THEN po.po_id
|
|
||||||
END), 0)::integer as overdue_pos,
|
|
||||||
COALESCE(SUM(CASE
|
|
||||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
|
||||||
THEN po.ordered
|
|
||||||
ELSE 0
|
|
||||||
END), 0)::integer as total_units,
|
|
||||||
ROUND(COALESCE(SUM(CASE
|
|
||||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
|
||||||
THEN po.ordered * po.cost_price
|
|
||||||
ELSE 0
|
|
||||||
END), 0)::numeric, 3) as total_cost,
|
|
||||||
ROUND(COALESCE(SUM(CASE
|
|
||||||
WHEN po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
|
||||||
THEN po.ordered * pm.current_price
|
|
||||||
ELSE 0
|
|
||||||
END), 0)::numeric, 3) as total_retail
|
|
||||||
FROM purchase_orders po
|
FROM purchase_orders po
|
||||||
|
WHERE po.status NOT IN ('canceled', 'done')
|
||||||
|
AND po.date >= CURRENT_DATE - INTERVAL '6 months'
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
COUNT(DISTINCT po_id)::integer as active_pos,
|
||||||
|
COUNT(DISTINCT CASE WHEN expected_date < CURRENT_DATE THEN po_id END)::integer as overdue_pos,
|
||||||
|
SUM(ordered)::integer as total_units,
|
||||||
|
ROUND(SUM(ordered * po_cost_price)::numeric, 3) as total_cost,
|
||||||
|
ROUND(SUM(ordered * pm.current_price)::numeric, 3) as total_retail
|
||||||
|
FROM po_metrics po
|
||||||
JOIN product_metrics pm ON po.pid = pm.pid
|
JOIN product_metrics pm ON po.pid = pm.pid
|
||||||
`);
|
`);
|
||||||
|
|
||||||
const { rows: vendorOrders } = await executeQuery(`
|
const { rows: vendorOrders } = await executeQuery(`
|
||||||
|
WITH po_by_vendor AS (
|
||||||
SELECT
|
SELECT
|
||||||
po.vendor,
|
vendor,
|
||||||
COUNT(DISTINCT po.po_id)::integer as orders,
|
po_id,
|
||||||
COALESCE(SUM(po.ordered), 0)::integer as units,
|
SUM(ordered) as total_ordered,
|
||||||
ROUND(COALESCE(SUM(po.ordered * po.cost_price), 0)::numeric, 3) as cost,
|
SUM(ordered * po_cost_price) as total_cost
|
||||||
ROUND(COALESCE(SUM(po.ordered * pm.current_price), 0)::numeric, 3) as retail
|
FROM purchase_orders
|
||||||
FROM purchase_orders po
|
WHERE status NOT IN ('canceled', 'done')
|
||||||
|
AND date >= CURRENT_DATE - INTERVAL '6 months'
|
||||||
|
GROUP BY vendor, po_id
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
pv.vendor,
|
||||||
|
COUNT(DISTINCT pv.po_id)::integer as orders,
|
||||||
|
SUM(pv.total_ordered)::integer as units,
|
||||||
|
ROUND(SUM(pv.total_cost)::numeric, 3) as cost,
|
||||||
|
ROUND(SUM(pv.total_ordered * pm.current_price)::numeric, 3) as retail
|
||||||
|
FROM po_by_vendor pv
|
||||||
|
JOIN purchase_orders po ON pv.po_id = po.po_id
|
||||||
JOIN product_metrics pm ON po.pid = pm.pid
|
JOIN product_metrics pm ON po.pid = pm.pid
|
||||||
WHERE po.receiving_status NOT IN ('partial_received', 'full_received', 'paid')
|
GROUP BY pv.vendor
|
||||||
GROUP BY po.vendor
|
HAVING ROUND(SUM(pv.total_cost)::numeric, 3) > 0
|
||||||
HAVING ROUND(COALESCE(SUM(po.ordered * po.cost_price), 0)::numeric, 3) > 0
|
|
||||||
ORDER BY cost DESC
|
ORDER BY cost DESC
|
||||||
`);
|
`);
|
||||||
|
|
||||||
|
|||||||
@@ -193,6 +193,33 @@ router.get('/:type/progress', (req, res) => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// GET /status - Check for active processes
|
||||||
|
router.get('/status', (req, res) => {
|
||||||
|
try {
|
||||||
|
const hasActiveUpdate = activeFullUpdate !== null;
|
||||||
|
const hasActiveReset = activeFullReset !== null;
|
||||||
|
|
||||||
|
if (hasActiveUpdate || hasActiveReset) {
|
||||||
|
res.json({
|
||||||
|
active: true,
|
||||||
|
progress: {
|
||||||
|
status: 'running',
|
||||||
|
operation: hasActiveUpdate ? 'Full update in progress' : 'Full reset in progress',
|
||||||
|
type: hasActiveUpdate ? 'update' : 'reset'
|
||||||
|
}
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
res.json({
|
||||||
|
active: false,
|
||||||
|
progress: null
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Error checking status:', error);
|
||||||
|
res.status(500).json({ error: error.message });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
// Route to cancel active process
|
// Route to cancel active process
|
||||||
router.post('/cancel', (req, res) => {
|
router.post('/cancel', (req, res) => {
|
||||||
let killed = false;
|
let killed = false;
|
||||||
@@ -351,7 +378,7 @@ router.get('/status/table-counts', async (req, res) => {
|
|||||||
const pool = req.app.locals.pool;
|
const pool = req.app.locals.pool;
|
||||||
const tables = [
|
const tables = [
|
||||||
// Core tables
|
// Core tables
|
||||||
'products', 'categories', 'product_categories', 'orders', 'purchase_orders',
|
'products', 'categories', 'product_categories', 'orders', 'purchase_orders', 'receivings',
|
||||||
// New metrics tables
|
// New metrics tables
|
||||||
'product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics',
|
'product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics',
|
||||||
// Config tables
|
// Config tables
|
||||||
@@ -375,7 +402,7 @@ router.get('/status/table-counts', async (req, res) => {
|
|||||||
|
|
||||||
// Group tables by type
|
// Group tables by type
|
||||||
const groupedCounts = {
|
const groupedCounts = {
|
||||||
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders'].includes(c.table_name)),
|
core: counts.filter(c => ['products', 'categories', 'product_categories', 'orders', 'purchase_orders', 'receivings'].includes(c.table_name)),
|
||||||
metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics'].includes(c.table_name)),
|
metrics: counts.filter(c => ['product_metrics', 'daily_product_snapshots','brand_metrics','category_metrics','vendor_metrics'].includes(c.table_name)),
|
||||||
config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name))
|
config: counts.filter(c => ['settings_global', 'settings_vendor', 'settings_product'].includes(c.table_name))
|
||||||
};
|
};
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -19,6 +19,7 @@ import { AuthProvider } from './contexts/AuthContext';
|
|||||||
import { Protected } from './components/auth/Protected';
|
import { Protected } from './components/auth/Protected';
|
||||||
import { FirstAccessiblePage } from './components/auth/FirstAccessiblePage';
|
import { FirstAccessiblePage } from './components/auth/FirstAccessiblePage';
|
||||||
import { Brands } from '@/pages/Brands';
|
import { Brands } from '@/pages/Brands';
|
||||||
|
import { Chat } from '@/pages/Chat';
|
||||||
const queryClient = new QueryClient();
|
const queryClient = new QueryClient();
|
||||||
|
|
||||||
function App() {
|
function App() {
|
||||||
@@ -133,6 +134,11 @@ function App() {
|
|||||||
<Forecasting />
|
<Forecasting />
|
||||||
</Protected>
|
</Protected>
|
||||||
} />
|
} />
|
||||||
|
<Route path="/chat" element={
|
||||||
|
<Protected page="chat">
|
||||||
|
<Chat />
|
||||||
|
</Protected>
|
||||||
|
} />
|
||||||
<Route path="*" element={<Navigate to="/" replace />} />
|
<Route path="*" element={<Navigate to="/" replace />} />
|
||||||
</Route>
|
</Route>
|
||||||
</Routes>
|
</Routes>
|
||||||
|
|||||||
559
inventory/src/components/chat/ChatRoom.tsx
Normal file
559
inventory/src/components/chat/ChatRoom.tsx
Normal file
@@ -0,0 +1,559 @@
|
|||||||
|
import React, { useState, useEffect, useRef } from 'react';
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { Badge } from '@/components/ui/badge';
|
||||||
|
import { Button } from '@/components/ui/button';
|
||||||
|
import { Avatar, AvatarFallback, AvatarImage } from '@/components/ui/avatar';
|
||||||
|
import { Loader2, Hash, Lock, MessageSquare, ChevronUp, Search, ExternalLink, FileText, Image, Download, MessageCircle, Users2 } from 'lucide-react';
|
||||||
|
import { Input } from '@/components/ui/input';
|
||||||
|
import config from '@/config';
|
||||||
|
import { convertEmojiShortcodes } from '@/utils/emojiUtils';
|
||||||
|
|
||||||
|
interface Message {
|
||||||
|
id: number;
|
||||||
|
msg: string;
|
||||||
|
ts: string;
|
||||||
|
u: {
|
||||||
|
_id: string;
|
||||||
|
username: string;
|
||||||
|
name?: string;
|
||||||
|
};
|
||||||
|
_updatedat: string;
|
||||||
|
urls?: any[];
|
||||||
|
mentions?: any[];
|
||||||
|
md?: any[];
|
||||||
|
attachments?: {
|
||||||
|
id: number;
|
||||||
|
mongo_id: string;
|
||||||
|
name: string;
|
||||||
|
size: number;
|
||||||
|
type: string;
|
||||||
|
url: string;
|
||||||
|
path: string;
|
||||||
|
typegroup: string;
|
||||||
|
identify?: {
|
||||||
|
size?: { width: number; height: number };
|
||||||
|
format?: string;
|
||||||
|
};
|
||||||
|
}[];
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Room {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
fname: string;
|
||||||
|
type: string;
|
||||||
|
msgs: number;
|
||||||
|
last_message_date: string;
|
||||||
|
display_name: string;
|
||||||
|
description?: string;
|
||||||
|
teamid?: string;
|
||||||
|
participants?: { username: string; name: string }[];
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ChatRoomProps {
|
||||||
|
roomId: string;
|
||||||
|
selectedUserId: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ChatRoom({ roomId, selectedUserId }: ChatRoomProps) {
|
||||||
|
const [room, setRoom] = useState<Room | null>(null);
|
||||||
|
const [messages, setMessages] = useState<Message[]>([]);
|
||||||
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [loadingMore, setLoadingMore] = useState(false);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const [hasMore, setHasMore] = useState(true);
|
||||||
|
const [searchQuery, setSearchQuery] = useState('');
|
||||||
|
const [searchResults, setSearchResults] = useState<any[]>([]);
|
||||||
|
const [showSearch, setShowSearch] = useState(false);
|
||||||
|
|
||||||
|
const messagesEndRef = useRef<HTMLDivElement>(null);
|
||||||
|
const messagesContainerRef = useRef<HTMLDivElement>(null);
|
||||||
|
|
||||||
|
const scrollToBottom = () => {
|
||||||
|
messagesEndRef.current?.scrollIntoView({ behavior: 'smooth' });
|
||||||
|
};
|
||||||
|
|
||||||
|
const fetchRoom = async () => {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${config.chatUrl}/rooms/${roomId}?userId=${selectedUserId}`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setRoom(data.room);
|
||||||
|
} else {
|
||||||
|
throw new Error(data.error || 'Failed to fetch room');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching room:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'An error occurred');
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const fetchMessages = async (before?: string, append = false) => {
|
||||||
|
if (!append) setLoading(true);
|
||||||
|
else setLoadingMore(true);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const params = new URLSearchParams({
|
||||||
|
limit: '50',
|
||||||
|
offset: append ? messages.length.toString() : '0'
|
||||||
|
});
|
||||||
|
|
||||||
|
if (before) {
|
||||||
|
params.set('before', before);
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await fetch(`${config.chatUrl}/rooms/${roomId}/messages?${params}`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
const newMessages = data.messages;
|
||||||
|
|
||||||
|
if (append) {
|
||||||
|
// Prepend older messages
|
||||||
|
setMessages(prev => [...newMessages, ...prev]);
|
||||||
|
setHasMore(newMessages.length === 50);
|
||||||
|
} else {
|
||||||
|
setMessages(newMessages);
|
||||||
|
setHasMore(newMessages.length === 50);
|
||||||
|
// Scroll to bottom on initial load
|
||||||
|
setTimeout(scrollToBottom, 100);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load attachments for these messages in the background (non-blocking)
|
||||||
|
if (newMessages.length > 0) {
|
||||||
|
loadAttachments(newMessages.map((m: Message) => m.id));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
throw new Error(data.error || 'Failed to fetch messages');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching messages:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'An error occurred');
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
setLoadingMore(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const loadAttachments = async (messageIds: number[]) => {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${config.chatUrl}/messages/attachments`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ messageIds }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success' && data.attachments) {
|
||||||
|
// Update messages with their attachments
|
||||||
|
setMessages(prevMessages =>
|
||||||
|
prevMessages.map(msg => ({
|
||||||
|
...msg,
|
||||||
|
attachments: data.attachments[msg.id] || []
|
||||||
|
}))
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error loading attachments:', err);
|
||||||
|
// Don't show error to user for attachments - messages are already displayed
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const loadMoreMessages = () => {
|
||||||
|
if (messages.length > 0 && hasMore && !loadingMore) {
|
||||||
|
const oldestMessage = messages[0];
|
||||||
|
fetchMessages(oldestMessage.ts, true);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const searchMessages = async () => {
|
||||||
|
if (!searchQuery || searchQuery.length < 2) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(
|
||||||
|
`${config.chatUrl}/users/${selectedUserId}/search?q=${encodeURIComponent(searchQuery)}&limit=20`
|
||||||
|
);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setSearchResults(data.results);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error searching messages:', err);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (roomId && selectedUserId) {
|
||||||
|
setMessages([]);
|
||||||
|
setError(null);
|
||||||
|
setHasMore(true);
|
||||||
|
fetchRoom();
|
||||||
|
fetchMessages();
|
||||||
|
}
|
||||||
|
}, [roomId, selectedUserId]);
|
||||||
|
|
||||||
|
const getRoomIcon = (room: Room) => {
|
||||||
|
switch (room.type) {
|
||||||
|
case 'c':
|
||||||
|
return <Hash className="h-4 w-4 text-blue-500" />;
|
||||||
|
case 'p':
|
||||||
|
// Distinguish between teams and discussions based on teamid
|
||||||
|
if (room.teamid) {
|
||||||
|
return <Users2 className="h-4 w-4 text-purple-500" />; // Teams
|
||||||
|
} else {
|
||||||
|
return <MessageCircle className="h-4 w-4 text-orange-500" />; // Discussions
|
||||||
|
}
|
||||||
|
case 'd':
|
||||||
|
return <MessageSquare className="h-4 w-4 text-green-500" />;
|
||||||
|
default:
|
||||||
|
return <Hash className="h-4 w-4 text-gray-500" />;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatTime = (timestamp: string) => {
|
||||||
|
// The stored timestamps are actually in Eastern Time but labeled as UTC
|
||||||
|
// We need to compensate by treating them as UTC and then converting back to Eastern
|
||||||
|
const originalDate = new Date(timestamp);
|
||||||
|
|
||||||
|
// Subtract 4 hours to compensate for the EDT offset (adjust to 5 hours for EST months)
|
||||||
|
// This assumes the original data was incorrectly stored as UTC when it was actually EDT/EST
|
||||||
|
const isDST = (date: Date) => {
|
||||||
|
const jan = new Date(date.getFullYear(), 0, 1);
|
||||||
|
const jul = new Date(date.getFullYear(), 6, 1);
|
||||||
|
return date.getTimezoneOffset() < Math.max(jan.getTimezoneOffset(), jul.getTimezoneOffset());
|
||||||
|
};
|
||||||
|
|
||||||
|
// Determine if the timestamp falls in DST period for Eastern Time
|
||||||
|
const offsetHours = isDST(originalDate) ? 4 : 5; // EDT = UTC-4, EST = UTC-5
|
||||||
|
const correctedDate = new Date(originalDate.getTime() - (offsetHours * 60 * 60 * 1000));
|
||||||
|
|
||||||
|
const now = new Date();
|
||||||
|
const timeZone = 'America/New_York';
|
||||||
|
|
||||||
|
// Compare dates in Eastern Time to ensure correct "today" detection
|
||||||
|
const dateInET = correctedDate.toLocaleDateString([], { timeZone });
|
||||||
|
const nowInET = now.toLocaleDateString([], { timeZone });
|
||||||
|
const isToday = dateInET === nowInET;
|
||||||
|
|
||||||
|
if (isToday) {
|
||||||
|
return correctedDate.toLocaleTimeString([], {
|
||||||
|
hour: '2-digit',
|
||||||
|
minute: '2-digit',
|
||||||
|
timeZone
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
return correctedDate.toLocaleDateString([], { timeZone }) + ' ' + correctedDate.toLocaleTimeString([], {
|
||||||
|
hour: '2-digit',
|
||||||
|
minute: '2-digit',
|
||||||
|
timeZone
|
||||||
|
});
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderMessageText = (text: string, urls?: any[]) => {
|
||||||
|
if (!text) return '';
|
||||||
|
|
||||||
|
// First, convert emoji shortcodes to actual emoji
|
||||||
|
let processedText = convertEmojiShortcodes(text);
|
||||||
|
|
||||||
|
// Then, handle markdown links [text](url) and convert them to HTML
|
||||||
|
processedText = processedText.replace(
|
||||||
|
/\[([^\]]+)\]\((https?:\/\/[^\s\)]+)\)/g,
|
||||||
|
'<a href="$2" target="_blank" rel="noopener noreferrer" class="text-blue-600 hover:underline">$1</a>'
|
||||||
|
);
|
||||||
|
|
||||||
|
// If we have URL previews, replace standalone URLs (that aren't already in markdown) with just the preview
|
||||||
|
if (urls && urls.length > 0) {
|
||||||
|
urls.forEach((urlData) => {
|
||||||
|
// Only replace standalone URLs that aren't part of markdown links
|
||||||
|
const standaloneUrlRegex = new RegExp(`(?<!\\]\\()${urlData.url.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}(?!\\))`, 'g');
|
||||||
|
processedText = processedText.replace(standaloneUrlRegex, '');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return <span dangerouslySetInnerHTML={{ __html: processedText }} />;
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderURLPreviews = (urls: any[]) => {
|
||||||
|
if (!urls || urls.length === 0) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="mt-2 space-y-2">
|
||||||
|
{urls.map((urlData, index) => (
|
||||||
|
<div key={index} className="border rounded-lg p-3 bg-gray-50">
|
||||||
|
<div className="flex items-start gap-2">
|
||||||
|
<ExternalLink className="h-4 w-4 mt-1 text-blue-500 flex-shrink-0" />
|
||||||
|
<div className="min-w-0 flex-1">
|
||||||
|
<a
|
||||||
|
href={urlData.url}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="text-blue-600 hover:underline text-sm break-all"
|
||||||
|
>
|
||||||
|
{urlData.meta?.pageTitle || urlData.url}
|
||||||
|
</a>
|
||||||
|
{urlData.meta?.ogDescription && (
|
||||||
|
<div className="text-xs text-muted-foreground mt-1">{urlData.meta.ogDescription}</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderAttachments = (attachments: any[]) => {
|
||||||
|
if (!attachments || attachments.length === 0) return null;
|
||||||
|
|
||||||
|
// Filter out thumbnail attachments (they're usually lower quality versions)
|
||||||
|
const filteredAttachments = attachments.filter(attachment =>
|
||||||
|
!attachment.name?.toLowerCase().startsWith('thumb-')
|
||||||
|
);
|
||||||
|
|
||||||
|
if (filteredAttachments.length === 0) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="mt-2 space-y-2">
|
||||||
|
{filteredAttachments.map((attachment, index) => {
|
||||||
|
const isImage = attachment.typegroup === 'image';
|
||||||
|
const filePath = `${config.chatUrl}/files/by-id/${attachment.mongo_id}`;
|
||||||
|
|
||||||
|
const handleDownload = () => {
|
||||||
|
// Create a temporary anchor element to trigger download
|
||||||
|
const link = document.createElement('a');
|
||||||
|
link.href = filePath;
|
||||||
|
link.download = attachment.name || 'download';
|
||||||
|
link.target = '_blank';
|
||||||
|
document.body.appendChild(link);
|
||||||
|
link.click();
|
||||||
|
document.body.removeChild(link);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div key={index} className="border rounded-lg p-3 bg-gray-50">
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
{isImage ? (
|
||||||
|
<Image className="h-4 w-4 text-green-500" />
|
||||||
|
) : (
|
||||||
|
<FileText className="h-4 w-4 text-blue-500" />
|
||||||
|
)}
|
||||||
|
<div className="flex-1 min-w-0">
|
||||||
|
<div className="font-medium text-sm truncate">{attachment.name}</div>
|
||||||
|
<div className="text-xs text-muted-foreground">
|
||||||
|
{(attachment.size / 1024).toFixed(1)} KB
|
||||||
|
{attachment.identify?.size && (
|
||||||
|
<span> • {attachment.identify.size.width}×{attachment.identify.size.height}</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<Button variant="ghost" size="sm" onClick={handleDownload}>
|
||||||
|
<Download className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
{isImage && attachment.identify?.size && (
|
||||||
|
<div className="mt-2">
|
||||||
|
<img
|
||||||
|
src={filePath}
|
||||||
|
alt={attachment.name}
|
||||||
|
className="max-w-xs max-h-48 rounded border cursor-pointer"
|
||||||
|
loading="lazy"
|
||||||
|
onClick={() => window.open(filePath, '_blank')}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderMentions = (text: string, mentions: any[]) => {
|
||||||
|
if (!mentions || mentions.length === 0) return text;
|
||||||
|
|
||||||
|
// First, convert emoji shortcodes to actual emoji
|
||||||
|
let renderedText = convertEmojiShortcodes(text);
|
||||||
|
|
||||||
|
// Then process mentions
|
||||||
|
mentions.forEach((mention) => {
|
||||||
|
if (mention.username) {
|
||||||
|
const mentionPattern = new RegExp(`@${mention.username}`, 'g');
|
||||||
|
renderedText = renderedText.replace(
|
||||||
|
mentionPattern,
|
||||||
|
`<span class="bg-blue-100 text-blue-800 px-1 rounded">@${mention.username}</span>`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return <span dangerouslySetInnerHTML={{ __html: renderedText }} />;
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderMessage = (message: Message, index: number) => {
|
||||||
|
const prevMessage = index > 0 ? messages[index - 1] : null;
|
||||||
|
const isConsecutive = prevMessage &&
|
||||||
|
prevMessage.u.username === message.u.username &&
|
||||||
|
new Date(message.ts).getTime() - new Date(prevMessage.ts).getTime() < 300000; // 5 minutes
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div key={message.id} className={`${isConsecutive ? 'mt-1' : 'mt-4'}`}>
|
||||||
|
{!isConsecutive && (
|
||||||
|
<div className="flex items-center gap-2 mb-1">
|
||||||
|
<Avatar className="h-8 w-8">
|
||||||
|
<AvatarImage
|
||||||
|
src={`${config.chatUrl}/avatar/${message.u._id}`}
|
||||||
|
alt={message.u.name || message.u.username}
|
||||||
|
/>
|
||||||
|
<AvatarFallback className="text-sm font-medium">
|
||||||
|
{(message.u.name || message.u.username).charAt(0).toUpperCase()}
|
||||||
|
</AvatarFallback>
|
||||||
|
</Avatar>
|
||||||
|
<span className="font-medium text-sm">
|
||||||
|
{message.u.name || message.u.username}
|
||||||
|
</span>
|
||||||
|
<span className="text-xs text-muted-foreground">
|
||||||
|
{formatTime(message.ts)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
<div className={`${isConsecutive ? 'ml-10' : 'ml-10'} text-sm`}>
|
||||||
|
<div className="break-words">
|
||||||
|
{message.mentions && message.mentions.length > 0
|
||||||
|
? renderMentions(message.msg, message.mentions)
|
||||||
|
: renderMessageText(message.msg, message.urls)
|
||||||
|
}
|
||||||
|
</div>
|
||||||
|
{message.urls && renderURLPreviews(message.urls)}
|
||||||
|
{message.attachments && renderAttachments(message.attachments)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!roomId) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full">
|
||||||
|
<CardContent className="flex items-center justify-center h-full">
|
||||||
|
<p className="text-muted-foreground">Select a room to view messages</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (loading && messages.length === 0) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full">
|
||||||
|
<CardContent className="flex items-center justify-center h-full">
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin" />
|
||||||
|
<span>Loading messages...</span>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full border-red-200 bg-red-50">
|
||||||
|
<CardContent className="flex items-center justify-center h-full">
|
||||||
|
<p className="text-red-700">{error}</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className="h-full flex flex-col">
|
||||||
|
<CardHeader className="border-b p-4">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
{room && getRoomIcon(room)}
|
||||||
|
<div>
|
||||||
|
<CardTitle className="text-lg">
|
||||||
|
{room?.type === 'd'
|
||||||
|
? `Direct message with ${room?.display_name || 'Unknown User'}`
|
||||||
|
: room?.display_name || room?.fname || room?.name || 'Unnamed Room'
|
||||||
|
}
|
||||||
|
</CardTitle>
|
||||||
|
{room?.description && room?.type !== 'd' && (
|
||||||
|
<p className="text-sm text-muted-foreground">{room.description}</p>
|
||||||
|
)}
|
||||||
|
{/* Only show participants for non-direct messages since DM names are already in the title */}
|
||||||
|
{room?.participants && room.participants.length > 0 && room?.type !== 'd' && (
|
||||||
|
<p className="text-xs text-muted-foreground">
|
||||||
|
{room.participants.map(p => p.name || p.username).join(', ')}
|
||||||
|
</p>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={() => setShowSearch(!showSearch)}
|
||||||
|
>
|
||||||
|
<Search className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{showSearch && (
|
||||||
|
<div className="flex gap-2 mt-2">
|
||||||
|
<Input
|
||||||
|
placeholder="Search messages..."
|
||||||
|
value={searchQuery}
|
||||||
|
onChange={(e) => setSearchQuery(e.target.value)}
|
||||||
|
onKeyPress={(e) => e.key === 'Enter' && searchMessages()}
|
||||||
|
/>
|
||||||
|
<Button onClick={searchMessages} size="sm">Search</Button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</CardHeader>
|
||||||
|
|
||||||
|
<CardContent className="flex-1 p-0 overflow-hidden">
|
||||||
|
<div
|
||||||
|
ref={messagesContainerRef}
|
||||||
|
className="h-full overflow-y-auto p-4"
|
||||||
|
>
|
||||||
|
{hasMore && messages.length > 0 && (
|
||||||
|
<div className="text-center mb-4">
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={loadMoreMessages}
|
||||||
|
disabled={loadingMore}
|
||||||
|
>
|
||||||
|
{loadingMore ? (
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin mr-2" />
|
||||||
|
) : (
|
||||||
|
<ChevronUp className="h-4 w-4 mr-2" />
|
||||||
|
)}
|
||||||
|
Load older messages
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{messages.length === 0 ? (
|
||||||
|
<div className="text-center text-muted-foreground">
|
||||||
|
No messages in this room
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-1">
|
||||||
|
{messages.map((message, index) => renderMessage(message, index))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div ref={messagesEndRef} />
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
177
inventory/src/components/chat/ChatTest.tsx
Normal file
177
inventory/src/components/chat/ChatTest.tsx
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { Badge } from '@/components/ui/badge';
|
||||||
|
import { Loader2, Hash, Lock, Users, MessageSquare } from 'lucide-react';
|
||||||
|
import config from '@/config';
|
||||||
|
|
||||||
|
interface Room {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
fname: string;
|
||||||
|
type: string;
|
||||||
|
msgs: number;
|
||||||
|
last_message_date: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface ChatTestProps {
|
||||||
|
selectedUserId: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ChatTest({ selectedUserId }: ChatTestProps) {
|
||||||
|
const [rooms, setRooms] = useState<Room[]>([]);
|
||||||
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (!selectedUserId) {
|
||||||
|
setRooms([]);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const fetchUserRooms = async () => {
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${config.chatUrl}/users/${selectedUserId}/rooms`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setRooms(data.rooms);
|
||||||
|
} else {
|
||||||
|
throw new Error(data.error || 'Failed to fetch rooms');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching user rooms:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'An error occurred');
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchUserRooms();
|
||||||
|
}, [selectedUserId]);
|
||||||
|
|
||||||
|
const getRoomIcon = (roomType: string) => {
|
||||||
|
switch (roomType) {
|
||||||
|
case 'c':
|
||||||
|
return <Hash className="h-4 w-4 text-blue-500" />;
|
||||||
|
case 'p':
|
||||||
|
return <Lock className="h-4 w-4 text-orange-500" />;
|
||||||
|
case 'd':
|
||||||
|
return <MessageSquare className="h-4 w-4 text-green-500" />;
|
||||||
|
default:
|
||||||
|
return <Users className="h-4 w-4 text-gray-500" />;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getRoomTypeLabel = (roomType: string) => {
|
||||||
|
switch (roomType) {
|
||||||
|
case 'c':
|
||||||
|
return 'Channel';
|
||||||
|
case 'p':
|
||||||
|
return 'Private';
|
||||||
|
case 'd':
|
||||||
|
return 'Direct';
|
||||||
|
default:
|
||||||
|
return 'Unknown';
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!selectedUserId) {
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Database Connection Test</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<p className="text-muted-foreground">
|
||||||
|
Select a user from the dropdown above to view their rooms and test the database connection.
|
||||||
|
</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (loading) {
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Loading User Rooms...</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin" />
|
||||||
|
<span>Fetching rooms for selected user...</span>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return (
|
||||||
|
<Card className="border-red-200 bg-red-50">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle className="text-red-800">Error Loading Rooms</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<p className="text-red-700">{error}</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>User Rooms ({rooms.length})</CardTitle>
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
Rooms accessible to the selected user
|
||||||
|
</p>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
{rooms.length === 0 ? (
|
||||||
|
<p className="text-muted-foreground">No rooms found for this user.</p>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-2">
|
||||||
|
{rooms.map((room) => (
|
||||||
|
<div
|
||||||
|
key={room.id}
|
||||||
|
className="flex items-center justify-between p-3 border rounded-lg hover:bg-gray-50"
|
||||||
|
>
|
||||||
|
<div className="flex items-center gap-3">
|
||||||
|
{getRoomIcon(room.type)}
|
||||||
|
<div>
|
||||||
|
<div className="font-medium">
|
||||||
|
{room.fname || room.name || 'Unnamed Room'}
|
||||||
|
</div>
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
{room.name && room.fname !== room.name && (
|
||||||
|
<span className="font-mono">#{room.name}</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Badge variant="secondary">
|
||||||
|
{getRoomTypeLabel(room.type)}
|
||||||
|
</Badge>
|
||||||
|
<Badge variant="outline">
|
||||||
|
{room.msgs} messages
|
||||||
|
</Badge>
|
||||||
|
{room.last_message_date && (
|
||||||
|
<Badge variant="outline" className="text-xs">
|
||||||
|
{new Date(room.last_message_date).toLocaleDateString()}
|
||||||
|
</Badge>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
332
inventory/src/components/chat/RoomList.tsx
Normal file
332
inventory/src/components/chat/RoomList.tsx
Normal file
@@ -0,0 +1,332 @@
|
|||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { Badge } from '@/components/ui/badge';
|
||||||
|
import { Avatar, AvatarFallback, AvatarImage } from '@/components/ui/avatar';
|
||||||
|
import { Loader2, Hash, Lock, Users, MessageSquare, Search, MessageCircle, Users2 } from 'lucide-react';
|
||||||
|
import { Input } from '@/components/ui/input';
|
||||||
|
import config from '@/config';
|
||||||
|
|
||||||
|
interface Room {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
fname: string;
|
||||||
|
type: string;
|
||||||
|
msgs: number;
|
||||||
|
last_message_date: string;
|
||||||
|
display_name: string;
|
||||||
|
userscount?: number;
|
||||||
|
description?: string;
|
||||||
|
teamid?: string;
|
||||||
|
archived?: boolean;
|
||||||
|
open?: boolean;
|
||||||
|
participants?: {
|
||||||
|
username: string;
|
||||||
|
name: string;
|
||||||
|
mongo_id: string;
|
||||||
|
avataretag?: string;
|
||||||
|
}[];
|
||||||
|
}
|
||||||
|
|
||||||
|
interface RoomListProps {
|
||||||
|
selectedUserId: string;
|
||||||
|
selectedRoomId: string | null;
|
||||||
|
onRoomSelect: (roomId: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function RoomList({ selectedUserId, selectedRoomId, onRoomSelect }: RoomListProps) {
|
||||||
|
const [rooms, setRooms] = useState<Room[]>([]);
|
||||||
|
const [filteredRooms, setFilteredRooms] = useState<Room[]>([]);
|
||||||
|
const [loading, setLoading] = useState(false);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
const [searchFilter, setSearchFilter] = useState('');
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (!selectedUserId) {
|
||||||
|
setRooms([]);
|
||||||
|
setFilteredRooms([]);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const fetchUserRooms = async () => {
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${config.chatUrl}/users/${selectedUserId}/rooms`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setRooms(data.rooms);
|
||||||
|
setFilteredRooms(data.rooms);
|
||||||
|
} else {
|
||||||
|
throw new Error(data.error || 'Failed to fetch rooms');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching user rooms:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'An error occurred');
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchUserRooms();
|
||||||
|
}, [selectedUserId]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
if (!searchFilter) {
|
||||||
|
setFilteredRooms(rooms);
|
||||||
|
} else {
|
||||||
|
const filtered = rooms.filter(room =>
|
||||||
|
(room.display_name?.toLowerCase() || '').includes(searchFilter.toLowerCase()) ||
|
||||||
|
(room.name?.toLowerCase() || '').includes(searchFilter.toLowerCase()) ||
|
||||||
|
(room.fname?.toLowerCase() || '').includes(searchFilter.toLowerCase())
|
||||||
|
);
|
||||||
|
setFilteredRooms(filtered);
|
||||||
|
}
|
||||||
|
}, [searchFilter, rooms]);
|
||||||
|
|
||||||
|
const groupRoomsByType = (rooms: Room[]) => {
|
||||||
|
const teams: Room[] = [];
|
||||||
|
const discussions: Room[] = [];
|
||||||
|
const channels: Room[] = [];
|
||||||
|
const directMessages: Room[] = [];
|
||||||
|
|
||||||
|
rooms.forEach(room => {
|
||||||
|
switch (room.type) {
|
||||||
|
case 'p':
|
||||||
|
if (room.teamid) {
|
||||||
|
teams.push(room);
|
||||||
|
} else {
|
||||||
|
discussions.push(room);
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case 'c':
|
||||||
|
channels.push(room);
|
||||||
|
break;
|
||||||
|
case 'd':
|
||||||
|
directMessages.push(room);
|
||||||
|
break;
|
||||||
|
default:
|
||||||
|
channels.push(room); // fallback for unknown types
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Sort each group by message count descending
|
||||||
|
const sortByMessages = (a: Room, b: Room) => (b.msgs || 0) - (a.msgs || 0);
|
||||||
|
|
||||||
|
teams.sort(sortByMessages);
|
||||||
|
discussions.sort(sortByMessages);
|
||||||
|
channels.sort(sortByMessages);
|
||||||
|
directMessages.sort(sortByMessages);
|
||||||
|
|
||||||
|
return { teams, discussions, channels, directMessages };
|
||||||
|
};
|
||||||
|
|
||||||
|
const renderRoomIcon = (room: Room) => {
|
||||||
|
// For direct messages, show participant avatars
|
||||||
|
if (room.type === 'd' && room.participants && room.participants.length > 0) {
|
||||||
|
if (room.participants.length === 1) {
|
||||||
|
// Single participant - show their avatar
|
||||||
|
const participant = room.participants[0];
|
||||||
|
return (
|
||||||
|
<Avatar className="h-10 w-10">
|
||||||
|
<AvatarImage
|
||||||
|
src={`${config.chatUrl}/avatar/${participant.mongo_id}`}
|
||||||
|
alt={participant.name || participant.username}
|
||||||
|
/>
|
||||||
|
<AvatarFallback className="text-lg">
|
||||||
|
{(participant.name || participant.username).charAt(0).toUpperCase()}
|
||||||
|
</AvatarFallback>
|
||||||
|
</Avatar>
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
// Multiple participants - show overlapping avatars
|
||||||
|
return (
|
||||||
|
<div className="relative flex items-center h-10 w-10">
|
||||||
|
{room.participants.slice(0, 3).map((participant, index) => (
|
||||||
|
<Avatar
|
||||||
|
key={participant.mongo_id}
|
||||||
|
className={`h-8 w-8 border-2 border-white ${index > 0 ? '-ml-4' : ''}`}
|
||||||
|
style={{ zIndex: 30 - index }}
|
||||||
|
>
|
||||||
|
<AvatarImage
|
||||||
|
src={`${config.chatUrl}/avatar/${participant.mongo_id}`}
|
||||||
|
alt={participant.name || participant.username}
|
||||||
|
/>
|
||||||
|
<AvatarFallback className="text-lg">
|
||||||
|
{(participant.name || participant.username).charAt(0).toUpperCase()}
|
||||||
|
</AvatarFallback>
|
||||||
|
</Avatar>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// For other room types, use icons
|
||||||
|
switch (room.type) {
|
||||||
|
case 'c':
|
||||||
|
return (
|
||||||
|
<div className="h-10 w-10 bg-blue-50 rounded-full flex items-center justify-center">
|
||||||
|
<Hash className="h-6 w-6 text-blue-500" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
case 'p':
|
||||||
|
// Distinguish between teams and discussions based on teamid
|
||||||
|
if (room.teamid) {
|
||||||
|
return (
|
||||||
|
<div className="h-10 w-10 bg-purple-50 rounded-full flex items-center justify-center">
|
||||||
|
<Users2 className="h-6 w-6 text-purple-500" />
|
||||||
|
</div>
|
||||||
|
); // Teams
|
||||||
|
} else {
|
||||||
|
return (
|
||||||
|
<div className="h-10 w-10 bg-orange-50 rounded-full flex items-center justify-center">
|
||||||
|
<MessageCircle className="h-6 w-6 text-orange-500" />
|
||||||
|
</div>
|
||||||
|
); // Discussions
|
||||||
|
}
|
||||||
|
case 'd':
|
||||||
|
return (
|
||||||
|
<div className="h-12 w-12 bg-green-50 rounded-full flex items-center justify-center">
|
||||||
|
<MessageSquare className="h-6 w-6 text-green-500" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
default:
|
||||||
|
return (
|
||||||
|
<div className="h-12 w-12 bg-gray-50 rounded-full flex items-center justify-center">
|
||||||
|
<Users className="h-6 w-6 text-gray-500" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!selectedUserId) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Rooms</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<p className="text-muted-foreground text-sm">
|
||||||
|
Select a user from the dropdown above to view their rooms.
|
||||||
|
</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (loading) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle>Loading Rooms...</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin" />
|
||||||
|
<span className="text-sm">Fetching rooms for selected user...</span>
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return (
|
||||||
|
<Card className="h-full border-red-200 bg-red-50">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle className="text-red-800">Error Loading Rooms</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<p className="text-red-700 text-sm">{error}</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className="h-full flex flex-col">
|
||||||
|
<CardHeader className="pb-3">
|
||||||
|
<CardTitle className="text-lg">Rooms</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
|
||||||
|
<CardContent className="flex-1 p-0 overflow-hidden">
|
||||||
|
<div className="h-full overflow-y-auto">
|
||||||
|
{filteredRooms.length === 0 ? (
|
||||||
|
<div className="p-4">
|
||||||
|
<p className="text-muted-foreground text-sm">
|
||||||
|
{searchFilter ? 'No rooms match your search.' : 'No rooms found for this user.'}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-3 px-2 pb-4">
|
||||||
|
{(() => {
|
||||||
|
const { teams, discussions, channels, directMessages } = groupRoomsByType(filteredRooms);
|
||||||
|
|
||||||
|
const renderRoomGroup = (title: string, rooms: Room[], icon: React.ReactNode) => {
|
||||||
|
if (rooms.length === 0) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div key={title} className="space-y-0.5">
|
||||||
|
<div className="flex items-center gap-2 px-2 py-1 border-b border-gray-100">
|
||||||
|
{icon}
|
||||||
|
<h3 className="text-xs font-semibold text-muted-foreground uppercase tracking-wide">
|
||||||
|
{title} ({rooms.length})
|
||||||
|
</h3>
|
||||||
|
</div>
|
||||||
|
<div className="space-y-0.5">
|
||||||
|
{rooms.map((room) => (
|
||||||
|
<div
|
||||||
|
key={room.id}
|
||||||
|
onClick={() => onRoomSelect(room.id.toString())}
|
||||||
|
className={`
|
||||||
|
flex items-center justify-between py-0.5 px-3 rounded-lg cursor-pointer transition-colors
|
||||||
|
hover:bg-gray-100
|
||||||
|
${selectedRoomId === room.id.toString() ? 'bg-blue-50 border-l-4 border-blue-500' : ''}
|
||||||
|
${(room.open === false || room.archived === true) ? 'opacity-60' : ''}
|
||||||
|
`}
|
||||||
|
>
|
||||||
|
<div className="grid grid-cols-4 items-center gap-2 min-w-0 flex-1">
|
||||||
|
{renderRoomIcon(room)}
|
||||||
|
<div className="min-w-0 flex-1 col-span-3">
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<div className={`font-medium text-sm truncate ${(room.open === false || room.archived === true) ? 'text-muted-foreground' : ''}`}>
|
||||||
|
{room.display_name || room.fname || room.name || 'Unnamed Room'}
|
||||||
|
</div>
|
||||||
|
{room.msgs > 0 && (
|
||||||
|
<span className="text-xs text-muted-foreground ml-2 flex-shrink-0">
|
||||||
|
{room.msgs} msgs
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
{room.description && (
|
||||||
|
<div className="text-xs text-muted-foreground truncate">
|
||||||
|
{room.description}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
{renderRoomGroup('Teams', teams, <Users2 className="h-3 w-3 text-purple-500" />)}
|
||||||
|
{renderRoomGroup('Discussions', discussions, <MessageCircle className="h-3 w-3 text-orange-500" />)}
|
||||||
|
{renderRoomGroup('Channels', channels, <Hash className="h-3 w-3 text-blue-500" />)}
|
||||||
|
{renderRoomGroup('Direct Messages', directMessages, <MessageSquare className="h-3 w-3 text-green-500" />)}
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
})()}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
117
inventory/src/components/chat/SearchResults.tsx
Normal file
117
inventory/src/components/chat/SearchResults.tsx
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
import React from 'react';
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { Badge } from '@/components/ui/badge';
|
||||||
|
import { Button } from '@/components/ui/button';
|
||||||
|
import { Hash, Lock, MessageSquare, X } from 'lucide-react';
|
||||||
|
import { convertEmojiShortcodes } from '@/utils/emojiUtils';
|
||||||
|
|
||||||
|
interface SearchResult {
|
||||||
|
id: number;
|
||||||
|
msg: string;
|
||||||
|
ts: string;
|
||||||
|
u: {
|
||||||
|
username: string;
|
||||||
|
name?: string;
|
||||||
|
};
|
||||||
|
room_id: number;
|
||||||
|
room_name: string;
|
||||||
|
room_fname: string;
|
||||||
|
room_type: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface SearchResultsProps {
|
||||||
|
results: SearchResult[];
|
||||||
|
query: string;
|
||||||
|
onClose: () => void;
|
||||||
|
onRoomSelect: (roomId: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function SearchResults({ results, query, onClose, onRoomSelect }: SearchResultsProps) {
|
||||||
|
const getRoomIcon = (roomType: string) => {
|
||||||
|
switch (roomType) {
|
||||||
|
case 'c':
|
||||||
|
return <Hash className="h-3 w-3 text-blue-500" />;
|
||||||
|
case 'p':
|
||||||
|
return <Lock className="h-3 w-3 text-orange-500" />;
|
||||||
|
case 'd':
|
||||||
|
return <MessageSquare className="h-3 w-3 text-green-500" />;
|
||||||
|
default:
|
||||||
|
return <Hash className="h-3 w-3 text-gray-500" />;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const highlightText = (text: string, query: string) => {
|
||||||
|
if (!query) return convertEmojiShortcodes(text);
|
||||||
|
|
||||||
|
// First convert emoji shortcodes
|
||||||
|
const textWithEmoji = convertEmojiShortcodes(text);
|
||||||
|
|
||||||
|
const regex = new RegExp(`(${query})`, 'gi');
|
||||||
|
const parts = textWithEmoji.split(regex);
|
||||||
|
|
||||||
|
return parts.map((part, index) =>
|
||||||
|
regex.test(part) ? (
|
||||||
|
<span key={index} className="bg-yellow-200 font-medium">
|
||||||
|
{part}
|
||||||
|
</span>
|
||||||
|
) : (
|
||||||
|
part
|
||||||
|
)
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatTime = (timestamp: string) => {
|
||||||
|
const date = new Date(timestamp);
|
||||||
|
return date.toLocaleDateString() + ' ' + date.toLocaleTimeString([], {
|
||||||
|
hour: '2-digit',
|
||||||
|
minute: '2-digit'
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className="absolute top-full left-0 right-0 z-10 mt-2 max-h-96 overflow-y-auto">
|
||||||
|
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||||
|
<CardTitle className="text-sm">
|
||||||
|
Search Results for "{query}" ({results.length})
|
||||||
|
</CardTitle>
|
||||||
|
<Button variant="ghost" size="sm" onClick={onClose}>
|
||||||
|
<X className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent className="pt-0">
|
||||||
|
{results.length === 0 ? (
|
||||||
|
<p className="text-sm text-muted-foreground">No messages found matching your search.</p>
|
||||||
|
) : (
|
||||||
|
<div className="space-y-3">
|
||||||
|
{results.map((result) => (
|
||||||
|
<div
|
||||||
|
key={result.id}
|
||||||
|
className="border rounded-lg p-3 hover:bg-gray-50 cursor-pointer"
|
||||||
|
onClick={() => {
|
||||||
|
onRoomSelect(result.room_id.toString());
|
||||||
|
onClose();
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<div className="flex items-center gap-2 mb-2">
|
||||||
|
{getRoomIcon(result.room_type)}
|
||||||
|
<span className="text-sm font-medium">
|
||||||
|
{result.room_fname || result.room_name || 'Unnamed Room'}
|
||||||
|
</span>
|
||||||
|
<Badge variant="outline" className="text-xs">
|
||||||
|
{result.u.name || result.u.username}
|
||||||
|
</Badge>
|
||||||
|
<span className="text-xs text-muted-foreground ml-auto">
|
||||||
|
{formatTime(result.ts)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<div className="text-sm">
|
||||||
|
{highlightText(result.msg, query)}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -9,6 +9,7 @@ import {
|
|||||||
Plus,
|
Plus,
|
||||||
ShoppingBag,
|
ShoppingBag,
|
||||||
Truck,
|
Truck,
|
||||||
|
MessageCircle,
|
||||||
} from "lucide-react";
|
} from "lucide-react";
|
||||||
import { IconCrystalBall } from "@tabler/icons-react";
|
import { IconCrystalBall } from "@tabler/icons-react";
|
||||||
import {
|
import {
|
||||||
@@ -81,6 +82,12 @@ const items = [
|
|||||||
icon: Plus,
|
icon: Plus,
|
||||||
url: "/import",
|
url: "/import",
|
||||||
permission: "access:import"
|
permission: "access:import"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
title: "Chat",
|
||||||
|
icon: MessageCircle,
|
||||||
|
url: "/chat",
|
||||||
|
permission: "access:chat"
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
|
|
||||||
|
|||||||
383
inventory/src/components/purchase-orders/CategoryMetricsCard.tsx
Normal file
383
inventory/src/components/purchase-orders/CategoryMetricsCard.tsx
Normal file
@@ -0,0 +1,383 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
import {
|
||||||
|
Card,
|
||||||
|
CardContent,
|
||||||
|
CardHeader,
|
||||||
|
CardTitle,
|
||||||
|
} from "../../components/ui/card";
|
||||||
|
import { Skeleton } from "../../components/ui/skeleton";
|
||||||
|
import { BarChart3, Loader2 } from "lucide-react";
|
||||||
|
import { Button } from "../../components/ui/button";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
DialogTrigger,
|
||||||
|
} from "../../components/ui/dialog";
|
||||||
|
import { PieChart, Pie, ResponsiveContainer, Cell, Sector } from "recharts";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "../../components/ui/table";
|
||||||
|
|
||||||
|
// Add this constant for pie chart colors
|
||||||
|
const COLORS = [
|
||||||
|
"#0088FE",
|
||||||
|
"#00C49F",
|
||||||
|
"#FFBB28",
|
||||||
|
"#FF8042",
|
||||||
|
"#8884D8",
|
||||||
|
"#82CA9D",
|
||||||
|
"#FFC658",
|
||||||
|
"#FF7C43",
|
||||||
|
];
|
||||||
|
|
||||||
|
// The renderActiveShape function for pie charts
|
||||||
|
const renderActiveShape = (props: any) => {
|
||||||
|
const {
|
||||||
|
cx,
|
||||||
|
cy,
|
||||||
|
innerRadius,
|
||||||
|
outerRadius,
|
||||||
|
startAngle,
|
||||||
|
endAngle,
|
||||||
|
fill,
|
||||||
|
category,
|
||||||
|
total_spend,
|
||||||
|
} = props;
|
||||||
|
|
||||||
|
// Split category name into words and create lines of max 12 chars
|
||||||
|
const words = category.split(" ");
|
||||||
|
const lines: string[] = [];
|
||||||
|
let currentLine = "";
|
||||||
|
|
||||||
|
words.forEach((word: string) => {
|
||||||
|
if ((currentLine + " " + word).length <= 12) {
|
||||||
|
currentLine = currentLine ? `${currentLine} ${word}` : word;
|
||||||
|
} else {
|
||||||
|
if (currentLine) lines.push(currentLine);
|
||||||
|
currentLine = word;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
if (currentLine) lines.push(currentLine);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<g>
|
||||||
|
<Sector
|
||||||
|
cx={cx}
|
||||||
|
cy={cy}
|
||||||
|
innerRadius={innerRadius}
|
||||||
|
outerRadius={outerRadius}
|
||||||
|
startAngle={startAngle}
|
||||||
|
endAngle={endAngle}
|
||||||
|
fill={fill}
|
||||||
|
/>
|
||||||
|
<Sector
|
||||||
|
cx={cx}
|
||||||
|
cy={cy}
|
||||||
|
startAngle={startAngle}
|
||||||
|
endAngle={endAngle}
|
||||||
|
innerRadius={outerRadius - 1}
|
||||||
|
outerRadius={outerRadius + 4}
|
||||||
|
fill={fill}
|
||||||
|
/>
|
||||||
|
{lines.map((line, i) => (
|
||||||
|
<text
|
||||||
|
key={i}
|
||||||
|
x={cx}
|
||||||
|
y={cy}
|
||||||
|
dy={-20 + i * 16}
|
||||||
|
textAnchor="middle"
|
||||||
|
fill="#888888"
|
||||||
|
className="text-xs"
|
||||||
|
>
|
||||||
|
{line}
|
||||||
|
</text>
|
||||||
|
))}
|
||||||
|
<text
|
||||||
|
x={cx}
|
||||||
|
y={cy}
|
||||||
|
dy={lines.length * 16 - 10}
|
||||||
|
textAnchor="middle"
|
||||||
|
fill="#000000"
|
||||||
|
className="text-base font-medium"
|
||||||
|
>
|
||||||
|
{`$${Number(total_spend).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`}
|
||||||
|
</text>
|
||||||
|
</g>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
interface CategoryMetricsCardProps {
|
||||||
|
loading: boolean;
|
||||||
|
yearlyCategoryData: {
|
||||||
|
category: string;
|
||||||
|
unique_products?: number;
|
||||||
|
total_spend: number;
|
||||||
|
percentage?: number;
|
||||||
|
avg_cost?: number;
|
||||||
|
cost_variance?: number;
|
||||||
|
}[];
|
||||||
|
yearlyDataLoading: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function CategoryMetricsCard({
|
||||||
|
loading,
|
||||||
|
yearlyCategoryData,
|
||||||
|
yearlyDataLoading,
|
||||||
|
}: CategoryMetricsCardProps) {
|
||||||
|
const [costAnalysisOpen, setCostAnalysisOpen] = useState(false);
|
||||||
|
const [activeSpendingIndex, setActiveSpendingIndex] = useState<
|
||||||
|
number | undefined
|
||||||
|
>();
|
||||||
|
const [initialLoading, setInitialLoading] = useState(true);
|
||||||
|
|
||||||
|
// Only show loading state on initial load, not during table refreshes
|
||||||
|
useEffect(() => {
|
||||||
|
if (yearlyCategoryData.length > 0 && !yearlyDataLoading) {
|
||||||
|
setInitialLoading(false);
|
||||||
|
}
|
||||||
|
}, [yearlyCategoryData, yearlyDataLoading]);
|
||||||
|
|
||||||
|
const formatNumber = (value: number) => {
|
||||||
|
return value.toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatCurrency = (value: number) => {
|
||||||
|
return `$${formatNumber(value)}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatPercent = (value: number) => {
|
||||||
|
return (
|
||||||
|
(value * 100).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 1,
|
||||||
|
maximumFractionDigits: 1,
|
||||||
|
}) + "%"
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Prepare spending chart data
|
||||||
|
const prepareSpendingChartData = () => {
|
||||||
|
if (!yearlyCategoryData.length) return [];
|
||||||
|
|
||||||
|
// Make a copy to avoid modifying state directly
|
||||||
|
const categoryArray = [...yearlyCategoryData];
|
||||||
|
const totalSpend = categoryArray.reduce(
|
||||||
|
(sum, cat) => sum + cat.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
// Split into significant categories (>=1%) and others
|
||||||
|
const significantCategories = categoryArray.filter(
|
||||||
|
(cat) => cat.total_spend / totalSpend >= 0.01
|
||||||
|
);
|
||||||
|
|
||||||
|
const otherCategories = categoryArray.filter(
|
||||||
|
(cat) => cat.total_spend / totalSpend < 0.01
|
||||||
|
);
|
||||||
|
|
||||||
|
let result = [...significantCategories];
|
||||||
|
|
||||||
|
// Add "Other" category if needed
|
||||||
|
if (otherCategories.length > 0) {
|
||||||
|
const otherTotalSpend = otherCategories.reduce(
|
||||||
|
(sum, cat) => sum + cat.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
result.push({
|
||||||
|
category: "Other",
|
||||||
|
total_spend: otherTotalSpend,
|
||||||
|
percentage: otherTotalSpend / totalSpend,
|
||||||
|
unique_products: otherCategories.reduce(
|
||||||
|
(sum, cat) => sum + (cat.unique_products || 0),
|
||||||
|
0
|
||||||
|
),
|
||||||
|
avg_cost:
|
||||||
|
otherTotalSpend /
|
||||||
|
otherCategories.reduce(
|
||||||
|
(sum, cat) => sum + (cat.unique_products || 0),
|
||||||
|
1
|
||||||
|
),
|
||||||
|
cost_variance: 0,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort by spend amount descending
|
||||||
|
return result.sort((a, b) => b.total_spend - a.total_spend);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Cost analysis table component
|
||||||
|
const CostAnalysisTable = () => {
|
||||||
|
if (!yearlyCategoryData.length) {
|
||||||
|
return yearlyDataLoading ? (
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="text-center p-4 text-muted-foreground">
|
||||||
|
No category data available for the past 12 months
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
{yearlyDataLoading ? (
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="text-sm font-medium mb-2 px-4 flex justify-between">
|
||||||
|
<span>
|
||||||
|
Showing received inventory by category for the past 12 months
|
||||||
|
</span>
|
||||||
|
<span>{yearlyCategoryData.length} categories found</span>
|
||||||
|
</div>
|
||||||
|
<div className="text-xs text-muted-foreground px-4 mb-2">
|
||||||
|
Note: items can be in multiple categories, so the sum of the
|
||||||
|
categories will not equal the total spend.
|
||||||
|
</div>
|
||||||
|
<Table>
|
||||||
|
<TableHeader>
|
||||||
|
<TableRow>
|
||||||
|
<TableHead>Category</TableHead>
|
||||||
|
<TableHead>Products</TableHead>
|
||||||
|
<TableHead>Avg. Cost</TableHead>
|
||||||
|
<TableHead>Price Variance</TableHead>
|
||||||
|
<TableHead>Total Spend</TableHead>
|
||||||
|
<TableHead>% of Total</TableHead>
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{yearlyCategoryData.map((category) => {
|
||||||
|
// Calculate percentage of total spend
|
||||||
|
const totalSpendPercentage =
|
||||||
|
"percentage" in category &&
|
||||||
|
typeof category.percentage === "number"
|
||||||
|
? category.percentage
|
||||||
|
: yearlyCategoryData.reduce(
|
||||||
|
(sum, cat) => sum + cat.total_spend,
|
||||||
|
0
|
||||||
|
) > 0
|
||||||
|
? category.total_spend /
|
||||||
|
yearlyCategoryData.reduce(
|
||||||
|
(sum, cat) => sum + cat.total_spend,
|
||||||
|
0
|
||||||
|
)
|
||||||
|
: 0;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<TableRow key={category.category}>
|
||||||
|
<TableCell className="font-medium">
|
||||||
|
{category.category || "Uncategorized"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{category.unique_products?.toLocaleString() || "N/A"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{category.avg_cost !== undefined
|
||||||
|
? formatCurrency(category.avg_cost)
|
||||||
|
: "N/A"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{category.cost_variance !== undefined
|
||||||
|
? parseFloat(
|
||||||
|
category.cost_variance.toFixed(2)
|
||||||
|
).toLocaleString()
|
||||||
|
: "N/A"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{formatCurrency(category.total_spend)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{formatPercent(totalSpendPercentage)}
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||||
|
<CardTitle className="text-sm font-medium">
|
||||||
|
Received by Category
|
||||||
|
</CardTitle>
|
||||||
|
<Dialog open={costAnalysisOpen} onOpenChange={setCostAnalysisOpen}>
|
||||||
|
<DialogTrigger asChild>
|
||||||
|
<Button variant="outline" disabled={initialLoading || loading}>
|
||||||
|
<BarChart3 className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</DialogTrigger>
|
||||||
|
<DialogContent className="max-w-[90%] w-fit">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle className="flex items-center gap-2">
|
||||||
|
<BarChart3 className="h-5 w-5" />
|
||||||
|
<span>Received Inventory by Category</span>
|
||||||
|
</DialogTitle>
|
||||||
|
</DialogHeader>
|
||||||
|
<div className="overflow-auto max-h-[70vh]">
|
||||||
|
<CostAnalysisTable />
|
||||||
|
</div>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
{initialLoading || loading ? (
|
||||||
|
<div className="flex flex-col items-center justify-center h-[170px]">
|
||||||
|
<Skeleton className="h-[170px] w-[170px] rounded-full" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="h-[170px] relative">
|
||||||
|
<ResponsiveContainer width="100%" height="100%">
|
||||||
|
<PieChart margin={{ top: 30, right: 0, left: 0, bottom: 30 }}>
|
||||||
|
<Pie
|
||||||
|
data={prepareSpendingChartData()}
|
||||||
|
dataKey="total_spend"
|
||||||
|
nameKey="category"
|
||||||
|
cx="50%"
|
||||||
|
cy="50%"
|
||||||
|
innerRadius={60}
|
||||||
|
outerRadius={80}
|
||||||
|
paddingAngle={1}
|
||||||
|
activeIndex={activeSpendingIndex}
|
||||||
|
activeShape={renderActiveShape}
|
||||||
|
onMouseEnter={(_, index) => setActiveSpendingIndex(index)}
|
||||||
|
onMouseLeave={() => setActiveSpendingIndex(undefined)}
|
||||||
|
>
|
||||||
|
{prepareSpendingChartData().map((entry, index) => (
|
||||||
|
<Cell
|
||||||
|
key={entry.category}
|
||||||
|
fill={COLORS[index % COLORS.length]}
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
</Pie>
|
||||||
|
</PieChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
155
inventory/src/components/purchase-orders/FilterControls.tsx
Normal file
155
inventory/src/components/purchase-orders/FilterControls.tsx
Normal file
@@ -0,0 +1,155 @@
|
|||||||
|
import { Input } from "../../components/ui/input";
|
||||||
|
import { Button } from "../../components/ui/button";
|
||||||
|
import {
|
||||||
|
Select,
|
||||||
|
SelectContent,
|
||||||
|
SelectItem,
|
||||||
|
SelectTrigger,
|
||||||
|
SelectValue,
|
||||||
|
} from "../../components/ui/select";
|
||||||
|
import {
|
||||||
|
PurchaseOrderStatus,
|
||||||
|
getPurchaseOrderStatusLabel
|
||||||
|
} from "../../types/status-codes";
|
||||||
|
|
||||||
|
interface FilterControlsProps {
|
||||||
|
searchInput: string;
|
||||||
|
setSearchInput: (value: string) => void;
|
||||||
|
filterValues: {
|
||||||
|
search: string;
|
||||||
|
status: string;
|
||||||
|
vendor: string;
|
||||||
|
recordType: string;
|
||||||
|
};
|
||||||
|
handleStatusChange: (value: string) => void;
|
||||||
|
handleVendorChange: (value: string) => void;
|
||||||
|
handleRecordTypeChange: (value: string) => void;
|
||||||
|
clearFilters: () => void;
|
||||||
|
filterOptions: {
|
||||||
|
vendors: string[];
|
||||||
|
statuses: number[];
|
||||||
|
};
|
||||||
|
loading: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
const STATUS_FILTER_OPTIONS = [
|
||||||
|
{ value: "all", label: "All Statuses" },
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.Created),
|
||||||
|
label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Created),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.ElectronicallyReadySend),
|
||||||
|
label: getPurchaseOrderStatusLabel(
|
||||||
|
PurchaseOrderStatus.ElectronicallyReadySend
|
||||||
|
),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.Ordered),
|
||||||
|
label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Ordered),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.ReceivingStarted),
|
||||||
|
label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.ReceivingStarted),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.Done),
|
||||||
|
label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Done),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
value: String(PurchaseOrderStatus.Canceled),
|
||||||
|
label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Canceled),
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
const RECORD_TYPE_FILTER_OPTIONS = [
|
||||||
|
{ value: "all", label: "All Records" },
|
||||||
|
{ value: "po_only", label: "PO Only" },
|
||||||
|
{ value: "po_with_receiving", label: "PO with Receiving" },
|
||||||
|
{ value: "receiving_only", label: "Receiving Only" },
|
||||||
|
];
|
||||||
|
|
||||||
|
export default function FilterControls({
|
||||||
|
searchInput,
|
||||||
|
setSearchInput,
|
||||||
|
filterValues,
|
||||||
|
handleStatusChange,
|
||||||
|
handleVendorChange,
|
||||||
|
handleRecordTypeChange,
|
||||||
|
clearFilters,
|
||||||
|
filterOptions,
|
||||||
|
loading,
|
||||||
|
}: FilterControlsProps) {
|
||||||
|
return (
|
||||||
|
<div className="mb-4 flex flex-wrap items-center gap-4">
|
||||||
|
<Input
|
||||||
|
placeholder="Search orders..."
|
||||||
|
value={searchInput}
|
||||||
|
onChange={(e) => setSearchInput(e.target.value)}
|
||||||
|
className="max-w-xs"
|
||||||
|
disabled={loading}
|
||||||
|
/>
|
||||||
|
<Select
|
||||||
|
value={filterValues.status}
|
||||||
|
onValueChange={handleStatusChange}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
<SelectTrigger className="w-[180px]">
|
||||||
|
<SelectValue placeholder="Select status" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{STATUS_FILTER_OPTIONS.map((option) => (
|
||||||
|
<SelectItem key={option.value} value={option.value}>
|
||||||
|
{option.label}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
<Select
|
||||||
|
value={filterValues.vendor}
|
||||||
|
onValueChange={handleVendorChange}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
<SelectTrigger className="w-[180px]">
|
||||||
|
<SelectValue placeholder="Select supplier" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
<SelectItem value="all">All Suppliers</SelectItem>
|
||||||
|
{filterOptions?.vendors?.map((vendor) => (
|
||||||
|
<SelectItem key={vendor} value={vendor}>
|
||||||
|
{vendor}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
<Select
|
||||||
|
value={filterValues.recordType}
|
||||||
|
onValueChange={handleRecordTypeChange}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
<SelectTrigger className="w-[180px]">
|
||||||
|
<SelectValue placeholder="Record Type" />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{RECORD_TYPE_FILTER_OPTIONS.map((option) => (
|
||||||
|
<SelectItem key={option.value} value={option.value}>
|
||||||
|
{option.label}
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
{(filterValues.search || filterValues.status !== "all" || filterValues.vendor !== "all" || filterValues.recordType !== "all") && (
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
onClick={clearFilters}
|
||||||
|
disabled={loading}
|
||||||
|
title="Clear filters"
|
||||||
|
className="gap-1"
|
||||||
|
>
|
||||||
|
<span>Clear</span> ✕
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
122
inventory/src/components/purchase-orders/OrderMetricsCard.tsx
Normal file
122
inventory/src/components/purchase-orders/OrderMetricsCard.tsx
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
import {
|
||||||
|
Card,
|
||||||
|
CardContent,
|
||||||
|
CardHeader,
|
||||||
|
CardTitle,
|
||||||
|
} from "../../components/ui/card";
|
||||||
|
import { Skeleton } from "../../components/ui/skeleton";
|
||||||
|
|
||||||
|
type ReceivingStatus = {
|
||||||
|
order_count: number;
|
||||||
|
total_ordered: number;
|
||||||
|
total_received: number;
|
||||||
|
fulfillment_rate: number;
|
||||||
|
total_value: number;
|
||||||
|
avg_cost: number;
|
||||||
|
avg_delivery_days?: number;
|
||||||
|
max_delivery_days?: number;
|
||||||
|
};
|
||||||
|
|
||||||
|
interface OrderMetricsCardProps {
|
||||||
|
summary: ReceivingStatus | null;
|
||||||
|
loading: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function OrderMetricsCard({
|
||||||
|
summary,
|
||||||
|
loading,
|
||||||
|
}: OrderMetricsCardProps) {
|
||||||
|
const [initialLoading, setInitialLoading] = useState(true);
|
||||||
|
|
||||||
|
// Only show loading state on initial load, not during table refreshes
|
||||||
|
useEffect(() => {
|
||||||
|
if (summary) {
|
||||||
|
setInitialLoading(false);
|
||||||
|
}
|
||||||
|
}, [summary]);
|
||||||
|
|
||||||
|
const formatNumber = (value: number) => {
|
||||||
|
return value.toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatCurrency = (value: number) => {
|
||||||
|
return `$${formatNumber(value)}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatPercent = (value: number) => {
|
||||||
|
return (
|
||||||
|
(value * 100).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 1,
|
||||||
|
maximumFractionDigits: 1,
|
||||||
|
}) + "%"
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||||
|
<CardTitle className="text-sm font-medium">Order Metrics</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
{initialLoading || loading ? (
|
||||||
|
<div className="flex flex-col gap-2">
|
||||||
|
{/* 5 rows of skeleton metrics */}
|
||||||
|
{[...Array(5)].map((_, i) => (
|
||||||
|
<div key={i} className="flex items-baseline justify-between">
|
||||||
|
<Skeleton className="h-4 w-32" />
|
||||||
|
<Skeleton className="h-6 w-16" />
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="flex flex-col gap-2">
|
||||||
|
<div className="flex items-baseline justify-between">
|
||||||
|
<p className="text-sm font-medium text-muted-foreground">
|
||||||
|
Avg. Cost per PO
|
||||||
|
</p>
|
||||||
|
<p className="text-lg font-bold">
|
||||||
|
{formatCurrency(summary?.avg_cost || 0)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-baseline justify-between">
|
||||||
|
<p className="text-sm font-medium text-muted-foreground">
|
||||||
|
Overall Fulfillment Rate
|
||||||
|
</p>
|
||||||
|
<p className="text-lg font-bold">
|
||||||
|
{formatPercent(summary?.fulfillment_rate || 0)}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-baseline justify-between">
|
||||||
|
<p className="text-sm font-medium text-muted-foreground">
|
||||||
|
Total Orders
|
||||||
|
</p>
|
||||||
|
<p className="text-lg font-bold">
|
||||||
|
{summary?.order_count.toLocaleString() || 0}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-baseline justify-between">
|
||||||
|
<p className="text-sm font-medium text-muted-foreground">
|
||||||
|
Avg. Delivery Days
|
||||||
|
</p>
|
||||||
|
<p className="text-lg font-bold">
|
||||||
|
{summary?.avg_delivery_days ? summary.avg_delivery_days.toFixed(1) : "N/A"}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div className="flex items-baseline justify-between">
|
||||||
|
<p className="text-sm font-medium text-muted-foreground">
|
||||||
|
Longest Delivery Days
|
||||||
|
</p>
|
||||||
|
<p className="text-lg font-bold">
|
||||||
|
{summary?.max_delivery_days ? summary.max_delivery_days.toFixed(0) : "N/A"}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
140
inventory/src/components/purchase-orders/PaginationControls.tsx
Normal file
140
inventory/src/components/purchase-orders/PaginationControls.tsx
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
import {
|
||||||
|
Pagination,
|
||||||
|
PaginationContent,
|
||||||
|
PaginationEllipsis,
|
||||||
|
PaginationItem,
|
||||||
|
PaginationLink,
|
||||||
|
PaginationNext,
|
||||||
|
PaginationPrevious,
|
||||||
|
} from "../../components/ui/pagination";
|
||||||
|
|
||||||
|
interface PaginationControlsProps {
|
||||||
|
pagination: {
|
||||||
|
total: number;
|
||||||
|
pages: number;
|
||||||
|
page: number;
|
||||||
|
limit: number;
|
||||||
|
};
|
||||||
|
currentPage: number;
|
||||||
|
onPageChange: (page: number) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function PaginationControls({
|
||||||
|
pagination,
|
||||||
|
currentPage,
|
||||||
|
onPageChange,
|
||||||
|
}: PaginationControlsProps) {
|
||||||
|
// Generate pagination items
|
||||||
|
const getPaginationItems = () => {
|
||||||
|
const items = [];
|
||||||
|
const totalPages = pagination.pages;
|
||||||
|
|
||||||
|
// Always show first page
|
||||||
|
if (totalPages > 0) {
|
||||||
|
items.push(
|
||||||
|
<PaginationItem key="first">
|
||||||
|
<PaginationLink
|
||||||
|
isActive={currentPage === 1}
|
||||||
|
onClick={() => currentPage !== 1 && onPageChange(1)}
|
||||||
|
>
|
||||||
|
1
|
||||||
|
</PaginationLink>
|
||||||
|
</PaginationItem>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add ellipsis if needed
|
||||||
|
if (currentPage > 3) {
|
||||||
|
items.push(
|
||||||
|
<PaginationItem key="ellipsis-1">
|
||||||
|
<PaginationEllipsis />
|
||||||
|
</PaginationItem>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add pages around current page
|
||||||
|
const startPage = Math.max(2, currentPage - 1);
|
||||||
|
const endPage = Math.min(totalPages - 1, currentPage + 1);
|
||||||
|
|
||||||
|
for (let i = startPage; i <= endPage; i++) {
|
||||||
|
if (i <= 1 || i >= totalPages) continue; // Skip first and last page as they're handled separately
|
||||||
|
items.push(
|
||||||
|
<PaginationItem key={i}>
|
||||||
|
<PaginationLink
|
||||||
|
isActive={currentPage === i}
|
||||||
|
onClick={() => currentPage !== i && onPageChange(i)}
|
||||||
|
>
|
||||||
|
{i}
|
||||||
|
</PaginationLink>
|
||||||
|
</PaginationItem>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add ellipsis if needed
|
||||||
|
if (currentPage < totalPages - 2) {
|
||||||
|
items.push(
|
||||||
|
<PaginationItem key="ellipsis-2">
|
||||||
|
<PaginationEllipsis />
|
||||||
|
</PaginationItem>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Always show last page if there are multiple pages
|
||||||
|
if (totalPages > 1) {
|
||||||
|
items.push(
|
||||||
|
<PaginationItem key="last">
|
||||||
|
<PaginationLink
|
||||||
|
isActive={currentPage === totalPages}
|
||||||
|
onClick={() => currentPage !== totalPages && onPageChange(totalPages)}
|
||||||
|
>
|
||||||
|
{totalPages}
|
||||||
|
</PaginationLink>
|
||||||
|
</PaginationItem>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return items;
|
||||||
|
};
|
||||||
|
|
||||||
|
if (pagination.pages <= 1) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="flex justify-center mb-6">
|
||||||
|
<Pagination>
|
||||||
|
<PaginationContent>
|
||||||
|
<PaginationItem>
|
||||||
|
<PaginationPrevious
|
||||||
|
href="#"
|
||||||
|
onClick={(e) => {
|
||||||
|
e.preventDefault();
|
||||||
|
if (currentPage > 1) onPageChange(currentPage - 1);
|
||||||
|
}}
|
||||||
|
aria-disabled={currentPage === 1}
|
||||||
|
className={currentPage === 1 ? "pointer-events-none opacity-50" : ""}
|
||||||
|
/>
|
||||||
|
</PaginationItem>
|
||||||
|
|
||||||
|
{getPaginationItems()}
|
||||||
|
|
||||||
|
<PaginationItem>
|
||||||
|
<PaginationNext
|
||||||
|
href="#"
|
||||||
|
onClick={(e) => {
|
||||||
|
e.preventDefault();
|
||||||
|
if (currentPage < pagination.pages) onPageChange(currentPage + 1);
|
||||||
|
}}
|
||||||
|
aria-disabled={currentPage === pagination.pages}
|
||||||
|
className={
|
||||||
|
currentPage === pagination.pages
|
||||||
|
? "pointer-events-none opacity-50"
|
||||||
|
: ""
|
||||||
|
}
|
||||||
|
/>
|
||||||
|
</PaginationItem>
|
||||||
|
</PaginationContent>
|
||||||
|
</Pagination>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -0,0 +1,241 @@
|
|||||||
|
import React, { useState, useEffect } from "react";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "../ui/table";
|
||||||
|
import { Skeleton } from "../ui/skeleton";
|
||||||
|
|
||||||
|
// Define the structure of purchase order items
|
||||||
|
interface PurchaseOrderItem {
|
||||||
|
id: string | number;
|
||||||
|
pid: string | number;
|
||||||
|
product_name: string;
|
||||||
|
sku: string;
|
||||||
|
upc: string;
|
||||||
|
ordered: number;
|
||||||
|
received: number;
|
||||||
|
po_cost_price: number;
|
||||||
|
cost_each?: number; // For receiving items
|
||||||
|
qty_each?: number; // For receiving items
|
||||||
|
total_cost: number;
|
||||||
|
receiving_status?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PurchaseOrder {
|
||||||
|
id: number | string;
|
||||||
|
vendor_name: string;
|
||||||
|
order_date: string | null;
|
||||||
|
receiving_date: string | null;
|
||||||
|
status: number;
|
||||||
|
total_items: number;
|
||||||
|
total_quantity: number;
|
||||||
|
total_cost: number;
|
||||||
|
total_received: number;
|
||||||
|
fulfillment_rate: number;
|
||||||
|
short_note: string | null;
|
||||||
|
record_type: "po_only" | "po_with_receiving" | "receiving_only";
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PurchaseOrderAccordionProps {
|
||||||
|
purchaseOrder: PurchaseOrder;
|
||||||
|
children: React.ReactNode;
|
||||||
|
rowClassName?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function PurchaseOrderAccordion({
|
||||||
|
purchaseOrder,
|
||||||
|
children,
|
||||||
|
rowClassName,
|
||||||
|
}: PurchaseOrderAccordionProps) {
|
||||||
|
const [isOpen, setIsOpen] = useState(false);
|
||||||
|
const [items, setItems] = useState<PurchaseOrderItem[]>([]);
|
||||||
|
const [loading, setLoading] = useState(true);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
// Clone the TableRow (children) and add the onClick handler and className
|
||||||
|
const enhancedRow = React.cloneElement(children as React.ReactElement, {
|
||||||
|
onClick: () => setIsOpen(!isOpen),
|
||||||
|
className: `${(children as React.ReactElement).props.className || ""} cursor-pointer ${isOpen ? 'bg-gray-100' : ''} ${rowClassName || ""}`.trim(),
|
||||||
|
"data-state": isOpen ? "open" : "closed"
|
||||||
|
});
|
||||||
|
|
||||||
|
// Format currency
|
||||||
|
const formatCurrency = (value: number) => {
|
||||||
|
return `$${value.toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
// Only fetch items when the accordion is open
|
||||||
|
if (!isOpen) return;
|
||||||
|
|
||||||
|
const fetchItems = async () => {
|
||||||
|
setLoading(true);
|
||||||
|
setError(null);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Endpoint path will depend on the type of record
|
||||||
|
const endpoint = purchaseOrder.record_type === "receiving_only"
|
||||||
|
? `/api/purchase-orders/receiving/${purchaseOrder.id}/items`
|
||||||
|
: `/api/purchase-orders/${purchaseOrder.id}/items`;
|
||||||
|
|
||||||
|
const response = await fetch(endpoint);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Failed to fetch items: ${response.statusText}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data = await response.json();
|
||||||
|
setItems(data);
|
||||||
|
} catch (err) {
|
||||||
|
console.error("Error fetching purchase order items:", err);
|
||||||
|
setError(err instanceof Error ? err.message : "Unknown error occurred");
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchItems();
|
||||||
|
}, [purchaseOrder.id, purchaseOrder.record_type, isOpen]);
|
||||||
|
|
||||||
|
// Render purchase order items list
|
||||||
|
const renderItemsList = () => {
|
||||||
|
if (error) {
|
||||||
|
return (
|
||||||
|
<div className="p-4 text-red-500">
|
||||||
|
Error loading items: {error}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="max-h-[600px] overflow-y-auto bg-gray-50 rounded-md p-2">
|
||||||
|
<Table className="w-full">
|
||||||
|
<TableHeader className="bg-white sticky top-0 z-10">
|
||||||
|
<TableRow>
|
||||||
|
<TableHead className="w-[100px]">Item Number</TableHead>
|
||||||
|
<TableHead className="w-auto">Product</TableHead>
|
||||||
|
<TableHead className="w-[100px]">UPC</TableHead>
|
||||||
|
<TableHead className="w-[80px] text-right">Ordered</TableHead>
|
||||||
|
<TableHead className="w-[80px] text-right">Received</TableHead>
|
||||||
|
<TableHead className="w-[100px] text-right">Unit Cost</TableHead>
|
||||||
|
<TableHead className="w-[100px] text-right">Total Cost</TableHead>
|
||||||
|
{purchaseOrder.record_type !== "po_only" && (
|
||||||
|
<TableHead className="w-[120px] text-right">Status</TableHead>
|
||||||
|
)}
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{loading ? (
|
||||||
|
// Loading skeleton
|
||||||
|
Array(5).fill(0).map((_, i) => (
|
||||||
|
<TableRow key={`skeleton-${i}`}>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
{purchaseOrder.record_type !== "po_only" && (
|
||||||
|
<TableCell><Skeleton className="h-5 w-full" /></TableCell>
|
||||||
|
)}
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
) : (
|
||||||
|
items.map((item) => (
|
||||||
|
<TableRow key={item.id} className="hover:bg-gray-100">
|
||||||
|
<TableCell className="">
|
||||||
|
<a
|
||||||
|
href={`https://backend.acherryontop.com/product/${item.pid}`}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="hover:underline"
|
||||||
|
onClick={(e) => e.stopPropagation()}
|
||||||
|
>
|
||||||
|
{item.sku}
|
||||||
|
</a>
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="font-medium">
|
||||||
|
<a
|
||||||
|
href={`https://backend.acherryontop.com/product/${item.pid}`}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="hover:underline"
|
||||||
|
onClick={(e) => e.stopPropagation()}
|
||||||
|
>
|
||||||
|
{item.product_name}
|
||||||
|
</a>
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="">
|
||||||
|
<a
|
||||||
|
href={`https://backend.acherryontop.com/product/${item.pid}`}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="hover:underline"
|
||||||
|
onClick={(e) => e.stopPropagation()}
|
||||||
|
>
|
||||||
|
{item.upc}
|
||||||
|
</a>
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-right">
|
||||||
|
{item.ordered}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-right">
|
||||||
|
{item.received || 0}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-right">
|
||||||
|
{formatCurrency(item.po_cost_price || item.cost_each || 0)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-right">
|
||||||
|
{formatCurrency(item.total_cost)}
|
||||||
|
</TableCell>
|
||||||
|
{purchaseOrder.record_type !== "po_only" && (
|
||||||
|
<TableCell className="text-right">
|
||||||
|
{item.receiving_status || "Unknown"}
|
||||||
|
</TableCell>
|
||||||
|
)}
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
)}
|
||||||
|
|
||||||
|
{!loading && items.length === 0 && (
|
||||||
|
<TableRow>
|
||||||
|
<TableCell colSpan={purchaseOrder.record_type === "po_only" ? 7 : 8} className="text-center py-4 text-muted-foreground">
|
||||||
|
No items found for this order
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
{/* First render the row which will serve as the trigger */}
|
||||||
|
{enhancedRow}
|
||||||
|
|
||||||
|
{/* Then render the accordion content conditionally if open */}
|
||||||
|
{isOpen && (
|
||||||
|
<TableRow className="p-0 border-0">
|
||||||
|
<TableCell colSpan={12} className="p-0 border-0">
|
||||||
|
<div className="pt-2 pb-4 px-4 bg-gray-50 border-t border-b">
|
||||||
|
<div className="mb-2 text-sm text-muted-foreground">
|
||||||
|
{purchaseOrder.total_items} product{purchaseOrder.total_items !== 1 ? "s" : ""} in this {purchaseOrder.record_type === "receiving_only" ? "receiving" : "purchase order"}
|
||||||
|
</div>
|
||||||
|
{renderItemsList()}
|
||||||
|
</div>
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
436
inventory/src/components/purchase-orders/PurchaseOrdersTable.tsx
Normal file
436
inventory/src/components/purchase-orders/PurchaseOrdersTable.tsx
Normal file
@@ -0,0 +1,436 @@
|
|||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "../ui/table";
|
||||||
|
import { Badge } from "../ui/badge";
|
||||||
|
import { Button } from "../ui/button";
|
||||||
|
import { Skeleton } from "../ui/skeleton";
|
||||||
|
import { FileText } from "lucide-react";
|
||||||
|
import {
|
||||||
|
Tooltip,
|
||||||
|
TooltipContent,
|
||||||
|
TooltipProvider,
|
||||||
|
TooltipTrigger,
|
||||||
|
} from "../ui/tooltip";
|
||||||
|
import {
|
||||||
|
getPurchaseOrderStatusLabel,
|
||||||
|
getReceivingStatusLabel,
|
||||||
|
getPurchaseOrderStatusVariant,
|
||||||
|
getReceivingStatusVariant,
|
||||||
|
} from "../../types/status-codes";
|
||||||
|
import {
|
||||||
|
Card,
|
||||||
|
CardContent,
|
||||||
|
CardHeader,
|
||||||
|
CardTitle,
|
||||||
|
} from "../ui/card";
|
||||||
|
import PurchaseOrderAccordion from "./PurchaseOrderAccordion";
|
||||||
|
|
||||||
|
interface PurchaseOrder {
|
||||||
|
id: number | string;
|
||||||
|
vendor_name: string;
|
||||||
|
order_date: string | null;
|
||||||
|
receiving_date: string | null;
|
||||||
|
status: number;
|
||||||
|
total_items: number;
|
||||||
|
total_quantity: number;
|
||||||
|
total_cost: number;
|
||||||
|
total_received: number;
|
||||||
|
fulfillment_rate: number;
|
||||||
|
short_note: string | null;
|
||||||
|
record_type: "po_only" | "po_with_receiving" | "receiving_only";
|
||||||
|
}
|
||||||
|
|
||||||
|
interface PurchaseOrdersTableProps {
|
||||||
|
purchaseOrders: PurchaseOrder[];
|
||||||
|
loading: boolean;
|
||||||
|
summary: { order_count: number } | null;
|
||||||
|
sortColumn: string;
|
||||||
|
sortDirection: "asc" | "desc";
|
||||||
|
handleSort: (column: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function PurchaseOrdersTable({
|
||||||
|
purchaseOrders,
|
||||||
|
loading,
|
||||||
|
summary,
|
||||||
|
sortColumn,
|
||||||
|
sortDirection,
|
||||||
|
handleSort
|
||||||
|
}: PurchaseOrdersTableProps) {
|
||||||
|
// Helper functions
|
||||||
|
const getRecordTypeIndicator = (recordType: string) => {
|
||||||
|
switch (recordType) {
|
||||||
|
case "po_with_receiving":
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="flex items-center justify-center border-green-500 text-green-700 bg-green-50 px-0 text-xs w-[85px]"
|
||||||
|
>
|
||||||
|
Received PO
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
case "po_only":
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="flex items-center justify-center border-blue-500 text-blue-700 bg-blue-50 px-0 text-xs w-[85px]"
|
||||||
|
>
|
||||||
|
PO
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
case "receiving_only":
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="flex items-center justify-center border-amber-500 text-amber-700 bg-amber-50 px-0 text-xs w-[85px]"
|
||||||
|
>
|
||||||
|
Receiving
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
default:
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
variant="outline"
|
||||||
|
className="flex items-center justify-center border-gray-500 text-gray-700 bg-gray-50 px-0 text-xs w-[85px]"
|
||||||
|
>
|
||||||
|
{recordType || "Unknown"}
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const getStatusBadge = (status: number, recordType: string) => {
|
||||||
|
if (recordType === "receiving_only") {
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
className="w-[115px] flex items-center text-xs justify-center px-1"
|
||||||
|
variant={getReceivingStatusVariant(status)}
|
||||||
|
>
|
||||||
|
{getReceivingStatusLabel(status)}
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Badge
|
||||||
|
className="w-[115px] flex items-center text-xs justify-center px-1"
|
||||||
|
variant={getPurchaseOrderStatusVariant(status)}
|
||||||
|
>
|
||||||
|
{getPurchaseOrderStatusLabel(status)}
|
||||||
|
</Badge>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatNumber = (value: number) => {
|
||||||
|
return value.toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatCurrency = (value: number) => {
|
||||||
|
return `$${formatNumber(value)}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatPercent = (value: number) => {
|
||||||
|
return (
|
||||||
|
(value * 100).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 1,
|
||||||
|
maximumFractionDigits: 1,
|
||||||
|
}) + "%"
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Update sort indicators in table headers
|
||||||
|
const getSortIndicator = (column: string) => {
|
||||||
|
if (sortColumn !== column) return null;
|
||||||
|
return sortDirection === "asc" ? " ↑" : " ↓";
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card className="mb-6">
|
||||||
|
<CardHeader className="flex flex-row items-center justify-between">
|
||||||
|
<CardTitle>Purchase Orders & Receivings</CardTitle>
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
{loading ? (
|
||||||
|
<Skeleton className="h-4 w-24" />
|
||||||
|
) : (
|
||||||
|
`${summary?.order_count.toLocaleString()} orders`
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<Table
|
||||||
|
className="table-fixed"
|
||||||
|
style={{ tableLayout: "fixed", width: "100%"}}
|
||||||
|
>
|
||||||
|
<TableHeader>
|
||||||
|
<TableRow>
|
||||||
|
<TableHead className="w-[100px] text-center">Type</TableHead>
|
||||||
|
<TableHead className="w-[60px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("id")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
ID{getSortIndicator("id")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[140px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("vendor_name")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Supplier{getSortIndicator("vendor_name")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[115px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("status")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Status{getSortIndicator("status")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[150px] text-center">Note</TableHead>
|
||||||
|
<TableHead className="w-[90px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("total_cost")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Total Cost{getSortIndicator("total_cost")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[70px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("total_items")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Products{getSortIndicator("total_items")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[90px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("order_date")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Order Date{getSortIndicator("order_date")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[90px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("receiving_date")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Rec'd Date{getSortIndicator("receiving_date")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[70px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("total_quantity")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Ordered{getSortIndicator("total_quantity")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[80px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("total_received")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
Received{getSortIndicator("total_received")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
<TableHead className="w-[80px] text-center">
|
||||||
|
<Button
|
||||||
|
className="w-full"
|
||||||
|
variant="ghost"
|
||||||
|
onClick={() => !loading && handleSort("fulfillment_rate")}
|
||||||
|
disabled={loading}
|
||||||
|
>
|
||||||
|
% Fulfilled{getSortIndicator("fulfillment_rate")}
|
||||||
|
</Button>
|
||||||
|
</TableHead>
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{loading ? (
|
||||||
|
// Skeleton rows for loading state
|
||||||
|
Array(50)
|
||||||
|
.fill(0)
|
||||||
|
.map((_, index) => (
|
||||||
|
<TableRow key={`skeleton-${index}`}>
|
||||||
|
<TableCell className="w-[100px]">
|
||||||
|
<Skeleton className="h-6 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[60px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[140px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[115px]">
|
||||||
|
<Skeleton className="h-6 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[150px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[90px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[70px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[90px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[90px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[70px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[80px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="w-[80px]">
|
||||||
|
<Skeleton className="h-5 w-full" />
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
))
|
||||||
|
) : purchaseOrders.length > 0 ? (
|
||||||
|
purchaseOrders.map((po) => {
|
||||||
|
// Determine row styling based on record type
|
||||||
|
let rowClassName = "border-l-4 border-l-gray-300"; // Default
|
||||||
|
|
||||||
|
if (po.record_type === "po_with_receiving") {
|
||||||
|
rowClassName = "border-l-4 border-l-green-500";
|
||||||
|
} else if (po.record_type === "po_only") {
|
||||||
|
rowClassName = "border-l-4 border-l-blue-500";
|
||||||
|
} else if (po.record_type === "receiving_only") {
|
||||||
|
rowClassName = "border-l-4 border-l-amber-500";
|
||||||
|
}
|
||||||
|
return (
|
||||||
|
<PurchaseOrderAccordion
|
||||||
|
key={`${po.id}-${po.record_type}`}
|
||||||
|
purchaseOrder={po}
|
||||||
|
rowClassName={rowClassName}
|
||||||
|
>
|
||||||
|
<TableRow>
|
||||||
|
<TableCell className="text-center">
|
||||||
|
{getRecordTypeIndicator(po.record_type)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="font-semibold text-center">
|
||||||
|
<a
|
||||||
|
href={po.record_type === "po_only"
|
||||||
|
? `https://backend.acherryontop.com/po/edit/${po.id}`
|
||||||
|
: `https://backend.acherryontop.com/receiving/edit/${po.id}`}
|
||||||
|
target="_blank"
|
||||||
|
rel="noopener noreferrer"
|
||||||
|
className="hover:underline"
|
||||||
|
>
|
||||||
|
{po.id}
|
||||||
|
</a>
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>{po.vendor_name}</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{getStatusBadge(po.status, po.record_type)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="truncate text-center">
|
||||||
|
{po.short_note ? (
|
||||||
|
<TooltipProvider>
|
||||||
|
<Tooltip>
|
||||||
|
<TooltipTrigger className="text-left flex items-center gap-1">
|
||||||
|
<FileText className="h-3 w-3" />
|
||||||
|
<span className="truncate">
|
||||||
|
{po.short_note}
|
||||||
|
</span>
|
||||||
|
</TooltipTrigger>
|
||||||
|
<TooltipContent>
|
||||||
|
<p>{po.short_note}</p>
|
||||||
|
</TooltipContent>
|
||||||
|
</Tooltip>
|
||||||
|
</TooltipProvider>
|
||||||
|
) : (
|
||||||
|
""
|
||||||
|
)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>{formatCurrency(po.total_cost)}</TableCell>
|
||||||
|
<TableCell className="text-center">{po.total_items.toLocaleString()}</TableCell>
|
||||||
|
<TableCell className="text-center">
|
||||||
|
{po.order_date
|
||||||
|
? new Date(po.order_date).toLocaleDateString(
|
||||||
|
"en-US",
|
||||||
|
{
|
||||||
|
month: "numeric",
|
||||||
|
day: "numeric",
|
||||||
|
year: "numeric",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
: "-"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-center">
|
||||||
|
{po.receiving_date
|
||||||
|
? new Date(po.receiving_date).toLocaleDateString(
|
||||||
|
"en-US",
|
||||||
|
{
|
||||||
|
month: "numeric",
|
||||||
|
day: "numeric",
|
||||||
|
year: "numeric",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
: "-"}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-center">
|
||||||
|
{po.record_type === "receiving_only" ? "-" : po.total_quantity.toLocaleString()}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-center">
|
||||||
|
{po.record_type === "po_only" ? "-" : po.total_received.toLocaleString()}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell className="text-center" >
|
||||||
|
{po.record_type === "po_with_receiving"
|
||||||
|
? (po.fulfillment_rate === null ? "N/A" : formatPercent(po.fulfillment_rate))
|
||||||
|
: "-"}
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
</PurchaseOrderAccordion>
|
||||||
|
);
|
||||||
|
})
|
||||||
|
) : (
|
||||||
|
<TableRow>
|
||||||
|
<TableCell
|
||||||
|
colSpan={12}
|
||||||
|
className="text-center text-muted-foreground"
|
||||||
|
>
|
||||||
|
No purchase orders found
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
)}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
354
inventory/src/components/purchase-orders/VendorMetricsCard.tsx
Normal file
354
inventory/src/components/purchase-orders/VendorMetricsCard.tsx
Normal file
@@ -0,0 +1,354 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
import {
|
||||||
|
Card,
|
||||||
|
CardContent,
|
||||||
|
CardHeader,
|
||||||
|
CardTitle,
|
||||||
|
} from "../../components/ui/card";
|
||||||
|
import { Skeleton } from "../../components/ui/skeleton";
|
||||||
|
import { BarChart3, Loader2 } from "lucide-react";
|
||||||
|
import { Button } from "../../components/ui/button";
|
||||||
|
import {
|
||||||
|
Dialog,
|
||||||
|
DialogContent,
|
||||||
|
DialogHeader,
|
||||||
|
DialogTitle,
|
||||||
|
DialogTrigger,
|
||||||
|
} from "../../components/ui/dialog";
|
||||||
|
import { PieChart, Pie, ResponsiveContainer, Cell, Sector } from "recharts";
|
||||||
|
import {
|
||||||
|
Table,
|
||||||
|
TableBody,
|
||||||
|
TableCell,
|
||||||
|
TableHead,
|
||||||
|
TableHeader,
|
||||||
|
TableRow,
|
||||||
|
} from "../../components/ui/table";
|
||||||
|
|
||||||
|
// Add this constant for pie chart colors
|
||||||
|
const COLORS = [
|
||||||
|
"#0088FE",
|
||||||
|
"#00C49F",
|
||||||
|
"#FFBB28",
|
||||||
|
"#FF8042",
|
||||||
|
"#8884D8",
|
||||||
|
"#82CA9D",
|
||||||
|
"#FFC658",
|
||||||
|
"#FF7C43",
|
||||||
|
];
|
||||||
|
|
||||||
|
// The renderActiveShape function for pie charts
|
||||||
|
const renderActiveShape = (props: any) => {
|
||||||
|
const {
|
||||||
|
cx,
|
||||||
|
cy,
|
||||||
|
innerRadius,
|
||||||
|
outerRadius,
|
||||||
|
startAngle,
|
||||||
|
endAngle,
|
||||||
|
fill,
|
||||||
|
category,
|
||||||
|
total_spend,
|
||||||
|
} = props;
|
||||||
|
|
||||||
|
// Split category name into words and create lines of max 12 chars
|
||||||
|
const words = category.split(" ");
|
||||||
|
const lines: string[] = [];
|
||||||
|
let currentLine = "";
|
||||||
|
|
||||||
|
words.forEach((word: string) => {
|
||||||
|
if ((currentLine + " " + word).length <= 12) {
|
||||||
|
currentLine = currentLine ? `${currentLine} ${word}` : word;
|
||||||
|
} else {
|
||||||
|
if (currentLine) lines.push(currentLine);
|
||||||
|
currentLine = word;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
if (currentLine) lines.push(currentLine);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<g>
|
||||||
|
<Sector
|
||||||
|
cx={cx}
|
||||||
|
cy={cy}
|
||||||
|
innerRadius={innerRadius}
|
||||||
|
outerRadius={outerRadius}
|
||||||
|
startAngle={startAngle}
|
||||||
|
endAngle={endAngle}
|
||||||
|
fill={fill}
|
||||||
|
/>
|
||||||
|
<Sector
|
||||||
|
cx={cx}
|
||||||
|
cy={cy}
|
||||||
|
startAngle={startAngle}
|
||||||
|
endAngle={endAngle}
|
||||||
|
innerRadius={outerRadius - 1}
|
||||||
|
outerRadius={outerRadius + 4}
|
||||||
|
fill={fill}
|
||||||
|
/>
|
||||||
|
{lines.map((line, i) => (
|
||||||
|
<text
|
||||||
|
key={i}
|
||||||
|
x={cx}
|
||||||
|
y={cy}
|
||||||
|
dy={-20 + i * 16}
|
||||||
|
textAnchor="middle"
|
||||||
|
fill="#888888"
|
||||||
|
className="text-xs"
|
||||||
|
>
|
||||||
|
{line}
|
||||||
|
</text>
|
||||||
|
))}
|
||||||
|
<text
|
||||||
|
x={cx}
|
||||||
|
y={cy}
|
||||||
|
dy={lines.length * 16 - 10}
|
||||||
|
textAnchor="middle"
|
||||||
|
fill="#000000"
|
||||||
|
className="text-base font-medium"
|
||||||
|
>
|
||||||
|
{`$${Number(total_spend).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
})}`}
|
||||||
|
</text>
|
||||||
|
</g>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
interface VendorMetricsCardProps {
|
||||||
|
loading: boolean;
|
||||||
|
yearlyVendorData: {
|
||||||
|
vendor: string;
|
||||||
|
orders: number;
|
||||||
|
total_spend: number;
|
||||||
|
percentage?: number;
|
||||||
|
}[];
|
||||||
|
yearlyDataLoading: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export default function VendorMetricsCard({
|
||||||
|
loading,
|
||||||
|
yearlyVendorData,
|
||||||
|
yearlyDataLoading,
|
||||||
|
}: VendorMetricsCardProps) {
|
||||||
|
const [vendorAnalysisOpen, setVendorAnalysisOpen] = useState(false);
|
||||||
|
const [activeVendorIndex, setActiveVendorIndex] = useState<
|
||||||
|
number | undefined
|
||||||
|
>();
|
||||||
|
const [initialLoading, setInitialLoading] = useState(true);
|
||||||
|
|
||||||
|
// Only show loading state on initial load, not during table refreshes
|
||||||
|
useEffect(() => {
|
||||||
|
if (yearlyVendorData.length > 0 && !yearlyDataLoading) {
|
||||||
|
setInitialLoading(false);
|
||||||
|
}
|
||||||
|
}, [yearlyVendorData, yearlyDataLoading]);
|
||||||
|
|
||||||
|
const formatNumber = (value: number) => {
|
||||||
|
return value.toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 2,
|
||||||
|
maximumFractionDigits: 2,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatCurrency = (value: number) => {
|
||||||
|
return `$${formatNumber(value)}`;
|
||||||
|
};
|
||||||
|
|
||||||
|
const formatPercent = (value: number) => {
|
||||||
|
return (
|
||||||
|
(value * 100).toLocaleString("en-US", {
|
||||||
|
minimumFractionDigits: 1,
|
||||||
|
maximumFractionDigits: 1,
|
||||||
|
}) + "%"
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Prepare vendor chart data
|
||||||
|
const prepareVendorChartData = () => {
|
||||||
|
if (!yearlyVendorData.length) return [];
|
||||||
|
|
||||||
|
// Make a copy to avoid modifying state directly
|
||||||
|
const vendorArray = [...yearlyVendorData];
|
||||||
|
const totalSpend = vendorArray.reduce(
|
||||||
|
(sum, vendor) => sum + vendor.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
// Split into significant vendors (>=1%) and others
|
||||||
|
const significantVendors = vendorArray.filter(
|
||||||
|
(vendor) => vendor.total_spend / totalSpend >= 0.01
|
||||||
|
);
|
||||||
|
|
||||||
|
const otherVendors = vendorArray.filter(
|
||||||
|
(vendor) => vendor.total_spend / totalSpend < 0.01
|
||||||
|
);
|
||||||
|
|
||||||
|
let result = [...significantVendors];
|
||||||
|
|
||||||
|
// Add "Other" category if needed
|
||||||
|
if (otherVendors.length > 0) {
|
||||||
|
const otherTotalSpend = otherVendors.reduce(
|
||||||
|
(sum, vendor) => sum + vendor.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
result.push({
|
||||||
|
vendor: "Other Vendors",
|
||||||
|
total_spend: otherTotalSpend,
|
||||||
|
percentage: otherTotalSpend / totalSpend,
|
||||||
|
orders: otherVendors.reduce((sum, vendor) => sum + vendor.orders, 0),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort by spend amount descending
|
||||||
|
return result.sort((a, b) => b.total_spend - a.total_spend);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Get all vendors for table
|
||||||
|
const getAllVendorsForTable = () => {
|
||||||
|
if (!yearlyVendorData.length) return [];
|
||||||
|
return [...yearlyVendorData].sort((a, b) => b.total_spend - a.total_spend);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Vendor analysis table component
|
||||||
|
const VendorAnalysisTable = () => {
|
||||||
|
const vendorData = getAllVendorsForTable();
|
||||||
|
|
||||||
|
if (!vendorData.length) {
|
||||||
|
return yearlyDataLoading ? (
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="text-center p-4 text-muted-foreground">
|
||||||
|
No supplier data available for the past 12 months
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div>
|
||||||
|
{yearlyDataLoading ? (
|
||||||
|
<div className="flex justify-center p-4">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="text-sm font-medium mb-2 flex justify-between items-center px-4">
|
||||||
|
<span>
|
||||||
|
Showing received inventory by supplier for the past 12 months
|
||||||
|
</span>
|
||||||
|
<span>{vendorData.length} suppliers found</span>
|
||||||
|
</div>
|
||||||
|
<Table>
|
||||||
|
<TableHeader>
|
||||||
|
<TableRow>
|
||||||
|
<TableHead>Supplier</TableHead>
|
||||||
|
<TableHead>Orders</TableHead>
|
||||||
|
<TableHead>Total Spend</TableHead>
|
||||||
|
<TableHead>% of Total</TableHead>
|
||||||
|
<TableHead>Avg. Order Value</TableHead>
|
||||||
|
</TableRow>
|
||||||
|
</TableHeader>
|
||||||
|
<TableBody>
|
||||||
|
{vendorData.map((vendor) => {
|
||||||
|
return (
|
||||||
|
<TableRow key={vendor.vendor}>
|
||||||
|
<TableCell className="font-medium">
|
||||||
|
{vendor.vendor}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>{vendor.orders.toLocaleString()}</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{formatCurrency(vendor.total_spend)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{formatPercent(vendor.percentage || 0)}
|
||||||
|
</TableCell>
|
||||||
|
<TableCell>
|
||||||
|
{formatCurrency(
|
||||||
|
vendor.orders ? vendor.total_spend / vendor.orders : 0
|
||||||
|
)}
|
||||||
|
</TableCell>
|
||||||
|
</TableRow>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</TableBody>
|
||||||
|
</Table>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Card>
|
||||||
|
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
||||||
|
<CardTitle className="text-sm font-medium">
|
||||||
|
Received by Supplier
|
||||||
|
</CardTitle>
|
||||||
|
<Dialog
|
||||||
|
open={vendorAnalysisOpen}
|
||||||
|
onOpenChange={setVendorAnalysisOpen}
|
||||||
|
>
|
||||||
|
<DialogTrigger asChild>
|
||||||
|
<Button variant="outline" disabled={initialLoading || loading}>
|
||||||
|
<BarChart3 className="h-4 w-4" />
|
||||||
|
</Button>
|
||||||
|
</DialogTrigger>
|
||||||
|
<DialogContent className="max-w-[90%] w-fit">
|
||||||
|
<DialogHeader>
|
||||||
|
<DialogTitle className="flex items-center gap-2">
|
||||||
|
<BarChart3 className="h-5 w-5" />
|
||||||
|
<span>Received Inventory by Supplier</span>
|
||||||
|
</DialogTitle>
|
||||||
|
</DialogHeader>
|
||||||
|
<div className="overflow-auto max-h-[70vh]">
|
||||||
|
<VendorAnalysisTable />
|
||||||
|
</div>
|
||||||
|
</DialogContent>
|
||||||
|
</Dialog>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
{initialLoading || loading ? (
|
||||||
|
<div className="flex flex-col items-center justify-center h-[170px]">
|
||||||
|
<Skeleton className="h-[170px] w-[170px] rounded-full" />
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<>
|
||||||
|
<div className="h-[170px] relative">
|
||||||
|
<ResponsiveContainer width="100%" height="100%">
|
||||||
|
<PieChart>
|
||||||
|
<Pie
|
||||||
|
data={prepareVendorChartData()}
|
||||||
|
dataKey="total_spend"
|
||||||
|
nameKey="vendor"
|
||||||
|
cx="50%"
|
||||||
|
cy="50%"
|
||||||
|
innerRadius={60}
|
||||||
|
outerRadius={80}
|
||||||
|
paddingAngle={1}
|
||||||
|
activeIndex={activeVendorIndex}
|
||||||
|
activeShape={(props: any) =>
|
||||||
|
renderActiveShape({ ...props, category: props.vendor })
|
||||||
|
}
|
||||||
|
onMouseEnter={(_, index) => setActiveVendorIndex(index)}
|
||||||
|
onMouseLeave={() => setActiveVendorIndex(undefined)}
|
||||||
|
>
|
||||||
|
{prepareVendorChartData().map((entry, index) => (
|
||||||
|
<Cell
|
||||||
|
key={entry.vendor}
|
||||||
|
fill={COLORS[index % COLORS.length]}
|
||||||
|
/>
|
||||||
|
))}
|
||||||
|
</Pie>
|
||||||
|
</PieChart>
|
||||||
|
</ResponsiveContainer>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -88,15 +88,12 @@ interface TableSkeletonProps {
|
|||||||
|
|
||||||
const TableSkeleton = ({ rows = 5, useAccordion = false }: TableSkeletonProps) => {
|
const TableSkeleton = ({ rows = 5, useAccordion = false }: TableSkeletonProps) => {
|
||||||
return (
|
return (
|
||||||
<Table>
|
<>
|
||||||
<TableBody>
|
|
||||||
{Array.from({ length: rows }).map((_, rowIndex) => (
|
{Array.from({ length: rows }).map((_, rowIndex) => (
|
||||||
<TableRow key={rowIndex} className="hover:bg-transparent">
|
<TableRow key={rowIndex} className="hover:bg-transparent">
|
||||||
<TableCell className="w-full p-0">
|
<TableCell className="w-full p-0">
|
||||||
{useAccordion ? (
|
{useAccordion ? (
|
||||||
<Accordion type="single" collapsible>
|
<div className="px-4 py-2 cursor-default">
|
||||||
<AccordionItem value={`skeleton-${rowIndex}`} className="border-0">
|
|
||||||
<AccordionTrigger className="px-4 py-2 cursor-default">
|
|
||||||
<div className="flex justify-between items-center w-full pr-4">
|
<div className="flex justify-between items-center w-full pr-4">
|
||||||
<div className="w-[50px]">
|
<div className="w-[50px]">
|
||||||
<Skeleton className="h-4 w-full" />
|
<Skeleton className="h-4 w-full" />
|
||||||
@@ -111,9 +108,7 @@ const TableSkeleton = ({ rows = 5, useAccordion = false }: TableSkeletonProps) =
|
|||||||
<Skeleton className="h-4 w-full" />
|
<Skeleton className="h-4 w-full" />
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</AccordionTrigger>
|
</div>
|
||||||
</AccordionItem>
|
|
||||||
</Accordion>
|
|
||||||
) : (
|
) : (
|
||||||
<div className="flex justify-between px-4 py-2">
|
<div className="flex justify-between px-4 py-2">
|
||||||
<Skeleton className="h-4 w-[180px]" />
|
<Skeleton className="h-4 w-[180px]" />
|
||||||
@@ -123,8 +118,7 @@ const TableSkeleton = ({ rows = 5, useAccordion = false }: TableSkeletonProps) =
|
|||||||
</TableCell>
|
</TableCell>
|
||||||
</TableRow>
|
</TableRow>
|
||||||
))}
|
))}
|
||||||
</TableBody>
|
</>
|
||||||
</Table>
|
|
||||||
);
|
);
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -172,8 +166,10 @@ export function DataManagement() {
|
|||||||
credentials: "include",
|
credentials: "include",
|
||||||
});
|
});
|
||||||
|
|
||||||
|
console.log("Status check response:", response.status, response.statusText);
|
||||||
|
|
||||||
if (!response.ok) {
|
if (!response.ok) {
|
||||||
throw new Error("Failed to check active process status");
|
throw new Error(`Failed to check active process status: ${response.status} ${response.statusText}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const data = await response.json();
|
const data = await response.json();
|
||||||
@@ -185,7 +181,8 @@ export function DataManagement() {
|
|||||||
// Determine if it's a reset or update based on the progress data
|
// Determine if it's a reset or update based on the progress data
|
||||||
const isReset =
|
const isReset =
|
||||||
data.progress.operation?.includes("reset") ||
|
data.progress.operation?.includes("reset") ||
|
||||||
data.progress.operation?.includes("Reset");
|
data.progress.operation?.includes("Reset") ||
|
||||||
|
data.progress.type === "reset";
|
||||||
|
|
||||||
// Set the appropriate state
|
// Set the appropriate state
|
||||||
if (isReset) {
|
if (isReset) {
|
||||||
@@ -209,6 +206,8 @@ export function DataManagement() {
|
|||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error checking for active processes:", error);
|
console.error("Error checking for active processes:", error);
|
||||||
|
// Don't toast error for status check since this happens on every load
|
||||||
|
// The main data fetch will handle showing errors to the user
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -508,6 +507,8 @@ export function DataManagement() {
|
|||||||
if (shouldSetLoading) setIsLoading(true);
|
if (shouldSetLoading) setIsLoading(true);
|
||||||
setHasError(false);
|
setHasError(false);
|
||||||
|
|
||||||
|
console.log("Fetching history data...");
|
||||||
|
|
||||||
const [importRes, calcRes, moduleRes, tableRes, tableCountsRes] = await Promise.all([
|
const [importRes, calcRes, moduleRes, tableRes, tableCountsRes] = await Promise.all([
|
||||||
fetch(`${config.apiUrl}/csv/history/import`, { credentials: 'include' }),
|
fetch(`${config.apiUrl}/csv/history/import`, { credentials: 'include' }),
|
||||||
fetch(`${config.apiUrl}/csv/history/calculate`, { credentials: 'include' }),
|
fetch(`${config.apiUrl}/csv/history/calculate`, { credentials: 'include' }),
|
||||||
@@ -516,8 +517,22 @@ export function DataManagement() {
|
|||||||
fetch(`${config.apiUrl}/csv/status/table-counts`, { credentials: 'include' }),
|
fetch(`${config.apiUrl}/csv/status/table-counts`, { credentials: 'include' }),
|
||||||
]);
|
]);
|
||||||
|
|
||||||
|
console.log("Fetch responses:", {
|
||||||
|
import: importRes.status,
|
||||||
|
calc: calcRes.status,
|
||||||
|
modules: moduleRes.status,
|
||||||
|
tables: tableRes.status,
|
||||||
|
tableCounts: tableCountsRes.status
|
||||||
|
});
|
||||||
|
|
||||||
if (!importRes.ok || !calcRes.ok || !moduleRes.ok || !tableRes.ok || !tableCountsRes.ok) {
|
if (!importRes.ok || !calcRes.ok || !moduleRes.ok || !tableRes.ok || !tableCountsRes.ok) {
|
||||||
throw new Error('One or more requests failed');
|
const failed = [];
|
||||||
|
if (!importRes.ok) failed.push(`import (${importRes.status})`);
|
||||||
|
if (!calcRes.ok) failed.push(`calculate (${calcRes.status})`);
|
||||||
|
if (!moduleRes.ok) failed.push(`modules (${moduleRes.status})`);
|
||||||
|
if (!tableRes.ok) failed.push(`tables (${tableRes.status})`);
|
||||||
|
if (!tableCountsRes.ok) failed.push(`table-counts (${tableCountsRes.status})`);
|
||||||
|
throw new Error(`Failed requests: ${failed.join(', ')}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
const [importData, calcData, moduleData, tableData, tableCountsData] = await Promise.all([
|
const [importData, calcData, moduleData, tableData, tableCountsData] = await Promise.all([
|
||||||
@@ -528,6 +543,14 @@ export function DataManagement() {
|
|||||||
tableCountsRes.json(),
|
tableCountsRes.json(),
|
||||||
]);
|
]);
|
||||||
|
|
||||||
|
console.log("Successfully fetched data:", {
|
||||||
|
importCount: importData?.length || 0,
|
||||||
|
calcCount: calcData?.length || 0,
|
||||||
|
moduleCount: moduleData?.length || 0,
|
||||||
|
tableCount: tableData?.length || 0,
|
||||||
|
tableCountsAvailable: !!tableCountsData
|
||||||
|
});
|
||||||
|
|
||||||
// Process import history to add duration_minutes if it doesn't exist
|
// Process import history to add duration_minutes if it doesn't exist
|
||||||
const processedImportData = (importData || []).map((record: ImportHistoryRecord) => {
|
const processedImportData = (importData || []).map((record: ImportHistoryRecord) => {
|
||||||
if (!record.duration_minutes && record.start_time && record.end_time) {
|
if (!record.duration_minutes && record.start_time && record.end_time) {
|
||||||
@@ -557,7 +580,8 @@ export function DataManagement() {
|
|||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error("Error fetching data:", error);
|
console.error("Error fetching data:", error);
|
||||||
setHasError(true);
|
setHasError(true);
|
||||||
toast.error("Failed to load data. Please try again.");
|
toast.error(`Failed to load data: ${error instanceof Error ? error.message : "Unknown error"}`);
|
||||||
|
// Set empty arrays instead of leaving them unchanged to trigger the UI to show empty states
|
||||||
setImportHistory([]);
|
setImportHistory([]);
|
||||||
setCalculateHistory([]);
|
setCalculateHistory([]);
|
||||||
setModuleStatus([]);
|
setModuleStatus([]);
|
||||||
@@ -801,7 +825,7 @@ export function DataManagement() {
|
|||||||
);
|
);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Card className="md:col-start-2 md:row-span-2 h-[550px]">
|
<Card className="md:col-start-2 md:row-span-2 h-[580px]">
|
||||||
<CardHeader className="pb-3">
|
<CardHeader className="pb-3">
|
||||||
<CardTitle>Table Record Counts</CardTitle>
|
<CardTitle>Table Record Counts</CardTitle>
|
||||||
</CardHeader>
|
</CardHeader>
|
||||||
@@ -953,7 +977,7 @@ export function DataManagement() {
|
|||||||
|
|
||||||
<div className="grid gap-4 md:grid-cols-2">
|
<div className="grid gap-4 md:grid-cols-2">
|
||||||
{/* Table Status */}
|
{/* Table Status */}
|
||||||
<div className="space-y-4 flex flex-col h-[550px]">
|
<div className="space-y-4 flex flex-col h-[580px]">
|
||||||
<Card className="flex-1">
|
<Card className="flex-1">
|
||||||
<CardHeader className="pb-3">
|
<CardHeader className="pb-3">
|
||||||
<CardTitle>Last Import Times</CardTitle>
|
<CardTitle>Last Import Times</CardTitle>
|
||||||
|
|||||||
@@ -3,7 +3,8 @@ const isDev = import.meta.env.DEV;
|
|||||||
const config = {
|
const config = {
|
||||||
apiUrl: isDev ? '/api' : 'https://inventory.kent.pw/api',
|
apiUrl: isDev ? '/api' : 'https://inventory.kent.pw/api',
|
||||||
baseUrl: isDev ? '' : 'https://inventory.kent.pw',
|
baseUrl: isDev ? '' : 'https://inventory.kent.pw',
|
||||||
authUrl: isDev ? '/auth-inv' : 'https://inventory.kent.pw/auth-inv'
|
authUrl: isDev ? '/auth-inv' : 'https://inventory.kent.pw/auth-inv',
|
||||||
|
chatUrl: isDev ? '/chat-api' : 'https://inventory.kent.pw/chat-api'
|
||||||
};
|
};
|
||||||
|
|
||||||
export default config;
|
export default config;
|
||||||
244
inventory/src/pages/Chat.tsx
Normal file
244
inventory/src/pages/Chat.tsx
Normal file
@@ -0,0 +1,244 @@
|
|||||||
|
import React, { useState, useEffect } from 'react';
|
||||||
|
import { Card, CardContent, CardHeader, CardTitle } from '@/components/ui/card';
|
||||||
|
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from '@/components/ui/select';
|
||||||
|
import { Input } from '@/components/ui/input';
|
||||||
|
import { Button } from '@/components/ui/button';
|
||||||
|
import { Avatar, AvatarFallback, AvatarImage } from '@/components/ui/avatar';
|
||||||
|
import { Loader2, Search } from 'lucide-react';
|
||||||
|
import { RoomList } from '@/components/chat/RoomList';
|
||||||
|
import { ChatRoom } from '@/components/chat/ChatRoom';
|
||||||
|
import { SearchResults } from '@/components/chat/SearchResults';
|
||||||
|
import config from '@/config';
|
||||||
|
|
||||||
|
interface User {
|
||||||
|
id: number;
|
||||||
|
username: string;
|
||||||
|
name: string;
|
||||||
|
type: string;
|
||||||
|
active: boolean;
|
||||||
|
status?: string;
|
||||||
|
lastlogin?: string;
|
||||||
|
statustext?: string;
|
||||||
|
statusconnection?: string;
|
||||||
|
mongo_id?: string;
|
||||||
|
avataretag?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface SearchResult {
|
||||||
|
id: number;
|
||||||
|
msg: string;
|
||||||
|
ts: string;
|
||||||
|
u: {
|
||||||
|
username: string;
|
||||||
|
name?: string;
|
||||||
|
};
|
||||||
|
room_id: number;
|
||||||
|
room_name: string;
|
||||||
|
room_fname: string;
|
||||||
|
room_type: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Chat() {
|
||||||
|
const [users, setUsers] = useState<User[]>([]);
|
||||||
|
const [selectedUserId, setSelectedUserId] = useState<string>('');
|
||||||
|
const [selectedRoomId, setSelectedRoomId] = useState<string | null>(null);
|
||||||
|
const [loading, setLoading] = useState(true);
|
||||||
|
const [error, setError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
// Global search state
|
||||||
|
const [globalSearchQuery, setGlobalSearchQuery] = useState('');
|
||||||
|
const [searchResults, setSearchResults] = useState<SearchResult[]>([]);
|
||||||
|
const [showSearchResults, setShowSearchResults] = useState(false);
|
||||||
|
const [searching, setSearching] = useState(false);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const fetchUsers = async () => {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${config.chatUrl}/users`);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setUsers(data.users);
|
||||||
|
} else {
|
||||||
|
throw new Error(data.error || 'Failed to fetch users');
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error fetching users:', err);
|
||||||
|
setError(err instanceof Error ? err.message : 'An error occurred');
|
||||||
|
} finally {
|
||||||
|
setLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fetchUsers();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
const handleUserChange = (userId: string) => {
|
||||||
|
setSelectedUserId(userId);
|
||||||
|
setSelectedRoomId(null); // Reset room selection when user changes
|
||||||
|
setGlobalSearchQuery(''); // Clear search when user changes
|
||||||
|
setShowSearchResults(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleRoomSelect = (roomId: string) => {
|
||||||
|
setSelectedRoomId(roomId);
|
||||||
|
setShowSearchResults(false); // Close search results when room is selected
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleGlobalSearch = async () => {
|
||||||
|
if (!globalSearchQuery || globalSearchQuery.length < 2 || !selectedUserId) return;
|
||||||
|
|
||||||
|
setSearching(true);
|
||||||
|
try {
|
||||||
|
const response = await fetch(
|
||||||
|
`${config.chatUrl}/users/${selectedUserId}/search?q=${encodeURIComponent(globalSearchQuery)}&limit=20`
|
||||||
|
);
|
||||||
|
const data = await response.json();
|
||||||
|
|
||||||
|
if (data.status === 'success') {
|
||||||
|
setSearchResults(data.results);
|
||||||
|
setShowSearchResults(true);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.error('Error searching messages:', err);
|
||||||
|
} finally {
|
||||||
|
setSearching(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSearchKeyPress = (e: React.KeyboardEvent) => {
|
||||||
|
if (e.key === 'Enter') {
|
||||||
|
handleGlobalSearch();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
if (loading) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-center justify-center min-h-[400px]">
|
||||||
|
<Loader2 className="h-8 w-8 animate-spin" />
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (error) {
|
||||||
|
return (
|
||||||
|
<div className="p-6">
|
||||||
|
<Card className="border-red-200 bg-red-50">
|
||||||
|
<CardHeader>
|
||||||
|
<CardTitle className="text-red-800">Connection Error</CardTitle>
|
||||||
|
</CardHeader>
|
||||||
|
<CardContent>
|
||||||
|
<p className="text-red-700">{error}</p>
|
||||||
|
<p className="text-sm text-red-600 mt-2">
|
||||||
|
Make sure the chat server is running and the database is accessible.
|
||||||
|
</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="p-6 space-y-6">
|
||||||
|
{/* Header */}
|
||||||
|
<div className="flex items-center justify-between">
|
||||||
|
<h1 className="text-3xl font-bold">Chat</h1>
|
||||||
|
|
||||||
|
<div className="flex items-center gap-4">
|
||||||
|
{/* Global Search */}
|
||||||
|
{selectedUserId && (
|
||||||
|
<div className="relative">
|
||||||
|
<div className="flex gap-2">
|
||||||
|
<Input
|
||||||
|
placeholder="Search all messages..."
|
||||||
|
value={globalSearchQuery}
|
||||||
|
onChange={(e) => setGlobalSearchQuery(e.target.value)}
|
||||||
|
onKeyPress={handleSearchKeyPress}
|
||||||
|
className="w-64"
|
||||||
|
/>
|
||||||
|
<Button
|
||||||
|
onClick={handleGlobalSearch}
|
||||||
|
disabled={searching || globalSearchQuery.length < 2}
|
||||||
|
size="icon"
|
||||||
|
variant="outline"
|
||||||
|
>
|
||||||
|
{searching ? (
|
||||||
|
<Loader2 className="h-4 w-4 animate-spin" />
|
||||||
|
) : (
|
||||||
|
<Search className="h-4 w-4" />
|
||||||
|
)}
|
||||||
|
</Button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{showSearchResults && (
|
||||||
|
<SearchResults
|
||||||
|
results={searchResults}
|
||||||
|
query={globalSearchQuery}
|
||||||
|
onClose={() => setShowSearchResults(false)}
|
||||||
|
onRoomSelect={handleRoomSelect}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<Select value={selectedUserId} onValueChange={handleUserChange}>
|
||||||
|
<SelectTrigger className="w-64">
|
||||||
|
<SelectValue placeholder="View as user..." />
|
||||||
|
</SelectTrigger>
|
||||||
|
<SelectContent>
|
||||||
|
{users.map((user) => (
|
||||||
|
<SelectItem key={user.id} value={user.id.toString()}>
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<Avatar className="h-6 w-6">
|
||||||
|
<AvatarImage
|
||||||
|
src={user.mongo_id ? `${config.chatUrl}/avatar/${user.mongo_id}` : undefined}
|
||||||
|
alt={user.name || user.username}
|
||||||
|
/>
|
||||||
|
<AvatarFallback className="text-xs">
|
||||||
|
{(user.name || user.username).charAt(0).toUpperCase()}
|
||||||
|
</AvatarFallback>
|
||||||
|
</Avatar>
|
||||||
|
<span className={user.active ? '' : 'text-muted-foreground'}>
|
||||||
|
{user.name || user.username}
|
||||||
|
{!user.active && <span className="text-xs ml-1">(inactive)</span>}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
</SelectItem>
|
||||||
|
))}
|
||||||
|
</SelectContent>
|
||||||
|
</Select>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Chat Interface */}
|
||||||
|
{selectedUserId ? (
|
||||||
|
<div className="grid grid-cols-12 gap-6 h-[700px]">
|
||||||
|
{/* Room List Sidebar */}
|
||||||
|
<div className="col-span-4 h-[85vh] overflow-y-auto">
|
||||||
|
<RoomList
|
||||||
|
selectedUserId={selectedUserId}
|
||||||
|
selectedRoomId={selectedRoomId}
|
||||||
|
onRoomSelect={handleRoomSelect}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Chat Messages Area */}
|
||||||
|
<div className="col-span-8 h-[85vh] overflow-y-auto">
|
||||||
|
<ChatRoom
|
||||||
|
roomId={selectedRoomId || ''}
|
||||||
|
selectedUserId={selectedUserId}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<Card>
|
||||||
|
<CardContent className="flex items-center justify-center h-64">
|
||||||
|
<p className="text-muted-foreground">
|
||||||
|
Select a user to view their chat rooms and messages.
|
||||||
|
</p>
|
||||||
|
</CardContent>
|
||||||
|
</Card>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -1,50 +1,31 @@
|
|||||||
import { useEffect, useState } from 'react';
|
import { useEffect, useState, useRef, useMemo } from "react";
|
||||||
import { Card, CardContent, CardHeader, CardTitle } from '../components/ui/card';
|
import OrderMetricsCard from "../components/purchase-orders/OrderMetricsCard";
|
||||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from '../components/ui/table';
|
import VendorMetricsCard from "../components/purchase-orders/VendorMetricsCard";
|
||||||
import { Loader2, ArrowUpDown } from 'lucide-react';
|
import CategoryMetricsCard from "../components/purchase-orders/CategoryMetricsCard";
|
||||||
import { Button } from '../components/ui/button';
|
import PaginationControls from "../components/purchase-orders/PaginationControls";
|
||||||
import { Input } from '../components/ui/input';
|
import PurchaseOrdersTable from "../components/purchase-orders/PurchaseOrdersTable";
|
||||||
import { Badge } from '../components/ui/badge';
|
import FilterControls from "../components/purchase-orders/FilterControls";
|
||||||
import {
|
|
||||||
Select,
|
|
||||||
SelectContent,
|
|
||||||
SelectItem,
|
|
||||||
SelectTrigger,
|
|
||||||
SelectValue,
|
|
||||||
} from '../components/ui/select';
|
|
||||||
import {
|
|
||||||
Pagination,
|
|
||||||
PaginationContent,
|
|
||||||
PaginationItem,
|
|
||||||
PaginationNext,
|
|
||||||
PaginationPrevious,
|
|
||||||
} from '../components/ui/pagination';
|
|
||||||
import { motion } from 'motion/react';
|
|
||||||
import {
|
|
||||||
PurchaseOrderStatus,
|
|
||||||
getPurchaseOrderStatusLabel,
|
|
||||||
getReceivingStatusLabel,
|
|
||||||
getPurchaseOrderStatusVariant,
|
|
||||||
getReceivingStatusVariant
|
|
||||||
} from '../types/status-codes';
|
|
||||||
|
|
||||||
interface PurchaseOrder {
|
interface PurchaseOrder {
|
||||||
id: number;
|
id: number | string;
|
||||||
vendor_name: string;
|
vendor_name: string;
|
||||||
order_date: string;
|
order_date: string | null;
|
||||||
|
receiving_date: string | null;
|
||||||
status: number;
|
status: number;
|
||||||
receiving_status: number;
|
|
||||||
total_items: number;
|
total_items: number;
|
||||||
total_quantity: number;
|
total_quantity: number;
|
||||||
total_cost: number;
|
total_cost: number;
|
||||||
total_received: number;
|
total_received: number;
|
||||||
fulfillment_rate: number;
|
fulfillment_rate: number;
|
||||||
|
short_note: string | null;
|
||||||
|
record_type: "po_only" | "po_with_receiving" | "receiving_only";
|
||||||
}
|
}
|
||||||
|
|
||||||
interface VendorMetrics {
|
interface VendorMetrics {
|
||||||
vendor_name: string;
|
vendor_name: string;
|
||||||
total_orders: number;
|
total_orders: number;
|
||||||
avg_delivery_days: number;
|
avg_delivery_days: number;
|
||||||
|
max_delivery_days: number;
|
||||||
fulfillment_rate: number;
|
fulfillment_rate: number;
|
||||||
avg_unit_cost: number;
|
avg_unit_cost: number;
|
||||||
total_spend: number;
|
total_spend: number;
|
||||||
@@ -59,6 +40,9 @@ interface CostAnalysis {
|
|||||||
total_spend_by_category: {
|
total_spend_by_category: {
|
||||||
category: string;
|
category: string;
|
||||||
total_spend: number;
|
total_spend: number;
|
||||||
|
unique_products?: number;
|
||||||
|
avg_cost?: number;
|
||||||
|
cost_variance?: number;
|
||||||
}[];
|
}[];
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -69,50 +53,33 @@ interface ReceivingStatus {
|
|||||||
fulfillment_rate: number;
|
fulfillment_rate: number;
|
||||||
total_value: number;
|
total_value: number;
|
||||||
avg_cost: number;
|
avg_cost: number;
|
||||||
|
avg_delivery_days?: number;
|
||||||
|
max_delivery_days?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface PurchaseOrdersResponse {
|
|
||||||
orders: PurchaseOrder[];
|
|
||||||
summary: {
|
|
||||||
order_count: number;
|
|
||||||
total_ordered: number;
|
|
||||||
total_received: number;
|
|
||||||
fulfillment_rate: number;
|
|
||||||
total_value: number;
|
|
||||||
avg_cost: number;
|
|
||||||
};
|
|
||||||
pagination: {
|
|
||||||
total: number;
|
|
||||||
pages: number;
|
|
||||||
page: number;
|
|
||||||
limit: number;
|
|
||||||
};
|
|
||||||
filters: {
|
|
||||||
vendors: string[];
|
|
||||||
statuses: string[];
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
export default function PurchaseOrders() {
|
export default function PurchaseOrders() {
|
||||||
const [purchaseOrders, setPurchaseOrders] = useState<PurchaseOrder[]>([]);
|
const [purchaseOrders, setPurchaseOrders] = useState<PurchaseOrder[]>([]);
|
||||||
const [, setVendorMetrics] = useState<VendorMetrics[]>([]);
|
const [, setVendorMetrics] = useState<VendorMetrics[]>([]);
|
||||||
const [costAnalysis, setCostAnalysis] = useState<CostAnalysis | null>(null);
|
const [, setCostAnalysis] = useState<CostAnalysis | null>(null);
|
||||||
const [summary, setSummary] = useState<ReceivingStatus | null>(null);
|
const [summary, setSummary] = useState<ReceivingStatus | null>(null);
|
||||||
const [loading, setLoading] = useState(true);
|
const [loading, setLoading] = useState(true);
|
||||||
const [page, setPage] = useState(1);
|
const [page, setPage] = useState(1);
|
||||||
const [sortColumn, setSortColumn] = useState<string>('order_date');
|
const [sortColumn, setSortColumn] = useState<string>("order_date");
|
||||||
const [sortDirection, setSortDirection] = useState<'asc' | 'desc'>('desc');
|
const [sortDirection, setSortDirection] = useState<"asc" | "desc">("desc");
|
||||||
const [filters, setFilters] = useState({
|
const [searchInput, setSearchInput] = useState("");
|
||||||
search: '',
|
const [filterValues, setFilterValues] = useState({
|
||||||
status: 'all',
|
search: "",
|
||||||
vendor: 'all',
|
status: "all",
|
||||||
|
vendor: "all",
|
||||||
|
recordType: "all",
|
||||||
});
|
});
|
||||||
const [filterOptions, setFilterOptions] = useState<{
|
const [filterOptions, setFilterOptions] = useState<{
|
||||||
vendors: string[];
|
vendors: string[];
|
||||||
statuses: string[];
|
statuses: number[];
|
||||||
}>({
|
}>({
|
||||||
vendors: [],
|
vendors: [],
|
||||||
statuses: []
|
statuses: [],
|
||||||
});
|
});
|
||||||
const [pagination, setPagination] = useState({
|
const [pagination, setPagination] = useState({
|
||||||
total: 0,
|
total: 0,
|
||||||
@@ -120,99 +87,173 @@ export default function PurchaseOrders() {
|
|||||||
page: 1,
|
page: 1,
|
||||||
limit: 100,
|
limit: 100,
|
||||||
});
|
});
|
||||||
|
const [] = useState(false);
|
||||||
|
const [] = useState<
|
||||||
|
number | undefined
|
||||||
|
>();
|
||||||
|
const [] = useState<
|
||||||
|
number | undefined
|
||||||
|
>();
|
||||||
|
const [] = useState(false);
|
||||||
|
const [yearlyVendorData, setYearlyVendorData] = useState<
|
||||||
|
{
|
||||||
|
vendor: string;
|
||||||
|
orders: number;
|
||||||
|
total_spend: number;
|
||||||
|
percentage?: number;
|
||||||
|
}[]
|
||||||
|
>([]);
|
||||||
|
const [yearlyCategoryData, setYearlyCategoryData] = useState<
|
||||||
|
{
|
||||||
|
category: string;
|
||||||
|
unique_products?: number;
|
||||||
|
total_spend: number;
|
||||||
|
percentage?: number;
|
||||||
|
avg_cost?: number;
|
||||||
|
cost_variance?: number;
|
||||||
|
}[]
|
||||||
|
>([]);
|
||||||
|
const [yearlyDataLoading, setYearlyDataLoading] = useState(false);
|
||||||
|
const hasInitialFetchRef = useRef(false);
|
||||||
|
const hasInitialYearlyFetchRef = useRef(false);
|
||||||
|
|
||||||
const STATUS_FILTER_OPTIONS = [
|
// Use useMemo to compute filters only when filterValues change
|
||||||
{ value: 'all', label: 'All Statuses' },
|
const filters = useMemo(() => filterValues, [filterValues]);
|
||||||
{ value: String(PurchaseOrderStatus.Created), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Created) },
|
|
||||||
{ value: String(PurchaseOrderStatus.ElectronicallyReadySend), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.ElectronicallyReadySend) },
|
|
||||||
{ value: String(PurchaseOrderStatus.Ordered), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Ordered) },
|
|
||||||
{ value: String(PurchaseOrderStatus.ReceivingStarted), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.ReceivingStarted) },
|
|
||||||
{ value: String(PurchaseOrderStatus.Done), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Done) },
|
|
||||||
{ value: String(PurchaseOrderStatus.Canceled), label: getPurchaseOrderStatusLabel(PurchaseOrderStatus.Canceled) },
|
|
||||||
];
|
|
||||||
|
|
||||||
const fetchData = async () => {
|
const fetchData = async () => {
|
||||||
try {
|
try {
|
||||||
const searchParams = new URLSearchParams({
|
setLoading(true);
|
||||||
page: page.toString(),
|
|
||||||
limit: '100',
|
|
||||||
sortColumn,
|
|
||||||
sortDirection,
|
|
||||||
...filters.search && { search: filters.search },
|
|
||||||
...filters.status && { status: filters.status },
|
|
||||||
...filters.vendor && { vendor: filters.vendor },
|
|
||||||
});
|
|
||||||
|
|
||||||
const [
|
// Build search params with proper encoding
|
||||||
purchaseOrdersRes,
|
const searchParams = new URLSearchParams();
|
||||||
vendorMetricsRes,
|
searchParams.append('page', page.toString());
|
||||||
costAnalysisRes
|
searchParams.append('limit', '100');
|
||||||
] = await Promise.all([
|
searchParams.append('sortColumn', sortColumn);
|
||||||
fetch(`/api/purchase-orders?${searchParams}`),
|
searchParams.append('sortDirection', sortDirection);
|
||||||
fetch('/api/purchase-orders/vendor-metrics'),
|
|
||||||
fetch('/api/purchase-orders/cost-analysis')
|
if (filters.search) {
|
||||||
|
searchParams.append('search', filters.search);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.status !== 'all') {
|
||||||
|
searchParams.append('status', filters.status);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.vendor !== 'all') {
|
||||||
|
searchParams.append('vendor', filters.vendor);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filters.recordType !== 'all') {
|
||||||
|
searchParams.append('recordType', filters.recordType);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log("Fetching data with params:", searchParams.toString());
|
||||||
|
|
||||||
|
// Fetch orders first separately to handle errors better
|
||||||
|
const purchaseOrdersRes = await fetch(`/api/purchase-orders?${searchParams.toString()}`);
|
||||||
|
|
||||||
|
if (!purchaseOrdersRes.ok) {
|
||||||
|
const errorText = await purchaseOrdersRes.text();
|
||||||
|
console.error("Failed to fetch purchase orders:", errorText);
|
||||||
|
throw new Error(`Failed to fetch purchase orders: ${errorText}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const purchaseOrdersData = await purchaseOrdersRes.json();
|
||||||
|
|
||||||
|
// Process orders data immediately
|
||||||
|
const processedOrders = purchaseOrdersData.orders.map((order: any) => ({
|
||||||
|
...order,
|
||||||
|
status: Number(order.status),
|
||||||
|
total_items: Number(order.total_items) || 0,
|
||||||
|
total_quantity: Number(order.total_quantity) || 0,
|
||||||
|
total_cost: Number(order.total_cost) || 0,
|
||||||
|
total_received: Number(order.total_received) || 0,
|
||||||
|
fulfillment_rate: Number(order.fulfillment_rate) || 0,
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Update the main data state
|
||||||
|
setPurchaseOrders(processedOrders);
|
||||||
|
setPagination(purchaseOrdersData.pagination);
|
||||||
|
setFilterOptions(purchaseOrdersData.filters);
|
||||||
|
|
||||||
|
// Now fetch the additional data in parallel
|
||||||
|
const [vendorMetricsRes, costAnalysisRes, deliveryMetricsRes] = await Promise.all([
|
||||||
|
fetch("/api/purchase-orders/vendor-metrics"),
|
||||||
|
fetch("/api/purchase-orders/cost-analysis"),
|
||||||
|
fetch("/api/purchase-orders/delivery-metrics"),
|
||||||
]);
|
]);
|
||||||
|
|
||||||
// Initialize default data
|
let vendorMetricsData = [];
|
||||||
let purchaseOrdersData: PurchaseOrdersResponse = {
|
let costAnalysisData = {
|
||||||
orders: [],
|
|
||||||
summary: {
|
|
||||||
order_count: 0,
|
|
||||||
total_ordered: 0,
|
|
||||||
total_received: 0,
|
|
||||||
fulfillment_rate: 0,
|
|
||||||
total_value: 0,
|
|
||||||
avg_cost: 0
|
|
||||||
},
|
|
||||||
pagination: {
|
|
||||||
total: 0,
|
|
||||||
pages: 0,
|
|
||||||
page: 1,
|
|
||||||
limit: 100
|
|
||||||
},
|
|
||||||
filters: {
|
|
||||||
vendors: [],
|
|
||||||
statuses: []
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
let vendorMetricsData: VendorMetrics[] = [];
|
|
||||||
let costAnalysisData: CostAnalysis = {
|
|
||||||
unique_products: 0,
|
unique_products: 0,
|
||||||
avg_cost: 0,
|
avg_cost: 0,
|
||||||
min_cost: 0,
|
min_cost: 0,
|
||||||
max_cost: 0,
|
max_cost: 0,
|
||||||
cost_variance: 0,
|
cost_variance: 0,
|
||||||
total_spend_by_category: []
|
total_spend_by_category: [],
|
||||||
};
|
};
|
||||||
|
|
||||||
// Only try to parse responses if they were successful
|
let deliveryMetricsData = {
|
||||||
if (purchaseOrdersRes.ok) {
|
avg_delivery_days: 0,
|
||||||
purchaseOrdersData = await purchaseOrdersRes.json();
|
max_delivery_days: 0
|
||||||
} else {
|
};
|
||||||
console.error('Failed to fetch purchase orders:', await purchaseOrdersRes.text());
|
|
||||||
}
|
|
||||||
|
|
||||||
if (vendorMetricsRes.ok) {
|
if (vendorMetricsRes.ok) {
|
||||||
vendorMetricsData = await vendorMetricsRes.json();
|
vendorMetricsData = await vendorMetricsRes.json();
|
||||||
|
setVendorMetrics(vendorMetricsData);
|
||||||
} else {
|
} else {
|
||||||
console.error('Failed to fetch vendor metrics:', await vendorMetricsRes.text());
|
console.error(
|
||||||
|
"Failed to fetch vendor metrics:",
|
||||||
|
await vendorMetricsRes.text()
|
||||||
|
);
|
||||||
|
setVendorMetrics([]);
|
||||||
}
|
}
|
||||||
|
|
||||||
if (costAnalysisRes.ok) {
|
if (costAnalysisRes.ok) {
|
||||||
costAnalysisData = await costAnalysisRes.json();
|
costAnalysisData = await costAnalysisRes.json();
|
||||||
|
setCostAnalysis(costAnalysisData);
|
||||||
} else {
|
} else {
|
||||||
console.error('Failed to fetch cost analysis:', await costAnalysisRes.text());
|
console.error(
|
||||||
|
"Failed to fetch cost analysis:",
|
||||||
|
await costAnalysisRes.text()
|
||||||
|
);
|
||||||
|
setCostAnalysis({
|
||||||
|
unique_products: 0,
|
||||||
|
avg_cost: 0,
|
||||||
|
min_cost: 0,
|
||||||
|
max_cost: 0,
|
||||||
|
cost_variance: 0,
|
||||||
|
total_spend_by_category: [],
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
setPurchaseOrders(purchaseOrdersData.orders);
|
if (deliveryMetricsRes.ok) {
|
||||||
setPagination(purchaseOrdersData.pagination);
|
deliveryMetricsData = await deliveryMetricsRes.json();
|
||||||
setFilterOptions(purchaseOrdersData.filters);
|
|
||||||
setSummary(purchaseOrdersData.summary);
|
// Merge delivery metrics into summary
|
||||||
setVendorMetrics(vendorMetricsData);
|
const summaryWithDelivery = {
|
||||||
setCostAnalysis(costAnalysisData);
|
...purchaseOrdersData.summary,
|
||||||
|
avg_delivery_days: deliveryMetricsData.avg_delivery_days,
|
||||||
|
max_delivery_days: deliveryMetricsData.max_delivery_days
|
||||||
|
};
|
||||||
|
|
||||||
|
setSummary(summaryWithDelivery);
|
||||||
|
} else {
|
||||||
|
console.error(
|
||||||
|
"Failed to fetch delivery metrics:",
|
||||||
|
await deliveryMetricsRes.text()
|
||||||
|
);
|
||||||
|
setSummary({
|
||||||
|
...purchaseOrdersData.summary,
|
||||||
|
avg_delivery_days: 0,
|
||||||
|
max_delivery_days: 0
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark that we've completed an initial fetch
|
||||||
|
hasInitialFetchRef.current = true;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error('Error fetching data:', error);
|
console.error("Error fetching data:", error);
|
||||||
// Set default values in case of error
|
// Set default values in case of error
|
||||||
setPurchaseOrders([]);
|
setPurchaseOrders([]);
|
||||||
setPagination({ total: 0, pages: 0, page: 1, limit: 100 });
|
setPagination({ total: 0, pages: 0, page: 1, limit: 100 });
|
||||||
@@ -223,7 +264,7 @@ export default function PurchaseOrders() {
|
|||||||
total_received: 0,
|
total_received: 0,
|
||||||
fulfillment_rate: 0,
|
fulfillment_rate: 0,
|
||||||
total_value: 0,
|
total_value: 0,
|
||||||
avg_cost: 0
|
avg_cost: 0,
|
||||||
});
|
});
|
||||||
setVendorMetrics([]);
|
setVendorMetrics([]);
|
||||||
setCostAnalysis({
|
setCostAnalysis({
|
||||||
@@ -232,284 +273,209 @@ export default function PurchaseOrders() {
|
|||||||
min_cost: 0,
|
min_cost: 0,
|
||||||
max_cost: 0,
|
max_cost: 0,
|
||||||
cost_variance: 0,
|
cost_variance: 0,
|
||||||
total_spend_by_category: []
|
total_spend_by_category: [],
|
||||||
});
|
});
|
||||||
} finally {
|
} finally {
|
||||||
setLoading(false);
|
setLoading(false);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
// Setup debounced search
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
fetchData();
|
const timer = setTimeout(() => {
|
||||||
}, [page, sortColumn, sortDirection, filters]);
|
if (searchInput !== filterValues.search) {
|
||||||
|
setFilterValues(prev => ({ ...prev, search: searchInput }));
|
||||||
|
}
|
||||||
|
}, 300); // Use 300ms for better response time
|
||||||
|
|
||||||
|
return () => clearTimeout(timer);
|
||||||
|
}, [searchInput, filterValues.search]);
|
||||||
|
|
||||||
|
// Reset page to 1 when filters change
|
||||||
|
useEffect(() => {
|
||||||
|
// Reset to page 1 when filters change to ensure proper pagination
|
||||||
|
setPage(1);
|
||||||
|
}, [filterValues]); // Use filterValues directly to avoid unnecessary renders
|
||||||
|
|
||||||
|
// Fetch data when page, sort or filters change
|
||||||
|
useEffect(() => {
|
||||||
|
// Log the current filter state for debugging
|
||||||
|
console.log("Fetching with filters:", filterValues);
|
||||||
|
console.log("Page:", page, "Sort:", sortColumn, sortDirection);
|
||||||
|
|
||||||
|
// Always fetch data - don't use conditional checks that might prevent it
|
||||||
|
fetchData();
|
||||||
|
}, [page, sortColumn, sortDirection, filterValues]);
|
||||||
|
|
||||||
|
// Handle column sorting more consistently
|
||||||
const handleSort = (column: string) => {
|
const handleSort = (column: string) => {
|
||||||
|
// Reset to page 1 when changing sort to ensure we see the first page of results
|
||||||
|
setPage(1);
|
||||||
|
|
||||||
if (sortColumn === column) {
|
if (sortColumn === column) {
|
||||||
setSortDirection(prev => prev === 'asc' ? 'desc' : 'asc');
|
setSortDirection((prev) => (prev === "asc" ? "desc" : "asc"));
|
||||||
} else {
|
} else {
|
||||||
setSortColumn(column);
|
setSortColumn(column);
|
||||||
setSortDirection('asc');
|
// For most columns, start with descending to show highest values first
|
||||||
|
if (column === 'id' || column === 'vendor_name') {
|
||||||
|
setSortDirection("asc");
|
||||||
|
} else {
|
||||||
|
setSortDirection("desc");
|
||||||
|
}
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
const getStatusBadge = (status: number, receivingStatus: number) => {
|
// Update filter handlers
|
||||||
// If the PO is canceled, show that status
|
const handleStatusChange = (value: string) => {
|
||||||
if (status === PurchaseOrderStatus.Canceled) {
|
setFilterValues(prev => ({ ...prev, status: value }));
|
||||||
return <Badge variant={getPurchaseOrderStatusVariant(status)}>
|
|
||||||
{getPurchaseOrderStatusLabel(status)}
|
|
||||||
</Badge>;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If receiving has started, show receiving status
|
|
||||||
if (status >= PurchaseOrderStatus.ReceivingStarted) {
|
|
||||||
return <Badge variant={getReceivingStatusVariant(receivingStatus)}>
|
|
||||||
{getReceivingStatusLabel(receivingStatus)}
|
|
||||||
</Badge>;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Otherwise show PO status
|
|
||||||
return <Badge variant={getPurchaseOrderStatusVariant(status)}>
|
|
||||||
{getPurchaseOrderStatusLabel(status)}
|
|
||||||
</Badge>;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
const formatNumber = (value: number) => {
|
const handleVendorChange = (value: string) => {
|
||||||
return value.toLocaleString('en-US', {
|
setFilterValues(prev => ({ ...prev, vendor: value }));
|
||||||
minimumFractionDigits: 2,
|
};
|
||||||
maximumFractionDigits: 2
|
|
||||||
|
const handleRecordTypeChange = (value: string) => {
|
||||||
|
setFilterValues(prev => ({ ...prev, recordType: value }));
|
||||||
|
};
|
||||||
|
|
||||||
|
// Clear all filters handler
|
||||||
|
const clearFilters = () => {
|
||||||
|
setSearchInput("");
|
||||||
|
setFilterValues({
|
||||||
|
search: "",
|
||||||
|
status: "all",
|
||||||
|
vendor: "all",
|
||||||
|
recordType: "all",
|
||||||
});
|
});
|
||||||
};
|
};
|
||||||
|
|
||||||
const formatPercent = (value: number) => {
|
// Update this function to fetch yearly data
|
||||||
return (value * 100).toLocaleString('en-US', {
|
const fetchYearlyData = async () => {
|
||||||
minimumFractionDigits: 1,
|
if (
|
||||||
maximumFractionDigits: 1
|
hasInitialYearlyFetchRef.current &&
|
||||||
}) + '%';
|
import.meta.hot &&
|
||||||
};
|
(yearlyVendorData.length > 0 || yearlyCategoryData.length > 0)
|
||||||
|
) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (loading) {
|
try {
|
||||||
return (
|
setYearlyDataLoading(true);
|
||||||
<div className="flex h-full items-center justify-center">
|
|
||||||
<Loader2 className="h-8 w-8 animate-spin" />
|
// Create a date for 1 year ago
|
||||||
</div>
|
const oneYearAgo = new Date();
|
||||||
|
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
|
||||||
|
const dateParam = oneYearAgo.toISOString().split("T")[0]; // Format as YYYY-MM-DD
|
||||||
|
|
||||||
|
const [vendorResponse, categoryResponse] = await Promise.all([
|
||||||
|
fetch(`/api/purchase-orders/vendor-analysis?since=${dateParam}`),
|
||||||
|
fetch(`/api/purchase-orders/category-analysis?since=${dateParam}`),
|
||||||
|
]);
|
||||||
|
|
||||||
|
if (vendorResponse.ok) {
|
||||||
|
const vendorData = await vendorResponse.json();
|
||||||
|
// Calculate percentages before setting state
|
||||||
|
const totalSpend = vendorData.reduce(
|
||||||
|
(sum: number, v: any) => sum + v.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
setYearlyVendorData(
|
||||||
|
vendorData.map((v: any) => ({
|
||||||
|
...v,
|
||||||
|
percentage: totalSpend > 0 ? v.total_spend / totalSpend : 0,
|
||||||
|
}))
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
console.error(
|
||||||
|
"Failed to fetch yearly vendor data:",
|
||||||
|
await vendorResponse.text()
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (categoryResponse.ok) {
|
||||||
|
const categoryData = await categoryResponse.json();
|
||||||
|
// Calculate percentages before setting state
|
||||||
|
const totalSpend = categoryData.reduce(
|
||||||
|
(sum: number, c: any) => sum + c.total_spend,
|
||||||
|
0
|
||||||
|
);
|
||||||
|
|
||||||
|
setYearlyCategoryData(
|
||||||
|
categoryData.map((c: any) => ({
|
||||||
|
...c,
|
||||||
|
percentage: totalSpend > 0 ? c.total_spend / totalSpend : 0,
|
||||||
|
}))
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
console.error(
|
||||||
|
"Failed to fetch yearly category data:",
|
||||||
|
await categoryResponse.text()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark that we've completed an initial fetch
|
||||||
|
hasInitialYearlyFetchRef.current = true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error fetching yearly data:", error);
|
||||||
|
} finally {
|
||||||
|
setYearlyDataLoading(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Fetch yearly data when component mounts, not just when dialogs open
|
||||||
|
useEffect(() => {
|
||||||
|
fetchYearlyData();
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
}, []);
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<motion.div layout className="container mx-auto py-6">
|
<div className="container mx-auto py-6">
|
||||||
<h1 className="mb-6 text-3xl font-bold">Purchase Orders</h1>
|
<h1 className="mb-6 text-3xl font-bold">Purchase Orders</h1>
|
||||||
|
|
||||||
{/* Metrics Overview */}
|
<div className="mb-4 grid gap-4 md:grid-cols-3">
|
||||||
<div className="mb-6 grid gap-4 md:grid-cols-4">
|
<OrderMetricsCard
|
||||||
<Card>
|
summary={summary}
|
||||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
loading={loading}
|
||||||
<CardTitle className="text-sm font-medium">Total Orders</CardTitle>
|
/>
|
||||||
</CardHeader>
|
<VendorMetricsCard
|
||||||
<CardContent>
|
loading={loading}
|
||||||
<div className="text-2xl font-bold">{summary?.order_count.toLocaleString() || 0}</div>
|
yearlyVendorData={yearlyVendorData}
|
||||||
</CardContent>
|
yearlyDataLoading={yearlyDataLoading}
|
||||||
</Card>
|
/>
|
||||||
<Card>
|
<CategoryMetricsCard
|
||||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
loading={loading}
|
||||||
<CardTitle className="text-sm font-medium">Total Value</CardTitle>
|
yearlyCategoryData={yearlyCategoryData}
|
||||||
</CardHeader>
|
yearlyDataLoading={yearlyDataLoading}
|
||||||
<CardContent>
|
|
||||||
<div className="text-2xl font-bold">
|
|
||||||
${formatNumber(summary?.total_value || 0)}
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
<Card>
|
|
||||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
|
||||||
<CardTitle className="text-sm font-medium">Fulfillment Rate</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="text-2xl font-bold">
|
|
||||||
{formatPercent(summary?.fulfillment_rate || 0)}
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
<Card>
|
|
||||||
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
|
|
||||||
<CardTitle className="text-sm font-medium">Avg Cost per PO</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<div className="text-2xl font-bold">
|
|
||||||
${formatNumber(summary?.avg_cost || 0)}
|
|
||||||
</div>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
{/* Filters */}
|
|
||||||
<div className="mb-4 flex items-center gap-4">
|
|
||||||
<Input
|
|
||||||
placeholder="Search orders..."
|
|
||||||
value={filters.search}
|
|
||||||
onChange={(e) => setFilters(prev => ({ ...prev, search: e.target.value }))}
|
|
||||||
className="max-w-xs"
|
|
||||||
/>
|
/>
|
||||||
<Select
|
|
||||||
value={filters.status}
|
|
||||||
onValueChange={(value) => setFilters(prev => ({ ...prev, status: value }))}
|
|
||||||
>
|
|
||||||
<SelectTrigger className="w-[180px]">
|
|
||||||
<SelectValue placeholder="Select status" />
|
|
||||||
</SelectTrigger>
|
|
||||||
<SelectContent>
|
|
||||||
{STATUS_FILTER_OPTIONS.map(option => (
|
|
||||||
<SelectItem key={option.value} value={option.value}>
|
|
||||||
{option.label}
|
|
||||||
</SelectItem>
|
|
||||||
))}
|
|
||||||
</SelectContent>
|
|
||||||
</Select>
|
|
||||||
<Select
|
|
||||||
value={filters.vendor}
|
|
||||||
onValueChange={(value) => setFilters(prev => ({ ...prev, vendor: value }))}
|
|
||||||
>
|
|
||||||
<SelectTrigger className="w-[180px]">
|
|
||||||
<SelectValue placeholder="Select vendor" />
|
|
||||||
</SelectTrigger>
|
|
||||||
<SelectContent>
|
|
||||||
<SelectItem value="all">All Vendors</SelectItem>
|
|
||||||
{filterOptions?.vendors?.map(vendor => (
|
|
||||||
<SelectItem key={vendor} value={vendor}>
|
|
||||||
{vendor}
|
|
||||||
</SelectItem>
|
|
||||||
))}
|
|
||||||
</SelectContent>
|
|
||||||
</Select>
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{/* Purchase Orders Table */}
|
<FilterControls
|
||||||
<Card className="mb-6">
|
searchInput={searchInput}
|
||||||
<CardHeader>
|
setSearchInput={setSearchInput}
|
||||||
<CardTitle>Recent Purchase Orders</CardTitle>
|
filterValues={filterValues}
|
||||||
</CardHeader>
|
handleStatusChange={handleStatusChange}
|
||||||
<CardContent>
|
handleVendorChange={handleVendorChange}
|
||||||
<Table>
|
handleRecordTypeChange={handleRecordTypeChange}
|
||||||
<TableHeader>
|
clearFilters={clearFilters}
|
||||||
<TableRow>
|
filterOptions={filterOptions}
|
||||||
<TableHead>
|
loading={loading}
|
||||||
<Button variant="ghost" onClick={() => handleSort('id')}>
|
/>
|
||||||
ID <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
<TableHead>
|
|
||||||
<Button variant="ghost" onClick={() => handleSort('vendor_name')}>
|
|
||||||
Vendor <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
<TableHead>
|
|
||||||
<Button variant="ghost" onClick={() => handleSort('order_date')}>
|
|
||||||
Order Date <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
<TableHead>
|
|
||||||
<Button variant="ghost" onClick={() => handleSort('status')}>
|
|
||||||
Status <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
<TableHead>Total Items</TableHead>
|
|
||||||
<TableHead>Total Quantity</TableHead>
|
|
||||||
<TableHead>
|
|
||||||
<Button variant="ghost" onClick={() => handleSort('total_cost')}>
|
|
||||||
Total Cost <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
<TableHead>Received</TableHead>
|
|
||||||
<TableHead>
|
|
||||||
<Button variant="ghost" onClick={() => handleSort('fulfillment_rate')}>
|
|
||||||
Fulfillment <ArrowUpDown className="ml-2 h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
{purchaseOrders.map((po) => (
|
|
||||||
<TableRow key={po.id}>
|
|
||||||
<TableCell>{po.id}</TableCell>
|
|
||||||
<TableCell>{po.vendor_name}</TableCell>
|
|
||||||
<TableCell>{new Date(po.order_date).toLocaleDateString()}</TableCell>
|
|
||||||
<TableCell>{getStatusBadge(po.status, po.receiving_status)}</TableCell>
|
|
||||||
<TableCell>{po.total_items.toLocaleString()}</TableCell>
|
|
||||||
<TableCell>{po.total_quantity.toLocaleString()}</TableCell>
|
|
||||||
<TableCell>${formatNumber(po.total_cost)}</TableCell>
|
|
||||||
<TableCell>{po.total_received.toLocaleString()}</TableCell>
|
|
||||||
<TableCell>{formatPercent(po.fulfillment_rate)}</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
))}
|
|
||||||
{!purchaseOrders.length && (
|
|
||||||
<TableRow>
|
|
||||||
<TableCell colSpan={9} className="text-center text-muted-foreground">
|
|
||||||
No purchase orders found
|
|
||||||
</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
)}
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
|
|
||||||
{/* Pagination */}
|
<PurchaseOrdersTable
|
||||||
{pagination.pages > 1 && (
|
purchaseOrders={purchaseOrders}
|
||||||
<div className="flex justify-center">
|
loading={loading}
|
||||||
<Pagination>
|
summary={summary}
|
||||||
<PaginationContent>
|
sortColumn={sortColumn}
|
||||||
<PaginationItem>
|
sortDirection={sortDirection}
|
||||||
<Button
|
handleSort={handleSort}
|
||||||
onClick={() => setPage(page - 1)}
|
/>
|
||||||
disabled={page === 1}
|
|
||||||
className="h-9 px-4"
|
<PaginationControls
|
||||||
>
|
pagination={pagination}
|
||||||
<PaginationPrevious className="h-4 w-4" />
|
currentPage={page}
|
||||||
</Button>
|
onPageChange={setPage}
|
||||||
</PaginationItem>
|
/>
|
||||||
<PaginationItem>
|
|
||||||
<Button
|
|
||||||
onClick={() => setPage(page + 1)}
|
|
||||||
disabled={page === pagination.pages}
|
|
||||||
className="h-9 px-4"
|
|
||||||
>
|
|
||||||
<PaginationNext className="h-4 w-4" />
|
|
||||||
</Button>
|
|
||||||
</PaginationItem>
|
|
||||||
</PaginationContent>
|
|
||||||
</Pagination>
|
|
||||||
</div>
|
</div>
|
||||||
)}
|
|
||||||
|
|
||||||
{/* Cost Analysis */}
|
|
||||||
<Card>
|
|
||||||
<CardHeader>
|
|
||||||
<CardTitle>Cost Analysis by Category</CardTitle>
|
|
||||||
</CardHeader>
|
|
||||||
<CardContent>
|
|
||||||
<Table>
|
|
||||||
<TableHeader>
|
|
||||||
<TableRow>
|
|
||||||
<TableHead>Category</TableHead>
|
|
||||||
<TableHead>Total Spend</TableHead>
|
|
||||||
</TableRow>
|
|
||||||
</TableHeader>
|
|
||||||
<TableBody>
|
|
||||||
{costAnalysis?.total_spend_by_category?.map((category) => (
|
|
||||||
<TableRow key={category.category}>
|
|
||||||
<TableCell>{category.category}</TableCell>
|
|
||||||
<TableCell>${formatNumber(category.total_spend)}</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
)) || (
|
|
||||||
<TableRow>
|
|
||||||
<TableCell colSpan={2} className="text-center text-muted-foreground">
|
|
||||||
No cost analysis data available
|
|
||||||
</TableCell>
|
|
||||||
</TableRow>
|
|
||||||
)}
|
|
||||||
</TableBody>
|
|
||||||
</Table>
|
|
||||||
</CardContent>
|
|
||||||
</Card>
|
|
||||||
</motion.div>
|
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -75,7 +75,7 @@ export function getPurchaseOrderStatusVariant(status: number): 'default' | 'seco
|
|||||||
|
|
||||||
export function getReceivingStatusVariant(status: number): 'default' | 'secondary' | 'destructive' | 'outline' {
|
export function getReceivingStatusVariant(status: number): 'default' | 'secondary' | 'destructive' | 'outline' {
|
||||||
if (isReceivingCanceled(status)) return 'destructive';
|
if (isReceivingCanceled(status)) return 'destructive';
|
||||||
if (status === ReceivingStatus.Paid) return 'default';
|
if (status === ReceivingStatus.Paid || status === ReceivingStatus.FullReceived) return 'default';
|
||||||
if (status >= ReceivingStatus.PartialReceived) return 'secondary';
|
if (status >= ReceivingStatus.PartialReceived) return 'secondary';
|
||||||
return 'outline';
|
return 'outline';
|
||||||
}
|
}
|
||||||
213
inventory/src/utils/emojiUtils.ts
Normal file
213
inventory/src/utils/emojiUtils.ts
Normal file
@@ -0,0 +1,213 @@
|
|||||||
|
// Emoji shortcode to Unicode mapping
|
||||||
|
// Based on common emoji shortcodes used in chat platforms
|
||||||
|
const emojiMap: Record<string, string> = {
|
||||||
|
// Smileys & Emotion
|
||||||
|
'joy': '😂',
|
||||||
|
'heart_eyes': '😍',
|
||||||
|
'sob': '😭',
|
||||||
|
'blush': '😊',
|
||||||
|
'kissing_heart': '😘',
|
||||||
|
'smiling': '☺️',
|
||||||
|
'weary': '😩',
|
||||||
|
'pensive': '😔',
|
||||||
|
'smirk': '😏',
|
||||||
|
'grin': '😁',
|
||||||
|
'wink': '😉',
|
||||||
|
'relieved': '😌',
|
||||||
|
'flushed': '😳',
|
||||||
|
'cry': '😢',
|
||||||
|
'sunglasses': '😎',
|
||||||
|
'sweat_smile': '😅',
|
||||||
|
'sleeping': '😴',
|
||||||
|
'smile': '😄',
|
||||||
|
'purple_heart': '💜',
|
||||||
|
'broken_heart': '💔',
|
||||||
|
'expressionless': '😑',
|
||||||
|
'sparkling_heart': '💖',
|
||||||
|
'blue_heart': '💙',
|
||||||
|
'confused': '😕',
|
||||||
|
'stuck_out_tongue_winking_eye': '😜',
|
||||||
|
'disappointed': '😞',
|
||||||
|
'yum': '😋',
|
||||||
|
'neutral_face': '😐',
|
||||||
|
'sleepy': '😪',
|
||||||
|
'cupid': '💘',
|
||||||
|
'heartpulse': '💗',
|
||||||
|
'revolving_hearts': '💞',
|
||||||
|
'speak_no_evil': '🙊',
|
||||||
|
'see_no_evil': '🙈',
|
||||||
|
'rage': '😡',
|
||||||
|
'smiley': '😃',
|
||||||
|
'tired_face': '😫',
|
||||||
|
'stuck_out_tongue_closed_eyes': '😝',
|
||||||
|
'muscle': '💪',
|
||||||
|
'skull': '💀',
|
||||||
|
'sunny': '☀️',
|
||||||
|
'yellow_heart': '💛',
|
||||||
|
'triumph': '😤',
|
||||||
|
'new_moon_with_face': '🌚',
|
||||||
|
'laughing': '😆',
|
||||||
|
'sweat': '😓',
|
||||||
|
'heavy_check_mark': '✔️',
|
||||||
|
'heart_eyes_cat': '😻',
|
||||||
|
'grinning': '😀',
|
||||||
|
'mask': '😷',
|
||||||
|
'green_heart': '💚',
|
||||||
|
'persevere': '😣',
|
||||||
|
'heartbeat': '💓',
|
||||||
|
'angry': '😠',
|
||||||
|
'grimacing': '😬',
|
||||||
|
'gun': '🔫',
|
||||||
|
'thumbsdown': '👎',
|
||||||
|
'dancer': '💃',
|
||||||
|
'musical_note': '🎵',
|
||||||
|
'no_mouth': '😶',
|
||||||
|
'dizzy': '💫',
|
||||||
|
'fist': '✊',
|
||||||
|
'unamused': '😒',
|
||||||
|
'cold_sweat': '😰',
|
||||||
|
'gem': '💎',
|
||||||
|
'pizza': '🍕',
|
||||||
|
'joy_cat': '😹',
|
||||||
|
'sun_with_face': '🌞',
|
||||||
|
|
||||||
|
// Hearts
|
||||||
|
'heart': '❤️',
|
||||||
|
'two_hearts': '💕',
|
||||||
|
'kiss': '💋',
|
||||||
|
|
||||||
|
// Hand gestures
|
||||||
|
'thumbsup': '👍',
|
||||||
|
'thumbs_up': '👍',
|
||||||
|
'thumbs_down': '👎',
|
||||||
|
'ok_hand': '👌',
|
||||||
|
'pray': '🙏',
|
||||||
|
'raised_hands': '🙌',
|
||||||
|
'clap': '👏',
|
||||||
|
'point_right': '👉',
|
||||||
|
'point_left': '👈',
|
||||||
|
'point_up': '☝️',
|
||||||
|
'point_down': '👇',
|
||||||
|
'raised_hand': '✋',
|
||||||
|
'wave': '👋',
|
||||||
|
'v': '✌️',
|
||||||
|
'oncoming_fist': '👊',
|
||||||
|
'facepunch': '👊',
|
||||||
|
'punch': '👊',
|
||||||
|
|
||||||
|
// Objects & symbols
|
||||||
|
'fire': '🔥',
|
||||||
|
'tada': '🎉',
|
||||||
|
'camera': '📷',
|
||||||
|
'notes': '🎶',
|
||||||
|
'sparkles': '✨',
|
||||||
|
'star2': '🌟',
|
||||||
|
'crown': '👑',
|
||||||
|
'headphones': '🎧',
|
||||||
|
'white_check_mark': '✅',
|
||||||
|
'arrow_right': '➡️',
|
||||||
|
'arrow_left': '⬅️',
|
||||||
|
'arrow_forward': '▶️',
|
||||||
|
'arrow_backward': '◀️',
|
||||||
|
'arrow_right_hook': '↪️',
|
||||||
|
'leftwards_arrow_with_hook': '↩️',
|
||||||
|
'red_circle': '🔴',
|
||||||
|
'boom': '💥',
|
||||||
|
'collision': '💥',
|
||||||
|
'copyright': '©️',
|
||||||
|
'thought_balloon': '💭',
|
||||||
|
'recycle': '♻️',
|
||||||
|
|
||||||
|
// Nature
|
||||||
|
'cherry_blossom': '🌸',
|
||||||
|
'rose': '🌹',
|
||||||
|
'scream': '😱',
|
||||||
|
|
||||||
|
// Body parts
|
||||||
|
'eyes': '👀',
|
||||||
|
'tongue': '👅',
|
||||||
|
|
||||||
|
// Misc
|
||||||
|
'poop': '💩',
|
||||||
|
'poo': '💩',
|
||||||
|
'shit': '💩',
|
||||||
|
'hankey': '💩',
|
||||||
|
'innocent': '😇',
|
||||||
|
'kissing_closed_eyes': '😚',
|
||||||
|
'stuck_out_tongue': '😛',
|
||||||
|
'disappointed_relieved': '😥',
|
||||||
|
'confounded': '😖',
|
||||||
|
'raising_hand': '🙋',
|
||||||
|
'no_good': '🙅',
|
||||||
|
'ok_woman': '🙆',
|
||||||
|
'information_desk_person': '💁',
|
||||||
|
'man_tipping_hand': '💁♂️',
|
||||||
|
'woman_tipping_hand': '💁♀️',
|
||||||
|
'man_gesturing_no': '🙅♂️',
|
||||||
|
'woman_gesturing_no': '🙅♀️',
|
||||||
|
'man_gesturing_ok': '🙆♂️',
|
||||||
|
'woman_gesturing_ok': '🙆♀️',
|
||||||
|
'man_raising_hand': '🙋♂️',
|
||||||
|
'woman_raising_hand': '🙋♀️',
|
||||||
|
|
||||||
|
// Common variations and aliases
|
||||||
|
'slightly_smiling_face': '🙂',
|
||||||
|
'upside_down_face': '🙃',
|
||||||
|
'thinking_face': '🤔',
|
||||||
|
'shrug': '🤷',
|
||||||
|
'facepalm': '🤦',
|
||||||
|
'man_shrugging': '🤷♂️',
|
||||||
|
'woman_shrugging': '🤷♀️',
|
||||||
|
'man_facepalming': '🤦♂️',
|
||||||
|
'woman_facepalming': '🤦♀️',
|
||||||
|
'hugging_face': '🤗',
|
||||||
|
'money_mouth_face': '🤑',
|
||||||
|
'nerd_face': '🤓',
|
||||||
|
'face_with_rolling_eyes': '🙄',
|
||||||
|
'zipper_mouth_face': '🤐',
|
||||||
|
'nauseated_face': '🤢',
|
||||||
|
'vomiting_face': '🤮',
|
||||||
|
'sneezing_face': '🤧',
|
||||||
|
'lying_face': '🤥',
|
||||||
|
'drooling_face': '🤤',
|
||||||
|
'sleeping_face': '😴',
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert emoji shortcodes (like :thumbsup: or :joy:) to Unicode emoji characters
|
||||||
|
*/
|
||||||
|
export function convertEmojiShortcodes(text: string): string {
|
||||||
|
if (!text || typeof text !== 'string') {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
return text.replace(/:([a-zA-Z0-9_+-]+):/g, (match, shortcode) => {
|
||||||
|
const emoji = emojiMap[shortcode.toLowerCase()];
|
||||||
|
return emoji || match; // Return the emoji if found, otherwise return the original text
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a string contains emoji shortcodes
|
||||||
|
*/
|
||||||
|
export function hasEmojiShortcodes(text: string): boolean {
|
||||||
|
if (!text || typeof text !== 'string') {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return /:([a-zA-Z0-9_+-]+):/.test(text);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get available emoji shortcodes
|
||||||
|
*/
|
||||||
|
export function getAvailableEmojis(): string[] {
|
||||||
|
return Object.keys(emojiMap).sort();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get emoji for a specific shortcode
|
||||||
|
*/
|
||||||
|
export function getEmoji(shortcode: string): string | null {
|
||||||
|
return emojiMap[shortcode.toLowerCase()] || null;
|
||||||
|
}
|
||||||
@@ -20,6 +20,5 @@
|
|||||||
"@/*": ["./src/*"]
|
"@/*": ["./src/*"]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"include": ["src"],
|
"include": ["src"]
|
||||||
"references": [{ "path": "./tsconfig.node.json" }]
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,7 +4,8 @@
|
|||||||
"skipLibCheck": true,
|
"skipLibCheck": true,
|
||||||
"module": "ESNext",
|
"module": "ESNext",
|
||||||
"moduleResolution": "bundler",
|
"moduleResolution": "bundler",
|
||||||
"allowSyntheticDefaultImports": true
|
"allowSyntheticDefaultImports": true,
|
||||||
|
"noEmit": true
|
||||||
},
|
},
|
||||||
"include": ["vite.config.ts"]
|
"include": ["vite.config.ts"]
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,189 +0,0 @@
|
|||||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
|
||||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
|
||||||
return new (P || (P = Promise))(function (resolve, reject) {
|
|
||||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
|
||||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
|
||||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
|
||||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
|
||||||
});
|
|
||||||
};
|
|
||||||
var __generator = (this && this.__generator) || function (thisArg, body) {
|
|
||||||
var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g = Object.create((typeof Iterator === "function" ? Iterator : Object).prototype);
|
|
||||||
return g.next = verb(0), g["throw"] = verb(1), g["return"] = verb(2), typeof Symbol === "function" && (g[Symbol.iterator] = function() { return this; }), g;
|
|
||||||
function verb(n) { return function (v) { return step([n, v]); }; }
|
|
||||||
function step(op) {
|
|
||||||
if (f) throw new TypeError("Generator is already executing.");
|
|
||||||
while (g && (g = 0, op[0] && (_ = 0)), _) try {
|
|
||||||
if (f = 1, y && (t = op[0] & 2 ? y["return"] : op[0] ? y["throw"] || ((t = y["return"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;
|
|
||||||
if (y = 0, t) op = [op[0] & 2, t.value];
|
|
||||||
switch (op[0]) {
|
|
||||||
case 0: case 1: t = op; break;
|
|
||||||
case 4: _.label++; return { value: op[1], done: false };
|
|
||||||
case 5: _.label++; y = op[1]; op = [0]; continue;
|
|
||||||
case 7: op = _.ops.pop(); _.trys.pop(); continue;
|
|
||||||
default:
|
|
||||||
if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }
|
|
||||||
if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }
|
|
||||||
if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }
|
|
||||||
if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }
|
|
||||||
if (t[2]) _.ops.pop();
|
|
||||||
_.trys.pop(); continue;
|
|
||||||
}
|
|
||||||
op = body.call(thisArg, _);
|
|
||||||
} catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }
|
|
||||||
if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };
|
|
||||||
}
|
|
||||||
};
|
|
||||||
import path from "path";
|
|
||||||
import { defineConfig } from 'vite';
|
|
||||||
import react from '@vitejs/plugin-react';
|
|
||||||
import { loadEnv } from "vite";
|
|
||||||
import fs from 'fs-extra';
|
|
||||||
// https://vitejs.dev/config/
|
|
||||||
export default defineConfig(function (_a) {
|
|
||||||
var mode = _a.mode;
|
|
||||||
var env = loadEnv(mode, process.cwd(), "");
|
|
||||||
var isDev = mode === 'development';
|
|
||||||
return {
|
|
||||||
plugins: [
|
|
||||||
react(),
|
|
||||||
{
|
|
||||||
name: 'copy-build',
|
|
||||||
closeBundle: function () { return __awaiter(void 0, void 0, void 0, function () {
|
|
||||||
var sourcePath, targetPath, error_1;
|
|
||||||
return __generator(this, function (_a) {
|
|
||||||
switch (_a.label) {
|
|
||||||
case 0:
|
|
||||||
if (!!isDev) return [3 /*break*/, 6];
|
|
||||||
sourcePath = path.resolve(__dirname, 'build');
|
|
||||||
targetPath = path.resolve(__dirname, '../inventory-server/frontend/build');
|
|
||||||
_a.label = 1;
|
|
||||||
case 1:
|
|
||||||
_a.trys.push([1, 5, , 6]);
|
|
||||||
return [4 /*yield*/, fs.ensureDir(path.dirname(targetPath))];
|
|
||||||
case 2:
|
|
||||||
_a.sent();
|
|
||||||
return [4 /*yield*/, fs.remove(targetPath)];
|
|
||||||
case 3:
|
|
||||||
_a.sent();
|
|
||||||
return [4 /*yield*/, fs.copy(sourcePath, targetPath)];
|
|
||||||
case 4:
|
|
||||||
_a.sent();
|
|
||||||
console.log('Build files copied successfully to server directory!');
|
|
||||||
return [3 /*break*/, 6];
|
|
||||||
case 5:
|
|
||||||
error_1 = _a.sent();
|
|
||||||
console.error('Error copying build files:', error_1);
|
|
||||||
process.exit(1);
|
|
||||||
return [3 /*break*/, 6];
|
|
||||||
case 6: return [2 /*return*/];
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}); }
|
|
||||||
}
|
|
||||||
],
|
|
||||||
define: {
|
|
||||||
'process.env.NODE_ENV': JSON.stringify(mode)
|
|
||||||
},
|
|
||||||
resolve: {
|
|
||||||
alias: {
|
|
||||||
"@": path.resolve(__dirname, "./src"),
|
|
||||||
},
|
|
||||||
},
|
|
||||||
server: {
|
|
||||||
host: "0.0.0.0",
|
|
||||||
port: 5173,
|
|
||||||
proxy: {
|
|
||||||
"/api": {
|
|
||||||
target: "https://inventory.kent.pw",
|
|
||||||
changeOrigin: true,
|
|
||||||
secure: false,
|
|
||||||
ws: true,
|
|
||||||
xfwd: true,
|
|
||||||
cookieDomainRewrite: "",
|
|
||||||
withCredentials: true,
|
|
||||||
rewrite: function (path) { return path.replace(/^\/api/, "/api"); },
|
|
||||||
configure: function (proxy, _options) {
|
|
||||||
proxy.on("error", function (err, req, res) {
|
|
||||||
console.log("API proxy error:", err);
|
|
||||||
res.writeHead(500, {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
});
|
|
||||||
res.end(JSON.stringify({ error: "Proxy Error", message: err.message }));
|
|
||||||
});
|
|
||||||
proxy.on("proxyReq", function (proxyReq, req, _res) {
|
|
||||||
console.log("Outgoing request to API:", {
|
|
||||||
method: req.method,
|
|
||||||
url: req.url,
|
|
||||||
headers: proxyReq.getHeaders(),
|
|
||||||
});
|
|
||||||
});
|
|
||||||
proxy.on("proxyRes", function (proxyRes, req, _res) {
|
|
||||||
console.log("API Proxy response:", {
|
|
||||||
statusCode: proxyRes.statusCode,
|
|
||||||
url: req.url,
|
|
||||||
headers: proxyRes.headers,
|
|
||||||
});
|
|
||||||
});
|
|
||||||
},
|
|
||||||
},
|
|
||||||
"/auth-inv": {
|
|
||||||
target: "https://inventory.kent.pw",
|
|
||||||
changeOrigin: true,
|
|
||||||
secure: false,
|
|
||||||
ws: true,
|
|
||||||
xfwd: true,
|
|
||||||
cookieDomainRewrite: {
|
|
||||||
"inventory.kent.pw": "localhost"
|
|
||||||
},
|
|
||||||
withCredentials: true,
|
|
||||||
onProxyReq: function (proxyReq, req) {
|
|
||||||
// Add origin header to match CORS policy
|
|
||||||
proxyReq.setHeader('Origin', 'http://localhost:5173');
|
|
||||||
},
|
|
||||||
rewrite: function (path) { return path.replace(/^\/auth-inv/, "/auth-inv"); },
|
|
||||||
configure: function (proxy, _options) {
|
|
||||||
proxy.on("error", function (err, req, res) {
|
|
||||||
console.log("Auth proxy error:", err);
|
|
||||||
res.writeHead(500, {
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
});
|
|
||||||
res.end(JSON.stringify({ error: "Proxy Error", message: err.message }));
|
|
||||||
});
|
|
||||||
proxy.on("proxyReq", function (proxyReq, req, _res) {
|
|
||||||
console.log("Outgoing request to Auth:", {
|
|
||||||
method: req.method,
|
|
||||||
url: req.url,
|
|
||||||
headers: proxyReq.getHeaders(),
|
|
||||||
});
|
|
||||||
});
|
|
||||||
proxy.on("proxyRes", function (proxyRes, req, _res) {
|
|
||||||
console.log("Auth Proxy response:", {
|
|
||||||
statusCode: proxyRes.statusCode,
|
|
||||||
url: req.url,
|
|
||||||
headers: proxyRes.headers,
|
|
||||||
});
|
|
||||||
});
|
|
||||||
},
|
|
||||||
},
|
|
||||||
"/uploads": {
|
|
||||||
target: "https://inventory.kent.pw",
|
|
||||||
changeOrigin: true,
|
|
||||||
secure: false,
|
|
||||||
rewrite: function (path) { return path; },
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
build: {
|
|
||||||
outDir: "build",
|
|
||||||
sourcemap: true,
|
|
||||||
rollupOptions: {
|
|
||||||
output: {
|
|
||||||
manualChunks: {
|
|
||||||
vendor: ["react", "react-dom", "react-router-dom"],
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
};
|
|
||||||
});
|
|
||||||
@@ -42,7 +42,7 @@ export default defineConfig(({ mode }) => {
|
|||||||
},
|
},
|
||||||
server: {
|
server: {
|
||||||
host: "0.0.0.0",
|
host: "0.0.0.0",
|
||||||
port: 5173,
|
port: 5175,
|
||||||
proxy: {
|
proxy: {
|
||||||
"/api": {
|
"/api": {
|
||||||
target: "https://inventory.kent.pw",
|
target: "https://inventory.kent.pw",
|
||||||
@@ -91,7 +91,7 @@ export default defineConfig(({ mode }) => {
|
|||||||
withCredentials: true,
|
withCredentials: true,
|
||||||
onProxyReq: (proxyReq, req) => {
|
onProxyReq: (proxyReq, req) => {
|
||||||
// Add origin header to match CORS policy
|
// Add origin header to match CORS policy
|
||||||
proxyReq.setHeader('Origin', 'http://localhost:5173');
|
proxyReq.setHeader('Origin', 'http://localhost:5175');
|
||||||
},
|
},
|
||||||
rewrite: (path) => path.replace(/^\/auth-inv/, "/auth-inv"),
|
rewrite: (path) => path.replace(/^\/auth-inv/, "/auth-inv"),
|
||||||
configure: (proxy, _options) => {
|
configure: (proxy, _options) => {
|
||||||
@@ -120,6 +120,41 @@ export default defineConfig(({ mode }) => {
|
|||||||
})
|
})
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
"/chat-api": {
|
||||||
|
target: "https://inventory.kent.pw",
|
||||||
|
changeOrigin: true,
|
||||||
|
secure: false,
|
||||||
|
ws: true,
|
||||||
|
xfwd: true,
|
||||||
|
cookieDomainRewrite: "",
|
||||||
|
withCredentials: true,
|
||||||
|
rewrite: (path) => path,
|
||||||
|
configure: (proxy, _options) => {
|
||||||
|
proxy.on("error", (err, req, res) => {
|
||||||
|
console.log("Chat API proxy error:", err)
|
||||||
|
res.writeHead(500, {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
})
|
||||||
|
res.end(
|
||||||
|
JSON.stringify({ error: "Proxy Error", message: err.message })
|
||||||
|
)
|
||||||
|
})
|
||||||
|
proxy.on("proxyReq", (proxyReq, req, _res) => {
|
||||||
|
console.log("Outgoing request to Chat API:", {
|
||||||
|
method: req.method,
|
||||||
|
url: req.url,
|
||||||
|
headers: proxyReq.getHeaders(),
|
||||||
|
})
|
||||||
|
})
|
||||||
|
proxy.on("proxyRes", (proxyRes, req, _res) => {
|
||||||
|
console.log("Chat API Proxy response:", {
|
||||||
|
statusCode: proxyRes.statusCode,
|
||||||
|
url: req.url,
|
||||||
|
headers: proxyRes.headers,
|
||||||
|
})
|
||||||
|
})
|
||||||
|
},
|
||||||
|
},
|
||||||
"/uploads": {
|
"/uploads": {
|
||||||
target: "https://inventory.kent.pw",
|
target: "https://inventory.kent.pw",
|
||||||
changeOrigin: true,
|
changeOrigin: true,
|
||||||
|
|||||||
Reference in New Issue
Block a user