Compare commits
10 Commits
9ce84fe5b9
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| ec8ab17d3f | |||
| 100e398aae | |||
| aec02e490a | |||
| 3831cef234 | |||
| 1866cbae7e | |||
| 3d1e8862f9 | |||
| 1dcb47cfc5 | |||
| 167c13c572 | |||
| 7218e7cc3f | |||
| 43d76e011d |
375
PRODUCT_IMPORT_ENHANCEMENTS.md
Normal file
375
PRODUCT_IMPORT_ENHANCEMENTS.md
Normal file
@@ -0,0 +1,375 @@
|
||||
# Product Import Module - Enhancement & Issues Outline
|
||||
|
||||
This document outlines the investigation and implementation requirements for each requested enhancement to the product import module.
|
||||
|
||||
---
|
||||
|
||||
## 1. UPC Import - Strip Quotes and Spaces ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** When importing UPCs, strip `'`, `"` characters and any spaces, leaving only numbers.
|
||||
|
||||
**Implementation (Completed):**
|
||||
- Modified `normalizeUpcValue()` in [Import.tsx:661-667](inventory/src/pages/Import.tsx#L661-L667)
|
||||
- Strips single quotes, double quotes, smart quotes (`'"`), and whitespace before processing
|
||||
- Then handles scientific notation and extracts only digits
|
||||
|
||||
**Files Modified:**
|
||||
- `inventory/src/pages/Import.tsx` - `normalizeUpcValue()` function
|
||||
|
||||
---
|
||||
|
||||
## 2. AI Context Columns in Validation Payloads ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** The match columns step has a setting to use a field only for AI context (`isAiSupplemental`). Update AI description validation to include any columns selected with this option in the payload. Also include in sanity check payload. Not needed for names.
|
||||
|
||||
**Current Implementation:**
|
||||
- AI Supplemental toggle: [MatchColumnsStep.tsx:102-118](inventory/src/components/product-import/steps/MatchColumnsStep/MatchColumnsStep.tsx#L102-L118)
|
||||
- AI supplemental data stored in `__aiSupplemental` field on each row
|
||||
- Description payload builder: [inlineAiPayload.ts:183-195](inventory/src/components/product-import/steps/ValidationStep/utils/inlineAiPayload.ts#L183-L195)
|
||||
|
||||
**Implementation:**
|
||||
1. **Update `buildDescriptionValidationPayload()` in `inlineAiPayload.ts`** to include AI supplemental data:
|
||||
```typescript
|
||||
export const buildDescriptionValidationPayload = (
|
||||
row: Data<string>,
|
||||
fieldOptions: FieldOptionsMap,
|
||||
productLinesCache: Map<string, SelectOption[]>,
|
||||
sublinesCache: Map<string, SelectOption[]>
|
||||
) => {
|
||||
const payload: Record<string, unknown> = {
|
||||
name: row.name,
|
||||
description: row.description,
|
||||
company_name: getFieldOptionLabel(row.company, fieldOptions, 'company'),
|
||||
company_id: row.company,
|
||||
categories: getFieldOptionLabel(row.category, fieldOptions, 'category'),
|
||||
};
|
||||
|
||||
// Add AI supplemental context if present
|
||||
if (row.__aiSupplemental && typeof row.__aiSupplemental === 'object') {
|
||||
payload.additional_context = row.__aiSupplemental;
|
||||
}
|
||||
|
||||
return payload;
|
||||
};
|
||||
```
|
||||
|
||||
2. **Update sanity check payload** - Locate sanity check submission logic and include `__aiSupplemental` data
|
||||
|
||||
3. **Verify `__aiSupplemental` is properly populated** from MatchColumnsStep when columns are marked as AI context only
|
||||
|
||||
**Files to Modify:**
|
||||
- `inventory/src/components/product-import/steps/ValidationStep/utils/inlineAiPayload.ts`
|
||||
- Backend sanity check endpoint (if separate from description validation)
|
||||
- Verify data flow in `MatchColumnsStep.tsx` → `ValidationStep`
|
||||
|
||||
---
|
||||
|
||||
## 3. Fresh Taxonomy Data Per Session ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** Ensure taxonomy data is brought in fresh with each session - cache should be invalidated if we exit the import flow and start again.
|
||||
|
||||
**Current Implementation:**
|
||||
- Field options cached 5 minutes: [ValidationStep/index.tsx:128-133](inventory/src/components/product-import/steps/ValidationStep/index.tsx#L128-L133)
|
||||
- Product lines cache: `productLinesCache` in Zustand store
|
||||
- Sublines cache: `sublinesCache` in Zustand store
|
||||
- Caches set to 10-minute stale time
|
||||
|
||||
**Implementation:**
|
||||
1. **Add cache invalidation on import flow mount/unmount** in `UploadFlow.tsx`:
|
||||
```typescript
|
||||
useEffect(() => {
|
||||
// On mount - invalidate import-related query cache
|
||||
queryClient.invalidateQueries({ queryKey: ['import-field-options'] });
|
||||
|
||||
return () => {
|
||||
// On unmount - clear caches
|
||||
queryClient.removeQueries({ queryKey: ['import-field-options'] });
|
||||
queryClient.removeQueries({ queryKey: ['product-lines'] });
|
||||
queryClient.removeQueries({ queryKey: ['sublines'] });
|
||||
};
|
||||
}, []);
|
||||
```
|
||||
|
||||
2. **Clear Zustand store caches** when exiting import flow:
|
||||
- Add action to `validationStore.ts` to clear `productLinesCache` and `sublinesCache`
|
||||
- Call this action on unmount of `UploadFlow` or when navigating away
|
||||
|
||||
3. **Consider adding a `sessionId`** that changes on each import flow start, used as part of cache keys
|
||||
|
||||
**Files to Modify:**
|
||||
- `inventory/src/components/product-import/steps/UploadFlow.tsx` - Add cleanup effect
|
||||
- `inventory/src/components/product-import/steps/ValidationStep/store/validationStore.ts` - Add cache clear action
|
||||
- Potentially `inventory/src/components/product-import/steps/ValidationStep/index.tsx` - Query key updates
|
||||
|
||||
---
|
||||
|
||||
## 4. Save Template from Confirmation Page ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** Add option to save rows of submitted data as a new template on the confirmation page after completing the import flow. Verify this works with new validation step changes.
|
||||
|
||||
**Current Implementation:**
|
||||
- **Import Results section already exists** inline in [Import.tsx:968-1150](inventory/src/pages/Import.tsx#L968-L1150)
|
||||
- Shows created products (lines 1021-1097) with image, name, UPC, item number
|
||||
- Shows errored products (lines 1100-1138) with error details
|
||||
- "Fix products with errors" button resumes validation flow for failed items
|
||||
- Template saving logic in ValidationStep: [useTemplateManagement.ts:204-266](inventory/src/components/product-import/steps/ValidationStep/hooks/useTemplateManagement.ts#L204-L266)
|
||||
- Saves via `POST /api/templates`
|
||||
- `importOutcome.submittedProducts` contains the full product data for each row
|
||||
|
||||
**Implementation:**
|
||||
1. **Add "Save as Template" button** to each created product row in the results section (around line 1087-1092 in Import.tsx):
|
||||
```typescript
|
||||
// Add button after the item number display
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => handleSaveAsTemplate(index)}
|
||||
>
|
||||
<BookmarkPlus className="h-4 w-4" />
|
||||
</Button>
|
||||
```
|
||||
|
||||
2. **Add state and dialog** for template saving in Import.tsx:
|
||||
```typescript
|
||||
const [templateSaveDialogOpen, setTemplateSaveDialogOpen] = useState(false);
|
||||
const [selectedProductForTemplate, setSelectedProductForTemplate] = useState<NormalizedProduct | null>(null);
|
||||
```
|
||||
|
||||
3. **Extract/reuse template save logic** from `useTemplateManagement.ts`:
|
||||
- The `saveNewTemplate()` function (lines 204-266) can be extracted into a shared utility
|
||||
- Or create a `SaveTemplateDialog` component that can be used in both places
|
||||
- Key fields needed: `company` (for template name), `product_type`, and all product field values
|
||||
|
||||
4. **Data mapping consideration:**
|
||||
- `importOutcome.submittedProducts` uses `NormalizedProduct` type
|
||||
- Templates expect raw field values - may need to map back from normalized format
|
||||
- Exclude metadata fields: `['id', '__index', '__meta', '__template', '__original', '__corrected', '__changes', '__aiSupplemental']`
|
||||
|
||||
**Files to Modify:**
|
||||
- `inventory/src/pages/Import.tsx` - Add save template button, state, and dialog
|
||||
- Consider creating `inventory/src/components/product-import/SaveTemplateDialog.tsx` for reusability
|
||||
- Potentially extract core save logic from `useTemplateManagement.ts` into shared utility
|
||||
|
||||
---
|
||||
|
||||
## 5. Sheet Preview on Select Sheet Step ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** On the select sheet step, show a preview of the first 10 lines or so of each sheet underneath the options.
|
||||
|
||||
**Implementation (Completed):**
|
||||
- Added `workbook` prop to `SelectSheetStep` component
|
||||
- Added `sheetPreviews` memoized computation using `XLSXLib.utils.sheet_to_json()`
|
||||
- Shows first 10 rows, 8 columns max per sheet
|
||||
- Added `truncateCell()` helper to limit cell content to 30 characters with ellipsis
|
||||
- Each sheet option is now a clickable card with:
|
||||
- Radio button and sheet name
|
||||
- Row count indicator
|
||||
- Scrollable preview table with horizontal scroll
|
||||
- Selected state highlighted with primary border
|
||||
- Updated `UploadFlow.tsx` to pass workbook prop
|
||||
|
||||
**Files Modified:**
|
||||
- `inventory/src/components/product-import/steps/SelectSheetStep/SelectSheetStep.tsx`
|
||||
- `inventory/src/components/product-import/steps/UploadFlow.tsx`
|
||||
|
||||
---
|
||||
|
||||
## 6. Empty Row Removal ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** When importing a sheet, automatically remove completely empty rows.
|
||||
|
||||
**Current Implementation:**
|
||||
- Empty columns are filtered: [MatchColumnsStep.tsx:616-634](inventory/src/components/product-import/steps/MatchColumnsStep/MatchColumnsStep.tsx#L616-L634)
|
||||
- A "Remove empty/duplicates" button exists that removes empty rows, single-value rows, AND duplicates
|
||||
- The automatic removal should ONLY remove completely empty rows, not duplicates or single-value rows
|
||||
|
||||
**Implementation (Completed):**
|
||||
- Added `isRowCompletelyEmpty()` helper function to [SelectHeaderStep.tsx](inventory/src/components/product-import/steps/SelectHeaderStep/SelectHeaderStep.tsx)
|
||||
- Added `useMemo` to filter empty rows on initial data load
|
||||
- Uses `Object.values(row)` to check all cell values (matches existing button logic)
|
||||
- Only removes rows where ALL values are undefined, null, or whitespace-only strings
|
||||
- Manual "Remove Empty/Duplicates" button still available for additional cleanup (duplicates, single-value rows)
|
||||
|
||||
**Files Modified:**
|
||||
- `inventory/src/components/product-import/steps/SelectHeaderStep/SelectHeaderStep.tsx`
|
||||
|
||||
---
|
||||
|
||||
## 7. Unit Conversion for Weight/Dimensions ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** Add unit conversion feature for weight and dimensions columns - similar to calculator button on cost/msrp, add button that opens popover with options to convert grams → oz, lbs → oz for the whole column at once.
|
||||
|
||||
**Current Implementation:**
|
||||
- Calculator button on price columns: [ValidationTable.tsx:1491-1627](inventory/src/components/product-import/steps/ValidationStep/components/ValidationTable.tsx#L1491-L1627)
|
||||
- `PriceColumnHeader` component shows calculator icon on hover
|
||||
- Weight field defined in config with validation
|
||||
|
||||
**Implementation:**
|
||||
1. **Create `UnitConversionColumnHeader` component** (similar to `PriceColumnHeader`):
|
||||
```typescript
|
||||
const UnitConversionColumnHeader = ({ field, table }) => {
|
||||
const [showPopover, setShowPopover] = useState(false);
|
||||
|
||||
const conversions = {
|
||||
weight: [
|
||||
{ label: 'Grams → Ounces', factor: 0.035274 },
|
||||
{ label: 'Pounds → Ounces', factor: 16 },
|
||||
{ label: 'Kilograms → Ounces', factor: 35.274 },
|
||||
],
|
||||
dimensions: [
|
||||
{ label: 'Centimeters → Inches', factor: 0.393701 },
|
||||
{ label: 'Millimeters → Inches', factor: 0.0393701 },
|
||||
]
|
||||
};
|
||||
|
||||
const applyConversion = (factor: number) => {
|
||||
// Batch update all cells in column
|
||||
table.rows.forEach((row, index) => {
|
||||
const currentValue = parseFloat(row[field.key]);
|
||||
if (!isNaN(currentValue)) {
|
||||
updateCell(index, field.key, (currentValue * factor).toFixed(2));
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
return (
|
||||
<Popover open={showPopover} onOpenChange={setShowPopover}>
|
||||
<PopoverTrigger>
|
||||
<Scale className="h-4 w-4" /> {/* or similar icon */}
|
||||
</PopoverTrigger>
|
||||
<PopoverContent>
|
||||
{conversions[fieldType].map(conv => (
|
||||
<Button key={conv.label} onClick={() => applyConversion(conv.factor)}>
|
||||
{conv.label}
|
||||
</Button>
|
||||
))}
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
2. **Identify weight/dimension fields** in config:
|
||||
- `weight_oz`, `length_in`, `width_in`, `height_in` (check actual field keys)
|
||||
|
||||
3. **Add to column header render logic** in ValidationTable
|
||||
|
||||
**Files to Modify:**
|
||||
- `inventory/src/components/product-import/steps/ValidationStep/components/ValidationTable.tsx`
|
||||
- Potentially create new component file for `UnitConversionColumnHeader`
|
||||
- Update column header rendering to use new component for weight/dimension fields
|
||||
|
||||
---
|
||||
|
||||
## 8. Expanded MSRP Auto-Fill from Cost ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** Expand auto-fill functionality for MSRP from cost - open small popover with options for 2x, 2.1x, 2.2x, 2.3x, 2.4x, 2.5x multipliers, plus checkbox to round up to nearest 9.
|
||||
|
||||
**Current Implementation:**
|
||||
- Calculator on MSRP column: [ValidationTable.tsx:1540-1584](inventory/src/components/product-import/steps/ValidationStep/components/ValidationTable.tsx#L1540-L1584)
|
||||
- Currently only does `Cost × 2` then subtracts 0.01 if whole number
|
||||
|
||||
**Implementation:**
|
||||
1. **Replace simple click with popover** in `PriceColumnHeader`:
|
||||
```typescript
|
||||
const [selectedMultiplier, setSelectedMultiplier] = useState(2.0);
|
||||
const [roundToNine, setRoundToNine] = useState(false);
|
||||
const multipliers = [2.0, 2.1, 2.2, 2.3, 2.4, 2.5];
|
||||
|
||||
const roundUpToNine = (value: number): number => {
|
||||
// 1.41 → 1.49, 2.78 → 2.79, 12.32 → 12.39
|
||||
const wholePart = Math.floor(value);
|
||||
const decimal = value - wholePart;
|
||||
if (decimal <= 0.09) return wholePart + 0.09;
|
||||
if (decimal <= 0.19) return wholePart + 0.19;
|
||||
// ... continue pattern, or:
|
||||
const lastDigit = Math.floor(decimal * 10);
|
||||
return wholePart + (lastDigit / 10) + 0.09;
|
||||
};
|
||||
|
||||
const calculateMsrp = (cost: number): number => {
|
||||
let result = cost * selectedMultiplier;
|
||||
if (roundToNine) {
|
||||
result = roundUpToNine(result);
|
||||
}
|
||||
return result;
|
||||
};
|
||||
```
|
||||
|
||||
2. **Create popover UI**:
|
||||
```tsx
|
||||
<Popover>
|
||||
<PopoverTrigger><Calculator className="h-4 w-4" /></PopoverTrigger>
|
||||
<PopoverContent className="w-48">
|
||||
<div className="space-y-2">
|
||||
<Label>Multiplier</Label>
|
||||
<div className="grid grid-cols-3 gap-1">
|
||||
{multipliers.map(m => (
|
||||
<Button
|
||||
key={m}
|
||||
variant={selectedMultiplier === m ? 'default' : 'outline'}
|
||||
size="sm"
|
||||
onClick={() => setSelectedMultiplier(m)}
|
||||
>
|
||||
{m}x
|
||||
</Button>
|
||||
))}
|
||||
</div>
|
||||
<div className="flex items-center gap-2">
|
||||
<Checkbox checked={roundToNine} onCheckedChange={setRoundToNine} />
|
||||
<Label>Round to .X9</Label>
|
||||
</div>
|
||||
<Button onClick={applyCalculation} className="w-full">
|
||||
Apply
|
||||
</Button>
|
||||
</div>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
```
|
||||
|
||||
**Files to Modify:**
|
||||
- `inventory/src/components/product-import/steps/ValidationStep/components/ValidationTable.tsx` - `PriceColumnHeader` component
|
||||
|
||||
---
|
||||
|
||||
## 9. Debug Mode - Skip API Submission ✅ IMPLEMENTED
|
||||
|
||||
**Issue:** Add a third switch in the footer of image upload step (visible only to users with `admin:debug` permission) that will not submit data to any API, only complete the process and show results page as if it had worked.
|
||||
|
||||
**Implementation (Completed):**
|
||||
- Added `skipApiSubmission` state to `ImageUploadStep.tsx`
|
||||
- Added amber-colored "Skip API (Debug)" switch (visible only with `admin:debug` permission)
|
||||
- When skip is active, "Use Test API" and "Use Test Database" switches are hidden
|
||||
- Added `skipApiSubmission?: boolean` to `SubmitOptions` type in `types.ts`
|
||||
- In `Import.tsx`, when `skipApiSubmission` is true:
|
||||
- Skips the actual API call entirely
|
||||
- Generates mock success response with mock PIDs
|
||||
- Shows `[DEBUG]` prefix in toast and result message
|
||||
- Displays results page as if submission succeeded
|
||||
|
||||
**Files Modified:**
|
||||
- `inventory/src/components/product-import/types.ts` - Added `skipApiSubmission` to `SubmitOptions`
|
||||
- `inventory/src/components/product-import/steps/ImageUploadStep/ImageUploadStep.tsx` - Added switch UI
|
||||
- `inventory/src/pages/Import.tsx` - Added skip logic in `handleData()`
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
| # | Enhancement | Complexity | Status |
|
||||
|---|-------------|------------|--------|
|
||||
| 1 | Strip UPC quotes/spaces | Low | ✅ Implemented |
|
||||
| 2 | AI context in validation | Medium | ✅ Implemented |
|
||||
| 3 | Fresh taxonomy per session | Medium | ✅ Implemented |
|
||||
| 4 | Save template from confirmation | Medium-High | ✅ Implemented |
|
||||
| 5 | Sheet preview | Low-Medium | ✅ Implemented |
|
||||
| 6 | Remove empty rows | Low | ✅ Implemented |
|
||||
| 7 | Unit conversion | Medium | ✅ Implemented |
|
||||
| 8 | MSRP multiplier options | Medium | ✅ Implemented |
|
||||
| 9 | Debug skip API | Low-Medium | ✅ Implemented |
|
||||
|
||||
**Implemented:** 9 of 9 items - All enhancements complete!
|
||||
|
||||
---
|
||||
|
||||
*Document generated: 2026-01-25*
|
||||
2846
docs/ai-validation-redesign.md
Normal file
2846
docs/ai-validation-redesign.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -35,7 +35,7 @@ global.pool = pool;
|
||||
app.use(express.json());
|
||||
app.use(morgan('combined'));
|
||||
app.use(cors({
|
||||
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw', 'https://acot.site'],
|
||||
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw', 'https://acot.site', 'https://tools.acherryontop.com'],
|
||||
credentials: true
|
||||
}));
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ global.pool = pool;
|
||||
app.use(express.json());
|
||||
app.use(morgan('combined'));
|
||||
app.use(cors({
|
||||
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw', 'https://acot.site'],
|
||||
origin: ['http://localhost:5175', 'http://localhost:5174', 'https://inventory.kent.pw', 'https://acot.site', 'https://tools.acherryontop.com'],
|
||||
credentials: true
|
||||
}));
|
||||
|
||||
|
||||
@@ -163,6 +163,7 @@ router.post('/simulate', async (req, res) => {
|
||||
productPromo = {},
|
||||
shippingPromo = {},
|
||||
shippingTiers = [],
|
||||
surcharges = [],
|
||||
merchantFeePercent,
|
||||
fixedCostPerOrder,
|
||||
cogsCalculationMode = 'actual',
|
||||
@@ -219,6 +220,17 @@ router.post('/simulate', async (req, res) => {
|
||||
.filter(tier => tier.threshold >= 0 && tier.value >= 0)
|
||||
.sort((a, b) => a.threshold - b.threshold)
|
||||
: [],
|
||||
surcharges: Array.isArray(surcharges)
|
||||
? surcharges
|
||||
.map(s => ({
|
||||
threshold: Number(s.threshold || 0),
|
||||
maxThreshold: typeof s.maxThreshold === 'number' && s.maxThreshold > 0 ? s.maxThreshold : null,
|
||||
target: s.target === 'shipping' || s.target === 'order' ? s.target : 'shipping',
|
||||
amount: Number(s.amount || 0)
|
||||
}))
|
||||
.filter(s => s.threshold >= 0 && s.amount >= 0)
|
||||
.sort((a, b) => a.threshold - b.threshold)
|
||||
: [],
|
||||
points: {
|
||||
pointsPerDollar: typeof pointsConfig.pointsPerDollar === 'number' ? pointsConfig.pointsPerDollar : null,
|
||||
redemptionRate: typeof pointsConfig.redemptionRate === 'number' ? pointsConfig.redemptionRate : null,
|
||||
@@ -407,7 +419,7 @@ router.post('/simulate', async (req, res) => {
|
||||
};
|
||||
|
||||
const orderValue = data.avgSubtotal > 0 ? data.avgSubtotal : getMidpoint(range);
|
||||
const shippingChargeBase = data.avgShipRate > 0 ? data.avgShipRate : 0;
|
||||
const shippingChargeBase = data.avgShipCost > 0 ? data.avgShipCost : 0;
|
||||
const actualShippingCost = data.avgShipCost > 0 ? data.avgShipCost : 0;
|
||||
|
||||
// Calculate COGS based on the selected mode
|
||||
@@ -459,8 +471,23 @@ router.post('/simulate', async (req, res) => {
|
||||
shipPromoDiscount = Math.min(shipPromoDiscount, shippingAfterAuto);
|
||||
}
|
||||
|
||||
const customerShipCost = Math.max(0, shippingAfterAuto - shipPromoDiscount);
|
||||
const customerItemCost = Math.max(0, orderValue - promoProductDiscount);
|
||||
// Calculate surcharges
|
||||
let shippingSurcharge = 0;
|
||||
let orderSurcharge = 0;
|
||||
for (const surcharge of config.surcharges) {
|
||||
const meetsMin = orderValue >= surcharge.threshold;
|
||||
const meetsMax = surcharge.maxThreshold == null || orderValue < surcharge.maxThreshold;
|
||||
if (meetsMin && meetsMax) {
|
||||
if (surcharge.target === 'shipping') {
|
||||
shippingSurcharge += surcharge.amount;
|
||||
} else if (surcharge.target === 'order') {
|
||||
orderSurcharge += surcharge.amount;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const customerShipCost = Math.max(0, shippingAfterAuto - shipPromoDiscount + shippingSurcharge);
|
||||
const customerItemCost = Math.max(0, orderValue - promoProductDiscount + orderSurcharge);
|
||||
const totalRevenue = customerItemCost + customerShipCost;
|
||||
|
||||
const merchantFees = totalRevenue * (config.merchantFeePercent / 100);
|
||||
@@ -488,6 +515,8 @@ router.post('/simulate', async (req, res) => {
|
||||
shippingChargeBase,
|
||||
shippingAfterAuto,
|
||||
shipPromoDiscount,
|
||||
shippingSurcharge,
|
||||
orderSurcharge,
|
||||
customerShipCost,
|
||||
actualShippingCost,
|
||||
totalRevenue,
|
||||
|
||||
@@ -0,0 +1,683 @@
|
||||
const express = require('express');
|
||||
const { DateTime } = require('luxon');
|
||||
|
||||
const router = express.Router();
|
||||
const { getDbConnection, getPoolStatus } = require('../db/connection');
|
||||
const {
|
||||
getTimeRangeConditions,
|
||||
_internal: timeHelpers
|
||||
} = require('../utils/timeUtils');
|
||||
|
||||
const TIMEZONE = 'America/New_York';
|
||||
|
||||
// Punch types from the database
|
||||
const PUNCH_TYPES = {
|
||||
OUT: 0,
|
||||
IN: 1,
|
||||
BREAK_START: 2,
|
||||
BREAK_END: 3,
|
||||
};
|
||||
|
||||
// Standard hours for FTE calculation (40 hours per week)
|
||||
const STANDARD_WEEKLY_HOURS = 40;
|
||||
|
||||
/**
|
||||
* Calculate working hours from timeclock entries
|
||||
* Groups punches by employee and date, pairs in/out punches
|
||||
* Returns both total hours (with breaks, for FTE) and productive hours (without breaks, for productivity)
|
||||
*/
|
||||
function calculateHoursFromPunches(punches) {
|
||||
// Group by employee
|
||||
const byEmployee = new Map();
|
||||
|
||||
punches.forEach(punch => {
|
||||
if (!byEmployee.has(punch.EmployeeID)) {
|
||||
byEmployee.set(punch.EmployeeID, []);
|
||||
}
|
||||
byEmployee.get(punch.EmployeeID).push(punch);
|
||||
});
|
||||
|
||||
const employeeHours = [];
|
||||
let totalHours = 0;
|
||||
let totalBreakHours = 0;
|
||||
|
||||
byEmployee.forEach((employeePunches, employeeId) => {
|
||||
// Sort by timestamp
|
||||
employeePunches.sort((a, b) => new Date(a.TimeStamp) - new Date(b.TimeStamp));
|
||||
|
||||
let hours = 0;
|
||||
let breakHours = 0;
|
||||
let currentIn = null;
|
||||
let breakStart = null;
|
||||
|
||||
employeePunches.forEach(punch => {
|
||||
const punchTime = new Date(punch.TimeStamp);
|
||||
|
||||
switch (punch.PunchType) {
|
||||
case PUNCH_TYPES.IN:
|
||||
currentIn = punchTime;
|
||||
break;
|
||||
case PUNCH_TYPES.OUT:
|
||||
if (currentIn) {
|
||||
hours += (punchTime - currentIn) / (1000 * 60 * 60); // Convert ms to hours
|
||||
currentIn = null;
|
||||
}
|
||||
break;
|
||||
case PUNCH_TYPES.BREAK_START:
|
||||
breakStart = punchTime;
|
||||
break;
|
||||
case PUNCH_TYPES.BREAK_END:
|
||||
if (breakStart) {
|
||||
breakHours += (punchTime - breakStart) / (1000 * 60 * 60);
|
||||
breakStart = null;
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
|
||||
totalHours += hours;
|
||||
totalBreakHours += breakHours;
|
||||
|
||||
employeeHours.push({
|
||||
employeeId,
|
||||
hours,
|
||||
breakHours,
|
||||
productiveHours: hours - breakHours,
|
||||
});
|
||||
});
|
||||
|
||||
return {
|
||||
employeeHours,
|
||||
totalHours,
|
||||
totalBreakHours,
|
||||
totalProductiveHours: totalHours - totalBreakHours
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate FTE (Full Time Equivalents) for a period
|
||||
* @param {number} totalHours - Total hours worked
|
||||
* @param {Date} startDate - Period start
|
||||
* @param {Date} endDate - Period end
|
||||
*/
|
||||
function calculateFTE(totalHours, startDate, endDate) {
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
const days = Math.max(1, (end - start) / (1000 * 60 * 60 * 24));
|
||||
const weeks = days / 7;
|
||||
const expectedHours = weeks * STANDARD_WEEKLY_HOURS;
|
||||
|
||||
return expectedHours > 0 ? totalHours / expectedHours : 0;
|
||||
}
|
||||
|
||||
// Main employee metrics endpoint
|
||||
router.get('/', async (req, res) => {
|
||||
const startTime = Date.now();
|
||||
console.log(`[EMPLOYEE-METRICS] Starting request for timeRange: ${req.query.timeRange}`);
|
||||
|
||||
const timeoutPromise = new Promise((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Request timeout after 30 seconds')), 30000);
|
||||
});
|
||||
|
||||
try {
|
||||
const mainOperation = async () => {
|
||||
const { timeRange, startDate, endDate } = req.query;
|
||||
console.log(`[EMPLOYEE-METRICS] Getting DB connection...`);
|
||||
const { connection, release } = await getDbConnection();
|
||||
console.log(`[EMPLOYEE-METRICS] DB connection obtained in ${Date.now() - startTime}ms`);
|
||||
|
||||
const { whereClause, params, dateRange } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
// Adapt where clause for timeclock table (uses TimeStamp instead of date_placed)
|
||||
const timeclockWhere = whereClause.replace(/date_placed/g, 'tc.TimeStamp');
|
||||
|
||||
// Query for timeclock data with employee names
|
||||
const timeclockQuery = `
|
||||
SELECT
|
||||
tc.EmployeeID,
|
||||
tc.TimeStamp,
|
||||
tc.PunchType,
|
||||
e.firstname,
|
||||
e.lastname
|
||||
FROM timeclock tc
|
||||
LEFT JOIN employees e ON tc.EmployeeID = e.employeeid
|
||||
WHERE ${timeclockWhere}
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
ORDER BY tc.EmployeeID, tc.TimeStamp
|
||||
`;
|
||||
|
||||
const [timeclockRows] = await connection.execute(timeclockQuery, params);
|
||||
|
||||
// Calculate hours (includes both total hours for FTE and productive hours for productivity)
|
||||
const { employeeHours, totalHours, totalBreakHours, totalProductiveHours } = calculateHoursFromPunches(timeclockRows);
|
||||
|
||||
// Get employee names for the results
|
||||
const employeeNames = new Map();
|
||||
timeclockRows.forEach(row => {
|
||||
if (!employeeNames.has(row.EmployeeID)) {
|
||||
employeeNames.set(row.EmployeeID, {
|
||||
firstname: row.firstname || '',
|
||||
lastname: row.lastname || '',
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Enrich employee hours with names
|
||||
const enrichedEmployeeHours = employeeHours.map(eh => ({
|
||||
...eh,
|
||||
name: employeeNames.has(eh.employeeId)
|
||||
? `${employeeNames.get(eh.employeeId).firstname} ${employeeNames.get(eh.employeeId).lastname}`.trim()
|
||||
: `Employee ${eh.employeeId}`,
|
||||
})).sort((a, b) => b.hours - a.hours);
|
||||
|
||||
// Query for picking tickets - using subquery to avoid duplication from bucket join
|
||||
// Ship-together orders: only count main orders (is_sub = 0 or NULL), not sub-orders
|
||||
const pickingWhere = whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
|
||||
// First get picking ticket stats without the bucket join (to avoid duplication)
|
||||
const pickingStatsQuery = `
|
||||
SELECT
|
||||
pt.createdby as employeeId,
|
||||
e.firstname,
|
||||
e.lastname,
|
||||
COUNT(DISTINCT pt.pickingid) as ticketCount,
|
||||
SUM(pt.totalpieces_picked) as piecesPicked,
|
||||
SUM(TIMESTAMPDIFF(SECOND, pt.createddate, pt.closeddate)) as pickingTimeSeconds,
|
||||
AVG(NULLIF(pt.picking_speed, 0)) as avgPickingSpeed
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN employees e ON pt.createdby = e.employeeid
|
||||
WHERE ${pickingWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY pt.createdby, e.firstname, e.lastname
|
||||
`;
|
||||
|
||||
// Separate query for order counts (needs bucket join for ship-together handling)
|
||||
const orderCountQuery = `
|
||||
SELECT
|
||||
pt.createdby as employeeId,
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${pickingWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY pt.createdby
|
||||
`;
|
||||
|
||||
const [[pickingStatsRows], [orderCountRows]] = await Promise.all([
|
||||
connection.execute(pickingStatsQuery, params),
|
||||
connection.execute(orderCountQuery, params)
|
||||
]);
|
||||
|
||||
// Merge the results
|
||||
const orderCountMap = new Map();
|
||||
orderCountRows.forEach(row => {
|
||||
orderCountMap.set(row.employeeId, parseInt(row.ordersPicked || 0));
|
||||
});
|
||||
|
||||
// Aggregate picking totals
|
||||
let totalOrdersPicked = 0;
|
||||
let totalPiecesPicked = 0;
|
||||
let totalTickets = 0;
|
||||
let totalPickingTimeSeconds = 0;
|
||||
let pickingSpeedSum = 0;
|
||||
let pickingSpeedCount = 0;
|
||||
|
||||
const pickingByEmployee = pickingStatsRows.map(row => {
|
||||
const ordersPicked = orderCountMap.get(row.employeeId) || 0;
|
||||
totalOrdersPicked += ordersPicked;
|
||||
totalPiecesPicked += parseInt(row.piecesPicked || 0);
|
||||
totalTickets += parseInt(row.ticketCount || 0);
|
||||
totalPickingTimeSeconds += parseInt(row.pickingTimeSeconds || 0);
|
||||
if (row.avgPickingSpeed && row.avgPickingSpeed > 0) {
|
||||
pickingSpeedSum += parseFloat(row.avgPickingSpeed);
|
||||
pickingSpeedCount++;
|
||||
}
|
||||
|
||||
const empPickingHours = parseInt(row.pickingTimeSeconds || 0) / 3600;
|
||||
|
||||
return {
|
||||
employeeId: row.employeeId,
|
||||
name: `${row.firstname || ''} ${row.lastname || ''}`.trim() || `Employee ${row.employeeId}`,
|
||||
ticketCount: parseInt(row.ticketCount || 0),
|
||||
ordersPicked,
|
||||
piecesPicked: parseInt(row.piecesPicked || 0),
|
||||
pickingHours: empPickingHours,
|
||||
avgPickingSpeed: row.avgPickingSpeed ? parseFloat(row.avgPickingSpeed) : null,
|
||||
};
|
||||
});
|
||||
|
||||
const totalPickingHours = totalPickingTimeSeconds / 3600;
|
||||
const avgPickingSpeed = pickingSpeedCount > 0 ? pickingSpeedSum / pickingSpeedCount : 0;
|
||||
|
||||
// Query for shipped orders - totals
|
||||
// Ship-together orders: only count main orders (order_type != 8 for sub-orders, or use parent tracking)
|
||||
const shippingWhere = whereClause.replace(/date_placed/g, 'o.date_shipped');
|
||||
|
||||
const shippingQuery = `
|
||||
SELECT
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
WHERE ${shippingWhere}
|
||||
AND o.order_status IN (100, 92)
|
||||
`;
|
||||
|
||||
const [shippingRows] = await connection.execute(shippingQuery, params);
|
||||
const shipping = shippingRows[0] || { ordersShipped: 0, piecesShipped: 0 };
|
||||
|
||||
// Query for shipped orders by employee
|
||||
const shippingByEmployeeQuery = `
|
||||
SELECT
|
||||
e.employeeid,
|
||||
e.firstname,
|
||||
e.lastname,
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
JOIN employees e ON o.stats_cid_shipped = e.cid
|
||||
WHERE ${shippingWhere}
|
||||
AND o.order_status IN (100, 92)
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
GROUP BY e.employeeid, e.firstname, e.lastname
|
||||
ORDER BY ordersShipped DESC
|
||||
`;
|
||||
|
||||
const [shippingByEmployeeRows] = await connection.execute(shippingByEmployeeQuery, params);
|
||||
const shippingByEmployee = shippingByEmployeeRows.map(row => ({
|
||||
employeeId: row.employeeid,
|
||||
name: `${row.firstname || ''} ${row.lastname || ''}`.trim() || `Employee ${row.employeeid}`,
|
||||
ordersShipped: parseInt(row.ordersShipped || 0),
|
||||
piecesShipped: parseInt(row.piecesShipped || 0),
|
||||
}));
|
||||
|
||||
// Calculate period dates for FTE calculation
|
||||
let periodStart, periodEnd;
|
||||
if (dateRange?.start) {
|
||||
periodStart = new Date(dateRange.start);
|
||||
} else if (params[0]) {
|
||||
periodStart = new Date(params[0]);
|
||||
} else {
|
||||
periodStart = new Date();
|
||||
periodStart.setDate(periodStart.getDate() - 30);
|
||||
}
|
||||
|
||||
if (dateRange?.end) {
|
||||
periodEnd = new Date(dateRange.end);
|
||||
} else if (params[1]) {
|
||||
periodEnd = new Date(params[1]);
|
||||
} else {
|
||||
periodEnd = new Date();
|
||||
}
|
||||
|
||||
const fte = calculateFTE(totalHours, periodStart, periodEnd);
|
||||
const activeEmployees = enrichedEmployeeHours.filter(e => e.hours > 0).length;
|
||||
|
||||
// Calculate weeks in period for weekly averages
|
||||
const periodDays = Math.max(1, (periodEnd - periodStart) / (1000 * 60 * 60 * 24));
|
||||
const weeksInPeriod = periodDays / 7;
|
||||
|
||||
// Get daily trend data for hours
|
||||
// Use DATE_FORMAT to get date string in Eastern timezone, avoiding JS timezone conversion issues
|
||||
// Business day starts at 1 AM, so subtract 1 hour before taking the date
|
||||
const trendWhere = whereClause.replace(/date_placed/g, 'tc.TimeStamp');
|
||||
const trendQuery = `
|
||||
SELECT
|
||||
DATE_FORMAT(DATE_SUB(tc.TimeStamp, INTERVAL 1 HOUR), '%Y-%m-%d') as date,
|
||||
tc.EmployeeID,
|
||||
tc.TimeStamp,
|
||||
tc.PunchType
|
||||
FROM timeclock tc
|
||||
LEFT JOIN employees e ON tc.EmployeeID = e.employeeid
|
||||
WHERE ${trendWhere}
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
ORDER BY date, tc.EmployeeID, tc.TimeStamp
|
||||
`;
|
||||
|
||||
const [trendRows] = await connection.execute(trendQuery, params);
|
||||
|
||||
// Get daily picking data for trend
|
||||
// Ship-together orders: only count main orders (is_sub = 0 or NULL)
|
||||
// Use DATE_FORMAT for consistent date string format
|
||||
const pickingTrendWhere = whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
const pickingTrendQuery = `
|
||||
SELECT
|
||||
DATE_FORMAT(DATE_SUB(pt.createddate, INTERVAL 1 HOUR), '%Y-%m-%d') as date,
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked,
|
||||
COALESCE(SUM(pt.totalpieces_picked), 0) as piecesPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${pickingTrendWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY DATE_FORMAT(DATE_SUB(pt.createddate, INTERVAL 1 HOUR), '%Y-%m-%d')
|
||||
ORDER BY date
|
||||
`;
|
||||
|
||||
const [pickingTrendRows] = await connection.execute(pickingTrendQuery, params);
|
||||
|
||||
// Create a map of picking data by date
|
||||
const pickingByDate = new Map();
|
||||
pickingTrendRows.forEach(row => {
|
||||
// Date is already a string in YYYY-MM-DD format from DATE_FORMAT
|
||||
const date = String(row.date);
|
||||
pickingByDate.set(date, {
|
||||
ordersPicked: parseInt(row.ordersPicked || 0),
|
||||
piecesPicked: parseInt(row.piecesPicked || 0),
|
||||
});
|
||||
});
|
||||
|
||||
// Group timeclock by date for trend
|
||||
const byDate = new Map();
|
||||
trendRows.forEach(row => {
|
||||
// Date is already a string in YYYY-MM-DD format from DATE_FORMAT
|
||||
const date = String(row.date);
|
||||
if (!byDate.has(date)) {
|
||||
byDate.set(date, []);
|
||||
}
|
||||
byDate.get(date).push(row);
|
||||
});
|
||||
|
||||
// Generate all dates in the period range for complete trend data
|
||||
const allDatesInRange = [];
|
||||
const startDt = DateTime.fromJSDate(periodStart).setZone(TIMEZONE).startOf('day');
|
||||
const endDt = DateTime.fromJSDate(periodEnd).setZone(TIMEZONE).startOf('day');
|
||||
|
||||
let currentDt = startDt;
|
||||
while (currentDt <= endDt) {
|
||||
allDatesInRange.push(currentDt.toFormat('yyyy-MM-dd'));
|
||||
currentDt = currentDt.plus({ days: 1 });
|
||||
}
|
||||
|
||||
// Build trend data for all dates in range, filling zeros for missing days
|
||||
const trend = allDatesInRange.map(date => {
|
||||
const punches = byDate.get(date) || [];
|
||||
const { totalHours: dayHours, employeeHours: dayEmployeeHours } = calculateHoursFromPunches(punches);
|
||||
const picking = pickingByDate.get(date) || { ordersPicked: 0, piecesPicked: 0 };
|
||||
|
||||
// Parse date string in Eastern timezone to get proper ISO timestamp
|
||||
const dateDt = DateTime.fromFormat(date, 'yyyy-MM-dd', { zone: TIMEZONE });
|
||||
|
||||
return {
|
||||
date,
|
||||
timestamp: dateDt.toISO(),
|
||||
hours: dayHours,
|
||||
activeEmployees: dayEmployeeHours.filter(e => e.hours > 0).length,
|
||||
ordersPicked: picking.ordersPicked,
|
||||
piecesPicked: picking.piecesPicked,
|
||||
};
|
||||
});
|
||||
|
||||
// Get previous period data for comparison
|
||||
const previousRange = getPreviousPeriodRange(timeRange, startDate, endDate);
|
||||
let comparison = null;
|
||||
let previousTotals = null;
|
||||
|
||||
if (previousRange) {
|
||||
const prevTimeclockWhere = previousRange.whereClause.replace(/date_placed/g, 'tc.TimeStamp');
|
||||
|
||||
const [prevTimeclockRows] = await connection.execute(
|
||||
`SELECT tc.EmployeeID, tc.TimeStamp, tc.PunchType
|
||||
FROM timeclock tc
|
||||
LEFT JOIN employees e ON tc.EmployeeID = e.employeeid
|
||||
WHERE ${prevTimeclockWhere}
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
ORDER BY tc.EmployeeID, tc.TimeStamp`,
|
||||
previousRange.params
|
||||
);
|
||||
|
||||
const {
|
||||
totalHours: prevTotalHours,
|
||||
totalProductiveHours: prevProductiveHours,
|
||||
employeeHours: prevEmployeeHours
|
||||
} = calculateHoursFromPunches(prevTimeclockRows);
|
||||
const prevActiveEmployees = prevEmployeeHours.filter(e => e.hours > 0).length;
|
||||
|
||||
// Previous picking data (ship-together fix applied)
|
||||
// Use separate queries to avoid duplication from bucket join
|
||||
const prevPickingWhere = previousRange.whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
|
||||
const [[prevPickingStatsRows], [prevOrderCountRows]] = await Promise.all([
|
||||
connection.execute(
|
||||
`SELECT
|
||||
SUM(pt.totalpieces_picked) as piecesPicked,
|
||||
SUM(TIMESTAMPDIFF(SECOND, pt.createddate, pt.closeddate)) as pickingTimeSeconds
|
||||
FROM picking_ticket pt
|
||||
WHERE ${prevPickingWhere}
|
||||
AND pt.closeddate IS NOT NULL`,
|
||||
previousRange.params
|
||||
),
|
||||
connection.execute(
|
||||
`SELECT
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${prevPickingWhere}
|
||||
AND pt.closeddate IS NOT NULL`,
|
||||
previousRange.params
|
||||
)
|
||||
]);
|
||||
|
||||
const prevPickingStats = prevPickingStatsRows[0] || { piecesPicked: 0, pickingTimeSeconds: 0 };
|
||||
const prevOrderCount = prevOrderCountRows[0] || { ordersPicked: 0 };
|
||||
const prevPicking = {
|
||||
ordersPicked: parseInt(prevOrderCount.ordersPicked || 0),
|
||||
piecesPicked: parseInt(prevPickingStats.piecesPicked || 0),
|
||||
pickingTimeSeconds: parseInt(prevPickingStats.pickingTimeSeconds || 0)
|
||||
};
|
||||
const prevPickingHours = prevPicking.pickingTimeSeconds / 3600;
|
||||
|
||||
// Previous shipping data
|
||||
const prevShippingWhere = previousRange.whereClause.replace(/date_placed/g, 'o.date_shipped');
|
||||
const [prevShippingRows] = await connection.execute(
|
||||
`SELECT
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
WHERE ${prevShippingWhere}
|
||||
AND o.order_status IN (100, 92)`,
|
||||
previousRange.params
|
||||
);
|
||||
const prevShipping = prevShippingRows[0] || { ordersShipped: 0, piecesShipped: 0 };
|
||||
|
||||
// Calculate previous period FTE and productivity
|
||||
const prevFte = calculateFTE(prevTotalHours, previousRange.start || periodStart, previousRange.end || periodEnd);
|
||||
const prevOrdersPerHour = prevProductiveHours > 0 ? parseInt(prevPicking.ordersPicked || 0) / prevProductiveHours : 0;
|
||||
const prevPiecesPerHour = prevProductiveHours > 0 ? parseInt(prevPicking.piecesPicked || 0) / prevProductiveHours : 0;
|
||||
|
||||
previousTotals = {
|
||||
hours: prevTotalHours,
|
||||
productiveHours: prevProductiveHours,
|
||||
activeEmployees: prevActiveEmployees,
|
||||
fte: prevFte,
|
||||
ordersPicked: parseInt(prevPicking.ordersPicked || 0),
|
||||
piecesPicked: parseInt(prevPicking.piecesPicked || 0),
|
||||
pickingHours: prevPickingHours,
|
||||
ordersShipped: parseInt(prevShipping.ordersShipped || 0),
|
||||
piecesShipped: parseInt(prevShipping.piecesShipped || 0),
|
||||
ordersPerHour: prevOrdersPerHour,
|
||||
piecesPerHour: prevPiecesPerHour,
|
||||
};
|
||||
|
||||
// Calculate productivity metrics for comparison
|
||||
const currentOrdersPerHour = totalProductiveHours > 0 ? totalOrdersPicked / totalProductiveHours : 0;
|
||||
const currentPiecesPerHour = totalProductiveHours > 0 ? totalPiecesPicked / totalProductiveHours : 0;
|
||||
|
||||
comparison = {
|
||||
hours: calculateComparison(totalHours, prevTotalHours),
|
||||
productiveHours: calculateComparison(totalProductiveHours, prevProductiveHours),
|
||||
activeEmployees: calculateComparison(activeEmployees, prevActiveEmployees),
|
||||
fte: calculateComparison(fte, prevFte),
|
||||
ordersPicked: calculateComparison(totalOrdersPicked, parseInt(prevPicking.ordersPicked || 0)),
|
||||
piecesPicked: calculateComparison(totalPiecesPicked, parseInt(prevPicking.piecesPicked || 0)),
|
||||
ordersShipped: calculateComparison(parseInt(shipping.ordersShipped || 0), parseInt(prevShipping.ordersShipped || 0)),
|
||||
piecesShipped: calculateComparison(parseInt(shipping.piecesShipped || 0), parseInt(prevShipping.piecesShipped || 0)),
|
||||
ordersPerHour: calculateComparison(currentOrdersPerHour, prevOrdersPerHour),
|
||||
piecesPerHour: calculateComparison(currentPiecesPerHour, prevPiecesPerHour),
|
||||
};
|
||||
}
|
||||
|
||||
// Calculate efficiency (picking time vs productive hours)
|
||||
const pickingEfficiency = totalProductiveHours > 0 ? (totalPickingHours / totalProductiveHours) * 100 : 0;
|
||||
|
||||
const response = {
|
||||
dateRange,
|
||||
totals: {
|
||||
// Time metrics
|
||||
hours: totalHours,
|
||||
breakHours: totalBreakHours,
|
||||
productiveHours: totalProductiveHours,
|
||||
pickingHours: totalPickingHours,
|
||||
|
||||
// Employee metrics
|
||||
activeEmployees,
|
||||
fte,
|
||||
weeksInPeriod,
|
||||
|
||||
// Picking metrics
|
||||
ordersPicked: totalOrdersPicked,
|
||||
piecesPicked: totalPiecesPicked,
|
||||
ticketCount: totalTickets,
|
||||
|
||||
// Shipping metrics
|
||||
ordersShipped: parseInt(shipping.ordersShipped || 0),
|
||||
piecesShipped: parseInt(shipping.piecesShipped || 0),
|
||||
|
||||
// Calculated metrics - standardized to weekly
|
||||
hoursPerWeek: weeksInPeriod > 0 ? totalHours / weeksInPeriod : 0,
|
||||
hoursPerEmployeePerWeek: activeEmployees > 0 && weeksInPeriod > 0
|
||||
? (totalHours / activeEmployees) / weeksInPeriod
|
||||
: 0,
|
||||
|
||||
// Productivity metrics (uses productive hours - excludes breaks)
|
||||
ordersPerHour: totalProductiveHours > 0 ? totalOrdersPicked / totalProductiveHours : 0,
|
||||
piecesPerHour: totalProductiveHours > 0 ? totalPiecesPicked / totalProductiveHours : 0,
|
||||
|
||||
// Picking speed from database (more accurate, only counts picking time)
|
||||
avgPickingSpeed,
|
||||
|
||||
// Efficiency metrics
|
||||
pickingEfficiency,
|
||||
},
|
||||
previousTotals,
|
||||
comparison,
|
||||
byEmployee: {
|
||||
hours: enrichedEmployeeHours,
|
||||
picking: pickingByEmployee,
|
||||
shipping: shippingByEmployee,
|
||||
},
|
||||
trend,
|
||||
};
|
||||
|
||||
return { response, release };
|
||||
};
|
||||
|
||||
let result;
|
||||
try {
|
||||
result = await Promise.race([mainOperation(), timeoutPromise]);
|
||||
} catch (error) {
|
||||
if (error.message.includes('timeout')) {
|
||||
console.log(`[EMPLOYEE-METRICS] Request timed out in ${Date.now() - startTime}ms`);
|
||||
throw error;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
const { response, release } = result;
|
||||
|
||||
if (release) release();
|
||||
|
||||
console.log(`[EMPLOYEE-METRICS] Request completed in ${Date.now() - startTime}ms`);
|
||||
res.json(response);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /employee-metrics:', error);
|
||||
console.log(`[EMPLOYEE-METRICS] Request failed in ${Date.now() - startTime}ms`);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Health check
|
||||
router.get('/health', async (req, res) => {
|
||||
try {
|
||||
const { connection, release } = await getDbConnection();
|
||||
await connection.execute('SELECT 1 as test');
|
||||
release();
|
||||
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus(),
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error.message,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Helper functions
|
||||
function calculateComparison(currentValue, previousValue) {
|
||||
if (typeof previousValue !== 'number') {
|
||||
return { absolute: null, percentage: null };
|
||||
}
|
||||
|
||||
const absolute = typeof currentValue === 'number' ? currentValue - previousValue : null;
|
||||
const percentage =
|
||||
absolute !== null && previousValue !== 0
|
||||
? (absolute / Math.abs(previousValue)) * 100
|
||||
: null;
|
||||
|
||||
return { absolute, percentage };
|
||||
}
|
||||
|
||||
function getPreviousPeriodRange(timeRange, startDate, endDate) {
|
||||
if (timeRange && timeRange !== 'custom') {
|
||||
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||
if (!prevTimeRange || prevTimeRange === timeRange) {
|
||||
return null;
|
||||
}
|
||||
return getTimeRangeConditions(prevTimeRange);
|
||||
}
|
||||
|
||||
const hasCustomDates = (timeRange === 'custom' || !timeRange) && startDate && endDate;
|
||||
if (!hasCustomDates) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
|
||||
if (Number.isNaN(start.getTime()) || Number.isNaN(end.getTime())) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const duration = end.getTime() - start.getTime();
|
||||
if (!Number.isFinite(duration) || duration <= 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const prevEnd = new Date(start.getTime() - 1);
|
||||
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||
|
||||
return getTimeRangeConditions('custom', prevStart.toISOString(), prevEnd.toISOString());
|
||||
}
|
||||
|
||||
function getPreviousTimeRange(timeRange) {
|
||||
const map = {
|
||||
today: 'yesterday',
|
||||
thisWeek: 'lastWeek',
|
||||
thisMonth: 'lastMonth',
|
||||
last7days: 'previous7days',
|
||||
last30days: 'previous30days',
|
||||
last90days: 'previous90days',
|
||||
yesterday: 'twoDaysAgo'
|
||||
};
|
||||
return map[timeRange] || timeRange;
|
||||
}
|
||||
|
||||
module.exports = router;
|
||||
@@ -0,0 +1,470 @@
|
||||
const express = require('express');
|
||||
const { DateTime } = require('luxon');
|
||||
|
||||
const router = express.Router();
|
||||
const { getDbConnection, getPoolStatus } = require('../db/connection');
|
||||
const {
|
||||
getTimeRangeConditions,
|
||||
} = require('../utils/timeUtils');
|
||||
|
||||
const TIMEZONE = 'America/New_York';
|
||||
|
||||
// Main operations metrics endpoint - focused on picking and shipping
|
||||
router.get('/', async (req, res) => {
|
||||
const startTime = Date.now();
|
||||
console.log(`[OPERATIONS-METRICS] Starting request for timeRange: ${req.query.timeRange}`);
|
||||
|
||||
const timeoutPromise = new Promise((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Request timeout after 30 seconds')), 30000);
|
||||
});
|
||||
|
||||
try {
|
||||
const mainOperation = async () => {
|
||||
const { timeRange, startDate, endDate } = req.query;
|
||||
console.log(`[OPERATIONS-METRICS] Getting DB connection...`);
|
||||
const { connection, release } = await getDbConnection();
|
||||
console.log(`[OPERATIONS-METRICS] DB connection obtained in ${Date.now() - startTime}ms`);
|
||||
|
||||
const { whereClause, params, dateRange } = getTimeRangeConditions(timeRange, startDate, endDate);
|
||||
|
||||
// Query for picking tickets - using subquery to avoid duplication from bucket join
|
||||
// Ship-together orders: only count main orders (is_sub = 0 or NULL), not sub-orders
|
||||
const pickingWhere = whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
|
||||
// First get picking ticket stats without the bucket join (to avoid duplication)
|
||||
const pickingStatsQuery = `
|
||||
SELECT
|
||||
pt.createdby as employeeId,
|
||||
e.firstname,
|
||||
e.lastname,
|
||||
COUNT(DISTINCT pt.pickingid) as ticketCount,
|
||||
SUM(pt.totalpieces_picked) as piecesPicked,
|
||||
SUM(TIMESTAMPDIFF(SECOND, pt.createddate, pt.closeddate)) as pickingTimeSeconds,
|
||||
AVG(NULLIF(pt.picking_speed, 0)) as avgPickingSpeed
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN employees e ON pt.createdby = e.employeeid
|
||||
WHERE ${pickingWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY pt.createdby, e.firstname, e.lastname
|
||||
`;
|
||||
|
||||
// Separate query for order counts (needs bucket join for ship-together handling)
|
||||
const orderCountQuery = `
|
||||
SELECT
|
||||
pt.createdby as employeeId,
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${pickingWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY pt.createdby
|
||||
`;
|
||||
|
||||
const [[pickingStatsRows], [orderCountRows]] = await Promise.all([
|
||||
connection.execute(pickingStatsQuery, params),
|
||||
connection.execute(orderCountQuery, params)
|
||||
]);
|
||||
|
||||
// Merge the results
|
||||
const orderCountMap = new Map();
|
||||
orderCountRows.forEach(row => {
|
||||
orderCountMap.set(row.employeeId, parseInt(row.ordersPicked || 0));
|
||||
});
|
||||
|
||||
// Aggregate picking totals
|
||||
let totalOrdersPicked = 0;
|
||||
let totalPiecesPicked = 0;
|
||||
let totalTickets = 0;
|
||||
let totalPickingTimeSeconds = 0;
|
||||
let pickingSpeedSum = 0;
|
||||
let pickingSpeedCount = 0;
|
||||
|
||||
const pickingByEmployee = pickingStatsRows.map(row => {
|
||||
const ordersPicked = orderCountMap.get(row.employeeId) || 0;
|
||||
totalOrdersPicked += ordersPicked;
|
||||
totalPiecesPicked += parseInt(row.piecesPicked || 0);
|
||||
totalTickets += parseInt(row.ticketCount || 0);
|
||||
totalPickingTimeSeconds += parseInt(row.pickingTimeSeconds || 0);
|
||||
if (row.avgPickingSpeed && row.avgPickingSpeed > 0) {
|
||||
pickingSpeedSum += parseFloat(row.avgPickingSpeed);
|
||||
pickingSpeedCount++;
|
||||
}
|
||||
|
||||
const empPickingHours = parseInt(row.pickingTimeSeconds || 0) / 3600;
|
||||
|
||||
return {
|
||||
employeeId: row.employeeId,
|
||||
name: `${row.firstname || ''} ${row.lastname || ''}`.trim() || `Employee ${row.employeeId}`,
|
||||
ticketCount: parseInt(row.ticketCount || 0),
|
||||
ordersPicked,
|
||||
piecesPicked: parseInt(row.piecesPicked || 0),
|
||||
pickingHours: empPickingHours,
|
||||
avgPickingSpeed: row.avgPickingSpeed ? parseFloat(row.avgPickingSpeed) : null,
|
||||
};
|
||||
});
|
||||
|
||||
const totalPickingHours = totalPickingTimeSeconds / 3600;
|
||||
const avgPickingSpeed = pickingSpeedCount > 0 ? pickingSpeedSum / pickingSpeedCount : 0;
|
||||
|
||||
// Query for shipped orders - totals
|
||||
// Ship-together orders: only count main orders (order_type != 8 for sub-orders)
|
||||
const shippingWhere = whereClause.replace(/date_placed/g, 'o.date_shipped');
|
||||
|
||||
const shippingQuery = `
|
||||
SELECT
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
WHERE ${shippingWhere}
|
||||
AND o.order_status IN (100, 92)
|
||||
`;
|
||||
|
||||
const [shippingRows] = await connection.execute(shippingQuery, params);
|
||||
const shipping = shippingRows[0] || { ordersShipped: 0, piecesShipped: 0 };
|
||||
|
||||
// Query for shipped orders by employee
|
||||
const shippingByEmployeeQuery = `
|
||||
SELECT
|
||||
e.employeeid,
|
||||
e.firstname,
|
||||
e.lastname,
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
JOIN employees e ON o.stats_cid_shipped = e.cid
|
||||
WHERE ${shippingWhere}
|
||||
AND o.order_status IN (100, 92)
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
GROUP BY e.employeeid, e.firstname, e.lastname
|
||||
ORDER BY ordersShipped DESC
|
||||
`;
|
||||
|
||||
const [shippingByEmployeeRows] = await connection.execute(shippingByEmployeeQuery, params);
|
||||
const shippingByEmployee = shippingByEmployeeRows.map(row => ({
|
||||
employeeId: row.employeeid,
|
||||
name: `${row.firstname || ''} ${row.lastname || ''}`.trim() || `Employee ${row.employeeid}`,
|
||||
ordersShipped: parseInt(row.ordersShipped || 0),
|
||||
piecesShipped: parseInt(row.piecesShipped || 0),
|
||||
}));
|
||||
|
||||
// Calculate period dates
|
||||
let periodStart, periodEnd;
|
||||
if (dateRange?.start) {
|
||||
periodStart = new Date(dateRange.start);
|
||||
} else if (params[0]) {
|
||||
periodStart = new Date(params[0]);
|
||||
} else {
|
||||
periodStart = new Date();
|
||||
periodStart.setDate(periodStart.getDate() - 30);
|
||||
}
|
||||
|
||||
if (dateRange?.end) {
|
||||
periodEnd = new Date(dateRange.end);
|
||||
} else if (params[1]) {
|
||||
periodEnd = new Date(params[1]);
|
||||
} else {
|
||||
periodEnd = new Date();
|
||||
}
|
||||
|
||||
// Calculate productivity (orders/pieces per picking hour)
|
||||
const ordersPerHour = totalPickingHours > 0 ? totalOrdersPicked / totalPickingHours : 0;
|
||||
const piecesPerHour = totalPickingHours > 0 ? totalPiecesPicked / totalPickingHours : 0;
|
||||
|
||||
// Get daily trend data for picking
|
||||
// Use DATE_FORMAT to get date string in Eastern timezone
|
||||
// Business day starts at 1 AM, so subtract 1 hour before taking the date
|
||||
const pickingTrendWhere = whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
const pickingTrendQuery = `
|
||||
SELECT
|
||||
DATE_FORMAT(DATE_SUB(pt.createddate, INTERVAL 1 HOUR), '%Y-%m-%d') as date,
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked,
|
||||
COALESCE(SUM(pt.totalpieces_picked), 0) as piecesPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${pickingTrendWhere}
|
||||
AND pt.closeddate IS NOT NULL
|
||||
GROUP BY DATE_FORMAT(DATE_SUB(pt.createddate, INTERVAL 1 HOUR), '%Y-%m-%d')
|
||||
ORDER BY date
|
||||
`;
|
||||
|
||||
// Get shipping trend data
|
||||
const shippingTrendWhere = whereClause.replace(/date_placed/g, 'o.date_shipped');
|
||||
const shippingTrendQuery = `
|
||||
SELECT
|
||||
DATE_FORMAT(DATE_SUB(o.date_shipped, INTERVAL 1 HOUR), '%Y-%m-%d') as date,
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
WHERE ${shippingTrendWhere}
|
||||
AND o.order_status IN (100, 92)
|
||||
GROUP BY DATE_FORMAT(DATE_SUB(o.date_shipped, INTERVAL 1 HOUR), '%Y-%m-%d')
|
||||
ORDER BY date
|
||||
`;
|
||||
|
||||
const [[pickingTrendRows], [shippingTrendRows]] = await Promise.all([
|
||||
connection.execute(pickingTrendQuery, params),
|
||||
connection.execute(shippingTrendQuery, params),
|
||||
]);
|
||||
|
||||
// Create maps for trend data
|
||||
const pickingByDate = new Map();
|
||||
pickingTrendRows.forEach(row => {
|
||||
const date = String(row.date);
|
||||
pickingByDate.set(date, {
|
||||
ordersPicked: parseInt(row.ordersPicked || 0),
|
||||
piecesPicked: parseInt(row.piecesPicked || 0),
|
||||
});
|
||||
});
|
||||
|
||||
const shippingByDate = new Map();
|
||||
shippingTrendRows.forEach(row => {
|
||||
const date = String(row.date);
|
||||
shippingByDate.set(date, {
|
||||
ordersShipped: parseInt(row.ordersShipped || 0),
|
||||
piecesShipped: parseInt(row.piecesShipped || 0),
|
||||
});
|
||||
});
|
||||
|
||||
// Generate all dates in the period range for complete trend data
|
||||
const allDatesInRange = [];
|
||||
const startDt = DateTime.fromJSDate(periodStart).setZone(TIMEZONE).startOf('day');
|
||||
const endDt = DateTime.fromJSDate(periodEnd).setZone(TIMEZONE).startOf('day');
|
||||
|
||||
let currentDt = startDt;
|
||||
while (currentDt <= endDt) {
|
||||
allDatesInRange.push(currentDt.toFormat('yyyy-MM-dd'));
|
||||
currentDt = currentDt.plus({ days: 1 });
|
||||
}
|
||||
|
||||
// Build trend data for all dates in range
|
||||
const trend = allDatesInRange.map(date => {
|
||||
const picking = pickingByDate.get(date) || { ordersPicked: 0, piecesPicked: 0 };
|
||||
const shippingData = shippingByDate.get(date) || { ordersShipped: 0, piecesShipped: 0 };
|
||||
|
||||
// Parse date string in Eastern timezone to get proper ISO timestamp
|
||||
const dateDt = DateTime.fromFormat(date, 'yyyy-MM-dd', { zone: TIMEZONE });
|
||||
|
||||
return {
|
||||
date,
|
||||
timestamp: dateDt.toISO(),
|
||||
ordersPicked: picking.ordersPicked,
|
||||
piecesPicked: picking.piecesPicked,
|
||||
ordersShipped: shippingData.ordersShipped,
|
||||
piecesShipped: shippingData.piecesShipped,
|
||||
};
|
||||
});
|
||||
|
||||
// Get previous period data for comparison
|
||||
const previousRange = getPreviousPeriodRange(timeRange, startDate, endDate);
|
||||
let comparison = null;
|
||||
let previousTotals = null;
|
||||
|
||||
if (previousRange) {
|
||||
// Previous picking data
|
||||
const prevPickingWhere = previousRange.whereClause.replace(/date_placed/g, 'pt.createddate');
|
||||
|
||||
const [[prevPickingStatsRows], [prevOrderCountRows]] = await Promise.all([
|
||||
connection.execute(
|
||||
`SELECT
|
||||
SUM(pt.totalpieces_picked) as piecesPicked,
|
||||
SUM(TIMESTAMPDIFF(SECOND, pt.createddate, pt.closeddate)) as pickingTimeSeconds
|
||||
FROM picking_ticket pt
|
||||
WHERE ${prevPickingWhere}
|
||||
AND pt.closeddate IS NOT NULL`,
|
||||
previousRange.params
|
||||
),
|
||||
connection.execute(
|
||||
`SELECT
|
||||
COUNT(DISTINCT CASE WHEN ptb.is_sub = 0 OR ptb.is_sub IS NULL THEN ptb.orderid END) as ordersPicked
|
||||
FROM picking_ticket pt
|
||||
LEFT JOIN picking_ticket_buckets ptb ON pt.pickingid = ptb.pickingid
|
||||
WHERE ${prevPickingWhere}
|
||||
AND pt.closeddate IS NOT NULL`,
|
||||
previousRange.params
|
||||
)
|
||||
]);
|
||||
|
||||
const prevPickingStats = prevPickingStatsRows[0] || { piecesPicked: 0, pickingTimeSeconds: 0 };
|
||||
const prevOrderCount = prevOrderCountRows[0] || { ordersPicked: 0 };
|
||||
const prevPicking = {
|
||||
ordersPicked: parseInt(prevOrderCount.ordersPicked || 0),
|
||||
piecesPicked: parseInt(prevPickingStats.piecesPicked || 0),
|
||||
pickingTimeSeconds: parseInt(prevPickingStats.pickingTimeSeconds || 0)
|
||||
};
|
||||
const prevPickingHours = prevPicking.pickingTimeSeconds / 3600;
|
||||
|
||||
// Previous shipping data
|
||||
const prevShippingWhere = previousRange.whereClause.replace(/date_placed/g, 'o.date_shipped');
|
||||
const [prevShippingRows] = await connection.execute(
|
||||
`SELECT
|
||||
COUNT(DISTINCT CASE WHEN o.order_type != 8 OR o.order_type IS NULL THEN o.order_id END) as ordersShipped,
|
||||
COALESCE(SUM(o.stats_prod_pieces), 0) as piecesShipped
|
||||
FROM _order o
|
||||
WHERE ${prevShippingWhere}
|
||||
AND o.order_status IN (100, 92)`,
|
||||
previousRange.params
|
||||
);
|
||||
const prevShipping = prevShippingRows[0] || { ordersShipped: 0, piecesShipped: 0 };
|
||||
|
||||
// Calculate previous productivity
|
||||
const prevOrdersPerHour = prevPickingHours > 0 ? parseInt(prevPicking.ordersPicked || 0) / prevPickingHours : 0;
|
||||
const prevPiecesPerHour = prevPickingHours > 0 ? parseInt(prevPicking.piecesPicked || 0) / prevPickingHours : 0;
|
||||
|
||||
previousTotals = {
|
||||
ordersPicked: parseInt(prevPicking.ordersPicked || 0),
|
||||
piecesPicked: parseInt(prevPicking.piecesPicked || 0),
|
||||
pickingHours: prevPickingHours,
|
||||
ordersShipped: parseInt(prevShipping.ordersShipped || 0),
|
||||
piecesShipped: parseInt(prevShipping.piecesShipped || 0),
|
||||
ordersPerHour: prevOrdersPerHour,
|
||||
piecesPerHour: prevPiecesPerHour,
|
||||
};
|
||||
|
||||
comparison = {
|
||||
ordersPicked: calculateComparison(totalOrdersPicked, parseInt(prevPicking.ordersPicked || 0)),
|
||||
piecesPicked: calculateComparison(totalPiecesPicked, parseInt(prevPicking.piecesPicked || 0)),
|
||||
ordersShipped: calculateComparison(parseInt(shipping.ordersShipped || 0), parseInt(prevShipping.ordersShipped || 0)),
|
||||
piecesShipped: calculateComparison(parseInt(shipping.piecesShipped || 0), parseInt(prevShipping.piecesShipped || 0)),
|
||||
ordersPerHour: calculateComparison(ordersPerHour, prevOrdersPerHour),
|
||||
piecesPerHour: calculateComparison(piecesPerHour, prevPiecesPerHour),
|
||||
};
|
||||
}
|
||||
|
||||
const response = {
|
||||
dateRange,
|
||||
totals: {
|
||||
// Picking metrics
|
||||
ordersPicked: totalOrdersPicked,
|
||||
piecesPicked: totalPiecesPicked,
|
||||
ticketCount: totalTickets,
|
||||
pickingHours: totalPickingHours,
|
||||
|
||||
// Shipping metrics
|
||||
ordersShipped: parseInt(shipping.ordersShipped || 0),
|
||||
piecesShipped: parseInt(shipping.piecesShipped || 0),
|
||||
|
||||
// Productivity metrics
|
||||
ordersPerHour,
|
||||
piecesPerHour,
|
||||
avgPickingSpeed,
|
||||
},
|
||||
previousTotals,
|
||||
comparison,
|
||||
byEmployee: {
|
||||
picking: pickingByEmployee,
|
||||
shipping: shippingByEmployee,
|
||||
},
|
||||
trend,
|
||||
};
|
||||
|
||||
return { response, release };
|
||||
};
|
||||
|
||||
let result;
|
||||
try {
|
||||
result = await Promise.race([mainOperation(), timeoutPromise]);
|
||||
} catch (error) {
|
||||
if (error.message.includes('timeout')) {
|
||||
console.log(`[OPERATIONS-METRICS] Request timed out in ${Date.now() - startTime}ms`);
|
||||
throw error;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
const { response, release } = result;
|
||||
|
||||
if (release) release();
|
||||
|
||||
console.log(`[OPERATIONS-METRICS] Request completed in ${Date.now() - startTime}ms`);
|
||||
res.json(response);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /operations-metrics:', error);
|
||||
console.log(`[OPERATIONS-METRICS] Request failed in ${Date.now() - startTime}ms`);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Health check
|
||||
router.get('/health', async (req, res) => {
|
||||
try {
|
||||
const { connection, release } = await getDbConnection();
|
||||
await connection.execute('SELECT 1 as test');
|
||||
release();
|
||||
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus(),
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error.message,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Helper functions
|
||||
function calculateComparison(currentValue, previousValue) {
|
||||
if (typeof previousValue !== 'number') {
|
||||
return { absolute: null, percentage: null };
|
||||
}
|
||||
|
||||
const absolute = typeof currentValue === 'number' ? currentValue - previousValue : null;
|
||||
const percentage =
|
||||
absolute !== null && previousValue !== 0
|
||||
? (absolute / Math.abs(previousValue)) * 100
|
||||
: null;
|
||||
|
||||
return { absolute, percentage };
|
||||
}
|
||||
|
||||
function getPreviousPeriodRange(timeRange, startDate, endDate) {
|
||||
if (timeRange && timeRange !== 'custom') {
|
||||
const prevTimeRange = getPreviousTimeRange(timeRange);
|
||||
if (!prevTimeRange || prevTimeRange === timeRange) {
|
||||
return null;
|
||||
}
|
||||
return getTimeRangeConditions(prevTimeRange);
|
||||
}
|
||||
|
||||
const hasCustomDates = (timeRange === 'custom' || !timeRange) && startDate && endDate;
|
||||
if (!hasCustomDates) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const start = new Date(startDate);
|
||||
const end = new Date(endDate);
|
||||
|
||||
if (Number.isNaN(start.getTime()) || Number.isNaN(end.getTime())) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const duration = end.getTime() - start.getTime();
|
||||
if (!Number.isFinite(duration) || duration <= 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const prevEnd = new Date(start.getTime() - 1);
|
||||
const prevStart = new Date(prevEnd.getTime() - duration);
|
||||
|
||||
return getTimeRangeConditions('custom', prevStart.toISOString(), prevEnd.toISOString());
|
||||
}
|
||||
|
||||
function getPreviousTimeRange(timeRange) {
|
||||
const map = {
|
||||
today: 'yesterday',
|
||||
thisWeek: 'lastWeek',
|
||||
thisMonth: 'lastMonth',
|
||||
last7days: 'previous7days',
|
||||
last30days: 'previous30days',
|
||||
last90days: 'previous90days',
|
||||
yesterday: 'twoDaysAgo'
|
||||
};
|
||||
return map[timeRange] || timeRange;
|
||||
}
|
||||
|
||||
module.exports = router;
|
||||
495
inventory-server/dashboard/acot-server/routes/payroll-metrics.js
Normal file
495
inventory-server/dashboard/acot-server/routes/payroll-metrics.js
Normal file
@@ -0,0 +1,495 @@
|
||||
const express = require('express');
|
||||
const { DateTime } = require('luxon');
|
||||
|
||||
const router = express.Router();
|
||||
const { getDbConnection, getPoolStatus } = require('../db/connection');
|
||||
|
||||
const TIMEZONE = 'America/New_York';
|
||||
|
||||
// Punch types from the database
|
||||
const PUNCH_TYPES = {
|
||||
OUT: 0,
|
||||
IN: 1,
|
||||
BREAK_START: 2,
|
||||
BREAK_END: 3,
|
||||
};
|
||||
|
||||
// Standard hours for overtime calculation (40 hours per week)
|
||||
const STANDARD_WEEKLY_HOURS = 40;
|
||||
|
||||
// Reference pay period start date (January 25, 2026 is a Sunday, first day of a pay period)
|
||||
const PAY_PERIOD_REFERENCE = DateTime.fromObject(
|
||||
{ year: 2026, month: 1, day: 25 },
|
||||
{ zone: TIMEZONE }
|
||||
);
|
||||
|
||||
/**
|
||||
* Calculate the pay period that contains a given date
|
||||
* Pay periods are 14 days starting on Sunday
|
||||
* @param {DateTime} date - The date to find the pay period for
|
||||
* @returns {{ start: DateTime, end: DateTime, week1: { start: DateTime, end: DateTime }, week2: { start: DateTime, end: DateTime } }}
|
||||
*/
|
||||
function getPayPeriodForDate(date) {
|
||||
const dt = DateTime.isDateTime(date) ? date : DateTime.fromJSDate(date, { zone: TIMEZONE });
|
||||
|
||||
// Calculate days since reference
|
||||
const daysSinceReference = Math.floor(dt.diff(PAY_PERIOD_REFERENCE, 'days').days);
|
||||
|
||||
// Find which pay period this falls into (can be negative for dates before reference)
|
||||
const payPeriodIndex = Math.floor(daysSinceReference / 14);
|
||||
|
||||
// Calculate the start of this pay period
|
||||
const start = PAY_PERIOD_REFERENCE.plus({ days: payPeriodIndex * 14 }).startOf('day');
|
||||
const end = start.plus({ days: 13 }).endOf('day');
|
||||
|
||||
// Week 1: Sunday through Saturday
|
||||
const week1Start = start;
|
||||
const week1End = start.plus({ days: 6 }).endOf('day');
|
||||
|
||||
// Week 2: Sunday through Saturday
|
||||
const week2Start = start.plus({ days: 7 }).startOf('day');
|
||||
const week2End = end;
|
||||
|
||||
return {
|
||||
start,
|
||||
end,
|
||||
week1: { start: week1Start, end: week1End },
|
||||
week2: { start: week2Start, end: week2End },
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the current pay period
|
||||
*/
|
||||
function getCurrentPayPeriod() {
|
||||
return getPayPeriodForDate(DateTime.now().setZone(TIMEZONE));
|
||||
}
|
||||
|
||||
/**
|
||||
* Navigate to previous or next pay period
|
||||
* @param {DateTime} currentStart - Current pay period start
|
||||
* @param {number} offset - Number of pay periods to move (negative for previous)
|
||||
*/
|
||||
function navigatePayPeriod(currentStart, offset) {
|
||||
const newStart = currentStart.plus({ days: offset * 14 });
|
||||
return getPayPeriodForDate(newStart);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate working hours from timeclock entries, broken down by week
|
||||
* @param {Array} punches - Timeclock punch entries
|
||||
* @param {Object} payPeriod - Pay period with week boundaries
|
||||
*/
|
||||
function calculateHoursByWeek(punches, payPeriod) {
|
||||
// Group by employee
|
||||
const byEmployee = new Map();
|
||||
|
||||
punches.forEach(punch => {
|
||||
if (!byEmployee.has(punch.EmployeeID)) {
|
||||
byEmployee.set(punch.EmployeeID, {
|
||||
employeeId: punch.EmployeeID,
|
||||
firstname: punch.firstname || '',
|
||||
lastname: punch.lastname || '',
|
||||
punches: [],
|
||||
});
|
||||
}
|
||||
byEmployee.get(punch.EmployeeID).punches.push(punch);
|
||||
});
|
||||
|
||||
const employeeResults = [];
|
||||
let totalHours = 0;
|
||||
let totalBreakHours = 0;
|
||||
let totalOvertimeHours = 0;
|
||||
let totalRegularHours = 0;
|
||||
let week1TotalHours = 0;
|
||||
let week1TotalOvertime = 0;
|
||||
let week2TotalHours = 0;
|
||||
let week2TotalOvertime = 0;
|
||||
|
||||
byEmployee.forEach((employeeData) => {
|
||||
// Sort punches by timestamp
|
||||
employeeData.punches.sort((a, b) => new Date(a.TimeStamp) - new Date(b.TimeStamp));
|
||||
|
||||
// Calculate hours for each week
|
||||
const week1Punches = employeeData.punches.filter(p => {
|
||||
const dt = DateTime.fromJSDate(new Date(p.TimeStamp), { zone: TIMEZONE });
|
||||
return dt >= payPeriod.week1.start && dt <= payPeriod.week1.end;
|
||||
});
|
||||
|
||||
const week2Punches = employeeData.punches.filter(p => {
|
||||
const dt = DateTime.fromJSDate(new Date(p.TimeStamp), { zone: TIMEZONE });
|
||||
return dt >= payPeriod.week2.start && dt <= payPeriod.week2.end;
|
||||
});
|
||||
|
||||
const week1Hours = calculateHoursFromPunches(week1Punches);
|
||||
const week2Hours = calculateHoursFromPunches(week2Punches);
|
||||
|
||||
// Calculate overtime per week (anything over 40 hours)
|
||||
const week1Overtime = Math.max(0, week1Hours.hours - STANDARD_WEEKLY_HOURS);
|
||||
const week2Overtime = Math.max(0, week2Hours.hours - STANDARD_WEEKLY_HOURS);
|
||||
const week1Regular = week1Hours.hours - week1Overtime;
|
||||
const week2Regular = week2Hours.hours - week2Overtime;
|
||||
|
||||
const employeeTotal = week1Hours.hours + week2Hours.hours;
|
||||
const employeeBreaks = week1Hours.breakHours + week2Hours.breakHours;
|
||||
const employeeOvertime = week1Overtime + week2Overtime;
|
||||
const employeeRegular = employeeTotal - employeeOvertime;
|
||||
|
||||
totalHours += employeeTotal;
|
||||
totalBreakHours += employeeBreaks;
|
||||
totalOvertimeHours += employeeOvertime;
|
||||
totalRegularHours += employeeRegular;
|
||||
week1TotalHours += week1Hours.hours;
|
||||
week1TotalOvertime += week1Overtime;
|
||||
week2TotalHours += week2Hours.hours;
|
||||
week2TotalOvertime += week2Overtime;
|
||||
|
||||
employeeResults.push({
|
||||
employeeId: employeeData.employeeId,
|
||||
name: `${employeeData.firstname} ${employeeData.lastname}`.trim() || `Employee ${employeeData.employeeId}`,
|
||||
week1Hours: week1Hours.hours,
|
||||
week1BreakHours: week1Hours.breakHours,
|
||||
week1Overtime,
|
||||
week1Regular,
|
||||
week2Hours: week2Hours.hours,
|
||||
week2BreakHours: week2Hours.breakHours,
|
||||
week2Overtime,
|
||||
week2Regular,
|
||||
totalHours: employeeTotal,
|
||||
totalBreakHours: employeeBreaks,
|
||||
overtimeHours: employeeOvertime,
|
||||
regularHours: employeeRegular,
|
||||
});
|
||||
});
|
||||
|
||||
// Sort by total hours descending
|
||||
employeeResults.sort((a, b) => b.totalHours - a.totalHours);
|
||||
|
||||
return {
|
||||
byEmployee: employeeResults,
|
||||
totals: {
|
||||
hours: totalHours,
|
||||
breakHours: totalBreakHours,
|
||||
overtimeHours: totalOvertimeHours,
|
||||
regularHours: totalRegularHours,
|
||||
activeEmployees: employeeResults.filter(e => e.totalHours > 0).length,
|
||||
},
|
||||
byWeek: [
|
||||
{
|
||||
week: 1,
|
||||
start: payPeriod.week1.start.toISODate(),
|
||||
end: payPeriod.week1.end.toISODate(),
|
||||
hours: week1TotalHours,
|
||||
overtime: week1TotalOvertime,
|
||||
regular: week1TotalHours - week1TotalOvertime,
|
||||
},
|
||||
{
|
||||
week: 2,
|
||||
start: payPeriod.week2.start.toISODate(),
|
||||
end: payPeriod.week2.end.toISODate(),
|
||||
hours: week2TotalHours,
|
||||
overtime: week2TotalOvertime,
|
||||
regular: week2TotalHours - week2TotalOvertime,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate hours from a set of punches
|
||||
*/
|
||||
function calculateHoursFromPunches(punches) {
|
||||
let hours = 0;
|
||||
let breakHours = 0;
|
||||
let currentIn = null;
|
||||
let breakStart = null;
|
||||
|
||||
punches.forEach(punch => {
|
||||
const punchTime = new Date(punch.TimeStamp);
|
||||
|
||||
switch (punch.PunchType) {
|
||||
case PUNCH_TYPES.IN:
|
||||
currentIn = punchTime;
|
||||
break;
|
||||
case PUNCH_TYPES.OUT:
|
||||
if (currentIn) {
|
||||
hours += (punchTime - currentIn) / (1000 * 60 * 60);
|
||||
currentIn = null;
|
||||
}
|
||||
break;
|
||||
case PUNCH_TYPES.BREAK_START:
|
||||
breakStart = punchTime;
|
||||
break;
|
||||
case PUNCH_TYPES.BREAK_END:
|
||||
if (breakStart) {
|
||||
breakHours += (punchTime - breakStart) / (1000 * 60 * 60);
|
||||
breakStart = null;
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
|
||||
return { hours, breakHours };
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate FTE for a pay period (based on 80 hours = 1 FTE for 2-week period)
|
||||
*/
|
||||
function calculateFTE(totalHours) {
|
||||
const fullTimePeriodHours = STANDARD_WEEKLY_HOURS * 2; // 80 hours for 2 weeks
|
||||
return totalHours / fullTimePeriodHours;
|
||||
}
|
||||
|
||||
// Main payroll metrics endpoint
|
||||
router.get('/', async (req, res) => {
|
||||
const startTime = Date.now();
|
||||
console.log(`[PAYROLL-METRICS] Starting request`);
|
||||
|
||||
const timeoutPromise = new Promise((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Request timeout after 30 seconds')), 30000);
|
||||
});
|
||||
|
||||
try {
|
||||
const mainOperation = async () => {
|
||||
const { payPeriodStart, navigate } = req.query;
|
||||
|
||||
let payPeriod;
|
||||
|
||||
if (payPeriodStart) {
|
||||
// Parse the provided start date
|
||||
const startDate = DateTime.fromISO(payPeriodStart, { zone: TIMEZONE });
|
||||
if (!startDate.isValid) {
|
||||
return res.status(400).json({ error: 'Invalid payPeriodStart date format' });
|
||||
}
|
||||
payPeriod = getPayPeriodForDate(startDate);
|
||||
} else {
|
||||
// Default to current pay period
|
||||
payPeriod = getCurrentPayPeriod();
|
||||
}
|
||||
|
||||
// Handle navigation if requested
|
||||
if (navigate) {
|
||||
const offset = parseInt(navigate, 10);
|
||||
if (!isNaN(offset)) {
|
||||
payPeriod = navigatePayPeriod(payPeriod.start, offset);
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`[PAYROLL-METRICS] Getting DB connection...`);
|
||||
const { connection, release } = await getDbConnection();
|
||||
console.log(`[PAYROLL-METRICS] DB connection obtained in ${Date.now() - startTime}ms`);
|
||||
|
||||
// Build query for the pay period
|
||||
const periodStart = payPeriod.start.toJSDate();
|
||||
const periodEnd = payPeriod.end.toJSDate();
|
||||
|
||||
const timeclockQuery = `
|
||||
SELECT
|
||||
tc.EmployeeID,
|
||||
tc.TimeStamp,
|
||||
tc.PunchType,
|
||||
e.firstname,
|
||||
e.lastname
|
||||
FROM timeclock tc
|
||||
LEFT JOIN employees e ON tc.EmployeeID = e.employeeid
|
||||
WHERE tc.TimeStamp >= ? AND tc.TimeStamp <= ?
|
||||
AND e.hidden = 0
|
||||
AND e.disabled = 0
|
||||
ORDER BY tc.EmployeeID, tc.TimeStamp
|
||||
`;
|
||||
|
||||
const [timeclockRows] = await connection.execute(timeclockQuery, [periodStart, periodEnd]);
|
||||
|
||||
// Calculate hours with week breakdown
|
||||
const hoursData = calculateHoursByWeek(timeclockRows, payPeriod);
|
||||
|
||||
// Calculate FTE
|
||||
const fte = calculateFTE(hoursData.totals.hours);
|
||||
const activeEmployees = hoursData.totals.activeEmployees;
|
||||
const avgHoursPerEmployee = activeEmployees > 0 ? hoursData.totals.hours / activeEmployees : 0;
|
||||
|
||||
// Get previous pay period data for comparison
|
||||
const prevPayPeriod = navigatePayPeriod(payPeriod.start, -1);
|
||||
const [prevTimeclockRows] = await connection.execute(timeclockQuery, [
|
||||
prevPayPeriod.start.toJSDate(),
|
||||
prevPayPeriod.end.toJSDate(),
|
||||
]);
|
||||
|
||||
const prevHoursData = calculateHoursByWeek(prevTimeclockRows, prevPayPeriod);
|
||||
const prevFte = calculateFTE(prevHoursData.totals.hours);
|
||||
|
||||
// Calculate comparisons
|
||||
const comparison = {
|
||||
hours: calculateComparison(hoursData.totals.hours, prevHoursData.totals.hours),
|
||||
overtimeHours: calculateComparison(hoursData.totals.overtimeHours, prevHoursData.totals.overtimeHours),
|
||||
fte: calculateComparison(fte, prevFte),
|
||||
activeEmployees: calculateComparison(hoursData.totals.activeEmployees, prevHoursData.totals.activeEmployees),
|
||||
};
|
||||
|
||||
const response = {
|
||||
payPeriod: {
|
||||
start: payPeriod.start.toISODate(),
|
||||
end: payPeriod.end.toISODate(),
|
||||
label: formatPayPeriodLabel(payPeriod),
|
||||
week1: {
|
||||
start: payPeriod.week1.start.toISODate(),
|
||||
end: payPeriod.week1.end.toISODate(),
|
||||
label: formatWeekLabel(payPeriod.week1),
|
||||
},
|
||||
week2: {
|
||||
start: payPeriod.week2.start.toISODate(),
|
||||
end: payPeriod.week2.end.toISODate(),
|
||||
label: formatWeekLabel(payPeriod.week2),
|
||||
},
|
||||
isCurrent: isCurrentPayPeriod(payPeriod),
|
||||
},
|
||||
totals: {
|
||||
hours: hoursData.totals.hours,
|
||||
breakHours: hoursData.totals.breakHours,
|
||||
overtimeHours: hoursData.totals.overtimeHours,
|
||||
regularHours: hoursData.totals.regularHours,
|
||||
activeEmployees,
|
||||
fte,
|
||||
avgHoursPerEmployee,
|
||||
},
|
||||
previousTotals: {
|
||||
hours: prevHoursData.totals.hours,
|
||||
overtimeHours: prevHoursData.totals.overtimeHours,
|
||||
activeEmployees: prevHoursData.totals.activeEmployees,
|
||||
fte: prevFte,
|
||||
},
|
||||
comparison,
|
||||
byEmployee: hoursData.byEmployee,
|
||||
byWeek: hoursData.byWeek,
|
||||
};
|
||||
|
||||
return { response, release };
|
||||
};
|
||||
|
||||
let result;
|
||||
try {
|
||||
result = await Promise.race([mainOperation(), timeoutPromise]);
|
||||
} catch (error) {
|
||||
if (error.message.includes('timeout')) {
|
||||
console.log(`[PAYROLL-METRICS] Request timed out in ${Date.now() - startTime}ms`);
|
||||
throw error;
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
|
||||
const { response, release } = result;
|
||||
|
||||
if (release) release();
|
||||
|
||||
console.log(`[PAYROLL-METRICS] Request completed in ${Date.now() - startTime}ms`);
|
||||
res.json(response);
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error in /payroll-metrics:', error);
|
||||
console.log(`[PAYROLL-METRICS] Request failed in ${Date.now() - startTime}ms`);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get pay period info endpoint (for navigation without full data)
|
||||
router.get('/period-info', async (req, res) => {
|
||||
try {
|
||||
const { payPeriodStart, navigate } = req.query;
|
||||
|
||||
let payPeriod;
|
||||
|
||||
if (payPeriodStart) {
|
||||
const startDate = DateTime.fromISO(payPeriodStart, { zone: TIMEZONE });
|
||||
if (!startDate.isValid) {
|
||||
return res.status(400).json({ error: 'Invalid payPeriodStart date format' });
|
||||
}
|
||||
payPeriod = getPayPeriodForDate(startDate);
|
||||
} else {
|
||||
payPeriod = getCurrentPayPeriod();
|
||||
}
|
||||
|
||||
if (navigate) {
|
||||
const offset = parseInt(navigate, 10);
|
||||
if (!isNaN(offset)) {
|
||||
payPeriod = navigatePayPeriod(payPeriod.start, offset);
|
||||
}
|
||||
}
|
||||
|
||||
res.json({
|
||||
payPeriod: {
|
||||
start: payPeriod.start.toISODate(),
|
||||
end: payPeriod.end.toISODate(),
|
||||
label: formatPayPeriodLabel(payPeriod),
|
||||
week1: {
|
||||
start: payPeriod.week1.start.toISODate(),
|
||||
end: payPeriod.week1.end.toISODate(),
|
||||
label: formatWeekLabel(payPeriod.week1),
|
||||
},
|
||||
week2: {
|
||||
start: payPeriod.week2.start.toISODate(),
|
||||
end: payPeriod.week2.end.toISODate(),
|
||||
label: formatWeekLabel(payPeriod.week2),
|
||||
},
|
||||
isCurrent: isCurrentPayPeriod(payPeriod),
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error in /payroll-metrics/period-info:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Health check
|
||||
router.get('/health', async (req, res) => {
|
||||
try {
|
||||
const { connection, release } = await getDbConnection();
|
||||
await connection.execute('SELECT 1 as test');
|
||||
release();
|
||||
|
||||
res.json({
|
||||
status: 'healthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
pool: getPoolStatus(),
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(500).json({
|
||||
status: 'unhealthy',
|
||||
timestamp: new Date().toISOString(),
|
||||
error: error.message,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
// Helper functions
|
||||
function calculateComparison(currentValue, previousValue) {
|
||||
if (typeof previousValue !== 'number') {
|
||||
return { absolute: null, percentage: null };
|
||||
}
|
||||
|
||||
const absolute = typeof currentValue === 'number' ? currentValue - previousValue : null;
|
||||
const percentage =
|
||||
absolute !== null && previousValue !== 0
|
||||
? (absolute / Math.abs(previousValue)) * 100
|
||||
: null;
|
||||
|
||||
return { absolute, percentage };
|
||||
}
|
||||
|
||||
function formatPayPeriodLabel(payPeriod) {
|
||||
const startStr = payPeriod.start.toFormat('MMM d');
|
||||
const endStr = payPeriod.end.toFormat('MMM d, yyyy');
|
||||
return `${startStr} – ${endStr}`;
|
||||
}
|
||||
|
||||
function formatWeekLabel(week) {
|
||||
const startStr = week.start.toFormat('MMM d');
|
||||
const endStr = week.end.toFormat('MMM d');
|
||||
return `${startStr} – ${endStr}`;
|
||||
}
|
||||
|
||||
function isCurrentPayPeriod(payPeriod) {
|
||||
const now = DateTime.now().setZone(TIMEZONE);
|
||||
return now >= payPeriod.start && now <= payPeriod.end;
|
||||
}
|
||||
|
||||
module.exports = router;
|
||||
@@ -49,6 +49,9 @@ app.get('/health', (req, res) => {
|
||||
app.use('/api/acot/test', require('./routes/test'));
|
||||
app.use('/api/acot/events', require('./routes/events'));
|
||||
app.use('/api/acot/discounts', require('./routes/discounts'));
|
||||
app.use('/api/acot/employee-metrics', require('./routes/employee-metrics'));
|
||||
app.use('/api/acot/payroll-metrics', require('./routes/payroll-metrics'));
|
||||
app.use('/api/acot/operations-metrics', require('./routes/operations-metrics'));
|
||||
|
||||
// Error handling middleware
|
||||
app.use((err, req, res, next) => {
|
||||
|
||||
@@ -33,7 +33,7 @@ const corsOptions = {
|
||||
origin: function(origin, callback) {
|
||||
const allowedOrigins = [
|
||||
'http://localhost:3000',
|
||||
'https://dashboard.kent.pw'
|
||||
'https://tools.acherryontop.com'
|
||||
];
|
||||
|
||||
console.log('CORS check for origin:', origin);
|
||||
|
||||
@@ -0,0 +1,57 @@
|
||||
-- Migration: Make AI prompts extensible with is_singleton column
|
||||
-- Date: 2024-01-19
|
||||
-- Description: Removes hardcoded prompt_type CHECK constraint, adds is_singleton column
|
||||
-- for dynamic uniqueness enforcement, and creates appropriate indexes.
|
||||
|
||||
-- 1. Drop the old CHECK constraints on prompt_type (allows any string value now)
|
||||
ALTER TABLE ai_prompts DROP CONSTRAINT IF EXISTS ai_prompts_prompt_type_check;
|
||||
ALTER TABLE ai_prompts DROP CONSTRAINT IF EXISTS company_required_for_specific;
|
||||
|
||||
-- 2. Add is_singleton column (defaults to true for backwards compatibility)
|
||||
ALTER TABLE ai_prompts ADD COLUMN IF NOT EXISTS is_singleton BOOLEAN NOT NULL DEFAULT true;
|
||||
|
||||
-- 3. Drop ALL old unique constraints and indexes (cleanup)
|
||||
-- Some were created as CONSTRAINTS (via ADD CONSTRAINT), others as standalone indexes
|
||||
-- Must drop constraints first, then remaining standalone indexes
|
||||
|
||||
-- Drop constraints (these also remove their backing indexes)
|
||||
ALTER TABLE ai_prompts DROP CONSTRAINT IF EXISTS unique_company_prompt;
|
||||
ALTER TABLE ai_prompts DROP CONSTRAINT IF EXISTS idx_unique_general_prompt;
|
||||
ALTER TABLE ai_prompts DROP CONSTRAINT IF EXISTS idx_unique_system_prompt;
|
||||
|
||||
-- Drop standalone indexes (IF EXISTS handles cases where they don't exist)
|
||||
DROP INDEX IF EXISTS idx_unique_general_prompt;
|
||||
DROP INDEX IF EXISTS idx_unique_system_prompt;
|
||||
DROP INDEX IF EXISTS idx_unique_name_validation_system;
|
||||
DROP INDEX IF EXISTS idx_unique_name_validation_general;
|
||||
DROP INDEX IF EXISTS idx_unique_description_validation_system;
|
||||
DROP INDEX IF EXISTS idx_unique_description_validation_general;
|
||||
DROP INDEX IF EXISTS idx_unique_sanity_check_system;
|
||||
DROP INDEX IF EXISTS idx_unique_sanity_check_general;
|
||||
DROP INDEX IF EXISTS idx_unique_bulk_validation_system;
|
||||
DROP INDEX IF EXISTS idx_unique_bulk_validation_general;
|
||||
DROP INDEX IF EXISTS idx_unique_name_validation_company;
|
||||
DROP INDEX IF EXISTS idx_unique_description_validation_company;
|
||||
DROP INDEX IF EXISTS idx_unique_bulk_validation_company;
|
||||
|
||||
-- 4. Create new partial unique indexes based on is_singleton
|
||||
-- For singleton types WITHOUT company (only one per prompt_type)
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS idx_singleton_no_company
|
||||
ON ai_prompts (prompt_type)
|
||||
WHERE is_singleton = true AND company IS NULL;
|
||||
|
||||
-- For singleton types WITH company (only one per prompt_type + company combination)
|
||||
CREATE UNIQUE INDEX IF NOT EXISTS idx_singleton_with_company
|
||||
ON ai_prompts (prompt_type, company)
|
||||
WHERE is_singleton = true AND company IS NOT NULL;
|
||||
|
||||
-- 5. Add index for fast lookups by type
|
||||
CREATE INDEX IF NOT EXISTS idx_prompt_type ON ai_prompts (prompt_type);
|
||||
|
||||
-- NOTE: After running this migration, you should:
|
||||
-- 1. Delete existing prompts with old types (general, system, company_specific)
|
||||
-- 2. Create new prompts with the new type naming convention:
|
||||
-- - name_validation_system, name_validation_general, name_validation_company_specific
|
||||
-- - description_validation_system, description_validation_general, description_validation_company_specific
|
||||
-- - sanity_check_system, sanity_check_general
|
||||
-- - bulk_validation_system, bulk_validation_general, bulk_validation_company_specific
|
||||
283
inventory-server/scripts/embedding-poc.js
Normal file
283
inventory-server/scripts/embedding-poc.js
Normal file
@@ -0,0 +1,283 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Embedding Proof-of-Concept Script
|
||||
*
|
||||
* Demonstrates how category embeddings work for product matching.
|
||||
* Uses OpenAI text-embedding-3-small model.
|
||||
*
|
||||
* Usage: node scripts/embedding-poc.js
|
||||
*/
|
||||
|
||||
const path = require('path');
|
||||
require('dotenv').config({ path: path.join(__dirname, '../.env') });
|
||||
|
||||
const { getDbConnection, closeAllConnections } = require('../src/utils/dbConnection');
|
||||
|
||||
// ============================================================================
|
||||
// Configuration
|
||||
// ============================================================================
|
||||
|
||||
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
|
||||
const EMBEDDING_MODEL = 'text-embedding-3-small';
|
||||
const EMBEDDING_DIMENSIONS = 1536;
|
||||
|
||||
// Sample products to test (you can modify these)
|
||||
const TEST_PRODUCTS = [
|
||||
{
|
||||
name: "Cosmos Infinity Chipboard - Stamperia",
|
||||
description: "Laser-cut chipboard shapes featuring celestial designs for mixed media projects"
|
||||
},
|
||||
{
|
||||
name: "Distress Oxide Ink Pad - Mermaid Lagoon",
|
||||
description: "Water-reactive dye ink that creates an oxidized effect"
|
||||
},
|
||||
{
|
||||
name: "Hedwig Puffy Stickers - Paper House Productions",
|
||||
description: "3D puffy stickers featuring Harry Potter's owl Hedwig"
|
||||
},
|
||||
{
|
||||
name: "Black Velvet Watercolor Brush Size 6",
|
||||
description: "Round brush for watercolor painting with synthetic bristles"
|
||||
},
|
||||
{
|
||||
name: "Floral Washi Tape Set",
|
||||
description: "Decorative paper tape with flower patterns, pack of 6 rolls"
|
||||
}
|
||||
];
|
||||
|
||||
// ============================================================================
|
||||
// OpenAI Embedding Functions
|
||||
// ============================================================================
|
||||
|
||||
async function getEmbeddings(texts) {
|
||||
const response = await fetch('https://api.openai.com/v1/embeddings', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${OPENAI_API_KEY}`
|
||||
},
|
||||
body: JSON.stringify({
|
||||
input: texts.map(t => t.substring(0, 8000)), // Max 8k chars per text
|
||||
model: EMBEDDING_MODEL,
|
||||
dimensions: EMBEDDING_DIMENSIONS
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json();
|
||||
throw new Error(`OpenAI API error: ${error.error?.message || response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Sort by index to ensure order matches input
|
||||
const sorted = data.data.sort((a, b) => a.index - b.index);
|
||||
|
||||
return {
|
||||
embeddings: sorted.map(item => item.embedding),
|
||||
usage: data.usage,
|
||||
model: data.model
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Vector Math
|
||||
// ============================================================================
|
||||
|
||||
function cosineSimilarity(a, b) {
|
||||
let dotProduct = 0;
|
||||
let normA = 0;
|
||||
let normB = 0;
|
||||
|
||||
for (let i = 0; i < a.length; i++) {
|
||||
dotProduct += a[i] * b[i];
|
||||
normA += a[i] * a[i];
|
||||
normB += b[i] * b[i];
|
||||
}
|
||||
|
||||
return dotProduct / (Math.sqrt(normA) * Math.sqrt(normB));
|
||||
}
|
||||
|
||||
function findTopMatches(queryEmbedding, categoryEmbeddings, topK = 10) {
|
||||
const scored = categoryEmbeddings.map(cat => ({
|
||||
...cat,
|
||||
similarity: cosineSimilarity(queryEmbedding, cat.embedding)
|
||||
}));
|
||||
|
||||
scored.sort((a, b) => b.similarity - a.similarity);
|
||||
|
||||
return scored.slice(0, topK);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Database Functions
|
||||
// ============================================================================
|
||||
|
||||
async function fetchCategories(connection) {
|
||||
console.log('\n📂 Fetching categories from database...');
|
||||
|
||||
// Fetch hierarchical categories (types 10-13)
|
||||
const [rows] = await connection.query(`
|
||||
SELECT
|
||||
cat_id,
|
||||
name,
|
||||
master_cat_id,
|
||||
type
|
||||
FROM product_categories
|
||||
WHERE type IN (10, 11, 12, 13)
|
||||
ORDER BY type, name
|
||||
`);
|
||||
|
||||
console.log(` Found ${rows.length} category records`);
|
||||
|
||||
// Build category paths
|
||||
const byId = new Map(rows.map(r => [r.cat_id, r]));
|
||||
const categories = [];
|
||||
|
||||
for (const row of rows) {
|
||||
const path = [];
|
||||
let current = row;
|
||||
|
||||
// Walk up the tree to build full path
|
||||
while (current) {
|
||||
path.unshift(current.name);
|
||||
current = current.master_cat_id ? byId.get(current.master_cat_id) : null;
|
||||
}
|
||||
|
||||
categories.push({
|
||||
id: row.cat_id,
|
||||
name: row.name,
|
||||
type: row.type,
|
||||
fullPath: path.join(' > '),
|
||||
embeddingText: path.join(' ') // For embedding generation
|
||||
});
|
||||
}
|
||||
|
||||
// Count by level
|
||||
const levels = {
|
||||
10: categories.filter(c => c.type === 10).length,
|
||||
11: categories.filter(c => c.type === 11).length,
|
||||
12: categories.filter(c => c.type === 12).length,
|
||||
13: categories.filter(c => c.type === 13).length,
|
||||
};
|
||||
|
||||
console.log(` Level breakdown: ${levels[10]} top-level, ${levels[11]} L2, ${levels[12]} L3, ${levels[13]} L4`);
|
||||
|
||||
return categories;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Main Script
|
||||
// ============================================================================
|
||||
|
||||
async function main() {
|
||||
console.log('═══════════════════════════════════════════════════════════════');
|
||||
console.log(' EMBEDDING PROOF-OF-CONCEPT');
|
||||
console.log(' Model: ' + EMBEDDING_MODEL);
|
||||
console.log('═══════════════════════════════════════════════════════════════');
|
||||
|
||||
if (!OPENAI_API_KEY) {
|
||||
console.error('❌ OPENAI_API_KEY not found in environment');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
let connection;
|
||||
|
||||
try {
|
||||
// Step 1: Connect to database
|
||||
console.log('\n🔌 Connecting to database via SSH tunnel...');
|
||||
const { connection: conn } = await getDbConnection();
|
||||
connection = conn;
|
||||
console.log(' ✅ Connected');
|
||||
|
||||
// Step 2: Fetch categories
|
||||
const categories = await fetchCategories(connection);
|
||||
|
||||
// Step 3: Generate embeddings for categories
|
||||
console.log('\n🧮 Generating embeddings for categories...');
|
||||
console.log(' This will cost approximately $' + (categories.length * 0.00002).toFixed(4));
|
||||
|
||||
const startTime = Date.now();
|
||||
|
||||
// Process in batches of 100 (OpenAI limit is 2048)
|
||||
const BATCH_SIZE = 100;
|
||||
let totalTokens = 0;
|
||||
|
||||
for (let i = 0; i < categories.length; i += BATCH_SIZE) {
|
||||
const batch = categories.slice(i, i + BATCH_SIZE);
|
||||
const texts = batch.map(c => c.embeddingText);
|
||||
|
||||
const result = await getEmbeddings(texts);
|
||||
|
||||
// Attach embeddings to categories
|
||||
for (let j = 0; j < batch.length; j++) {
|
||||
batch[j].embedding = result.embeddings[j];
|
||||
}
|
||||
|
||||
totalTokens += result.usage.total_tokens;
|
||||
console.log(` Batch ${Math.floor(i / BATCH_SIZE) + 1}/${Math.ceil(categories.length / BATCH_SIZE)}: ${batch.length} categories embedded`);
|
||||
}
|
||||
|
||||
const embeddingTime = Date.now() - startTime;
|
||||
console.log(` ✅ Generated ${categories.length} embeddings in ${embeddingTime}ms`);
|
||||
console.log(` 📊 Total tokens used: ${totalTokens} (~$${(totalTokens * 0.00002).toFixed(4)})`);
|
||||
|
||||
// Step 4: Test with sample products
|
||||
console.log('\n═══════════════════════════════════════════════════════════════');
|
||||
console.log(' TESTING WITH SAMPLE PRODUCTS');
|
||||
console.log('═══════════════════════════════════════════════════════════════');
|
||||
|
||||
for (const product of TEST_PRODUCTS) {
|
||||
console.log('\n┌─────────────────────────────────────────────────────────────');
|
||||
console.log(`│ Product: "${product.name}"`);
|
||||
console.log(`│ Description: "${product.description.substring(0, 60)}..."`);
|
||||
console.log('├─────────────────────────────────────────────────────────────');
|
||||
|
||||
// Generate embedding for product
|
||||
const productText = `${product.name} ${product.description}`;
|
||||
const { embeddings: [productEmbedding] } = await getEmbeddings([productText]);
|
||||
|
||||
// Find top matches
|
||||
const matches = findTopMatches(productEmbedding, categories, 10);
|
||||
|
||||
console.log('│ Top 10 Category Matches:');
|
||||
matches.forEach((match, i) => {
|
||||
const similarity = (match.similarity * 100).toFixed(1);
|
||||
const bar = '█'.repeat(Math.round(match.similarity * 20));
|
||||
const marker = i < 3 ? ' ✅' : '';
|
||||
console.log(`│ ${(i + 1).toString().padStart(2)}. [${similarity.padStart(5)}%] ${bar.padEnd(20)} ${match.fullPath}${marker}`);
|
||||
});
|
||||
console.log('└─────────────────────────────────────────────────────────────');
|
||||
}
|
||||
|
||||
// Step 5: Summary
|
||||
console.log('\n═══════════════════════════════════════════════════════════════');
|
||||
console.log(' SUMMARY');
|
||||
console.log('═══════════════════════════════════════════════════════════════');
|
||||
console.log(` Categories embedded: ${categories.length}`);
|
||||
console.log(` Embedding time: ${embeddingTime}ms (one-time cost)`);
|
||||
console.log(` Per-product lookup: ~${(Date.now() - startTime) / TEST_PRODUCTS.length}ms`);
|
||||
console.log(` Vector dimensions: ${EMBEDDING_DIMENSIONS}`);
|
||||
console.log(` Memory usage: ~${(categories.length * EMBEDDING_DIMENSIONS * 4 / 1024 / 1024).toFixed(2)} MB (in-memory vectors)`);
|
||||
console.log('');
|
||||
console.log(' 💡 In production:');
|
||||
console.log(' - Category embeddings are computed once and cached');
|
||||
console.log(' - Only product embedding is computed per-request (~$0.00002)');
|
||||
console.log(' - Vector search is instant (in-memory cosine similarity)');
|
||||
console.log(' - Top 10 results go to AI for final selection (~$0.0001)');
|
||||
console.log('═══════════════════════════════════════════════════════════════\n');
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Error:', error.message);
|
||||
if (error.stack) {
|
||||
console.error(error.stack);
|
||||
}
|
||||
process.exit(1);
|
||||
} finally {
|
||||
await closeAllConnections();
|
||||
console.log('🔌 Database connections closed');
|
||||
}
|
||||
}
|
||||
|
||||
// Run the script
|
||||
main();
|
||||
@@ -6,6 +6,7 @@ const corsMiddleware = cors({
|
||||
'https://inventory.kent.pw',
|
||||
'http://localhost:5175',
|
||||
'https://acot.site',
|
||||
'https://tools.acherryontop.com',
|
||||
/^http:\/\/192\.168\.\d+\.\d+(:\d+)?$/,
|
||||
/^http:\/\/10\.\d+\.\d+\.\d+(:\d+)?$/
|
||||
],
|
||||
@@ -27,7 +28,7 @@ const corsErrorHandler = (err, req, res, next) => {
|
||||
res.status(403).json({
|
||||
error: 'CORS not allowed',
|
||||
origin: req.get('Origin'),
|
||||
message: 'Origin not in allowed list: https://inventory.kent.pw, https://acot.site, localhost:5175, 192.168.x.x, or 10.x.x.x'
|
||||
message: 'Origin not in allowed list: https://inventory.kent.pw, https://acot.site, https://tools.acherryontop.com, localhost:5175, 192.168.x.x, or 10.x.x.x'
|
||||
});
|
||||
} else {
|
||||
next(err);
|
||||
|
||||
@@ -51,66 +51,64 @@ router.get('/:id', async (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Get prompt by type (general, system, company_specific)
|
||||
// Get prompt by type (any prompt_type value - extensible)
|
||||
router.get('/by-type', async (req, res) => {
|
||||
try {
|
||||
const { type, company } = req.query;
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
// Validate prompt type
|
||||
if (!type || !['general', 'system', 'company_specific'].includes(type)) {
|
||||
return res.status(400).json({
|
||||
error: 'Valid type query parameter is required (general, system, or company_specific)'
|
||||
});
|
||||
}
|
||||
|
||||
// For company_specific type, company ID is required
|
||||
if (type === 'company_specific' && !company) {
|
||||
|
||||
// Validate type is provided
|
||||
if (!type || typeof type !== 'string' || type.trim().length === 0) {
|
||||
return res.status(400).json({
|
||||
error: 'Company ID is required for company_specific prompt type'
|
||||
error: 'Valid type query parameter is required'
|
||||
});
|
||||
}
|
||||
|
||||
// For general and system types, company should not be provided
|
||||
if ((type === 'general' || type === 'system') && company) {
|
||||
|
||||
// For company_specific types, company ID is required
|
||||
const isCompanySpecificType = type.endsWith('_company_specific');
|
||||
if (isCompanySpecificType && !company) {
|
||||
return res.status(400).json({
|
||||
error: 'Company ID should not be provided for general or system prompt types'
|
||||
error: 'Company ID is required for company_specific prompt types'
|
||||
});
|
||||
}
|
||||
|
||||
// Build the query based on the type
|
||||
|
||||
// For non-company-specific types, company should not be provided
|
||||
if (!isCompanySpecificType && company) {
|
||||
return res.status(400).json({
|
||||
error: 'Company ID should not be provided for non-company-specific prompt types'
|
||||
});
|
||||
}
|
||||
|
||||
// Build the query based on whether company is provided
|
||||
let query, params;
|
||||
if (type === 'company_specific') {
|
||||
if (company) {
|
||||
query = 'SELECT * FROM ai_prompts WHERE prompt_type = $1 AND company = $2';
|
||||
params = [type, company];
|
||||
params = [type.trim(), company];
|
||||
} else {
|
||||
query = 'SELECT * FROM ai_prompts WHERE prompt_type = $1';
|
||||
params = [type];
|
||||
query = 'SELECT * FROM ai_prompts WHERE prompt_type = $1 AND company IS NULL';
|
||||
params = [type.trim()];
|
||||
}
|
||||
|
||||
|
||||
// Execute the query
|
||||
const result = await pool.query(query, params);
|
||||
|
||||
|
||||
// Check if any prompt was found
|
||||
if (result.rows.length === 0) {
|
||||
let errorMessage;
|
||||
if (type === 'company_specific') {
|
||||
errorMessage = `AI prompt not found for company ${company}`;
|
||||
} else {
|
||||
errorMessage = `${type.charAt(0).toUpperCase() + type.slice(1)} AI prompt not found`;
|
||||
}
|
||||
const errorMessage = company
|
||||
? `AI prompt '${type}' not found for company ${company}`
|
||||
: `AI prompt '${type}' not found`;
|
||||
return res.status(404).json({ error: errorMessage });
|
||||
}
|
||||
|
||||
|
||||
// Return the first matching prompt
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error fetching AI prompt by type:', error);
|
||||
res.status(500).json({
|
||||
res.status(500).json({
|
||||
error: 'Failed to fetch AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
@@ -130,27 +128,28 @@ router.post('/', async (req, res) => {
|
||||
if (!prompt_text || !prompt_type) {
|
||||
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||
}
|
||||
|
||||
// Validate prompt type
|
||||
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||
}
|
||||
|
||||
// Validate company is provided for company-specific prompts
|
||||
if (prompt_type === 'company_specific' && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||
|
||||
// Validate prompt_type is a non-empty string (no hardcoded list - extensible)
|
||||
if (typeof prompt_type !== 'string' || prompt_type.trim().length === 0) {
|
||||
return res.status(400).json({ error: 'Prompt type must be a non-empty string' });
|
||||
}
|
||||
|
||||
// Validate company is not provided for general or system prompts
|
||||
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||
// For company-specific types (ending with _company_specific), require company
|
||||
const isCompanySpecificType = prompt_type.endsWith('_company_specific');
|
||||
if (isCompanySpecificType && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompt types' });
|
||||
}
|
||||
|
||||
// For non-company-specific types, company should not be provided
|
||||
if (!isCompanySpecificType && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for non-company-specific prompt types' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
|
||||
const result = await pool.query(`
|
||||
INSERT INTO ai_prompts (
|
||||
prompt_text,
|
||||
@@ -160,35 +159,30 @@ router.post('/', async (req, res) => {
|
||||
RETURNING *
|
||||
`, [
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company
|
||||
prompt_type.trim(),
|
||||
company || null
|
||||
]);
|
||||
|
||||
res.status(201).json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error creating AI prompt:', error);
|
||||
|
||||
|
||||
// Check for unique constraint violations
|
||||
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||
if (error.message.includes('unique_company_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt already exists for this company',
|
||||
if (error instanceof Error && error.message.includes('unique')) {
|
||||
if (error.message.includes('idx_singleton_with_company')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt of this type already exists for this company',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A general prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A system prompt already exists',
|
||||
} else if (error.message.includes('idx_singleton_no_company')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt of this type already exists',
|
||||
details: error.message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
res.status(500).json({
|
||||
|
||||
res.status(500).json({
|
||||
error: 'Failed to create AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
@@ -209,73 +203,70 @@ router.put('/:id', async (req, res) => {
|
||||
if (!prompt_text || !prompt_type) {
|
||||
return res.status(400).json({ error: 'Prompt text and type are required' });
|
||||
}
|
||||
|
||||
// Validate prompt type
|
||||
if (!['general', 'company_specific', 'system'].includes(prompt_type)) {
|
||||
return res.status(400).json({ error: 'Prompt type must be either "general", "company_specific", or "system"' });
|
||||
|
||||
// Validate prompt_type is a non-empty string (no hardcoded list - extensible)
|
||||
if (typeof prompt_type !== 'string' || prompt_type.trim().length === 0) {
|
||||
return res.status(400).json({ error: 'Prompt type must be a non-empty string' });
|
||||
}
|
||||
|
||||
// Validate company is provided for company-specific prompts
|
||||
if (prompt_type === 'company_specific' && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompts' });
|
||||
|
||||
// For company-specific types, require company
|
||||
const isCompanySpecificType = prompt_type.endsWith('_company_specific');
|
||||
if (isCompanySpecificType && !company) {
|
||||
return res.status(400).json({ error: 'Company is required for company-specific prompt types' });
|
||||
}
|
||||
|
||||
// Validate company is not provided for general or system prompts
|
||||
if ((prompt_type === 'general' || prompt_type === 'system') && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for general or system prompts' });
|
||||
|
||||
// For non-company-specific types, company should not be provided
|
||||
if (!isCompanySpecificType && company) {
|
||||
return res.status(400).json({ error: 'Company should not be provided for non-company-specific prompt types' });
|
||||
}
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not initialized');
|
||||
}
|
||||
|
||||
|
||||
// Check if the prompt exists
|
||||
const checkResult = await pool.query('SELECT * FROM ai_prompts WHERE id = $1', [id]);
|
||||
if (checkResult.rows.length === 0) {
|
||||
return res.status(404).json({ error: 'AI prompt not found' });
|
||||
}
|
||||
|
||||
|
||||
const result = await pool.query(`
|
||||
UPDATE ai_prompts
|
||||
SET
|
||||
UPDATE ai_prompts
|
||||
SET
|
||||
prompt_text = $1,
|
||||
prompt_type = $2,
|
||||
company = $3
|
||||
company = $3,
|
||||
updated_at = CURRENT_TIMESTAMP
|
||||
WHERE id = $4
|
||||
RETURNING *
|
||||
`, [
|
||||
prompt_text,
|
||||
prompt_type,
|
||||
company,
|
||||
prompt_type.trim(),
|
||||
company || null,
|
||||
id
|
||||
]);
|
||||
|
||||
res.json(result.rows[0]);
|
||||
} catch (error) {
|
||||
console.error('Error updating AI prompt:', error);
|
||||
|
||||
|
||||
// Check for unique constraint violations
|
||||
if (error instanceof Error && error.message.includes('unique constraint')) {
|
||||
if (error.message.includes('unique_company_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt already exists for this company',
|
||||
if (error instanceof Error && error.message.includes('unique')) {
|
||||
if (error.message.includes('idx_singleton_with_company')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt of this type already exists for this company',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_general_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A general prompt already exists',
|
||||
details: error.message
|
||||
});
|
||||
} else if (error.message.includes('idx_unique_system_prompt')) {
|
||||
return res.status(409).json({
|
||||
error: 'A system prompt already exists',
|
||||
} else if (error.message.includes('idx_singleton_no_company')) {
|
||||
return res.status(409).json({
|
||||
error: 'A prompt of this type already exists',
|
||||
details: error.message
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
res.status(500).json({
|
||||
|
||||
res.status(500).json({
|
||||
error: 'Failed to update AI prompt',
|
||||
details: error instanceof Error ? error.message : 'Unknown error'
|
||||
});
|
||||
|
||||
@@ -347,34 +347,34 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// First, fetch the system prompt using the consolidated endpoint approach
|
||||
// First, fetch the system prompt for bulk validation
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_system' AND company IS NULL
|
||||
`);
|
||||
|
||||
|
||||
if (systemPromptResult.rows.length === 0) {
|
||||
console.error("❌ No system prompt found in database");
|
||||
throw new Error("No system prompt found in database");
|
||||
console.error("❌ No bulk_validation_system prompt found in database");
|
||||
throw new Error("Missing required AI prompt: bulk_validation_system. Please add it in Settings > AI Validation Prompts.");
|
||||
}
|
||||
|
||||
const systemPrompt = systemPromptResult.rows[0];
|
||||
console.log("📝 Loaded system prompt from database, ID:", systemPrompt.id);
|
||||
console.log("📝 Loaded bulk_validation_system prompt from database, ID:", systemPrompt.id);
|
||||
|
||||
// Then, fetch the general prompt using the consolidated endpoint approach
|
||||
// Then, fetch the general prompt for bulk validation
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_general' AND company IS NULL
|
||||
`);
|
||||
|
||||
|
||||
if (generalPromptResult.rows.length === 0) {
|
||||
console.error("❌ No general prompt found in database");
|
||||
throw new Error("No general prompt found in database");
|
||||
console.error("❌ No bulk_validation_general prompt found in database");
|
||||
throw new Error("Missing required AI prompt: bulk_validation_general. Please add it in Settings > AI Validation Prompts.");
|
||||
}
|
||||
|
||||
// Get the general prompt text and info
|
||||
const generalPrompt = generalPromptResult.rows[0];
|
||||
console.log("📝 Loaded general prompt from database, ID:", generalPrompt.id);
|
||||
console.log("📝 Loaded bulk_validation_general prompt from database, ID:", generalPrompt.id);
|
||||
|
||||
// Fetch company-specific prompts if we have products to validate
|
||||
let companyPrompts = [];
|
||||
@@ -389,16 +389,16 @@ async function generateDebugResponse(productsToUse, res) {
|
||||
|
||||
if (companyIds.size > 0) {
|
||||
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||
|
||||
// Fetch company-specific prompts
|
||||
|
||||
// Fetch company-specific prompts for bulk validation
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
|
||||
companyPrompts = companyPromptsResult.rows;
|
||||
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||
console.log(`📝 Loaded ${companyPrompts.length} bulk_validation_company_specific prompts`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -688,34 +688,34 @@ async function loadPrompt(connection, productsToValidate = null, appPool = null)
|
||||
throw new Error("Database connection not available");
|
||||
}
|
||||
|
||||
// Fetch the system prompt using the consolidated endpoint approach
|
||||
// Fetch the system prompt for bulk validation
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'system'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_system' AND company IS NULL
|
||||
`);
|
||||
|
||||
|
||||
if (systemPromptResult.rows.length === 0) {
|
||||
console.error("❌ No system prompt found in database");
|
||||
throw new Error("No system prompt found in database");
|
||||
console.error("❌ No bulk_validation_system prompt found in database");
|
||||
throw new Error("Missing required AI prompt: bulk_validation_system. Please add it in Settings > AI Validation Prompts.");
|
||||
}
|
||||
|
||||
const systemInstructions = systemPromptResult.rows[0].prompt_text;
|
||||
console.log("📝 Loaded system prompt from database");
|
||||
console.log("📝 Loaded bulk_validation_system prompt from database");
|
||||
|
||||
// Fetch the general prompt using the consolidated endpoint approach
|
||||
// Fetch the general prompt for bulk validation
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'general'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_general' AND company IS NULL
|
||||
`);
|
||||
|
||||
|
||||
if (generalPromptResult.rows.length === 0) {
|
||||
console.error("❌ No general prompt found in database");
|
||||
throw new Error("No general prompt found in database");
|
||||
console.error("❌ No bulk_validation_general prompt found in database");
|
||||
throw new Error("Missing required AI prompt: bulk_validation_general. Please add it in Settings > AI Validation Prompts.");
|
||||
}
|
||||
|
||||
// Get the general prompt text
|
||||
const basePrompt = generalPromptResult.rows[0].prompt_text;
|
||||
console.log("📝 Loaded general prompt from database");
|
||||
console.log("📝 Loaded bulk_validation_general prompt from database");
|
||||
|
||||
// Fetch company-specific prompts if we have products to validate
|
||||
let companyPrompts = [];
|
||||
@@ -730,16 +730,16 @@ async function loadPrompt(connection, productsToValidate = null, appPool = null)
|
||||
|
||||
if (companyIds.size > 0) {
|
||||
console.log(`🔍 Found ${companyIds.size} unique companies in products:`, Array.from(companyIds));
|
||||
|
||||
// Fetch company-specific prompts
|
||||
|
||||
// Fetch company-specific prompts for bulk validation
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
|
||||
companyPrompts = companyPromptsResult.rows;
|
||||
console.log(`📝 Loaded ${companyPrompts.length} company-specific prompts`);
|
||||
console.log(`📝 Loaded ${companyPrompts.length} bulk_validation_company_specific prompts`);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1186,14 +1186,14 @@ router.post("/validate", async (req, res) => {
|
||||
if (!pool) {
|
||||
console.warn("⚠️ Local database pool not available for prompt sources");
|
||||
} else {
|
||||
// Get system prompt
|
||||
// Get system prompt for bulk validation
|
||||
const systemPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'system'
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'bulk_validation_system' AND company IS NULL
|
||||
`);
|
||||
|
||||
// Get general prompt
|
||||
// Get general prompt for bulk validation
|
||||
const generalPromptResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'general'
|
||||
SELECT * FROM ai_prompts WHERE prompt_type = 'bulk_validation_general' AND company IS NULL
|
||||
`);
|
||||
|
||||
// Extract unique company IDs from products
|
||||
@@ -1206,10 +1206,10 @@ router.post("/validate", async (req, res) => {
|
||||
|
||||
let companyPrompts = [];
|
||||
if (companyIds.size > 0) {
|
||||
// Fetch company-specific prompts
|
||||
// Fetch company-specific prompts for bulk validation
|
||||
const companyPromptsResult = await pool.query(`
|
||||
SELECT * FROM ai_prompts
|
||||
WHERE prompt_type = 'company_specific'
|
||||
WHERE prompt_type = 'bulk_validation_company_specific'
|
||||
AND company = ANY($1)
|
||||
`, [Array.from(companyIds)]);
|
||||
|
||||
|
||||
434
inventory-server/src/routes/ai.js
Normal file
434
inventory-server/src/routes/ai.js
Normal file
@@ -0,0 +1,434 @@
|
||||
/**
|
||||
* AI Routes
|
||||
*
|
||||
* API endpoints for AI-powered product validation features.
|
||||
* Provides embedding generation and similarity-based suggestions.
|
||||
*/
|
||||
|
||||
const express = require('express');
|
||||
const router = express.Router();
|
||||
const aiService = require('../services/ai');
|
||||
const { getDbConnection, closeAllConnections } = require('../utils/dbConnection');
|
||||
|
||||
// Track initialization state
|
||||
let initializationPromise = null;
|
||||
|
||||
/**
|
||||
* Ensure AI service is initialized
|
||||
* Uses lazy initialization on first request
|
||||
*/
|
||||
async function ensureInitialized() {
|
||||
if (aiService.isReady()) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (initializationPromise) {
|
||||
await initializationPromise;
|
||||
return aiService.isReady();
|
||||
}
|
||||
|
||||
initializationPromise = (async () => {
|
||||
try {
|
||||
console.log('[AI Routes] Initializing AI service...');
|
||||
|
||||
// Get database connection for taxonomy
|
||||
const { connection } = await getDbConnection();
|
||||
|
||||
const result = await aiService.initialize({
|
||||
openaiApiKey: process.env.OPENAI_API_KEY,
|
||||
groqApiKey: process.env.GROQ_API_KEY,
|
||||
mysqlConnection: connection,
|
||||
pool: null, // Will be set by setPool()
|
||||
logger: console
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
console.error('[AI Routes] AI service initialization failed:', result.message);
|
||||
return false;
|
||||
}
|
||||
|
||||
console.log('[AI Routes] AI service initialized:', {
|
||||
...result.stats,
|
||||
groqEnabled: result.groqEnabled
|
||||
});
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Failed to initialize AI service:', error);
|
||||
return false;
|
||||
}
|
||||
})();
|
||||
|
||||
await initializationPromise;
|
||||
return aiService.isReady();
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/ai/status
|
||||
* Get AI service status
|
||||
*/
|
||||
router.get('/status', async (req, res) => {
|
||||
try {
|
||||
const status = aiService.getStatus();
|
||||
res.json(status);
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/initialize
|
||||
* Manually trigger initialization (also happens automatically on first use)
|
||||
*/
|
||||
router.post('/initialize', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
const status = aiService.getStatus();
|
||||
|
||||
res.json({
|
||||
success: ready,
|
||||
...status
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Initialize error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* GET /api/ai/taxonomy
|
||||
* Get all taxonomy data (categories, themes, colors) without embeddings
|
||||
*/
|
||||
router.get('/taxonomy', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const taxonomy = aiService.getTaxonomyData();
|
||||
res.json(taxonomy);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Taxonomy error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/embedding
|
||||
* Generate embedding for a single product
|
||||
*
|
||||
* Body: { product: { name, description, company_name, line_name } }
|
||||
* Returns: { embedding: number[], latencyMs: number }
|
||||
*/
|
||||
router.post('/embedding', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const { product } = req.body;
|
||||
|
||||
if (!product) {
|
||||
return res.status(400).json({ error: 'Product is required' });
|
||||
}
|
||||
|
||||
const result = await aiService.getProductEmbedding(product);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Embedding error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/embeddings
|
||||
* Generate embeddings for multiple products
|
||||
*
|
||||
* Body: { products: Array<{ name, description, company_name, line_name }> }
|
||||
* Returns: { embeddings: Array<{ index, embedding }>, latencyMs }
|
||||
*/
|
||||
router.post('/embeddings', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const { products } = req.body;
|
||||
|
||||
if (!Array.isArray(products)) {
|
||||
return res.status(400).json({ error: 'Products array is required' });
|
||||
}
|
||||
|
||||
const result = await aiService.getProductEmbeddings(products);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Embeddings error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/suggestions
|
||||
* Get category/theme/color suggestions for a single product
|
||||
* Generates embedding and finds similar taxonomy items
|
||||
*
|
||||
* Body: { product: { name, description, company_name, line_name }, options?: { topCategories, topThemes, topColors } }
|
||||
* Returns: { categories: Array, themes: Array, colors: Array, latencyMs }
|
||||
*/
|
||||
router.post('/suggestions', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const { product, options } = req.body;
|
||||
|
||||
if (!product) {
|
||||
return res.status(400).json({ error: 'Product is required' });
|
||||
}
|
||||
|
||||
const suggestions = await aiService.getSuggestionsForProduct(product, options);
|
||||
res.json(suggestions);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Suggestions error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/suggestions/batch
|
||||
* Get suggestions for multiple products
|
||||
* More efficient than calling /suggestions multiple times
|
||||
*
|
||||
* Body: { products: Array, options?: { topCategories, topThemes, topColors } }
|
||||
* Returns: { results: Array<{ index, categories, themes, colors }>, latencyMs }
|
||||
*/
|
||||
router.post('/suggestions/batch', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const { products, options } = req.body;
|
||||
|
||||
if (!Array.isArray(products)) {
|
||||
return res.status(400).json({ error: 'Products array is required' });
|
||||
}
|
||||
|
||||
const startTime = Date.now();
|
||||
|
||||
// Generate all embeddings at once
|
||||
const { embeddings, latencyMs: embeddingLatency } = await aiService.getProductEmbeddings(products);
|
||||
|
||||
// Find suggestions for each embedding
|
||||
const results = embeddings.map(({ index, embedding }) => {
|
||||
const suggestions = aiService.findSimilarTaxonomy(embedding, options);
|
||||
return {
|
||||
index,
|
||||
...suggestions
|
||||
};
|
||||
});
|
||||
|
||||
const totalLatency = Date.now() - startTime;
|
||||
|
||||
res.json({
|
||||
results,
|
||||
latencyMs: totalLatency,
|
||||
embeddingLatencyMs: embeddingLatency,
|
||||
searchLatencyMs: totalLatency - embeddingLatency,
|
||||
productCount: products.length,
|
||||
embeddingCount: embeddings.length
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Batch suggestions error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/similar
|
||||
* Find similar taxonomy items given a pre-computed embedding
|
||||
* Useful when frontend has cached the embedding
|
||||
*
|
||||
* Body: { embedding: number[], options?: { topCategories, topThemes, topColors } }
|
||||
* Returns: { categories, themes, colors }
|
||||
*/
|
||||
router.post('/similar', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
const { embedding, options } = req.body;
|
||||
|
||||
if (!embedding || !Array.isArray(embedding)) {
|
||||
return res.status(400).json({ error: 'Embedding array is required' });
|
||||
}
|
||||
|
||||
const startTime = Date.now();
|
||||
const suggestions = aiService.findSimilarTaxonomy(embedding, options);
|
||||
|
||||
res.json({
|
||||
...suggestions,
|
||||
latencyMs: Date.now() - startTime
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Similar error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// ============================================================================
|
||||
// INLINE AI VALIDATION ENDPOINTS (Groq-powered)
|
||||
// ============================================================================
|
||||
|
||||
/**
|
||||
* POST /api/ai/validate/inline/name
|
||||
* Validate a single product name for spelling, grammar, and naming conventions
|
||||
*
|
||||
* Body: { product: { name, company_name, company_id, line_name, description } }
|
||||
* Returns: { isValid, suggestion?, issues[], latencyMs }
|
||||
*/
|
||||
router.post('/validate/inline/name', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
if (!aiService.hasChatCompletion()) {
|
||||
return res.status(503).json({
|
||||
error: 'Chat completion not available - GROQ_API_KEY not configured'
|
||||
});
|
||||
}
|
||||
|
||||
const { product } = req.body;
|
||||
|
||||
if (!product) {
|
||||
return res.status(400).json({ error: 'Product is required' });
|
||||
}
|
||||
|
||||
// Get pool from app.locals (set by server.js)
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const result = await aiService.runTask(aiService.TASK_IDS.VALIDATE_NAME, {
|
||||
product,
|
||||
pool
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
return res.status(500).json({
|
||||
error: result.error || 'Validation failed',
|
||||
code: result.code
|
||||
});
|
||||
}
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Name validation error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/validate/inline/description
|
||||
* Validate a single product description for quality and guideline compliance
|
||||
*
|
||||
* Body: { product: { name, description, company_name, company_id, categories } }
|
||||
* Returns: { isValid, suggestion?, issues[], latencyMs }
|
||||
*/
|
||||
router.post('/validate/inline/description', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
if (!aiService.hasChatCompletion()) {
|
||||
return res.status(503).json({
|
||||
error: 'Chat completion not available - GROQ_API_KEY not configured'
|
||||
});
|
||||
}
|
||||
|
||||
const { product } = req.body;
|
||||
|
||||
if (!product) {
|
||||
return res.status(400).json({ error: 'Product is required' });
|
||||
}
|
||||
|
||||
// Get pool from app.locals (set by server.js)
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const result = await aiService.runTask(aiService.TASK_IDS.VALIDATE_DESCRIPTION, {
|
||||
product,
|
||||
pool
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
return res.status(500).json({
|
||||
error: result.error || 'Validation failed',
|
||||
code: result.code
|
||||
});
|
||||
}
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Description validation error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
/**
|
||||
* POST /api/ai/validate/sanity-check
|
||||
* Run consistency/sanity check on a batch of products
|
||||
*
|
||||
* Body: { products: Array<product data> }
|
||||
* Returns: { issues: Array<{ productIndex, field, issue, suggestion? }>, summary, latencyMs }
|
||||
*/
|
||||
router.post('/validate/sanity-check', async (req, res) => {
|
||||
try {
|
||||
const ready = await ensureInitialized();
|
||||
if (!ready) {
|
||||
return res.status(503).json({ error: 'AI service not available' });
|
||||
}
|
||||
|
||||
if (!aiService.hasChatCompletion()) {
|
||||
return res.status(503).json({
|
||||
error: 'Chat completion not available - GROQ_API_KEY not configured'
|
||||
});
|
||||
}
|
||||
|
||||
const { products } = req.body;
|
||||
|
||||
if (!Array.isArray(products) || products.length === 0) {
|
||||
return res.status(400).json({ error: 'Products array is required' });
|
||||
}
|
||||
|
||||
// Get pool from app.locals (set by server.js)
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
const result = await aiService.runTask(aiService.TASK_IDS.SANITY_CHECK, {
|
||||
products,
|
||||
pool
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
return res.status(500).json({
|
||||
error: result.error || 'Sanity check failed',
|
||||
code: result.code
|
||||
});
|
||||
}
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
console.error('[AI Routes] Sanity check error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = router;
|
||||
@@ -635,7 +635,7 @@ router.post('/upload-image', upload.single('image'), async (req, res) => {
|
||||
|
||||
// Create URL for the uploaded file - using an absolute URL with domain
|
||||
// This will generate a URL like: https://acot.site/uploads/products/filename.jpg
|
||||
const baseUrl = 'https://acot.site';
|
||||
const baseUrl = 'https://tools.acherryontop.com';
|
||||
const imageUrl = `${baseUrl}/uploads/products/${req.file.filename}`;
|
||||
|
||||
// Schedule this image for deletion in 24 hours
|
||||
@@ -715,6 +715,26 @@ router.delete('/delete-image', (req, res) => {
|
||||
}
|
||||
});
|
||||
|
||||
// Clear all taxonomy caches
|
||||
router.post('/clear-taxonomy-cache', (req, res) => {
|
||||
try {
|
||||
// Clear all entries from the query cache
|
||||
const cacheSize = connectionCache.queryCache.size;
|
||||
connectionCache.queryCache.clear();
|
||||
|
||||
console.log(`Cleared ${cacheSize} entries from taxonomy cache`);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: `Cache cleared (${cacheSize} entries removed)`,
|
||||
clearedEntries: cacheSize
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error clearing taxonomy cache:', error);
|
||||
res.status(500).json({ error: 'Failed to clear cache' });
|
||||
}
|
||||
});
|
||||
|
||||
// Get all options for import fields
|
||||
router.get('/field-options', async (req, res) => {
|
||||
try {
|
||||
@@ -1394,7 +1414,7 @@ router.get('/check-upc-and-generate-sku', async (req, res) => {
|
||||
|
||||
if (upcCheck.length > 0) {
|
||||
return res.status(409).json({
|
||||
error: 'UPC already exists',
|
||||
error: 'A product with this UPC already exists',
|
||||
existingProductId: upcCheck[0].pid,
|
||||
existingItemNumber: upcCheck[0].itemnumber
|
||||
});
|
||||
|
||||
@@ -194,7 +194,7 @@ router.post('/upload', upload.single('image'), async (req, res) => {
|
||||
}
|
||||
|
||||
// Create URL for the uploaded file
|
||||
const baseUrl = 'https://acot.site';
|
||||
const baseUrl = 'https://tools.acherryontop.com';
|
||||
const imageUrl = `${baseUrl}/uploads/reusable/${req.file.filename}`;
|
||||
|
||||
const pool = req.app.locals.pool;
|
||||
|
||||
@@ -15,6 +15,7 @@ const configRouter = require('./routes/config');
|
||||
const metricsRouter = require('./routes/metrics');
|
||||
const importRouter = require('./routes/import');
|
||||
const aiValidationRouter = require('./routes/ai-validation');
|
||||
const aiRouter = require('./routes/ai');
|
||||
const templatesRouter = require('./routes/templates');
|
||||
const aiPromptsRouter = require('./routes/ai-prompts');
|
||||
const reusableImagesRouter = require('./routes/reusable-images');
|
||||
@@ -124,6 +125,7 @@ async function startServer() {
|
||||
app.use('/api/brands-aggregate', brandsAggregateRouter);
|
||||
app.use('/api/import', importRouter);
|
||||
app.use('/api/ai-validation', aiValidationRouter);
|
||||
app.use('/api/ai', aiRouter);
|
||||
app.use('/api/templates', templatesRouter);
|
||||
app.use('/api/ai-prompts', aiPromptsRouter);
|
||||
app.use('/api/reusable-images', reusableImagesRouter);
|
||||
|
||||
82
inventory-server/src/services/ai/embeddings/similarity.js
Normal file
82
inventory-server/src/services/ai/embeddings/similarity.js
Normal file
@@ -0,0 +1,82 @@
|
||||
/**
|
||||
* Vector similarity utilities
|
||||
*/
|
||||
|
||||
/**
|
||||
* Compute cosine similarity between two vectors
|
||||
* @param {number[]} a
|
||||
* @param {number[]} b
|
||||
* @returns {number} Similarity score between -1 and 1
|
||||
*/
|
||||
function cosineSimilarity(a, b) {
|
||||
if (!a || !b || a.length !== b.length) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
let dotProduct = 0;
|
||||
let normA = 0;
|
||||
let normB = 0;
|
||||
|
||||
for (let i = 0; i < a.length; i++) {
|
||||
dotProduct += a[i] * b[i];
|
||||
normA += a[i] * a[i];
|
||||
normB += b[i] * b[i];
|
||||
}
|
||||
|
||||
const denominator = Math.sqrt(normA) * Math.sqrt(normB);
|
||||
if (denominator === 0) return 0;
|
||||
|
||||
return dotProduct / denominator;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find top K most similar items from a collection
|
||||
* @param {number[]} queryEmbedding - The embedding to search for
|
||||
* @param {Array<{id: any, embedding: number[]}>} items - Items with embeddings
|
||||
* @param {number} topK - Number of results to return
|
||||
* @returns {Array<{id: any, similarity: number}>}
|
||||
*/
|
||||
function findTopMatches(queryEmbedding, items, topK = 10) {
|
||||
if (!queryEmbedding || !items || items.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const scored = items.map(item => ({
|
||||
id: item.id,
|
||||
similarity: cosineSimilarity(queryEmbedding, item.embedding)
|
||||
}));
|
||||
|
||||
scored.sort((a, b) => b.similarity - a.similarity);
|
||||
|
||||
return scored.slice(0, topK);
|
||||
}
|
||||
|
||||
/**
|
||||
* Find matches above a similarity threshold
|
||||
* @param {number[]} queryEmbedding
|
||||
* @param {Array<{id: any, embedding: number[]}>} items
|
||||
* @param {number} threshold - Minimum similarity (0-1)
|
||||
* @returns {Array<{id: any, similarity: number}>}
|
||||
*/
|
||||
function findMatchesAboveThreshold(queryEmbedding, items, threshold = 0.5) {
|
||||
if (!queryEmbedding || !items || items.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const scored = items
|
||||
.map(item => ({
|
||||
id: item.id,
|
||||
similarity: cosineSimilarity(queryEmbedding, item.embedding)
|
||||
}))
|
||||
.filter(item => item.similarity >= threshold);
|
||||
|
||||
scored.sort((a, b) => b.similarity - a.similarity);
|
||||
|
||||
return scored;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
cosineSimilarity,
|
||||
findTopMatches,
|
||||
findMatchesAboveThreshold
|
||||
};
|
||||
@@ -0,0 +1,323 @@
|
||||
/**
|
||||
* Taxonomy Embedding Service
|
||||
*
|
||||
* Generates and caches embeddings for categories, themes, and colors.
|
||||
* Excludes "Black Friday", "Gifts", "Deals" categories and their children.
|
||||
*/
|
||||
|
||||
const { findTopMatches } = require('./similarity');
|
||||
|
||||
// Categories to exclude (and all their children)
|
||||
const EXCLUDED_CATEGORY_NAMES = ['black friday', 'gifts', 'deals'];
|
||||
|
||||
class TaxonomyEmbeddings {
|
||||
constructor({ provider, logger }) {
|
||||
this.provider = provider;
|
||||
this.logger = logger || console;
|
||||
|
||||
// Cached taxonomy with embeddings
|
||||
this.categories = [];
|
||||
this.themes = [];
|
||||
this.colors = [];
|
||||
|
||||
// Raw data without embeddings (for lookup)
|
||||
this.categoryMap = new Map();
|
||||
this.themeMap = new Map();
|
||||
this.colorMap = new Map();
|
||||
|
||||
this.initialized = false;
|
||||
this.initializing = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Initialize embeddings - fetch taxonomy and generate embeddings
|
||||
*/
|
||||
async initialize(connection) {
|
||||
if (this.initialized) {
|
||||
return { categories: this.categories.length, themes: this.themes.length, colors: this.colors.length };
|
||||
}
|
||||
|
||||
if (this.initializing) {
|
||||
// Wait for existing initialization
|
||||
while (this.initializing) {
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
}
|
||||
return { categories: this.categories.length, themes: this.themes.length, colors: this.colors.length };
|
||||
}
|
||||
|
||||
this.initializing = true;
|
||||
|
||||
try {
|
||||
this.logger.info('[TaxonomyEmbeddings] Starting initialization...');
|
||||
|
||||
// Fetch raw taxonomy data
|
||||
const [categories, themes, colors] = await Promise.all([
|
||||
this._fetchCategories(connection),
|
||||
this._fetchThemes(connection),
|
||||
this._fetchColors(connection)
|
||||
]);
|
||||
|
||||
this.logger.info(`[TaxonomyEmbeddings] Fetched ${categories.length} categories, ${themes.length} themes, ${colors.length} colors`);
|
||||
|
||||
// Generate embeddings in parallel
|
||||
const [catEmbeddings, themeEmbeddings, colorEmbeddings] = await Promise.all([
|
||||
this._generateEmbeddings(categories, 'categories'),
|
||||
this._generateEmbeddings(themes, 'themes'),
|
||||
this._generateEmbeddings(colors, 'colors')
|
||||
]);
|
||||
|
||||
// Store with embeddings
|
||||
this.categories = catEmbeddings;
|
||||
this.themes = themeEmbeddings;
|
||||
this.colors = colorEmbeddings;
|
||||
|
||||
// Build lookup maps
|
||||
this.categoryMap = new Map(this.categories.map(c => [c.id, c]));
|
||||
this.themeMap = new Map(this.themes.map(t => [t.id, t]));
|
||||
this.colorMap = new Map(this.colors.map(c => [c.id, c]));
|
||||
|
||||
this.initialized = true;
|
||||
this.logger.info('[TaxonomyEmbeddings] Initialization complete');
|
||||
|
||||
return {
|
||||
categories: this.categories.length,
|
||||
themes: this.themes.length,
|
||||
colors: this.colors.length
|
||||
};
|
||||
} catch (error) {
|
||||
this.logger.error('[TaxonomyEmbeddings] Initialization failed:', error);
|
||||
throw error;
|
||||
} finally {
|
||||
this.initializing = false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Find similar categories for a product embedding
|
||||
*/
|
||||
findSimilarCategories(productEmbedding, topK = 10) {
|
||||
if (!this.initialized || !productEmbedding) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const matches = findTopMatches(productEmbedding, this.categories, topK);
|
||||
|
||||
return matches.map(match => {
|
||||
const cat = this.categoryMap.get(match.id);
|
||||
return {
|
||||
id: match.id,
|
||||
name: cat?.name || '',
|
||||
fullPath: cat?.fullPath || '',
|
||||
similarity: match.similarity
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Find similar themes for a product embedding
|
||||
*/
|
||||
findSimilarThemes(productEmbedding, topK = 5) {
|
||||
if (!this.initialized || !productEmbedding) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const matches = findTopMatches(productEmbedding, this.themes, topK);
|
||||
|
||||
return matches.map(match => {
|
||||
const theme = this.themeMap.get(match.id);
|
||||
return {
|
||||
id: match.id,
|
||||
name: theme?.name || '',
|
||||
fullPath: theme?.fullPath || '',
|
||||
similarity: match.similarity
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Find similar colors for a product embedding
|
||||
*/
|
||||
findSimilarColors(productEmbedding, topK = 5) {
|
||||
if (!this.initialized || !productEmbedding) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const matches = findTopMatches(productEmbedding, this.colors, topK);
|
||||
|
||||
return matches.map(match => {
|
||||
const color = this.colorMap.get(match.id);
|
||||
return {
|
||||
id: match.id,
|
||||
name: color?.name || '',
|
||||
similarity: match.similarity
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all taxonomy data (without embeddings) for frontend
|
||||
*/
|
||||
getTaxonomyData() {
|
||||
return {
|
||||
categories: this.categories.map(({ id, name, fullPath, parentId }) => ({ id, name, fullPath, parentId })),
|
||||
themes: this.themes.map(({ id, name, fullPath, parentId }) => ({ id, name, fullPath, parentId })),
|
||||
colors: this.colors.map(({ id, name }) => ({ id, name }))
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if service is ready
|
||||
*/
|
||||
isReady() {
|
||||
return this.initialized;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Private Methods
|
||||
// ============================================================================
|
||||
|
||||
async _fetchCategories(connection) {
|
||||
// Fetch hierarchical categories (types 10-13)
|
||||
const [rows] = await connection.query(`
|
||||
SELECT cat_id, name, master_cat_id, type
|
||||
FROM product_categories
|
||||
WHERE type IN (10, 11, 12, 13)
|
||||
ORDER BY type, name
|
||||
`);
|
||||
|
||||
// Build lookup for hierarchy
|
||||
const byId = new Map(rows.map(r => [r.cat_id, r]));
|
||||
|
||||
// Find IDs of excluded top-level categories and all their descendants
|
||||
const excludedIds = new Set();
|
||||
|
||||
// First pass: find excluded top-level categories
|
||||
for (const row of rows) {
|
||||
if (row.type === 10 && EXCLUDED_CATEGORY_NAMES.includes(row.name.toLowerCase())) {
|
||||
excludedIds.add(row.cat_id);
|
||||
}
|
||||
}
|
||||
|
||||
// Multiple passes to find all descendants
|
||||
let foundNew = true;
|
||||
while (foundNew) {
|
||||
foundNew = false;
|
||||
for (const row of rows) {
|
||||
if (!excludedIds.has(row.cat_id) && excludedIds.has(row.master_cat_id)) {
|
||||
excludedIds.add(row.cat_id);
|
||||
foundNew = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.info(`[TaxonomyEmbeddings] Excluding ${excludedIds.size} categories (Black Friday, Gifts, Deals and children)`);
|
||||
|
||||
// Build category objects with full paths, excluding filtered ones
|
||||
const categories = [];
|
||||
|
||||
for (const row of rows) {
|
||||
if (excludedIds.has(row.cat_id)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const path = [];
|
||||
let current = row;
|
||||
|
||||
// Walk up the tree to build full path
|
||||
while (current) {
|
||||
path.unshift(current.name);
|
||||
current = current.master_cat_id ? byId.get(current.master_cat_id) : null;
|
||||
}
|
||||
|
||||
categories.push({
|
||||
id: row.cat_id,
|
||||
name: row.name,
|
||||
parentId: row.master_cat_id,
|
||||
type: row.type,
|
||||
fullPath: path.join(' > '),
|
||||
embeddingText: path.join(' ')
|
||||
});
|
||||
}
|
||||
|
||||
return categories;
|
||||
}
|
||||
|
||||
async _fetchThemes(connection) {
|
||||
// Fetch themes (types 20-21)
|
||||
const [rows] = await connection.query(`
|
||||
SELECT cat_id, name, master_cat_id, type
|
||||
FROM product_categories
|
||||
WHERE type IN (20, 21)
|
||||
ORDER BY type, name
|
||||
`);
|
||||
|
||||
const byId = new Map(rows.map(r => [r.cat_id, r]));
|
||||
const themes = [];
|
||||
|
||||
for (const row of rows) {
|
||||
const path = [];
|
||||
let current = row;
|
||||
|
||||
while (current) {
|
||||
path.unshift(current.name);
|
||||
current = current.master_cat_id ? byId.get(current.master_cat_id) : null;
|
||||
}
|
||||
|
||||
themes.push({
|
||||
id: row.cat_id,
|
||||
name: row.name,
|
||||
parentId: row.master_cat_id,
|
||||
type: row.type,
|
||||
fullPath: path.join(' > '),
|
||||
embeddingText: path.join(' ')
|
||||
});
|
||||
}
|
||||
|
||||
return themes;
|
||||
}
|
||||
|
||||
async _fetchColors(connection) {
|
||||
const [rows] = await connection.query(`
|
||||
SELECT color, name, hex_color
|
||||
FROM product_color_list
|
||||
ORDER BY \`order\`
|
||||
`);
|
||||
|
||||
return rows.map(row => ({
|
||||
id: row.color,
|
||||
name: row.name,
|
||||
hexColor: row.hex_color,
|
||||
embeddingText: row.name
|
||||
}));
|
||||
}
|
||||
|
||||
async _generateEmbeddings(items, label) {
|
||||
if (items.length === 0) {
|
||||
return items;
|
||||
}
|
||||
|
||||
const startTime = Date.now();
|
||||
const texts = items.map(item => item.embeddingText);
|
||||
const results = [...items];
|
||||
|
||||
// Process in batches
|
||||
let batchNum = 0;
|
||||
for await (const chunk of this.provider.embedBatchChunked(texts, { batchSize: 100 })) {
|
||||
batchNum++;
|
||||
for (let i = 0; i < chunk.embeddings.length; i++) {
|
||||
const globalIndex = chunk.startIndex + i;
|
||||
results[globalIndex] = {
|
||||
...results[globalIndex],
|
||||
embedding: chunk.embeddings[i]
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const elapsed = Date.now() - startTime;
|
||||
this.logger.info(`[TaxonomyEmbeddings] Generated ${items.length} ${label} embeddings in ${elapsed}ms`);
|
||||
|
||||
return results;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { TaxonomyEmbeddings };
|
||||
386
inventory-server/src/services/ai/index.js
Normal file
386
inventory-server/src/services/ai/index.js
Normal file
@@ -0,0 +1,386 @@
|
||||
/**
|
||||
* AI Service
|
||||
*
|
||||
* Main entry point for AI functionality including:
|
||||
* - Embeddings for taxonomy suggestions (OpenAI)
|
||||
* - Chat completions for validation tasks (Groq)
|
||||
* - Task registry for AI operations
|
||||
*/
|
||||
|
||||
const { OpenAIProvider } = require('./providers/openaiProvider');
|
||||
const { GroqProvider, MODELS: GROQ_MODELS } = require('./providers/groqProvider');
|
||||
const { TaxonomyEmbeddings } = require('./embeddings/taxonomyEmbeddings');
|
||||
const { cosineSimilarity, findTopMatches } = require('./embeddings/similarity');
|
||||
const { getRegistry, TASK_IDS, registerAllTasks } = require('./tasks');
|
||||
|
||||
let initialized = false;
|
||||
let initializing = false;
|
||||
let openaiProvider = null;
|
||||
let groqProvider = null;
|
||||
let taxonomyEmbeddings = null;
|
||||
let logger = console;
|
||||
|
||||
// Store pool reference for task access
|
||||
let appPool = null;
|
||||
|
||||
/**
|
||||
* Initialize the AI service
|
||||
* @param {Object} options
|
||||
* @param {string} options.openaiApiKey - OpenAI API key (for embeddings)
|
||||
* @param {string} [options.groqApiKey] - Groq API key (for chat completions)
|
||||
* @param {Object} options.mysqlConnection - MySQL connection for taxonomy data
|
||||
* @param {Object} [options.pool] - PostgreSQL pool for prompt loading
|
||||
* @param {Object} [options.logger] - Logger instance
|
||||
*/
|
||||
async function initialize({ openaiApiKey, groqApiKey, mysqlConnection, pool, logger: customLogger }) {
|
||||
if (initialized) {
|
||||
return { success: true, message: 'Already initialized' };
|
||||
}
|
||||
|
||||
if (initializing) {
|
||||
// Wait for existing initialization
|
||||
while (initializing) {
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
}
|
||||
return { success: initialized, message: initialized ? 'Initialized' : 'Initialization failed' };
|
||||
}
|
||||
|
||||
initializing = true;
|
||||
|
||||
try {
|
||||
if (customLogger) {
|
||||
logger = customLogger;
|
||||
}
|
||||
|
||||
if (!openaiApiKey) {
|
||||
throw new Error('OpenAI API key is required');
|
||||
}
|
||||
|
||||
logger.info('[AI] Initializing AI service...');
|
||||
|
||||
// Store pool reference for tasks
|
||||
if (pool) {
|
||||
appPool = pool;
|
||||
}
|
||||
|
||||
// Create OpenAI provider (for embeddings)
|
||||
openaiProvider = new OpenAIProvider({ apiKey: openaiApiKey });
|
||||
|
||||
// Create Groq provider (for chat completions) if API key provided
|
||||
if (groqApiKey) {
|
||||
groqProvider = new GroqProvider({ apiKey: groqApiKey });
|
||||
logger.info('[AI] Groq provider initialized for chat completions');
|
||||
} else {
|
||||
logger.warn('[AI] No Groq API key provided - chat completion tasks will not be available');
|
||||
}
|
||||
|
||||
// Create and initialize taxonomy embeddings
|
||||
taxonomyEmbeddings = new TaxonomyEmbeddings({
|
||||
provider: openaiProvider,
|
||||
logger
|
||||
});
|
||||
|
||||
const stats = await taxonomyEmbeddings.initialize(mysqlConnection);
|
||||
|
||||
// Register validation tasks if Groq is available
|
||||
if (groqProvider) {
|
||||
registerValidationTasks();
|
||||
}
|
||||
|
||||
initialized = true;
|
||||
logger.info('[AI] AI service initialized', {
|
||||
...stats,
|
||||
groqEnabled: !!groqProvider,
|
||||
tasksRegistered: getRegistry().list()
|
||||
});
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: 'Initialized',
|
||||
stats,
|
||||
groqEnabled: !!groqProvider
|
||||
};
|
||||
} catch (error) {
|
||||
logger.error('[AI] Initialization failed:', error);
|
||||
return { success: false, message: error.message };
|
||||
} finally {
|
||||
initializing = false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Register validation tasks with the task registry
|
||||
* Called during initialization if Groq is available
|
||||
*/
|
||||
function registerValidationTasks() {
|
||||
registerAllTasks(logger);
|
||||
logger.info('[AI] Validation tasks registered');
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if service is ready
|
||||
*/
|
||||
function isReady() {
|
||||
return initialized && taxonomyEmbeddings?.isReady();
|
||||
}
|
||||
|
||||
/**
|
||||
* Build weighted product text for embedding.
|
||||
* Weights the product name heavily by repeating it, and truncates long descriptions
|
||||
* to prevent verbose marketing copy from drowning out the product signal.
|
||||
*
|
||||
* @param {Object} product - Product with name, description, company, line
|
||||
* @returns {string} - Combined text for embedding
|
||||
*/
|
||||
function buildProductText(product) {
|
||||
const parts = [];
|
||||
const name = product.name?.trim();
|
||||
const description = product.description?.trim();
|
||||
const company = (product.company_name || product.company)?.trim();
|
||||
const line = (product.line_name || product.line)?.trim();
|
||||
|
||||
// Name is most important - repeat 3x to weight it heavily in the embedding
|
||||
if (name) {
|
||||
parts.push(name, name, name);
|
||||
}
|
||||
|
||||
// Company and line provide context
|
||||
if (company) {
|
||||
parts.push(company);
|
||||
}
|
||||
if (line) {
|
||||
parts.push(line);
|
||||
}
|
||||
|
||||
// Truncate description to prevent it from overwhelming the signal
|
||||
if (description) {
|
||||
const truncated = description.length > 500
|
||||
? description.substring(0, 500) + '...'
|
||||
: description;
|
||||
parts.push(truncated);
|
||||
}
|
||||
|
||||
return parts.join(' ').trim();
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate embedding for a product
|
||||
* @param {Object} product - Product with name, description, company, line
|
||||
* @returns {Promise<{embedding: number[], latencyMs: number}>}
|
||||
*/
|
||||
async function getProductEmbedding(product) {
|
||||
if (!initialized || !openaiProvider) {
|
||||
throw new Error('AI service not initialized');
|
||||
}
|
||||
|
||||
const text = buildProductText(product);
|
||||
|
||||
if (!text) {
|
||||
return { embedding: null, latencyMs: 0 };
|
||||
}
|
||||
|
||||
const result = await openaiProvider.embed(text);
|
||||
|
||||
return {
|
||||
embedding: result.embeddings[0],
|
||||
latencyMs: result.latencyMs
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate embeddings for multiple products
|
||||
* @param {Object[]} products - Array of products
|
||||
* @returns {Promise<{embeddings: Array<{index: number, embedding: number[]}>, latencyMs: number}>}
|
||||
*/
|
||||
async function getProductEmbeddings(products) {
|
||||
if (!initialized || !openaiProvider) {
|
||||
throw new Error('AI service not initialized');
|
||||
}
|
||||
|
||||
const texts = products.map(buildProductText);
|
||||
|
||||
// Track which products have empty text
|
||||
const validIndices = texts.map((t, i) => t ? i : -1).filter(i => i >= 0);
|
||||
const validTexts = texts.filter(t => t);
|
||||
|
||||
if (validTexts.length === 0) {
|
||||
return { embeddings: [], latencyMs: 0 };
|
||||
}
|
||||
|
||||
const result = await openaiProvider.embed(validTexts);
|
||||
|
||||
// Map embeddings back to original indices
|
||||
const embeddings = validIndices.map((originalIndex, resultIndex) => ({
|
||||
index: originalIndex,
|
||||
embedding: result.embeddings[resultIndex]
|
||||
}));
|
||||
|
||||
return {
|
||||
embeddings,
|
||||
latencyMs: result.latencyMs
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Find similar taxonomy items for a product embedding
|
||||
* @param {number[]} productEmbedding
|
||||
* @param {Object} options
|
||||
* @returns {{categories: Array, themes: Array, colors: Array}}
|
||||
*/
|
||||
function findSimilarTaxonomy(productEmbedding, options = {}) {
|
||||
if (!initialized || !taxonomyEmbeddings) {
|
||||
throw new Error('AI service not initialized');
|
||||
}
|
||||
|
||||
const topCategories = options.topCategories ?? 10;
|
||||
const topThemes = options.topThemes ?? 5;
|
||||
const topColors = options.topColors ?? 5;
|
||||
|
||||
return {
|
||||
categories: taxonomyEmbeddings.findSimilarCategories(productEmbedding, topCategories),
|
||||
themes: taxonomyEmbeddings.findSimilarThemes(productEmbedding, topThemes),
|
||||
colors: taxonomyEmbeddings.findSimilarColors(productEmbedding, topColors)
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get product embedding and find similar taxonomy in one call
|
||||
* @param {Object} product
|
||||
* @param {Object} options
|
||||
*/
|
||||
async function getSuggestionsForProduct(product, options = {}) {
|
||||
const { embedding, latencyMs: embeddingLatency } = await getProductEmbedding(product);
|
||||
|
||||
if (!embedding) {
|
||||
return {
|
||||
categories: [],
|
||||
themes: [],
|
||||
colors: [],
|
||||
latencyMs: embeddingLatency
|
||||
};
|
||||
}
|
||||
|
||||
const startSearch = Date.now();
|
||||
const suggestions = findSimilarTaxonomy(embedding, options);
|
||||
const searchLatency = Date.now() - startSearch;
|
||||
|
||||
return {
|
||||
...suggestions,
|
||||
latencyMs: embeddingLatency + searchLatency,
|
||||
embeddingLatencyMs: embeddingLatency,
|
||||
searchLatencyMs: searchLatency
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all taxonomy data (without embeddings) for frontend
|
||||
*/
|
||||
function getTaxonomyData() {
|
||||
if (!initialized || !taxonomyEmbeddings) {
|
||||
throw new Error('AI service not initialized');
|
||||
}
|
||||
|
||||
return taxonomyEmbeddings.getTaxonomyData();
|
||||
}
|
||||
|
||||
/**
|
||||
* Get service status
|
||||
*/
|
||||
function getStatus() {
|
||||
const registry = getRegistry();
|
||||
|
||||
return {
|
||||
initialized,
|
||||
ready: isReady(),
|
||||
hasOpenAI: !!openaiProvider,
|
||||
hasGroq: !!groqProvider,
|
||||
hasTaxonomy: !!taxonomyEmbeddings,
|
||||
taxonomyStats: taxonomyEmbeddings ? {
|
||||
categories: taxonomyEmbeddings.categories?.length || 0,
|
||||
themes: taxonomyEmbeddings.themes?.length || 0,
|
||||
colors: taxonomyEmbeddings.colors?.length || 0
|
||||
} : null,
|
||||
tasks: {
|
||||
registered: registry.list(),
|
||||
count: registry.size()
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Run an AI task by ID
|
||||
* @param {string} taskId - Task identifier from TASK_IDS
|
||||
* @param {Object} payload - Task-specific input
|
||||
* @returns {Promise<Object>} Task result
|
||||
*/
|
||||
async function runTask(taskId, payload = {}) {
|
||||
if (!initialized) {
|
||||
throw new Error('AI service not initialized');
|
||||
}
|
||||
|
||||
if (!groqProvider) {
|
||||
throw new Error('Groq provider not available - chat completion tasks require GROQ_API_KEY');
|
||||
}
|
||||
|
||||
const registry = getRegistry();
|
||||
return registry.runTask(taskId, {
|
||||
...payload,
|
||||
// Inject dependencies tasks may need
|
||||
provider: groqProvider,
|
||||
// Use pool from payload if provided (from route), fall back to stored appPool
|
||||
pool: payload.pool || appPool,
|
||||
logger
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the Groq provider instance (for direct use if needed)
|
||||
* @returns {GroqProvider|null}
|
||||
*/
|
||||
function getGroqProvider() {
|
||||
return groqProvider;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the PostgreSQL pool (for tasks that need DB access)
|
||||
* @returns {Object|null}
|
||||
*/
|
||||
function getPool() {
|
||||
return appPool;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if chat completion tasks are available
|
||||
* @returns {boolean}
|
||||
*/
|
||||
function hasChatCompletion() {
|
||||
return !!groqProvider;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
// Initialization
|
||||
initialize,
|
||||
isReady,
|
||||
getStatus,
|
||||
|
||||
// Embeddings (OpenAI)
|
||||
getProductEmbedding,
|
||||
getProductEmbeddings,
|
||||
findSimilarTaxonomy,
|
||||
getSuggestionsForProduct,
|
||||
getTaxonomyData,
|
||||
|
||||
// Chat completions (Groq)
|
||||
runTask,
|
||||
hasChatCompletion,
|
||||
getGroqProvider,
|
||||
getPool,
|
||||
|
||||
// Constants
|
||||
TASK_IDS,
|
||||
GROQ_MODELS,
|
||||
|
||||
// Re-export utilities
|
||||
cosineSimilarity,
|
||||
findTopMatches
|
||||
};
|
||||
176
inventory-server/src/services/ai/prompts/descriptionPrompts.js
Normal file
176
inventory-server/src/services/ai/prompts/descriptionPrompts.js
Normal file
@@ -0,0 +1,176 @@
|
||||
/**
|
||||
* Description Validation Prompts
|
||||
*
|
||||
* Functions for building and parsing description validation prompts.
|
||||
* System and general prompts are loaded from the database.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Sanitize an issue string from AI response
|
||||
* AI sometimes returns malformed strings with escape sequences
|
||||
*
|
||||
* @param {string} issue - Raw issue string
|
||||
* @returns {string} Cleaned issue string
|
||||
*/
|
||||
function sanitizeIssue(issue) {
|
||||
if (!issue || typeof issue !== 'string') return '';
|
||||
|
||||
let cleaned = issue
|
||||
// Remove trailing backslashes (incomplete escapes)
|
||||
.replace(/\\+$/, '')
|
||||
// Fix malformed escaped quotes at end of string
|
||||
.replace(/\\",?\)?$/, '')
|
||||
// Clean up double-escaped quotes
|
||||
.replace(/\\\\"/g, '"')
|
||||
// Clean up single escaped quotes that aren't needed
|
||||
.replace(/\\"/g, '"')
|
||||
// Remove any remaining trailing punctuation artifacts
|
||||
.replace(/[,\s]+$/, '')
|
||||
// Trim whitespace
|
||||
.trim();
|
||||
|
||||
return cleaned;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the user prompt for description validation
|
||||
* Combines database prompts with product data
|
||||
*
|
||||
* @param {Object} product - Product data
|
||||
* @param {string} product.name - Product name
|
||||
* @param {string} product.description - Current description
|
||||
* @param {string} [product.company_name] - Company name
|
||||
* @param {string} [product.categories] - Product categories
|
||||
* @param {Object} prompts - Prompts loaded from database
|
||||
* @param {string} prompts.general - General description guidelines
|
||||
* @param {string} [prompts.companySpecific] - Company-specific rules
|
||||
* @returns {string} Complete user prompt
|
||||
*/
|
||||
function buildDescriptionUserPrompt(product, prompts) {
|
||||
const parts = [];
|
||||
|
||||
// Add general prompt/guidelines if provided
|
||||
if (prompts.general) {
|
||||
parts.push(prompts.general);
|
||||
parts.push(''); // Empty line for separation
|
||||
}
|
||||
|
||||
// Add company-specific rules if provided
|
||||
if (prompts.companySpecific) {
|
||||
parts.push(`COMPANY-SPECIFIC RULES FOR ${product.company_name || 'THIS COMPANY'}:`);
|
||||
parts.push(prompts.companySpecific);
|
||||
parts.push(''); // Empty line for separation
|
||||
}
|
||||
|
||||
// Add product information
|
||||
parts.push('PRODUCT TO VALIDATE:');
|
||||
parts.push(`NAME: "${product.name || ''}"`);
|
||||
parts.push(`COMPANY: ${product.company_name || 'Unknown'}`);
|
||||
|
||||
if (product.categories) {
|
||||
parts.push(`CATEGORIES: ${product.categories}`);
|
||||
}
|
||||
|
||||
parts.push('');
|
||||
parts.push('CURRENT DESCRIPTION:');
|
||||
parts.push(`"${product.description || '(empty)'}"`);
|
||||
|
||||
// Add response format instructions
|
||||
parts.push('');
|
||||
parts.push('CRITICAL RULES:');
|
||||
parts.push('- If isValid is false, you MUST provide a suggestion with the improved description');
|
||||
parts.push('- If there are ANY issues, isValid MUST be false and suggestion MUST contain the corrected text');
|
||||
parts.push('- Only set isValid to true if there are ZERO issues and the description needs no changes');
|
||||
parts.push('');
|
||||
parts.push('RESPOND WITH JSON:');
|
||||
parts.push(JSON.stringify({
|
||||
isValid: 'true if perfect, false if ANY changes needed',
|
||||
suggestion: 'REQUIRED when isValid is false - the complete improved description',
|
||||
issues: ['list each problem found (empty array only if isValid is true)']
|
||||
}, null, 2));
|
||||
|
||||
return parts.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the AI response for description validation
|
||||
*
|
||||
* @param {Object|null} parsed - Parsed JSON from AI
|
||||
* @param {string} content - Raw response content
|
||||
* @returns {Object}
|
||||
*/
|
||||
function parseDescriptionResponse(parsed, content) {
|
||||
// If we got valid parsed JSON, use it
|
||||
if (parsed && typeof parsed.isValid === 'boolean') {
|
||||
// Sanitize issues - AI sometimes returns malformed escape sequences
|
||||
const rawIssues = Array.isArray(parsed.issues) ? parsed.issues : [];
|
||||
const issues = rawIssues
|
||||
.map(sanitizeIssue)
|
||||
.filter(issue => issue.length > 0);
|
||||
|
||||
const suggestion = parsed.suggestion || null;
|
||||
|
||||
// IMPORTANT: LLMs sometimes return contradictory data (isValid: true with issues).
|
||||
// If there are issues, treat as invalid regardless of what the AI said.
|
||||
// Also if there's a suggestion, the AI thought something needed to change.
|
||||
const isValid = parsed.isValid && issues.length === 0 && !suggestion;
|
||||
|
||||
return { isValid, suggestion, issues };
|
||||
}
|
||||
|
||||
// Handle case where isValid is a string "true"/"false" instead of boolean
|
||||
if (parsed && typeof parsed.isValid === 'string') {
|
||||
const rawIssues = Array.isArray(parsed.issues) ? parsed.issues : [];
|
||||
const issues = rawIssues
|
||||
.map(sanitizeIssue)
|
||||
.filter(issue => issue.length > 0);
|
||||
const suggestion = parsed.suggestion || null;
|
||||
const rawIsValid = parsed.isValid.toLowerCase() !== 'false';
|
||||
|
||||
// Same defensive logic: if there are issues, it's not valid
|
||||
const isValid = rawIsValid && issues.length === 0 && !suggestion;
|
||||
|
||||
return { isValid, suggestion, issues };
|
||||
}
|
||||
|
||||
// Try to extract from content if parsing failed
|
||||
try {
|
||||
// Look for isValid pattern
|
||||
const isValidMatch = content.match(/"isValid"\s*:\s*(true|false)/i);
|
||||
const isValid = isValidMatch ? isValidMatch[1].toLowerCase() === 'true' : true;
|
||||
|
||||
// Look for suggestion (might be multiline)
|
||||
const suggestionMatch = content.match(/"suggestion"\s*:\s*"((?:[^"\\]|\\.)*)"/s);
|
||||
let suggestion = suggestionMatch ? suggestionMatch[1] : null;
|
||||
if (suggestion) {
|
||||
// Unescape common escapes
|
||||
suggestion = suggestion.replace(/\\n/g, '\n').replace(/\\"/g, '"');
|
||||
}
|
||||
|
||||
// Look for issues array
|
||||
const issuesMatch = content.match(/"issues"\s*:\s*\[([\s\S]*?)\]/);
|
||||
let issues = [];
|
||||
if (issuesMatch) {
|
||||
const issuesContent = issuesMatch[1];
|
||||
const issueStrings = issuesContent.match(/"([^"]+)"/g);
|
||||
if (issueStrings) {
|
||||
issues = issueStrings
|
||||
.map(s => sanitizeIssue(s.replace(/"/g, '')))
|
||||
.filter(issue => issue.length > 0);
|
||||
}
|
||||
}
|
||||
|
||||
// Same logic: if there are issues, it's not valid
|
||||
const finalIsValid = isValid && issues.length === 0 && !suggestion;
|
||||
|
||||
return { isValid: finalIsValid, suggestion, issues };
|
||||
} catch {
|
||||
// Default to valid if we can't parse anything
|
||||
return { isValid: true, suggestion: null, issues: [] };
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
buildDescriptionUserPrompt,
|
||||
parseDescriptionResponse
|
||||
};
|
||||
187
inventory-server/src/services/ai/prompts/namePrompts.js
Normal file
187
inventory-server/src/services/ai/prompts/namePrompts.js
Normal file
@@ -0,0 +1,187 @@
|
||||
/**
|
||||
* Name Validation Prompts
|
||||
*
|
||||
* Functions for building and parsing name validation prompts.
|
||||
* System and general prompts are loaded from the database.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Sanitize an issue string from AI response
|
||||
* AI sometimes returns malformed strings with escape sequences
|
||||
*
|
||||
* @param {string} issue - Raw issue string
|
||||
* @returns {string} Cleaned issue string
|
||||
*/
|
||||
function sanitizeIssue(issue) {
|
||||
if (!issue || typeof issue !== 'string') return '';
|
||||
|
||||
let cleaned = issue
|
||||
// Remove trailing backslashes (incomplete escapes)
|
||||
.replace(/\\+$/, '')
|
||||
// Fix malformed escaped quotes at end of string
|
||||
.replace(/\\",?\)?$/, '')
|
||||
// Clean up double-escaped quotes
|
||||
.replace(/\\\\"/g, '"')
|
||||
// Clean up single escaped quotes that aren't needed
|
||||
.replace(/\\"/g, '"')
|
||||
// Remove any remaining trailing punctuation artifacts
|
||||
.replace(/[,\s]+$/, '')
|
||||
// Trim whitespace
|
||||
.trim();
|
||||
|
||||
return cleaned;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build the user prompt for name validation
|
||||
* Combines database prompts with product data
|
||||
*
|
||||
* @param {Object} product - Product data
|
||||
* @param {string} product.name - Current product name
|
||||
* @param {string} [product.company_name] - Company name
|
||||
* @param {string} [product.line_name] - Product line name
|
||||
* @param {string} [product.subline_name] - Product subline name
|
||||
* @param {string[]} [product.siblingNames] - Names of other products in the same line
|
||||
* @param {Object} prompts - Prompts loaded from database
|
||||
* @param {string} prompts.general - General naming conventions
|
||||
* @param {string} [prompts.companySpecific] - Company-specific rules
|
||||
* @returns {string} Complete user prompt
|
||||
*/
|
||||
function buildNameUserPrompt(product, prompts) {
|
||||
const parts = [];
|
||||
|
||||
// Add general prompt/conventions if provided
|
||||
if (prompts.general) {
|
||||
parts.push(prompts.general);
|
||||
parts.push(''); // Empty line for separation
|
||||
}
|
||||
|
||||
// Add company-specific rules if provided
|
||||
if (prompts.companySpecific) {
|
||||
parts.push(`COMPANY-SPECIFIC RULES FOR ${product.company_name || 'THIS COMPANY'}:`);
|
||||
parts.push(prompts.companySpecific);
|
||||
parts.push(''); // Empty line for separation
|
||||
}
|
||||
|
||||
// Add product information
|
||||
parts.push('PRODUCT TO VALIDATE:');
|
||||
parts.push(`NAME: "${product.name || ''}"`);
|
||||
parts.push(`COMPANY: ${product.company_name || 'Unknown'}`);
|
||||
parts.push(`LINE: ${product.line_name || 'None'}`);
|
||||
if (product.subline_name) {
|
||||
parts.push(`SUBLINE: ${product.subline_name}`);
|
||||
}
|
||||
|
||||
// Add sibling context for naming decisions
|
||||
if (product.siblingNames && product.siblingNames.length > 0) {
|
||||
parts.push('');
|
||||
parts.push(`OTHER PRODUCTS IN THIS LINE (${product.siblingNames.length + 1} total including this one):`);
|
||||
product.siblingNames.forEach(name => {
|
||||
parts.push(`- ${name}`);
|
||||
});
|
||||
}
|
||||
|
||||
// Add response format instructions
|
||||
parts.push('');
|
||||
parts.push('RESPOND WITH JSON:');
|
||||
parts.push(JSON.stringify({
|
||||
isValid: 'true/false',
|
||||
suggestion: 'corrected name if changes needed, or null if valid',
|
||||
issues: ['issue 1', 'issue 2 (empty array if valid)']
|
||||
}, null, 2));
|
||||
|
||||
return parts.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the AI response for name validation
|
||||
*
|
||||
* @param {Object|null} parsed - Parsed JSON from AI
|
||||
* @param {string} content - Raw response content
|
||||
* @returns {Object}
|
||||
*/
|
||||
function parseNameResponse(parsed, content) {
|
||||
// Debug: Log what we're trying to parse
|
||||
console.log('[parseNameResponse] Input:', {
|
||||
hasParsed: !!parsed,
|
||||
parsedIsValid: parsed?.isValid,
|
||||
parsedType: typeof parsed?.isValid,
|
||||
contentPreview: content?.substring(0, 3000)
|
||||
});
|
||||
|
||||
// If we got valid parsed JSON, use it
|
||||
if (parsed && typeof parsed.isValid === 'boolean') {
|
||||
// Sanitize issues - AI sometimes returns malformed escape sequences
|
||||
const rawIssues = Array.isArray(parsed.issues) ? parsed.issues : [];
|
||||
const issues = rawIssues
|
||||
.map(sanitizeIssue)
|
||||
.filter(issue => issue.length > 0);
|
||||
const suggestion = parsed.suggestion || null;
|
||||
|
||||
// IMPORTANT: LLMs sometimes return contradictory data (isValid: true with issues).
|
||||
// If there are issues, treat as invalid regardless of what the AI said.
|
||||
const isValid = parsed.isValid && issues.length === 0 && !suggestion;
|
||||
|
||||
return { isValid, suggestion, issues };
|
||||
}
|
||||
|
||||
// Handle case where isValid is a string "true"/"false" instead of boolean
|
||||
if (parsed && typeof parsed.isValid === 'string') {
|
||||
const rawIssues = Array.isArray(parsed.issues) ? parsed.issues : [];
|
||||
const issues = rawIssues
|
||||
.map(sanitizeIssue)
|
||||
.filter(issue => issue.length > 0);
|
||||
const suggestion = parsed.suggestion || null;
|
||||
const rawIsValid = parsed.isValid.toLowerCase() !== 'false';
|
||||
|
||||
// Same defensive logic: if there are issues, it's not valid
|
||||
const isValid = rawIsValid && issues.length === 0 && !suggestion;
|
||||
|
||||
console.log('[parseNameResponse] Parsed isValid as string:', parsed.isValid, '→', isValid);
|
||||
return { isValid, suggestion, issues };
|
||||
}
|
||||
|
||||
// Try to extract from content if parsing failed
|
||||
try {
|
||||
// Look for isValid pattern - handle both boolean and quoted string
|
||||
// Matches: "isValid": true, "isValid": false, "isValid": "true", "isValid": "false"
|
||||
const isValidMatch = content.match(/"isValid"\s*:\s*"?(true|false)"?/i);
|
||||
const isValid = isValidMatch ? isValidMatch[1].toLowerCase() === 'true' : true;
|
||||
|
||||
console.log('[parseNameResponse] Regex extraction:', {
|
||||
isValidMatch: isValidMatch?.[0],
|
||||
isValidValue: isValidMatch?.[1],
|
||||
resultIsValid: isValid
|
||||
});
|
||||
|
||||
// Look for suggestion - handle escaped quotes and null
|
||||
const suggestionMatch = content.match(/"suggestion"\s*:\s*(?:"([^"\\]*(?:\\.[^"\\]*)*)"|null)/);
|
||||
const suggestion = suggestionMatch ? (suggestionMatch[1] || null) : null;
|
||||
|
||||
// Look for issues array
|
||||
const issuesMatch = content.match(/"issues"\s*:\s*\[([\s\S]*?)\]/);
|
||||
let issues = [];
|
||||
if (issuesMatch) {
|
||||
const issuesContent = issuesMatch[1];
|
||||
const issueStrings = issuesContent.match(/"([^"]+)"/g);
|
||||
if (issueStrings) {
|
||||
issues = issueStrings
|
||||
.map(s => sanitizeIssue(s.replace(/"/g, '')))
|
||||
.filter(issue => issue.length > 0);
|
||||
}
|
||||
}
|
||||
|
||||
// Same defensive logic: if there are issues, it's not valid
|
||||
const finalIsValid = isValid && issues.length === 0 && !suggestion;
|
||||
|
||||
return { isValid: finalIsValid, suggestion, issues };
|
||||
} catch {
|
||||
// Default to valid if we can't parse anything
|
||||
return { isValid: true, suggestion: null, issues: [] };
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
buildNameUserPrompt,
|
||||
parseNameResponse
|
||||
};
|
||||
194
inventory-server/src/services/ai/prompts/promptLoader.js
Normal file
194
inventory-server/src/services/ai/prompts/promptLoader.js
Normal file
@@ -0,0 +1,194 @@
|
||||
/**
|
||||
* Prompt Loader
|
||||
*
|
||||
* Utilities to load AI prompts from the ai_prompts PostgreSQL table.
|
||||
* Supports loading prompts by base type (e.g., 'name_validation' loads
|
||||
* name_validation_system, name_validation_general, and optionally
|
||||
* name_validation_company_specific).
|
||||
*/
|
||||
|
||||
/**
|
||||
* Load a single prompt by exact type
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string} promptType - Exact prompt type (e.g., 'name_validation_system')
|
||||
* @param {string} [company] - Company identifier (for company_specific types)
|
||||
* @returns {Promise<string|null>} Prompt text or null if not found
|
||||
*/
|
||||
async function loadPromptByType(pool, promptType, company = null) {
|
||||
try {
|
||||
let result;
|
||||
|
||||
if (company) {
|
||||
result = await pool.query(
|
||||
'SELECT prompt_text FROM ai_prompts WHERE prompt_type = $1 AND company = $2',
|
||||
[promptType, company]
|
||||
);
|
||||
} else {
|
||||
result = await pool.query(
|
||||
'SELECT prompt_text FROM ai_prompts WHERE prompt_type = $1 AND company IS NULL',
|
||||
[promptType]
|
||||
);
|
||||
}
|
||||
|
||||
return result.rows[0]?.prompt_text || null;
|
||||
} catch (error) {
|
||||
console.error(`[PromptLoader] Error loading ${promptType} prompt:`, error.message);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load all prompts for a task type (system, general, and optionally company-specific)
|
||||
*
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string} baseType - Base type name (e.g., 'name_validation', 'description_validation')
|
||||
* @param {string|null} [company] - Optional company ID for company-specific prompts
|
||||
* @returns {Promise<{system: string|null, general: string|null, companySpecific: string|null}>}
|
||||
*/
|
||||
async function loadPromptsByType(pool, baseType, company = null) {
|
||||
const systemType = `${baseType}_system`;
|
||||
const generalType = `${baseType}_general`;
|
||||
const companyType = `${baseType}_company_specific`;
|
||||
|
||||
// Load system and general prompts in parallel
|
||||
const [system, general] = await Promise.all([
|
||||
loadPromptByType(pool, systemType),
|
||||
loadPromptByType(pool, generalType)
|
||||
]);
|
||||
|
||||
// Load company-specific prompt if company is provided
|
||||
let companySpecific = null;
|
||||
if (company) {
|
||||
companySpecific = await loadPromptByType(pool, companyType, company);
|
||||
}
|
||||
|
||||
return {
|
||||
system,
|
||||
general,
|
||||
companySpecific
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Load name validation prompts
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string|null} [company] - Optional company ID
|
||||
* @returns {Promise<{system: string|null, general: string|null, companySpecific: string|null}>}
|
||||
*/
|
||||
async function loadNameValidationPrompts(pool, company = null) {
|
||||
return loadPromptsByType(pool, 'name_validation', company);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load description validation prompts
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string|null} [company] - Optional company ID
|
||||
* @returns {Promise<{system: string|null, general: string|null, companySpecific: string|null}>}
|
||||
*/
|
||||
async function loadDescriptionValidationPrompts(pool, company = null) {
|
||||
return loadPromptsByType(pool, 'description_validation', company);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load sanity check prompts (no company-specific variant)
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @returns {Promise<{system: string|null, general: string|null, companySpecific: null}>}
|
||||
*/
|
||||
async function loadSanityCheckPrompts(pool) {
|
||||
return loadPromptsByType(pool, 'sanity_check', null);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load bulk validation prompts (GPT-5 validation)
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string|null} [company] - Optional company ID
|
||||
* @returns {Promise<{system: string|null, general: string|null, companySpecific: string|null}>}
|
||||
*/
|
||||
async function loadBulkValidationPrompts(pool, company = null) {
|
||||
return loadPromptsByType(pool, 'bulk_validation', company);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load bulk validation prompts for multiple companies at once
|
||||
* @param {Object} pool - PostgreSQL pool
|
||||
* @param {string[]} companyIds - Array of company IDs
|
||||
* @returns {Promise<{system: string|null, general: string|null, companyPrompts: Map<string, string>}>}
|
||||
*/
|
||||
async function loadBulkValidationPromptsForCompanies(pool, companyIds = []) {
|
||||
// Load system and general prompts
|
||||
const [system, general] = await Promise.all([
|
||||
loadPromptByType(pool, 'bulk_validation_system'),
|
||||
loadPromptByType(pool, 'bulk_validation_general')
|
||||
]);
|
||||
|
||||
// Load company-specific prompts for all provided companies
|
||||
const companyPrompts = new Map();
|
||||
|
||||
if (companyIds.length > 0) {
|
||||
try {
|
||||
const result = await pool.query(
|
||||
`SELECT company, prompt_text FROM ai_prompts
|
||||
WHERE prompt_type = 'bulk_validation_company_specific'
|
||||
AND company = ANY($1)`,
|
||||
[companyIds]
|
||||
);
|
||||
|
||||
for (const row of result.rows) {
|
||||
companyPrompts.set(row.company, row.prompt_text);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('[PromptLoader] Error loading company-specific prompts:', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
system,
|
||||
general,
|
||||
companyPrompts
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate that required prompts exist, throw error if missing
|
||||
* @param {Object} prompts - Prompts object from loadPromptsByType
|
||||
* @param {string} baseType - Base type for error messages
|
||||
* @param {Object} options - Validation options
|
||||
* @param {boolean} [options.requireSystem=true] - Require system prompt
|
||||
* @param {boolean} [options.requireGeneral=true] - Require general prompt
|
||||
* @throws {Error} If required prompts are missing
|
||||
*/
|
||||
function validateRequiredPrompts(prompts, baseType, options = {}) {
|
||||
const { requireSystem = true, requireGeneral = true } = options;
|
||||
const missing = [];
|
||||
|
||||
if (requireSystem && !prompts.system) {
|
||||
missing.push(`${baseType}_system`);
|
||||
}
|
||||
|
||||
if (requireGeneral && !prompts.general) {
|
||||
missing.push(`${baseType}_general`);
|
||||
}
|
||||
|
||||
if (missing.length > 0) {
|
||||
throw new Error(
|
||||
`Missing required AI prompts: ${missing.join(', ')}. ` +
|
||||
`Please add these prompts in Settings > AI Validation Prompts.`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
// Core loader
|
||||
loadPromptByType,
|
||||
loadPromptsByType,
|
||||
|
||||
// Task-specific loaders
|
||||
loadNameValidationPrompts,
|
||||
loadDescriptionValidationPrompts,
|
||||
loadSanityCheckPrompts,
|
||||
loadBulkValidationPrompts,
|
||||
loadBulkValidationPromptsForCompanies,
|
||||
|
||||
// Validation
|
||||
validateRequiredPrompts
|
||||
};
|
||||
128
inventory-server/src/services/ai/prompts/sanityCheckPrompts.js
Normal file
128
inventory-server/src/services/ai/prompts/sanityCheckPrompts.js
Normal file
@@ -0,0 +1,128 @@
|
||||
/**
|
||||
* Sanity Check Prompts
|
||||
*
|
||||
* Functions for building and parsing batch product consistency validation prompts.
|
||||
* System and general prompts are loaded from the database.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Build the user prompt for sanity check
|
||||
* Combines database prompts with product data
|
||||
*
|
||||
* @param {Object[]} products - Array of product data (limited fields for context)
|
||||
* @param {Object} prompts - Prompts loaded from database
|
||||
* @param {string} prompts.general - General sanity check rules
|
||||
* @returns {string} Complete user prompt
|
||||
*/
|
||||
function buildSanityCheckUserPrompt(products, prompts) {
|
||||
// Build a simplified product list for the prompt
|
||||
const productSummaries = products.map((p, index) => ({
|
||||
index,
|
||||
name: p.name,
|
||||
supplier: p.supplier_name || p.supplier,
|
||||
company: p.company_name || p.company,
|
||||
supplier_no: p.supplier_no,
|
||||
msrp: p.msrp,
|
||||
cost_each: p.cost_each,
|
||||
qty_per_unit: p.qty_per_unit,
|
||||
case_qty: p.case_qty,
|
||||
tax_cat: p.tax_cat_name || p.tax_cat,
|
||||
size_cat: p.size_cat_name || p.size_cat,
|
||||
themes: p.theme_names || p.themes,
|
||||
categories: p.category_names || p.categories,
|
||||
weight: p.weight,
|
||||
length: p.length,
|
||||
width: p.width,
|
||||
height: p.height
|
||||
}));
|
||||
|
||||
const parts = [];
|
||||
|
||||
// Add general prompt/rules if provided
|
||||
if (prompts.general) {
|
||||
parts.push(prompts.general);
|
||||
parts.push(''); // Empty line for separation
|
||||
}
|
||||
|
||||
// Add products to review
|
||||
parts.push(`PRODUCTS TO REVIEW (${products.length} items):`);
|
||||
parts.push(JSON.stringify(productSummaries, null, 2));
|
||||
|
||||
// Add response format
|
||||
parts.push('');
|
||||
parts.push('RESPOND WITH JSON:');
|
||||
parts.push(JSON.stringify({
|
||||
issues: [
|
||||
{
|
||||
productIndex: 0,
|
||||
field: 'msrp',
|
||||
issue: 'Description of the issue found',
|
||||
suggestion: 'Suggested fix or verification (optional)'
|
||||
}
|
||||
],
|
||||
summary: '2-3 sentences summarizing the overall product quality'
|
||||
}, null, 2));
|
||||
|
||||
parts.push('');
|
||||
parts.push('If no issues are found, return empty issues array with positive summary.');
|
||||
|
||||
return parts.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse the AI response for sanity check
|
||||
*
|
||||
* @param {Object|null} parsed - Parsed JSON from AI
|
||||
* @param {string} content - Raw response content
|
||||
* @returns {Object}
|
||||
*/
|
||||
function parseSanityCheckResponse(parsed, content) {
|
||||
// If we got valid parsed JSON, use it
|
||||
if (parsed && Array.isArray(parsed.issues)) {
|
||||
return {
|
||||
issues: parsed.issues.map(issue => ({
|
||||
productIndex: issue.productIndex ?? issue.index ?? 0,
|
||||
field: issue.field || 'unknown',
|
||||
issue: issue.issue || issue.message || '',
|
||||
suggestion: issue.suggestion || null
|
||||
})),
|
||||
summary: parsed.summary || 'Review complete'
|
||||
};
|
||||
}
|
||||
|
||||
// Try to extract from content if parsing failed
|
||||
try {
|
||||
// Try to find issues array
|
||||
const issuesMatch = content.match(/"issues"\s*:\s*\[([\s\S]*?)\]/);
|
||||
let issues = [];
|
||||
|
||||
if (issuesMatch) {
|
||||
// Try to parse the array content
|
||||
try {
|
||||
const arrayContent = `[${issuesMatch[1]}]`;
|
||||
const parsedIssues = JSON.parse(arrayContent);
|
||||
issues = parsedIssues.map(issue => ({
|
||||
productIndex: issue.productIndex ?? issue.index ?? 0,
|
||||
field: issue.field || 'unknown',
|
||||
issue: issue.issue || issue.message || '',
|
||||
suggestion: issue.suggestion || null
|
||||
}));
|
||||
} catch {
|
||||
// Couldn't parse the array
|
||||
}
|
||||
}
|
||||
|
||||
// Try to find summary
|
||||
const summaryMatch = content.match(/"summary"\s*:\s*"([^"]+)"/);
|
||||
const summary = summaryMatch ? summaryMatch[1] : 'Review complete';
|
||||
|
||||
return { issues, summary };
|
||||
} catch {
|
||||
return { issues: [], summary: 'Could not parse review results' };
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
buildSanityCheckUserPrompt,
|
||||
parseSanityCheckResponse
|
||||
};
|
||||
203
inventory-server/src/services/ai/providers/groqProvider.js
Normal file
203
inventory-server/src/services/ai/providers/groqProvider.js
Normal file
@@ -0,0 +1,203 @@
|
||||
/**
|
||||
* Groq Provider - Handles chat completions via Groq's OpenAI-compatible API
|
||||
*
|
||||
* Uses Groq's fast inference for real-time AI validation tasks.
|
||||
* Supports models like openai/gpt-oss-120b (complex) and openai/gpt-oss-20b (simple).
|
||||
*/
|
||||
|
||||
const GROQ_BASE_URL = 'https://api.groq.com/openai/v1';
|
||||
|
||||
// Default models
|
||||
const MODELS = {
|
||||
LARGE: 'openai/gpt-oss-120b', // For complex tasks (descriptions, sanity checks)
|
||||
SMALL: 'openai/gpt-oss-20b' // For simple tasks (name validation)
|
||||
};
|
||||
|
||||
class GroqProvider {
|
||||
/**
|
||||
* @param {Object} options
|
||||
* @param {string} options.apiKey - Groq API key
|
||||
* @param {string} [options.baseUrl] - Override base URL
|
||||
* @param {number} [options.timeoutMs=30000] - Default timeout
|
||||
*/
|
||||
constructor({ apiKey, baseUrl = GROQ_BASE_URL, timeoutMs = 30000 }) {
|
||||
if (!apiKey) {
|
||||
throw new Error('Groq API key is required');
|
||||
}
|
||||
this.apiKey = apiKey;
|
||||
this.baseUrl = baseUrl;
|
||||
this.timeoutMs = timeoutMs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Send a chat completion request
|
||||
*
|
||||
* @param {Object} params
|
||||
* @param {Array<{role: string, content: string}>} params.messages - Conversation messages
|
||||
* @param {string} [params.model] - Model to use (defaults to LARGE)
|
||||
* @param {number} [params.temperature=0.3] - Response randomness (0-2)
|
||||
* @param {number} [params.maxTokens=500] - Max tokens in response
|
||||
* @param {Object} [params.responseFormat] - For JSON mode: { type: 'json_object' }
|
||||
* @param {number} [params.timeoutMs] - Request timeout override
|
||||
* @returns {Promise<{content: string, parsed: Object|null, usage: Object, latencyMs: number, model: string}>}
|
||||
*/
|
||||
async chatCompletion({
|
||||
messages,
|
||||
model = MODELS.LARGE,
|
||||
temperature = 0.3,
|
||||
maxTokens = 500,
|
||||
responseFormat = null,
|
||||
timeoutMs = this.timeoutMs
|
||||
}) {
|
||||
const started = Date.now();
|
||||
|
||||
const body = {
|
||||
model,
|
||||
messages,
|
||||
temperature,
|
||||
max_completion_tokens: maxTokens
|
||||
};
|
||||
|
||||
// Enable JSON mode if requested
|
||||
if (responseFormat?.type === 'json_object') {
|
||||
body.response_format = { type: 'json_object' };
|
||||
}
|
||||
|
||||
// Debug: Log request being sent
|
||||
console.log('[Groq] Request:', {
|
||||
model: body.model,
|
||||
temperature: body.temperature,
|
||||
maxTokens: body.max_completion_tokens,
|
||||
hasResponseFormat: !!body.response_format,
|
||||
messageCount: body.messages?.length,
|
||||
systemPromptLength: body.messages?.[0]?.content?.length,
|
||||
userPromptLength: body.messages?.[1]?.content?.length
|
||||
});
|
||||
|
||||
const response = await this._makeRequest('chat/completions', body, timeoutMs);
|
||||
|
||||
// Debug: Log raw response structure
|
||||
console.log('[Groq] Raw response:', {
|
||||
hasChoices: !!response.choices,
|
||||
choicesLength: response.choices?.length,
|
||||
firstChoice: response.choices?.[0] ? {
|
||||
finishReason: response.choices[0].finish_reason,
|
||||
hasMessage: !!response.choices[0].message,
|
||||
contentLength: response.choices[0].message?.content?.length,
|
||||
contentPreview: response.choices[0].message?.content?.substring(0, 200)
|
||||
} : null,
|
||||
usage: response.usage,
|
||||
model: response.model
|
||||
});
|
||||
|
||||
const content = response.choices?.[0]?.message?.content || '';
|
||||
const usage = response.usage || {};
|
||||
|
||||
// Attempt to parse JSON if response format was requested
|
||||
let parsed = null;
|
||||
if (responseFormat && content) {
|
||||
try {
|
||||
parsed = JSON.parse(content);
|
||||
} catch {
|
||||
// Content isn't valid JSON - try to extract JSON from markdown
|
||||
parsed = this._extractJson(content);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
content,
|
||||
parsed,
|
||||
usage: {
|
||||
promptTokens: usage.prompt_tokens || 0,
|
||||
completionTokens: usage.completion_tokens || 0,
|
||||
totalTokens: usage.total_tokens || 0
|
||||
},
|
||||
latencyMs: Date.now() - started,
|
||||
model: response.model || model
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract JSON from content that might be wrapped in markdown code blocks
|
||||
* @private
|
||||
*/
|
||||
_extractJson(content) {
|
||||
// Try to find JSON in code blocks
|
||||
const codeBlockMatch = content.match(/```(?:json)?\s*([\s\S]*?)```/);
|
||||
if (codeBlockMatch) {
|
||||
try {
|
||||
return JSON.parse(codeBlockMatch[1].trim());
|
||||
} catch {
|
||||
// Fall through
|
||||
}
|
||||
}
|
||||
|
||||
// Try to find JSON object/array directly
|
||||
const jsonMatch = content.match(/(\{[\s\S]*\}|\[[\s\S]*\])/);
|
||||
if (jsonMatch) {
|
||||
try {
|
||||
return JSON.parse(jsonMatch[1]);
|
||||
} catch {
|
||||
// Fall through
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Make an HTTP request to Groq API
|
||||
* @private
|
||||
*/
|
||||
async _makeRequest(endpoint, body, timeoutMs) {
|
||||
const controller = new AbortController();
|
||||
const timeout = setTimeout(() => controller.abort(), timeoutMs);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${this.baseUrl}/${endpoint}`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${this.apiKey}`
|
||||
},
|
||||
body: JSON.stringify(body),
|
||||
signal: controller.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json().catch(() => ({}));
|
||||
const message = error.error?.message || `Groq API error: ${response.status}`;
|
||||
const err = new Error(message);
|
||||
err.status = response.status;
|
||||
err.code = error.error?.code;
|
||||
// Include failed_generation if available (for JSON mode failures)
|
||||
if (error.error?.failed_generation) {
|
||||
err.failedGeneration = error.error.failed_generation;
|
||||
console.error('[Groq] JSON validation failed. Model output:', error.error.failed_generation);
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
|
||||
return response.json();
|
||||
} catch (error) {
|
||||
if (error.name === 'AbortError') {
|
||||
const err = new Error(`Groq request timed out after ${timeoutMs}ms`);
|
||||
err.code = 'TIMEOUT';
|
||||
throw err;
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if the provider is properly configured
|
||||
* @returns {boolean}
|
||||
*/
|
||||
isConfigured() {
|
||||
return !!this.apiKey;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { GroqProvider, MODELS, GROQ_BASE_URL };
|
||||
117
inventory-server/src/services/ai/providers/openaiProvider.js
Normal file
117
inventory-server/src/services/ai/providers/openaiProvider.js
Normal file
@@ -0,0 +1,117 @@
|
||||
/**
|
||||
* OpenAI Provider - Handles embedding generation
|
||||
*/
|
||||
|
||||
const EMBEDDING_MODEL = 'text-embedding-3-small';
|
||||
const EMBEDDING_DIMENSIONS = 1536;
|
||||
const MAX_BATCH_SIZE = 2048;
|
||||
|
||||
class OpenAIProvider {
|
||||
constructor({ apiKey, baseUrl = 'https://api.openai.com/v1', timeoutMs = 60000 }) {
|
||||
if (!apiKey) {
|
||||
throw new Error('OpenAI API key is required');
|
||||
}
|
||||
this.apiKey = apiKey;
|
||||
this.baseUrl = baseUrl;
|
||||
this.timeoutMs = timeoutMs;
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate embeddings for one or more texts
|
||||
* @param {string|string[]} input - Text or array of texts
|
||||
* @param {Object} options
|
||||
* @returns {Promise<{embeddings: number[][], usage: Object, model: string, latencyMs: number}>}
|
||||
*/
|
||||
async embed(input, options = {}) {
|
||||
const texts = Array.isArray(input) ? input : [input];
|
||||
const model = options.model || EMBEDDING_MODEL;
|
||||
const dimensions = options.dimensions || EMBEDDING_DIMENSIONS;
|
||||
const timeoutMs = options.timeoutMs || this.timeoutMs;
|
||||
|
||||
if (texts.length > MAX_BATCH_SIZE) {
|
||||
throw new Error(`Batch size ${texts.length} exceeds max of ${MAX_BATCH_SIZE}`);
|
||||
}
|
||||
|
||||
const started = Date.now();
|
||||
|
||||
// Clean and truncate input texts
|
||||
const cleanedTexts = texts.map(t =>
|
||||
(t || '').replace(/\n+/g, ' ').trim().substring(0, 8000)
|
||||
);
|
||||
|
||||
const body = {
|
||||
input: cleanedTexts,
|
||||
model,
|
||||
encoding_format: 'float'
|
||||
};
|
||||
|
||||
// Only embedding-3 models support dimensions parameter
|
||||
if (model.includes('embedding-3')) {
|
||||
body.dimensions = dimensions;
|
||||
}
|
||||
|
||||
const response = await this._makeRequest('embeddings', body, timeoutMs);
|
||||
|
||||
// Sort by index to ensure order matches input
|
||||
const sortedData = response.data.sort((a, b) => a.index - b.index);
|
||||
|
||||
return {
|
||||
embeddings: sortedData.map(item => item.embedding),
|
||||
usage: {
|
||||
promptTokens: response.usage?.prompt_tokens || 0,
|
||||
totalTokens: response.usage?.total_tokens || 0
|
||||
},
|
||||
model: response.model || model,
|
||||
latencyMs: Date.now() - started
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Generator for processing large batches in chunks
|
||||
*/
|
||||
async *embedBatchChunked(texts, options = {}) {
|
||||
const batchSize = Math.min(options.batchSize || 100, MAX_BATCH_SIZE);
|
||||
|
||||
for (let i = 0; i < texts.length; i += batchSize) {
|
||||
const chunk = texts.slice(i, i + batchSize);
|
||||
const result = await this.embed(chunk, options);
|
||||
|
||||
yield {
|
||||
embeddings: result.embeddings,
|
||||
startIndex: i,
|
||||
endIndex: i + chunk.length,
|
||||
usage: result.usage,
|
||||
model: result.model,
|
||||
latencyMs: result.latencyMs
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async _makeRequest(endpoint, body, timeoutMs) {
|
||||
const controller = new AbortController();
|
||||
const timeout = setTimeout(() => controller.abort(), timeoutMs);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${this.baseUrl}/${endpoint}`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': `Bearer ${this.apiKey}`
|
||||
},
|
||||
body: JSON.stringify(body),
|
||||
signal: controller.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const error = await response.json().catch(() => ({}));
|
||||
throw new Error(error.error?.message || `OpenAI API error: ${response.status}`);
|
||||
}
|
||||
|
||||
return response.json();
|
||||
} finally {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { OpenAIProvider, EMBEDDING_MODEL, EMBEDDING_DIMENSIONS };
|
||||
@@ -0,0 +1,158 @@
|
||||
/**
|
||||
* Description Validation Task
|
||||
*
|
||||
* Validates a product description for quality, accuracy, and guideline compliance.
|
||||
* Uses Groq with the larger model for better reasoning about content quality.
|
||||
* Loads all prompts from the database (no hardcoded prompts).
|
||||
*/
|
||||
|
||||
const { MODELS } = require('../providers/groqProvider');
|
||||
const {
|
||||
loadDescriptionValidationPrompts,
|
||||
validateRequiredPrompts
|
||||
} = require('../prompts/promptLoader');
|
||||
const {
|
||||
buildDescriptionUserPrompt,
|
||||
parseDescriptionResponse
|
||||
} = require('../prompts/descriptionPrompts');
|
||||
|
||||
const TASK_ID = 'validate.description';
|
||||
|
||||
/**
|
||||
* Create the description validation task
|
||||
*
|
||||
* @returns {Object} Task definition
|
||||
*/
|
||||
function createDescriptionValidationTask() {
|
||||
return {
|
||||
id: TASK_ID,
|
||||
description: 'Validate product description for quality and guideline compliance',
|
||||
|
||||
/**
|
||||
* Run the description validation
|
||||
*
|
||||
* @param {Object} payload
|
||||
* @param {Object} payload.product - Product data
|
||||
* @param {string} payload.product.name - Product name (for context)
|
||||
* @param {string} payload.product.description - Description to validate
|
||||
* @param {string} [payload.product.company_name] - Company name
|
||||
* @param {string} [payload.product.company_id] - Company ID for loading specific rules
|
||||
* @param {string} [payload.product.categories] - Product categories
|
||||
* @param {Object} payload.provider - Groq provider instance
|
||||
* @param {Object} payload.pool - PostgreSQL pool
|
||||
* @param {Object} [payload.logger] - Logger instance
|
||||
* @returns {Promise<Object>}
|
||||
*/
|
||||
async run(payload) {
|
||||
const { product, provider, pool, logger } = payload;
|
||||
const log = logger || console;
|
||||
|
||||
// Validate required input
|
||||
if (!product?.name && !product?.description) {
|
||||
return {
|
||||
isValid: true,
|
||||
suggestion: null,
|
||||
issues: [],
|
||||
skipped: true,
|
||||
reason: 'No name or description provided'
|
||||
};
|
||||
}
|
||||
|
||||
if (!provider) {
|
||||
throw new Error('Groq provider not available');
|
||||
}
|
||||
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not available');
|
||||
}
|
||||
|
||||
try {
|
||||
// Load prompts from database
|
||||
const companyKey = product.company_id || product.company_name || product.company;
|
||||
const prompts = await loadDescriptionValidationPrompts(pool, companyKey);
|
||||
|
||||
// Validate required prompts exist
|
||||
validateRequiredPrompts(prompts, 'description_validation');
|
||||
|
||||
// Build the user prompt with database-loaded prompts
|
||||
const userPrompt = buildDescriptionUserPrompt(product, prompts);
|
||||
|
||||
let response;
|
||||
let result;
|
||||
|
||||
try {
|
||||
// Try with JSON mode first
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.LARGE, // openai/gpt-oss-120b - better for content analysis
|
||||
temperature: 0.3, // Slightly higher for creative suggestions
|
||||
maxTokens: 2000, // Reasoning models need extra tokens for thinking
|
||||
responseFormat: { type: 'json_object' }
|
||||
});
|
||||
|
||||
// Log full raw response for debugging
|
||||
log.info('[DescriptionValidation] Raw AI response:', {
|
||||
parsed: response.parsed,
|
||||
content: response.content,
|
||||
contentLength: response.content?.length
|
||||
});
|
||||
|
||||
// Parse the response
|
||||
result = parseDescriptionResponse(response.parsed, response.content);
|
||||
} catch (jsonError) {
|
||||
// If JSON mode failed, check if we have failedGeneration to parse
|
||||
if (jsonError.failedGeneration) {
|
||||
log.warn('[DescriptionValidation] JSON mode failed, attempting to parse failed_generation:', {
|
||||
failedGeneration: jsonError.failedGeneration
|
||||
});
|
||||
result = parseDescriptionResponse(null, jsonError.failedGeneration);
|
||||
response = { latencyMs: 0, usage: {}, model: MODELS.LARGE };
|
||||
} else {
|
||||
// Retry without JSON mode
|
||||
log.warn('[DescriptionValidation] JSON mode failed, retrying without JSON mode');
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.LARGE,
|
||||
temperature: 0.3,
|
||||
maxTokens: 2000 // Reasoning models need extra tokens for thinking
|
||||
// No responseFormat - let the model respond freely
|
||||
});
|
||||
log.info('[DescriptionValidation] Raw AI response (no JSON mode):', {
|
||||
parsed: response.parsed,
|
||||
content: response.content,
|
||||
contentLength: response.content?.length
|
||||
});
|
||||
result = parseDescriptionResponse(response.parsed, response.content);
|
||||
}
|
||||
}
|
||||
|
||||
log.info(`[DescriptionValidation] Validated description for "${product.name}" in ${response.latencyMs}ms`, {
|
||||
isValid: result.isValid,
|
||||
hasSuggestion: !!result.suggestion,
|
||||
issueCount: result.issues.length
|
||||
});
|
||||
|
||||
return {
|
||||
...result,
|
||||
latencyMs: response.latencyMs,
|
||||
usage: response.usage,
|
||||
model: response.model
|
||||
};
|
||||
} catch (error) {
|
||||
log.error('[DescriptionValidation] Error:', error.message);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
TASK_ID,
|
||||
createDescriptionValidationTask
|
||||
};
|
||||
186
inventory-server/src/services/ai/tasks/index.js
Normal file
186
inventory-server/src/services/ai/tasks/index.js
Normal file
@@ -0,0 +1,186 @@
|
||||
/**
|
||||
* AI Task Registry
|
||||
*
|
||||
* Simple registry pattern for AI tasks. Each task has:
|
||||
* - id: Unique identifier
|
||||
* - run: Async function that executes the task
|
||||
*
|
||||
* This allows adding new AI capabilities without modifying core code.
|
||||
*/
|
||||
|
||||
const { createNameValidationTask, TASK_ID: NAME_TASK_ID } = require('./nameValidationTask');
|
||||
const { createDescriptionValidationTask, TASK_ID: DESC_TASK_ID } = require('./descriptionValidationTask');
|
||||
const { createSanityCheckTask, TASK_ID: SANITY_TASK_ID } = require('./sanityCheckTask');
|
||||
|
||||
/**
|
||||
* Task IDs - frozen constants for type safety
|
||||
*/
|
||||
const TASK_IDS = Object.freeze({
|
||||
// Inline validation (triggered on field blur)
|
||||
VALIDATE_NAME: NAME_TASK_ID,
|
||||
VALIDATE_DESCRIPTION: DESC_TASK_ID,
|
||||
|
||||
// Batch operations (triggered on user action)
|
||||
SANITY_CHECK: SANITY_TASK_ID
|
||||
});
|
||||
|
||||
/**
|
||||
* Task Registry
|
||||
*/
|
||||
class TaskRegistry {
|
||||
constructor() {
|
||||
this.tasks = new Map();
|
||||
}
|
||||
|
||||
/**
|
||||
* Register a task
|
||||
* @param {Object} task
|
||||
* @param {string} task.id - Unique task identifier
|
||||
* @param {Function} task.run - Async function: (payload) => result
|
||||
* @param {string} [task.description] - Human-readable description
|
||||
*/
|
||||
register(task) {
|
||||
if (!task?.id) {
|
||||
throw new Error('Task must have an id');
|
||||
}
|
||||
if (typeof task.run !== 'function') {
|
||||
throw new Error(`Task ${task.id} must have a run function`);
|
||||
}
|
||||
if (this.tasks.has(task.id)) {
|
||||
throw new Error(`Task ${task.id} is already registered`);
|
||||
}
|
||||
|
||||
this.tasks.set(task.id, task);
|
||||
return this;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a task by ID
|
||||
* @param {string} taskId
|
||||
* @returns {Object|null}
|
||||
*/
|
||||
get(taskId) {
|
||||
return this.tasks.get(taskId) || null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a task exists
|
||||
* @param {string} taskId
|
||||
* @returns {boolean}
|
||||
*/
|
||||
has(taskId) {
|
||||
return this.tasks.has(taskId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Run a task by ID
|
||||
* @param {string} taskId
|
||||
* @param {Object} payload - Task-specific input
|
||||
* @returns {Promise<Object>} Task result
|
||||
*/
|
||||
async runTask(taskId, payload = {}) {
|
||||
const task = this.get(taskId);
|
||||
if (!task) {
|
||||
throw new Error(`Unknown task: ${taskId}`);
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await task.run(payload);
|
||||
return {
|
||||
success: true,
|
||||
taskId,
|
||||
...result
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
taskId,
|
||||
error: error.message,
|
||||
code: error.code
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* List all registered task IDs
|
||||
* @returns {string[]}
|
||||
*/
|
||||
list() {
|
||||
return Array.from(this.tasks.keys());
|
||||
}
|
||||
|
||||
/**
|
||||
* Get count of registered tasks
|
||||
* @returns {number}
|
||||
*/
|
||||
size() {
|
||||
return this.tasks.size;
|
||||
}
|
||||
}
|
||||
|
||||
// Singleton instance
|
||||
let registry = null;
|
||||
|
||||
/**
|
||||
* Get or create the task registry
|
||||
* @returns {TaskRegistry}
|
||||
*/
|
||||
function getRegistry() {
|
||||
if (!registry) {
|
||||
registry = new TaskRegistry();
|
||||
}
|
||||
return registry;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset the registry (mainly for testing)
|
||||
*/
|
||||
function resetRegistry() {
|
||||
registry = null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Register all validation tasks with the registry
|
||||
* Call this during initialization after the registry is created
|
||||
*
|
||||
* @param {Object} [logger] - Optional logger
|
||||
*/
|
||||
function registerAllTasks(logger = console) {
|
||||
const reg = getRegistry();
|
||||
|
||||
// Register name validation
|
||||
if (!reg.has(TASK_IDS.VALIDATE_NAME)) {
|
||||
reg.register(createNameValidationTask());
|
||||
logger.info(`[Tasks] Registered: ${TASK_IDS.VALIDATE_NAME}`);
|
||||
}
|
||||
|
||||
// Register description validation
|
||||
if (!reg.has(TASK_IDS.VALIDATE_DESCRIPTION)) {
|
||||
reg.register(createDescriptionValidationTask());
|
||||
logger.info(`[Tasks] Registered: ${TASK_IDS.VALIDATE_DESCRIPTION}`);
|
||||
}
|
||||
|
||||
// Register sanity check
|
||||
if (!reg.has(TASK_IDS.SANITY_CHECK)) {
|
||||
reg.register(createSanityCheckTask());
|
||||
logger.info(`[Tasks] Registered: ${TASK_IDS.SANITY_CHECK}`);
|
||||
}
|
||||
|
||||
return reg;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
// Constants
|
||||
TASK_IDS,
|
||||
|
||||
// Registry
|
||||
TaskRegistry,
|
||||
getRegistry,
|
||||
resetRegistry,
|
||||
registerAllTasks,
|
||||
|
||||
// Task factories (for custom registration)
|
||||
createNameValidationTask,
|
||||
createDescriptionValidationTask,
|
||||
createSanityCheckTask
|
||||
};
|
||||
172
inventory-server/src/services/ai/tasks/nameValidationTask.js
Normal file
172
inventory-server/src/services/ai/tasks/nameValidationTask.js
Normal file
@@ -0,0 +1,172 @@
|
||||
/**
|
||||
* Name Validation Task
|
||||
*
|
||||
* Validates a product name for spelling, grammar, and naming conventions.
|
||||
* Uses Groq with the smaller model for fast response times.
|
||||
* Loads all prompts from the database (no hardcoded prompts).
|
||||
*/
|
||||
|
||||
const { MODELS } = require('../providers/groqProvider');
|
||||
const {
|
||||
loadNameValidationPrompts,
|
||||
validateRequiredPrompts
|
||||
} = require('../prompts/promptLoader');
|
||||
const {
|
||||
buildNameUserPrompt,
|
||||
parseNameResponse
|
||||
} = require('../prompts/namePrompts');
|
||||
|
||||
const TASK_ID = 'validate.name';
|
||||
|
||||
/**
|
||||
* Create the name validation task
|
||||
*
|
||||
* @returns {Object} Task definition
|
||||
*/
|
||||
function createNameValidationTask() {
|
||||
return {
|
||||
id: TASK_ID,
|
||||
description: 'Validate product name for spelling, grammar, and conventions',
|
||||
|
||||
/**
|
||||
* Run the name validation
|
||||
*
|
||||
* @param {Object} payload
|
||||
* @param {Object} payload.product - Product data
|
||||
* @param {string} payload.product.name - Product name to validate
|
||||
* @param {string} [payload.product.company_name] - Company name
|
||||
* @param {string} [payload.product.company_id] - Company ID for loading specific rules
|
||||
* @param {string} [payload.product.line_name] - Product line
|
||||
* @param {string} [payload.product.description] - Description for context
|
||||
* @param {Object} payload.provider - Groq provider instance
|
||||
* @param {Object} payload.pool - PostgreSQL pool
|
||||
* @param {Object} [payload.logger] - Logger instance
|
||||
* @returns {Promise<Object>}
|
||||
*/
|
||||
async run(payload) {
|
||||
const { product, provider, pool, logger } = payload;
|
||||
const log = logger || console;
|
||||
|
||||
// Validate required input
|
||||
if (!product?.name) {
|
||||
return {
|
||||
isValid: true,
|
||||
suggestion: null,
|
||||
issues: [],
|
||||
skipped: true,
|
||||
reason: 'No name provided'
|
||||
};
|
||||
}
|
||||
|
||||
if (!provider) {
|
||||
throw new Error('Groq provider not available');
|
||||
}
|
||||
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not available');
|
||||
}
|
||||
|
||||
try {
|
||||
// Load prompts from database
|
||||
const companyKey = product.company_id || product.company_name || product.company;
|
||||
const prompts = await loadNameValidationPrompts(pool, companyKey);
|
||||
|
||||
// Debug: Log loaded prompts
|
||||
log.info('[NameValidation] Loaded prompts:', {
|
||||
hasSystem: !!prompts.system,
|
||||
systemLength: prompts.system?.length || 0,
|
||||
hasGeneral: !!prompts.general,
|
||||
generalLength: prompts.general?.length || 0,
|
||||
generalPreview: prompts.general?.substring(0, 100) || '(empty)',
|
||||
hasCompanySpecific: !!prompts.companySpecific,
|
||||
companyKey
|
||||
});
|
||||
|
||||
// Validate required prompts exist
|
||||
validateRequiredPrompts(prompts, 'name_validation');
|
||||
|
||||
// Build the user prompt with database-loaded prompts
|
||||
const userPrompt = buildNameUserPrompt(product, prompts);
|
||||
|
||||
// Debug: Log the full user prompt being sent
|
||||
log.info('[NameValidation] User prompt:', userPrompt.substring(0, 500));
|
||||
|
||||
let response;
|
||||
let result;
|
||||
|
||||
try {
|
||||
// Try with JSON mode first
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.LARGE, // openai/gpt-oss-120b - reasoning model
|
||||
temperature: 0.2, // Low temperature for consistent results
|
||||
maxTokens: 3000, // Reasoning models need extra tokens for thinking
|
||||
responseFormat: { type: 'json_object' }
|
||||
});
|
||||
|
||||
// Log full raw response for debugging
|
||||
log.info('[NameValidation] Raw AI response:', {
|
||||
parsed: response.parsed,
|
||||
content: response.content,
|
||||
contentLength: response.content?.length
|
||||
});
|
||||
|
||||
// Parse the response
|
||||
result = parseNameResponse(response.parsed, response.content);
|
||||
} catch (jsonError) {
|
||||
// If JSON mode failed, check if we have failedGeneration to parse
|
||||
if (jsonError.failedGeneration) {
|
||||
log.warn('[NameValidation] JSON mode failed, attempting to parse failed_generation:', {
|
||||
failedGeneration: jsonError.failedGeneration
|
||||
});
|
||||
result = parseNameResponse(null, jsonError.failedGeneration);
|
||||
response = { latencyMs: 0, usage: {}, model: MODELS.SMALL };
|
||||
} else {
|
||||
// Retry without JSON mode
|
||||
log.warn('[NameValidation] JSON mode failed, retrying without JSON mode');
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.SMALL,
|
||||
temperature: 0.2,
|
||||
maxTokens: 1500 // Reasoning models need extra tokens for thinking
|
||||
// No responseFormat - let the model respond freely
|
||||
});
|
||||
log.info('[NameValidation] Raw AI response (no JSON mode):', {
|
||||
parsed: response.parsed,
|
||||
content: response.content,
|
||||
contentLength: response.content?.length
|
||||
});
|
||||
result = parseNameResponse(response.parsed, response.content);
|
||||
}
|
||||
}
|
||||
|
||||
log.info(`[NameValidation] Validated "${product.name}" in ${response.latencyMs}ms`, {
|
||||
isValid: result.isValid,
|
||||
hassuggestion: !!result.suggestion,
|
||||
issueCount: result.issues.length
|
||||
});
|
||||
|
||||
return {
|
||||
...result,
|
||||
latencyMs: response.latencyMs,
|
||||
usage: response.usage,
|
||||
model: response.model
|
||||
};
|
||||
} catch (error) {
|
||||
log.error('[NameValidation] Error:', error.message);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
TASK_ID,
|
||||
createNameValidationTask
|
||||
};
|
||||
182
inventory-server/src/services/ai/tasks/sanityCheckTask.js
Normal file
182
inventory-server/src/services/ai/tasks/sanityCheckTask.js
Normal file
@@ -0,0 +1,182 @@
|
||||
/**
|
||||
* Sanity Check Task
|
||||
*
|
||||
* Reviews a batch of products for consistency and appropriateness.
|
||||
* Uses Groq with the larger model for complex batch analysis.
|
||||
* Loads all prompts from the database (no hardcoded prompts).
|
||||
*/
|
||||
|
||||
const { MODELS } = require('../providers/groqProvider');
|
||||
const {
|
||||
loadSanityCheckPrompts,
|
||||
validateRequiredPrompts
|
||||
} = require('../prompts/promptLoader');
|
||||
const {
|
||||
buildSanityCheckUserPrompt,
|
||||
parseSanityCheckResponse
|
||||
} = require('../prompts/sanityCheckPrompts');
|
||||
|
||||
const TASK_ID = 'sanity.check';
|
||||
|
||||
// Maximum products to send in a single request (to avoid token limits)
|
||||
const MAX_PRODUCTS_PER_REQUEST = 50;
|
||||
|
||||
/**
|
||||
* Create the sanity check task
|
||||
*
|
||||
* @returns {Object} Task definition
|
||||
*/
|
||||
function createSanityCheckTask() {
|
||||
return {
|
||||
id: TASK_ID,
|
||||
description: 'Review batch of products for consistency and appropriateness',
|
||||
|
||||
/**
|
||||
* Run the sanity check
|
||||
*
|
||||
* @param {Object} payload
|
||||
* @param {Object[]} payload.products - Array of products to check
|
||||
* @param {Object} payload.provider - Groq provider instance
|
||||
* @param {Object} payload.pool - PostgreSQL pool
|
||||
* @param {Object} [payload.logger] - Logger instance
|
||||
* @returns {Promise<Object>}
|
||||
*/
|
||||
async run(payload) {
|
||||
const { products, provider, pool, logger } = payload;
|
||||
const log = logger || console;
|
||||
|
||||
// Validate required input
|
||||
if (!Array.isArray(products) || products.length === 0) {
|
||||
return {
|
||||
issues: [],
|
||||
summary: 'No products to check',
|
||||
skipped: true
|
||||
};
|
||||
}
|
||||
|
||||
if (!provider) {
|
||||
throw new Error('Groq provider not available');
|
||||
}
|
||||
|
||||
if (!pool) {
|
||||
throw new Error('Database pool not available');
|
||||
}
|
||||
|
||||
try {
|
||||
// Load prompts from database
|
||||
const prompts = await loadSanityCheckPrompts(pool);
|
||||
|
||||
// Validate required prompts exist
|
||||
validateRequiredPrompts(prompts, 'sanity_check');
|
||||
|
||||
// If batch is small enough, process in one request
|
||||
if (products.length <= MAX_PRODUCTS_PER_REQUEST) {
|
||||
return await checkBatch(products, prompts, provider, log);
|
||||
}
|
||||
|
||||
// Otherwise, process in chunks and combine results
|
||||
log.info(`[SanityCheck] Processing ${products.length} products in chunks`);
|
||||
const allIssues = [];
|
||||
const summaries = [];
|
||||
|
||||
for (let i = 0; i < products.length; i += MAX_PRODUCTS_PER_REQUEST) {
|
||||
const chunk = products.slice(i, i + MAX_PRODUCTS_PER_REQUEST);
|
||||
const chunkOffset = i; // To adjust product indices in results
|
||||
|
||||
const result = await checkBatch(chunk, prompts, provider, log);
|
||||
|
||||
// Adjust product indices to match original array
|
||||
const adjustedIssues = result.issues.map(issue => ({
|
||||
...issue,
|
||||
productIndex: issue.productIndex + chunkOffset
|
||||
}));
|
||||
|
||||
allIssues.push(...adjustedIssues);
|
||||
summaries.push(result.summary);
|
||||
}
|
||||
|
||||
return {
|
||||
issues: allIssues,
|
||||
summary: summaries.length > 1
|
||||
? `Reviewed ${products.length} products in ${summaries.length} batches. ${allIssues.length} issues found.`
|
||||
: summaries[0],
|
||||
totalProducts: products.length,
|
||||
issueCount: allIssues.length
|
||||
};
|
||||
} catch (error) {
|
||||
log.error('[SanityCheck] Error:', error.message);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Check a single batch of products
|
||||
*
|
||||
* @param {Object[]} products - Products to check
|
||||
* @param {Object} prompts - Loaded prompts from database
|
||||
* @param {Object} provider - Groq provider
|
||||
* @param {Object} log - Logger
|
||||
* @returns {Promise<Object>}
|
||||
*/
|
||||
async function checkBatch(products, prompts, provider, log) {
|
||||
const userPrompt = buildSanityCheckUserPrompt(products, prompts);
|
||||
|
||||
let response;
|
||||
let result;
|
||||
|
||||
try {
|
||||
// Try with JSON mode first
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.LARGE, // openai/gpt-oss-120b - needed for complex batch analysis
|
||||
temperature: 0.2, // Low temperature for consistent analysis
|
||||
maxTokens: 2000, // More tokens for batch results
|
||||
responseFormat: { type: 'json_object' }
|
||||
});
|
||||
|
||||
result = parseSanityCheckResponse(response.parsed, response.content);
|
||||
} catch (jsonError) {
|
||||
// If JSON mode failed, check if we have failedGeneration to parse
|
||||
if (jsonError.failedGeneration) {
|
||||
log.warn('[SanityCheck] JSON mode failed, attempting to parse failed_generation');
|
||||
result = parseSanityCheckResponse(null, jsonError.failedGeneration);
|
||||
response = { latencyMs: 0, usage: {}, model: MODELS.LARGE };
|
||||
} else {
|
||||
// Retry without JSON mode
|
||||
log.warn('[SanityCheck] JSON mode failed, retrying without JSON mode');
|
||||
response = await provider.chatCompletion({
|
||||
messages: [
|
||||
{ role: 'system', content: prompts.system },
|
||||
{ role: 'user', content: userPrompt }
|
||||
],
|
||||
model: MODELS.LARGE,
|
||||
temperature: 0.2,
|
||||
maxTokens: 2000
|
||||
// No responseFormat - let the model respond freely
|
||||
});
|
||||
result = parseSanityCheckResponse(response.parsed, response.content);
|
||||
}
|
||||
}
|
||||
|
||||
log.info(`[SanityCheck] Checked ${products.length} products in ${response.latencyMs}ms`, {
|
||||
issueCount: result.issues.length
|
||||
});
|
||||
|
||||
return {
|
||||
...result,
|
||||
latencyMs: response.latencyMs,
|
||||
usage: response.usage,
|
||||
model: response.model
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
TASK_ID,
|
||||
createSanityCheckTask,
|
||||
MAX_PRODUCTS_PER_REQUEST
|
||||
};
|
||||
@@ -6,7 +6,7 @@
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "tsc -b && vite build",
|
||||
"build:deploy": "tsc -b && COPY_BUILD=true vite build",
|
||||
"build:deploy": "tsc -b && COPY_BUILD=true DEPLOY_TARGET=netcup DEPLOY_PATH=/var/www/html/inventory/frontend vite build",
|
||||
"lint": "eslint .",
|
||||
"preview": "vite preview",
|
||||
"mount": "../mountremote.command"
|
||||
|
||||
913
inventory/src/components/dashboard/OperationsMetrics.tsx
Normal file
913
inventory/src/components/dashboard/OperationsMetrics.tsx
Normal file
@@ -0,0 +1,913 @@
|
||||
import { useEffect, useMemo, useState } from "react";
|
||||
import { acotService } from "@/services/dashboard/acotService";
|
||||
import { Card, CardContent } from "@/components/ui/card";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
SelectItem,
|
||||
SelectTrigger,
|
||||
SelectValue,
|
||||
} from "@/components/ui/select";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
DialogTrigger,
|
||||
} from "@/components/ui/dialog";
|
||||
import { Separator } from "@/components/ui/separator";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import {
|
||||
Area,
|
||||
CartesianGrid,
|
||||
ComposedChart,
|
||||
Legend,
|
||||
Line,
|
||||
ResponsiveContainer,
|
||||
Tooltip,
|
||||
XAxis,
|
||||
YAxis,
|
||||
} from "recharts";
|
||||
import type { TooltipProps } from "recharts";
|
||||
import { Package, Truck, Gauge, TrendingUp } from "lucide-react";
|
||||
import PeriodSelectionPopover, {
|
||||
type QuickPreset,
|
||||
} from "@/components/dashboard/PeriodSelectionPopover";
|
||||
import type { CustomPeriod, NaturalLanguagePeriodResult } from "@/utils/naturalLanguagePeriod";
|
||||
import { CARD_STYLES } from "@/lib/dashboard/designTokens";
|
||||
import {
|
||||
DashboardSectionHeader,
|
||||
DashboardStatCard,
|
||||
DashboardStatCardSkeleton,
|
||||
ChartSkeleton,
|
||||
DashboardEmptyState,
|
||||
DashboardErrorState,
|
||||
TOOLTIP_STYLES,
|
||||
METRIC_COLORS,
|
||||
} from "@/components/dashboard/shared";
|
||||
|
||||
type ComparisonValue = {
|
||||
absolute: number | null;
|
||||
percentage: number | null;
|
||||
};
|
||||
|
||||
type OperationsTotals = {
|
||||
ordersPicked: number;
|
||||
piecesPicked: number;
|
||||
ticketCount: number;
|
||||
pickingHours: number;
|
||||
ordersShipped: number;
|
||||
piecesShipped: number;
|
||||
ordersPerHour: number;
|
||||
piecesPerHour: number;
|
||||
avgPickingSpeed: number;
|
||||
};
|
||||
|
||||
type OperationsComparison = {
|
||||
ordersPicked?: ComparisonValue;
|
||||
piecesPicked?: ComparisonValue;
|
||||
ordersShipped?: ComparisonValue;
|
||||
piecesShipped?: ComparisonValue;
|
||||
ordersPerHour?: ComparisonValue;
|
||||
piecesPerHour?: ComparisonValue;
|
||||
};
|
||||
|
||||
type EmployeePickingEntry = {
|
||||
employeeId: number;
|
||||
name: string;
|
||||
ticketCount: number;
|
||||
ordersPicked: number;
|
||||
piecesPicked: number;
|
||||
pickingHours: number;
|
||||
avgPickingSpeed: number | null;
|
||||
};
|
||||
|
||||
type EmployeeShippingEntry = {
|
||||
employeeId: number;
|
||||
name: string;
|
||||
ordersShipped: number;
|
||||
piecesShipped: number;
|
||||
};
|
||||
|
||||
type TrendPoint = {
|
||||
date: string;
|
||||
timestamp: string;
|
||||
ordersPicked: number;
|
||||
piecesPicked: number;
|
||||
ordersShipped: number;
|
||||
piecesShipped: number;
|
||||
};
|
||||
|
||||
type OperationsMetricsResponse = {
|
||||
dateRange?: { label?: string };
|
||||
totals: OperationsTotals;
|
||||
previousTotals?: OperationsTotals | null;
|
||||
comparison?: OperationsComparison | null;
|
||||
byEmployee: {
|
||||
picking: EmployeePickingEntry[];
|
||||
shipping: EmployeeShippingEntry[];
|
||||
};
|
||||
trend: TrendPoint[];
|
||||
};
|
||||
|
||||
type ChartSeriesKey = "ordersPicked" | "piecesPicked" | "ordersShipped" | "piecesShipped";
|
||||
|
||||
type GroupByOption = "day" | "week" | "month";
|
||||
|
||||
type ChartPoint = {
|
||||
label: string;
|
||||
timestamp: string | null;
|
||||
ordersPicked: number | null;
|
||||
piecesPicked: number | null;
|
||||
ordersShipped: number | null;
|
||||
piecesShipped: number | null;
|
||||
tooltipLabel: string;
|
||||
};
|
||||
|
||||
const chartColors: Record<ChartSeriesKey, string> = {
|
||||
ordersPicked: METRIC_COLORS.orders,
|
||||
piecesPicked: METRIC_COLORS.aov,
|
||||
ordersShipped: METRIC_COLORS.profit,
|
||||
piecesShipped: METRIC_COLORS.secondary,
|
||||
};
|
||||
|
||||
const SERIES_LABELS: Record<ChartSeriesKey, string> = {
|
||||
ordersPicked: "Orders Picked",
|
||||
piecesPicked: "Pieces Picked",
|
||||
ordersShipped: "Orders Shipped",
|
||||
piecesShipped: "Pieces Shipped",
|
||||
};
|
||||
|
||||
const SERIES_DEFINITIONS: Array<{
|
||||
key: ChartSeriesKey;
|
||||
label: string;
|
||||
}> = [
|
||||
{ key: "ordersPicked", label: SERIES_LABELS.ordersPicked },
|
||||
{ key: "piecesPicked", label: SERIES_LABELS.piecesPicked },
|
||||
{ key: "ordersShipped", label: SERIES_LABELS.ordersShipped },
|
||||
{ key: "piecesShipped", label: SERIES_LABELS.piecesShipped },
|
||||
];
|
||||
|
||||
const GROUP_BY_CHOICES: Array<{ value: GroupByOption; label: string }> = [
|
||||
{ value: "day", label: "Days" },
|
||||
{ value: "week", label: "Weeks" },
|
||||
{ value: "month", label: "Months" },
|
||||
];
|
||||
|
||||
const MONTHS = [
|
||||
"January", "February", "March", "April", "May", "June",
|
||||
"July", "August", "September", "October", "November", "December",
|
||||
];
|
||||
|
||||
const MONTH_COUNT_LIMIT = 999;
|
||||
const QUARTER_COUNT_LIMIT = 999;
|
||||
const YEAR_COUNT_LIMIT = 999;
|
||||
|
||||
const formatMonthLabel = (year: number, monthIndex: number) => `${MONTHS[monthIndex]} ${year}`;
|
||||
|
||||
const formatQuarterLabel = (year: number, quarterIndex: number) => `Q${quarterIndex + 1} ${year}`;
|
||||
|
||||
function formatPeriodRangeLabel(period: CustomPeriod): string {
|
||||
const range = computePeriodRange(period);
|
||||
if (!range) return "";
|
||||
|
||||
const start = range.start;
|
||||
const end = range.end;
|
||||
|
||||
if (period.type === "month") {
|
||||
const startLabel = formatMonthLabel(start.getFullYear(), start.getMonth());
|
||||
const endLabel = formatMonthLabel(end.getFullYear(), end.getMonth());
|
||||
return period.count === 1 ? startLabel : `${startLabel} – ${endLabel}`;
|
||||
}
|
||||
|
||||
if (period.type === "quarter") {
|
||||
const startQuarter = Math.floor(start.getMonth() / 3);
|
||||
const endQuarter = Math.floor(end.getMonth() / 3);
|
||||
const startLabel = formatQuarterLabel(start.getFullYear(), startQuarter);
|
||||
const endLabel = formatQuarterLabel(end.getFullYear(), endQuarter);
|
||||
return period.count === 1 ? startLabel : `${startLabel} – ${endLabel}`;
|
||||
}
|
||||
|
||||
const startYear = start.getFullYear();
|
||||
const endYear = end.getFullYear();
|
||||
return period.count === 1 ? `${startYear}` : `${startYear} – ${endYear}`;
|
||||
}
|
||||
|
||||
const formatNumber = (value: number, decimals = 0) => {
|
||||
if (!Number.isFinite(value)) return "0";
|
||||
return value.toLocaleString("en-US", {
|
||||
minimumFractionDigits: decimals,
|
||||
maximumFractionDigits: decimals,
|
||||
});
|
||||
};
|
||||
|
||||
const formatHours = (value: number) => {
|
||||
if (!Number.isFinite(value)) return "0h";
|
||||
return `${value.toFixed(1)}h`;
|
||||
};
|
||||
|
||||
const ensureValidCustomPeriod = (period: CustomPeriod): CustomPeriod => {
|
||||
if (period.count < 1) {
|
||||
return { ...period, count: 1 };
|
||||
}
|
||||
|
||||
switch (period.type) {
|
||||
case "month":
|
||||
return {
|
||||
...period,
|
||||
startMonth: Math.min(Math.max(period.startMonth, 0), 11),
|
||||
count: Math.min(period.count, MONTH_COUNT_LIMIT),
|
||||
};
|
||||
case "quarter":
|
||||
return {
|
||||
...period,
|
||||
startQuarter: Math.min(Math.max(period.startQuarter, 0), 3),
|
||||
count: Math.min(period.count, QUARTER_COUNT_LIMIT),
|
||||
};
|
||||
case "year":
|
||||
default:
|
||||
return {
|
||||
...period,
|
||||
count: Math.min(period.count, YEAR_COUNT_LIMIT),
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
function computePeriodRange(period: CustomPeriod): { start: Date; end: Date } | null {
|
||||
const safePeriod = ensureValidCustomPeriod(period);
|
||||
let start: Date;
|
||||
|
||||
if (safePeriod.type === "month") {
|
||||
start = new Date(safePeriod.startYear, safePeriod.startMonth, 1, 0, 0, 0, 0);
|
||||
const endExclusive = new Date(start);
|
||||
endExclusive.setMonth(endExclusive.getMonth() + safePeriod.count);
|
||||
endExclusive.setMilliseconds(endExclusive.getMilliseconds() - 1);
|
||||
return { start, end: endExclusive };
|
||||
}
|
||||
|
||||
if (safePeriod.type === "quarter") {
|
||||
const startMonth = safePeriod.startQuarter * 3;
|
||||
start = new Date(safePeriod.startYear, startMonth, 1, 0, 0, 0, 0);
|
||||
const endExclusive = new Date(start);
|
||||
endExclusive.setMonth(endExclusive.getMonth() + safePeriod.count * 3);
|
||||
endExclusive.setMilliseconds(endExclusive.getMilliseconds() - 1);
|
||||
return { start, end: endExclusive };
|
||||
}
|
||||
|
||||
start = new Date(safePeriod.startYear, 0, 1, 0, 0, 0, 0);
|
||||
const endExclusive = new Date(start);
|
||||
endExclusive.setFullYear(endExclusive.getFullYear() + safePeriod.count);
|
||||
endExclusive.setMilliseconds(endExclusive.getMilliseconds() - 1);
|
||||
return { start, end: endExclusive };
|
||||
}
|
||||
|
||||
const OperationsMetrics = () => {
|
||||
const currentDate = useMemo(() => new Date(), []);
|
||||
const currentYear = currentDate.getFullYear();
|
||||
|
||||
const [customPeriod, setCustomPeriod] = useState<CustomPeriod>({
|
||||
type: "month",
|
||||
startYear: currentYear,
|
||||
startMonth: currentDate.getMonth(),
|
||||
count: 1,
|
||||
});
|
||||
const [isLast30DaysMode, setIsLast30DaysMode] = useState<boolean>(true);
|
||||
const [isPeriodPopoverOpen, setIsPeriodPopoverOpen] = useState<boolean>(false);
|
||||
const [metrics, setMetrics] = useState<Record<ChartSeriesKey, boolean>>({
|
||||
ordersPicked: true,
|
||||
piecesPicked: false,
|
||||
ordersShipped: true,
|
||||
piecesShipped: false,
|
||||
});
|
||||
const [groupBy, setGroupBy] = useState<GroupByOption>("day");
|
||||
const [data, setData] = useState<OperationsMetricsResponse | null>(null);
|
||||
const [loading, setLoading] = useState<boolean>(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
const selectedRange = useMemo(() => {
|
||||
if (isLast30DaysMode) {
|
||||
const end = new Date(currentDate);
|
||||
const start = new Date(currentDate);
|
||||
start.setHours(0, 0, 0, 0);
|
||||
end.setHours(23, 59, 59, 999);
|
||||
start.setDate(start.getDate() - 29);
|
||||
return { start, end };
|
||||
}
|
||||
return computePeriodRange(customPeriod);
|
||||
}, [isLast30DaysMode, customPeriod, currentDate]);
|
||||
|
||||
const effectiveRangeEnd = useMemo(() => {
|
||||
if (!selectedRange) return null;
|
||||
const rangeEndMs = selectedRange.end.getTime();
|
||||
const currentMs = currentDate.getTime();
|
||||
const startMs = selectedRange.start.getTime();
|
||||
const clampedMs = Math.min(rangeEndMs, currentMs);
|
||||
const safeEndMs = clampedMs < startMs ? startMs : clampedMs;
|
||||
return new Date(safeEndMs);
|
||||
}, [selectedRange, currentDate]);
|
||||
|
||||
const requestRange = useMemo(() => {
|
||||
if (!selectedRange) return null;
|
||||
const end = effectiveRangeEnd ?? selectedRange.end;
|
||||
return {
|
||||
start: new Date(selectedRange.start),
|
||||
end: new Date(end),
|
||||
};
|
||||
}, [selectedRange, effectiveRangeEnd]);
|
||||
|
||||
useEffect(() => {
|
||||
let cancelled = false;
|
||||
|
||||
const fetchData = async () => {
|
||||
setLoading(true);
|
||||
setError(null);
|
||||
|
||||
try {
|
||||
const params: Record<string, string> = {};
|
||||
|
||||
if (isLast30DaysMode) {
|
||||
params.timeRange = "last30days";
|
||||
} else {
|
||||
if (!selectedRange || !requestRange) {
|
||||
setData(null);
|
||||
return;
|
||||
}
|
||||
params.timeRange = "custom";
|
||||
params.startDate = requestRange.start.toISOString();
|
||||
params.endDate = requestRange.end.toISOString();
|
||||
}
|
||||
|
||||
// @ts-expect-error - acotService is a JS file, TypeScript can't infer the param type
|
||||
const response = (await acotService.getOperationsMetrics(params)) as OperationsMetricsResponse;
|
||||
if (!cancelled) {
|
||||
setData(response);
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
if (!cancelled) {
|
||||
let message = "Failed to load operations metrics";
|
||||
if (typeof err === "object" && err !== null) {
|
||||
const maybeError = err as { response?: { data?: { error?: unknown } }; message?: unknown };
|
||||
const responseError = maybeError.response?.data?.error;
|
||||
if (typeof responseError === "string" && responseError.trim().length > 0) {
|
||||
message = responseError;
|
||||
} else if (typeof maybeError.message === "string") {
|
||||
message = maybeError.message;
|
||||
}
|
||||
}
|
||||
setError(message);
|
||||
}
|
||||
} finally {
|
||||
if (!cancelled) {
|
||||
setLoading(false);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
void fetchData();
|
||||
return () => { cancelled = true; };
|
||||
}, [isLast30DaysMode, selectedRange, requestRange]);
|
||||
|
||||
const cards = useMemo(() => {
|
||||
if (!data?.totals) return [];
|
||||
|
||||
const totals = data.totals;
|
||||
const comparison = data.comparison ?? {};
|
||||
|
||||
return [
|
||||
{
|
||||
key: "ordersPicked",
|
||||
title: "Orders Picked",
|
||||
value: formatNumber(totals.ordersPicked),
|
||||
description: `${formatNumber(totals.piecesPicked)} pieces`,
|
||||
trendValue: comparison.ordersPicked?.percentage,
|
||||
iconColor: "blue" as const,
|
||||
tooltip: "Total distinct orders picked (ship-together groups count as 1).",
|
||||
},
|
||||
{
|
||||
key: "ordersShipped",
|
||||
title: "Orders Shipped",
|
||||
value: formatNumber(totals.ordersShipped),
|
||||
description: `${formatNumber(totals.piecesShipped)} pieces`,
|
||||
trendValue: comparison.ordersShipped?.percentage,
|
||||
iconColor: "emerald" as const,
|
||||
tooltip: "Total orders shipped (ship-together groups count as 1).",
|
||||
},
|
||||
{
|
||||
key: "productivity",
|
||||
title: "Productivity",
|
||||
value: `${formatNumber(totals.ordersPerHour, 1)}/h`,
|
||||
description: `${formatNumber(totals.piecesPerHour, 1)} pieces/hour`,
|
||||
trendValue: comparison.ordersPerHour?.percentage,
|
||||
iconColor: "purple" as const,
|
||||
tooltip: "Orders and pieces picked per picking hour.",
|
||||
},
|
||||
{
|
||||
key: "pickingSpeed",
|
||||
title: "Picking Speed",
|
||||
value: `${formatNumber(totals.avgPickingSpeed, 1)}/h`,
|
||||
description: `${formatHours(totals.pickingHours)} picking time`,
|
||||
iconColor: "orange" as const,
|
||||
tooltip: "Average pieces picked per hour while actively picking.",
|
||||
},
|
||||
];
|
||||
}, [data]);
|
||||
|
||||
const chartData = useMemo<ChartPoint[]>(() => {
|
||||
if (!data?.trend?.length) return [];
|
||||
|
||||
const groupedData = new Map<string, {
|
||||
label: string;
|
||||
tooltipLabel: string;
|
||||
timestamp: string;
|
||||
ordersPicked: number;
|
||||
piecesPicked: number;
|
||||
ordersShipped: number;
|
||||
piecesShipped: number;
|
||||
}>();
|
||||
|
||||
data.trend.forEach((point) => {
|
||||
const date = new Date(point.timestamp);
|
||||
let key: string;
|
||||
let label: string;
|
||||
let tooltipLabel: string;
|
||||
|
||||
switch (groupBy) {
|
||||
case "week": {
|
||||
const weekStart = new Date(date);
|
||||
weekStart.setDate(date.getDate() - date.getDay());
|
||||
key = weekStart.toISOString().split("T")[0];
|
||||
label = weekStart.toLocaleDateString("en-US", { month: "short", day: "numeric" });
|
||||
tooltipLabel = `Week of ${weekStart.toLocaleDateString("en-US", { month: "short", day: "numeric", year: "numeric" })}`;
|
||||
break;
|
||||
}
|
||||
case "month": {
|
||||
key = `${date.getFullYear()}-${String(date.getMonth() + 1).padStart(2, "0")}`;
|
||||
label = date.toLocaleDateString("en-US", { month: "short", year: "numeric" });
|
||||
tooltipLabel = date.toLocaleDateString("en-US", { month: "long", year: "numeric" });
|
||||
break;
|
||||
}
|
||||
default: {
|
||||
key = point.date;
|
||||
label = date.toLocaleDateString("en-US", { month: "short", day: "numeric" });
|
||||
tooltipLabel = date.toLocaleDateString("en-US", { weekday: "short", month: "short", day: "numeric", year: "numeric" });
|
||||
}
|
||||
}
|
||||
|
||||
const existing = groupedData.get(key);
|
||||
if (existing) {
|
||||
existing.ordersPicked += point.ordersPicked || 0;
|
||||
existing.piecesPicked += point.piecesPicked || 0;
|
||||
existing.ordersShipped += point.ordersShipped || 0;
|
||||
existing.piecesShipped += point.piecesShipped || 0;
|
||||
} else {
|
||||
groupedData.set(key, {
|
||||
label,
|
||||
tooltipLabel,
|
||||
timestamp: point.timestamp,
|
||||
ordersPicked: point.ordersPicked || 0,
|
||||
piecesPicked: point.piecesPicked || 0,
|
||||
ordersShipped: point.ordersShipped || 0,
|
||||
piecesShipped: point.piecesShipped || 0,
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
return Array.from(groupedData.entries())
|
||||
.sort(([a], [b]) => a.localeCompare(b))
|
||||
.map(([, group]) => ({
|
||||
label: group.label,
|
||||
timestamp: group.timestamp,
|
||||
ordersPicked: group.ordersPicked,
|
||||
piecesPicked: group.piecesPicked,
|
||||
ordersShipped: group.ordersShipped,
|
||||
piecesShipped: group.piecesShipped,
|
||||
tooltipLabel: group.tooltipLabel,
|
||||
}));
|
||||
}, [data, groupBy]);
|
||||
|
||||
const selectedRangeLabel = useMemo(() => {
|
||||
if (isLast30DaysMode) return "Last 30 Days";
|
||||
const label = formatPeriodRangeLabel(customPeriod);
|
||||
if (!label) return "";
|
||||
|
||||
if (!selectedRange || !effectiveRangeEnd) return label;
|
||||
|
||||
const isPartial = effectiveRangeEnd.getTime() < selectedRange.end.getTime();
|
||||
if (!isPartial) return label;
|
||||
|
||||
const partialLabel = effectiveRangeEnd.toLocaleDateString("en-US", {
|
||||
month: "short",
|
||||
day: "numeric",
|
||||
});
|
||||
|
||||
return `${label} (through ${partialLabel})`;
|
||||
}, [isLast30DaysMode, customPeriod, selectedRange, effectiveRangeEnd]);
|
||||
|
||||
const hasActiveMetrics = useMemo(() => Object.values(metrics).some(Boolean), [metrics]);
|
||||
const hasData = chartData.length > 0;
|
||||
|
||||
const handleGroupByChange = (value: string) => {
|
||||
setGroupBy(value as GroupByOption);
|
||||
};
|
||||
|
||||
const toggleMetric = (series: ChartSeriesKey) => {
|
||||
setMetrics((prev) => ({
|
||||
...prev,
|
||||
[series]: !prev[series],
|
||||
}));
|
||||
};
|
||||
|
||||
const handleNaturalLanguageResult = (result: NaturalLanguagePeriodResult) => {
|
||||
if (result === "last30days") {
|
||||
setIsLast30DaysMode(true);
|
||||
return;
|
||||
}
|
||||
if (result) {
|
||||
setIsLast30DaysMode(false);
|
||||
setCustomPeriod(result);
|
||||
}
|
||||
};
|
||||
|
||||
const handleQuickPeriod = (preset: QuickPreset) => {
|
||||
const now = new Date();
|
||||
const year = now.getFullYear();
|
||||
const month = now.getMonth();
|
||||
const quarter = Math.floor(month / 3);
|
||||
|
||||
switch (preset) {
|
||||
case "last30days":
|
||||
setIsLast30DaysMode(true);
|
||||
break;
|
||||
case "thisMonth":
|
||||
setIsLast30DaysMode(false);
|
||||
setCustomPeriod({ type: "month", startYear: year, startMonth: month, count: 1 });
|
||||
break;
|
||||
case "lastMonth":
|
||||
setIsLast30DaysMode(false);
|
||||
const lastMonth = month === 0 ? 11 : month - 1;
|
||||
const lastMonthYear = month === 0 ? year - 1 : year;
|
||||
setCustomPeriod({ type: "month", startYear: lastMonthYear, startMonth: lastMonth, count: 1 });
|
||||
break;
|
||||
case "thisQuarter":
|
||||
setIsLast30DaysMode(false);
|
||||
setCustomPeriod({ type: "quarter", startYear: year, startQuarter: quarter, count: 1 });
|
||||
break;
|
||||
case "lastQuarter":
|
||||
setIsLast30DaysMode(false);
|
||||
const lastQuarter = quarter === 0 ? 3 : quarter - 1;
|
||||
const lastQuarterYear = quarter === 0 ? year - 1 : year;
|
||||
setCustomPeriod({ type: "quarter", startYear: lastQuarterYear, startQuarter: lastQuarter, count: 1 });
|
||||
break;
|
||||
case "thisYear":
|
||||
setIsLast30DaysMode(false);
|
||||
setCustomPeriod({ type: "year", startYear: year, count: 1 });
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
const headerActions = !error ? (
|
||||
<>
|
||||
<Dialog>
|
||||
<DialogTrigger asChild>
|
||||
<Button variant="outline" className="h-9" disabled={loading || !data?.byEmployee}>
|
||||
Details
|
||||
</Button>
|
||||
</DialogTrigger>
|
||||
<DialogContent className={`p-4 max-w-[95vw] w-fit max-h-[85vh] overflow-hidden flex flex-col ${CARD_STYLES.base}`}>
|
||||
<DialogHeader className="flex-none">
|
||||
<DialogTitle className="text-foreground">
|
||||
Operations Details
|
||||
</DialogTitle>
|
||||
</DialogHeader>
|
||||
<div className="flex-1 overflow-auto mt-6 space-y-6">
|
||||
<div>
|
||||
<h3 className="text-sm font-medium mb-2">Picking by Employee</h3>
|
||||
<div className={`rounded-lg border ${CARD_STYLES.base}`}>
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="whitespace-nowrap px-4">Employee</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Tickets</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Orders</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Pieces</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Hours</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Speed</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.byEmployee?.picking?.map((emp) => (
|
||||
<TableRow key={emp.employeeId}>
|
||||
<TableCell className="px-4">{emp.name}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatNumber(emp.ticketCount)}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatNumber(emp.ordersPicked)}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatNumber(emp.piecesPicked)}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatHours(emp.pickingHours || 0)}</TableCell>
|
||||
<TableCell className="text-right px-4">
|
||||
{emp.avgPickingSpeed != null ? `${formatNumber(emp.avgPickingSpeed, 1)}/h` : "—"}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{data?.byEmployee?.shipping && data.byEmployee.shipping.length > 0 && (
|
||||
<div>
|
||||
<h3 className="text-sm font-medium mb-2">Shipping by Employee</h3>
|
||||
<div className={`rounded-lg border ${CARD_STYLES.base}`}>
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="whitespace-nowrap px-4">Employee</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Orders</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Pieces</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data.byEmployee.shipping.map((emp) => (
|
||||
<TableRow key={emp.employeeId}>
|
||||
<TableCell className="px-4">{emp.name}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatNumber(emp.ordersShipped)}</TableCell>
|
||||
<TableCell className="text-right px-4">{formatNumber(emp.piecesShipped)}</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
<PeriodSelectionPopover
|
||||
open={isPeriodPopoverOpen}
|
||||
onOpenChange={setIsPeriodPopoverOpen}
|
||||
selectedLabel={selectedRangeLabel}
|
||||
referenceDate={currentDate}
|
||||
isLast30DaysActive={isLast30DaysMode}
|
||||
onQuickSelect={handleQuickPeriod}
|
||||
onApplyResult={handleNaturalLanguageResult}
|
||||
/>
|
||||
</>
|
||||
) : null;
|
||||
|
||||
return (
|
||||
<Card className={`w-full h-full ${CARD_STYLES.elevated}`}>
|
||||
<DashboardSectionHeader
|
||||
title="Operations"
|
||||
size="large"
|
||||
actions={headerActions}
|
||||
/>
|
||||
|
||||
<CardContent className="p-6 pt-0 space-y-4">
|
||||
{!error && (
|
||||
loading ? (
|
||||
<SkeletonStats />
|
||||
) : (
|
||||
cards.length > 0 && <OperationsStatGrid cards={cards} />
|
||||
)
|
||||
)}
|
||||
|
||||
{!error && (
|
||||
<div className="flex items-center flex-col sm:flex-row gap-0 sm:gap-4">
|
||||
<div className="flex flex-wrap gap-1">
|
||||
{SERIES_DEFINITIONS.map((series) => (
|
||||
<Button
|
||||
key={series.key}
|
||||
variant={metrics[series.key] ? "default" : "outline"}
|
||||
size="sm"
|
||||
onClick={() => toggleMetric(series.key)}
|
||||
>
|
||||
{series.label}
|
||||
</Button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<Separator orientation="vertical" className="h-6 hidden sm:block" />
|
||||
<Separator orientation="horizontal" className="sm:hidden w-20 my-2" />
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
<div className="text-sm text-muted-foreground">Group:</div>
|
||||
<Select value={groupBy} onValueChange={handleGroupByChange}>
|
||||
<SelectTrigger className="w-[100px]">
|
||||
<SelectValue placeholder="Group By" />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
{GROUP_BY_CHOICES.map((option) => (
|
||||
<SelectItem key={option.value} value={option.value}>
|
||||
{option.label}
|
||||
</SelectItem>
|
||||
))}
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{loading ? (
|
||||
<ChartSkeleton type="area" height="default" withCard={false} />
|
||||
) : error ? (
|
||||
<DashboardErrorState error={`Failed to load operations data: ${error}`} className="mx-0 my-0" />
|
||||
) : !hasData ? (
|
||||
<DashboardEmptyState
|
||||
icon={Package}
|
||||
title="No operations data available"
|
||||
description="Try selecting a different time range"
|
||||
/>
|
||||
) : (
|
||||
<div className={`h-[280px] ${CARD_STYLES.base} rounded-lg p-0 relative`}>
|
||||
{!hasActiveMetrics ? (
|
||||
<DashboardEmptyState
|
||||
icon={TrendingUp}
|
||||
title="No metrics selected"
|
||||
description="Select at least one metric to visualize."
|
||||
/>
|
||||
) : (
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<ComposedChart data={chartData} margin={{ top: 5, right: 15, left: 15, bottom: 5 }}>
|
||||
<defs>
|
||||
<linearGradient id="operationsOrdersPicked" x1="0" y1="0" x2="0" y2="1">
|
||||
<stop offset="5%" stopColor={chartColors.ordersPicked} stopOpacity={0.8} />
|
||||
<stop offset="95%" stopColor={chartColors.ordersPicked} stopOpacity={0.3} />
|
||||
</linearGradient>
|
||||
</defs>
|
||||
<CartesianGrid strokeDasharray="3 3" className="stroke-muted" />
|
||||
<XAxis
|
||||
dataKey="label"
|
||||
className="text-xs text-muted-foreground"
|
||||
tick={{ fill: "currentColor" }}
|
||||
/>
|
||||
<YAxis
|
||||
tickFormatter={(value: number) => formatNumber(value)}
|
||||
className="text-xs text-muted-foreground"
|
||||
tick={{ fill: "currentColor" }}
|
||||
/>
|
||||
<Tooltip content={<OperationsTooltip />} />
|
||||
<Legend formatter={(value: string) => SERIES_LABELS[value as ChartSeriesKey] ?? value} />
|
||||
|
||||
{metrics.ordersPicked && (
|
||||
<Area
|
||||
type="monotone"
|
||||
dataKey="ordersPicked"
|
||||
name={SERIES_LABELS.ordersPicked}
|
||||
stroke={chartColors.ordersPicked}
|
||||
fill="url(#operationsOrdersPicked)"
|
||||
strokeWidth={2}
|
||||
/>
|
||||
)}
|
||||
{metrics.ordersShipped && (
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="ordersShipped"
|
||||
name={SERIES_LABELS.ordersShipped}
|
||||
stroke={chartColors.ordersShipped}
|
||||
strokeWidth={2}
|
||||
dot={false}
|
||||
activeDot={{ r: 4 }}
|
||||
connectNulls
|
||||
/>
|
||||
)}
|
||||
{metrics.piecesPicked && (
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="piecesPicked"
|
||||
name={SERIES_LABELS.piecesPicked}
|
||||
stroke={chartColors.piecesPicked}
|
||||
strokeWidth={2}
|
||||
strokeDasharray="5 3"
|
||||
dot={false}
|
||||
activeDot={{ r: 4 }}
|
||||
connectNulls
|
||||
/>
|
||||
)}
|
||||
{metrics.piecesShipped && (
|
||||
<Line
|
||||
type="monotone"
|
||||
dataKey="piecesShipped"
|
||||
name={SERIES_LABELS.piecesShipped}
|
||||
stroke={chartColors.piecesShipped}
|
||||
strokeWidth={2}
|
||||
strokeDasharray="3 3"
|
||||
dot={false}
|
||||
activeDot={{ r: 4 }}
|
||||
connectNulls
|
||||
/>
|
||||
)}
|
||||
</ComposedChart>
|
||||
</ResponsiveContainer>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
};
|
||||
|
||||
type OperationsStatCardConfig = {
|
||||
key: string;
|
||||
title: string;
|
||||
value: string;
|
||||
description?: string;
|
||||
trendValue?: number | null;
|
||||
trendInverted?: boolean;
|
||||
iconColor: "blue" | "orange" | "emerald" | "purple" | "cyan" | "amber";
|
||||
tooltip?: string;
|
||||
};
|
||||
|
||||
const ICON_MAP = {
|
||||
ordersPicked: Package,
|
||||
ordersShipped: Truck,
|
||||
productivity: Gauge,
|
||||
pickingSpeed: TrendingUp,
|
||||
} as const;
|
||||
|
||||
function OperationsStatGrid({ cards }: { cards: OperationsStatCardConfig[] }) {
|
||||
return (
|
||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4 py-4 w-full dashboard-stagger">
|
||||
{cards.map((card) => (
|
||||
<DashboardStatCard
|
||||
key={card.key}
|
||||
title={card.title}
|
||||
value={card.value}
|
||||
subtitle={card.description}
|
||||
trend={
|
||||
card.trendValue != null && Number.isFinite(card.trendValue)
|
||||
? {
|
||||
value: card.trendValue,
|
||||
moreIsBetter: !card.trendInverted,
|
||||
}
|
||||
: undefined
|
||||
}
|
||||
icon={ICON_MAP[card.key as keyof typeof ICON_MAP]}
|
||||
iconColor={card.iconColor}
|
||||
tooltip={card.tooltip}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function SkeletonStats() {
|
||||
return (
|
||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4 py-4 w-full">
|
||||
{Array.from({ length: 4 }).map((_, index) => (
|
||||
<DashboardStatCardSkeleton key={index} hasIcon hasSubtitle />
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const OperationsTooltip = ({ active, payload, label }: TooltipProps<number, string>) => {
|
||||
if (!active || !payload?.length) return null;
|
||||
|
||||
const basePoint = payload[0]?.payload as ChartPoint | undefined;
|
||||
const resolvedLabel = basePoint?.tooltipLabel ?? label;
|
||||
|
||||
const desiredOrder: ChartSeriesKey[] = ["ordersPicked", "piecesPicked", "ordersShipped", "piecesShipped"];
|
||||
const payloadMap = new Map(payload.map((entry) => [entry.dataKey as ChartSeriesKey, entry]));
|
||||
const orderedPayload = desiredOrder
|
||||
.map((key) => payloadMap.get(key))
|
||||
.filter((entry): entry is (typeof payload)[0] => entry !== undefined);
|
||||
|
||||
return (
|
||||
<div className={TOOLTIP_STYLES.container}>
|
||||
<p className={TOOLTIP_STYLES.header}>{resolvedLabel}</p>
|
||||
<div className={TOOLTIP_STYLES.content}>
|
||||
{orderedPayload.map((entry, index) => {
|
||||
const key = (entry.dataKey ?? "") as ChartSeriesKey;
|
||||
const rawValue = entry.value;
|
||||
const formattedValue = rawValue != null ? formatNumber(rawValue as number) : "—";
|
||||
|
||||
return (
|
||||
<div key={`${key}-${index}`} className={TOOLTIP_STYLES.row}>
|
||||
<div className={TOOLTIP_STYLES.rowLabel}>
|
||||
<span
|
||||
className={TOOLTIP_STYLES.dot}
|
||||
style={{ backgroundColor: entry.stroke || entry.color || "#888" }}
|
||||
/>
|
||||
<span className={TOOLTIP_STYLES.name}>
|
||||
{SERIES_LABELS[key] ?? entry.name ?? key}
|
||||
</span>
|
||||
</div>
|
||||
<span className={TOOLTIP_STYLES.value}>{formattedValue}</span>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default OperationsMetrics;
|
||||
558
inventory/src/components/dashboard/PayrollMetrics.tsx
Normal file
558
inventory/src/components/dashboard/PayrollMetrics.tsx
Normal file
@@ -0,0 +1,558 @@
|
||||
import { useEffect, useMemo, useState } from "react";
|
||||
import { acotService } from "@/services/dashboard/acotService";
|
||||
import { Card, CardContent } from "@/components/ui/card";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
DialogTrigger,
|
||||
} from "@/components/ui/dialog";
|
||||
import {
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableHead,
|
||||
TableHeader,
|
||||
TableRow,
|
||||
} from "@/components/ui/table";
|
||||
import {
|
||||
Bar,
|
||||
BarChart,
|
||||
CartesianGrid,
|
||||
Cell,
|
||||
Legend,
|
||||
ResponsiveContainer,
|
||||
Tooltip,
|
||||
XAxis,
|
||||
YAxis,
|
||||
} from "recharts";
|
||||
import type { TooltipProps } from "recharts";
|
||||
import { Clock, Users, AlertTriangle, ChevronLeft, ChevronRight, Calendar } from "lucide-react";
|
||||
import { CARD_STYLES } from "@/lib/dashboard/designTokens";
|
||||
import {
|
||||
DashboardSectionHeader,
|
||||
DashboardStatCard,
|
||||
DashboardStatCardSkeleton,
|
||||
ChartSkeleton,
|
||||
DashboardEmptyState,
|
||||
DashboardErrorState,
|
||||
TOOLTIP_STYLES,
|
||||
METRIC_COLORS,
|
||||
} from "@/components/dashboard/shared";
|
||||
|
||||
type ComparisonValue = {
|
||||
absolute: number | null;
|
||||
percentage: number | null;
|
||||
};
|
||||
|
||||
type PayPeriodWeek = {
|
||||
start: string;
|
||||
end: string;
|
||||
label: string;
|
||||
};
|
||||
|
||||
type PayPeriod = {
|
||||
start: string;
|
||||
end: string;
|
||||
label: string;
|
||||
week1: PayPeriodWeek;
|
||||
week2: PayPeriodWeek;
|
||||
isCurrent: boolean;
|
||||
};
|
||||
|
||||
type PayrollTotals = {
|
||||
hours: number;
|
||||
breakHours: number;
|
||||
overtimeHours: number;
|
||||
regularHours: number;
|
||||
activeEmployees: number;
|
||||
fte: number;
|
||||
avgHoursPerEmployee: number;
|
||||
};
|
||||
|
||||
type PayrollComparison = {
|
||||
hours?: ComparisonValue;
|
||||
overtimeHours?: ComparisonValue;
|
||||
fte?: ComparisonValue;
|
||||
activeEmployees?: ComparisonValue;
|
||||
};
|
||||
|
||||
type EmployeePayrollEntry = {
|
||||
employeeId: number;
|
||||
name: string;
|
||||
week1Hours: number;
|
||||
week1BreakHours: number;
|
||||
week1Overtime: number;
|
||||
week1Regular: number;
|
||||
week2Hours: number;
|
||||
week2BreakHours: number;
|
||||
week2Overtime: number;
|
||||
week2Regular: number;
|
||||
totalHours: number;
|
||||
totalBreakHours: number;
|
||||
overtimeHours: number;
|
||||
regularHours: number;
|
||||
};
|
||||
|
||||
type WeekSummary = {
|
||||
week: number;
|
||||
start: string;
|
||||
end: string;
|
||||
hours: number;
|
||||
overtime: number;
|
||||
regular: number;
|
||||
};
|
||||
|
||||
type PayrollMetricsResponse = {
|
||||
payPeriod: PayPeriod;
|
||||
totals: PayrollTotals;
|
||||
previousTotals?: PayrollTotals | null;
|
||||
comparison?: PayrollComparison | null;
|
||||
byEmployee: EmployeePayrollEntry[];
|
||||
byWeek: WeekSummary[];
|
||||
};
|
||||
|
||||
const chartColors = {
|
||||
regular: METRIC_COLORS.orders,
|
||||
overtime: METRIC_COLORS.expense,
|
||||
};
|
||||
|
||||
const formatNumber = (value: number, decimals = 0) => {
|
||||
if (!Number.isFinite(value)) return "0";
|
||||
return value.toLocaleString("en-US", {
|
||||
minimumFractionDigits: decimals,
|
||||
maximumFractionDigits: decimals,
|
||||
});
|
||||
};
|
||||
|
||||
const formatHours = (value: number) => {
|
||||
if (!Number.isFinite(value)) return "0h";
|
||||
return `${value.toFixed(1)}h`;
|
||||
};
|
||||
|
||||
const PayrollMetrics = () => {
|
||||
const [data, setData] = useState<PayrollMetricsResponse | null>(null);
|
||||
const [loading, setLoading] = useState<boolean>(true);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [currentPayPeriodStart, setCurrentPayPeriodStart] = useState<string | null>(null);
|
||||
|
||||
// Fetch data
|
||||
useEffect(() => {
|
||||
let cancelled = false;
|
||||
|
||||
const fetchData = async () => {
|
||||
setLoading(true);
|
||||
setError(null);
|
||||
|
||||
try {
|
||||
const params: Record<string, string> = {};
|
||||
if (currentPayPeriodStart) {
|
||||
params.payPeriodStart = currentPayPeriodStart;
|
||||
}
|
||||
|
||||
// @ts-expect-error - acotService is a JS file, TypeScript can't infer the param type
|
||||
const response = (await acotService.getPayrollMetrics(params)) as PayrollMetricsResponse;
|
||||
if (!cancelled) {
|
||||
setData(response);
|
||||
// Update the current pay period start if not set (first load)
|
||||
if (!currentPayPeriodStart && response.payPeriod?.start) {
|
||||
setCurrentPayPeriodStart(response.payPeriod.start);
|
||||
}
|
||||
}
|
||||
} catch (err: unknown) {
|
||||
if (!cancelled) {
|
||||
let message = "Failed to load payroll metrics";
|
||||
if (typeof err === "object" && err !== null) {
|
||||
const maybeError = err as { response?: { data?: { error?: unknown } }; message?: unknown };
|
||||
const responseError = maybeError.response?.data?.error;
|
||||
if (typeof responseError === "string" && responseError.trim().length > 0) {
|
||||
message = responseError;
|
||||
} else if (typeof maybeError.message === "string") {
|
||||
message = maybeError.message;
|
||||
}
|
||||
}
|
||||
setError(message);
|
||||
}
|
||||
} finally {
|
||||
if (!cancelled) {
|
||||
setLoading(false);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
void fetchData();
|
||||
return () => { cancelled = true; };
|
||||
}, [currentPayPeriodStart]);
|
||||
|
||||
const navigatePeriod = (direction: "prev" | "next") => {
|
||||
if (!data?.payPeriod?.start) return;
|
||||
|
||||
// Calculate the new pay period start by adding/subtracting 14 days
|
||||
const currentStart = new Date(data.payPeriod.start);
|
||||
const offset = direction === "prev" ? -14 : 14;
|
||||
currentStart.setDate(currentStart.getDate() + offset);
|
||||
setCurrentPayPeriodStart(currentStart.toISOString().split("T")[0]);
|
||||
};
|
||||
|
||||
const goToCurrentPeriod = () => {
|
||||
setCurrentPayPeriodStart(null); // null triggers loading current period
|
||||
};
|
||||
|
||||
const cards = useMemo(() => {
|
||||
if (!data?.totals) return [];
|
||||
|
||||
const totals = data.totals;
|
||||
const comparison = data.comparison ?? {};
|
||||
|
||||
return [
|
||||
{
|
||||
key: "hours",
|
||||
title: "Total Hours",
|
||||
value: formatHours(totals.hours),
|
||||
description: `${formatHours(totals.regularHours)} regular`,
|
||||
trendValue: comparison.hours?.percentage,
|
||||
iconColor: "blue" as const,
|
||||
tooltip: "Total hours worked by all employees in this pay period.",
|
||||
},
|
||||
{
|
||||
key: "overtime",
|
||||
title: "Overtime",
|
||||
value: formatHours(totals.overtimeHours),
|
||||
description: totals.overtimeHours > 0
|
||||
? `${formatNumber((totals.overtimeHours / totals.hours) * 100, 1)}% of total`
|
||||
: "No overtime",
|
||||
trendValue: comparison.overtimeHours?.percentage,
|
||||
trendInverted: true,
|
||||
iconColor: totals.overtimeHours > 0 ? "orange" as const : "emerald" as const,
|
||||
tooltip: "Hours exceeding 40 per employee per week.",
|
||||
},
|
||||
{
|
||||
key: "fte",
|
||||
title: "FTE",
|
||||
value: formatNumber(totals.fte, 2),
|
||||
description: `${formatNumber(totals.activeEmployees)} employees`,
|
||||
trendValue: comparison.fte?.percentage,
|
||||
iconColor: "emerald" as const,
|
||||
tooltip: "Full-Time Equivalents (80 hours = 1 FTE for 2-week period).",
|
||||
},
|
||||
{
|
||||
key: "avgHours",
|
||||
title: "Avg Hours",
|
||||
value: formatHours(totals.avgHoursPerEmployee),
|
||||
description: "Per employee",
|
||||
iconColor: "purple" as const,
|
||||
tooltip: "Average hours worked per active employee in this pay period.",
|
||||
},
|
||||
];
|
||||
}, [data]);
|
||||
|
||||
const chartData = useMemo(() => {
|
||||
if (!data?.byWeek) return [];
|
||||
|
||||
return data.byWeek.map((week) => ({
|
||||
name: `Week ${week.week}`,
|
||||
label: formatWeekRange(week.start, week.end),
|
||||
regular: week.regular,
|
||||
overtime: week.overtime,
|
||||
total: week.hours,
|
||||
}));
|
||||
}, [data]);
|
||||
|
||||
const hasData = data?.byWeek && data.byWeek.length > 0;
|
||||
|
||||
const headerActions = !error ? (
|
||||
<div className="flex items-center gap-2">
|
||||
<Dialog>
|
||||
<DialogTrigger asChild>
|
||||
<Button variant="outline" className="h-9" disabled={loading || !data?.byEmployee}>
|
||||
Details
|
||||
</Button>
|
||||
</DialogTrigger>
|
||||
<DialogContent className={`p-4 max-w-[95vw] w-fit max-h-[85vh] overflow-hidden flex flex-col ${CARD_STYLES.base}`}>
|
||||
<DialogHeader className="flex-none">
|
||||
<DialogTitle className="text-foreground">
|
||||
Employee Hours - {data?.payPeriod?.label}
|
||||
</DialogTitle>
|
||||
</DialogHeader>
|
||||
<div className="flex-1 overflow-auto mt-6">
|
||||
<div className={`rounded-lg border ${CARD_STYLES.base}`}>
|
||||
<Table>
|
||||
<TableHeader>
|
||||
<TableRow>
|
||||
<TableHead className="whitespace-nowrap px-4">Employee</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Week 1</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Week 2</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Total</TableHead>
|
||||
<TableHead className="text-right whitespace-nowrap px-4">Overtime</TableHead>
|
||||
</TableRow>
|
||||
</TableHeader>
|
||||
<TableBody>
|
||||
{data?.byEmployee?.map((emp) => (
|
||||
<TableRow key={emp.employeeId}>
|
||||
<TableCell className="px-4">{emp.name}</TableCell>
|
||||
<TableCell className="text-right px-4">
|
||||
<span className={emp.week1Overtime > 0 ? "text-orange-600 dark:text-orange-400 font-medium" : ""}>
|
||||
{formatHours(emp.week1Hours)}
|
||||
{emp.week1Overtime > 0 && (
|
||||
<span className="ml-1 text-xs">
|
||||
(+{formatHours(emp.week1Overtime)} OT)
|
||||
</span>
|
||||
)}
|
||||
</span>
|
||||
</TableCell>
|
||||
<TableCell className="text-right px-4">
|
||||
<span className={emp.week2Overtime > 0 ? "text-orange-600 dark:text-orange-400 font-medium" : ""}>
|
||||
{formatHours(emp.week2Hours)}
|
||||
{emp.week2Overtime > 0 && (
|
||||
<span className="ml-1 text-xs">
|
||||
(+{formatHours(emp.week2Overtime)} OT)
|
||||
</span>
|
||||
)}
|
||||
</span>
|
||||
</TableCell>
|
||||
<TableCell className="text-right px-4 font-medium">
|
||||
{formatHours(emp.totalHours)}
|
||||
</TableCell>
|
||||
<TableCell className="text-right px-4">
|
||||
{emp.overtimeHours > 0 ? (
|
||||
<span className="text-orange-600 dark:text-orange-400 font-medium">
|
||||
{formatHours(emp.overtimeHours)}
|
||||
</span>
|
||||
) : (
|
||||
<span className="text-muted-foreground">—</span>
|
||||
)}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</div>
|
||||
</div>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
<div className="flex items-center gap-1">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="icon"
|
||||
className="h-9 w-9"
|
||||
onClick={() => navigatePeriod("prev")}
|
||||
disabled={loading}
|
||||
>
|
||||
<ChevronLeft className="h-4 w-4" />
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
className="h-9 px-3 min-w-[180px]"
|
||||
onClick={goToCurrentPeriod}
|
||||
disabled={loading || data?.payPeriod?.isCurrent}
|
||||
>
|
||||
<Calendar className="h-4 w-4 mr-2" />
|
||||
{loading ? "Loading..." : data?.payPeriod?.label || "Loading..."}
|
||||
</Button>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="icon"
|
||||
className="h-9 w-9"
|
||||
onClick={() => navigatePeriod("next")}
|
||||
disabled={loading || data?.payPeriod?.isCurrent}
|
||||
>
|
||||
<ChevronRight className="h-4 w-4" />
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
) : null;
|
||||
|
||||
return (
|
||||
<Card className={`w-full h-full ${CARD_STYLES.elevated}`}>
|
||||
<DashboardSectionHeader
|
||||
title="Payroll"
|
||||
size="large"
|
||||
actions={headerActions}
|
||||
/>
|
||||
|
||||
<CardContent className="p-6 pt-0 space-y-4">
|
||||
{!error && (
|
||||
loading ? (
|
||||
<SkeletonStats />
|
||||
) : (
|
||||
cards.length > 0 && <PayrollStatGrid cards={cards} />
|
||||
)
|
||||
)}
|
||||
|
||||
{loading ? (
|
||||
<ChartSkeleton type="bar" height="default" withCard={false} />
|
||||
) : error ? (
|
||||
<DashboardErrorState error={`Failed to load payroll data: ${error}`} className="mx-0 my-0" />
|
||||
) : !hasData ? (
|
||||
<DashboardEmptyState
|
||||
icon={Clock}
|
||||
title="No payroll data available"
|
||||
description="Try selecting a different pay period"
|
||||
/>
|
||||
) : (
|
||||
<div className={`h-[280px] ${CARD_STYLES.base} rounded-lg p-0 relative`}>
|
||||
<ResponsiveContainer width="100%" height="100%">
|
||||
<BarChart data={chartData} margin={{ top: 20, right: 20, left: 20, bottom: 5 }}>
|
||||
<CartesianGrid strokeDasharray="3 3" className="stroke-muted" />
|
||||
<XAxis
|
||||
dataKey="label"
|
||||
className="text-xs text-muted-foreground"
|
||||
tick={{ fill: "currentColor" }}
|
||||
/>
|
||||
<YAxis
|
||||
tickFormatter={(value: number) => `${value}h`}
|
||||
className="text-xs text-muted-foreground"
|
||||
tick={{ fill: "currentColor" }}
|
||||
/>
|
||||
<Tooltip content={<PayrollTooltip />} />
|
||||
<Legend />
|
||||
<Bar
|
||||
dataKey="regular"
|
||||
name="Regular Hours"
|
||||
stackId="hours"
|
||||
fill={chartColors.regular}
|
||||
/>
|
||||
<Bar
|
||||
dataKey="overtime"
|
||||
name="Overtime"
|
||||
stackId="hours"
|
||||
fill={chartColors.overtime}
|
||||
>
|
||||
{chartData.map((entry, index) => (
|
||||
<Cell
|
||||
key={`cell-${index}`}
|
||||
fill={entry.overtime > 0 ? chartColors.overtime : chartColors.regular}
|
||||
/>
|
||||
))}
|
||||
</Bar>
|
||||
</BarChart>
|
||||
</ResponsiveContainer>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && !error && data?.byWeek && data.byWeek.some(w => w.overtime > 0) && (
|
||||
<div className="flex items-center gap-2 text-sm text-orange-600 dark:text-orange-400">
|
||||
<AlertTriangle className="h-4 w-4" />
|
||||
<span>
|
||||
Overtime detected: {formatHours(data.totals.overtimeHours)} total
|
||||
({data.byEmployee?.filter(e => e.overtimeHours > 0).length || 0} employees)
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</CardContent>
|
||||
</Card>
|
||||
);
|
||||
};
|
||||
|
||||
function formatWeekRange(start: string, end: string): string {
|
||||
const startDate = new Date(start + "T00:00:00");
|
||||
const endDate = new Date(end + "T00:00:00");
|
||||
|
||||
const startStr = startDate.toLocaleDateString("en-US", { month: "short", day: "numeric" });
|
||||
const endStr = endDate.toLocaleDateString("en-US", { month: "short", day: "numeric" });
|
||||
|
||||
return `${startStr} – ${endStr}`;
|
||||
}
|
||||
|
||||
type PayrollStatCardConfig = {
|
||||
key: string;
|
||||
title: string;
|
||||
value: string;
|
||||
description?: string;
|
||||
trendValue?: number | null;
|
||||
trendInverted?: boolean;
|
||||
iconColor: "blue" | "orange" | "emerald" | "purple" | "cyan" | "amber";
|
||||
tooltip?: string;
|
||||
};
|
||||
|
||||
const ICON_MAP = {
|
||||
hours: Clock,
|
||||
overtime: AlertTriangle,
|
||||
fte: Users,
|
||||
avgHours: Clock,
|
||||
} as const;
|
||||
|
||||
function PayrollStatGrid({ cards }: { cards: PayrollStatCardConfig[] }) {
|
||||
return (
|
||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4 py-4 w-full dashboard-stagger">
|
||||
{cards.map((card) => (
|
||||
<DashboardStatCard
|
||||
key={card.key}
|
||||
title={card.title}
|
||||
value={card.value}
|
||||
subtitle={card.description}
|
||||
trend={
|
||||
card.trendValue != null && Number.isFinite(card.trendValue)
|
||||
? {
|
||||
value: card.trendValue,
|
||||
moreIsBetter: !card.trendInverted,
|
||||
}
|
||||
: undefined
|
||||
}
|
||||
icon={ICON_MAP[card.key as keyof typeof ICON_MAP]}
|
||||
iconColor={card.iconColor}
|
||||
tooltip={card.tooltip}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
function SkeletonStats() {
|
||||
return (
|
||||
<div className="grid grid-cols-2 lg:grid-cols-4 gap-4 py-4 w-full">
|
||||
{Array.from({ length: 4 }).map((_, index) => (
|
||||
<DashboardStatCardSkeleton key={index} hasIcon hasSubtitle />
|
||||
))}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const PayrollTooltip = ({ active, payload, label }: TooltipProps<number, string>) => {
|
||||
if (!active || !payload?.length) return null;
|
||||
|
||||
const regular = payload.find(p => p.dataKey === "regular")?.value as number | undefined;
|
||||
const overtime = payload.find(p => p.dataKey === "overtime")?.value as number | undefined;
|
||||
const total = (regular || 0) + (overtime || 0);
|
||||
|
||||
return (
|
||||
<div className={TOOLTIP_STYLES.container}>
|
||||
<p className={TOOLTIP_STYLES.header}>{label}</p>
|
||||
<div className={TOOLTIP_STYLES.content}>
|
||||
<div className={TOOLTIP_STYLES.row}>
|
||||
<div className={TOOLTIP_STYLES.rowLabel}>
|
||||
<span
|
||||
className={TOOLTIP_STYLES.dot}
|
||||
style={{ backgroundColor: chartColors.regular }}
|
||||
/>
|
||||
<span className={TOOLTIP_STYLES.name}>Regular Hours</span>
|
||||
</div>
|
||||
<span className={TOOLTIP_STYLES.value}>{formatHours(regular || 0)}</span>
|
||||
</div>
|
||||
{overtime != null && overtime > 0 && (
|
||||
<div className={TOOLTIP_STYLES.row}>
|
||||
<div className={TOOLTIP_STYLES.rowLabel}>
|
||||
<span
|
||||
className={TOOLTIP_STYLES.dot}
|
||||
style={{ backgroundColor: chartColors.overtime }}
|
||||
/>
|
||||
<span className={TOOLTIP_STYLES.name}>Overtime</span>
|
||||
</div>
|
||||
<span className={TOOLTIP_STYLES.value}>{formatHours(overtime)}</span>
|
||||
</div>
|
||||
)}
|
||||
<div className={`${TOOLTIP_STYLES.row} border-t border-border/50 pt-1 mt-1`}>
|
||||
<div className={TOOLTIP_STYLES.rowLabel}>
|
||||
<span className={TOOLTIP_STYLES.name}>Total</span>
|
||||
</div>
|
||||
<span className={`${TOOLTIP_STYLES.value} font-semibold`}>{formatHours(total)}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default PayrollMetrics;
|
||||
@@ -8,7 +8,7 @@ import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger, SelectValue } from "@/components/ui/select";
|
||||
import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { DiscountPromoOption, DiscountPromoType, ShippingPromoType, ShippingTierConfig, DiscountSimulationResponse, CogsCalculationMode } from "@/types/discount-simulator";
|
||||
import { DiscountPromoOption, DiscountPromoType, ShippingPromoType, ShippingTierConfig, SurchargeConfig, DiscountSimulationResponse, CogsCalculationMode } from "@/types/discount-simulator";
|
||||
import { formatNumber } from "@/utils/productUtils";
|
||||
import { PlusIcon, X } from "lucide-react";
|
||||
import { Skeleton } from "@/components/ui/skeleton";
|
||||
@@ -35,6 +35,8 @@ interface ConfigPanelProps {
|
||||
onShippingPromoChange: (update: Partial<ConfigPanelProps["shippingPromo"]>) => void;
|
||||
shippingTiers: ShippingTierConfig[];
|
||||
onShippingTiersChange: (tiers: ShippingTierConfig[]) => void;
|
||||
surcharges: SurchargeConfig[];
|
||||
onSurchargesChange: (surcharges: SurchargeConfig[]) => void;
|
||||
merchantFeePercent: number;
|
||||
onMerchantFeeChange: (value: number) => void;
|
||||
fixedCostPerOrder: number;
|
||||
@@ -43,6 +45,7 @@ interface ConfigPanelProps {
|
||||
onCogsCalculationModeChange: (mode: CogsCalculationMode) => void;
|
||||
pointsPerDollar: number;
|
||||
redemptionRate: number;
|
||||
onRedemptionRateChange: (value: number) => void;
|
||||
pointDollarValue: number;
|
||||
onPointDollarValueChange: (value: number) => void;
|
||||
onConfigInputChange: () => void;
|
||||
@@ -65,6 +68,7 @@ const formatPercent = (value: number) => {
|
||||
};
|
||||
|
||||
const generateTierId = () => `tier-${Date.now()}-${Math.random().toString(36).slice(2, 10)}`;
|
||||
const generateSurchargeId = () => `surcharge-${Date.now()}-${Math.random().toString(36).slice(2, 10)}`;
|
||||
|
||||
const parseDateToTimestamp = (value?: string | null): number | undefined => {
|
||||
if (!value) {
|
||||
@@ -101,6 +105,8 @@ export function ConfigPanel({
|
||||
onShippingPromoChange,
|
||||
shippingTiers,
|
||||
onShippingTiersChange,
|
||||
surcharges,
|
||||
onSurchargesChange,
|
||||
merchantFeePercent,
|
||||
onMerchantFeeChange,
|
||||
fixedCostPerOrder,
|
||||
@@ -109,6 +115,7 @@ export function ConfigPanel({
|
||||
onCogsCalculationModeChange,
|
||||
pointsPerDollar,
|
||||
redemptionRate,
|
||||
onRedemptionRateChange,
|
||||
pointDollarValue,
|
||||
onPointDollarValueChange,
|
||||
onConfigInputChange,
|
||||
@@ -235,6 +242,93 @@ export function ConfigPanel({
|
||||
handleFieldBlur();
|
||||
}, [sortShippingTiers, handleFieldBlur]);
|
||||
|
||||
// Surcharge handlers
|
||||
useEffect(() => {
|
||||
if (surcharges.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const surchargesMissingIds = surcharges.some((s) => !s.id);
|
||||
if (!surchargesMissingIds) {
|
||||
return;
|
||||
}
|
||||
|
||||
const normalizedSurcharges = surcharges.map((s) =>
|
||||
s.id ? s : { ...s, id: generateSurchargeId() }
|
||||
);
|
||||
onSurchargesChange(normalizedSurcharges);
|
||||
}, [surcharges, onSurchargesChange]);
|
||||
|
||||
const handleSurchargeUpdate = (index: number, update: Partial<SurchargeConfig>) => {
|
||||
const items = [...surcharges];
|
||||
const current = items[index];
|
||||
if (!current) {
|
||||
return;
|
||||
}
|
||||
|
||||
const surchargeId = current.id ?? generateSurchargeId();
|
||||
const merged = {
|
||||
...current,
|
||||
...update,
|
||||
id: surchargeId,
|
||||
};
|
||||
|
||||
const normalized: SurchargeConfig = {
|
||||
...merged,
|
||||
threshold: Number.isFinite(merged.threshold) ? merged.threshold ?? 0 : 0,
|
||||
maxThreshold: Number.isFinite(merged.maxThreshold) && (merged.maxThreshold ?? 0) > 0 ? merged.maxThreshold : undefined,
|
||||
amount: Number.isFinite(merged.amount) ? merged.amount ?? 0 : 0,
|
||||
};
|
||||
|
||||
items[index] = normalized;
|
||||
onSurchargesChange(items);
|
||||
};
|
||||
|
||||
const handleSurchargeRemove = (index: number) => {
|
||||
onConfigInputChange();
|
||||
const items = surcharges.filter((_, i) => i !== index);
|
||||
onSurchargesChange(items);
|
||||
};
|
||||
|
||||
const handleSurchargeAdd = () => {
|
||||
onConfigInputChange();
|
||||
const lastThreshold = surcharges[surcharges.length - 1]?.threshold ?? 0;
|
||||
const items = [
|
||||
...surcharges,
|
||||
{
|
||||
threshold: lastThreshold,
|
||||
target: "shipping" as const,
|
||||
amount: 0,
|
||||
id: generateSurchargeId(),
|
||||
},
|
||||
];
|
||||
onSurchargesChange(items);
|
||||
};
|
||||
|
||||
const sortSurcharges = useCallback(() => {
|
||||
if (surcharges.length < 2) {
|
||||
return;
|
||||
}
|
||||
|
||||
const originalIds = surcharges.map((s) => s.id);
|
||||
const sorted = [...surcharges]
|
||||
.map((s) => ({
|
||||
...s,
|
||||
threshold: Number.isFinite(s.threshold) ? s.threshold : 0,
|
||||
amount: Number.isFinite(s.amount) ? s.amount : 0,
|
||||
}))
|
||||
.sort((a, b) => a.threshold - b.threshold);
|
||||
|
||||
const orderChanged = sorted.some((s, index) => s.id !== originalIds[index]);
|
||||
if (orderChanged) {
|
||||
onSurchargesChange(sorted);
|
||||
}
|
||||
}, [surcharges, onSurchargesChange]);
|
||||
|
||||
const handleSurchargeBlur = useCallback(() => {
|
||||
sortSurcharges();
|
||||
handleFieldBlur();
|
||||
}, [sortSurcharges, handleFieldBlur]);
|
||||
|
||||
const sectionTitleClass = "text-[0.65rem] font-semibold uppercase tracking-[0.18em] text-muted-foreground";
|
||||
const sectionBaseClass = "flex flex-col rounded-md border border-border/60 bg-muted/30 px-3 py-2.5";
|
||||
@@ -244,10 +338,10 @@ export function ConfigPanel({
|
||||
const fieldClass = "flex flex-col gap-1";
|
||||
const labelClass = "text-[0.65rem] uppercase tracking-wide text-muted-foreground";
|
||||
const fieldRowClass = "flex flex-col gap-2";
|
||||
const fieldRowHorizontalClass = "flex flex-col gap-2 sm:flex-row sm:items-end sm:gap-3";
|
||||
const compactTriggerClass = "h-8 px-2 text-xs";
|
||||
const compactNumberClass = "h-8 px-2 text-sm";
|
||||
const compactWideNumberClass = "h-8 px-2 text-sm";
|
||||
const fieldRowHorizontalClass = "flex flex-col gap-2 sm:flex-row sm:gap-3";
|
||||
const compactTriggerClass = "h-8 px-1.5 text-xs";
|
||||
const compactNumberClass = "h-8 px-1.5 text-sm";
|
||||
const compactWideNumberClass = "h-8 px-1.5 text-sm";
|
||||
const metricPillClass = "flex items-center gap-1 rounded border border-border/60 bg-background px-2 py-1 text-[0.68rem] font-medium text-foreground";
|
||||
const showProductAdjustments = productPromo.type !== "none";
|
||||
const showShippingAdjustments = shippingPromo.type !== "none";
|
||||
@@ -255,8 +349,8 @@ export function ConfigPanel({
|
||||
|
||||
return (
|
||||
<Card className="w-full">
|
||||
<CardContent className="flex flex-col gap-3 px-4 py-4">
|
||||
<div className="space-y-4">
|
||||
<CardContent className="flex flex-col gap-2 px-2 py-2">
|
||||
<div className="space-y-2">
|
||||
<section className={sectionClass}>
|
||||
<div className={fieldRowClass}>
|
||||
<div className={fieldClass}>
|
||||
@@ -487,7 +581,7 @@ export function ConfigPanel({
|
||||
return (
|
||||
<div
|
||||
key={tierKey}
|
||||
className="relative grid gap-2 rounded px-2 py-2 text-xs sm:grid-cols-[minmax(0,0.9fr)_minmax(0,1.1fr)_minmax(0,1fr)_auto] sm:items-end"
|
||||
className="relative grid gap-2 rounded px-2 py-0.5 text-xs sm:grid-cols-[minmax(0,0.9fr)_minmax(0,1.1fr)_minmax(0,1fr)_auto] sm:items-end"
|
||||
>
|
||||
<div>
|
||||
<Input
|
||||
@@ -553,6 +647,114 @@ export function ConfigPanel({
|
||||
)}
|
||||
</section>
|
||||
|
||||
<section className={compactSectionClass}>
|
||||
<div className={sectionHeaderClass}>
|
||||
<span className={sectionTitleClass}>Surcharges</span>
|
||||
<Button variant="outline" size="sm" onClick={handleSurchargeAdd} className="flex items-center gap-1">
|
||||
<PlusIcon className="w-3 h-3" />
|
||||
Add surcharge
|
||||
</Button>
|
||||
</div>
|
||||
{surcharges.length === 0 ? (
|
||||
<p className="text-xs text-muted-foreground">Add surcharges to model fees at different order values.</p>
|
||||
) : (
|
||||
<ScrollArea>
|
||||
<div className="flex flex-col gap-2 pr-1 -mx-2">
|
||||
<div className="grid gap-2 px-2 py-1 text-[0.65rem] font-medium uppercase tracking-[0.18em] text-muted-foreground sm:grid-cols-[minmax(0,1fr)_minmax(0,1fr)_minmax(0,1.2fr)_minmax(0,0.8fr)_auto]">
|
||||
<div>Min</div>
|
||||
<div>Max</div>
|
||||
<div>Add To</div>
|
||||
<div>Amount</div>
|
||||
<div className="w-1.5" aria-hidden="true" />
|
||||
</div>
|
||||
{surcharges.map((surcharge, index) => {
|
||||
const surchargeKey = surcharge.id ?? `surcharge-${index}`;
|
||||
return (
|
||||
<div
|
||||
key={surchargeKey}
|
||||
className="relative grid gap-2 rounded px-2 py-2 text-xs sm:grid-cols-[minmax(0,0.9fr)_minmax(0,0.9fr)_minmax(0,1fr)_minmax(0,0.9fr)_auto] sm:items-end"
|
||||
>
|
||||
<div>
|
||||
<Input
|
||||
className={compactNumberClass}
|
||||
type="number"
|
||||
step="1"
|
||||
value={surcharge.threshold}
|
||||
onChange={(event) => {
|
||||
onConfigInputChange();
|
||||
handleSurchargeUpdate(index, {
|
||||
threshold: parseNumber(event.target.value, 0),
|
||||
});
|
||||
}}
|
||||
onBlur={handleSurchargeBlur}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Input
|
||||
className={compactNumberClass}
|
||||
type="number"
|
||||
step="1"
|
||||
placeholder="∞"
|
||||
value={surcharge.maxThreshold ?? ''}
|
||||
onChange={(event) => {
|
||||
onConfigInputChange();
|
||||
const val = event.target.value;
|
||||
handleSurchargeUpdate(index, {
|
||||
maxThreshold: val === '' ? undefined : parseNumber(val, 0),
|
||||
});
|
||||
}}
|
||||
onBlur={handleSurchargeBlur}
|
||||
/>
|
||||
</div>
|
||||
<div>
|
||||
<Select
|
||||
value={surcharge.target}
|
||||
onValueChange={(value) => {
|
||||
onConfigInputChange();
|
||||
handleSurchargeUpdate(index, { target: value as SurchargeConfig["target"] });
|
||||
}}
|
||||
>
|
||||
<SelectTrigger className={`${compactTriggerClass} w-full`}>
|
||||
<SelectValue />
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
<SelectItem value="shipping">Shipping</SelectItem>
|
||||
<SelectItem value="order">Order</SelectItem>
|
||||
</SelectContent>
|
||||
</Select>
|
||||
</div>
|
||||
<div>
|
||||
<Input
|
||||
className={compactNumberClass}
|
||||
type="number"
|
||||
step="0.01"
|
||||
value={surcharge.amount}
|
||||
onChange={(event) => {
|
||||
onConfigInputChange();
|
||||
handleSurchargeUpdate(index, { amount: parseNumber(event.target.value, 0) });
|
||||
}}
|
||||
onBlur={handleSurchargeBlur}
|
||||
/>
|
||||
</div>
|
||||
<div className="w-1.5" aria-hidden="true" />
|
||||
<div className="absolute -right-0.5 top-1/2 -translate-y-1/2 flex justify-end">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => handleSurchargeRemove(index)}
|
||||
className="p-1"
|
||||
>
|
||||
<X className="h-3 w-3" />
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</ScrollArea>
|
||||
)}
|
||||
</section>
|
||||
|
||||
<section className={sectionClass}>
|
||||
<div className={sectionHeaderClass}>
|
||||
<span className={sectionTitleClass}>Order costs</span>
|
||||
@@ -614,14 +816,24 @@ export function ConfigPanel({
|
||||
<span className={sectionTitleClass}>Rewards points</span>
|
||||
</div>
|
||||
<div className={fieldRowClass}>
|
||||
<div className="grid gap-3 sm:grid-cols-2">
|
||||
<div className={fieldRowHorizontalClass}>
|
||||
<div className="flex flex-col gap-1.5">
|
||||
<span className={labelClass}>Points per $</span>
|
||||
<span className="text-sm font-medium">{Number.isFinite(pointsPerDollar) ? pointsPerDollar.toFixed(4) : '—'}</span>
|
||||
<span className="text-sm font-medium mt-1">{Number.isFinite(pointsPerDollar) ? pointsPerDollar.toFixed(4) : '—'}</span>
|
||||
</div>
|
||||
<div className="flex flex-col gap-1.5">
|
||||
<span className={labelClass}>Redemption rate</span>
|
||||
<span className="text-sm font-medium">{formatPercent(redemptionRate)}</span>
|
||||
<div className={fieldClass}>
|
||||
<Label className={labelClass}>Redemption rate (%)</Label>
|
||||
<Input
|
||||
className={compactNumberClass}
|
||||
type="number"
|
||||
step="1"
|
||||
value={Math.round(redemptionRate * 100)}
|
||||
onChange={(event) => {
|
||||
onConfigInputChange();
|
||||
onRedemptionRateChange(parseNumber(event.target.value, 90) / 100);
|
||||
}}
|
||||
onBlur={handleFieldBlur}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className={fieldClass}>
|
||||
|
||||
@@ -1,11 +1,10 @@
|
||||
import { useCallback, useEffect, useMemo, useState, useRef } from "react";
|
||||
import { useCallback, useContext, useEffect, useMemo, useState } from "react";
|
||||
import { Check, ChevronsUpDown, Loader2 } from "lucide-react";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
@@ -14,6 +13,7 @@ import {
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Label } from "@/components/ui/label";
|
||||
import { Switch } from "@/components/ui/switch";
|
||||
import {
|
||||
Command,
|
||||
CommandEmpty,
|
||||
@@ -27,9 +27,10 @@ import {
|
||||
PopoverContent,
|
||||
PopoverTrigger,
|
||||
} from "@/components/ui/popover";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import config from "@/config";
|
||||
import { createProductCategory, type CreateProductCategoryResponse } from "@/services/apiv2";
|
||||
import { AuthContext } from "@/contexts/AuthContext";
|
||||
|
||||
type Option = {
|
||||
label: string;
|
||||
@@ -84,7 +85,11 @@ export function CreateProductCategoryDialog({
|
||||
environment = "prod",
|
||||
onCreated,
|
||||
}: CreateProductCategoryDialogProps) {
|
||||
const { user } = useContext(AuthContext);
|
||||
const hasDebugPermission = Boolean(user?.is_admin || user?.permissions?.includes("admin:debug"));
|
||||
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
const [activeTab, setActiveTab] = useState<"line" | "subline">(defaultLineId ? "subline" : "line");
|
||||
const [companyId, setCompanyId] = useState<string>(defaultCompanyId ?? "");
|
||||
const [lineId, setLineId] = useState<string>(defaultLineId ?? "");
|
||||
const [categoryName, setCategoryName] = useState("");
|
||||
@@ -92,7 +97,8 @@ export function CreateProductCategoryDialog({
|
||||
const [isLoadingLines, setIsLoadingLines] = useState(false);
|
||||
const [lines, setLines] = useState<Option[]>([]);
|
||||
const [linesCache, setLinesCache] = useState<Record<string, Option[]>>({});
|
||||
|
||||
const [targetEnvironment, setTargetEnvironment] = useState<"dev" | "prod">(environment);
|
||||
|
||||
// Popover open states
|
||||
const [companyOpen, setCompanyOpen] = useState(false);
|
||||
const [lineOpen, setLineOpen] = useState(false);
|
||||
@@ -116,6 +122,7 @@ export function CreateProductCategoryDialog({
|
||||
setCompanyId(defaultCompanyId ?? "");
|
||||
setLineId(defaultLineId ?? "");
|
||||
setCategoryName("");
|
||||
setActiveTab(defaultLineId ? "subline" : "line");
|
||||
}
|
||||
}, [isOpen, defaultCompanyId, defaultLineId]);
|
||||
|
||||
@@ -180,8 +187,13 @@ export function CreateProductCategoryDialog({
|
||||
return;
|
||||
}
|
||||
|
||||
const parentId = lineId || companyId;
|
||||
const creationType: "line" | "subline" = lineId ? "subline" : "line";
|
||||
if (activeTab === "subline" && !lineId) {
|
||||
toast.error("Select a parent line to create a subline");
|
||||
return;
|
||||
}
|
||||
|
||||
const parentId = activeTab === "subline" ? lineId : companyId;
|
||||
const creationType = activeTab;
|
||||
|
||||
setIsSubmitting(true);
|
||||
|
||||
@@ -189,7 +201,7 @@ export function CreateProductCategoryDialog({
|
||||
const result = await createProductCategory({
|
||||
masterCatId: parentId,
|
||||
name: trimmedName,
|
||||
environment,
|
||||
environment: targetEnvironment,
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
@@ -211,7 +223,7 @@ export function CreateProductCategoryDialog({
|
||||
}
|
||||
}
|
||||
|
||||
if (!lineId) {
|
||||
if (activeTab === "line") {
|
||||
const nextOption: Option = { label: trimmedName, value: newId ?? trimmedName };
|
||||
setLinesCache((prev) => {
|
||||
const existing = prev[companyId] ?? [];
|
||||
@@ -243,13 +255,13 @@ export function CreateProductCategoryDialog({
|
||||
const message =
|
||||
error instanceof Error
|
||||
? error.message
|
||||
: `Failed to create ${lineId ? "subline" : "product line"}.`;
|
||||
: `Failed to create ${activeTab === "line" ? "product line" : "subline"}.`;
|
||||
toast.error(message);
|
||||
} finally {
|
||||
setIsSubmitting(false);
|
||||
}
|
||||
},
|
||||
[categoryName, companyId, environment, lineId, onCreated],
|
||||
[activeTab, categoryName, companyId, targetEnvironment, lineId, onCreated],
|
||||
);
|
||||
|
||||
return (
|
||||
@@ -257,149 +269,172 @@ export function CreateProductCategoryDialog({
|
||||
<DialogTrigger asChild>{trigger}</DialogTrigger>
|
||||
<DialogContent>
|
||||
<DialogHeader>
|
||||
<DialogTitle>Create Product Line or Subline</DialogTitle>
|
||||
<DialogDescription>
|
||||
Add a new product line beneath a company or create a subline beneath an existing line.
|
||||
</DialogDescription>
|
||||
<DialogTitle>Create New Line or Subline</DialogTitle>
|
||||
</DialogHeader>
|
||||
|
||||
<form onSubmit={handleSubmit} className="space-y-4">
|
||||
{/* Company Select - Searchable */}
|
||||
<div className="space-y-2">
|
||||
<Label>Company</Label>
|
||||
<Popover open={companyOpen} onOpenChange={setCompanyOpen}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
role="combobox"
|
||||
aria-expanded={companyOpen}
|
||||
className="w-full justify-between font-normal"
|
||||
>
|
||||
{selectedCompanyLabel || "Select a company"}
|
||||
<ChevronsUpDown className="ml-2 h-4 w-4 shrink-0 opacity-50" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent className="w-[--radix-popover-trigger-width] p-0" align="start">
|
||||
<Command>
|
||||
<CommandInput placeholder="Search companies..." />
|
||||
<CommandList>
|
||||
<CommandEmpty>No company found.</CommandEmpty>
|
||||
<CommandGroup>
|
||||
{companyOptions.map((company) => (
|
||||
<CommandItem
|
||||
key={company.value}
|
||||
value={company.label}
|
||||
onSelect={() => {
|
||||
setCompanyId(company.value);
|
||||
setLineId("");
|
||||
setCompanyOpen(false);
|
||||
}}
|
||||
>
|
||||
{company.label}
|
||||
{company.value === companyId && (
|
||||
<Check className="ml-auto h-4 w-4" />
|
||||
)}
|
||||
</CommandItem>
|
||||
))}
|
||||
</CommandGroup>
|
||||
</CommandList>
|
||||
</Command>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
</div>
|
||||
<Tabs value={activeTab} onValueChange={(value) => setActiveTab(value as "line" | "subline")}>
|
||||
<TabsList className="grid w-full grid-cols-2">
|
||||
<TabsTrigger value="line">Create Line</TabsTrigger>
|
||||
<TabsTrigger value="subline">Create Subline</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
{/* Line Select - Searchable */}
|
||||
<div className="space-y-2">
|
||||
<Label>
|
||||
Parent Line <span className="text-muted-foreground">(optional)</span>
|
||||
</Label>
|
||||
<Popover open={lineOpen} onOpenChange={setLineOpen}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
role="combobox"
|
||||
aria-expanded={lineOpen}
|
||||
className="w-full justify-between font-normal"
|
||||
disabled={!companyId || isLoadingLines}
|
||||
>
|
||||
{!companyId
|
||||
? "Select a company first"
|
||||
: isLoadingLines
|
||||
? "Loading product lines..."
|
||||
: selectedLineLabel || "Leave empty to create a new line"}
|
||||
<ChevronsUpDown className="ml-2 h-4 w-4 shrink-0 opacity-50" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent className="w-[--radix-popover-trigger-width] p-0" align="start">
|
||||
<Command>
|
||||
<CommandInput placeholder="Search lines..." />
|
||||
<CommandList>
|
||||
<CommandEmpty>No line found.</CommandEmpty>
|
||||
<CommandGroup>
|
||||
{/* Option to clear selection */}
|
||||
<CommandItem
|
||||
value="none"
|
||||
onSelect={() => {
|
||||
setLineId("");
|
||||
setLineOpen(false);
|
||||
}}
|
||||
>
|
||||
<span className="text-muted-foreground">None (create new line)</span>
|
||||
{lineId === "" && <Check className="ml-auto h-4 w-4" />}
|
||||
</CommandItem>
|
||||
{lines.map((line) => (
|
||||
<CommandItem
|
||||
key={line.value}
|
||||
value={line.label}
|
||||
onSelect={() => {
|
||||
setLineId(line.value);
|
||||
setLineOpen(false);
|
||||
}}
|
||||
>
|
||||
{line.label}
|
||||
{line.value === lineId && (
|
||||
<Check className="ml-auto h-4 w-4" />
|
||||
)}
|
||||
</CommandItem>
|
||||
))}
|
||||
</CommandGroup>
|
||||
</CommandList>
|
||||
</Command>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
{companyId && !isLoadingLines && !lines.length && (
|
||||
<p className="text-xs text-muted-foreground">
|
||||
No existing lines found for this company. A new line will be created.
|
||||
</p>
|
||||
<form onSubmit={handleSubmit} className="space-y-4 pt-4">
|
||||
{/* Company Select - shown in both tabs */}
|
||||
<div className="space-y-2">
|
||||
<Label>Company</Label>
|
||||
<Popover open={companyOpen} onOpenChange={setCompanyOpen}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
role="combobox"
|
||||
aria-expanded={companyOpen}
|
||||
className="w-full justify-between font-normal"
|
||||
>
|
||||
{selectedCompanyLabel || "Select a company"}
|
||||
<ChevronsUpDown className="ml-2 h-4 w-4 shrink-0 opacity-50" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent className="w-[--radix-popover-trigger-width] p-0" align="start">
|
||||
<Command>
|
||||
<CommandInput placeholder="Search companies..." />
|
||||
<CommandList>
|
||||
<CommandEmpty>No company found.</CommandEmpty>
|
||||
<CommandGroup>
|
||||
{companyOptions.map((company) => (
|
||||
<CommandItem
|
||||
key={company.value}
|
||||
value={company.label}
|
||||
onSelect={() => {
|
||||
setCompanyId(company.value);
|
||||
setLineId("");
|
||||
setCompanyOpen(false);
|
||||
}}
|
||||
>
|
||||
{company.label}
|
||||
{company.value === companyId && (
|
||||
<Check className="ml-auto h-4 w-4" />
|
||||
)}
|
||||
</CommandItem>
|
||||
))}
|
||||
</CommandGroup>
|
||||
</CommandList>
|
||||
</Command>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
</div>
|
||||
|
||||
<TabsContent value="line" className="space-y-4 m-0">
|
||||
<div className="space-y-2">
|
||||
<Label htmlFor="create-line-name">Line Name</Label>
|
||||
<Input
|
||||
id="create-line-name"
|
||||
value={categoryName}
|
||||
onChange={(event) => setCategoryName(event.target.value)}
|
||||
placeholder="Enter the new line name"
|
||||
/>
|
||||
</div>
|
||||
</TabsContent>
|
||||
|
||||
<TabsContent value="subline" className="space-y-4 m-0">
|
||||
{/* Parent Line Select - only shown in subline tab */}
|
||||
<div className="space-y-2">
|
||||
<Label>Parent Line</Label>
|
||||
<Popover open={lineOpen} onOpenChange={setLineOpen}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
role="combobox"
|
||||
aria-expanded={lineOpen}
|
||||
className="w-full justify-between font-normal"
|
||||
disabled={!companyId || isLoadingLines}
|
||||
>
|
||||
{!companyId
|
||||
? "Select a company first"
|
||||
: isLoadingLines
|
||||
? "Loading product lines..."
|
||||
: selectedLineLabel || "Select a parent line"}
|
||||
<ChevronsUpDown className="ml-2 h-4 w-4 shrink-0 opacity-50" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent className="w-[--radix-popover-trigger-width] p-0" align="start">
|
||||
<Command>
|
||||
<CommandInput placeholder="Search lines..." />
|
||||
<CommandList>
|
||||
<CommandEmpty>No line found.</CommandEmpty>
|
||||
<CommandGroup>
|
||||
{lines.map((line) => (
|
||||
<CommandItem
|
||||
key={line.value}
|
||||
value={line.label}
|
||||
onSelect={() => {
|
||||
setLineId(line.value);
|
||||
setLineOpen(false);
|
||||
}}
|
||||
>
|
||||
{line.label}
|
||||
{line.value === lineId && (
|
||||
<Check className="ml-auto h-4 w-4" />
|
||||
)}
|
||||
</CommandItem>
|
||||
))}
|
||||
</CommandGroup>
|
||||
</CommandList>
|
||||
</Command>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
{companyId && !isLoadingLines && !lines.length && (
|
||||
<p className="text-xs text-muted-foreground">
|
||||
No existing lines found for this company. Create a line first.
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
<Label htmlFor="create-subline-name">Subline Name</Label>
|
||||
<Input
|
||||
id="create-subline-name"
|
||||
value={categoryName}
|
||||
onChange={(event) => setCategoryName(event.target.value)}
|
||||
placeholder="Enter the new subline name"
|
||||
/>
|
||||
</div>
|
||||
</TabsContent>
|
||||
|
||||
{hasDebugPermission && (
|
||||
<div className="pt-2 pb-2 border-t">
|
||||
<div className="flex items-center gap-2">
|
||||
<Switch
|
||||
id="category-api-environment"
|
||||
checked={targetEnvironment === "dev"}
|
||||
onCheckedChange={(checked) => setTargetEnvironment(checked ? "dev" : "prod")}
|
||||
/>
|
||||
<Label htmlFor="category-api-environment" className="text-sm font-medium cursor-pointer">
|
||||
Use test API
|
||||
</Label>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
<Label htmlFor="create-category-name">Name</Label>
|
||||
<Input
|
||||
id="create-category-name"
|
||||
value={categoryName}
|
||||
onChange={(event) => setCategoryName(event.target.value)}
|
||||
placeholder="Enter the new line or subline name"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<DialogFooter className="gap-2">
|
||||
<Button type="button" variant="outline" onClick={() => setIsOpen(false)}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button type="submit" disabled={isSubmitting || !companyId}>
|
||||
{isSubmitting ? (
|
||||
<>
|
||||
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
|
||||
Creating...
|
||||
</>
|
||||
) : (
|
||||
"Create"
|
||||
)}
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
<DialogFooter className="gap-2">
|
||||
<Button type="button" variant="outline" onClick={() => setIsOpen(false)}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
type="submit"
|
||||
disabled={isSubmitting || !companyId || (activeTab === "subline" && !lineId)}
|
||||
>
|
||||
{isSubmitting ? (
|
||||
<>
|
||||
<Loader2 className="mr-2 h-4 w-4 animate-spin" />
|
||||
Creating...
|
||||
</>
|
||||
) : (
|
||||
`Create ${activeTab === "line" ? "Line" : "Subline"}`
|
||||
)}
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</form>
|
||||
</Tabs>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
);
|
||||
|
||||
@@ -93,9 +93,8 @@ export const BASE_IMPORT_FIELDS = [
|
||||
description: "Internal notions number",
|
||||
alternateMatches: ["notions #","nmc"],
|
||||
fieldType: { type: "input" },
|
||||
width: 100,
|
||||
width: 110,
|
||||
validations: [
|
||||
{ rule: "required", errorMessage: "Required", level: "error" },
|
||||
{ rule: "unique", errorMessage: "Must be unique", level: "error" },
|
||||
{ rule: "regex", value: "^[0-9]+$", errorMessage: "Must be a number", level: "error" },
|
||||
],
|
||||
@@ -106,7 +105,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
description: "Product name/title",
|
||||
alternateMatches: ["sku description","product name"],
|
||||
fieldType: { type: "input" },
|
||||
width: 500,
|
||||
width: 400,
|
||||
validations: [
|
||||
{ rule: "required", errorMessage: "Required", level: "error" },
|
||||
{ rule: "unique", errorMessage: "Must be unique", level: "error" },
|
||||
@@ -121,7 +120,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
type: "input",
|
||||
price: true
|
||||
},
|
||||
width: 100,
|
||||
width: 110,
|
||||
validations: [
|
||||
{ rule: "required", errorMessage: "Required", level: "error" },
|
||||
{ rule: "regex", value: "^[0-9]*.?[0-9]+$", errorMessage: "Must be a number", level: "error" },
|
||||
@@ -133,7 +132,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
description: "Quantity of items per individual unit",
|
||||
alternateMatches: ["inner pack", "inner", "min qty", "unit qty", "min. order qty", "supplier qty/unit"],
|
||||
fieldType: { type: "input" },
|
||||
width: 80,
|
||||
width: 100,
|
||||
validations: [
|
||||
{ rule: "required", errorMessage: "Required", level: "error" },
|
||||
{ rule: "regex", value: "^[0-9]+$", errorMessage: "Must be a number", level: "error" },
|
||||
@@ -143,12 +142,12 @@ export const BASE_IMPORT_FIELDS = [
|
||||
label: "Cost Each",
|
||||
key: "cost_each",
|
||||
description: "Wholesale cost per unit",
|
||||
alternateMatches: ["wholesale", "wholesale price", "supplier cost each", "cost each"],
|
||||
alternateMatches: ["wholesale", "wholesale price", "supplier cost each", "cost each","whls"],
|
||||
fieldType: {
|
||||
type: "input",
|
||||
price: true
|
||||
},
|
||||
width: 100,
|
||||
width: 120,
|
||||
validations: [
|
||||
{ rule: "required", errorMessage: "Required", level: "error" },
|
||||
{ rule: "regex", value: "^[0-9]*.?[0-9]+$", errorMessage: "Must be a number", level: "error" },
|
||||
@@ -158,7 +157,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
label: "Case Pack",
|
||||
key: "case_qty",
|
||||
description: "Number of units per case",
|
||||
alternateMatches: ["mc qty","case qty","case pack","box ct"],
|
||||
alternateMatches: ["mc qty","case qty","case pack","box ct","master"],
|
||||
fieldType: { type: "input" },
|
||||
width: 100,
|
||||
validations: [
|
||||
@@ -250,11 +249,11 @@ export const BASE_IMPORT_FIELDS = [
|
||||
width: 190,
|
||||
validations: [{ rule: "required", errorMessage: "Required", level: "error" }],
|
||||
},
|
||||
{
|
||||
{
|
||||
label: "COO",
|
||||
key: "coo",
|
||||
description: "2-letter country code (ISO)",
|
||||
alternateMatches: ["coo", "country of origin"],
|
||||
alternateMatches: ["coo", "country of origin", "origin"],
|
||||
fieldType: { type: "input" },
|
||||
width: 70,
|
||||
validations: [
|
||||
@@ -265,7 +264,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
label: "HTS Code",
|
||||
key: "hts_code",
|
||||
description: "Harmonized Tariff Schedule code",
|
||||
alternateMatches: ["taric","hts"],
|
||||
alternateMatches: ["taric","hts","hs code","hs code (commodity code)"],
|
||||
fieldType: { type: "input" },
|
||||
width: 130,
|
||||
validations: [
|
||||
@@ -286,7 +285,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
label: "Description",
|
||||
key: "description",
|
||||
description: "Detailed product description",
|
||||
alternateMatches: ["details/description"],
|
||||
alternateMatches: ["details/description","description of item"],
|
||||
fieldType: {
|
||||
type: "input",
|
||||
multiline: true
|
||||
@@ -312,7 +311,7 @@ export const BASE_IMPORT_FIELDS = [
|
||||
type: "multi-select",
|
||||
options: [], // Will be populated from API
|
||||
},
|
||||
width: 350,
|
||||
width: 400,
|
||||
validations: [{ rule: "required", errorMessage: "Required", level: "error" }],
|
||||
},
|
||||
{
|
||||
|
||||
@@ -47,6 +47,7 @@ export const ImageUploadStep = ({
|
||||
const hasDebugPermission = Boolean(user?.is_admin || user?.permissions?.includes("admin:debug"));
|
||||
const [targetEnvironment, setTargetEnvironment] = useState<SubmitOptions["targetEnvironment"]>("prod");
|
||||
const [useTestDataSource, setUseTestDataSource] = useState<boolean>(false);
|
||||
const [skipApiSubmission, setSkipApiSubmission] = useState<boolean>(false);
|
||||
|
||||
// Use our hook for product images initialization
|
||||
const { productImages, setProductImages, getFullImageUrl } = useProductImagesInit(data);
|
||||
@@ -177,6 +178,7 @@ export const ImageUploadStep = ({
|
||||
const submitOptions: SubmitOptions = {
|
||||
targetEnvironment,
|
||||
useTestDataSource,
|
||||
skipApiSubmission,
|
||||
};
|
||||
|
||||
await onSubmit(updatedData, file, submitOptions);
|
||||
@@ -186,7 +188,7 @@ export const ImageUploadStep = ({
|
||||
} finally {
|
||||
setIsSubmitting(false);
|
||||
}
|
||||
}, [data, file, onSubmit, productImages, targetEnvironment, useTestDataSource]);
|
||||
}, [data, file, onSubmit, productImages, targetEnvironment, useTestDataSource, skipApiSubmission]);
|
||||
|
||||
return (
|
||||
<div className="flex flex-col h-[calc(100vh-9.5rem)] overflow-hidden">
|
||||
@@ -297,27 +299,43 @@ export const ImageUploadStep = ({
|
||||
<div className="flex flex-1 flex-wrap items-center justify-end gap-6">
|
||||
{hasDebugPermission && (
|
||||
<div className="flex gap-4 text-sm">
|
||||
{!skipApiSubmission && (
|
||||
<>
|
||||
<div className="flex items-center gap-1">
|
||||
<Switch
|
||||
id="product-import-api-environment"
|
||||
checked={targetEnvironment === "dev"}
|
||||
onCheckedChange={(checked) => setTargetEnvironment(checked ? "dev" : "prod")}
|
||||
/>
|
||||
<div>
|
||||
<Label htmlFor="product-import-api-environment" className="text-sm font-medium">
|
||||
Use test API
|
||||
</Label>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-center gap-1">
|
||||
<Switch
|
||||
id="product-import-api-test-data"
|
||||
checked={useTestDataSource}
|
||||
onCheckedChange={(checked) => setUseTestDataSource(checked)}
|
||||
/>
|
||||
<div>
|
||||
<Label htmlFor="product-import-api-test-data" className="text-sm font-medium">
|
||||
Use test database
|
||||
</Label>
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
<div className="flex items-center gap-1">
|
||||
<Switch
|
||||
id="product-import-api-environment"
|
||||
checked={targetEnvironment === "dev"}
|
||||
onCheckedChange={(checked) => setTargetEnvironment(checked ? "dev" : "prod")}
|
||||
id="product-import-skip-api"
|
||||
checked={skipApiSubmission}
|
||||
onCheckedChange={(checked) => setSkipApiSubmission(checked)}
|
||||
/>
|
||||
<div>
|
||||
<Label htmlFor="product-import-api-environment" className="text-sm font-medium">
|
||||
Use test API
|
||||
</Label>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex items-center gap-1">
|
||||
<Switch
|
||||
id="product-import-api-test-data"
|
||||
checked={useTestDataSource}
|
||||
onCheckedChange={(checked) => setUseTestDataSource(checked)}
|
||||
/>
|
||||
<div>
|
||||
<Label htmlFor="product-import-api-test-data" className="text-sm font-medium">
|
||||
Use test database
|
||||
<Label htmlFor="product-import-skip-api" className="text-sm font-medium text-amber-600">
|
||||
Skip API (Debug)
|
||||
</Label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -29,7 +29,7 @@ import {
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query"
|
||||
import config from "@/config"
|
||||
import { Button } from "@/components/ui/button"
|
||||
import { CheckCircle2, AlertCircle, EyeIcon, EyeOffIcon, ArrowRightIcon, XIcon, FileSpreadsheetIcon, LinkIcon, CheckIcon, ChevronsUpDown, Bot } from "lucide-react"
|
||||
import { CheckCircle2, AlertCircle, EyeIcon, EyeOffIcon, ArrowRightIcon, XIcon, FileSpreadsheetIcon, LinkIcon, CheckIcon, ChevronsUpDown, Bot, RefreshCw, Plus } from "lucide-react"
|
||||
import { Separator } from "@/components/ui/separator"
|
||||
import { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger } from "@/components/ui/tooltip"
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table"
|
||||
@@ -147,17 +147,17 @@ const MemoizedColumnSamplePreview = React.memo(function ColumnSamplePreview({ sa
|
||||
<FileSpreadsheetIcon className="h-4 w-4 text-muted-foreground" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent side="right" align="start" className="w-[250px] p-0">
|
||||
<ScrollArea className="h-[200px] overflow-y-auto">
|
||||
<PopoverContent side="right" align="start" className="w-[280px] p-0" onWheel={(e) => e.stopPropagation()}>
|
||||
<div className="max-h-[300px] overflow-y-auto overscroll-contain" style={{ overscrollBehavior: 'contain' }}>
|
||||
<div className="p-3 space-y-2">
|
||||
{samples.map((sample, i) => (
|
||||
<div key={i} className="text-sm">
|
||||
<div key={i} className="text-sm break-words">
|
||||
<span className="font-medium">{String(sample || '(empty)')}</span>
|
||||
{i < samples.length - 1 && <Separator className="w-full my-2" />}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</ScrollArea>
|
||||
</div>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
</div>
|
||||
@@ -611,6 +611,7 @@ const MatchColumnsStepComponent = <T extends string>({
|
||||
const { fields, autoMapHeaders, autoMapSelectValues, autoMapDistance, translations } = useRsi<T>()
|
||||
const queryClient = useQueryClient()
|
||||
const [isLoading, setIsLoading] = useState(false)
|
||||
const [isRefreshing, setIsRefreshing] = useState(false)
|
||||
|
||||
const [columns, setColumns] = useState<Columns<T>>(() => {
|
||||
// Helper function to check if a column is completely empty
|
||||
@@ -846,6 +847,43 @@ const MatchColumnsStepComponent = <T extends string>({
|
||||
[queryClient, setGlobalSelections],
|
||||
);
|
||||
|
||||
// Handle manual cache refresh
|
||||
const handleRefreshTaxonomy = useCallback(async () => {
|
||||
setIsRefreshing(true);
|
||||
try {
|
||||
// Clear backend cache
|
||||
const response = await fetch(`${config.apiUrl}/import/clear-taxonomy-cache`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error('Failed to clear backend cache');
|
||||
}
|
||||
|
||||
// Clear frontend React Query cache
|
||||
await Promise.all([
|
||||
queryClient.invalidateQueries({ queryKey: ['field-options'] }),
|
||||
queryClient.invalidateQueries({ queryKey: ['product-lines'] }),
|
||||
queryClient.invalidateQueries({ queryKey: ['product-lines-mapped'] }),
|
||||
queryClient.invalidateQueries({ queryKey: ['sublines'] }),
|
||||
queryClient.invalidateQueries({ queryKey: ['sublines-mapped'] }),
|
||||
]);
|
||||
|
||||
// Refetch field options immediately
|
||||
await queryClient.refetchQueries({ queryKey: ['field-options'] });
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error refreshing taxonomy:', error);
|
||||
toast.error('Failed to refresh taxonomy data', {
|
||||
});
|
||||
} finally {
|
||||
setIsRefreshing(false);
|
||||
}
|
||||
}, [queryClient]);
|
||||
|
||||
// Check if a field is covered by global selections
|
||||
const isFieldCoveredByGlobalSelections = useCallback((key: string) => {
|
||||
return (key === 'supplier' && !!globalSelections.supplier) ||
|
||||
@@ -1815,11 +1853,11 @@ const MatchColumnsStepComponent = <T extends string>({
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="pt-2">
|
||||
<div className="pt-4 flex items-center gap-2">
|
||||
<CreateProductCategoryDialog
|
||||
trigger={
|
||||
<Button variant="link" className="h-auto px-0 text-sm font-medium">
|
||||
+ New line or subline
|
||||
<Button variant="outline" size="sm" className="w-full">
|
||||
<Plus className="h-3 w-3" /> New line or subline
|
||||
</Button>
|
||||
}
|
||||
companies={fieldOptions?.companies || []}
|
||||
@@ -1827,6 +1865,26 @@ const MatchColumnsStepComponent = <T extends string>({
|
||||
defaultLineId={globalSelections.line}
|
||||
onCreated={handleCategoryCreated}
|
||||
/>
|
||||
|
||||
<TooltipProvider>
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
className="w-full"
|
||||
onClick={handleRefreshTaxonomy}
|
||||
disabled={isRefreshing}
|
||||
>
|
||||
<RefreshCw className={`h-3 w-3 ${isRefreshing ? 'animate-spin' : ''}`} />
|
||||
Refresh data
|
||||
</Button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent className="max-w-[250px]">
|
||||
<p>Reload all suppliers, companies, lines, categories, and other taxonomy data from the database</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1917,18 +1975,11 @@ const MatchColumnsStepComponent = <T extends string>({
|
||||
)}
|
||||
|
||||
<div className="flex items-center gap-2 ml-auto">
|
||||
<Button
|
||||
variant="outline"
|
||||
disabled={isLoading}
|
||||
onClick={() => handleOnContinue(false)}
|
||||
>
|
||||
{translations.matchColumnsStep.nextButtonTitle}
|
||||
</Button>
|
||||
<Button
|
||||
disabled={isLoading}
|
||||
onClick={() => handleOnContinue(true)}
|
||||
>
|
||||
{translations.matchColumnsStep.nextButtonTitle} (New Validation)
|
||||
{translations.matchColumnsStep.nextButtonTitle}
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useCallback, useState } from "react"
|
||||
import { useCallback, useState, useMemo } from "react"
|
||||
import { SelectHeaderTable } from "./components/SelectHeaderTable"
|
||||
import { useRsi } from "../../hooks/useRsi"
|
||||
import type { RawData } from "../../types"
|
||||
@@ -11,12 +11,29 @@ type SelectHeaderProps = {
|
||||
onBack?: () => void
|
||||
}
|
||||
|
||||
/**
|
||||
* Checks if a row is completely empty (all values are undefined, null, or whitespace-only strings)
|
||||
*/
|
||||
const isRowCompletelyEmpty = (row: RawData): boolean => {
|
||||
return Object.values(row).every(val =>
|
||||
val === undefined ||
|
||||
val === null ||
|
||||
(typeof val === 'string' && val.trim() === '')
|
||||
);
|
||||
};
|
||||
|
||||
export const SelectHeaderStep = ({ data, onContinue, onBack }: SelectHeaderProps) => {
|
||||
const { translations } = useRsi()
|
||||
const { toast } = useToast()
|
||||
const [selectedRows, setSelectedRows] = useState<ReadonlySet<number>>(new Set([0]))
|
||||
const [isLoading, setIsLoading] = useState(false)
|
||||
const [localData, setLocalData] = useState<RawData[]>(data)
|
||||
|
||||
// Automatically filter out completely empty rows on initial load
|
||||
const initialFilteredData = useMemo(() => {
|
||||
return data.filter(row => !isRowCompletelyEmpty(row));
|
||||
}, [data]);
|
||||
|
||||
const [localData, setLocalData] = useState<RawData[]>(initialFilteredData)
|
||||
|
||||
const handleContinue = useCallback(async () => {
|
||||
const [selectedRowIndex] = selectedRows
|
||||
|
||||
@@ -1,21 +1,63 @@
|
||||
import { useCallback, useState } from "react"
|
||||
import { useCallback, useState, useMemo } from "react"
|
||||
import type XLSX from "xlsx"
|
||||
import * as XLSXLib from "xlsx"
|
||||
import { useRsi } from "../../hooks/useRsi"
|
||||
import { RadioGroup, RadioGroupItem } from "@/components/ui/radio-group"
|
||||
import { Label } from "@/components/ui/label"
|
||||
import { Button } from "@/components/ui/button"
|
||||
import { ChevronLeft } from "lucide-react"
|
||||
import { ScrollArea, ScrollBar } from "@/components/ui/scroll-area"
|
||||
|
||||
type SelectSheetProps = {
|
||||
sheetNames: string[]
|
||||
workbook: XLSX.WorkBook
|
||||
onContinue: (sheetName: string) => Promise<void>
|
||||
onBack?: () => void
|
||||
}
|
||||
|
||||
export const SelectSheetStep = ({ sheetNames, onContinue, onBack }: SelectSheetProps) => {
|
||||
const MAX_PREVIEW_ROWS = 10
|
||||
const MAX_PREVIEW_COLS = 8
|
||||
const MAX_CELL_LENGTH = 30
|
||||
|
||||
export const SelectSheetStep = ({ sheetNames, workbook, onContinue, onBack }: SelectSheetProps) => {
|
||||
const [isLoading, setIsLoading] = useState(false)
|
||||
const { translations } = useRsi()
|
||||
const [value, setValue] = useState(sheetNames[0])
|
||||
|
||||
// Generate preview data for each sheet
|
||||
const sheetPreviews = useMemo(() => {
|
||||
const previews: Record<string, (string | number | null)[][]> = {}
|
||||
|
||||
for (const sheetName of sheetNames) {
|
||||
const sheet = workbook.Sheets[sheetName]
|
||||
if (!sheet) continue
|
||||
|
||||
// Convert sheet to array of arrays
|
||||
const data = XLSXLib.utils.sheet_to_json<(string | number | null)[]>(sheet, {
|
||||
header: 1,
|
||||
defval: null,
|
||||
})
|
||||
|
||||
// Take first N rows and limit columns
|
||||
const previewRows = data.slice(0, MAX_PREVIEW_ROWS).map(row =>
|
||||
(row as (string | number | null)[]).slice(0, MAX_PREVIEW_COLS)
|
||||
)
|
||||
|
||||
previews[sheetName] = previewRows
|
||||
}
|
||||
|
||||
return previews
|
||||
}, [sheetNames, workbook])
|
||||
|
||||
const truncateCell = (value: string | number | null): string => {
|
||||
if (value === null || value === undefined) return ""
|
||||
const str = String(value)
|
||||
if (str.length > MAX_CELL_LENGTH) {
|
||||
return str.slice(0, MAX_CELL_LENGTH - 1) + "…"
|
||||
}
|
||||
return str
|
||||
}
|
||||
|
||||
const handleOnContinue = useCallback(
|
||||
async (data: typeof value) => {
|
||||
setIsLoading(true)
|
||||
@@ -37,19 +79,69 @@ export const SelectSheetStep = ({ sheetNames, onContinue, onBack }: SelectSheetP
|
||||
<RadioGroup
|
||||
value={value}
|
||||
onValueChange={setValue}
|
||||
className="space-y-4"
|
||||
className="space-y-6"
|
||||
>
|
||||
{sheetNames.map((sheetName) => (
|
||||
<div key={sheetName} className="flex items-center space-x-2">
|
||||
<RadioGroupItem value={sheetName} id={sheetName} />
|
||||
<Label
|
||||
htmlFor={sheetName}
|
||||
className="text-base"
|
||||
{sheetNames.map((sheetName) => {
|
||||
const preview = sheetPreviews[sheetName] || []
|
||||
const isSelected = value === sheetName
|
||||
|
||||
return (
|
||||
<div
|
||||
key={sheetName}
|
||||
className={`rounded-lg border p-4 transition-colors cursor-pointer ${
|
||||
isSelected ? "border-primary bg-primary/5" : "border-border hover:border-muted-foreground/50"
|
||||
}`}
|
||||
onClick={() => setValue(sheetName)}
|
||||
>
|
||||
{sheetName}
|
||||
</Label>
|
||||
</div>
|
||||
))}
|
||||
<div className="flex items-center space-x-2 mb-3">
|
||||
<RadioGroupItem value={sheetName} id={sheetName} />
|
||||
<Label
|
||||
htmlFor={sheetName}
|
||||
className="text-base font-medium cursor-pointer"
|
||||
>
|
||||
{sheetName}
|
||||
</Label>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
({preview.length === 10 ? 'first ' : ''}{preview.length} rows shown)
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{preview.length > 0 && (
|
||||
<ScrollArea className="w-full">
|
||||
<div className="rounded border bg-muted/30">
|
||||
<table className="text-xs w-full">
|
||||
<tbody>
|
||||
{preview.map((row, rowIndex) => (
|
||||
<tr
|
||||
key={rowIndex}
|
||||
className={rowIndex === 0 ? "bg-muted/50 font-medium" : ""}
|
||||
>
|
||||
{row.map((cell, colIndex) => (
|
||||
<td
|
||||
key={colIndex}
|
||||
className="px-2 py-1 border-r border-b last:border-r-0 whitespace-nowrap max-w-[150px] overflow-hidden text-ellipsis"
|
||||
title={cell !== null ? String(cell) : ""}
|
||||
>
|
||||
{truncateCell(cell)}
|
||||
</td>
|
||||
))}
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<ScrollBar orientation="horizontal" />
|
||||
</ScrollArea>
|
||||
)}
|
||||
|
||||
{preview.length === 0 && (
|
||||
<p className="text-xs text-muted-foreground italic">
|
||||
No data in this sheet
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</RadioGroup>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
import { useCallback, useState } from "react"
|
||||
import { useCallback, useState, useEffect } from "react"
|
||||
import type XLSX from "xlsx"
|
||||
import { useQueryClient } from "@tanstack/react-query"
|
||||
import { UploadStep } from "./UploadStep/UploadStep"
|
||||
import { SelectHeaderStep } from "./SelectHeaderStep/SelectHeaderStep"
|
||||
import { SelectSheetStep } from "./SelectSheetStep/SelectSheetStep"
|
||||
import { mapWorkbook } from "../utils/mapWorkbook"
|
||||
import { ValidationStepNew } from "./ValidationStepNew"
|
||||
import { ValidationStep } from "./ValidationStep"
|
||||
import { ImageUploadStep } from "./ImageUploadStep/ImageUploadStep"
|
||||
import { MatchColumnsStep } from "./MatchColumnsStep/MatchColumnsStep"
|
||||
@@ -14,7 +14,8 @@ import { useRsi } from "../hooks/useRsi"
|
||||
import type { RawData, Data } from "../types"
|
||||
import { Progress } from "@/components/ui/progress"
|
||||
import { useToast } from "@/hooks/use-toast"
|
||||
import { addErrorsAndRunHooks } from "./ValidationStepNew/utils/dataMutations"
|
||||
import { addErrorsAndRunHooks } from "./ValidationStep/utils/dataMutations"
|
||||
import { useValidationStore } from "./ValidationStep/store/validationStore"
|
||||
|
||||
export enum StepType {
|
||||
upload = "upload",
|
||||
@@ -53,6 +54,7 @@ export type StepState =
|
||||
| {
|
||||
type: StepType.validateDataNew
|
||||
data: any[]
|
||||
file?: File
|
||||
globalSelections?: GlobalSelections
|
||||
isFromScratch?: boolean
|
||||
}
|
||||
@@ -82,6 +84,31 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
|
||||
onSubmit } = useRsi()
|
||||
const [uploadedFile, setUploadedFile] = useState<File | null>(null)
|
||||
const { toast } = useToast()
|
||||
const queryClient = useQueryClient()
|
||||
const resetValidationStore = useValidationStore((state) => state.reset)
|
||||
|
||||
// Fresh taxonomy data per session:
|
||||
// Invalidate caches on mount and clear on unmount to ensure fresh data each import session
|
||||
useEffect(() => {
|
||||
// On mount - invalidate import-related query caches to fetch fresh data
|
||||
queryClient.invalidateQueries({ queryKey: ['field-options'] });
|
||||
queryClient.invalidateQueries({ queryKey: ['product-lines'] });
|
||||
queryClient.invalidateQueries({ queryKey: ['product-lines-mapped'] });
|
||||
queryClient.invalidateQueries({ queryKey: ['sublines'] });
|
||||
queryClient.invalidateQueries({ queryKey: ['sublines-mapped'] });
|
||||
|
||||
return () => {
|
||||
// On unmount - remove queries from cache entirely and reset Zustand store
|
||||
queryClient.removeQueries({ queryKey: ['field-options'] });
|
||||
queryClient.removeQueries({ queryKey: ['product-lines'] });
|
||||
queryClient.removeQueries({ queryKey: ['product-lines-mapped'] });
|
||||
queryClient.removeQueries({ queryKey: ['sublines'] });
|
||||
queryClient.removeQueries({ queryKey: ['sublines-mapped'] });
|
||||
|
||||
// Reset the validation store to clear productLinesCache and sublinesCache
|
||||
resetValidationStore();
|
||||
};
|
||||
}, [queryClient, resetValidationStore]);
|
||||
const errorToast = useCallback(
|
||||
(description: string) => {
|
||||
toast({
|
||||
@@ -143,6 +170,7 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
|
||||
return (
|
||||
<SelectSheetStep
|
||||
sheetNames={state.workbook.SheetNames}
|
||||
workbook={state.workbook}
|
||||
onContinue={async (sheetName) => {
|
||||
if (maxRecords && exceedsMaxRecords(state.workbook.Sheets[sheetName], maxRecords)) {
|
||||
errorToast(translations.uploadStep.maxRecordsExceeded(maxRecords.toString()))
|
||||
@@ -220,36 +248,8 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
|
||||
/>
|
||||
)
|
||||
case StepType.validateData:
|
||||
// Always use the new ValidationStepNew component
|
||||
return (
|
||||
<ValidationStepNew
|
||||
initialData={state.data}
|
||||
file={uploadedFile || new File([], "empty.xlsx")}
|
||||
onBack={() => {
|
||||
// If we started from scratch, we need to go back to the upload step
|
||||
if (state.isFromScratch) {
|
||||
onNext({
|
||||
type: StepType.upload
|
||||
});
|
||||
} else if (onBack) {
|
||||
// Use the provided onBack function
|
||||
onBack();
|
||||
}
|
||||
}}
|
||||
onNext={(validatedData: any[]) => {
|
||||
// Go to image upload step with the validated data
|
||||
onNext({
|
||||
type: StepType.imageUpload,
|
||||
data: validatedData,
|
||||
file: uploadedFile || new File([], "empty.xlsx"),
|
||||
globalSelections: state.globalSelections
|
||||
});
|
||||
}}
|
||||
isFromScratch={state.isFromScratch}
|
||||
/>
|
||||
)
|
||||
case StepType.validateDataNew:
|
||||
// New Zustand-based ValidationStep component
|
||||
// Zustand-based ValidationStep component (both cases now use this)
|
||||
return (
|
||||
<ValidationStep
|
||||
initialData={state.data}
|
||||
@@ -282,7 +282,15 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
|
||||
<ImageUploadStep
|
||||
data={state.data}
|
||||
file={state.file}
|
||||
onBack={onBack}
|
||||
onBack={() => {
|
||||
// Go back to the validation step with the current data
|
||||
onNext({
|
||||
type: StepType.validateDataNew,
|
||||
data: state.data,
|
||||
file: state.file,
|
||||
globalSelections: state.globalSelections
|
||||
});
|
||||
}}
|
||||
onSubmit={(data, file, options) => {
|
||||
// Create a Result object from the array data
|
||||
const result = {
|
||||
|
||||
@@ -0,0 +1,290 @@
|
||||
/**
|
||||
* AiSuggestionBadge Component
|
||||
*
|
||||
* Displays an AI suggestion with accept/dismiss actions.
|
||||
* Used for inline validation suggestions on Name and Description fields.
|
||||
*
|
||||
* For description fields, starts collapsed (just icon + count) and expands on click.
|
||||
* For name fields, uses compact inline mode.
|
||||
*/
|
||||
|
||||
import { useState } from 'react';
|
||||
import { Check, X, Sparkles, AlertCircle, ChevronDown, ChevronUp, Info } from 'lucide-react';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
TooltipProvider,
|
||||
TooltipTrigger,
|
||||
} from '@/components/ui/tooltip';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
interface AiSuggestionBadgeProps {
|
||||
/** The suggested value */
|
||||
suggestion: string;
|
||||
/** List of issues found (optional) */
|
||||
issues?: string[];
|
||||
/** Called when user accepts the suggestion */
|
||||
onAccept: () => void;
|
||||
/** Called when user dismisses the suggestion */
|
||||
onDismiss: () => void;
|
||||
/** Additional CSS classes */
|
||||
className?: string;
|
||||
/** Whether to show the suggestion as compact (inline) - used for name field */
|
||||
compact?: boolean;
|
||||
/** Whether to start in collapsible mode (icon + count) - used for description field */
|
||||
collapsible?: boolean;
|
||||
}
|
||||
|
||||
export function AiSuggestionBadge({
|
||||
suggestion,
|
||||
issues = [],
|
||||
onAccept,
|
||||
onDismiss,
|
||||
className,
|
||||
compact = false,
|
||||
collapsible = false
|
||||
}: AiSuggestionBadgeProps) {
|
||||
const [isExpanded, setIsExpanded] = useState(false);
|
||||
|
||||
// Compact mode for name fields - inline suggestion with accept/dismiss
|
||||
if (compact) {
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center justify-between gap-1.5 px-2 py-1 rounded-md text-xs',
|
||||
'bg-purple-50 border border-purple-200',
|
||||
'dark:bg-purple-950/30 dark:border-purple-800',
|
||||
className
|
||||
)}
|
||||
>
|
||||
<div className="flex items-start gap-1.5">
|
||||
<Sparkles className="h-3 w-3 text-purple-500 flex-shrink-0 mt-0.5" />
|
||||
|
||||
<span className="text-purple-700 dark:text-purple-300">
|
||||
{suggestion}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-0.5 flex-shrink-0">
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={200}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
className="h-5 w-5 p-0 text-green-600 hover:text-green-700 hover:bg-green-100"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onAccept();
|
||||
}}
|
||||
>
|
||||
<Check className="h-3 w-3" />
|
||||
</Button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="top">
|
||||
<p>Accept suggestion</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={200}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
className="h-5 w-5 p-0 text-gray-400 hover:text-gray-600 hover:bg-gray-100"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onDismiss();
|
||||
}}
|
||||
>
|
||||
<X className="h-3 w-3" />
|
||||
</Button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="top">
|
||||
<p>Ignore</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
{/* Info icon with issues tooltip */}
|
||||
{issues.length > 0 && (
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={200}>
|
||||
<TooltipTrigger asChild>
|
||||
<button
|
||||
type="button"
|
||||
className="flex-shrink-0 text-purple-400 hover:text-purple-600 transition-colors"
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
>
|
||||
<Info className="h-3.5 w-3.5" />
|
||||
</button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent
|
||||
side="top"
|
||||
align="start"
|
||||
className="max-w-[300px] p-2"
|
||||
>
|
||||
<div className="flex flex-col gap-1">
|
||||
<div className="text-xs font-medium text-purple-300 mb-1">
|
||||
Issues found:
|
||||
</div>
|
||||
{issues.map((issue, index) => (
|
||||
<div
|
||||
key={index}
|
||||
className="flex items-start gap-1.5 text-xs"
|
||||
>
|
||||
<AlertCircle className="h-3 w-3 mt-0.5 flex-shrink-0 text-purple-300" />
|
||||
<span>{issue}</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// Collapsible mode for description fields
|
||||
if (collapsible && !isExpanded) {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
setIsExpanded(true);
|
||||
}}
|
||||
className={cn(
|
||||
'flex items-center gap-1.5 px-2 py-1 rounded-md text-xs',
|
||||
'bg-purple-50 border border-purple-200 hover:bg-purple-100',
|
||||
'dark:bg-purple-950/30 dark:border-purple-800 dark:hover:bg-purple-900/40',
|
||||
'transition-colors cursor-pointer',
|
||||
className
|
||||
)}
|
||||
title="Click to see AI suggestion"
|
||||
>
|
||||
<Sparkles className="h-3.5 w-3.5 text-purple-500" />
|
||||
<span className="text-purple-600 dark:text-purple-400 font-medium">
|
||||
{issues.length} {issues.length === 1 ? 'issue' : 'issues'}
|
||||
</span>
|
||||
<ChevronDown className="h-3 w-3 text-purple-400" />
|
||||
</button>
|
||||
);
|
||||
}
|
||||
|
||||
// Expanded view (default for non-compact, or when collapsible is expanded)
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex flex-col gap-2 p-3 rounded-md',
|
||||
'bg-purple-50 border border-purple-200',
|
||||
'dark:bg-purple-950/30 dark:border-purple-800',
|
||||
className
|
||||
)}
|
||||
>
|
||||
{/* Header with collapse button if collapsible */}
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-2">
|
||||
<Sparkles className="h-3.5 w-3.5 text-purple-500 flex-shrink-0" />
|
||||
<span className="text-xs font-medium text-purple-600 dark:text-purple-400">
|
||||
AI Suggestion
|
||||
</span>
|
||||
{issues.length > 0 && (
|
||||
<span className="text-xs text-purple-500 dark:text-purple-400">
|
||||
({issues.length} {issues.length === 1 ? 'issue' : 'issues'})
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{collapsible && (
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
className="h-5 w-5 p-0 text-purple-400 hover:text-purple-600"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
setIsExpanded(false);
|
||||
}}
|
||||
title="Collapse"
|
||||
>
|
||||
<ChevronUp className="h-3.5 w-3.5" />
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Issues list */}
|
||||
{issues.length > 0 && (
|
||||
<div className="flex flex-col gap-1 text-xs">
|
||||
{issues.map((issue, index) => (
|
||||
<div
|
||||
key={index}
|
||||
className="flex items-start gap-1.5 text-purple-600 dark:text-purple-400"
|
||||
>
|
||||
<AlertCircle className="h-3 w-3 mt-0.5 flex-shrink-0 text-purple-400" />
|
||||
<span>{issue}</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Suggested description */}
|
||||
<div className="mt-1">
|
||||
<div className="text-xs text-purple-500 dark:text-purple-400 mb-1 font-medium">
|
||||
Suggested:
|
||||
</div>
|
||||
<div className="text-sm text-purple-700 dark:text-purple-300 leading-relaxed bg-white/50 dark:bg-black/20 rounded p-2 border border-purple-100 dark:border-purple-800">
|
||||
{suggestion}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex items-center gap-2 mt-1">
|
||||
<Button
|
||||
size="sm"
|
||||
variant="outline"
|
||||
className="h-7 px-3 text-xs bg-white border-green-300 text-green-700 hover:bg-green-50 hover:border-green-400 dark:bg-green-950/30 dark:border-green-700 dark:text-green-400"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onAccept();
|
||||
}}
|
||||
>
|
||||
<Check className="h-3 w-3 mr-1" />
|
||||
Accept
|
||||
</Button>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
className="h-7 px-3 text-xs text-gray-500 hover:text-gray-700 dark:text-gray-400"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onDismiss();
|
||||
}}
|
||||
>
|
||||
Dismiss
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Loading state for AI validation
|
||||
*/
|
||||
export function AiValidationLoading({ className }: { className?: string }) {
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center gap-2 px-2 py-1 rounded-md text-xs',
|
||||
'bg-purple-50 border border-purple-200',
|
||||
'dark:bg-purple-950/30 dark:border-purple-800',
|
||||
className
|
||||
)}
|
||||
>
|
||||
<div className="h-3 w-3 border-2 border-purple-500 border-t-transparent rounded-full animate-spin" />
|
||||
<span className="text-purple-600 dark:text-purple-400">
|
||||
Validating with AI...
|
||||
</span>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -242,7 +242,7 @@ const SearchableTemplateSelect: React.FC<SearchableTemplateSelectProps> = ({
|
||||
disabled={disabled}
|
||||
className={cn('w-full justify-between overflow-hidden', triggerClassName)}
|
||||
>
|
||||
<span className="truncate overflow-hidden mr-1">{getDisplayText()}</span>
|
||||
<span className="truncate overflow-hidden mr-1 text-sm font-normal">{getDisplayText()}</span>
|
||||
<ChevronsUpDown className="h-4 w-4 shrink-0 opacity-50 flex-none" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
|
||||
@@ -0,0 +1,192 @@
|
||||
/**
|
||||
* Suggestion Badges Component
|
||||
*
|
||||
* Displays AI-suggested options inline for categories, themes, and colors.
|
||||
* Shows similarity scores and allows one-click selection.
|
||||
*/
|
||||
|
||||
import { Sparkles, Loader2, Plus, Check } from 'lucide-react';
|
||||
import { cn } from '@/lib/utils';
|
||||
import type { TaxonomySuggestion } from '../store/types';
|
||||
|
||||
interface SuggestionBadgesProps {
|
||||
/** Suggestions to display */
|
||||
suggestions: TaxonomySuggestion[];
|
||||
/** Currently selected values (IDs) */
|
||||
selectedValues: (string | number)[];
|
||||
/** Callback when a suggestion is clicked */
|
||||
onSelect: (id: number) => void;
|
||||
/** Whether suggestions are loading */
|
||||
isLoading?: boolean;
|
||||
/** Maximum suggestions to show */
|
||||
maxSuggestions?: number;
|
||||
/** Minimum similarity to show (0-1) */
|
||||
minSimilarity?: number;
|
||||
/** Label for the section */
|
||||
label?: string;
|
||||
/** Compact mode for smaller displays */
|
||||
compact?: boolean;
|
||||
/** Show similarity scores */
|
||||
showScores?: boolean;
|
||||
/** Custom class name */
|
||||
className?: string;
|
||||
}
|
||||
|
||||
export function SuggestionBadges({
|
||||
suggestions,
|
||||
selectedValues,
|
||||
onSelect,
|
||||
isLoading = false,
|
||||
maxSuggestions = 5,
|
||||
minSimilarity = 0,
|
||||
label = 'Suggested',
|
||||
compact = false,
|
||||
showScores = true,
|
||||
className,
|
||||
}: SuggestionBadgesProps) {
|
||||
// Filter and limit suggestions
|
||||
const filteredSuggestions = suggestions
|
||||
.filter(s => s.similarity >= minSimilarity)
|
||||
.slice(0, maxSuggestions);
|
||||
|
||||
// Don't render if no suggestions and not loading
|
||||
if (!isLoading && filteredSuggestions.length === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const isSelected = (id: number) => {
|
||||
return selectedValues.some(v => String(v) === String(id));
|
||||
};
|
||||
|
||||
return (
|
||||
<div className={cn('flex flex-wrap items-center gap-1.5', className)}>
|
||||
{/* Label */}
|
||||
<div className={cn(
|
||||
'flex items-center gap-1 text-purple-600 dark:text-purple-400',
|
||||
compact ? 'text-[10px]' : 'text-xs'
|
||||
)}>
|
||||
<Sparkles className={compact ? 'h-2.5 w-2.5' : 'h-3 w-3'} />
|
||||
{!compact && <span className="font-medium">{label}:</span>}
|
||||
</div>
|
||||
|
||||
{/* Loading state */}
|
||||
{isLoading && (
|
||||
<div className="flex items-center gap-1 text-gray-400">
|
||||
<Loader2 className={cn('animate-spin', compact ? 'h-2.5 w-2.5' : 'h-3 w-3')} />
|
||||
{!compact && <span className="text-xs">Loading...</span>}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Suggestion badges */}
|
||||
{filteredSuggestions.map((suggestion) => {
|
||||
const selected = isSelected(suggestion.id);
|
||||
const similarityPercent = Math.round(suggestion.similarity * 100);
|
||||
|
||||
return (
|
||||
<button
|
||||
key={suggestion.id}
|
||||
type="button"
|
||||
onClick={() => onSelect(suggestion.id)}
|
||||
disabled={selected}
|
||||
className={cn(
|
||||
'inline-flex items-center gap-1 rounded-full border transition-colors',
|
||||
compact ? 'px-1.5 py-0.5 text-[10px]' : 'px-2 py-0.5 text-xs',
|
||||
selected
|
||||
? 'border-green-300 bg-green-50 text-green-700 dark:border-green-700 dark:bg-green-950 dark:text-green-400'
|
||||
: 'border-purple-200 bg-purple-50 text-purple-700 hover:bg-purple-100 dark:border-purple-800 dark:bg-purple-950/50 dark:text-purple-300 dark:hover:bg-purple-900/50'
|
||||
)}
|
||||
title={suggestion.fullPath || suggestion.name}
|
||||
>
|
||||
{selected ? (
|
||||
<Check className={compact ? 'h-2 w-2' : 'h-2.5 w-2.5'} />
|
||||
) : (
|
||||
<Plus className={compact ? 'h-2 w-2' : 'h-2.5 w-2.5'} />
|
||||
)}
|
||||
<span className="truncate max-w-[120px]">{suggestion.name}</span>
|
||||
{showScores && !compact && (
|
||||
<span className={cn(
|
||||
'opacity-60',
|
||||
selected ? 'text-green-600' : 'text-purple-500'
|
||||
)}>
|
||||
{similarityPercent}%
|
||||
</span>
|
||||
)}
|
||||
</button>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Inline suggestion for a single field (used inside dropdowns)
|
||||
*/
|
||||
interface InlineSuggestionProps {
|
||||
suggestion: TaxonomySuggestion;
|
||||
isSelected: boolean;
|
||||
onSelect: () => void;
|
||||
showScore?: boolean;
|
||||
}
|
||||
|
||||
export function InlineSuggestion({
|
||||
suggestion,
|
||||
isSelected,
|
||||
onSelect,
|
||||
showScore = true,
|
||||
}: InlineSuggestionProps) {
|
||||
const similarityPercent = Math.round(suggestion.similarity * 100);
|
||||
|
||||
return (
|
||||
<div
|
||||
className={cn(
|
||||
'flex items-center justify-between px-2 py-1.5 cursor-pointer',
|
||||
isSelected
|
||||
? 'bg-green-50 dark:bg-green-950/30'
|
||||
: 'bg-purple-50/50 hover:bg-purple-100/50 dark:bg-purple-950/20 dark:hover:bg-purple-900/30'
|
||||
)}
|
||||
onClick={onSelect}
|
||||
>
|
||||
<div className="flex items-center gap-2 min-w-0">
|
||||
<Sparkles className="h-3 w-3 text-purple-500 flex-shrink-0" />
|
||||
<span className="truncate text-sm">
|
||||
{suggestion.fullPath || suggestion.name}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex items-center gap-2 flex-shrink-0 ml-2">
|
||||
{showScore && (
|
||||
<span className="text-xs text-purple-500 dark:text-purple-400">
|
||||
{similarityPercent}%
|
||||
</span>
|
||||
)}
|
||||
{isSelected ? (
|
||||
<Check className="h-3.5 w-3.5 text-green-500" />
|
||||
) : (
|
||||
<Plus className="h-3.5 w-3.5 text-purple-400" />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Suggestion section header for dropdowns
|
||||
*/
|
||||
interface SuggestionSectionHeaderProps {
|
||||
isLoading?: boolean;
|
||||
count?: number;
|
||||
}
|
||||
|
||||
export function SuggestionSectionHeader({ isLoading, count }: SuggestionSectionHeaderProps) {
|
||||
return (
|
||||
<div className="flex items-center gap-2 px-2 py-1.5 text-xs font-medium text-purple-600 dark:text-purple-400 bg-purple-50/80 dark:bg-purple-950/40 border-b border-purple-100 dark:border-purple-900">
|
||||
<Sparkles className="h-3 w-3" />
|
||||
<span>AI Suggested</span>
|
||||
{isLoading && <Loader2 className="h-3 w-3 animate-spin" />}
|
||||
{!isLoading && count !== undefined && (
|
||||
<span className="text-purple-400 dark:text-purple-500">({count})</span>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default SuggestionBadges;
|
||||
@@ -6,7 +6,7 @@
|
||||
* Note: Initialization effects are in index.tsx so they run before this mounts.
|
||||
*/
|
||||
|
||||
import { useCallback, useMemo } from 'react';
|
||||
import { useCallback, useMemo, useRef, useState } from 'react';
|
||||
import { useValidationStore } from '../store/validationStore';
|
||||
import {
|
||||
useTotalErrorCount,
|
||||
@@ -21,11 +21,16 @@ import { FloatingSelectionBar } from './FloatingSelectionBar';
|
||||
import { useAiValidationFlow } from '../hooks/useAiValidation';
|
||||
import { useFieldOptions } from '../hooks/useFieldOptions';
|
||||
import { useTemplateManagement } from '../hooks/useTemplateManagement';
|
||||
import { useCopyDownValidation } from '../hooks/useCopyDownValidation';
|
||||
import { useSanityCheck } from '../hooks/useSanityCheck';
|
||||
import { AiValidationProgressDialog } from '../dialogs/AiValidationProgress';
|
||||
import { AiValidationResultsDialog } from '../dialogs/AiValidationResults';
|
||||
import { AiDebugDialog } from '../dialogs/AiDebugDialog';
|
||||
import { SanityCheckDialog } from '../dialogs/SanityCheckDialog';
|
||||
import { TemplateForm } from '@/components/templates/TemplateForm';
|
||||
import type { CleanRowData } from '../store/types';
|
||||
import { AiSuggestionsProvider } from '../contexts/AiSuggestionsContext';
|
||||
import type { CleanRowData, RowData } from '../store/types';
|
||||
import type { ProductForSanityCheck } from '../hooks/useSanityCheck';
|
||||
|
||||
interface ValidationContainerProps {
|
||||
onBack?: () => void;
|
||||
@@ -56,6 +61,38 @@ export const ValidationContainer = ({
|
||||
const aiValidation = useAiValidationFlow();
|
||||
const { data: fieldOptionsData } = useFieldOptions();
|
||||
const { loadTemplates } = useTemplateManagement();
|
||||
const sanityCheck = useSanityCheck();
|
||||
|
||||
// Sanity check dialog state
|
||||
const [sanityCheckDialogOpen, setSanityCheckDialogOpen] = useState(false);
|
||||
// Debug: skip sanity check toggle (admin:debug only)
|
||||
const [skipSanityCheck, setSkipSanityCheck] = useState(false);
|
||||
|
||||
// Handle UPC validation after copy-down operations on supplier/upc fields
|
||||
useCopyDownValidation();
|
||||
|
||||
// Get initial products for AI suggestions (read once via ref to avoid re-fetching)
|
||||
const initialProductsRef = useRef<RowData[] | null>(null);
|
||||
if (initialProductsRef.current === null) {
|
||||
initialProductsRef.current = useValidationStore.getState().rows;
|
||||
}
|
||||
|
||||
// Create stable lookup functions for company/line names
|
||||
const getCompanyName = useCallback((id: string): string | undefined => {
|
||||
const companies = fieldOptionsData?.companies || [];
|
||||
const company = companies.find(c => c.value === id);
|
||||
return company?.label;
|
||||
}, [fieldOptionsData?.companies]);
|
||||
|
||||
const getLineName = useCallback((id: string): string | undefined => {
|
||||
// Lines are fetched dynamically per company, check the cache
|
||||
const cache = useValidationStore.getState().productLinesCache;
|
||||
for (const lines of cache.values()) {
|
||||
const line = lines.find(l => l.value === id);
|
||||
if (line) return line.label;
|
||||
}
|
||||
return undefined;
|
||||
}, []);
|
||||
|
||||
// Convert field options to TemplateForm format
|
||||
const templateFormFieldOptions = useMemo(() => {
|
||||
@@ -93,70 +130,207 @@ export const ValidationContainer = ({
|
||||
}
|
||||
}, [onBack]);
|
||||
|
||||
// Build products array for sanity check
|
||||
const buildProductsForSanityCheck = useCallback((): ProductForSanityCheck[] => {
|
||||
const rows = useValidationStore.getState().rows;
|
||||
const fields = useValidationStore.getState().fields;
|
||||
|
||||
// Build lookup for field options (for display names)
|
||||
const getFieldLabel = (fieldKey: string, value: unknown): string | undefined => {
|
||||
const field = fields.find(f => f.key === fieldKey);
|
||||
if (field && field.fieldType.type === 'select' && 'options' in field.fieldType) {
|
||||
const option = field.fieldType.options?.find(o => o.value === String(value));
|
||||
return option?.label;
|
||||
}
|
||||
return undefined;
|
||||
};
|
||||
|
||||
// Convert rows to sanity check format
|
||||
return rows.map((row) => {
|
||||
const product: ProductForSanityCheck = {
|
||||
name: row.name as string | undefined,
|
||||
supplier: row.supplier as string | undefined,
|
||||
supplier_name: getFieldLabel('supplier', row.supplier),
|
||||
company: row.company as string | undefined,
|
||||
company_name: getFieldLabel('company', row.company),
|
||||
supplier_no: row.supplier_no as string | undefined,
|
||||
msrp: row.msrp as string | number | undefined,
|
||||
cost_each: row.cost_each as string | number | undefined,
|
||||
qty_per_unit: row.qty_per_unit as string | number | undefined,
|
||||
case_qty: row.case_qty as string | number | undefined,
|
||||
tax_cat: row.tax_cat as string | number | undefined,
|
||||
tax_cat_name: getFieldLabel('tax_cat', row.tax_cat),
|
||||
size_cat: row.size_cat as string | number | undefined,
|
||||
size_cat_name: getFieldLabel('size_cat', row.size_cat),
|
||||
themes: row.themes as string | undefined,
|
||||
categories: row.categories as string | undefined,
|
||||
weight: row.weight as string | number | undefined,
|
||||
length: row.length as string | number | undefined,
|
||||
width: row.width as string | number | undefined,
|
||||
height: row.height as string | number | undefined,
|
||||
};
|
||||
|
||||
// Add AI supplemental context if present (from MatchColumnsStep "AI context only" columns)
|
||||
if (row.__aiSupplemental && typeof row.__aiSupplemental === 'object') {
|
||||
product.additional_context = row.__aiSupplemental;
|
||||
}
|
||||
|
||||
return product;
|
||||
});
|
||||
}, []);
|
||||
|
||||
// Handle viewing cached sanity check results
|
||||
const handleViewResults = useCallback(() => {
|
||||
setSanityCheckDialogOpen(true);
|
||||
}, []);
|
||||
|
||||
// Handle running a fresh sanity check
|
||||
const handleRunCheck = useCallback(() => {
|
||||
const products = buildProductsForSanityCheck();
|
||||
setSanityCheckDialogOpen(true);
|
||||
sanityCheck.runCheck(products);
|
||||
}, [sanityCheck, buildProductsForSanityCheck]);
|
||||
|
||||
// Handle proceeding directly to next step (skipping sanity check)
|
||||
const handleProceedDirect = useCallback(() => {
|
||||
handleNext();
|
||||
}, [handleNext]);
|
||||
|
||||
// Force a new sanity check (refresh button in dialog)
|
||||
const handleRefreshSanityCheck = useCallback(() => {
|
||||
const products = buildProductsForSanityCheck();
|
||||
sanityCheck.runCheck(products);
|
||||
}, [sanityCheck, buildProductsForSanityCheck]);
|
||||
|
||||
// Handle proceeding after sanity check
|
||||
const handleSanityCheckProceed = useCallback(() => {
|
||||
setSanityCheckDialogOpen(false);
|
||||
sanityCheck.clearResults();
|
||||
handleNext();
|
||||
}, [handleNext, sanityCheck]);
|
||||
|
||||
// Handle going back from sanity check dialog (keeps results cached)
|
||||
const handleSanityCheckGoBack = useCallback(() => {
|
||||
setSanityCheckDialogOpen(false);
|
||||
// Don't clear results - keep them cached for next time
|
||||
}, []);
|
||||
|
||||
// Handle scrolling to a specific product from sanity check issue
|
||||
const handleScrollToProduct = useCallback((productIndex: number) => {
|
||||
// Find the row element and scroll to it
|
||||
const rowElement = document.querySelector(`[data-row-index="${productIndex}"]`);
|
||||
if (rowElement) {
|
||||
rowElement.scrollIntoView({ behavior: 'smooth', block: 'center' });
|
||||
// Briefly highlight the row
|
||||
rowElement.classList.add('ring-2', 'ring-purple-500');
|
||||
setTimeout(() => {
|
||||
rowElement.classList.remove('ring-2', 'ring-purple-500');
|
||||
}, 2000);
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Build product names lookup for sanity check dialog
|
||||
// Rebuild fresh whenever dialog opens to ensure names are current after AI suggestions
|
||||
const buildProductNames = useCallback(() => {
|
||||
const rows = useValidationStore.getState().rows;
|
||||
const names: Record<number, string> = {};
|
||||
rows.forEach((row, index) => {
|
||||
names[index] = (row.name as string) || `Product ${index + 1}`;
|
||||
});
|
||||
return names;
|
||||
}, []);
|
||||
|
||||
const productNames = useMemo(() => buildProductNames(), [sanityCheckDialogOpen, buildProductNames]);
|
||||
|
||||
return (
|
||||
<div className="flex flex-col h-[calc(100vh-9.5rem)] overflow-hidden">
|
||||
{/* Toolbar */}
|
||||
<ValidationToolbar
|
||||
rowCount={rowCount}
|
||||
errorCount={totalErrorCount}
|
||||
rowsWithErrors={rowsWithErrorsCount}
|
||||
/>
|
||||
<AiSuggestionsProvider
|
||||
getCompanyName={getCompanyName}
|
||||
getLineName={getLineName}
|
||||
initialProducts={initialProductsRef.current || undefined}
|
||||
autoInitialize={!!fieldOptionsData}
|
||||
>
|
||||
<div className="flex flex-col h-[calc(100vh-9.5rem)] overflow-hidden">
|
||||
{/* Toolbar */}
|
||||
<ValidationToolbar
|
||||
rowCount={rowCount}
|
||||
errorCount={totalErrorCount}
|
||||
rowsWithErrors={rowsWithErrorsCount}
|
||||
/>
|
||||
|
||||
{/* Main table area */}
|
||||
<div className="flex-1 overflow-hidden">
|
||||
<ValidationTable />
|
||||
{/* Main table area */}
|
||||
<div className="flex-1 overflow-hidden">
|
||||
<ValidationTable />
|
||||
</div>
|
||||
|
||||
{/* Footer with navigation */}
|
||||
<ValidationFooter
|
||||
onBack={handleBack}
|
||||
onProceedDirect={handleProceedDirect}
|
||||
onViewResults={handleViewResults}
|
||||
onRunCheck={handleRunCheck}
|
||||
canGoBack={!!onBack}
|
||||
canProceed={totalErrorCount === 0}
|
||||
errorCount={totalErrorCount}
|
||||
rowCount={rowCount}
|
||||
isSanityChecking={sanityCheck.isChecking}
|
||||
hasRunSanityCheck={sanityCheck.hasRun}
|
||||
skipSanityCheck={skipSanityCheck}
|
||||
onSkipSanityCheckChange={setSkipSanityCheck}
|
||||
/>
|
||||
|
||||
{/* Floating selection bar - appears when rows selected */}
|
||||
<FloatingSelectionBar />
|
||||
|
||||
{/* AI Validation dialogs */}
|
||||
{aiValidation.isValidating && aiValidation.progress && (
|
||||
<AiValidationProgressDialog
|
||||
progress={aiValidation.progress}
|
||||
onCancel={aiValidation.cancel}
|
||||
/>
|
||||
)}
|
||||
|
||||
{aiValidation.results && !aiValidation.isValidating && (
|
||||
<AiValidationResultsDialog
|
||||
results={aiValidation.results}
|
||||
revertedChanges={aiValidation.revertedChanges}
|
||||
onRevert={aiValidation.revertChange}
|
||||
onAccept={aiValidation.acceptChange}
|
||||
onDismiss={aiValidation.dismissResults}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* AI Debug Dialog - for viewing prompt */}
|
||||
<AiDebugDialog
|
||||
open={aiValidation.showDebugDialog}
|
||||
onClose={aiValidation.closePromptPreview}
|
||||
debugData={aiValidation.debugPrompt}
|
||||
/>
|
||||
|
||||
{/* Sanity Check Dialog - shows cached results or runs new check */}
|
||||
<SanityCheckDialog
|
||||
open={sanityCheckDialogOpen}
|
||||
onOpenChange={setSanityCheckDialogOpen}
|
||||
isChecking={sanityCheck.isChecking}
|
||||
error={sanityCheck.error}
|
||||
result={sanityCheck.result}
|
||||
onProceed={handleSanityCheckProceed}
|
||||
onGoBack={handleSanityCheckGoBack}
|
||||
onRefresh={handleRefreshSanityCheck}
|
||||
onScrollToProduct={handleScrollToProduct}
|
||||
productNames={productNames}
|
||||
validationErrorCount={totalErrorCount}
|
||||
/>
|
||||
|
||||
{/* Template form dialog - for saving row as template */}
|
||||
<TemplateForm
|
||||
isOpen={isTemplateFormOpen}
|
||||
onClose={closeTemplateForm}
|
||||
onSuccess={handleTemplateFormSuccess}
|
||||
initialData={templateFormData as Parameters<typeof TemplateForm>[0]['initialData']}
|
||||
mode="create"
|
||||
fieldOptions={templateFormFieldOptions}
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Footer with navigation */}
|
||||
<ValidationFooter
|
||||
onBack={handleBack}
|
||||
onNext={handleNext}
|
||||
canGoBack={!!onBack}
|
||||
canProceed={totalErrorCount === 0}
|
||||
errorCount={totalErrorCount}
|
||||
rowCount={rowCount}
|
||||
onAiValidate={aiValidation.validate}
|
||||
isAiValidating={aiValidation.isValidating}
|
||||
onShowDebug={aiValidation.showPromptPreview}
|
||||
/>
|
||||
|
||||
{/* Floating selection bar - appears when rows selected */}
|
||||
<FloatingSelectionBar />
|
||||
|
||||
{/* AI Validation dialogs */}
|
||||
{aiValidation.isValidating && aiValidation.progress && (
|
||||
<AiValidationProgressDialog
|
||||
progress={aiValidation.progress}
|
||||
onCancel={aiValidation.cancel}
|
||||
/>
|
||||
)}
|
||||
|
||||
{aiValidation.results && !aiValidation.isValidating && (
|
||||
<AiValidationResultsDialog
|
||||
results={aiValidation.results}
|
||||
revertedChanges={aiValidation.revertedChanges}
|
||||
onRevert={aiValidation.revertChange}
|
||||
onAccept={aiValidation.acceptChange}
|
||||
onDismiss={aiValidation.dismissResults}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* AI Debug Dialog - for viewing prompt */}
|
||||
<AiDebugDialog
|
||||
open={aiValidation.showDebugDialog}
|
||||
onClose={aiValidation.closePromptPreview}
|
||||
debugData={aiValidation.debugPrompt}
|
||||
/>
|
||||
|
||||
{/* Template form dialog - for saving row as template */}
|
||||
<TemplateForm
|
||||
isOpen={isTemplateFormOpen}
|
||||
onClose={closeTemplateForm}
|
||||
onSuccess={handleTemplateFormSuccess}
|
||||
initialData={templateFormData as Parameters<typeof TemplateForm>[0]['initialData']}
|
||||
mode="create"
|
||||
fieldOptions={templateFormFieldOptions}
|
||||
/>
|
||||
</div>
|
||||
</AiSuggestionsProvider>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,39 +1,61 @@
|
||||
/**
|
||||
* ValidationFooter Component
|
||||
*
|
||||
* Navigation footer with back/next buttons, AI validate, and summary info.
|
||||
* Navigation footer with back/next buttons and summary info.
|
||||
* After first sanity check, shows options to view results, recheck, or proceed directly.
|
||||
*/
|
||||
|
||||
import { useState } from 'react';
|
||||
import { useContext } from 'react';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { CheckCircle, Wand2, FileText } from 'lucide-react';
|
||||
import { Protected } from '@/components/auth/Protected';
|
||||
import { Dialog, DialogContent, DialogHeader, DialogTitle, DialogDescription, DialogFooter } from '@/components/ui/dialog';
|
||||
import { Switch } from '@/components/ui/switch';
|
||||
import { Label } from '@/components/ui/label';
|
||||
import { CheckCircle, Loader2, Eye, RefreshCw } from 'lucide-react';
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
TooltipProvider,
|
||||
TooltipTrigger,
|
||||
} from '@/components/ui/tooltip';
|
||||
import { AuthContext } from '@/contexts/AuthContext';
|
||||
|
||||
interface ValidationFooterProps {
|
||||
onBack?: () => void;
|
||||
onNext?: () => void;
|
||||
/** Called to proceed directly to next step (no sanity check) */
|
||||
onProceedDirect?: () => void;
|
||||
/** Called to view cached sanity check results */
|
||||
onViewResults?: () => void;
|
||||
/** Called to run a fresh sanity check */
|
||||
onRunCheck?: () => void;
|
||||
canGoBack: boolean;
|
||||
canProceed: boolean;
|
||||
errorCount: number;
|
||||
rowCount: number;
|
||||
onAiValidate?: () => void;
|
||||
isAiValidating?: boolean;
|
||||
onShowDebug?: () => void;
|
||||
/** Whether sanity check is currently running */
|
||||
isSanityChecking?: boolean;
|
||||
/** Whether sanity check has been run at least once */
|
||||
hasRunSanityCheck?: boolean;
|
||||
/** Whether to skip sanity check (debug mode) */
|
||||
skipSanityCheck?: boolean;
|
||||
/** Called when skip sanity check toggle changes */
|
||||
onSkipSanityCheckChange?: (skip: boolean) => void;
|
||||
}
|
||||
|
||||
export const ValidationFooter = ({
|
||||
onBack,
|
||||
onNext,
|
||||
onProceedDirect,
|
||||
onViewResults,
|
||||
onRunCheck,
|
||||
canGoBack,
|
||||
canProceed,
|
||||
errorCount,
|
||||
rowCount,
|
||||
onAiValidate,
|
||||
isAiValidating = false,
|
||||
onShowDebug,
|
||||
isSanityChecking = false,
|
||||
hasRunSanityCheck = false,
|
||||
skipSanityCheck = false,
|
||||
onSkipSanityCheckChange,
|
||||
}: ValidationFooterProps) => {
|
||||
const [showErrorDialog, setShowErrorDialog] = useState(false);
|
||||
const { user } = useContext(AuthContext);
|
||||
const hasDebugPermission = Boolean(user?.is_admin || user?.permissions?.includes('admin:debug'));
|
||||
|
||||
return (
|
||||
<div className="flex items-center justify-between border-t bg-muted/50 px-6 py-4">
|
||||
@@ -60,43 +82,91 @@ export const ValidationFooter = ({
|
||||
|
||||
{/* Action buttons */}
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Show Prompt Debug - Admin only */}
|
||||
{onShowDebug && (
|
||||
<Protected permission="admin:debug">
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={onShowDebug}
|
||||
disabled={isAiValidating}
|
||||
>
|
||||
<FileText className="h-4 w-4 mr-1" />
|
||||
Show Prompt
|
||||
</Button>
|
||||
</Protected>
|
||||
{/* Skip sanity check toggle - only for admin:debug users */}
|
||||
{hasDebugPermission && onSkipSanityCheckChange && !hasRunSanityCheck && (
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={300}>
|
||||
<TooltipTrigger asChild>
|
||||
<div className="flex items-center gap-2 mr-4">
|
||||
<Switch
|
||||
id="skip-sanity"
|
||||
checked={skipSanityCheck}
|
||||
onCheckedChange={onSkipSanityCheckChange}
|
||||
/>
|
||||
<Label
|
||||
htmlFor="skip-sanity"
|
||||
className="text-sm text-muted-foreground cursor-pointer flex items-center gap-1"
|
||||
>
|
||||
Skip Consistency Check
|
||||
</Label>
|
||||
</div>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="top">
|
||||
<p>Debug: Skip consistency check</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
)}
|
||||
|
||||
{/* AI Validate */}
|
||||
{onAiValidate && (
|
||||
{/* Before first sanity check: single "Next" button that runs the check */}
|
||||
{!hasRunSanityCheck && !skipSanityCheck && (
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={onAiValidate}
|
||||
disabled={isAiValidating || rowCount === 0}
|
||||
onClick={onRunCheck}
|
||||
disabled={isSanityChecking || rowCount === 0}
|
||||
>
|
||||
<Wand2 className="h-4 w-4 mr-1" />
|
||||
{isAiValidating ? 'Validating...' : 'AI Validate'}
|
||||
Next
|
||||
</Button>
|
||||
)}
|
||||
|
||||
{/* Next button */}
|
||||
{onNext && (
|
||||
{/* After first sanity check: show all three options */}
|
||||
{hasRunSanityCheck && !skipSanityCheck && (
|
||||
<>
|
||||
{/* View previous results */}
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={300}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={onViewResults}
|
||||
disabled={isSanityChecking}
|
||||
>
|
||||
<Eye className="h-4 w-4 mr-1" />
|
||||
Review Check Results
|
||||
</Button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="top">
|
||||
<p>Review previous consistency check results</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
|
||||
{/* Run fresh check */}
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={300}>
|
||||
<TooltipTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={onRunCheck}
|
||||
disabled={isSanityChecking || rowCount === 0}
|
||||
>
|
||||
{isSanityChecking ? (
|
||||
<Loader2 className="h-4 w-4 mr-1 animate-spin" />
|
||||
) : (
|
||||
<RefreshCw className="h-4 w-4 mr-1" />
|
||||
)}
|
||||
{isSanityChecking ? 'Checking...' : 'Check Again'}
|
||||
</Button>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="top">
|
||||
<p>Run a fresh consistency check</p>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
|
||||
{/* Proceed directly */}
|
||||
<Button
|
||||
onClick={() => {
|
||||
if (canProceed) {
|
||||
onNext();
|
||||
} else {
|
||||
setShowErrorDialog(true);
|
||||
}
|
||||
}}
|
||||
onClick={onProceedDirect}
|
||||
disabled={isSanityChecking}
|
||||
title={
|
||||
!canProceed
|
||||
? `There are ${errorCount} validation errors`
|
||||
@@ -105,36 +175,19 @@ export const ValidationFooter = ({
|
||||
>
|
||||
Next
|
||||
</Button>
|
||||
|
||||
<Dialog open={showErrorDialog} onOpenChange={setShowErrorDialog}>
|
||||
<DialogContent>
|
||||
<DialogHeader>
|
||||
<DialogTitle className="pb-3">Are you sure?</DialogTitle>
|
||||
<DialogDescription>
|
||||
There are still {errorCount} validation error{errorCount !== 1 ? 's' : ''} in your data.
|
||||
Are you sure you want to continue?
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
<DialogFooter>
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={() => setShowErrorDialog(false)}
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button
|
||||
onClick={() => {
|
||||
setShowErrorDialog(false);
|
||||
onNext();
|
||||
}}
|
||||
>
|
||||
Continue Anyway
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Skip mode: just show Continue */}
|
||||
{skipSanityCheck && (
|
||||
<Button
|
||||
onClick={onProceedDirect}
|
||||
disabled={rowCount === 0}
|
||||
title="Continue to image upload (sanity check skipped)"
|
||||
>
|
||||
Next
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -19,7 +19,6 @@ import { Badge } from '@/components/ui/badge';
|
||||
import { useValidationStore } from '../store/validationStore';
|
||||
import {
|
||||
useFilters,
|
||||
useSelectedRowCount,
|
||||
useFields,
|
||||
} from '../store/selectors';
|
||||
import { CreateProductCategoryDialog, type CreatedCategoryInfo } from '../../../CreateProductCategoryDialog';
|
||||
@@ -38,7 +37,6 @@ export const ValidationToolbar = ({
|
||||
rowsWithErrors,
|
||||
}: ValidationToolbarProps) => {
|
||||
const filters = useFilters();
|
||||
const selectedRowCount = useSelectedRowCount();
|
||||
const fields = useFields();
|
||||
|
||||
// State for the product search template dialog
|
||||
|
||||
@@ -29,6 +29,10 @@ import {
|
||||
} from '@/components/ui/popover';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { ValidationError } from '../../store/types';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
|
||||
/** Time window (ms) during which this cell should not open after a popover closes */
|
||||
const POPOVER_CLOSE_DELAY = 150;
|
||||
|
||||
interface ComboboxCellProps {
|
||||
value: unknown;
|
||||
@@ -56,6 +60,9 @@ const ComboboxCellComponent = ({
|
||||
const [isLoadingOptions, setIsLoadingOptions] = useState(false);
|
||||
const hasFetchedRef = useRef(false);
|
||||
|
||||
// Get store state for coordinating with popover close behavior
|
||||
const cellPopoverClosedAt = useValidationStore((s) => s.cellPopoverClosedAt);
|
||||
|
||||
const stringValue = String(value ?? '');
|
||||
const hasError = errors.length > 0;
|
||||
const errorMessage = errors[0]?.message;
|
||||
@@ -67,6 +74,10 @@ const ComboboxCellComponent = ({
|
||||
// Handle popover open - trigger fetch if needed
|
||||
const handleOpenChange = useCallback(
|
||||
(isOpen: boolean) => {
|
||||
// Block opening if a popover was just closed (click-outside behavior)
|
||||
if (isOpen && Date.now() - cellPopoverClosedAt < POPOVER_CLOSE_DELAY) {
|
||||
return;
|
||||
}
|
||||
setOpen(isOpen);
|
||||
if (isOpen && onFetchOptions && options.length === 0 && !hasFetchedRef.current) {
|
||||
hasFetchedRef.current = true;
|
||||
@@ -76,7 +87,7 @@ const ComboboxCellComponent = ({
|
||||
});
|
||||
}
|
||||
},
|
||||
[onFetchOptions, options.length]
|
||||
[onFetchOptions, options.length, cellPopoverClosedAt]
|
||||
);
|
||||
|
||||
// Handle selection
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
|
||||
import { useState, useCallback, useEffect, useRef, memo } from 'react';
|
||||
import { Input } from '@/components/ui/input';
|
||||
import { Loader2, AlertCircle } from 'lucide-react';
|
||||
import { AlertCircle } from 'lucide-react';
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
@@ -19,6 +19,10 @@ import { cn } from '@/lib/utils';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { ValidationError } from '../../store/types';
|
||||
import { ErrorType } from '../../store/types';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
|
||||
/** Time window (ms) during which this cell should not focus after a popover closes */
|
||||
const POPOVER_CLOSE_DELAY = 150;
|
||||
|
||||
interface InputCellProps {
|
||||
value: unknown;
|
||||
@@ -43,6 +47,9 @@ const InputCellComponent = ({
|
||||
const [isFocused, setIsFocused] = useState(false);
|
||||
const inputRef = useRef<HTMLInputElement>(null);
|
||||
|
||||
// Get store state for coordinating with popover close behavior
|
||||
const cellPopoverClosedAt = useValidationStore((s) => s.cellPopoverClosedAt);
|
||||
|
||||
// Sync local value with prop value when not focused
|
||||
useEffect(() => {
|
||||
if (!isFocused) {
|
||||
@@ -70,8 +77,13 @@ const InputCellComponent = ({
|
||||
);
|
||||
|
||||
const handleFocus = useCallback(() => {
|
||||
// Block focus if a popover was just closed (click-outside behavior)
|
||||
if (Date.now() - cellPopoverClosedAt < POPOVER_CLOSE_DELAY) {
|
||||
inputRef.current?.blur();
|
||||
return;
|
||||
}
|
||||
setIsFocused(true);
|
||||
}, []);
|
||||
}, [cellPopoverClosedAt]);
|
||||
|
||||
// Update store only on blur - this is when validation runs too
|
||||
// Round price fields to 2 decimal places
|
||||
|
||||
@@ -6,10 +6,14 @@
|
||||
*
|
||||
* PERFORMANCE: Uses uncontrolled open state for Popover.
|
||||
* Controlled open state can cause delays due to React state processing.
|
||||
*
|
||||
* AI SUGGESTIONS: For categories, themes, and colors fields, this component
|
||||
* displays AI-powered suggestions based on product embeddings. Suggestions
|
||||
* appear at the top of the dropdown with similarity scores.
|
||||
*/
|
||||
|
||||
import { useCallback, useMemo, memo, useState } from 'react';
|
||||
import { Check, ChevronsUpDown, AlertCircle } from 'lucide-react';
|
||||
import { Check, ChevronsUpDown, AlertCircle, Sparkles, Loader2 } from 'lucide-react';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import {
|
||||
Command,
|
||||
@@ -34,8 +38,13 @@ import { Badge } from '@/components/ui/badge';
|
||||
import { Skeleton } from '@/components/ui/skeleton';
|
||||
import { cn } from '@/lib/utils';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { ValidationError } from '../../store/types';
|
||||
import type { ValidationError, TaxonomySuggestion } from '../../store/types';
|
||||
import { ErrorType } from '../../store/types';
|
||||
import { useCellSuggestions } from '../../contexts/AiSuggestionsContext';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
|
||||
/** Time window (ms) during which this cell should not open after a popover closes */
|
||||
const POPOVER_CLOSE_DELAY = 150;
|
||||
|
||||
// Extended option type to include hex color values
|
||||
interface MultiSelectOption extends SelectOption {
|
||||
@@ -49,6 +58,8 @@ interface MultiSelectCellProps {
|
||||
field: Field<string>;
|
||||
options?: SelectOption[];
|
||||
rowIndex: number;
|
||||
/** Product's unique __index for AI suggestions */
|
||||
productIndex?: string;
|
||||
isValidating: boolean;
|
||||
errors: ValidationError[];
|
||||
onChange: (value: unknown) => void;
|
||||
@@ -56,6 +67,10 @@ interface MultiSelectCellProps {
|
||||
onFetchOptions?: () => void;
|
||||
}
|
||||
|
||||
// Fields that support AI suggestions
|
||||
const SUGGESTION_FIELDS = ['categories', 'themes', 'colors'] as const;
|
||||
type SuggestionField = typeof SUGGESTION_FIELDS[number];
|
||||
|
||||
/**
|
||||
* Helper to extract hex color from option
|
||||
* Supports hex, hexColor, and hex_color field names
|
||||
@@ -79,6 +94,7 @@ const MultiSelectCellComponent = ({
|
||||
value,
|
||||
field,
|
||||
options = [],
|
||||
productIndex,
|
||||
isValidating,
|
||||
errors,
|
||||
onChange: _onChange, // Unused - onBlur handles both update and validation
|
||||
@@ -86,6 +102,33 @@ const MultiSelectCellComponent = ({
|
||||
}: MultiSelectCellProps) => {
|
||||
const [open, setOpen] = useState(false);
|
||||
|
||||
// Get store state for coordinating with popover close behavior
|
||||
const cellPopoverClosedAt = useValidationStore((s) => s.cellPopoverClosedAt);
|
||||
|
||||
// Handle popover open/close with check for recent popover close
|
||||
const handleOpenChange = useCallback((isOpen: boolean) => {
|
||||
// Block opening if a popover was just closed (click-outside behavior)
|
||||
if (isOpen && Date.now() - cellPopoverClosedAt < POPOVER_CLOSE_DELAY) {
|
||||
return;
|
||||
}
|
||||
setOpen(isOpen);
|
||||
}, [cellPopoverClosedAt]);
|
||||
|
||||
// Get AI suggestions for categories, themes, and colors
|
||||
const supportsSuggestions = SUGGESTION_FIELDS.includes(field.key as SuggestionField);
|
||||
const suggestions = useCellSuggestions(productIndex || '');
|
||||
|
||||
// Get the right suggestions based on field type
|
||||
const fieldSuggestions: TaxonomySuggestion[] = useMemo(() => {
|
||||
if (!supportsSuggestions || !productIndex) return [];
|
||||
switch (field.key) {
|
||||
case 'categories': return suggestions.categories;
|
||||
case 'themes': return suggestions.themes;
|
||||
case 'colors': return suggestions.colors;
|
||||
default: return [];
|
||||
}
|
||||
}, [supportsSuggestions, productIndex, field.key, suggestions]);
|
||||
|
||||
// Handle wheel scroll in dropdown - stop propagation to prevent table scroll
|
||||
const handleWheel = useCallback((e: React.WheelEvent<HTMLDivElement>) => {
|
||||
e.stopPropagation();
|
||||
@@ -150,7 +193,7 @@ const MultiSelectCellComponent = ({
|
||||
|
||||
return (
|
||||
<div className="relative w-full">
|
||||
<Popover open={open} onOpenChange={setOpen}>
|
||||
<Popover open={open} onOpenChange={handleOpenChange}>
|
||||
<PopoverTrigger asChild>
|
||||
<Button
|
||||
variant="outline"
|
||||
@@ -216,48 +259,132 @@ const MultiSelectCellComponent = ({
|
||||
<ChevronsUpDown className="ml-2 h-4 w-4 shrink-0 opacity-50" />
|
||||
</Button>
|
||||
</PopoverTrigger>
|
||||
<PopoverContent className="w-[300px] p-0" align="start">
|
||||
<PopoverContent className="w-[var(--radix-popover-trigger-width)] p-0" align="start">
|
||||
<Command>
|
||||
<CommandInput placeholder={`Search ${field.label}...`} />
|
||||
<CommandList>
|
||||
<CommandEmpty>No options found.</CommandEmpty>
|
||||
<div
|
||||
className="max-h-[200px] overflow-y-auto overscroll-contain"
|
||||
className="max-h-[250px] overflow-y-auto overscroll-contain"
|
||||
onWheel={handleWheel}
|
||||
>
|
||||
<CommandGroup>
|
||||
{options.map((option) => {
|
||||
const hexColor = field.key === 'colors' ? getOptionHex(option as MultiSelectOption) : undefined;
|
||||
const isWhite = hexColor ? isWhiteColor(hexColor) : false;
|
||||
{/* Selected items section - floats to top of dropdown */}
|
||||
{selectedValues.length > 0 && (
|
||||
<CommandGroup>
|
||||
<div className="flex items-center gap-2 px-2 py-1.5 text-xs font-medium text-green-600 dark:text-green-400 bg-green-50/80 dark:bg-green-950/40 border-b border-green-100 dark:border-green-900">
|
||||
<Check className="h-3 w-3" />
|
||||
<span>Selected ({selectedValues.length})</span>
|
||||
</div>
|
||||
{selectedValues.map((selectedVal) => {
|
||||
const option = options.find((opt) => opt.value === selectedVal) as MultiSelectOption | undefined;
|
||||
const hexColor = field.key === 'colors' && option ? getOptionHex(option) : undefined;
|
||||
const isWhite = hexColor ? isWhiteColor(hexColor) : false;
|
||||
const label = option?.label || selectedVal;
|
||||
|
||||
return (
|
||||
<CommandItem
|
||||
key={option.value}
|
||||
value={option.label}
|
||||
onSelect={() => handleSelect(option.value)}
|
||||
>
|
||||
<Check
|
||||
className={cn(
|
||||
'mr-2 h-4 w-4',
|
||||
selectedValues.includes(option.value)
|
||||
? 'opacity-100'
|
||||
: 'opacity-0'
|
||||
return (
|
||||
<CommandItem
|
||||
key={`selected-${selectedVal}`}
|
||||
value={`selected-${label}`}
|
||||
onSelect={() => handleSelect(selectedVal)}
|
||||
className="bg-green-50/50 dark:bg-green-950/30"
|
||||
>
|
||||
<Check className="mr-2 h-4 w-4 opacity-100 text-green-600" />
|
||||
{field.key === 'colors' && hexColor && (
|
||||
<span
|
||||
className={cn(
|
||||
'inline-block h-3.5 w-3.5 rounded-full mr-2 flex-shrink-0',
|
||||
isWhite && 'border border-black'
|
||||
)}
|
||||
style={{ backgroundColor: hexColor }}
|
||||
/>
|
||||
)}
|
||||
/>
|
||||
{/* Color circle for colors field */}
|
||||
{field.key === 'colors' && hexColor && (
|
||||
<span
|
||||
className={cn(
|
||||
'inline-block h-3.5 w-3.5 rounded-full mr-2 flex-shrink-0',
|
||||
isWhite && 'border border-black'
|
||||
{label}
|
||||
</CommandItem>
|
||||
);
|
||||
})}
|
||||
</CommandGroup>
|
||||
)}
|
||||
|
||||
{/* AI Suggestions section - shown below selected items */}
|
||||
{supportsSuggestions && (fieldSuggestions.length > 0 || suggestions.isLoading) && (
|
||||
<CommandGroup>
|
||||
<div className="flex items-center gap-2 px-2 py-1.5 text-xs font-medium text-purple-600 dark:text-purple-400 bg-purple-50/80 dark:bg-purple-950/40 border-b border-purple-100 dark:border-purple-900">
|
||||
<Sparkles className="h-3 w-3" />
|
||||
<span>Suggested</span>
|
||||
{suggestions.isLoading && <Loader2 className="h-3 w-3 animate-spin" />}
|
||||
</div>
|
||||
{fieldSuggestions.slice(0, 5).map((suggestion) => {
|
||||
const isSelected = selectedValues.includes(String(suggestion.id));
|
||||
// Skip suggestions that are already in the Selected section
|
||||
if (isSelected) return null;
|
||||
const similarityPercent = Math.round(suggestion.similarity * 100);
|
||||
const hexColor = field.key === 'colors'
|
||||
? options.find(o => o.value === String(suggestion.id)) as MultiSelectOption | undefined
|
||||
: undefined;
|
||||
const suggestionHex = hexColor ? getOptionHex(hexColor) : undefined;
|
||||
|
||||
return (
|
||||
<CommandItem
|
||||
key={`suggestion-${suggestion.id}`}
|
||||
value={`suggestion-${suggestion.name}`}
|
||||
onSelect={() => handleSelect(String(suggestion.id))}
|
||||
className="bg-purple-50/30 dark:bg-purple-950/20"
|
||||
>
|
||||
<div className="flex items-center gap-2 min-w-0 flex-1">
|
||||
<Check className="h-4 w-4 flex-shrink-0 opacity-0" />
|
||||
{/* Color circle for colors */}
|
||||
{field.key === 'colors' && suggestionHex && (
|
||||
<span
|
||||
className={cn(
|
||||
'inline-block h-3.5 w-3.5 rounded-full flex-shrink-0',
|
||||
isWhiteColor(suggestionHex) && 'border border-black'
|
||||
)}
|
||||
style={{ backgroundColor: suggestionHex }}
|
||||
/>
|
||||
)}
|
||||
style={{ backgroundColor: hexColor }}
|
||||
/>
|
||||
)}
|
||||
{option.label}
|
||||
</CommandItem>
|
||||
);
|
||||
})}
|
||||
{/* Show full path for categories/themes, just name for colors */}
|
||||
<span className="" title={suggestion.fullPath || suggestion.name}>
|
||||
{field.key === 'colors' ? suggestion.name : (suggestion.fullPath || suggestion.name)}
|
||||
</span>
|
||||
</div>
|
||||
<span className="text-xs text-purple-500 dark:text-purple-400 ml-2 flex-shrink-0">
|
||||
{similarityPercent}%
|
||||
</span>
|
||||
</CommandItem>
|
||||
);
|
||||
})}
|
||||
</CommandGroup>
|
||||
)}
|
||||
|
||||
{/* Regular options - excludes selected items (shown in Selected section above) */}
|
||||
<CommandGroup heading={selectedValues.length > 0 || (supportsSuggestions && fieldSuggestions.length > 0) ? "All Options" : undefined}>
|
||||
{options
|
||||
.filter((option) => !selectedValues.includes(option.value))
|
||||
.map((option) => {
|
||||
const hexColor = field.key === 'colors' ? getOptionHex(option as MultiSelectOption) : undefined;
|
||||
const isWhite = hexColor ? isWhiteColor(hexColor) : false;
|
||||
|
||||
return (
|
||||
<CommandItem
|
||||
key={option.value}
|
||||
value={option.label}
|
||||
onSelect={() => handleSelect(option.value)}
|
||||
>
|
||||
<Check className="mr-2 h-4 w-4 opacity-0" />
|
||||
{/* Color circle for colors field */}
|
||||
{field.key === 'colors' && hexColor && (
|
||||
<span
|
||||
className={cn(
|
||||
'inline-block h-3.5 w-3.5 rounded-full mr-2 flex-shrink-0',
|
||||
isWhite && 'border border-black'
|
||||
)}
|
||||
style={{ backgroundColor: hexColor }}
|
||||
/>
|
||||
)}
|
||||
{option.label}
|
||||
</CommandItem>
|
||||
);
|
||||
})}
|
||||
</CommandGroup>
|
||||
</div>
|
||||
</CommandList>
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
* MultilineInput Component
|
||||
*
|
||||
* Expandable textarea cell for long text content.
|
||||
* Includes AI suggestion display when available.
|
||||
* Memoized to prevent unnecessary re-renders when parent table updates.
|
||||
*/
|
||||
|
||||
@@ -9,21 +10,46 @@ import { useState, useCallback, useRef, useEffect, memo } from 'react';
|
||||
import { Textarea } from '@/components/ui/textarea';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { Popover, PopoverTrigger, PopoverContent } from '@/components/ui/popover';
|
||||
import { X, Loader2 } from 'lucide-react';
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
TooltipProvider,
|
||||
TooltipTrigger,
|
||||
} from '@/components/ui/tooltip';
|
||||
import { X, Loader2, Sparkles, AlertCircle, Check, ChevronDown, ChevronUp } from 'lucide-react';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { ValidationError } from '../../store/types';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
|
||||
/** Time window (ms) during which other cells should not open after a popover closes */
|
||||
const POPOVER_CLOSE_DELAY = 150;
|
||||
|
||||
/** AI suggestion data for a single field */
|
||||
interface AiFieldSuggestion {
|
||||
isValid: boolean;
|
||||
suggestion?: string | null;
|
||||
issues?: string[];
|
||||
}
|
||||
|
||||
interface MultilineInputProps {
|
||||
value: unknown;
|
||||
field: Field<string>;
|
||||
options?: SelectOption[];
|
||||
rowIndex: number;
|
||||
productIndex: string;
|
||||
isValidating: boolean;
|
||||
errors: ValidationError[];
|
||||
onChange: (value: unknown) => void;
|
||||
onBlur: (value: unknown) => void;
|
||||
onFetchOptions?: () => void;
|
||||
isLoadingOptions?: boolean;
|
||||
/** AI suggestion for this field */
|
||||
aiSuggestion?: AiFieldSuggestion | null;
|
||||
/** Whether AI is currently validating */
|
||||
isAiValidating?: boolean;
|
||||
/** Called when user dismisses/clears the AI suggestion (also called after applying) */
|
||||
onDismissAiSuggestion?: () => void;
|
||||
}
|
||||
|
||||
const MultilineInputComponent = ({
|
||||
@@ -33,16 +59,44 @@ const MultilineInputComponent = ({
|
||||
errors,
|
||||
onChange,
|
||||
onBlur,
|
||||
aiSuggestion,
|
||||
isAiValidating,
|
||||
onDismissAiSuggestion,
|
||||
}: MultilineInputProps) => {
|
||||
const [popoverOpen, setPopoverOpen] = useState(false);
|
||||
const [editValue, setEditValue] = useState('');
|
||||
const [localDisplayValue, setLocalDisplayValue] = useState<string | null>(null);
|
||||
const [aiSuggestionExpanded, setAiSuggestionExpanded] = useState(false);
|
||||
const [editedSuggestion, setEditedSuggestion] = useState('');
|
||||
const cellRef = useRef<HTMLDivElement>(null);
|
||||
const preventReopenRef = useRef(false);
|
||||
// Tracks intentional closes (close button, accept/dismiss) vs click-outside closes
|
||||
const intentionalCloseRef = useRef(false);
|
||||
|
||||
// Get store state and actions for coordinating popover close behavior across cells
|
||||
const cellPopoverClosedAt = useValidationStore((s) => s.cellPopoverClosedAt);
|
||||
const setCellPopoverClosed = useValidationStore((s) => s.setCellPopoverClosed);
|
||||
|
||||
const hasError = errors.length > 0;
|
||||
const errorMessage = errors[0]?.message;
|
||||
|
||||
// Check if we have a displayable AI suggestion
|
||||
const hasAiSuggestion = aiSuggestion && !aiSuggestion.isValid && aiSuggestion.suggestion;
|
||||
const aiIssues = aiSuggestion?.issues || [];
|
||||
|
||||
// Handle wheel scroll in textarea - stop propagation to prevent table scroll
|
||||
const handleTextareaWheel = useCallback((e: React.WheelEvent<HTMLTextAreaElement>) => {
|
||||
const target = e.currentTarget;
|
||||
const { scrollTop, scrollHeight, clientHeight } = target;
|
||||
const atTop = scrollTop === 0;
|
||||
const atBottom = scrollTop + clientHeight >= scrollHeight - 1;
|
||||
|
||||
// Only stop propagation if we can scroll in the direction of the wheel
|
||||
if ((e.deltaY < 0 && !atTop) || (e.deltaY > 0 && !atBottom)) {
|
||||
e.stopPropagation();
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Initialize localDisplayValue on mount and when value changes externally
|
||||
useEffect(() => {
|
||||
const strValue = String(value ?? '');
|
||||
@@ -51,6 +105,18 @@ const MultilineInputComponent = ({
|
||||
}
|
||||
}, [value, localDisplayValue]);
|
||||
|
||||
// Initialize edited suggestion when AI suggestion changes
|
||||
useEffect(() => {
|
||||
if (aiSuggestion?.suggestion) {
|
||||
setEditedSuggestion(aiSuggestion.suggestion);
|
||||
}
|
||||
}, [aiSuggestion?.suggestion]);
|
||||
|
||||
// Check if another cell's popover was recently closed (prevents immediate focus on click-outside)
|
||||
const wasPopoverRecentlyClosed = useCallback(() => {
|
||||
return Date.now() - cellPopoverClosedAt < POPOVER_CLOSE_DELAY;
|
||||
}, [cellPopoverClosedAt]);
|
||||
|
||||
// Handle trigger click to toggle the popover
|
||||
const handleTriggerClick = useCallback(
|
||||
(e: React.MouseEvent) => {
|
||||
@@ -61,6 +127,13 @@ const MultilineInputComponent = ({
|
||||
return;
|
||||
}
|
||||
|
||||
// Block opening if another popover was just closed
|
||||
if (wasPopoverRecentlyClosed()) {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
return;
|
||||
}
|
||||
|
||||
// Only process if not already open
|
||||
if (!popoverOpen) {
|
||||
setPopoverOpen(true);
|
||||
@@ -68,10 +141,10 @@ const MultilineInputComponent = ({
|
||||
setEditValue(localDisplayValue || String(value ?? ''));
|
||||
}
|
||||
},
|
||||
[popoverOpen, value, localDisplayValue]
|
||||
[popoverOpen, value, localDisplayValue, wasPopoverRecentlyClosed]
|
||||
);
|
||||
|
||||
// Handle immediate close of popover
|
||||
// Handle immediate close of popover (used by close button and actions - intentional closes)
|
||||
const handleClosePopover = useCallback(() => {
|
||||
// Only process if we have changes
|
||||
if (editValue !== value || editValue !== localDisplayValue) {
|
||||
@@ -83,27 +156,60 @@ const MultilineInputComponent = ({
|
||||
onBlur(editValue);
|
||||
}
|
||||
|
||||
// Mark this as an intentional close (not click-outside)
|
||||
intentionalCloseRef.current = true;
|
||||
|
||||
// Immediately close popover
|
||||
setPopoverOpen(false);
|
||||
setAiSuggestionExpanded(false);
|
||||
|
||||
// Prevent reopening
|
||||
// Prevent reopening this same cell
|
||||
preventReopenRef.current = true;
|
||||
setTimeout(() => {
|
||||
preventReopenRef.current = false;
|
||||
}, 100);
|
||||
}, [editValue, value, localDisplayValue, onChange, onBlur]);
|
||||
|
||||
// Handle popover open/close
|
||||
// Handle popover open/close (called by Radix for click-outside and escape key)
|
||||
const handlePopoverOpenChange = useCallback(
|
||||
(open: boolean) => {
|
||||
if (!open && popoverOpen) {
|
||||
handleClosePopover();
|
||||
// Check if this was an intentional close (via close button or actions)
|
||||
const wasIntentional = intentionalCloseRef.current;
|
||||
intentionalCloseRef.current = false; // Reset for next time
|
||||
|
||||
if (wasIntentional) {
|
||||
// Intentional close already handled by handleClosePopover
|
||||
return;
|
||||
}
|
||||
|
||||
// This is a click-outside close - save changes and signal other cells
|
||||
if (editValue !== value || editValue !== localDisplayValue) {
|
||||
setLocalDisplayValue(editValue);
|
||||
onChange(editValue);
|
||||
onBlur(editValue);
|
||||
}
|
||||
|
||||
setPopoverOpen(false);
|
||||
setAiSuggestionExpanded(false);
|
||||
|
||||
// Signal to other cells that a popover just closed via click-outside
|
||||
setCellPopoverClosed();
|
||||
|
||||
preventReopenRef.current = true;
|
||||
setTimeout(() => {
|
||||
preventReopenRef.current = false;
|
||||
}, 100);
|
||||
} else if (open && !popoverOpen) {
|
||||
// Block opening if another popover was just closed
|
||||
if (wasPopoverRecentlyClosed()) {
|
||||
return;
|
||||
}
|
||||
setEditValue(localDisplayValue || String(value ?? ''));
|
||||
setPopoverOpen(true);
|
||||
}
|
||||
},
|
||||
[value, popoverOpen, handleClosePopover, localDisplayValue]
|
||||
[value, popoverOpen, localDisplayValue, wasPopoverRecentlyClosed, editValue, onChange, onBlur, setCellPopoverClosed]
|
||||
);
|
||||
|
||||
// Handle direct input change
|
||||
@@ -111,37 +217,100 @@ const MultilineInputComponent = ({
|
||||
setEditValue(e.target.value);
|
||||
}, []);
|
||||
|
||||
// Handle accepting the AI suggestion (possibly edited)
|
||||
const handleAcceptSuggestion = useCallback(() => {
|
||||
// Use the edited suggestion
|
||||
setEditValue(editedSuggestion);
|
||||
setLocalDisplayValue(editedSuggestion);
|
||||
onChange(editedSuggestion);
|
||||
onBlur(editedSuggestion);
|
||||
onDismissAiSuggestion?.(); // Clear the suggestion after accepting
|
||||
setAiSuggestionExpanded(false);
|
||||
}, [editedSuggestion, onChange, onBlur, onDismissAiSuggestion]);
|
||||
|
||||
// Handle dismissing the AI suggestion
|
||||
const handleDismissSuggestion = useCallback(() => {
|
||||
onDismissAiSuggestion?.();
|
||||
setAiSuggestionExpanded(false);
|
||||
}, [onDismissAiSuggestion]);
|
||||
|
||||
// Calculate display value
|
||||
const displayValue = localDisplayValue !== null ? localDisplayValue : String(value ?? '');
|
||||
|
||||
// Tooltip content - show full description or error message
|
||||
const tooltipContent = errorMessage || displayValue;
|
||||
const showTooltip = tooltipContent && tooltipContent.length > 30;
|
||||
|
||||
return (
|
||||
<div className="w-full relative" ref={cellRef}>
|
||||
<Popover open={popoverOpen} onOpenChange={handlePopoverOpenChange}>
|
||||
<PopoverTrigger asChild>
|
||||
<div
|
||||
onClick={handleTriggerClick}
|
||||
className={cn(
|
||||
'px-2 py-1 h-8 rounded-md text-sm w-full cursor-pointer',
|
||||
'overflow-hidden whitespace-nowrap text-ellipsis',
|
||||
'border',
|
||||
hasError ? 'border-destructive bg-destructive/5' : 'border-input',
|
||||
isValidating && 'opacity-50'
|
||||
<TooltipProvider>
|
||||
<Tooltip delayDuration={300}>
|
||||
<TooltipTrigger asChild>
|
||||
<PopoverTrigger asChild>
|
||||
<div
|
||||
onClick={handleTriggerClick}
|
||||
className={cn(
|
||||
'pl-2 pr-4 py-1 rounded-md text-sm w-full cursor-pointer relative',
|
||||
'overflow-hidden leading-tight h-[65px] top-0',
|
||||
'border',
|
||||
hasError ? 'border-destructive bg-destructive/5' : 'border-input',
|
||||
hasAiSuggestion && !hasError && 'border-purple-300 bg-purple-50/50 dark:border-purple-700 dark:bg-purple-950/20',
|
||||
isValidating && 'opacity-50'
|
||||
)}
|
||||
>
|
||||
{displayValue}
|
||||
{/* AI suggestion indicator - small badge in corner, clickable to open with AI expanded */}
|
||||
{hasAiSuggestion && !popoverOpen && (
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
// Block opening if another popover was just closed
|
||||
if (wasPopoverRecentlyClosed()) {
|
||||
return;
|
||||
}
|
||||
setAiSuggestionExpanded(true);
|
||||
setPopoverOpen(true);
|
||||
setEditValue(localDisplayValue || String(value ?? ''));
|
||||
}}
|
||||
className="absolute bottom-1 right-1 flex items-center gap-1 px-1.5 py-0.5 rounded bg-purple-100 hover:bg-purple-200 dark:bg-purple-900/50 dark:hover:bg-purple-800/50 text-purple-600 dark:text-purple-400 text-xs transition-colors"
|
||||
title="View AI suggestion"
|
||||
>
|
||||
<Sparkles className="h-3 w-3" />
|
||||
<span>{aiIssues.length}</span>
|
||||
</button>
|
||||
)}
|
||||
{/* AI validating indicator */}
|
||||
{isAiValidating && (
|
||||
<div className="absolute bottom-1 right-1 flex items-center gap-1 px-1.5 py-0.5 rounded bg-purple-100 dark:bg-purple-900/50">
|
||||
<Loader2 className="h-3 w-3 animate-spin text-purple-500" />
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</PopoverTrigger>
|
||||
</TooltipTrigger>
|
||||
{showTooltip && !popoverOpen && (
|
||||
<TooltipContent
|
||||
side="top"
|
||||
align="start"
|
||||
className="max-w-[400px] whitespace-pre-wrap"
|
||||
>
|
||||
<p>{tooltipContent}</p>
|
||||
</TooltipContent>
|
||||
)}
|
||||
title={errorMessage || displayValue}
|
||||
>
|
||||
{displayValue}
|
||||
</div>
|
||||
</PopoverTrigger>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
<PopoverContent
|
||||
className="p-0 shadow-lg rounded-md"
|
||||
style={{ width: Math.max(cellRef.current?.offsetWidth || 300, 300) }}
|
||||
style={{ width: Math.max(cellRef.current?.offsetWidth || 400, 400) }}
|
||||
align="start"
|
||||
side="bottom"
|
||||
alignOffset={0}
|
||||
sideOffset={4}
|
||||
onInteractOutside={handleClosePopover}
|
||||
sideOffset={-65}
|
||||
>
|
||||
<div className="flex flex-col relative">
|
||||
<div className="flex flex-col">
|
||||
{/* Close button */}
|
||||
<Button
|
||||
size="icon"
|
||||
variant="ghost"
|
||||
@@ -151,13 +320,96 @@ const MultilineInputComponent = ({
|
||||
<X className="h-3 w-3" />
|
||||
</Button>
|
||||
|
||||
{/* Main textarea */}
|
||||
<Textarea
|
||||
value={editValue}
|
||||
onChange={handleChange}
|
||||
className="min-h-[150px] border-none focus-visible:ring-0 rounded-md p-2 pr-8"
|
||||
onWheel={handleTextareaWheel}
|
||||
className="min-h-[120px] max-h-[200px] overflow-y-auto overscroll-contain border-none focus-visible:ring-0 rounded-t-md rounded-b-none pl-2 pr-4 py-1 resize-y"
|
||||
placeholder={`Enter ${field.label || 'text'}...`}
|
||||
autoFocus
|
||||
/>
|
||||
|
||||
{/* AI Suggestion section */}
|
||||
{hasAiSuggestion && (
|
||||
<div className="border-t border-purple-200 dark:border-purple-800 bg-purple-50/80 dark:bg-purple-950/30">
|
||||
{/* Collapsed header - always visible */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setAiSuggestionExpanded(!aiSuggestionExpanded)}
|
||||
className="w-full flex items-center justify-between px-3 py-2 hover:bg-purple-100/50 dark:hover:bg-purple-900/30 transition-colors"
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<Sparkles className="h-3.5 w-3.5 text-purple-500" />
|
||||
<span className="text-xs font-medium text-purple-600 dark:text-purple-400">
|
||||
AI Suggestion
|
||||
</span>
|
||||
<span className="text-xs text-purple-500 dark:text-purple-400">
|
||||
({aiIssues.length} {aiIssues.length === 1 ? 'issue' : 'issues'})
|
||||
</span>
|
||||
</div>
|
||||
{aiSuggestionExpanded ? (
|
||||
<ChevronUp className="h-4 w-4 text-purple-400" />
|
||||
) : (
|
||||
<ChevronDown className="h-4 w-4 text-purple-400" />
|
||||
)}
|
||||
</button>
|
||||
|
||||
{/* Expanded content */}
|
||||
{aiSuggestionExpanded && (
|
||||
<div className="px-3 pb-3 space-y-3">
|
||||
{/* Issues list */}
|
||||
{aiIssues.length > 0 && (
|
||||
<div className="flex flex-col gap-1">
|
||||
{aiIssues.map((issue, index) => (
|
||||
<div
|
||||
key={index}
|
||||
className="flex items-start gap-1.5 text-xs text-purple-600 dark:text-purple-400"
|
||||
>
|
||||
<AlertCircle className="h-3 w-3 mt-0.5 flex-shrink-0 text-purple-400" />
|
||||
<span>{issue}</span>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Editable suggestion */}
|
||||
<div>
|
||||
<div className="text-xs text-purple-500 dark:text-purple-400 mb-1 font-medium">
|
||||
Suggested (editable):
|
||||
</div>
|
||||
<Textarea
|
||||
value={editedSuggestion}
|
||||
onChange={(e) => setEditedSuggestion(e.target.value)}
|
||||
onWheel={handleTextareaWheel}
|
||||
className="min-h-[120px] max-h-[200px] overflow-y-auto overscroll-contain text-sm bg-white dark:bg-black/20 border-purple-200 dark:border-purple-700 focus-visible:ring-purple-400 resize-y"
|
||||
/>
|
||||
</div>
|
||||
|
||||
{/* Actions */}
|
||||
<div className="flex items-center gap-2">
|
||||
<Button
|
||||
size="sm"
|
||||
variant="outline"
|
||||
className="h-7 px-3 text-xs bg-white border-green-300 text-green-700 hover:bg-green-50 hover:border-green-400 dark:bg-green-950/30 dark:border-green-700 dark:text-green-400"
|
||||
onClick={handleAcceptSuggestion}
|
||||
>
|
||||
<Check className="h-3 w-3 mr-1" />
|
||||
Replace With Suggestion
|
||||
</Button>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
className="h-7 px-3 text-xs text-gray-500 hover:text-gray-700 dark:text-gray-400"
|
||||
onClick={handleDismissSuggestion}
|
||||
>
|
||||
Dismiss
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</PopoverContent>
|
||||
</Popover>
|
||||
|
||||
@@ -33,6 +33,10 @@ import { cn } from '@/lib/utils';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { ValidationError } from '../../store/types';
|
||||
import { ErrorType } from '../../store/types';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
|
||||
/** Time window (ms) during which this cell should not open after a popover closes */
|
||||
const POPOVER_CLOSE_DELAY = 150;
|
||||
|
||||
interface SelectCellProps {
|
||||
value: unknown;
|
||||
@@ -62,6 +66,9 @@ const SelectCellComponent = ({
|
||||
const [isFetchingOptions, setIsFetchingOptions] = useState(false);
|
||||
const hasFetchedRef = useRef(false);
|
||||
|
||||
// Get store state for coordinating with popover close behavior
|
||||
const cellPopoverClosedAt = useValidationStore((s) => s.cellPopoverClosedAt);
|
||||
|
||||
// Combined loading state - either internal fetch or external loading
|
||||
const isLoadingOptions = isFetchingOptions || externalLoadingOptions;
|
||||
|
||||
@@ -78,6 +85,10 @@ const SelectCellComponent = ({
|
||||
// Handle opening the dropdown - fetch options if needed
|
||||
const handleOpenChange = useCallback(
|
||||
async (isOpen: boolean) => {
|
||||
// Block opening if a popover was just closed (click-outside behavior)
|
||||
if (isOpen && Date.now() - cellPopoverClosedAt < POPOVER_CLOSE_DELAY) {
|
||||
return;
|
||||
}
|
||||
if (isOpen && onFetchOptions && options.length === 0 && !hasFetchedRef.current) {
|
||||
hasFetchedRef.current = true;
|
||||
setIsFetchingOptions(true);
|
||||
@@ -89,7 +100,7 @@ const SelectCellComponent = ({
|
||||
}
|
||||
setOpen(isOpen);
|
||||
},
|
||||
[onFetchOptions, options.length]
|
||||
[onFetchOptions, options.length, cellPopoverClosedAt]
|
||||
);
|
||||
|
||||
// Handle selection
|
||||
|
||||
@@ -0,0 +1,429 @@
|
||||
/**
|
||||
* AI Suggestions Context
|
||||
*
|
||||
* Provides embedding-based suggestions to cells without causing re-renders.
|
||||
* Uses refs to store suggestion data and callbacks, so consumers can read
|
||||
* values on-demand without subscribing to state changes.
|
||||
*
|
||||
* PERFORMANCE: This context deliberately uses refs instead of state to avoid
|
||||
* cascading re-renders through the virtualized table. Cells read suggestions
|
||||
* when they need them (e.g., when dropdown opens).
|
||||
*/
|
||||
|
||||
import React, { createContext, useContext, useRef, useCallback, useEffect, useState } from 'react';
|
||||
import type { RowData, ProductSuggestions, TaxonomySuggestion } from '../store/types';
|
||||
|
||||
// ============================================================================
|
||||
// Types
|
||||
// ============================================================================
|
||||
|
||||
interface AiSuggestionsContextValue {
|
||||
/** Check if service is initialized */
|
||||
isInitialized: boolean;
|
||||
/** Get suggestions for a product by index */
|
||||
getSuggestions: (productIndex: string) => ProductSuggestions | undefined;
|
||||
/** Check if suggestions are loading for a product */
|
||||
isLoading: (productIndex: string) => boolean;
|
||||
/** Trigger suggestion fetch for a product */
|
||||
fetchSuggestions: (product: RowData) => void;
|
||||
/** Handle field blur - refreshes suggestions if relevant field changed */
|
||||
handleFieldBlur: (product: RowData, fieldKey: string) => void;
|
||||
/** Get category suggestions for a product */
|
||||
getCategorySuggestions: (productIndex: string) => TaxonomySuggestion[];
|
||||
/** Get theme suggestions for a product */
|
||||
getThemeSuggestions: (productIndex: string) => TaxonomySuggestion[];
|
||||
/** Get color suggestions for a product */
|
||||
getColorSuggestions: (productIndex: string) => TaxonomySuggestion[];
|
||||
/** Subscribe to suggestion changes for a product (returns unsubscribe fn) */
|
||||
subscribe: (productIndex: string, callback: () => void) => () => void;
|
||||
/** Force refresh suggestions for all products */
|
||||
refreshAll: () => void;
|
||||
}
|
||||
|
||||
interface AiSuggestionsProviderProps {
|
||||
children: React.ReactNode;
|
||||
/** Get company name by ID */
|
||||
getCompanyName?: (id: string) => string | undefined;
|
||||
/** Get line name by ID */
|
||||
getLineName?: (id: string) => string | undefined;
|
||||
/** Initial products to fetch suggestions for */
|
||||
initialProducts?: RowData[];
|
||||
/** Whether to auto-initialize (default: true) */
|
||||
autoInitialize?: boolean;
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Context
|
||||
// ============================================================================
|
||||
|
||||
const AiSuggestionsContext = createContext<AiSuggestionsContextValue | null>(null);
|
||||
|
||||
// Fields that affect embeddings
|
||||
const EMBEDDING_FIELDS = ['company', 'line', 'name', 'description'];
|
||||
|
||||
const API_BASE = '/api/ai';
|
||||
|
||||
// ============================================================================
|
||||
// Provider
|
||||
// ============================================================================
|
||||
|
||||
export function AiSuggestionsProvider({
|
||||
children,
|
||||
getCompanyName,
|
||||
getLineName,
|
||||
initialProducts,
|
||||
autoInitialize = true,
|
||||
}: AiSuggestionsProviderProps) {
|
||||
// State for initialization status (this can cause re-render, but it's rare)
|
||||
const [isInitialized, setIsInitialized] = useState(false);
|
||||
|
||||
// Refs for data that shouldn't trigger re-renders
|
||||
const suggestionsRef = useRef<Map<string, ProductSuggestions>>(new Map());
|
||||
const loadingRef = useRef<Set<string>>(new Set());
|
||||
const fieldValuesRef = useRef<Map<string, Record<string, unknown>>>(new Map());
|
||||
const subscribersRef = useRef<Map<string, Set<() => void>>>(new Map());
|
||||
|
||||
// Ref for lookup functions (updated on each render)
|
||||
const lookupFnsRef = useRef({ getCompanyName, getLineName });
|
||||
lookupFnsRef.current = { getCompanyName, getLineName };
|
||||
|
||||
/**
|
||||
* Notify subscribers when suggestions change
|
||||
*/
|
||||
const notifySubscribers = useCallback((productIndex: string) => {
|
||||
const callbacks = subscribersRef.current.get(productIndex);
|
||||
if (callbacks) {
|
||||
callbacks.forEach(cb => cb());
|
||||
}
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Initialize the AI service
|
||||
*/
|
||||
const initialize = useCallback(async (): Promise<boolean> => {
|
||||
if (isInitialized) return true;
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/initialize`, { method: 'POST' });
|
||||
const data = await response.json();
|
||||
|
||||
if (!response.ok || !data.success) {
|
||||
console.error('[AiSuggestions] Initialization failed:', data.error);
|
||||
return false;
|
||||
}
|
||||
|
||||
setIsInitialized(true);
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error('[AiSuggestions] Initialization error:', error);
|
||||
return false;
|
||||
}
|
||||
}, [isInitialized]);
|
||||
|
||||
/**
|
||||
* Build product data for API request
|
||||
*/
|
||||
const buildProductRequest = useCallback((product: RowData) => {
|
||||
const { getCompanyName: getCompany, getLineName: getLine } = lookupFnsRef.current;
|
||||
return {
|
||||
name: product.name,
|
||||
description: product.description,
|
||||
company_name: product.company ? getCompany?.(String(product.company)) : undefined,
|
||||
line_name: product.line ? getLine?.(String(product.line)) : undefined,
|
||||
};
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Fetch suggestions for a single product
|
||||
*/
|
||||
const fetchSuggestions = useCallback(async (product: RowData) => {
|
||||
const productIndex = product.__index;
|
||||
if (!productIndex) return;
|
||||
|
||||
// Skip if already loading
|
||||
if (loadingRef.current.has(productIndex)) return;
|
||||
|
||||
// Ensure initialized
|
||||
const ready = await initialize();
|
||||
if (!ready) return;
|
||||
|
||||
// Check if product has enough data for meaningful suggestions
|
||||
const productData = buildProductRequest(product);
|
||||
const hasText = productData.name || productData.description || productData.company_name;
|
||||
if (!hasText) return;
|
||||
|
||||
// Mark as loading
|
||||
loadingRef.current.add(productIndex);
|
||||
notifySubscribers(productIndex);
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/suggestions`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ product: productData }),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`API error: ${response.status}`);
|
||||
}
|
||||
|
||||
const suggestions: ProductSuggestions = await response.json();
|
||||
|
||||
// Store suggestions
|
||||
suggestionsRef.current.set(productIndex, suggestions);
|
||||
|
||||
// Store field values for change detection
|
||||
fieldValuesRef.current.set(productIndex, {
|
||||
company: product.company,
|
||||
line: product.line,
|
||||
name: product.name,
|
||||
description: product.description,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[AiSuggestions] Fetch error:', error);
|
||||
} finally {
|
||||
loadingRef.current.delete(productIndex);
|
||||
notifySubscribers(productIndex);
|
||||
}
|
||||
}, [initialize, buildProductRequest, notifySubscribers]);
|
||||
|
||||
/**
|
||||
* Fetch suggestions for multiple products in batch
|
||||
*/
|
||||
const fetchBatchSuggestions = useCallback(async (products: RowData[]) => {
|
||||
// Ensure initialized
|
||||
const ready = await initialize();
|
||||
if (!ready) return;
|
||||
|
||||
// Filter to products that need fetching
|
||||
const productsToFetch = products.filter(p => {
|
||||
if (!p.__index) return false;
|
||||
if (loadingRef.current.has(p.__index)) return false;
|
||||
if (suggestionsRef.current.has(p.__index)) return false;
|
||||
|
||||
const productData = buildProductRequest(p);
|
||||
return productData.name || productData.description || productData.company_name;
|
||||
});
|
||||
|
||||
if (productsToFetch.length === 0) return;
|
||||
|
||||
// Mark all as loading
|
||||
productsToFetch.forEach(p => {
|
||||
loadingRef.current.add(p.__index);
|
||||
notifySubscribers(p.__index);
|
||||
});
|
||||
|
||||
try {
|
||||
const response = await fetch(`${API_BASE}/suggestions/batch`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
products: productsToFetch.map(p => ({
|
||||
_index: p.__index,
|
||||
...buildProductRequest(p),
|
||||
})),
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`API error: ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
// Store results
|
||||
for (const result of data.results || []) {
|
||||
const product = productsToFetch[result.index];
|
||||
if (product?.__index) {
|
||||
suggestionsRef.current.set(product.__index, {
|
||||
categories: result.categories,
|
||||
themes: result.themes,
|
||||
colors: result.colors,
|
||||
});
|
||||
|
||||
fieldValuesRef.current.set(product.__index, {
|
||||
company: product.company,
|
||||
line: product.line,
|
||||
name: product.name,
|
||||
description: product.description,
|
||||
});
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('[AiSuggestions] Batch fetch error:', error);
|
||||
} finally {
|
||||
productsToFetch.forEach(p => {
|
||||
loadingRef.current.delete(p.__index);
|
||||
notifySubscribers(p.__index);
|
||||
});
|
||||
}
|
||||
}, [initialize, buildProductRequest, notifySubscribers]);
|
||||
|
||||
/**
|
||||
* Handle field blur - refresh suggestions if embedding field changed
|
||||
*/
|
||||
const handleFieldBlur = useCallback((product: RowData, fieldKey: string) => {
|
||||
if (!EMBEDDING_FIELDS.includes(fieldKey)) return;
|
||||
|
||||
const productIndex = product.__index;
|
||||
if (!productIndex) return;
|
||||
|
||||
// Check if value actually changed
|
||||
const prevValues = fieldValuesRef.current.get(productIndex);
|
||||
if (prevValues) {
|
||||
const prevValue = prevValues[fieldKey];
|
||||
const currentValue = product[fieldKey];
|
||||
if (prevValue === currentValue) return;
|
||||
}
|
||||
|
||||
// Clear existing suggestions and refetch
|
||||
suggestionsRef.current.delete(productIndex);
|
||||
fieldValuesRef.current.delete(productIndex);
|
||||
|
||||
// Debounce the fetch (simple timeout-based debounce)
|
||||
setTimeout(() => fetchSuggestions(product), 300);
|
||||
}, [fetchSuggestions]);
|
||||
|
||||
/**
|
||||
* Get suggestions for a product
|
||||
*/
|
||||
const getSuggestions = useCallback((productIndex: string): ProductSuggestions | undefined => {
|
||||
return suggestionsRef.current.get(productIndex);
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Check if loading
|
||||
*/
|
||||
const isLoading = useCallback((productIndex: string): boolean => {
|
||||
return loadingRef.current.has(productIndex);
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Get category suggestions
|
||||
*/
|
||||
const getCategorySuggestions = useCallback((productIndex: string): TaxonomySuggestion[] => {
|
||||
return suggestionsRef.current.get(productIndex)?.categories || [];
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Get theme suggestions
|
||||
*/
|
||||
const getThemeSuggestions = useCallback((productIndex: string): TaxonomySuggestion[] => {
|
||||
return suggestionsRef.current.get(productIndex)?.themes || [];
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Get color suggestions
|
||||
*/
|
||||
const getColorSuggestions = useCallback((productIndex: string): TaxonomySuggestion[] => {
|
||||
return suggestionsRef.current.get(productIndex)?.colors || [];
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Subscribe to suggestion changes
|
||||
*/
|
||||
const subscribe = useCallback((productIndex: string, callback: () => void): (() => void) => {
|
||||
if (!subscribersRef.current.has(productIndex)) {
|
||||
subscribersRef.current.set(productIndex, new Set());
|
||||
}
|
||||
subscribersRef.current.get(productIndex)!.add(callback);
|
||||
|
||||
return () => {
|
||||
subscribersRef.current.get(productIndex)?.delete(callback);
|
||||
};
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Refresh all products
|
||||
*/
|
||||
const refreshAll = useCallback(() => {
|
||||
// Clear all cached suggestions
|
||||
suggestionsRef.current.clear();
|
||||
fieldValuesRef.current.clear();
|
||||
|
||||
// If we have initial products, refetch them
|
||||
if (initialProducts && initialProducts.length > 0) {
|
||||
fetchBatchSuggestions(initialProducts);
|
||||
}
|
||||
}, [initialProducts, fetchBatchSuggestions]);
|
||||
|
||||
// Auto-initialize and fetch initial products
|
||||
useEffect(() => {
|
||||
if (!autoInitialize) return;
|
||||
|
||||
const init = async () => {
|
||||
const ready = await initialize();
|
||||
if (ready && initialProducts && initialProducts.length > 0) {
|
||||
// Small delay to avoid blocking initial render
|
||||
setTimeout(() => fetchBatchSuggestions(initialProducts), 100);
|
||||
}
|
||||
};
|
||||
|
||||
init();
|
||||
}, [autoInitialize, initialize, initialProducts, fetchBatchSuggestions]);
|
||||
|
||||
const contextValue: AiSuggestionsContextValue = {
|
||||
isInitialized,
|
||||
getSuggestions,
|
||||
isLoading,
|
||||
fetchSuggestions,
|
||||
handleFieldBlur,
|
||||
getCategorySuggestions,
|
||||
getThemeSuggestions,
|
||||
getColorSuggestions,
|
||||
subscribe,
|
||||
refreshAll,
|
||||
};
|
||||
|
||||
return (
|
||||
<AiSuggestionsContext.Provider value={contextValue}>
|
||||
{children}
|
||||
</AiSuggestionsContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Hook
|
||||
// ============================================================================
|
||||
|
||||
export function useAiSuggestionsContext(): AiSuggestionsContextValue | null {
|
||||
return useContext(AiSuggestionsContext);
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook for cells to get suggestions with re-render on update
|
||||
* Only use this in cell components that need to display suggestions
|
||||
*/
|
||||
export function useCellSuggestions(productIndex: string) {
|
||||
const context = useAiSuggestionsContext();
|
||||
const [, forceUpdate] = useState({});
|
||||
|
||||
useEffect(() => {
|
||||
if (!context) return;
|
||||
|
||||
// Subscribe to changes for this product
|
||||
const unsubscribe = context.subscribe(productIndex, () => {
|
||||
forceUpdate({});
|
||||
});
|
||||
|
||||
return unsubscribe;
|
||||
}, [context, productIndex]);
|
||||
|
||||
if (!context) {
|
||||
return {
|
||||
categories: [] as TaxonomySuggestion[],
|
||||
themes: [] as TaxonomySuggestion[],
|
||||
colors: [] as TaxonomySuggestion[],
|
||||
isLoading: false,
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
categories: context.getCategorySuggestions(productIndex),
|
||||
themes: context.getThemeSuggestions(productIndex),
|
||||
colors: context.getColorSuggestions(productIndex),
|
||||
isLoading: context.isLoading(productIndex),
|
||||
};
|
||||
}
|
||||
|
||||
export default AiSuggestionsContext;
|
||||
@@ -10,7 +10,7 @@
|
||||
* Only visible to users with admin:debug permission.
|
||||
*/
|
||||
|
||||
import { useState, useMemo } from 'react';
|
||||
import { useState } from 'react';
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
|
||||
@@ -18,7 +18,7 @@ import { Badge } from '@/components/ui/badge';
|
||||
import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import { Check, X, Sparkles, AlertTriangle, Info, Cpu, Brain } from 'lucide-react';
|
||||
import { Protected } from '@/components/auth/Protected';
|
||||
import type { AiValidationResults, AiTokenUsage, AiValidationChange } from '../store/types';
|
||||
import type { AiValidationResults, AiTokenUsage } from '../store/types';
|
||||
|
||||
interface AiValidationResultsDialogProps {
|
||||
results: AiValidationResults;
|
||||
|
||||
@@ -0,0 +1,315 @@
|
||||
/**
|
||||
* SanityCheckDialog Component
|
||||
*
|
||||
* Modal dialog that shows sanity check progress and results.
|
||||
* Automatically triggered when user clicks Continue to next step.
|
||||
*/
|
||||
|
||||
import {
|
||||
AlertCircle,
|
||||
CheckCircle,
|
||||
Loader2,
|
||||
AlertTriangle,
|
||||
ChevronRight,
|
||||
XCircle,
|
||||
} from 'lucide-react';
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle
|
||||
} from '@/components/ui/dialog';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import type { SanityCheckResult } from '../hooks/useSanityCheck';
|
||||
|
||||
interface SanityCheckDialogProps {
|
||||
/** Whether the dialog is open */
|
||||
open: boolean;
|
||||
/** Called when dialog should close */
|
||||
onOpenChange: (open: boolean) => void;
|
||||
/** Whether the check is currently running */
|
||||
isChecking: boolean;
|
||||
/** Error message if check failed */
|
||||
error: string | null;
|
||||
/** Results of the sanity check */
|
||||
result: SanityCheckResult | null;
|
||||
/** Called when user wants to proceed despite issues */
|
||||
onProceed: () => void;
|
||||
/** Called when user wants to go back and fix issues */
|
||||
onGoBack: () => void;
|
||||
/** Called to refresh/re-run the sanity check */
|
||||
onRefresh?: () => void;
|
||||
/** Called to scroll to a specific product */
|
||||
onScrollToProduct?: (productIndex: number) => void;
|
||||
/** Product names for display (indexed by product index) */
|
||||
productNames?: Record<number, string>;
|
||||
/** Number of validation errors (required fields, etc.) */
|
||||
validationErrorCount?: number;
|
||||
}
|
||||
|
||||
export function SanityCheckDialog({
|
||||
open,
|
||||
onOpenChange,
|
||||
isChecking,
|
||||
error,
|
||||
result,
|
||||
onProceed,
|
||||
onGoBack,
|
||||
onScrollToProduct,
|
||||
productNames = {},
|
||||
validationErrorCount = 0
|
||||
}: SanityCheckDialogProps) {
|
||||
const hasSanityIssues = result?.issues && result.issues.length > 0;
|
||||
const hasValidationErrors = validationErrorCount > 0;
|
||||
const hasAnyIssues = hasSanityIssues || hasValidationErrors;
|
||||
const allClear = !isChecking && !error && !hasSanityIssues && result;
|
||||
|
||||
// Group issues by field, then by exact issue+suggestion combination for deduplication
|
||||
const issuesByField = result?.issues?.reduce((acc, issue) => {
|
||||
const field = issue.field;
|
||||
if (!acc[field]) {
|
||||
acc[field] = {};
|
||||
}
|
||||
|
||||
const key = `${issue.issue}:${issue.suggestion || ''}`;
|
||||
if (!acc[field][key]) {
|
||||
acc[field][key] = {
|
||||
field: issue.field,
|
||||
issue: issue.issue,
|
||||
suggestion: issue.suggestion,
|
||||
productIndices: []
|
||||
};
|
||||
}
|
||||
acc[field][key].productIndices.push(issue.productIndex);
|
||||
return acc;
|
||||
}, {} as Record<string, Record<string, { field: string; issue: string; suggestion?: string; productIndices: number[] }>>) || {};
|
||||
|
||||
return (
|
||||
<Dialog open={open} onOpenChange={onOpenChange}>
|
||||
<DialogContent className="sm:max-w-[600px]">
|
||||
<DialogHeader>
|
||||
<div className="flex items-center justify-between">
|
||||
<DialogTitle className="flex items-center gap-2">
|
||||
{isChecking ? (
|
||||
<>
|
||||
Running Consistency Check...
|
||||
</>
|
||||
) : error ? (
|
||||
<>
|
||||
<XCircle className="h-5 w-5 text-red-500" />
|
||||
Consistency Check Failed
|
||||
</>
|
||||
) : hasAnyIssues ? (
|
||||
<>
|
||||
<AlertTriangle className="h-5 w-5 text-amber-500" />
|
||||
{hasValidationErrors && hasSanityIssues
|
||||
? 'Validation Errors & Consistency Issues Found'
|
||||
: hasValidationErrors
|
||||
? 'Validation Errors'
|
||||
: 'Consistency Issues Found'}
|
||||
</>
|
||||
) : allClear ? (
|
||||
<>
|
||||
Continue
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
<CheckCircle className="h-5 w-5 text-green-500" />
|
||||
Consistency Check
|
||||
</>
|
||||
)}
|
||||
</DialogTitle>
|
||||
|
||||
</div>
|
||||
</DialogHeader>
|
||||
|
||||
{/* Content */}
|
||||
<div className="py-4">
|
||||
{/* Loading state */}
|
||||
{isChecking && (
|
||||
<div className="flex flex-col items-center justify-center py-8 gap-4">
|
||||
<Loader2 className="h-8 w-8 animate-spin text-purple-500" />
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Analyzing {result?.totalProducts || '...'} products...
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Error state */}
|
||||
{error && !isChecking && (
|
||||
<div className="flex items-start gap-3 p-4 rounded-lg bg-red-50 border border-red-200">
|
||||
<AlertCircle className="h-5 w-5 text-red-500 flex-shrink-0 mt-0.5" />
|
||||
<div>
|
||||
<p className="font-medium text-red-800">Error</p>
|
||||
<p className="text-sm text-red-600 mt-1">{error}</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Success state - only show if no validation errors either */}
|
||||
{allClear && !hasValidationErrors && !isChecking && (
|
||||
<div className="flex items-start gap-3 p-4 rounded-lg bg-green-50 border border-green-200">
|
||||
<div>
|
||||
<p className="text-sm text-green-600 mt-1">
|
||||
{result?.summary || 'No consistency issues detected in your products.'}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Validation errors warning */}
|
||||
{hasValidationErrors && !isChecking && (
|
||||
<div className="flex items-start gap-3 p-4 rounded-lg bg-red-50 border border-red-200 mb-2">
|
||||
<div>
|
||||
<p className="text-sm text-red-600">
|
||||
There {validationErrorCount === 1 ? 'is' : 'are'} {validationErrorCount} validation error{validationErrorCount === 1 ? '' : 's'} (required fields missing, invalid values, etc.) in your data.
|
||||
These should be fixed before continuing.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
{hasSanityIssues && !isChecking && (
|
||||
<>
|
||||
{/* Summary */}
|
||||
{result?.summary && (
|
||||
<div className="flex items-start gap-3 p-4 rounded-lg bg-amber-50 border border-amber-200">
|
||||
<div>
|
||||
<p className="text-sm text-amber-800">{result.summary}</p>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Sanity check issues list */}
|
||||
{hasSanityIssues && !isChecking && (
|
||||
<ScrollArea className="max-h-[400px] overflow-y-auto mt-4">
|
||||
<div className="space-y-4">
|
||||
|
||||
|
||||
{/* Issues grouped by field, then by unique issue+suggestion */}
|
||||
{Object.entries(issuesByField).map(([field, groups]) => (
|
||||
<div key={field} className="space-y-2">
|
||||
<div className="flex items-center gap-2">
|
||||
<Badge variant="outline" className="text-xs font-semibold">
|
||||
{formatFieldName(field)}
|
||||
</Badge>
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{Object.values(groups).reduce((sum, g) => sum + g.productIndices.length, 0)} issue{Object.values(groups).reduce((sum, g) => sum + g.productIndices.length, 0) === 1 ? '' : 's'}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{Object.entries(groups).map(([key, group]) => (
|
||||
<div
|
||||
key={key}
|
||||
className="flex items-start gap-3 p-3 rounded-lg bg-gray-50 border border-gray-200 hover:border-gray-300 transition-colors"
|
||||
>
|
||||
<AlertCircle className="h-4 w-4 text-amber-500 flex-shrink-0 mt-0.5" />
|
||||
<div className="flex-1 min-w-0">
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{group.productIndices.length} product{group.productIndices.length === 1 ? '' : 's'}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-wrap items-center gap-x-1 gap-y-1 mb-2">
|
||||
{group.productIndices.map((productIndex, idx) => (
|
||||
<span key={productIndex} className="inline-flex items-center">
|
||||
<span className="text-sm font-medium">
|
||||
{productNames[productIndex] || `Product ${productIndex + 1}`}
|
||||
</span>
|
||||
{onScrollToProduct && (
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
className="h-5 px-1 text-xs text-blue-600 hover:text-blue-800"
|
||||
onClick={() => {
|
||||
onOpenChange(false);
|
||||
onScrollToProduct(productIndex);
|
||||
}}
|
||||
>
|
||||
<ChevronRight className="h-3 w-3" />
|
||||
</Button>
|
||||
)}
|
||||
{idx < group.productIndices.length - 1 && (
|
||||
<span className="text-gray-400 mr-1">,</span>
|
||||
)}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
<p className="text-sm text-gray-600">{group.issue}</p>
|
||||
{group.suggestion && (
|
||||
<p className="text-xs text-blue-600 mt-1">
|
||||
{group.suggestion}
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</ScrollArea>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Footer */}
|
||||
<DialogFooter>
|
||||
{isChecking ? (
|
||||
<Button variant="outline" onClick={() => onOpenChange(false)}>
|
||||
Cancel
|
||||
</Button>
|
||||
) : error ? (
|
||||
<>
|
||||
<Button variant="outline" onClick={onGoBack}>
|
||||
Go Back
|
||||
</Button>
|
||||
<Button onClick={() => onOpenChange(false)}>
|
||||
Close
|
||||
</Button>
|
||||
</>
|
||||
) : allClear && !hasValidationErrors ? (
|
||||
<Button onClick={onProceed}>
|
||||
Continue to Next Step
|
||||
<ChevronRight className="h-4 w-4 ml-1" />
|
||||
</Button>
|
||||
) : hasAnyIssues ? (
|
||||
<>
|
||||
<Button variant="outline" onClick={onGoBack}>
|
||||
Go Back & Fix
|
||||
</Button>
|
||||
<Button onClick={onProceed} variant={hasValidationErrors ? 'destructive' : 'default'}>
|
||||
Proceed Anyway
|
||||
</Button>
|
||||
</>
|
||||
) : null}
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Format a field key into a human-readable name
|
||||
*/
|
||||
function formatFieldName(field: string): string {
|
||||
const fieldNames: Record<string, string> = {
|
||||
supplier_no: 'Supplier #',
|
||||
msrp: 'MSRP',
|
||||
cost_each: 'Cost Each',
|
||||
qty_per_unit: 'Min Qty',
|
||||
case_qty: 'Case Pack',
|
||||
tax_cat: 'Tax Category',
|
||||
size_cat: 'Size Category',
|
||||
name: 'Name',
|
||||
themes: 'Themes',
|
||||
weight: 'Weight',
|
||||
length: 'Length',
|
||||
width: 'Width',
|
||||
height: 'Height'
|
||||
};
|
||||
|
||||
return fieldNames[field] || field.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase());
|
||||
}
|
||||
@@ -12,7 +12,6 @@ import { useCallback } from 'react';
|
||||
import { useValidationStore } from '../../store/validationStore';
|
||||
import type { AiValidationChange, AiValidationResults, AiTokenUsage } from '../../store/types';
|
||||
import type { Field, SelectOption } from '../../../../types';
|
||||
import type { AiValidationResponse, AiTokenUsage as ApiTokenUsage } from './useAiApi';
|
||||
|
||||
/**
|
||||
* Helper to convert a value to number or null
|
||||
|
||||
@@ -0,0 +1,145 @@
|
||||
/**
|
||||
* useAutoInlineAiValidation Hook
|
||||
*
|
||||
* Automatically triggers inline AI validation (name/description) for rows
|
||||
* that have sufficient context when the validation step becomes ready.
|
||||
*
|
||||
* Context requirements:
|
||||
* - Name validation: company + line + name value
|
||||
* - Description validation: company + line + name + description value
|
||||
*
|
||||
* This runs once when the table is ready, firing all requests at once.
|
||||
* The blur handler in ValidationTable.tsx handles subsequent validations
|
||||
* when fields are edited.
|
||||
*/
|
||||
|
||||
import { useEffect, useRef } from 'react';
|
||||
import { useValidationStore } from '../store/validationStore';
|
||||
import { useInitPhase } from '../store/selectors';
|
||||
import {
|
||||
buildNameValidationPayload,
|
||||
buildDescriptionValidationPayload,
|
||||
} from '../utils/inlineAiPayload';
|
||||
|
||||
/**
|
||||
* Trigger validation for a single field
|
||||
*/
|
||||
async function triggerValidation(
|
||||
productIndex: string,
|
||||
field: 'name' | 'description',
|
||||
payload: Record<string, unknown>
|
||||
) {
|
||||
const {
|
||||
setInlineAiValidating,
|
||||
setInlineAiSuggestion,
|
||||
markInlineAiAutoValidated,
|
||||
} = useValidationStore.getState();
|
||||
|
||||
const validationKey = `${productIndex}-${field}`;
|
||||
|
||||
// Mark as auto-validated BEFORE calling API (prevents blur handler race condition)
|
||||
markInlineAiAutoValidated(productIndex, field);
|
||||
|
||||
// Mark as validating
|
||||
setInlineAiValidating(validationKey, true);
|
||||
|
||||
const endpoint = field === 'name'
|
||||
? '/api/ai/validate/inline/name'
|
||||
: '/api/ai/validate/inline/description';
|
||||
|
||||
try {
|
||||
const response = await fetch(endpoint, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ product: payload }),
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success !== false) {
|
||||
setInlineAiSuggestion(productIndex, field, {
|
||||
isValid: result.isValid ?? true,
|
||||
suggestion: result.suggestion,
|
||||
issues: result.issues || [],
|
||||
latencyMs: result.latencyMs,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
console.error(`[AutoInlineAI] ${field} validation error for ${productIndex}:`, err);
|
||||
} finally {
|
||||
setInlineAiValidating(validationKey, false);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook that triggers inline AI validation for all rows with sufficient context
|
||||
* when the validation step becomes ready.
|
||||
*/
|
||||
export function useAutoInlineAiValidation() {
|
||||
const initPhase = useInitPhase();
|
||||
const hasRunRef = useRef(false);
|
||||
|
||||
useEffect(() => {
|
||||
// Only run when ready phase is reached, and only once
|
||||
if (initPhase !== 'ready' || hasRunRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
hasRunRef.current = true;
|
||||
|
||||
const state = useValidationStore.getState();
|
||||
const { rows, fields, inlineAi } = state;
|
||||
|
||||
console.log('[AutoInlineAI] Starting auto-validation for', rows.length, 'rows');
|
||||
|
||||
let nameCount = 0;
|
||||
let descCount = 0;
|
||||
|
||||
// Process all rows - fire requests immediately (no batching)
|
||||
for (const row of rows) {
|
||||
const productIndex = row.__index;
|
||||
if (!productIndex) continue;
|
||||
|
||||
// Check name context: company + line + name
|
||||
const hasNameContext =
|
||||
row.company &&
|
||||
row.line &&
|
||||
row.name &&
|
||||
typeof row.name === 'string' &&
|
||||
row.name.trim();
|
||||
|
||||
// Check description context: company + line + name + description
|
||||
const hasDescContext =
|
||||
hasNameContext &&
|
||||
row.description &&
|
||||
typeof row.description === 'string' &&
|
||||
row.description.trim();
|
||||
|
||||
// Skip if already auto-validated (shouldn't happen on first run, but be safe)
|
||||
const nameAlreadyValidated = inlineAi.autoValidated.has(`${productIndex}-name`);
|
||||
const descAlreadyValidated = inlineAi.autoValidated.has(`${productIndex}-description`);
|
||||
|
||||
// Skip if currently validating (another process started validation)
|
||||
const nameCurrentlyValidating = inlineAi.validating.has(`${productIndex}-name`);
|
||||
const descCurrentlyValidating = inlineAi.validating.has(`${productIndex}-description`);
|
||||
|
||||
// Trigger name validation if context is sufficient
|
||||
if (hasNameContext && !nameAlreadyValidated && !nameCurrentlyValidating) {
|
||||
const payload = buildNameValidationPayload(row, fields, rows);
|
||||
triggerValidation(productIndex, 'name', payload);
|
||||
nameCount++;
|
||||
}
|
||||
|
||||
// Trigger description validation if context is sufficient
|
||||
if (hasDescContext && !descAlreadyValidated && !descCurrentlyValidating) {
|
||||
const payload = buildDescriptionValidationPayload(row, fields);
|
||||
triggerValidation(productIndex, 'description', payload);
|
||||
descCount++;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`[AutoInlineAI] Triggered ${nameCount} name validations, ${descCount} description validations`);
|
||||
}, [initPhase]);
|
||||
}
|
||||
|
||||
export default useAutoInlineAiValidation;
|
||||
@@ -0,0 +1,140 @@
|
||||
/**
|
||||
* useCopyDownValidation Hook
|
||||
*
|
||||
* Watches for copy-down operations and triggers appropriate validations:
|
||||
* - UPC-related fields (supplier, upc, barcode) -> UPC validation
|
||||
* - Line field -> Inline AI validation for rows that gain sufficient context
|
||||
*
|
||||
* This avoids duplicating validation logic - we reuse the same code paths
|
||||
* that handle individual cell blur events.
|
||||
*/
|
||||
|
||||
import { useEffect } from 'react';
|
||||
import { useValidationStore } from '../store/validationStore';
|
||||
import { useUpcValidation } from './useUpcValidation';
|
||||
import type { Field } from '../../../types';
|
||||
import {
|
||||
buildNameValidationPayload,
|
||||
buildDescriptionValidationPayload,
|
||||
} from '../utils/inlineAiPayload';
|
||||
|
||||
/**
|
||||
* Trigger inline AI validation for a single row/field
|
||||
*/
|
||||
async function triggerInlineAiValidation(
|
||||
rowIndex: number,
|
||||
field: 'name' | 'description',
|
||||
rows: ReturnType<typeof useValidationStore.getState>['rows'],
|
||||
fields: Field<string>[],
|
||||
setInlineAiValidating: (key: string, validating: boolean) => void,
|
||||
setInlineAiSuggestion: (productIndex: string, field: 'name' | 'description', result: { isValid: boolean; suggestion?: string; issues: string[]; latencyMs?: number }) => void
|
||||
) {
|
||||
const row = rows[rowIndex];
|
||||
if (!row?.__index) return;
|
||||
|
||||
const productIndex = row.__index;
|
||||
const validationKey = `${productIndex}-${field}`;
|
||||
|
||||
setInlineAiValidating(validationKey, true);
|
||||
|
||||
// Build payload using centralized utility
|
||||
const productPayload = field === 'name'
|
||||
? buildNameValidationPayload(row, fields, rows)
|
||||
: buildDescriptionValidationPayload(row, fields);
|
||||
|
||||
const endpoint = field === 'name'
|
||||
? '/api/ai/validate/inline/name'
|
||||
: '/api/ai/validate/inline/description';
|
||||
|
||||
try {
|
||||
const response = await fetch(endpoint, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ product: productPayload }),
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success !== false) {
|
||||
setInlineAiSuggestion(productIndex, field, {
|
||||
isValid: result.isValid ?? true,
|
||||
suggestion: result.suggestion,
|
||||
issues: result.issues || [],
|
||||
latencyMs: result.latencyMs,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
console.error(`[InlineAI] ${field} validation error on line copy-down:`, err);
|
||||
} finally {
|
||||
setInlineAiValidating(validationKey, false);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook that handles validation after copy-down operations.
|
||||
* Should be called once in ValidationContainer to ensure validation runs.
|
||||
*/
|
||||
export const useCopyDownValidation = () => {
|
||||
const { validateUpc } = useUpcValidation();
|
||||
|
||||
// Subscribe to pending validations
|
||||
const pendingUpcValidation = useValidationStore((state) => state.pendingCopyDownValidation);
|
||||
const pendingInlineAiValidation = useValidationStore((state) => state.pendingInlineAiValidation);
|
||||
const clearPendingCopyDownValidation = useValidationStore((state) => state.clearPendingCopyDownValidation);
|
||||
const clearPendingInlineAiValidation = useValidationStore((state) => state.clearPendingInlineAiValidation);
|
||||
|
||||
// Handle UPC validation
|
||||
useEffect(() => {
|
||||
if (!pendingUpcValidation) return;
|
||||
|
||||
const { affectedRows } = pendingUpcValidation;
|
||||
const rows = useValidationStore.getState().rows;
|
||||
|
||||
const validationPromises = affectedRows.map(async (rowIndex) => {
|
||||
const row = rows[rowIndex];
|
||||
if (!row) return;
|
||||
|
||||
const supplierId = row.supplier ? String(row.supplier) : '';
|
||||
const upcValue = row.upc ? String(row.upc) : (row.barcode ? String(row.barcode) : '');
|
||||
|
||||
if (supplierId && upcValue) {
|
||||
await validateUpc(rowIndex, supplierId, upcValue);
|
||||
}
|
||||
});
|
||||
|
||||
Promise.all(validationPromises).then(() => {
|
||||
clearPendingCopyDownValidation();
|
||||
});
|
||||
}, [pendingUpcValidation, validateUpc, clearPendingCopyDownValidation]);
|
||||
|
||||
// Handle inline AI validation (triggered by line copy-down)
|
||||
useEffect(() => {
|
||||
if (!pendingInlineAiValidation) return;
|
||||
|
||||
const { nameRows, descriptionRows } = pendingInlineAiValidation;
|
||||
const state = useValidationStore.getState();
|
||||
const { rows, fields, setInlineAiValidating, setInlineAiSuggestion } = state;
|
||||
|
||||
console.log(`[InlineAI] Line copy-down: validating ${nameRows.length} names, ${descriptionRows.length} descriptions`);
|
||||
|
||||
const validationPromises: Promise<void>[] = [];
|
||||
|
||||
// Trigger name validation for applicable rows
|
||||
for (const rowIndex of nameRows) {
|
||||
validationPromises.push(
|
||||
triggerInlineAiValidation(rowIndex, 'name', rows, fields, setInlineAiValidating, setInlineAiSuggestion)
|
||||
);
|
||||
}
|
||||
|
||||
// Trigger description validation for applicable rows
|
||||
for (const rowIndex of descriptionRows) {
|
||||
validationPromises.push(
|
||||
triggerInlineAiValidation(rowIndex, 'description', rows, fields, setInlineAiValidating, setInlineAiSuggestion)
|
||||
);
|
||||
}
|
||||
|
||||
Promise.all(validationPromises).then(() => {
|
||||
clearPendingInlineAiValidation();
|
||||
});
|
||||
}, [pendingInlineAiValidation, clearPendingInlineAiValidation]);
|
||||
};
|
||||
@@ -0,0 +1,272 @@
|
||||
/**
|
||||
* useInlineAiValidation Hook
|
||||
*
|
||||
* Provides inline AI validation for product names and descriptions.
|
||||
* Calls the backend Groq-powered validation endpoints.
|
||||
*/
|
||||
|
||||
import { useState, useCallback, useRef } from 'react';
|
||||
|
||||
// Types for the validation results
|
||||
export interface InlineAiResult {
|
||||
isValid: boolean;
|
||||
suggestion: string | null;
|
||||
issues: string[];
|
||||
latencyMs?: number;
|
||||
}
|
||||
|
||||
export interface InlineAiValidationState {
|
||||
isValidating: boolean;
|
||||
error: string | null;
|
||||
nameResult: InlineAiResult | null;
|
||||
descriptionResult: InlineAiResult | null;
|
||||
}
|
||||
|
||||
// Product data structure for validation
|
||||
// Note: company_id is needed by backend to load company-specific prompts, but line_id/subline_id are not needed
|
||||
export interface ProductForValidation {
|
||||
name?: string;
|
||||
description?: string;
|
||||
company_name?: string;
|
||||
company_id?: string; // Needed by backend for prompt loading (not sent to AI model)
|
||||
line_name?: string;
|
||||
subline_name?: string;
|
||||
categories?: string;
|
||||
// Sibling context for naming decisions
|
||||
siblingNames?: string[];
|
||||
}
|
||||
|
||||
// Debounce delay in milliseconds
|
||||
const DEBOUNCE_DELAY = 500;
|
||||
|
||||
/**
|
||||
* Hook for inline AI validation of product fields
|
||||
*/
|
||||
export function useInlineAiValidation() {
|
||||
const [state, setState] = useState<InlineAiValidationState>({
|
||||
isValidating: false,
|
||||
error: null,
|
||||
nameResult: null,
|
||||
descriptionResult: null
|
||||
});
|
||||
|
||||
// Track pending requests for cancellation
|
||||
const abortControllerRef = useRef<AbortController | null>(null);
|
||||
const debounceTimerRef = useRef<NodeJS.Timeout | null>(null);
|
||||
|
||||
/**
|
||||
* Validate a product name
|
||||
*/
|
||||
const validateName = useCallback(async (product: ProductForValidation): Promise<InlineAiResult | null> => {
|
||||
if (!product.name?.trim()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Cancel any pending request
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
}
|
||||
|
||||
const controller = new AbortController();
|
||||
abortControllerRef.current = controller;
|
||||
|
||||
setState(prev => ({ ...prev, isValidating: true, error: null }));
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/ai/validate/inline/name', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ product }),
|
||||
signal: controller.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || `Validation failed: ${response.status}`);
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
const aiResult: InlineAiResult = {
|
||||
isValid: result.isValid ?? true,
|
||||
suggestion: result.suggestion || null,
|
||||
issues: result.issues || [],
|
||||
latencyMs: result.latencyMs
|
||||
};
|
||||
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isValidating: false,
|
||||
nameResult: aiResult,
|
||||
error: null
|
||||
}));
|
||||
|
||||
return aiResult;
|
||||
} catch (error) {
|
||||
if ((error as Error).name === 'AbortError') {
|
||||
// Request was cancelled, ignore
|
||||
return null;
|
||||
}
|
||||
|
||||
const message = (error as Error).message || 'Validation failed';
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isValidating: false,
|
||||
error: message
|
||||
}));
|
||||
|
||||
return null;
|
||||
}
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Validate a product description
|
||||
*/
|
||||
const validateDescription = useCallback(async (product: ProductForValidation): Promise<InlineAiResult | null> => {
|
||||
if (!product.name?.trim() && !product.description?.trim()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Cancel any pending request
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
}
|
||||
|
||||
const controller = new AbortController();
|
||||
abortControllerRef.current = controller;
|
||||
|
||||
setState(prev => ({ ...prev, isValidating: true, error: null }));
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/ai/validate/inline/description', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ product }),
|
||||
signal: controller.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || `Validation failed: ${response.status}`);
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
const aiResult: InlineAiResult = {
|
||||
isValid: result.isValid ?? true,
|
||||
suggestion: result.suggestion || null,
|
||||
issues: result.issues || [],
|
||||
latencyMs: result.latencyMs
|
||||
};
|
||||
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isValidating: false,
|
||||
descriptionResult: aiResult,
|
||||
error: null
|
||||
}));
|
||||
|
||||
return aiResult;
|
||||
} catch (error) {
|
||||
if ((error as Error).name === 'AbortError') {
|
||||
// Request was cancelled, ignore
|
||||
return null;
|
||||
}
|
||||
|
||||
const message = (error as Error).message || 'Validation failed';
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isValidating: false,
|
||||
error: message
|
||||
}));
|
||||
|
||||
return null;
|
||||
}
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Debounced name validation - call on blur or after typing stops
|
||||
*/
|
||||
const validateNameDebounced = useCallback((product: ProductForValidation) => {
|
||||
// Clear existing timer
|
||||
if (debounceTimerRef.current) {
|
||||
clearTimeout(debounceTimerRef.current);
|
||||
}
|
||||
|
||||
debounceTimerRef.current = setTimeout(() => {
|
||||
validateName(product);
|
||||
}, DEBOUNCE_DELAY);
|
||||
}, [validateName]);
|
||||
|
||||
/**
|
||||
* Debounced description validation
|
||||
*/
|
||||
const validateDescriptionDebounced = useCallback((product: ProductForValidation) => {
|
||||
// Clear existing timer
|
||||
if (debounceTimerRef.current) {
|
||||
clearTimeout(debounceTimerRef.current);
|
||||
}
|
||||
|
||||
debounceTimerRef.current = setTimeout(() => {
|
||||
validateDescription(product);
|
||||
}, DEBOUNCE_DELAY);
|
||||
}, [validateDescription]);
|
||||
|
||||
/**
|
||||
* Clear validation results
|
||||
*/
|
||||
const clearResults = useCallback(() => {
|
||||
// Cancel any pending requests
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
abortControllerRef.current = null;
|
||||
}
|
||||
|
||||
if (debounceTimerRef.current) {
|
||||
clearTimeout(debounceTimerRef.current);
|
||||
debounceTimerRef.current = null;
|
||||
}
|
||||
|
||||
setState({
|
||||
isValidating: false,
|
||||
error: null,
|
||||
nameResult: null,
|
||||
descriptionResult: null
|
||||
});
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Clear name result only
|
||||
*/
|
||||
const clearNameResult = useCallback(() => {
|
||||
setState(prev => ({ ...prev, nameResult: null }));
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Clear description result only
|
||||
*/
|
||||
const clearDescriptionResult = useCallback(() => {
|
||||
setState(prev => ({ ...prev, descriptionResult: null }));
|
||||
}, []);
|
||||
|
||||
return {
|
||||
// State
|
||||
isValidating: state.isValidating,
|
||||
error: state.error,
|
||||
nameResult: state.nameResult,
|
||||
descriptionResult: state.descriptionResult,
|
||||
|
||||
// Actions - immediate
|
||||
validateName,
|
||||
validateDescription,
|
||||
|
||||
// Actions - debounced
|
||||
validateNameDebounced,
|
||||
validateDescriptionDebounced,
|
||||
|
||||
// Clear
|
||||
clearResults,
|
||||
clearNameResult,
|
||||
clearDescriptionResult
|
||||
};
|
||||
}
|
||||
@@ -0,0 +1,250 @@
|
||||
/**
|
||||
* useSanityCheck Hook
|
||||
*
|
||||
* Runs batch sanity check on products before proceeding to next step.
|
||||
* Checks for consistency and appropriateness across products.
|
||||
*
|
||||
* Results are cached locally - clicking "Sanity Check" again shows cached
|
||||
* results without making a new API call. Use "Refresh" to force a new check.
|
||||
*/
|
||||
|
||||
import { useState, useCallback, useRef } from 'react';
|
||||
|
||||
// Types for sanity check results
|
||||
export interface SanityIssue {
|
||||
productIndex: number;
|
||||
field: string;
|
||||
issue: string;
|
||||
suggestion?: string;
|
||||
}
|
||||
|
||||
export interface SanityCheckResult {
|
||||
issues: SanityIssue[];
|
||||
summary: string;
|
||||
latencyMs?: number;
|
||||
totalProducts?: number;
|
||||
issueCount?: number;
|
||||
/** Timestamp when check was run */
|
||||
checkedAt?: number;
|
||||
}
|
||||
|
||||
export interface SanityCheckState {
|
||||
isChecking: boolean;
|
||||
error: string | null;
|
||||
result: SanityCheckResult | null;
|
||||
hasRun: boolean;
|
||||
}
|
||||
|
||||
// Product data for sanity check (simplified structure)
|
||||
export interface ProductForSanityCheck {
|
||||
name?: string;
|
||||
supplier?: string;
|
||||
supplier_name?: string;
|
||||
company?: string;
|
||||
company_name?: string;
|
||||
supplier_no?: string;
|
||||
msrp?: string | number;
|
||||
cost_each?: string | number;
|
||||
qty_per_unit?: string | number;
|
||||
case_qty?: string | number;
|
||||
tax_cat?: string | number;
|
||||
tax_cat_name?: string;
|
||||
size_cat?: string | number;
|
||||
size_cat_name?: string;
|
||||
themes?: string;
|
||||
theme_names?: string;
|
||||
categories?: string;
|
||||
category_names?: string;
|
||||
weight?: string | number;
|
||||
length?: string | number;
|
||||
width?: string | number;
|
||||
height?: string | number;
|
||||
additional_context?: Record<string, string>; // AI supplemental columns from MatchColumnsStep
|
||||
}
|
||||
|
||||
/**
|
||||
* Hook for batch sanity check of products
|
||||
*/
|
||||
export function useSanityCheck() {
|
||||
const [state, setState] = useState<SanityCheckState>({
|
||||
isChecking: false,
|
||||
error: null,
|
||||
result: null,
|
||||
hasRun: false
|
||||
});
|
||||
|
||||
// Track pending request for cancellation
|
||||
const abortControllerRef = useRef<AbortController | null>(null);
|
||||
|
||||
/**
|
||||
* Run sanity check on products
|
||||
*/
|
||||
const runCheck = useCallback(async (products: ProductForSanityCheck[]): Promise<SanityCheckResult | null> => {
|
||||
if (!products || products.length === 0) {
|
||||
return {
|
||||
issues: [],
|
||||
summary: 'No products to check'
|
||||
};
|
||||
}
|
||||
|
||||
// Cancel any pending request
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
}
|
||||
|
||||
const controller = new AbortController();
|
||||
abortControllerRef.current = controller;
|
||||
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isChecking: true,
|
||||
error: null,
|
||||
hasRun: true
|
||||
}));
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/ai/validate/sanity-check', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ products }),
|
||||
signal: controller.signal
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorData = await response.json().catch(() => ({}));
|
||||
throw new Error(errorData.error || `Sanity check failed: ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
const result: SanityCheckResult = {
|
||||
issues: data.issues || [],
|
||||
summary: data.summary || 'Check complete',
|
||||
latencyMs: data.latencyMs,
|
||||
totalProducts: products.length,
|
||||
issueCount: data.issues?.length || 0,
|
||||
checkedAt: Date.now()
|
||||
};
|
||||
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isChecking: false,
|
||||
result,
|
||||
error: null
|
||||
}));
|
||||
|
||||
return result;
|
||||
} catch (error) {
|
||||
if ((error as Error).name === 'AbortError') {
|
||||
// Request was cancelled
|
||||
return null;
|
||||
}
|
||||
|
||||
const message = (error as Error).message || 'Sanity check failed';
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isChecking: false,
|
||||
error: message
|
||||
}));
|
||||
|
||||
return null;
|
||||
}
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Cancel the current check
|
||||
*/
|
||||
const cancelCheck = useCallback(() => {
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
abortControllerRef.current = null;
|
||||
}
|
||||
|
||||
setState(prev => ({
|
||||
...prev,
|
||||
isChecking: false
|
||||
}));
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Clear results and reset state
|
||||
*/
|
||||
const clearResults = useCallback(() => {
|
||||
if (abortControllerRef.current) {
|
||||
abortControllerRef.current.abort();
|
||||
abortControllerRef.current = null;
|
||||
}
|
||||
|
||||
setState({
|
||||
isChecking: false,
|
||||
error: null,
|
||||
result: null,
|
||||
hasRun: false
|
||||
});
|
||||
}, []);
|
||||
|
||||
/**
|
||||
* Get issues for a specific product index
|
||||
*/
|
||||
const getIssuesForProduct = useCallback((productIndex: number): SanityIssue[] => {
|
||||
if (!state.result?.issues) return [];
|
||||
return state.result.issues.filter(issue => issue.productIndex === productIndex);
|
||||
}, [state.result]);
|
||||
|
||||
/**
|
||||
* Get issues grouped by field
|
||||
*/
|
||||
const getIssuesByField = useCallback((): Record<string, SanityIssue[]> => {
|
||||
if (!state.result?.issues) return {};
|
||||
|
||||
return state.result.issues.reduce((acc, issue) => {
|
||||
const field = issue.field;
|
||||
if (!acc[field]) {
|
||||
acc[field] = [];
|
||||
}
|
||||
acc[field].push(issue);
|
||||
return acc;
|
||||
}, {} as Record<string, SanityIssue[]>);
|
||||
}, [state.result]);
|
||||
|
||||
/**
|
||||
* Check if there are any issues
|
||||
*/
|
||||
const hasIssues = state.result?.issues && state.result.issues.length > 0;
|
||||
|
||||
/**
|
||||
* Check if the sanity check passed (ran with no issues)
|
||||
*/
|
||||
const passed = state.hasRun && !state.isChecking && !state.error && !hasIssues;
|
||||
|
||||
/**
|
||||
* Check if we have cached results that can be displayed
|
||||
*/
|
||||
const hasCachedResults = state.hasRun && state.result !== null && !state.isChecking;
|
||||
|
||||
return {
|
||||
// State
|
||||
isChecking: state.isChecking,
|
||||
error: state.error,
|
||||
result: state.result,
|
||||
hasRun: state.hasRun,
|
||||
hasIssues,
|
||||
passed,
|
||||
hasCachedResults,
|
||||
|
||||
// Computed
|
||||
issues: state.result?.issues || [],
|
||||
summary: state.result?.summary || null,
|
||||
issueCount: state.result?.issueCount || 0,
|
||||
checkedAt: state.result?.checkedAt || null,
|
||||
|
||||
// Actions
|
||||
runCheck,
|
||||
cancelCheck,
|
||||
clearResults,
|
||||
|
||||
// Helpers
|
||||
getIssuesForProduct,
|
||||
getIssuesByField
|
||||
};
|
||||
}
|
||||
@@ -154,7 +154,7 @@ export const useUpcValidation = () => {
|
||||
// Set specific error for conflicts
|
||||
if (result.code === 'conflict') {
|
||||
setError(rowIndex, 'upc', {
|
||||
message: 'UPC already exists in database',
|
||||
message: 'A product with this UPC already exists',
|
||||
level: 'error',
|
||||
source: ErrorSource.Upc,
|
||||
type: ErrorType.Unique,
|
||||
@@ -262,7 +262,7 @@ export const useUpcValidation = () => {
|
||||
|
||||
if (result.code === 'conflict') {
|
||||
setError(index, 'upc', {
|
||||
message: 'UPC already exists in database',
|
||||
message: 'A product with this UPC already exists',
|
||||
level: 'error',
|
||||
source: ErrorSource.Upc,
|
||||
type: ErrorType.Unique,
|
||||
|
||||
@@ -18,6 +18,7 @@ import { useTemplateManagement } from './hooks/useTemplateManagement';
|
||||
import { useUpcValidation } from './hooks/useUpcValidation';
|
||||
import { useValidationActions } from './hooks/useValidationActions';
|
||||
import { useProductLines } from './hooks/useProductLines';
|
||||
import { useAutoInlineAiValidation } from './hooks/useAutoInlineAiValidation';
|
||||
import { BASE_IMPORT_FIELDS } from '../../config';
|
||||
import config from '@/config';
|
||||
import type { ValidationStepProps } from './store/types';
|
||||
@@ -120,6 +121,9 @@ export const ValidationStep = ({
|
||||
const { validateAllRows } = useValidationActions();
|
||||
const { prefetchAllLines } = useProductLines();
|
||||
|
||||
// Auto inline AI validation - triggers when ready phase is reached
|
||||
useAutoInlineAiValidation();
|
||||
|
||||
// Fetch field options
|
||||
const { data: fieldOptions, isLoading: optionsLoading, error: optionsError } = useQuery({
|
||||
queryKey: ['field-options'],
|
||||
@@ -128,6 +132,9 @@ export const ValidationStep = ({
|
||||
retry: 2,
|
||||
});
|
||||
|
||||
// Get current store state to check if we're returning to an already-initialized store
|
||||
const storeRows = useValidationStore((state) => state.rows);
|
||||
|
||||
// Initialize store with data
|
||||
useEffect(() => {
|
||||
console.log('[ValidationStep] Init effect running, initStartedRef:', initStartedRef.current, 'initPhase:', initPhase);
|
||||
@@ -140,6 +147,17 @@ export const ValidationStep = ({
|
||||
return;
|
||||
}
|
||||
|
||||
// IMPORTANT: Skip initialization if we're returning to an already-ready store
|
||||
// This happens when navigating back from ImageUploadStep - the store still has
|
||||
// all the validated data, so we don't need to re-run the initialization sequence.
|
||||
// We check that the store is 'ready' and has matching row count to avoid
|
||||
// false positives from stale store data.
|
||||
if (initPhase === 'ready' && storeRows.length === initialData.length && storeRows.length > 0) {
|
||||
console.log('[ValidationStep] Skipping init - returning to already-ready store with', storeRows.length, 'rows');
|
||||
initStartedRef.current = true;
|
||||
return;
|
||||
}
|
||||
|
||||
initStartedRef.current = true;
|
||||
|
||||
console.log('[ValidationStep] Starting initialization with', initialData.length, 'rows');
|
||||
@@ -154,7 +172,7 @@ export const ValidationStep = ({
|
||||
console.log('[ValidationStep] Calling initialize()');
|
||||
initialize(rowData, BASE_IMPORT_FIELDS as unknown as Field<string>[], file);
|
||||
console.log('[ValidationStep] initialize() called');
|
||||
}, [initialData, file, initialize, initPhase]);
|
||||
}, [initialData, file, initialize, initPhase, storeRows.length]);
|
||||
|
||||
// Update fields when options are loaded
|
||||
// CRITICAL: Check store state (not ref) because initialize() resets the store
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
* only to the state they need, preventing unnecessary re-renders.
|
||||
*/
|
||||
|
||||
import { useMemo, useCallback } from 'react';
|
||||
import { useMemo } from 'react';
|
||||
import { useValidationStore } from './validationStore';
|
||||
import type { RowData, ValidationError } from './types';
|
||||
|
||||
|
||||
@@ -21,7 +21,7 @@ export interface RowData {
|
||||
__original?: Record<string, unknown>; // Original values before AI changes
|
||||
__corrected?: Record<string, unknown>; // AI-corrected values
|
||||
__changes?: Record<string, boolean>; // Fields changed by AI
|
||||
__aiSupplemental?: string[]; // AI supplemental columns from MatchColumnsStep
|
||||
__aiSupplemental?: Record<string, string>; // AI supplemental columns from MatchColumnsStep (header -> value)
|
||||
|
||||
// Standard fields (from config.ts)
|
||||
supplier?: string;
|
||||
@@ -151,6 +151,27 @@ export interface CopyDownState {
|
||||
targetRowIndex: number | null; // Hover preview - which row the user is hovering on
|
||||
}
|
||||
|
||||
/**
|
||||
* Tracks rows that need UPC validation after copy-down completes.
|
||||
* This allows reusing the existing validateUpc logic instead of duplicating it.
|
||||
*/
|
||||
export interface PendingCopyDownValidation {
|
||||
fieldKey: string;
|
||||
affectedRows: number[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Tracks rows that need inline AI validation after line copy-down.
|
||||
* When line is copied to rows that already have company + name/description,
|
||||
* those rows now have sufficient context for validation.
|
||||
*/
|
||||
export interface PendingInlineAiValidation {
|
||||
/** Row indices that need name validation */
|
||||
nameRows: number[];
|
||||
/** Row indices that need description validation */
|
||||
descriptionRows: number[];
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// Dialog State Types
|
||||
// =============================================================================
|
||||
@@ -218,6 +239,80 @@ export interface AiValidationState {
|
||||
revertedChanges: Set<string>; // Format: "productIndex:fieldKey"
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// AI Suggestions Types (Embedding-based)
|
||||
// =============================================================================
|
||||
|
||||
export interface TaxonomySuggestion {
|
||||
id: number;
|
||||
name: string;
|
||||
fullPath?: string;
|
||||
similarity: number;
|
||||
}
|
||||
|
||||
export interface ProductSuggestions {
|
||||
categories: TaxonomySuggestion[];
|
||||
themes: TaxonomySuggestion[];
|
||||
colors: TaxonomySuggestion[];
|
||||
latencyMs?: number;
|
||||
}
|
||||
|
||||
export interface AiSuggestionsState {
|
||||
initialized: boolean;
|
||||
initializing: boolean;
|
||||
/** Map of product __index to their embedding */
|
||||
embeddings: Map<string, number[]>;
|
||||
/** Map of product __index to their suggestions */
|
||||
suggestions: Map<string, ProductSuggestions>;
|
||||
/** Products currently being processed */
|
||||
processing: Set<string>;
|
||||
/** Last error if any */
|
||||
error: string | null;
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// Inline AI Validation Types (Groq-powered)
|
||||
// =============================================================================
|
||||
|
||||
/**
|
||||
* Result from inline AI validation (name or description)
|
||||
*/
|
||||
export interface InlineAiValidationResult {
|
||||
isValid: boolean;
|
||||
suggestion?: string;
|
||||
issues: string[];
|
||||
latencyMs?: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Per-product inline AI suggestions (keyed by product __index)
|
||||
*/
|
||||
export interface InlineAiSuggestion {
|
||||
name?: InlineAiValidationResult;
|
||||
description?: InlineAiValidationResult;
|
||||
/** Whether the suggestion has been dismissed by the user */
|
||||
dismissed?: {
|
||||
name?: boolean;
|
||||
description?: boolean;
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* State for inline AI validation
|
||||
*/
|
||||
export interface InlineAiState {
|
||||
/** Map of product __index to their inline suggestions */
|
||||
suggestions: Map<string, InlineAiSuggestion>;
|
||||
/** Products currently being validated (format: "productIndex-field") */
|
||||
validating: Set<string>;
|
||||
/**
|
||||
* Fields that have been auto-validated on load (format: "productIndex-field")
|
||||
* This prevents re-validation when blur fires for a field that was just auto-validated,
|
||||
* and prevents auto-validation from firing for fields that were manually edited.
|
||||
*/
|
||||
autoValidated: Set<string>;
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// Initialization Types
|
||||
// =============================================================================
|
||||
@@ -292,6 +387,8 @@ export interface ValidationState {
|
||||
|
||||
// === Copy-Down Mode ===
|
||||
copyDownMode: CopyDownState;
|
||||
pendingCopyDownValidation: PendingCopyDownValidation | null;
|
||||
pendingInlineAiValidation: PendingInlineAiValidation | null;
|
||||
|
||||
// === Dialogs ===
|
||||
dialogs: DialogState;
|
||||
@@ -302,8 +399,15 @@ export interface ValidationState {
|
||||
// === AI Validation ===
|
||||
aiValidation: AiValidationState;
|
||||
|
||||
// === Inline AI Validation (Groq) ===
|
||||
inlineAi: InlineAiState;
|
||||
|
||||
// === File (for output) ===
|
||||
file: File | null;
|
||||
|
||||
// === UI State ===
|
||||
/** Timestamp when a MultilineInput popover was last closed (for click-outside behavior) */
|
||||
cellPopoverClosedAt: number;
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
@@ -376,6 +480,8 @@ export interface ValidationActions {
|
||||
cancelCopyDown: () => void;
|
||||
completeCopyDown: (targetRowIndex: number) => void;
|
||||
setTargetRowHover: (rowIndex: number | null) => void;
|
||||
clearPendingCopyDownValidation: () => void;
|
||||
clearPendingInlineAiValidation: () => void;
|
||||
|
||||
// === Dialogs ===
|
||||
setDialogs: (updates: Partial<DialogState>) => void;
|
||||
@@ -396,9 +502,22 @@ export interface ValidationActions {
|
||||
clearAiValidation: () => void;
|
||||
storeOriginalValues: () => void;
|
||||
|
||||
// === Inline AI Validation ===
|
||||
setInlineAiSuggestion: (productIndex: string, field: 'name' | 'description', result: InlineAiValidationResult) => void;
|
||||
dismissInlineAiSuggestion: (productIndex: string, field: 'name' | 'description') => void;
|
||||
acceptInlineAiSuggestion: (productIndex: string, field: 'name' | 'description') => void;
|
||||
clearInlineAiSuggestion: (productIndex: string, field?: 'name' | 'description') => void;
|
||||
setInlineAiValidating: (productIndex: string, validating: boolean) => void;
|
||||
markInlineAiAutoValidated: (productIndex: string, field: 'name' | 'description') => void;
|
||||
isInlineAiAutoValidated: (productIndex: string, field: 'name' | 'description') => boolean;
|
||||
|
||||
// === Output ===
|
||||
getCleanedData: () => CleanRowData[];
|
||||
|
||||
// === UI State ===
|
||||
/** Called when a MultilineInput popover closes to prevent immediate focus on other cells */
|
||||
setCellPopoverClosed: () => void;
|
||||
|
||||
// === Reset ===
|
||||
reset: () => void;
|
||||
}
|
||||
|
||||
@@ -29,6 +29,7 @@ import type {
|
||||
AiValidationResults,
|
||||
CopyDownState,
|
||||
DialogState,
|
||||
InlineAiValidationResult,
|
||||
} from './types';
|
||||
import type { Field, SelectOption } from '../../../types';
|
||||
|
||||
@@ -57,6 +58,9 @@ const initialCopyDownState: CopyDownState = {
|
||||
targetRowIndex: null,
|
||||
};
|
||||
|
||||
// Fields that require UPC validation when changed via copy-down
|
||||
const UPC_VALIDATION_FIELDS = ['supplier', 'upc', 'barcode'];
|
||||
|
||||
const initialDialogState: DialogState = {
|
||||
templateFormOpen: false,
|
||||
templateFormData: null,
|
||||
@@ -105,6 +109,8 @@ const getInitialState = (): ValidationState => ({
|
||||
|
||||
// Copy-Down Mode
|
||||
copyDownMode: { ...initialCopyDownState },
|
||||
pendingCopyDownValidation: null,
|
||||
pendingInlineAiValidation: null,
|
||||
|
||||
// Dialogs
|
||||
dialogs: { ...initialDialogState },
|
||||
@@ -120,8 +126,18 @@ const getInitialState = (): ValidationState => ({
|
||||
revertedChanges: new Set(),
|
||||
},
|
||||
|
||||
// Inline AI Validation (Groq)
|
||||
inlineAi: {
|
||||
suggestions: new Map(),
|
||||
validating: new Set(),
|
||||
autoValidated: new Set(),
|
||||
},
|
||||
|
||||
// File
|
||||
file: null,
|
||||
|
||||
// UI State
|
||||
cellPopoverClosedAt: 0,
|
||||
});
|
||||
|
||||
// =============================================================================
|
||||
@@ -245,13 +261,45 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
|
||||
copyDown: (fromRowIndex: number, fieldKey: string, toRowIndex?: number) => {
|
||||
set((state) => {
|
||||
const sourceValue = state.rows[fromRowIndex]?.[fieldKey];
|
||||
const sourceRow = state.rows[fromRowIndex];
|
||||
const sourceValue = sourceRow?.[fieldKey];
|
||||
if (sourceValue === undefined) return;
|
||||
|
||||
const endIndex = toRowIndex ?? state.rows.length - 1;
|
||||
const isInlineAiField = fieldKey === 'name' || fieldKey === 'description';
|
||||
|
||||
// For inline AI fields, check if source was validated/dismissed
|
||||
// If so, we'll mark targets as autoValidated to skip re-validation
|
||||
let sourceIsDismissed = false;
|
||||
if (isInlineAiField && sourceRow?.__index) {
|
||||
const sourceSuggestion = state.inlineAi.suggestions.get(sourceRow.__index);
|
||||
sourceIsDismissed = sourceSuggestion?.dismissed?.[fieldKey as 'name' | 'description'] ?? false;
|
||||
}
|
||||
|
||||
for (let i = fromRowIndex + 1; i <= endIndex; i++) {
|
||||
if (state.rows[i]) {
|
||||
state.rows[i][fieldKey] = sourceValue;
|
||||
const targetRow = state.rows[i];
|
||||
if (targetRow) {
|
||||
targetRow[fieldKey] = sourceValue;
|
||||
|
||||
// For name/description fields:
|
||||
// 1. Mark as autoValidated so blur won't re-validate
|
||||
// 2. Clear any existing suggestion for this field (value changed)
|
||||
// 3. Set dismissed state based on source (if source was dismissed, targets are also valid)
|
||||
if (isInlineAiField && targetRow.__index) {
|
||||
const field = fieldKey as 'name' | 'description';
|
||||
state.inlineAi.autoValidated.add(`${targetRow.__index}-${field}`);
|
||||
|
||||
// Clear existing suggestion and set dismissed state
|
||||
const existing = state.inlineAi.suggestions.get(targetRow.__index) || {};
|
||||
state.inlineAi.suggestions.set(targetRow.__index, {
|
||||
...existing,
|
||||
[field]: undefined, // Clear the suggestion for this field
|
||||
dismissed: {
|
||||
...existing.dismissed,
|
||||
[field]: sourceIsDismissed, // Copy dismissed state from source
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
@@ -574,9 +622,13 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
const hasValue = sourceValue !== null && sourceValue !== '' &&
|
||||
!(Array.isArray(sourceValue) && sourceValue.length === 0);
|
||||
|
||||
// Track affected rows for UPC validation
|
||||
const affectedRows: number[] = [];
|
||||
|
||||
for (let i = sourceRowIndex + 1; i <= targetRowIndex; i++) {
|
||||
if (state.rows[i]) {
|
||||
state.rows[i][fieldKey] = cloneValue(sourceValue);
|
||||
affectedRows.push(i);
|
||||
|
||||
// Clear validation errors for this field if value is non-empty
|
||||
if (hasValue) {
|
||||
@@ -596,6 +648,52 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
|
||||
// Reset copy-down mode
|
||||
state.copyDownMode = { ...initialCopyDownState };
|
||||
|
||||
// If this field affects UPC validation, store the affected rows
|
||||
// so a hook can trigger validation using the existing validateUpc function
|
||||
if (UPC_VALIDATION_FIELDS.includes(fieldKey) && affectedRows.length > 0) {
|
||||
state.pendingCopyDownValidation = {
|
||||
fieldKey,
|
||||
affectedRows,
|
||||
};
|
||||
}
|
||||
|
||||
// If line is being copied, check which rows now have sufficient context
|
||||
// for inline AI validation (company + line + name/description)
|
||||
if (fieldKey === 'line' && affectedRows.length > 0) {
|
||||
const nameRows: number[] = [];
|
||||
const descriptionRows: number[] = [];
|
||||
|
||||
for (const rowIdx of affectedRows) {
|
||||
const row = state.rows[rowIdx];
|
||||
if (!row?.__index) continue;
|
||||
|
||||
// Check if row has company + line (just set) + name
|
||||
const hasNameContext = row.company && sourceValue && row.name &&
|
||||
typeof row.name === 'string' && row.name.trim();
|
||||
|
||||
if (hasNameContext) {
|
||||
// Check if name hasn't been dismissed
|
||||
const suggestion = state.inlineAi.suggestions.get(row.__index);
|
||||
const nameIsDismissed = suggestion?.dismissed?.name;
|
||||
if (!nameIsDismissed) {
|
||||
nameRows.push(rowIdx);
|
||||
}
|
||||
|
||||
// Check if description also has sufficient context
|
||||
const hasDescContext = row.description &&
|
||||
typeof row.description === 'string' && row.description.trim();
|
||||
const descIsDismissed = suggestion?.dismissed?.description;
|
||||
if (hasDescContext && !descIsDismissed) {
|
||||
descriptionRows.push(rowIdx);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (nameRows.length > 0 || descriptionRows.length > 0) {
|
||||
state.pendingInlineAiValidation = { nameRows, descriptionRows };
|
||||
}
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
@@ -607,6 +705,18 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
});
|
||||
},
|
||||
|
||||
clearPendingCopyDownValidation: () => {
|
||||
set((state) => {
|
||||
state.pendingCopyDownValidation = null;
|
||||
});
|
||||
},
|
||||
|
||||
clearPendingInlineAiValidation: () => {
|
||||
set((state) => {
|
||||
state.pendingInlineAiValidation = null;
|
||||
});
|
||||
},
|
||||
|
||||
// =========================================================================
|
||||
// Dialogs
|
||||
// =========================================================================
|
||||
@@ -726,6 +836,113 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
});
|
||||
},
|
||||
|
||||
// =========================================================================
|
||||
// Inline AI Validation (Groq)
|
||||
// =========================================================================
|
||||
|
||||
setInlineAiSuggestion: (productIndex: string, field: 'name' | 'description', result: InlineAiValidationResult) => {
|
||||
// Debug: Log what we're setting
|
||||
console.log('[Store] setInlineAiSuggestion called:', {
|
||||
productIndex,
|
||||
field,
|
||||
result,
|
||||
});
|
||||
|
||||
set((state) => {
|
||||
const existing = state.inlineAi.suggestions.get(productIndex) || {};
|
||||
const newSuggestion = {
|
||||
...existing,
|
||||
[field]: result,
|
||||
dismissed: {
|
||||
...existing.dismissed,
|
||||
[field]: false, // Reset dismissed state when new suggestion arrives
|
||||
},
|
||||
};
|
||||
state.inlineAi.suggestions.set(productIndex, newSuggestion);
|
||||
state.inlineAi.validating.delete(`${productIndex}-${field}`);
|
||||
|
||||
// Debug: Log what's in the Map now
|
||||
console.log('[Store] After set, suggestions Map has:', productIndex, state.inlineAi.suggestions.get(productIndex));
|
||||
});
|
||||
},
|
||||
|
||||
dismissInlineAiSuggestion: (productIndex: string, field: 'name' | 'description') => {
|
||||
set((state) => {
|
||||
const existing = state.inlineAi.suggestions.get(productIndex);
|
||||
if (existing) {
|
||||
state.inlineAi.suggestions.set(productIndex, {
|
||||
...existing,
|
||||
dismissed: {
|
||||
...existing.dismissed,
|
||||
[field]: true,
|
||||
},
|
||||
});
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
acceptInlineAiSuggestion: (productIndex: string, field: 'name' | 'description') => {
|
||||
set((state) => {
|
||||
const suggestion = state.inlineAi.suggestions.get(productIndex)?.[field];
|
||||
if (suggestion?.suggestion) {
|
||||
// Find the row by __index and update the field
|
||||
const rowIndex = state.rows.findIndex((row: RowData) => row.__index === productIndex);
|
||||
if (rowIndex >= 0) {
|
||||
state.rows[rowIndex][field] = suggestion.suggestion;
|
||||
}
|
||||
// Mark as dismissed after accepting
|
||||
const existing = state.inlineAi.suggestions.get(productIndex);
|
||||
if (existing) {
|
||||
state.inlineAi.suggestions.set(productIndex, {
|
||||
...existing,
|
||||
dismissed: {
|
||||
...existing.dismissed,
|
||||
[field]: true,
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
clearInlineAiSuggestion: (productIndex: string, field?: 'name' | 'description') => {
|
||||
set((state) => {
|
||||
if (field) {
|
||||
const existing = state.inlineAi.suggestions.get(productIndex);
|
||||
if (existing) {
|
||||
const { [field]: _, ...rest } = existing;
|
||||
if (Object.keys(rest).length === 0 || (Object.keys(rest).length === 1 && 'dismissed' in rest)) {
|
||||
state.inlineAi.suggestions.delete(productIndex);
|
||||
} else {
|
||||
state.inlineAi.suggestions.set(productIndex, rest);
|
||||
}
|
||||
}
|
||||
} else {
|
||||
state.inlineAi.suggestions.delete(productIndex);
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
setInlineAiValidating: (productIndex: string, validating: boolean) => {
|
||||
set((state) => {
|
||||
if (validating) {
|
||||
state.inlineAi.validating.add(productIndex);
|
||||
} else {
|
||||
state.inlineAi.validating.delete(productIndex);
|
||||
}
|
||||
});
|
||||
},
|
||||
|
||||
markInlineAiAutoValidated: (productIndex: string, field: 'name' | 'description') => {
|
||||
set((state) => {
|
||||
state.inlineAi.autoValidated.add(`${productIndex}-${field}`);
|
||||
});
|
||||
},
|
||||
|
||||
isInlineAiAutoValidated: (productIndex: string, field: 'name' | 'description') => {
|
||||
return get().inlineAi.autoValidated.has(`${productIndex}-${field}`);
|
||||
},
|
||||
|
||||
// =========================================================================
|
||||
// Output
|
||||
// =========================================================================
|
||||
@@ -739,6 +956,16 @@ export const useValidationStore = create<ValidationStore>()(
|
||||
});
|
||||
},
|
||||
|
||||
// =========================================================================
|
||||
// UI State
|
||||
// =========================================================================
|
||||
|
||||
setCellPopoverClosed: () => {
|
||||
set((state) => {
|
||||
state.cellPopoverClosedAt = Date.now();
|
||||
});
|
||||
},
|
||||
|
||||
// =========================================================================
|
||||
// Reset
|
||||
// =========================================================================
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import type { Data, Fields, Info, RowHook, TableHook } from "../../../types"
|
||||
import type { Meta, Errors } from "../../ValidationStepNew/types"
|
||||
import type { Data, Fields, Info, RowHook, TableHook, Meta, Errors } from "../../../types"
|
||||
import { v4 } from "uuid"
|
||||
import { ErrorSources, ErrorType } from "../../../types"
|
||||
import { normalizeCountryCode } from "./countryUtils"
|
||||
|
||||
|
||||
type DataWithMeta<T extends string> = Data<T> & Meta & {
|
||||
@@ -57,6 +57,21 @@ export const addErrorsAndRunHooks = async <T extends string>(
|
||||
}
|
||||
}
|
||||
|
||||
// Normalize country of origin (coo) to 2-letter ISO codes
|
||||
processedData.forEach((row) => {
|
||||
const coo = (row as Record<string, unknown>).coo
|
||||
if (typeof coo === "string" && coo.trim()) {
|
||||
const raw = coo.trim()
|
||||
const normalized = normalizeCountryCode(raw)
|
||||
if (normalized) {
|
||||
(row as Record<string, unknown>).coo = normalized
|
||||
} else if (raw.length === 2) {
|
||||
// Uppercase 2-letter values as fallback
|
||||
(row as Record<string, unknown>).coo = raw.toUpperCase()
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
fields.forEach((field) => {
|
||||
const fieldKey = field.key as string
|
||||
field.validations?.forEach((validation) => {
|
||||
|
||||
@@ -0,0 +1,202 @@
|
||||
/**
|
||||
* Inline AI Validation Payload Builder
|
||||
*
|
||||
* Centralized utility for building payloads sent to the inline AI validation endpoints.
|
||||
* This ensures consistent payload structure across all validation triggers:
|
||||
* - Blur handler in ValidationTable
|
||||
* - Auto-validation on page load
|
||||
* - Copy-down validation
|
||||
* - Template application
|
||||
*
|
||||
* Note: IDs are not included as the AI model can't look them up - only names are useful.
|
||||
*/
|
||||
|
||||
import type { RowData } from '../store/types';
|
||||
import type { Field, SelectOption } from '../../../types';
|
||||
import { useValidationStore } from '../store/validationStore';
|
||||
|
||||
/**
|
||||
* Helper to look up field option label from field definitions
|
||||
*/
|
||||
export function getFieldLabel(
|
||||
fields: Field<string>[],
|
||||
fieldKey: string,
|
||||
val: unknown
|
||||
): string | undefined {
|
||||
const fieldDef = fields.find(f => f.key === fieldKey);
|
||||
if (fieldDef && fieldDef.fieldType.type === 'select' && 'options' in fieldDef.fieldType) {
|
||||
const option = fieldDef.fieldType.options?.find(o => o.value === String(val));
|
||||
return option?.label;
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Look up line name from the productLinesCache
|
||||
* Line options are loaded dynamically per-company and stored in a separate cache
|
||||
*/
|
||||
function getLineName(companyId: string | number, lineId: string | number): string | undefined {
|
||||
const { productLinesCache } = useValidationStore.getState();
|
||||
const lineOptions = productLinesCache.get(String(companyId)) as SelectOption[] | undefined;
|
||||
if (lineOptions) {
|
||||
const option = lineOptions.find(o => o.value === String(lineId));
|
||||
return option?.label;
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Look up subline name from the sublinesCache
|
||||
* Subline options are loaded dynamically per-line and stored in a separate cache
|
||||
*/
|
||||
function getSublineName(lineId: string | number, sublineId: string | number): string | undefined {
|
||||
const { sublinesCache } = useValidationStore.getState();
|
||||
const sublineOptions = sublinesCache.get(String(lineId)) as SelectOption[] | undefined;
|
||||
if (sublineOptions) {
|
||||
const option = sublineOptions.find(o => o.value === String(sublineId));
|
||||
return option?.label;
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
|
||||
/**
|
||||
* Compute sibling product names for naming context.
|
||||
* Siblings are products with the same company + line (+ subline if set).
|
||||
*/
|
||||
export function computeSiblingNames(
|
||||
row: RowData,
|
||||
allRows: RowData[]
|
||||
): string[] {
|
||||
const siblingNames: string[] = [];
|
||||
|
||||
if (!row.company || !row.line) {
|
||||
return siblingNames;
|
||||
}
|
||||
|
||||
const companyId = String(row.company);
|
||||
const lineId = String(row.line);
|
||||
const sublineId = row.subline ? String(row.subline) : null;
|
||||
|
||||
for (const otherRow of allRows) {
|
||||
// Skip self
|
||||
if (otherRow.__index === row.__index) continue;
|
||||
|
||||
// Must match company and line
|
||||
if (String(otherRow.company) !== companyId) continue;
|
||||
if (String(otherRow.line) !== lineId) continue;
|
||||
|
||||
// If current product has subline, siblings must match subline too
|
||||
if (sublineId && String(otherRow.subline) !== sublineId) continue;
|
||||
|
||||
// Add name if it exists
|
||||
if (otherRow.name && typeof otherRow.name === 'string' && otherRow.name.trim()) {
|
||||
siblingNames.push(otherRow.name);
|
||||
}
|
||||
}
|
||||
|
||||
return siblingNames;
|
||||
}
|
||||
|
||||
/**
|
||||
* Payload for name validation endpoint
|
||||
*/
|
||||
export interface NameValidationPayload {
|
||||
name: string;
|
||||
company_name?: string;
|
||||
company_id?: string; // Needed by backend to load company-specific prompts (not sent to AI)
|
||||
line_name?: string;
|
||||
subline_name?: string;
|
||||
siblingNames?: string[];
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
/**
|
||||
* Payload for description validation endpoint
|
||||
*/
|
||||
export interface DescriptionValidationPayload {
|
||||
name: string;
|
||||
description: string;
|
||||
company_name?: string;
|
||||
company_id?: string; // Needed by backend to load company-specific prompts (not sent to AI)
|
||||
categories?: string;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for overriding row values when building payloads
|
||||
*/
|
||||
export interface PayloadOverrides {
|
||||
name?: string;
|
||||
description?: string;
|
||||
line?: string | number; // Line ID override (for line change handler)
|
||||
}
|
||||
|
||||
/**
|
||||
* Build payload for name validation API
|
||||
*
|
||||
* @param row - The row data
|
||||
* @param fields - Field definitions for label lookup
|
||||
* @param allRows - All rows for sibling computation
|
||||
* @param overrides - Optional value overrides (e.g., new name from blur handler, new line from line change)
|
||||
*/
|
||||
export function buildNameValidationPayload(
|
||||
row: RowData,
|
||||
fields: Field<string>[],
|
||||
allRows: RowData[],
|
||||
overrides?: PayloadOverrides
|
||||
): NameValidationPayload {
|
||||
// Use override line for sibling computation if provided
|
||||
const effectiveRow = overrides?.line !== undefined
|
||||
? { ...row, line: String(overrides.line) }
|
||||
: row;
|
||||
const siblingNames = computeSiblingNames(effectiveRow, allRows);
|
||||
|
||||
// Determine line_name - use override if provided
|
||||
// Line options are stored in productLinesCache (keyed by company ID), not field options
|
||||
const lineValue = overrides?.line ?? row.line;
|
||||
const lineName = row.company && lineValue
|
||||
? getLineName(row.company, lineValue)
|
||||
: undefined;
|
||||
|
||||
// Subline options are stored in sublinesCache (keyed by line ID), not field options
|
||||
const sublineName = lineValue && row.subline
|
||||
? getSublineName(lineValue, row.subline)
|
||||
: undefined;
|
||||
|
||||
return {
|
||||
name: overrides?.name ?? String(row.name || ''),
|
||||
company_name: row.company ? getFieldLabel(fields, 'company', row.company) : undefined,
|
||||
company_id: row.company ? String(row.company) : undefined, // For backend prompt loading
|
||||
line_name: lineName,
|
||||
subline_name: sublineName,
|
||||
siblingNames: siblingNames.length > 0 ? siblingNames : undefined,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Build payload for description validation API
|
||||
*
|
||||
* @param row - The row data
|
||||
* @param fields - Field definitions for label lookup
|
||||
* @param overrides - Optional value overrides (e.g., from blur handler)
|
||||
*/
|
||||
export function buildDescriptionValidationPayload(
|
||||
row: RowData,
|
||||
fields: Field<string>[],
|
||||
overrides?: PayloadOverrides
|
||||
): DescriptionValidationPayload {
|
||||
const payload: DescriptionValidationPayload = {
|
||||
name: overrides?.name ?? String(row.name || ''),
|
||||
description: overrides?.description ?? String(row.description || ''),
|
||||
company_name: row.company ? getFieldLabel(fields, 'company', row.company) : undefined,
|
||||
company_id: row.company ? String(row.company) : undefined, // For backend prompt loading
|
||||
categories: row.categories as string | undefined,
|
||||
};
|
||||
|
||||
// Add AI supplemental context if present (from MatchColumnsStep "AI context only" columns)
|
||||
if (row.__aiSupplemental && typeof row.__aiSupplemental === 'object') {
|
||||
payload.additional_context = row.__aiSupplemental;
|
||||
}
|
||||
|
||||
return payload;
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user