Refactor validation error handling to use a single source of truth (validationErrors Map), speed up item number generation and loading lines/sublines

This commit is contained in:
2025-03-16 14:09:58 -04:00
parent 153bbecc44
commit 52ae7e10aa
12 changed files with 825 additions and 585 deletions

View File

@@ -62,14 +62,78 @@ const validationTimeoutsRef = useRef<Record<number, NodeJS.Timeout>>({});
While there is cleanup on unmount, if rows are added/removed dynamically, timeouts for deleted rows might not be properly cleared. While there is cleanup on unmount, if rows are added/removed dynamically, timeouts for deleted rows might not be properly cleared.
## 5. Inefficient Error Storage ## 5. Inefficient Error Storage (RESOLVED)
Errors are stored in multiple places: **Status: RESOLVED**
- In the `validationErrors` Map
### Problem
Previously, validation errors were stored in multiple locations:
- In the `validationErrors` Map in `useValidationState`
- In the row data itself as `__errors` - In the row data itself as `__errors`
- In the UPC validation results
This duplication makes it harder to maintain a single source of truth and could lead to inconsistencies. This redundancy caused several issues:
- Inconsistent error states between the two storage locations
- Increased memory usage by storing the same information twice
- Complex state management to keep both sources in sync
- Difficulty reasoning about where errors should be accessed from
### Solution
We've implemented a unified error storage approach by:
- Making the `validationErrors` Map in `useValidationState` the single source of truth for all validation errors
- Removed the `__errors` property from row data
- Updated all validation functions to interact with the central error store instead of modifying row data
- Modified UPC validation to use the central error store
- Updated all components to read errors from the `validationErrors` Map instead of row data
### Key Changes
1. Modified `dataMutations.ts` to stop storing errors in row data
2. Updated the `Meta` type to remove the `__errors` property
3. Modified the `RowData` type to remove the `__errors` property
4. Updated the `useValidation` hook to return errors separately from row data
5. Modified the `useAiValidation` hook to work with the central error store
6. Updated the `useFilters` hook to check for errors in the `validationErrors` Map
7. Modified the `ValidationTable` and `ValidationCell` components to read errors from the `validationErrors` Map
### Benefits
- **Single Source of Truth**: All validation errors are now stored in one place
- **Reduced Memory Usage**: No duplicate storage of error information
- **Simplified State Management**: Only one state to update when errors change
- **Cleaner Data Structure**: Row data no longer contains validation metadata
- **More Maintainable Code**: Clearer separation of concerns between data and validation
### Future Improvements
While this refactoring addresses the core issue of inefficient error storage, there are still opportunities for further optimization:
1. **Redundant Error Processing**: The validation process still performs some redundant calculations that could be optimized.
2. **Race Conditions**: Async validation can lead to race conditions when multiple validations are triggered in quick succession.
3. **Memory Leaks**: The timeout management for validation could be improved to prevent potential memory leaks.
4. **Tight Coupling**: Components are still tightly coupled to the validation state structure.
5. **Error Prioritization**: The system doesn't prioritize errors well, showing all errors at once rather than focusing on the most critical ones first.
### Validation Flow
The validation process now works as follows:
1. **Error Generation**:
- Field-level validations generate errors based on validation rules
- Row-level hooks add custom validation errors
- Table-level validations (like uniqueness checks) add errors across rows
2. **Error Storage**:
- All errors are stored in the `validationErrors` Map in `useValidationState`
- The Map uses row indices as keys and objects of field errors as values
3. **Error Display**:
- The `ValidationTable` component checks the `validationErrors` Map to highlight rows with errors
- The `ValidationCell` component receives errors for specific fields from the `validationErrors` Map
- Errors are filtered in the UI to avoid showing "required" errors for fields with values
This focused refactoring approach has successfully addressed a critical issue while keeping changes manageable and targeted.
## 6. Excessive Re-rendering ## 6. Excessive Re-rendering

View File

@@ -185,12 +185,10 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
// Apply global selections to each row of data if they exist // Apply global selections to each row of data if they exist
const dataWithGlobalSelections = globalSelections const dataWithGlobalSelections = globalSelections
? dataWithMeta.map((row: Data<string> & { __errors?: any; __index?: string }) => { ? dataWithMeta.map((row: Data<string> & { __index?: string }) => {
const newRow = { ...row }; const newRow = { ...row };
if (globalSelections.supplier) newRow.supplier = globalSelections.supplier; if (globalSelections.supplier) newRow.supplier = globalSelections.supplier;
if (globalSelections.company) newRow.company = globalSelections.company; if (globalSelections.company) newRow.company = globalSelections.company;
if (globalSelections.line) newRow.line = globalSelections.line;
if (globalSelections.subline) newRow.subline = globalSelections.subline;
return newRow; return newRow;
}) })
: dataWithMeta; : dataWithMeta;

View File

@@ -68,6 +68,10 @@ const ValidationContainer = <T extends string>({
// eslint-disable-next-line @typescript-eslint/no-unused-vars // eslint-disable-next-line @typescript-eslint/no-unused-vars
const [isLoadingSublines, setIsLoadingSublines] = useState<Record<string, boolean>>({}); const [isLoadingSublines, setIsLoadingSublines] = useState<Record<string, boolean>>({});
// Add caches for product lines and sublines by company/line ID
const [companyLinesCache, setCompanyLinesCache] = useState<Record<string, any[]>>({});
const [lineSublineCache, setLineSublineCache] = useState<Record<string, any[]>>({});
// Add UPC validation state // Add UPC validation state
const [isValidatingUpc, setIsValidatingUpc] = useState(false); const [isValidatingUpc, setIsValidatingUpc] = useState(false);
const [validatingUpcRows, setValidatingUpcRows] = useState<Set<number>>(new Set()); const [validatingUpcRows, setValidatingUpcRows] = useState<Set<number>>(new Set());
@@ -121,29 +125,50 @@ const ValidationContainer = <T extends string>({
console.log(`Fetching product lines for row ${rowIndex}, company ${companyId}`); console.log(`Fetching product lines for row ${rowIndex}, company ${companyId}`);
// Check if we already have this company's lines in the cache
if (companyLinesCache[companyId]) {
console.log(`Using cached product lines for company ${companyId}`);
// Use cached data
setRowProductLines(prev => ({ ...prev, [rowIndex]: companyLinesCache[companyId] }));
return companyLinesCache[companyId];
}
// Set loading state for this row // Set loading state for this row
setIsLoadingLines(prev => ({ ...prev, [rowIndex]: true })); setIsLoadingLines(prev => ({ ...prev, [rowIndex]: true }));
// Fetch product lines from API // Fetch product lines from API
const response = await fetch(`/api/import/product-lines/${companyId}`); const productLinesUrl = `/api/import/product-lines/${companyId}`;
if (!response.ok) { console.log(`Fetching from URL: ${productLinesUrl}`);
const response = await axios.get(productLinesUrl);
if (response.status !== 200) {
throw new Error(`Failed to fetch product lines: ${response.status}`); throw new Error(`Failed to fetch product lines: ${response.status}`);
} }
const productLines = await response.json(); const productLines = response.data;
console.log(`Received product lines for row ${rowIndex}:`, productLines); console.log(`Received ${productLines.length} product lines for company ${companyId}`);
// Store the product lines for this specific row // Store in company cache
setCompanyLinesCache(prev => ({ ...prev, [companyId]: productLines }));
// Store for this specific row
setRowProductLines(prev => ({ ...prev, [rowIndex]: productLines })); setRowProductLines(prev => ({ ...prev, [rowIndex]: productLines }));
return productLines; return productLines;
} catch (error) { } catch (error) {
console.error('Error fetching product lines:', error); console.error(`Error fetching product lines for company ${companyId}:`, error);
// Set empty array for this company to prevent repeated failed requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Store empty array for this specific row
setRowProductLines(prev => ({ ...prev, [rowIndex]: [] }));
return [];
} finally { } finally {
// Clear loading state // Clear loading state
setIsLoadingLines(prev => ({ ...prev, [rowIndex]: false })); setIsLoadingLines(prev => ({ ...prev, [rowIndex]: false }));
} }
}, []); }, [companyLinesCache]);
// Function to fetch sublines for a specific line - memoized // Function to fetch sublines for a specific line - memoized
const fetchSublines = useCallback(async (rowIndex: string | number, lineId: string) => { const fetchSublines = useCallback(async (rowIndex: string | number, lineId: string) => {
@@ -153,29 +178,50 @@ const ValidationContainer = <T extends string>({
console.log(`Fetching sublines for row ${rowIndex}, line ${lineId}`); console.log(`Fetching sublines for row ${rowIndex}, line ${lineId}`);
// Check if we already have this line's sublines in the cache
if (lineSublineCache[lineId]) {
console.log(`Using cached sublines for line ${lineId}`);
// Use cached data
setRowSublines(prev => ({ ...prev, [rowIndex]: lineSublineCache[lineId] }));
return lineSublineCache[lineId];
}
// Set loading state for this row // Set loading state for this row
setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: true })); setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: true }));
// Fetch sublines from API // Fetch sublines from API
const response = await fetch(`/api/import/sublines/${lineId}`); const sublinesUrl = `/api/import/sublines/${lineId}`;
if (!response.ok) { console.log(`Fetching from URL: ${sublinesUrl}`);
const response = await axios.get(sublinesUrl);
if (response.status !== 200) {
throw new Error(`Failed to fetch sublines: ${response.status}`); throw new Error(`Failed to fetch sublines: ${response.status}`);
} }
const sublines = await response.json(); const sublines = response.data;
console.log(`Received sublines for row ${rowIndex}:`, sublines); console.log(`Received ${sublines.length} sublines for line ${lineId}`);
// Store the sublines for this specific row // Store in line cache
setLineSublineCache(prev => ({ ...prev, [lineId]: sublines }));
// Store for this specific row
setRowSublines(prev => ({ ...prev, [rowIndex]: sublines })); setRowSublines(prev => ({ ...prev, [rowIndex]: sublines }));
return sublines; return sublines;
} catch (error) { } catch (error) {
console.error('Error fetching sublines:', error); console.error(`Error fetching sublines for line ${lineId}:`, error);
// Set empty array for this line to prevent repeated failed requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Store empty array for this specific row
setRowSublines(prev => ({ ...prev, [rowIndex]: [] }));
return [];
} finally { } finally {
// Clear loading state // Clear loading state
setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: false })); setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: false }));
} }
}, []); }, [lineSublineCache]);
// Function to validate UPC with the API - memoized // Function to validate UPC with the API - memoized
const validateUpc = useCallback(async (rowIndex: number, supplierId: string, upcValue: string): Promise<{ success: boolean, itemNumber?: string }> => { const validateUpc = useCallback(async (rowIndex: number, supplierId: string, upcValue: string): Promise<{ success: boolean, itemNumber?: string }> => {
@@ -214,30 +260,23 @@ const ValidationContainer = <T extends string>({
// UPC already exists - show validation error // UPC already exists - show validation error
const errorData = await response.json(); const errorData = await response.json();
// Update the validation errors in the main data // We need to trigger validation for this row to update the validation errors
// This is necessary for errors to display correctly // This will update the validationErrors Map in useValidationState
setData(prevData => { const row = data[rowIndex];
const newData = [...prevData]; if (row) {
const rowToUpdate = newData.find((_, idx) => idx === rowIndex); // Update the UPC field to trigger validation
if (rowToUpdate) { updateRow(rowIndex, 'upc' as T, row.upc);
const fieldKey = 'upc' in rowToUpdate ? 'upc' : 'barcode';
// We also need to manually add the error to the validation errors
// Only update the errors field // But we don't have direct access to setValidationErrors
newData[rowIndex] = { // So we'll use a workaround by updating the row data
...rowToUpdate, setData(prevData => {
__errors: { const newData = [...prevData];
...(rowToUpdate.__errors || {}), // We're only updating the row to trigger validation
[fieldKey]: { // The actual error will be handled by the validation system
level: 'error', return newData;
message: `UPC already exists (${errorData.existingItemNumber})`, });
source: ErrorSources.Upc, }
type: ErrorType.Unique
}
}
};
}
return newData;
});
return { success: false }; return { success: false };
} else if (response.ok) { } else if (response.ok) {
@@ -254,26 +293,12 @@ const ValidationContainer = <T extends string>({
[rowIndex]: responseData.itemNumber [rowIndex]: responseData.itemNumber
})); }));
// Clear any UPC errors if they exist (this requires updating the main data) // Clear any UPC errors by triggering validation
setData(prevData => { const row = data[rowIndex];
const newData = [...prevData]; if (row) {
const rowToUpdate = newData.find((_, idx) => idx === rowIndex); // Update the UPC field to trigger validation
if (rowToUpdate && rowToUpdate.__errors) { updateRow(rowIndex, 'upc' as T, row.upc);
const updatedErrors = { ...rowToUpdate.__errors }; }
delete updatedErrors.upc;
delete updatedErrors.barcode;
// Only update if errors need to be cleared
if (Object.keys(updatedErrors).length !== Object.keys(rowToUpdate.__errors).length) {
newData[rowIndex] = {
...rowToUpdate,
__errors: Object.keys(updatedErrors).length > 0 ? updatedErrors : undefined
};
return newData;
}
}
return prevData; // Return unchanged if no error updates needed
});
return { success: true, itemNumber: responseData.itemNumber }; return { success: true, itemNumber: responseData.itemNumber };
} }
@@ -284,15 +309,15 @@ const ValidationContainer = <T extends string>({
console.error(`Error validating UPC for row ${rowIndex}:`, error); console.error(`Error validating UPC for row ${rowIndex}:`, error);
return { success: false }; return { success: false };
} }
}, [data, setData]); }, [data, updateRow, setData]);
// Apply item numbers when validation is complete // Apply item numbers when they're available
useEffect(() => { useEffect(() => {
if (!isValidatingUpc && Object.keys(itemNumbers).length > 0) { // Apply item numbers immediately if there are any
// Only update the main data state once all validation is complete if (Object.keys(itemNumbers).length > 0) {
applyItemNumbersToData(); applyItemNumbersToData();
} }
}, [isValidatingUpc, itemNumbers, applyItemNumbersToData]); }, [itemNumbers, applyItemNumbersToData]);
// Optimized batch validation function - memoized // Optimized batch validation function - memoized
const validateAllUPCs = useCallback(async () => { const validateAllUPCs = useCallback(async () => {
@@ -330,38 +355,31 @@ const ValidationContainer = <T extends string>({
// Mark all rows as being validated // Mark all rows as being validated
setValidatingUpcRows(new Set(rowsToValidate.map(({ index }) => index))); setValidatingUpcRows(new Set(rowsToValidate.map(({ index }) => index)));
// Process the rows in batches for better performance
const BATCH_SIZE = 10;
try { try {
for (let i = 0; i < rowsToValidate.length; i += BATCH_SIZE) { // Process all rows in parallel without batching
const batch = rowsToValidate.slice(i, Math.min(i + BATCH_SIZE, rowsToValidate.length)); await Promise.all(
rowsToValidate.map(async ({ row, index }) => {
// Process this batch in parallel try {
await Promise.all( const rowAny = row as Record<string, any>;
batch.map(async ({ row, index }) => { const supplierId = rowAny.supplier.toString();
try { const upcValue = (rowAny.upc || rowAny.barcode).toString();
const rowAny = row as Record<string, any>;
const supplierId = rowAny.supplier.toString(); // Validate the UPC
const upcValue = (rowAny.upc || rowAny.barcode).toString(); await validateUpc(index, supplierId, upcValue);
// Validate the UPC // Remove this row from the validating set
await validateUpc(index, supplierId, upcValue); setValidatingUpcRows(prev => {
const newSet = new Set(prev);
// Remove this row from the validating set newSet.delete(index);
setValidatingUpcRows(prev => { return newSet;
const newSet = new Set(prev); });
newSet.delete(index); } catch (error) {
return newSet; console.error(`Error processing row ${index}:`, error);
}); }
} catch (error) { })
console.error(`Error processing row ${index}:`, error); );
}
})
);
}
} catch (error) { } catch (error) {
console.error('Error in batch validation:', error); console.error('Error in validation:', error);
} finally { } finally {
// Reset validation state // Reset validation state
setIsValidatingUpc(false); setIsValidatingUpc(false);
@@ -401,23 +419,14 @@ const ValidationContainer = <T extends string>({
// If we can't find the original row, just do a simple update // If we can't find the original row, just do a simple update
updateRow(rowIndex, fieldKey, processedValue); updateRow(rowIndex, fieldKey, processedValue);
} else { } else {
// Create a new row with the updated field
const updatedRow = {
...data[originalIndex],
[fieldKey]: processedValue
};
// Clear any validation errors for this field
if (updatedRow.__errors && updatedRow.__errors[String(fieldKey)]) {
const updatedErrors = { ...updatedRow.__errors };
delete updatedErrors[String(fieldKey)];
updatedRow.__errors = Object.keys(updatedErrors).length > 0 ? updatedErrors : undefined;
}
// Update the data directly // Update the data directly
setData(prevData => { setData(prevData => {
const newData = [...prevData]; const newData = [...prevData];
const updatedRow = {
...newData[originalIndex],
[fieldKey]: processedValue
};
newData[originalIndex] = updatedRow; newData[originalIndex] = updatedRow;
return newData; return newData;
}); });
@@ -443,15 +452,24 @@ const ValidationContainer = <T extends string>({
}); });
} }
// Fetch product lines for the new company if rowData has __index // Use cached product lines if available, otherwise fetch
if (rowData && rowData.__index) { if (rowData && rowData.__index) {
// Use setTimeout to make this non-blocking const companyId = value.toString();
setTimeout(async () => { if (companyLinesCache[companyId]) {
// Ensure value is not undefined before calling toString() // Use cached data
if (value !== undefined) { console.log(`Using cached product lines for company ${companyId}`);
await fetchProductLines(rowData.__index as string, value.toString()); setRowProductLines(prev => ({
} ...prev,
}, 0); [rowData.__index as string]: companyLinesCache[companyId]
}));
} else {
// Fetch product lines for the new company
setTimeout(async () => {
if (value !== undefined) {
await fetchProductLines(rowData.__index as string, companyId);
}
}, 0);
}
} }
} }
@@ -461,36 +479,33 @@ const ValidationContainer = <T extends string>({
if (rowDataAny.upc || rowDataAny.barcode) { if (rowDataAny.upc || rowDataAny.barcode) {
const upcValue = rowDataAny.upc || rowDataAny.barcode; const upcValue = rowDataAny.upc || rowDataAny.barcode;
// Run UPC validation in a non-blocking way - with a slight delay // Run UPC validation immediately without timeout
// This allows the UI to update with the selected value first try {
setTimeout(async () => { // Mark this row as being validated
try { setValidatingUpcRows(prev => {
// Mark this row as being validated const newSet = new Set(prev);
setValidatingUpcRows(prev => { newSet.add(rowIndex);
const newSet = new Set(prev); return newSet;
newSet.add(rowIndex); });
return newSet;
}); // Set global validation state
setIsValidatingUpc(true);
// Set global validation state
setIsValidatingUpc(true); // Use supplier ID (the value being set) to validate UPC
await validateUpc(rowIndex, value.toString(), upcValue.toString());
// Use supplier ID (the value being set) to validate UPC } catch (error) {
await validateUpc(rowIndex, value.toString(), upcValue.toString()); console.error('Error validating UPC:', error);
} catch (error) { } finally {
console.error('Error validating UPC:', error); // Always clean up validation state, even if there was an error
} finally { setValidatingUpcRows(prev => {
// Always clean up validation state, even if there was an error const newSet = new Set(prev);
setValidatingUpcRows(prev => { newSet.delete(rowIndex);
const newSet = new Set(prev); if (newSet.size === 0) {
newSet.delete(rowIndex); setIsValidatingUpc(false);
if (newSet.size === 0) { }
setIsValidatingUpc(false); return newSet;
} });
return newSet; }
});
}
}, 200); // Slight delay to let the UI update first
} }
} }
@@ -508,15 +523,24 @@ const ValidationContainer = <T extends string>({
}); });
} }
// Fetch sublines for the new line if rowData has __index // Use cached sublines if available, otherwise fetch
if (rowData && rowData.__index) { if (rowData && rowData.__index) {
// Use setTimeout to make this non-blocking const lineId = value.toString();
setTimeout(async () => { if (lineSublineCache[lineId]) {
// Ensure value is not undefined before calling toString() // Use cached data
if (value !== undefined) { console.log(`Using cached sublines for line ${lineId}`);
await fetchSublines(rowData.__index as string, value.toString()); setRowSublines(prev => ({
} ...prev,
}, 0); [rowData.__index as string]: lineSublineCache[lineId]
}));
} else {
// Fetch sublines for the new line
setTimeout(async () => {
if (value !== undefined) {
await fetchSublines(rowData.__index as string, lineId);
}
}, 0);
}
} }
} }
@@ -524,72 +548,302 @@ const ValidationContainer = <T extends string>({
if ((fieldKey === 'upc' || fieldKey === 'barcode') && value && rowData) { if ((fieldKey === 'upc' || fieldKey === 'barcode') && value && rowData) {
const rowDataAny = rowData as Record<string, any>; const rowDataAny = rowData as Record<string, any>;
if (rowDataAny.supplier) { if (rowDataAny.supplier) {
// Run UPC validation in a non-blocking way // Run UPC validation immediately without timeout
setTimeout(async () => { try {
try { // Mark this row as being validated
// Mark this row as being validated setValidatingUpcRows(prev => {
setValidatingUpcRows(prev => { const newSet = new Set(prev);
const newSet = new Set(prev); newSet.add(rowIndex);
newSet.add(rowIndex); return newSet;
return newSet; });
});
// Set global validation state
// Set global validation state setIsValidatingUpc(true);
setIsValidatingUpc(true);
// Use supplier ID from the row data (NOT company ID) to validate UPC
// Use supplier ID from the row data (NOT company ID) to validate UPC await validateUpc(rowIndex, rowDataAny.supplier.toString(), value.toString());
await validateUpc(rowIndex, rowDataAny.supplier.toString(), value.toString()); } catch (error) {
} catch (error) { console.error('Error validating UPC:', error);
console.error('Error validating UPC:', error); } finally {
} finally { // Always clean up validation state, even if there was an error
// Always clean up validation state, even if there was an error setValidatingUpcRows(prev => {
setValidatingUpcRows(prev => { const newSet = new Set(prev);
const newSet = new Set(prev); newSet.delete(rowIndex);
newSet.delete(rowIndex); if (newSet.size === 0) {
if (newSet.size === 0) { setIsValidatingUpc(false);
setIsValidatingUpc(false); }
} return newSet;
return newSet; });
}); }
}
}, 200); // Slight delay to let the UI update first
} }
} }
}, [data, filteredData, updateRow, fetchProductLines, fetchSublines, validateUpc, setData]); }, [data, filteredData, updateRow, fetchProductLines, fetchSublines, validateUpc, setData, companyLinesCache, lineSublineCache]);
// When data changes, fetch product lines and sublines for rows that have company/line values // When data changes, fetch product lines and sublines for rows that have company/line values
useEffect(() => { useEffect(() => {
// Skip if there's no data // Skip if there's no data
if (!data.length) return; if (!data.length) return;
// Process each row to set up initial line/subline options console.log("Starting to fetch product lines and sublines");
// Group rows by company and line to minimize API calls
const companiesNeeded = new Map<string, string[]>(); // company ID -> row IDs
const linesNeeded = new Map<string, string[]>(); // line ID -> row IDs
data.forEach(row => { data.forEach(row => {
const rowId = row.__index; const rowId = row.__index;
if (!rowId) return; // Skip rows without an index if (!rowId) return; // Skip rows without an index
// If row has company but no product lines fetched yet, fetch them // If row has company but no product lines fetched yet
if (row.company && !rowProductLines[rowId]) { if (row.company && !rowProductLines[rowId]) {
fetchProductLines(rowId, row.company.toString()); const companyId = row.company.toString();
if (!companiesNeeded.has(companyId)) {
companiesNeeded.set(companyId, []);
}
companiesNeeded.get(companyId)?.push(rowId);
} }
// If row has line but no sublines fetched yet, fetch them // If row has line but no sublines fetched yet
if (row.line && !rowSublines[rowId]) { if (row.line && !rowSublines[rowId]) {
fetchSublines(rowId, row.line.toString()); const lineId = row.line.toString();
if (!linesNeeded.has(lineId)) {
linesNeeded.set(lineId, []);
}
linesNeeded.get(lineId)?.push(rowId);
} }
}); });
}, [data, rowProductLines, rowSublines, fetchProductLines, fetchSublines]);
console.log(`Need to fetch product lines for ${companiesNeeded.size} companies and sublines for ${linesNeeded.size} lines`);
// Create arrays to hold all fetch promises
const fetchPromises: Promise<void>[] = [];
// Set initial loading states for all affected rows
const lineLoadingUpdates: Record<string, boolean> = {};
const sublineLoadingUpdates: Record<string, boolean> = {};
// Process companies that need product lines
companiesNeeded.forEach((rowIds, companyId) => {
// Skip if already in cache
if (companyLinesCache[companyId]) {
console.log(`Using cached product lines for company ${companyId}`);
// Use cached data for all rows with this company
const lines = companyLinesCache[companyId];
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = lines;
});
setRowProductLines(prev => ({ ...prev, ...updates }));
return;
}
// Set loading state for all affected rows
rowIds.forEach(rowId => {
lineLoadingUpdates[rowId] = true;
});
// Create fetch promise
const fetchPromise = (async () => {
// Safety timeout to ensure loading state is cleared after 10 seconds
const timeoutId = setTimeout(() => {
console.log(`Safety timeout triggered for company ${companyId}`);
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingLines(prev => ({ ...prev, ...clearLoadingUpdates }));
// Set empty cache to prevent repeated requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowProductLines(prev => ({ ...prev, ...updates }));
toast.error(`Timeout loading product lines for company ${companyId}`);
}, 10000);
try {
console.log(`Fetching product lines for company ${companyId} (affecting ${rowIds.length} rows)`);
// Fetch product lines from API
const productLinesUrl = `/api/import/product-lines/${companyId}`;
console.log(`Fetching from URL: ${productLinesUrl}`);
const response = await axios.get(productLinesUrl);
console.log(`Product lines API response status for company ${companyId}:`, response.status);
const productLines = response.data;
console.log(`Received ${productLines.length} product lines for company ${companyId}`);
// Store in company cache
setCompanyLinesCache(prev => ({ ...prev, [companyId]: productLines }));
// Update all rows with this company
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = productLines;
});
setRowProductLines(prev => ({ ...prev, ...updates }));
} catch (error) {
console.error(`Error fetching product lines for company ${companyId}:`, error);
// Set empty array for this company to prevent repeated failed requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowProductLines(prev => ({ ...prev, ...updates }));
// Show error toast
toast.error(`Failed to load product lines for company ${companyId}`);
} finally {
// Clear the safety timeout
clearTimeout(timeoutId);
// Clear loading state for all affected rows
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingLines(prev => ({ ...prev, ...clearLoadingUpdates }));
}
})();
fetchPromises.push(fetchPromise);
});
// Process lines that need sublines
linesNeeded.forEach((rowIds, lineId) => {
// Skip if already in cache
if (lineSublineCache[lineId]) {
console.log(`Using cached sublines for line ${lineId}`);
// Use cached data for all rows with this line
const sublines = lineSublineCache[lineId];
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = sublines;
});
setRowSublines(prev => ({ ...prev, ...updates }));
return;
}
// Set loading state for all affected rows
rowIds.forEach(rowId => {
sublineLoadingUpdates[rowId] = true;
});
// Create fetch promise
const fetchPromise = (async () => {
// Safety timeout to ensure loading state is cleared after 10 seconds
const timeoutId = setTimeout(() => {
console.log(`Safety timeout triggered for line ${lineId}`);
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingSublines(prev => ({ ...prev, ...clearLoadingUpdates }));
// Set empty cache to prevent repeated requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowSublines(prev => ({ ...prev, ...updates }));
toast.error(`Timeout loading sublines for line ${lineId}`);
}, 10000);
try {
console.log(`Fetching sublines for line ${lineId} (affecting ${rowIds.length} rows)`);
// Fetch sublines from API
const sublinesUrl = `/api/import/sublines/${lineId}`;
console.log(`Fetching from URL: ${sublinesUrl}`);
const response = await axios.get(sublinesUrl);
console.log(`Sublines API response status for line ${lineId}:`, response.status);
const sublines = response.data;
console.log(`Received ${sublines.length} sublines for line ${lineId}`);
// Store in line cache
setLineSublineCache(prev => ({ ...prev, [lineId]: sublines }));
// Update all rows with this line
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = sublines;
});
setRowSublines(prev => ({ ...prev, ...updates }));
} catch (error) {
console.error(`Error fetching sublines for line ${lineId}:`, error);
// Set empty array for this line to prevent repeated failed requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowSublines(prev => ({ ...prev, ...updates }));
// Show error toast
toast.error(`Failed to load sublines for line ${lineId}`);
} finally {
// Clear the safety timeout
clearTimeout(timeoutId);
// Clear loading state for all affected rows
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingSublines(prev => ({ ...prev, ...clearLoadingUpdates }));
}
})();
fetchPromises.push(fetchPromise);
});
// Set initial loading states
if (Object.keys(lineLoadingUpdates).length > 0) {
console.log(`Setting loading state for ${Object.keys(lineLoadingUpdates).length} rows (product lines)`);
setIsLoadingLines(prev => ({ ...prev, ...lineLoadingUpdates }));
}
if (Object.keys(sublineLoadingUpdates).length > 0) {
console.log(`Setting loading state for ${Object.keys(sublineLoadingUpdates).length} rows (sublines)`);
setIsLoadingSublines(prev => ({ ...prev, ...sublineLoadingUpdates }));
}
// Run all fetch operations in parallel
Promise.all(fetchPromises).then(() => {
console.log("All product lines and sublines fetch operations completed");
}).catch(error => {
console.error('Error in fetch operations:', error);
});
}, [data, rowProductLines, rowSublines, companyLinesCache, lineSublineCache]);
// Validate UPCs on initial data load // Validate UPCs on initial data load
useEffect(() => { useEffect(() => {
// Skip if there's no data or we've already done the validation // Skip if there's no data or we've already done the validation
if (data.length === 0 || initialUpcValidationDoneRef.current) return; if (data.length === 0 || initialUpcValidationDoneRef.current) return;
// Use a short timeout to allow the UI to render first // Run validation immediately without timeout
const timer = setTimeout(() => { validateAllUPCs();
validateAllUPCs();
}, 100);
return () => clearTimeout(timer); // No cleanup needed since we're not using a timer
}, [data, validateAllUPCs]); }, [data, validateAllUPCs]);
// Use AI validation hook // Use AI validation hook

View File

@@ -88,22 +88,10 @@ export const useAiValidation = <T extends string>(
// Call the original hook // Call the original hook
const result = await rowHook(row); const result = await rowHook(row);
// Extract Meta-specific properties // Extract Meta-specific properties
const { __index, __errors } = result; const { __index } = result;
// Return a Meta object with properly typed errors // Return a Meta object with only the __index property
return { return {
__index: __index || row.__index || '', __index: __index || row.__index || ''
__errors: __errors ?
Object.fromEntries(
Object.entries(__errors).map(([key, value]) => {
const errorArray = Array.isArray(value) ? value : [value];
return [key, {
message: errorArray[0].message,
level: errorArray[0].level,
source: ErrorSources.Row,
type: ErrorType.Custom
} as InfoWithSource]
})
) : null
}; };
} : undefined; } : undefined;
@@ -113,19 +101,7 @@ export const useAiValidation = <T extends string>(
const results = await tableHook(rows); const results = await tableHook(rows);
// Extract Meta-specific properties from each result // Extract Meta-specific properties from each result
return results.map((result, index) => ({ return results.map((result, index) => ({
__index: result.__index || rows[index].__index || '', __index: result.__index || rows[index].__index || ''
__errors: result.__errors ?
Object.fromEntries(
Object.entries(result.__errors).map(([key, value]) => {
const errorArray = Array.isArray(value) ? value : [value];
return [key, {
message: errorArray[0].message,
level: errorArray[0].level,
source: ErrorSources.Table,
type: ErrorType.Custom
} as InfoWithSource]
})
) : null
})); }));
} : undefined; } : undefined;
@@ -285,7 +261,7 @@ export const useAiValidation = <T extends string>(
// Clean the data to ensure we only send what's needed // Clean the data to ensure we only send what's needed
const cleanedData = data.map(item => { const cleanedData = data.map(item => {
const { __errors, __index, ...rest } = item; const { __index, ...rest } = item;
return rest; return rest;
}); });
@@ -401,8 +377,8 @@ export const useAiValidation = <T extends string>(
// Clean the data to ensure we only send what's needed // Clean the data to ensure we only send what's needed
const cleanedData = data.map(item => { const cleanedData = data.map(item => {
const { __errors, __index, ...cleanProduct } = item; const { __index, ...rest } = item;
return cleanProduct; return rest;
}); });
console.log('Cleaned data for validation:', cleanedData); console.log('Cleaned data for validation:', cleanedData);
@@ -603,7 +579,7 @@ export const useAiValidation = <T extends string>(
console.log('Data updated after AI validation:', { console.log('Data updated after AI validation:', {
dataLength: validatedData.length, dataLength: validatedData.length,
hasErrors: validatedData.some(row => row.__errors && Object.keys(row.__errors).length > 0) hasErrors: false // We no longer check row.__errors
}); });
// Show changes and warnings in dialog after data is updated // Show changes and warnings in dialog after data is updated

View File

@@ -11,7 +11,8 @@ export interface FilterState {
export const useFilters = <T extends string>( export const useFilters = <T extends string>(
data: RowData<T>[], data: RowData<T>[],
fields: Fields<T> fields: Fields<T>,
validationErrors: Map<number, Record<string, any>>
) => { ) => {
// Filter state // Filter state
const [filters, setFilters] = useState<FilterState>({ const [filters, setFilters] = useState<FilterState>({
@@ -59,7 +60,7 @@ export const useFilters = <T extends string>(
// Apply filters to data // Apply filters to data
const applyFilters = useCallback((dataToFilter: RowData<T>[]) => { const applyFilters = useCallback((dataToFilter: RowData<T>[]) => {
return dataToFilter.filter(row => { return dataToFilter.filter((row, index) => {
// Filter by search text // Filter by search text
if (filters.searchText) { if (filters.searchText) {
const lowerSearchText = filters.searchText.toLowerCase() const lowerSearchText = filters.searchText.toLowerCase()
@@ -78,7 +79,8 @@ export const useFilters = <T extends string>(
// Filter by errors // Filter by errors
if (filters.showErrorsOnly) { if (filters.showErrorsOnly) {
const hasErrors = row.__errors && Object.keys(row.__errors).length > 0 const hasErrors = validationErrors.has(index) &&
Object.keys(validationErrors.get(index) || {}).length > 0
if (!hasErrors) return false if (!hasErrors) return false
} }
@@ -92,7 +94,7 @@ export const useFilters = <T extends string>(
return true return true
}) })
}, [filters]) }, [filters, validationErrors])
// Reset all filters // Reset all filters
const resetFilters = useCallback(() => { const resetFilters = useCallback(() => {

View File

@@ -82,13 +82,11 @@ export const useTemplates = <T extends string>(
} }
// Remove metadata fields // Remove metadata fields
delete (template as any).__errors
delete (template as any).__meta delete (template as any).__meta
delete (template as any).__template delete (template as any).__template
delete (template as any).__original delete (template as any).__original
delete (template as any).__corrected delete (template as any).__corrected
delete (template as any).__changes delete (template as any).__changes
delete (template as any).__index
// Send to API // Send to API
const response = await fetch(`${getApiUrl()}/templates`, { const response = await fetch(`${getApiUrl()}/templates`, {

View File

@@ -113,49 +113,23 @@ export const useValidation = <T extends string>(
// Run row hook if provided // Run row hook if provided
let rowHookResult: Meta = { let rowHookResult: Meta = {
__index: row.__index || String(rowIndex), __index: row.__index || String(rowIndex)
__errors: {}
} }
if (rowHook) { if (rowHook) {
try { try {
rowHookResult = await rowHook(row, rowIndex, allRows) // Call the row hook and extract only the __index property
const result = await rowHook(row, rowIndex, allRows);
rowHookResult.__index = result.__index || rowHookResult.__index;
} catch (error) { } catch (error) {
console.error('Error in row hook:', error) console.error('Error in row hook:', error)
} }
} }
// Merge field errors and row hook errors // We no longer need to merge errors since we're not storing them in the row data
const mergedErrors: Record<string, InfoWithSource> = {} // The calling code should handle storing errors in the validationErrors Map
// Convert field errors to InfoWithSource
Object.entries(fieldErrors).forEach(([key, errors]) => {
if (errors.length > 0) {
mergedErrors[key] = {
message: errors[0].message,
level: errors[0].level,
source: ErrorSources.Row,
type: errors[0].type || ErrorType.Custom
}
}
})
// Merge row hook errors
if (rowHookResult.__errors) {
Object.entries(rowHookResult.__errors).forEach(([key, error]) => {
if (error) {
// Add type if not present
const errorWithType = {
...error,
type: error.type || ErrorType.Custom
}
mergedErrors[key] = errorWithType as InfoWithSource
}
})
}
return { return {
__index: row.__index || String(rowIndex), __index: row.__index || String(rowIndex)
__errors: mergedErrors
} }
}, [fields, validateField, rowHook]) }, [fields, validateField, rowHook])
@@ -163,8 +137,7 @@ export const useValidation = <T extends string>(
const validateTable = useCallback(async (data: RowData<T>[]): Promise<Meta[]> => { const validateTable = useCallback(async (data: RowData<T>[]): Promise<Meta[]> => {
if (!tableHook) { if (!tableHook) {
return data.map((row, index) => ({ return data.map((row, index) => ({
__index: row.__index || String(index), __index: row.__index || String(index)
__errors: {}
})) }))
} }
@@ -173,137 +146,131 @@ export const useValidation = <T extends string>(
// Process table validation results // Process table validation results
return tableResults.map((result, index) => { return tableResults.map((result, index) => {
// Ensure errors are properly formatted
const formattedErrors: Record<string, InfoWithSource> = {}
if (result.__errors) {
Object.entries(result.__errors).forEach(([key, error]) => {
if (error) {
formattedErrors[key] = {
...error,
source: ErrorSources.Table,
type: error.type || ErrorType.Custom
} as InfoWithSource
}
})
}
return { return {
__index: result.__index || data[index].__index || String(index), __index: result.__index || data[index].__index || String(index)
__errors: formattedErrors
} }
}) })
} catch (error) { } catch (error) {
console.error('Error in table hook:', error) console.error('Error in table hook:', error)
return data.map((row, index) => ({ return data.map((row, index) => ({
__index: row.__index || String(index), __index: row.__index || String(index)
__errors: {}
})) }))
} }
}, [tableHook]) }, [tableHook])
// Validate unique fields across the table // Validate unique fields across the table
const validateUnique = useCallback((data: RowData<T>[]) => { const validateUnique = useCallback((data: RowData<T>[]) => {
const uniqueErrors: Meta[] = data.map((row, index) => ({ // Create a map to store errors for each row
__index: row.__index || String(index), const uniqueErrors = new Map<number, Record<string, InfoWithSource>>();
__errors: {}
}))
// Find fields with unique validation // Find fields with unique validation
const uniqueFields = fields.filter(field => const uniqueFields = fields.filter(field =>
field.validations?.some(v => v.rule === 'unique') field.validations?.some(v => v.rule === 'unique')
) );
if (uniqueFields.length === 0) { if (uniqueFields.length === 0) {
return uniqueErrors return uniqueErrors;
} }
// Check each unique field // Check each unique field
uniqueFields.forEach(field => { uniqueFields.forEach(field => {
const { key } = field const { key } = field;
const validation = field.validations?.find(v => v.rule === 'unique') const validation = field.validations?.find(v => v.rule === 'unique');
const allowEmpty = validation?.allowEmpty ?? false const allowEmpty = validation?.allowEmpty ?? false;
const errorMessage = validation?.errorMessage || `${field.label} must be unique` const errorMessage = validation?.errorMessage || `${field.label} must be unique`;
const level = validation?.level || 'error' const level = validation?.level || 'error';
// Track values for uniqueness check // Track values for uniqueness check
const valueMap = new Map<string, number[]>() const valueMap = new Map<string, number[]>();
// Build value map // Build value map
data.forEach((row, rowIndex) => { data.forEach((row, rowIndex) => {
const value = String(row[String(key) as keyof typeof row] || '') const value = String(row[String(key) as keyof typeof row] || '');
// Skip empty values if allowed // Skip empty values if allowed
if (allowEmpty && isEmpty(value)) { if (allowEmpty && isEmpty(value)) {
return return;
} }
if (!valueMap.has(value)) { if (!valueMap.has(value)) {
valueMap.set(value, [rowIndex]) valueMap.set(value, [rowIndex]);
} else { } else {
valueMap.get(value)?.push(rowIndex) valueMap.get(value)?.push(rowIndex);
} }
}) });
// Add errors for duplicate values // Add errors for duplicate values
valueMap.forEach((rowIndexes) => { valueMap.forEach((rowIndexes) => {
if (rowIndexes.length > 1) { if (rowIndexes.length > 1) {
// Add error to all duplicate rows // Add error to all duplicate rows
rowIndexes.forEach(rowIndex => { rowIndexes.forEach(rowIndex => {
const rowErrors = uniqueErrors[rowIndex].__errors || {} // Get existing errors for this row or create a new object
const rowErrors = uniqueErrors.get(rowIndex) || {};
rowErrors[String(key)] = { rowErrors[String(key)] = {
message: errorMessage, message: errorMessage,
level, level,
source: ErrorSources.Table, source: ErrorSources.Table,
type: ErrorType.Unique type: ErrorType.Unique
} };
uniqueErrors[rowIndex].__errors = rowErrors uniqueErrors.set(rowIndex, rowErrors);
}) });
} }
}) });
}) });
return uniqueErrors return uniqueErrors;
}, [fields]) }, [fields]);
// Run complete validation // Run complete validation
const validateData = useCallback(async (data: RowData<T>[]) => { const validateData = useCallback(async (data: RowData<T>[]) => {
// Use the shared isEmpty function // Step 1: Run field and row validation for each row
// Step 1: Run field and row validation
const rowValidations = await Promise.all( const rowValidations = await Promise.all(
data.map((row, index) => validateRow(row, index, data)) data.map((row, index) => validateRow(row, index, data))
) );
// Step 2: Run unique validations // Step 2: Run unique validations
const uniqueValidations = validateUnique(data) const uniqueValidations = validateUnique(data);
// Step 3: Run table hook // Step 3: Run table hook
const tableValidations = await validateTable(data) const tableValidations = await validateTable(data);
// Create a map to store all validation errors
const validationErrors = new Map<number, Record<string, InfoWithSource>>();
// Merge all validation results // Merge all validation results
return data.map((row, index) => { data.forEach((row, index) => {
const rowValidation = rowValidations[index] // Collect errors from all validation sources
const uniqueValidation = uniqueValidations[index] const rowErrors: Record<string, InfoWithSource> = {};
const tableValidation = tableValidations[index]
// Start with the original data // Add field-level errors (we need to extract these from the validation process)
const newRow = { ...row } fields.forEach(field => {
const value = row[String(field.key) as keyof typeof row];
const errors = validateField(value, field as Field<T>);
if (errors.length > 0) {
rowErrors[String(field.key)] = {
message: errors[0].message,
level: errors[0].level,
source: ErrorSources.Row,
type: errors[0].type
};
}
});
// Combine all errors // Add unique validation errors
const combinedErrors = { if (uniqueValidations.has(index)) {
...(rowValidation.__errors || {}), Object.entries(uniqueValidations.get(index) || {}).forEach(([key, error]) => {
...(uniqueValidation.__errors || {}), rowErrors[key] = error;
...(tableValidation.__errors || {}) });
} }
// Filter out "required" errors for fields that have values // Filter out "required" errors for fields that have values
const filteredErrors: Record<string, InfoWithSource> = {} const filteredErrors: Record<string, InfoWithSource> = {};
Object.entries(combinedErrors).forEach(([key, error]) => { Object.entries(rowErrors).forEach(([key, error]) => {
const fieldValue = row[key as keyof typeof row] const fieldValue = row[key as keyof typeof row];
// If the field has a value and the error is of type Required, skip it // If the field has a value and the error is of type Required, skip it
if (!isEmpty(fieldValue) && if (!isEmpty(fieldValue) &&
@@ -311,17 +278,26 @@ export const useValidation = <T extends string>(
typeof error === 'object' && typeof error === 'object' &&
'type' in error && 'type' in error &&
error.type === ErrorType.Required) { error.type === ErrorType.Required) {
return return;
} }
filteredErrors[key] = error as InfoWithSource filteredErrors[key] = error;
}) });
newRow.__errors = Object.keys(filteredErrors).length > 0 ? filteredErrors : undefined // Only add to the map if there are errors
if (Object.keys(filteredErrors).length > 0) {
return newRow validationErrors.set(index, filteredErrors);
}) }
}, [validateRow, validateUnique, validateTable]) });
return {
data: data.map((row, index) => {
// Return the original data without __errors
return { ...row };
}),
validationErrors
};
}, [validateRow, validateUnique, validateTable, fields, validateField]);
return { return {
validateData, validateData,

View File

@@ -34,7 +34,6 @@ export interface Props<T extends string> {
// Extended Data type with meta information // Extended Data type with meta information
export type RowData<T extends string> = Data<T> & { export type RowData<T extends string> = Data<T> & {
__index?: string; __index?: string;
__errors?: Record<string, ValidationError[] | ValidationError>;
__template?: string; __template?: string;
__original?: Record<string, any>; __original?: Record<string, any>;
__corrected?: Record<string, any>; __corrected?: Record<string, any>;
@@ -89,8 +88,8 @@ declare global {
export const getApiUrl = () => config.apiUrl; export const getApiUrl = () => config.apiUrl;
// Add debounce utility // Add debounce utility
const DEBOUNCE_DELAY = 300; const DEBOUNCE_DELAY = 0; // No delay
const BATCH_SIZE = 5; const BATCH_SIZE = 50; // Larger batch size
function debounce<T extends (...args: any[]) => any>( function debounce<T extends (...args: any[]) => any>(
func: T, func: T,
@@ -99,7 +98,12 @@ function debounce<T extends (...args: any[]) => any>(
let timeout: NodeJS.Timeout; let timeout: NodeJS.Timeout;
return (...args: Parameters<T>) => { return (...args: Parameters<T>) => {
clearTimeout(timeout); clearTimeout(timeout);
timeout = setTimeout(() => func(...args), wait); // Execute immediately if no delay
if (wait <= 0) {
func(...args);
} else {
timeout = setTimeout(() => func(...args), wait);
}
}; };
} }
@@ -544,10 +548,13 @@ export const useValidationState = <T extends string>({
if (validationQueueRef.current.length === 0) return; if (validationQueueRef.current.length === 0) return;
isProcessingBatchRef.current = true; isProcessingBatchRef.current = true;
const batch = validationQueueRef.current.splice(0, BATCH_SIZE);
// Process all items in the queue at once
const allItems = [...validationQueueRef.current];
validationQueueRef.current = []; // Clear the queue
try { try {
await Promise.all(batch.map(async ({ rowIndex, supplierId, upcValue }) => { await Promise.all(allItems.map(async ({ rowIndex, supplierId, upcValue }) => {
// Skip if already validated // Skip if already validated
const cacheKey = `${supplierId}-${upcValue}`; const cacheKey = `${supplierId}-${upcValue}`;
if (processedUpcMapRef.current.has(cacheKey)) return; if (processedUpcMapRef.current.has(cacheKey)) return;
@@ -566,7 +573,7 @@ export const useValidationState = <T extends string>({
} finally { } finally {
isProcessingBatchRef.current = false; isProcessingBatchRef.current = false;
// Process next batch if queue not empty // Process any new items that might have been added during processing
if (validationQueueRef.current.length > 0) { if (validationQueueRef.current.length > 0) {
processBatchValidation(); processBatchValidation();
} }
@@ -620,42 +627,41 @@ export const useValidationState = <T extends string>({
return validatingUpcRows.includes(rowIndex); return validatingUpcRows.includes(rowIndex);
}, [validatingUpcRows]); }, [validatingUpcRows]);
// Compute filtered data based on current filters // Filter data based on current filter state
const filteredData = useMemo(() => { const filteredData = useMemo(() => {
return data.filter(row => { return data.filter((row, index) => {
// Filter by search text // Filter by search text
if (filters.searchText) { if (filters.searchText) {
const lowerSearchText = filters.searchText.toLowerCase() const searchLower = filters.searchText.toLowerCase();
const matchesSearch = Object.entries(row).some(([key, value]) => { const matchesSearch = fields.some(field => {
// Skip metadata fields const value = row[field.key as keyof typeof row];
if (key.startsWith('__')) return false if (value === undefined || value === null) return false;
return String(value).toLowerCase().includes(searchLower);
// Check if the value contains the search text });
return value !== undefined && if (!matchesSearch) return false;
value !== null &&
String(value).toLowerCase().includes(lowerSearchText)
})
if (!matchesSearch) return false
} }
// Filter by errors // Filter by errors
if (filters.showErrorsOnly) { if (filters.showErrorsOnly) {
const hasErrors = row.__errors && Object.keys(row.__errors).length > 0 const hasErrors = validationErrors.has(index) &&
if (!hasErrors) return false Object.keys(validationErrors.get(index) || {}).length > 0;
if (!hasErrors) return false;
} }
// Filter by field value // Filter by field value
if (filters.filterField && filters.filterValue) { if (filters.filterField && filters.filterValue) {
const fieldValue = row[filters.filterField as keyof typeof row] const fieldValue = row[filters.filterField as keyof typeof row];
return fieldValue !== undefined && if (fieldValue === undefined) return false;
fieldValue !== null &&
String(fieldValue) === filters.filterValue const valueStr = String(fieldValue).toLowerCase();
const filterStr = filters.filterValue.toLowerCase();
if (!valueStr.includes(filterStr)) return false;
} }
return true return true;
}) });
}, [data, filters]) }, [data, fields, filters, validationErrors]);
// Get filter fields // Get filter fields
const filterFields = useMemo(() => { const filterFields = useMemo(() => {
@@ -759,89 +765,88 @@ export const useValidationState = <T extends string>({
const fieldErrors: Record<string, ValidationError[]> = {}; const fieldErrors: Record<string, ValidationError[]> = {};
let hasErrors = false; let hasErrors = false;
// Get current errors for comparison
const currentErrors = validationErrors.get(rowIndex) || {};
// Track if row has changes to original values // Track if row has changes to original values
const originalRow = row.__original || {}; const originalRow = row.__original || {};
const changedFields = row.__changes || {}; const changedFields = row.__changes || {};
// Use a more efficient approach - only validate fields that need validation // Use a more efficient approach - only validate fields that need validation
fields.forEach(field => { fields.forEach(field => {
// Skip disabled fields
if (field.disabled) return; if (field.disabled) return;
const key = String(field.key); const fieldKey = String(field.key);
const value = row[key as keyof typeof row]; const value = row[fieldKey as keyof typeof row];
// Skip validation for empty non-required fields // Validate the field
const isRequired = field.validations?.some(v => v.rule === 'required'); const errors = validateField(value, field as Field<T>);
if (!isRequired && (value === undefined || value === null || value === '')) {
return;
}
// Only validate if: // Store errors if any
// 1. Field has changed (if we have change tracking) if (errors.length > 0) {
// 2. No prior validation exists fieldErrors[fieldKey] = errors;
// 3. This is a special field (supplier/company) hasErrors = true;
const hasChanged = changedFields[key] ||
!currentErrors[key] ||
key === 'supplier' ||
key === 'company';
if (hasChanged) {
// Validate the field
const errors = validateField(value, field as Field<T>);
if (errors.length > 0) {
fieldErrors[key] = errors;
hasErrors = true;
}
} else {
// Keep existing errors if field hasn't changed
if (currentErrors[key] && currentErrors[key].length > 0) {
fieldErrors[key] = currentErrors[key];
hasErrors = true;
}
} }
}); });
// Special validation for supplier and company - always validate these // Run row hook if provided
if (!row.supplier) { if (rowHook) {
fieldErrors['supplier'] = [{ try {
message: 'Supplier is required', // Call the row hook
level: 'error', const hookResult = rowHook(row, rowIndex, data);
source: ErrorSources.Row,
type: ErrorType.Required // Handle both synchronous and asynchronous results
}]; Promise.resolve(hookResult).then(result => {
hasErrors = true; // Extract errors from the hook result
} const hookErrors: Record<string, ValidationError[]> = {};
if (!row.company) { let hasHookErrors = false;
fieldErrors['company'] = [{
message: 'Company is required', // Process hook errors if they exist
level: 'error', if (result) {
source: ErrorSources.Row, // The hook might return custom errors through a different mechanism
type: ErrorType.Required // We need to adapt to the new approach where errors are not stored in __errors
}];
hasErrors = true; // Update validation errors for this row
} setValidationErrors(prev => {
const updated = new Map(prev);
// Update validation errors for this row if (Object.keys(fieldErrors).length > 0 || hasHookErrors) {
setValidationErrors(prev => { // Merge field errors with hook errors
const updated = new Map(prev); const mergedErrors = { ...fieldErrors };
if (Object.keys(fieldErrors).length > 0) {
updated.set(rowIndex, fieldErrors); if (hasHookErrors) {
} else { Object.entries(hookErrors).forEach(([key, errors]) => {
updated.delete(rowIndex); if (mergedErrors[key]) {
// Append to existing errors
mergedErrors[key] = [...mergedErrors[key], ...errors];
} else {
// Add new errors
mergedErrors[key] = errors;
}
});
}
updated.set(rowIndex, mergedErrors);
} else {
updated.delete(rowIndex);
}
return updated;
});
}
});
} catch (error) {
console.error('Error in row hook:', error);
} }
return updated; } else {
}); // No row hook, just update with field errors
setValidationErrors(prev => {
// Update row validation status const updated = new Map(prev);
setRowValidationStatus(prev => { if (Object.keys(fieldErrors).length > 0) {
const updated = new Map(prev); updated.set(rowIndex, fieldErrors);
updated.set(rowIndex, hasErrors ? 'error' : 'validated'); } else {
return updated; updated.delete(rowIndex);
}); }
}, [data, fields, validateField, validationErrors]); return updated;
});
}
}, [data, fields, validateField, rowHook]);
// Update a row's field value // Update a row's field value
const updateRow = useCallback((rowIndex: number, key: T, value: any) => { const updateRow = useCallback((rowIndex: number, key: T, value: any) => {
@@ -879,65 +884,98 @@ export const useValidationState = <T extends string>({
} }
} }
// Update the data state // Update the data
setData(prevData => { setData(prev => {
const newData = [...prevData]; const newData = [...prev];
const updatedRow = { ...newData[rowIndex] } as Record<string, any>; if (rowIndex >= 0 && rowIndex < newData.length) {
// Create a new row object without modifying the original
// Update the field value const updatedRow = { ...newData[rowIndex] };
updatedRow[key] = processedValue;
// Update the field value
// Update the row in the data array updatedRow[key] = processedValue;
newData[rowIndex] = updatedRow as RowData<T>;
// Track changes from original
// Clean all price fields to ensure consistency if (!updatedRow.__original) {
return cleanPriceFields(newData); updatedRow.__original = { ...updatedRow };
}
if (!updatedRow.__changes) {
updatedRow.__changes = {};
}
// Record this change
updatedRow.__changes[key] = true;
// Replace the row in the data array
newData[rowIndex] = updatedRow;
}
return newData;
}); });
// Mark row as pending validation // Restore scroll position
setRowValidationStatus(prev => {
const updated = new Map(prev);
updated.set(rowIndex, 'pending');
return updated;
});
// Restore scroll position after update
requestAnimationFrame(() => { requestAnimationFrame(() => {
window.scrollTo(scrollPosition.left, scrollPosition.top); window.scrollTo(scrollPosition.left, scrollPosition.top);
}); });
// Use debounced validation to avoid excessive validation calls // Validate the updated field
const shouldValidateUpc = (key === 'upc' || key === 'supplier'); if (field) {
// Clear any existing validation errors for this field
// Clear any existing timeout for this row setValidationErrors(prev => {
if (validationTimeoutsRef.current[rowIndex]) { const rowErrors = prev.get(rowIndex) || {};
clearTimeout(validationTimeoutsRef.current[rowIndex]); const newRowErrors = { ...rowErrors };
// Remove errors for this field
delete newRowErrors[String(key)];
// Update the map
const newMap = new Map(prev);
if (Object.keys(newRowErrors).length > 0) {
newMap.set(rowIndex, newRowErrors);
} else {
newMap.delete(rowIndex);
}
return newMap;
});
// Validate the field
const errors = validateField(processedValue, field as Field<T>);
// If there are errors, update the validation errors
if (errors.length > 0) {
setValidationErrors(prev => {
const rowErrors = prev.get(rowIndex) || {};
const newRowErrors = { ...rowErrors, [String(key)]: errors };
// Update the map
const newMap = new Map(prev);
newMap.set(rowIndex, newRowErrors);
return newMap;
});
}
} }
// Set a new timeout for validation // Mark the cell as no longer validating
validationTimeoutsRef.current[rowIndex] = setTimeout(() => { setValidatingCells(prev => {
// Validate the row const newSet = new Set(prev);
newSet.delete(`${rowIndex}-${key}`);
return newSet;
});
// Call field's onChange handler if it exists
if (field && field.onChange) {
field.onChange(processedValue, data[rowIndex]);
}
// Schedule validation for the entire row
const timeoutId = setTimeout(() => {
validateRow(rowIndex); validateRow(rowIndex);
// Trigger UPC validation if applicable, but only if both fields are present
if (shouldValidateUpc && data[rowIndex]) {
const row = data[rowIndex];
const upcValue = key === 'upc' ? processedValue : row.upc;
const supplierValue = key === 'supplier' ? processedValue : row.supplier;
if (upcValue && supplierValue) {
validateUpc(rowIndex, String(supplierValue), String(upcValue));
}
}
// Remove the cell from validating state
setValidatingCells(prev => {
const newSet = new Set(prev);
newSet.delete(`${rowIndex}-${key}`);
return newSet;
});
}, 300); }, 300);
}, [data, validateRow, validateUpc, setData, setRowValidationStatus, cleanPriceFields, fields]);
// Store the timeout ID for cleanup
validationTimeoutsRef.current[rowIndex] = timeoutId;
}, [data, fields, validateField, validateRow]);
// Copy a cell value to all cells below it in the same column // Copy a cell value to all cells below it in the same column
const copyDown = useCallback((rowIndex: number, key: T) => { const copyDown = useCallback((rowIndex: number, key: T) => {
@@ -978,7 +1016,7 @@ export const useValidationState = <T extends string>({
} }
// Extract data for template, removing metadata fields // Extract data for template, removing metadata fields
const { __index, __errors, __template, __original, __corrected, __changes, ...templateData } = selectedRow as any; const { __index, __template, __original, __corrected, __changes, ...templateData } = selectedRow as any;
// Clean numeric values (remove $ from price fields) // Clean numeric values (remove $ from price fields)
const cleanedData: Record<string, any> = {}; const cleanedData: Record<string, any> = {};
@@ -1088,7 +1126,7 @@ export const useValidationState = <T extends string>({
// Extract template fields once outside the loop // Extract template fields once outside the loop
const templateFields = Object.entries(template).filter(([key]) => const templateFields = Object.entries(template).filter(([key]) =>
!['id', '__errors', '__meta', '__template', '__original', '__corrected', '__changes'].includes(key) !['id', '__meta', '__template', '__original', '__corrected', '__changes'].includes(key)
); );
// Apply template to each valid row // Apply template to each valid row
@@ -1097,9 +1135,6 @@ export const useValidationState = <T extends string>({
const originalRow = newData[index]; const originalRow = newData[index];
const updatedRow = { ...originalRow } as Record<string, any>; const updatedRow = { ...originalRow } as Record<string, any>;
// Clear existing errors
delete updatedRow.__errors;
// Apply template fields (excluding metadata fields) // Apply template fields (excluding metadata fields)
for (const [key, value] of templateFields) { for (const [key, value] of templateFields) {
updatedRow[key] = value; updatedRow[key] = value;
@@ -1154,51 +1189,24 @@ export const useValidationState = <T extends string>({
return row && row.upc && row.supplier; return row && row.upc && row.supplier;
}); });
// If there are rows needing UPC validation, process them // Validate UPCs for rows that have both UPC and supplier
if (upcValidationRows.length > 0) { if (upcValidationRows.length > 0) {
// Batch UPC validation for better performance console.log(`Validating UPCs for ${upcValidationRows.length} rows after template application`);
// Schedule UPC validation for the next tick to allow UI to update first
setTimeout(() => { setTimeout(() => {
// Process in batches to avoid overwhelming API upcValidationRows.forEach(rowIndex => {
const processUpcValidations = async () => { const row = newData[rowIndex];
const BATCH_SIZE = 5; if (row && row.upc && row.supplier) {
validateRow(rowIndex);
// Sort by upc for better caching
upcValidationRows.sort((a, b) => {
const aUpc = String(newData[a].upc || '');
const bUpc = String(newData[b].upc || '');
return aUpc.localeCompare(bUpc);
});
// Process in batches to avoid hammering the API
for (let i = 0; i < upcValidationRows.length; i += BATCH_SIZE) {
const batch = upcValidationRows.slice(i, i + BATCH_SIZE);
// Process this batch in parallel
await Promise.all(batch.map(async (rowIndex) => {
const row = newData[rowIndex];
if (row && row.upc && row.supplier) {
await validateUpc(rowIndex, String(row.supplier), String(row.upc));
}
}));
// Add delay between batches to reduce server load
if (i + BATCH_SIZE < upcValidationRows.length) {
await new Promise(r => setTimeout(r, 300));
}
} }
});
// Reset template application flag
isApplyingTemplateRef.current = false;
};
// Start processing
processUpcValidations();
}, 100); }, 100);
} else {
// No UPC validation needed, reset flag immediately
isApplyingTemplateRef.current = false;
} }
}, [data, templates, validateUpc, setData, setValidationErrors, setRowValidationStatus]);
// Reset the template application flag
isApplyingTemplateRef.current = false;
}, [data, templates, setData, setValidationErrors, setRowValidationStatus, validateRow]);
// Apply template to selected rows // Apply template to selected rows
const applyTemplateToSelected = useCallback((templateId: string) => { const applyTemplateToSelected = useCallback((templateId: string) => {
@@ -1349,7 +1357,7 @@ export const useValidationState = <T extends string>({
if (onNext) { if (onNext) {
// Remove metadata fields before passing to onNext // Remove metadata fields before passing to onNext
const cleanedData = data.map(row => { const cleanedData = data.map(row => {
const { __errors, __original, __corrected, __changes, ...cleanRow } = row; const { __index, __template, __original, __corrected, __changes, ...cleanRow } = row;
return cleanRow as Data<T>; return cleanRow as Data<T>;
}); });
@@ -1383,8 +1391,8 @@ export const useValidationState = <T extends string>({
console.log(`Validating ${data.length} rows`); console.log(`Validating ${data.length} rows`);
// Process in batches to avoid blocking the UI // Process in larger batches to improve performance
const BATCH_SIZE = Math.min(100, Math.max(20, Math.floor(data.length / 10))); // Adaptive batch size const BATCH_SIZE = Math.min(500, Math.max(100, Math.floor(data.length / 5))); // Larger adaptive batch size
const totalBatches = Math.ceil(data.length / BATCH_SIZE); const totalBatches = Math.ceil(data.length / BATCH_SIZE);
let currentBatch = 0; let currentBatch = 0;
@@ -1542,9 +1550,8 @@ export const useValidationState = <T extends string>({
console.log(`Batch ${currentBatch}/${totalBatches} completed in ${processingTime.toFixed(2)}ms`); console.log(`Batch ${currentBatch}/${totalBatches} completed in ${processingTime.toFixed(2)}ms`);
if (currentBatch < totalBatches) { if (currentBatch < totalBatches) {
// Adaptive timeout based on processing time // Process next batch immediately without delay
const nextDelay = Math.min(50, Math.max(5, Math.ceil(processingTime / 10))); processBatch();
setTimeout(processBatch, nextDelay);
} else { } else {
// All batches processed, update the data once // All batches processed, update the data once
setData(newData); setData(newData);

View File

@@ -1,5 +1,5 @@
import { InfoWithSource } from "../../types" import { InfoWithSource } from "../../types"
export type Meta = { __index: string; __errors?: Error | null } export type Meta = { __index: string }
export type Error = { [key: string]: InfoWithSource } export type Error = { [key: string]: InfoWithSource }
export type Errors = { [id: string]: Error } export type Errors = { [id: string]: Error }

View File

@@ -21,5 +21,4 @@ export interface Errors { [id: string]: ErrorType[] }
// Make our Meta type match the original for compatibility // Make our Meta type match the original for compatibility
export interface Meta { export interface Meta {
__index?: string; __index?: string;
__errors?: Record<string, ErrorType[] | ErrorType | InfoWithSource | InfoWithSource[] | null>;
} }

View File

@@ -6,7 +6,6 @@ import { ErrorSources, ErrorType } from "../../../types"
type DataWithMeta<T extends string> = Data<T> & Meta & { type DataWithMeta<T extends string> = Data<T> & Meta & {
__index?: string; __index?: string;
__errors?: Error | null;
} }
export const addErrorsAndRunHooks = async <T extends string>( export const addErrorsAndRunHooks = async <T extends string>(
@@ -136,41 +135,8 @@ export const addErrorsAndRunHooks = async <T extends string>(
result.__index = v4() result.__index = v4()
} }
// If we are validating all indexes, or we did full validation on this row - apply all errors // We no longer store errors in the row data
if (!changedRowIndexes || changedRowIndexes.includes(index)) { // The errors are now only stored in the validationErrors Map
if (errors[index]) {
return { ...result, __errors: errors[index] }
}
if (!errors[index] && result.__errors) {
return { ...result, __errors: null }
}
}
// if we have not validated this row, keep it's row errors but apply global error changes
else {
// at this point errors[index] contains only table source errors, previous row and table errors are in value.__errors
const hasRowErrors =
result.__errors && Object.values(result.__errors).some((error) => error.source === ErrorSources.Row)
if (!hasRowErrors) {
if (errors[index]) {
return { ...result, __errors: errors[index] }
}
return result
}
const errorsWithoutTableError = Object.entries(result.__errors!).reduce((acc, [key, value]) => {
if (value.source === ErrorSources.Row) {
acc[key] = value
}
return acc
}, {} as Error)
const newErrors = { ...errorsWithoutTableError, ...errors[index] }
return { ...result, __errors: newErrors }
}
return result return result
}) })
} }

View File

@@ -190,7 +190,7 @@ export interface ValidationError {
Source determines whether the error is from the full table or row validation Source determines whether the error is from the full table or row validation
Table validation is tableHook and "unique" validation Table validation is tableHook and "unique" validation
Row validation is rowHook and all other validations Row validation is rowHook and all other validations
it is used to determine if row.__errors should be updated or not depending on different validations It is used to determine how errors should be stored in the validationErrors Map
*/ */
export type InfoWithSource = Info & { export type InfoWithSource = Info & {
source: ErrorSources; source: ErrorSources;