Refactor validation error handling to use a single source of truth (validationErrors Map), speed up item number generation and loading lines/sublines

This commit is contained in:
2025-03-16 14:09:58 -04:00
parent 153bbecc44
commit 52ae7e10aa
12 changed files with 825 additions and 585 deletions

View File

@@ -62,14 +62,78 @@ const validationTimeoutsRef = useRef<Record<number, NodeJS.Timeout>>({});
While there is cleanup on unmount, if rows are added/removed dynamically, timeouts for deleted rows might not be properly cleared.
## 5. Inefficient Error Storage
## 5. Inefficient Error Storage (RESOLVED)
Errors are stored in multiple places:
- In the `validationErrors` Map
**Status: RESOLVED**
### Problem
Previously, validation errors were stored in multiple locations:
- In the `validationErrors` Map in `useValidationState`
- In the row data itself as `__errors`
- In the UPC validation results
This duplication makes it harder to maintain a single source of truth and could lead to inconsistencies.
This redundancy caused several issues:
- Inconsistent error states between the two storage locations
- Increased memory usage by storing the same information twice
- Complex state management to keep both sources in sync
- Difficulty reasoning about where errors should be accessed from
### Solution
We've implemented a unified error storage approach by:
- Making the `validationErrors` Map in `useValidationState` the single source of truth for all validation errors
- Removed the `__errors` property from row data
- Updated all validation functions to interact with the central error store instead of modifying row data
- Modified UPC validation to use the central error store
- Updated all components to read errors from the `validationErrors` Map instead of row data
### Key Changes
1. Modified `dataMutations.ts` to stop storing errors in row data
2. Updated the `Meta` type to remove the `__errors` property
3. Modified the `RowData` type to remove the `__errors` property
4. Updated the `useValidation` hook to return errors separately from row data
5. Modified the `useAiValidation` hook to work with the central error store
6. Updated the `useFilters` hook to check for errors in the `validationErrors` Map
7. Modified the `ValidationTable` and `ValidationCell` components to read errors from the `validationErrors` Map
### Benefits
- **Single Source of Truth**: All validation errors are now stored in one place
- **Reduced Memory Usage**: No duplicate storage of error information
- **Simplified State Management**: Only one state to update when errors change
- **Cleaner Data Structure**: Row data no longer contains validation metadata
- **More Maintainable Code**: Clearer separation of concerns between data and validation
### Future Improvements
While this refactoring addresses the core issue of inefficient error storage, there are still opportunities for further optimization:
1. **Redundant Error Processing**: The validation process still performs some redundant calculations that could be optimized.
2. **Race Conditions**: Async validation can lead to race conditions when multiple validations are triggered in quick succession.
3. **Memory Leaks**: The timeout management for validation could be improved to prevent potential memory leaks.
4. **Tight Coupling**: Components are still tightly coupled to the validation state structure.
5. **Error Prioritization**: The system doesn't prioritize errors well, showing all errors at once rather than focusing on the most critical ones first.
### Validation Flow
The validation process now works as follows:
1. **Error Generation**:
- Field-level validations generate errors based on validation rules
- Row-level hooks add custom validation errors
- Table-level validations (like uniqueness checks) add errors across rows
2. **Error Storage**:
- All errors are stored in the `validationErrors` Map in `useValidationState`
- The Map uses row indices as keys and objects of field errors as values
3. **Error Display**:
- The `ValidationTable` component checks the `validationErrors` Map to highlight rows with errors
- The `ValidationCell` component receives errors for specific fields from the `validationErrors` Map
- Errors are filtered in the UI to avoid showing "required" errors for fields with values
This focused refactoring approach has successfully addressed a critical issue while keeping changes manageable and targeted.
## 6. Excessive Re-rendering

View File

@@ -185,12 +185,10 @@ export const UploadFlow = ({ state, onNext, onBack }: Props) => {
// Apply global selections to each row of data if they exist
const dataWithGlobalSelections = globalSelections
? dataWithMeta.map((row: Data<string> & { __errors?: any; __index?: string }) => {
? dataWithMeta.map((row: Data<string> & { __index?: string }) => {
const newRow = { ...row };
if (globalSelections.supplier) newRow.supplier = globalSelections.supplier;
if (globalSelections.company) newRow.company = globalSelections.company;
if (globalSelections.line) newRow.line = globalSelections.line;
if (globalSelections.subline) newRow.subline = globalSelections.subline;
return newRow;
})
: dataWithMeta;

View File

@@ -68,6 +68,10 @@ const ValidationContainer = <T extends string>({
// eslint-disable-next-line @typescript-eslint/no-unused-vars
const [isLoadingSublines, setIsLoadingSublines] = useState<Record<string, boolean>>({});
// Add caches for product lines and sublines by company/line ID
const [companyLinesCache, setCompanyLinesCache] = useState<Record<string, any[]>>({});
const [lineSublineCache, setLineSublineCache] = useState<Record<string, any[]>>({});
// Add UPC validation state
const [isValidatingUpc, setIsValidatingUpc] = useState(false);
const [validatingUpcRows, setValidatingUpcRows] = useState<Set<number>>(new Set());
@@ -121,29 +125,50 @@ const ValidationContainer = <T extends string>({
console.log(`Fetching product lines for row ${rowIndex}, company ${companyId}`);
// Check if we already have this company's lines in the cache
if (companyLinesCache[companyId]) {
console.log(`Using cached product lines for company ${companyId}`);
// Use cached data
setRowProductLines(prev => ({ ...prev, [rowIndex]: companyLinesCache[companyId] }));
return companyLinesCache[companyId];
}
// Set loading state for this row
setIsLoadingLines(prev => ({ ...prev, [rowIndex]: true }));
// Fetch product lines from API
const response = await fetch(`/api/import/product-lines/${companyId}`);
if (!response.ok) {
const productLinesUrl = `/api/import/product-lines/${companyId}`;
console.log(`Fetching from URL: ${productLinesUrl}`);
const response = await axios.get(productLinesUrl);
if (response.status !== 200) {
throw new Error(`Failed to fetch product lines: ${response.status}`);
}
const productLines = await response.json();
console.log(`Received product lines for row ${rowIndex}:`, productLines);
const productLines = response.data;
console.log(`Received ${productLines.length} product lines for company ${companyId}`);
// Store the product lines for this specific row
// Store in company cache
setCompanyLinesCache(prev => ({ ...prev, [companyId]: productLines }));
// Store for this specific row
setRowProductLines(prev => ({ ...prev, [rowIndex]: productLines }));
return productLines;
} catch (error) {
console.error('Error fetching product lines:', error);
console.error(`Error fetching product lines for company ${companyId}:`, error);
// Set empty array for this company to prevent repeated failed requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Store empty array for this specific row
setRowProductLines(prev => ({ ...prev, [rowIndex]: [] }));
return [];
} finally {
// Clear loading state
setIsLoadingLines(prev => ({ ...prev, [rowIndex]: false }));
}
}, []);
}, [companyLinesCache]);
// Function to fetch sublines for a specific line - memoized
const fetchSublines = useCallback(async (rowIndex: string | number, lineId: string) => {
@@ -153,29 +178,50 @@ const ValidationContainer = <T extends string>({
console.log(`Fetching sublines for row ${rowIndex}, line ${lineId}`);
// Check if we already have this line's sublines in the cache
if (lineSublineCache[lineId]) {
console.log(`Using cached sublines for line ${lineId}`);
// Use cached data
setRowSublines(prev => ({ ...prev, [rowIndex]: lineSublineCache[lineId] }));
return lineSublineCache[lineId];
}
// Set loading state for this row
setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: true }));
// Fetch sublines from API
const response = await fetch(`/api/import/sublines/${lineId}`);
if (!response.ok) {
const sublinesUrl = `/api/import/sublines/${lineId}`;
console.log(`Fetching from URL: ${sublinesUrl}`);
const response = await axios.get(sublinesUrl);
if (response.status !== 200) {
throw new Error(`Failed to fetch sublines: ${response.status}`);
}
const sublines = await response.json();
console.log(`Received sublines for row ${rowIndex}:`, sublines);
const sublines = response.data;
console.log(`Received ${sublines.length} sublines for line ${lineId}`);
// Store the sublines for this specific row
// Store in line cache
setLineSublineCache(prev => ({ ...prev, [lineId]: sublines }));
// Store for this specific row
setRowSublines(prev => ({ ...prev, [rowIndex]: sublines }));
return sublines;
} catch (error) {
console.error('Error fetching sublines:', error);
console.error(`Error fetching sublines for line ${lineId}:`, error);
// Set empty array for this line to prevent repeated failed requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Store empty array for this specific row
setRowSublines(prev => ({ ...prev, [rowIndex]: [] }));
return [];
} finally {
// Clear loading state
setIsLoadingSublines(prev => ({ ...prev, [rowIndex]: false }));
}
}, []);
}, [lineSublineCache]);
// Function to validate UPC with the API - memoized
const validateUpc = useCallback(async (rowIndex: number, supplierId: string, upcValue: string): Promise<{ success: boolean, itemNumber?: string }> => {
@@ -214,30 +260,23 @@ const ValidationContainer = <T extends string>({
// UPC already exists - show validation error
const errorData = await response.json();
// Update the validation errors in the main data
// This is necessary for errors to display correctly
// We need to trigger validation for this row to update the validation errors
// This will update the validationErrors Map in useValidationState
const row = data[rowIndex];
if (row) {
// Update the UPC field to trigger validation
updateRow(rowIndex, 'upc' as T, row.upc);
// We also need to manually add the error to the validation errors
// But we don't have direct access to setValidationErrors
// So we'll use a workaround by updating the row data
setData(prevData => {
const newData = [...prevData];
const rowToUpdate = newData.find((_, idx) => idx === rowIndex);
if (rowToUpdate) {
const fieldKey = 'upc' in rowToUpdate ? 'upc' : 'barcode';
// Only update the errors field
newData[rowIndex] = {
...rowToUpdate,
__errors: {
...(rowToUpdate.__errors || {}),
[fieldKey]: {
level: 'error',
message: `UPC already exists (${errorData.existingItemNumber})`,
source: ErrorSources.Upc,
type: ErrorType.Unique
}
}
};
}
// We're only updating the row to trigger validation
// The actual error will be handled by the validation system
return newData;
});
}
return { success: false };
} else if (response.ok) {
@@ -254,26 +293,12 @@ const ValidationContainer = <T extends string>({
[rowIndex]: responseData.itemNumber
}));
// Clear any UPC errors if they exist (this requires updating the main data)
setData(prevData => {
const newData = [...prevData];
const rowToUpdate = newData.find((_, idx) => idx === rowIndex);
if (rowToUpdate && rowToUpdate.__errors) {
const updatedErrors = { ...rowToUpdate.__errors };
delete updatedErrors.upc;
delete updatedErrors.barcode;
// Only update if errors need to be cleared
if (Object.keys(updatedErrors).length !== Object.keys(rowToUpdate.__errors).length) {
newData[rowIndex] = {
...rowToUpdate,
__errors: Object.keys(updatedErrors).length > 0 ? updatedErrors : undefined
};
return newData;
// Clear any UPC errors by triggering validation
const row = data[rowIndex];
if (row) {
// Update the UPC field to trigger validation
updateRow(rowIndex, 'upc' as T, row.upc);
}
}
return prevData; // Return unchanged if no error updates needed
});
return { success: true, itemNumber: responseData.itemNumber };
}
@@ -284,15 +309,15 @@ const ValidationContainer = <T extends string>({
console.error(`Error validating UPC for row ${rowIndex}:`, error);
return { success: false };
}
}, [data, setData]);
}, [data, updateRow, setData]);
// Apply item numbers when validation is complete
// Apply item numbers when they're available
useEffect(() => {
if (!isValidatingUpc && Object.keys(itemNumbers).length > 0) {
// Only update the main data state once all validation is complete
// Apply item numbers immediately if there are any
if (Object.keys(itemNumbers).length > 0) {
applyItemNumbersToData();
}
}, [isValidatingUpc, itemNumbers, applyItemNumbersToData]);
}, [itemNumbers, applyItemNumbersToData]);
// Optimized batch validation function - memoized
const validateAllUPCs = useCallback(async () => {
@@ -330,16 +355,10 @@ const ValidationContainer = <T extends string>({
// Mark all rows as being validated
setValidatingUpcRows(new Set(rowsToValidate.map(({ index }) => index)));
// Process the rows in batches for better performance
const BATCH_SIZE = 10;
try {
for (let i = 0; i < rowsToValidate.length; i += BATCH_SIZE) {
const batch = rowsToValidate.slice(i, Math.min(i + BATCH_SIZE, rowsToValidate.length));
// Process this batch in parallel
// Process all rows in parallel without batching
await Promise.all(
batch.map(async ({ row, index }) => {
rowsToValidate.map(async ({ row, index }) => {
try {
const rowAny = row as Record<string, any>;
const supplierId = rowAny.supplier.toString();
@@ -359,9 +378,8 @@ const ValidationContainer = <T extends string>({
}
})
);
}
} catch (error) {
console.error('Error in batch validation:', error);
console.error('Error in validation:', error);
} finally {
// Reset validation state
setIsValidatingUpc(false);
@@ -401,23 +419,14 @@ const ValidationContainer = <T extends string>({
// If we can't find the original row, just do a simple update
updateRow(rowIndex, fieldKey, processedValue);
} else {
// Create a new row with the updated field
const updatedRow = {
...data[originalIndex],
[fieldKey]: processedValue
};
// Clear any validation errors for this field
if (updatedRow.__errors && updatedRow.__errors[String(fieldKey)]) {
const updatedErrors = { ...updatedRow.__errors };
delete updatedErrors[String(fieldKey)];
updatedRow.__errors = Object.keys(updatedErrors).length > 0 ? updatedErrors : undefined;
}
// Update the data directly
setData(prevData => {
const newData = [...prevData];
const updatedRow = {
...newData[originalIndex],
[fieldKey]: processedValue
};
newData[originalIndex] = updatedRow;
return newData;
});
@@ -443,17 +452,26 @@ const ValidationContainer = <T extends string>({
});
}
// Fetch product lines for the new company if rowData has __index
// Use cached product lines if available, otherwise fetch
if (rowData && rowData.__index) {
// Use setTimeout to make this non-blocking
const companyId = value.toString();
if (companyLinesCache[companyId]) {
// Use cached data
console.log(`Using cached product lines for company ${companyId}`);
setRowProductLines(prev => ({
...prev,
[rowData.__index as string]: companyLinesCache[companyId]
}));
} else {
// Fetch product lines for the new company
setTimeout(async () => {
// Ensure value is not undefined before calling toString()
if (value !== undefined) {
await fetchProductLines(rowData.__index as string, value.toString());
await fetchProductLines(rowData.__index as string, companyId);
}
}, 0);
}
}
}
// If updating supplier field AND there's a UPC value, validate UPC
if (fieldKey === 'supplier' && value && rowData) {
@@ -461,9 +479,7 @@ const ValidationContainer = <T extends string>({
if (rowDataAny.upc || rowDataAny.barcode) {
const upcValue = rowDataAny.upc || rowDataAny.barcode;
// Run UPC validation in a non-blocking way - with a slight delay
// This allows the UI to update with the selected value first
setTimeout(async () => {
// Run UPC validation immediately without timeout
try {
// Mark this row as being validated
setValidatingUpcRows(prev => {
@@ -490,7 +506,6 @@ const ValidationContainer = <T extends string>({
return newSet;
});
}
}, 200); // Slight delay to let the UI update first
}
}
@@ -508,24 +523,32 @@ const ValidationContainer = <T extends string>({
});
}
// Fetch sublines for the new line if rowData has __index
// Use cached sublines if available, otherwise fetch
if (rowData && rowData.__index) {
// Use setTimeout to make this non-blocking
const lineId = value.toString();
if (lineSublineCache[lineId]) {
// Use cached data
console.log(`Using cached sublines for line ${lineId}`);
setRowSublines(prev => ({
...prev,
[rowData.__index as string]: lineSublineCache[lineId]
}));
} else {
// Fetch sublines for the new line
setTimeout(async () => {
// Ensure value is not undefined before calling toString()
if (value !== undefined) {
await fetchSublines(rowData.__index as string, value.toString());
await fetchSublines(rowData.__index as string, lineId);
}
}, 0);
}
}
}
// If updating UPC/barcode field AND there's a supplier value, validate UPC
if ((fieldKey === 'upc' || fieldKey === 'barcode') && value && rowData) {
const rowDataAny = rowData as Record<string, any>;
if (rowDataAny.supplier) {
// Run UPC validation in a non-blocking way
setTimeout(async () => {
// Run UPC validation immediately without timeout
try {
// Mark this row as being validated
setValidatingUpcRows(prev => {
@@ -552,44 +575,275 @@ const ValidationContainer = <T extends string>({
return newSet;
});
}
}, 200); // Slight delay to let the UI update first
}
}
}, [data, filteredData, updateRow, fetchProductLines, fetchSublines, validateUpc, setData]);
}, [data, filteredData, updateRow, fetchProductLines, fetchSublines, validateUpc, setData, companyLinesCache, lineSublineCache]);
// When data changes, fetch product lines and sublines for rows that have company/line values
useEffect(() => {
// Skip if there's no data
if (!data.length) return;
// Process each row to set up initial line/subline options
console.log("Starting to fetch product lines and sublines");
// Group rows by company and line to minimize API calls
const companiesNeeded = new Map<string, string[]>(); // company ID -> row IDs
const linesNeeded = new Map<string, string[]>(); // line ID -> row IDs
data.forEach(row => {
const rowId = row.__index;
if (!rowId) return; // Skip rows without an index
// If row has company but no product lines fetched yet, fetch them
// If row has company but no product lines fetched yet
if (row.company && !rowProductLines[rowId]) {
fetchProductLines(rowId, row.company.toString());
const companyId = row.company.toString();
if (!companiesNeeded.has(companyId)) {
companiesNeeded.set(companyId, []);
}
companiesNeeded.get(companyId)?.push(rowId);
}
// If row has line but no sublines fetched yet, fetch them
// If row has line but no sublines fetched yet
if (row.line && !rowSublines[rowId]) {
fetchSublines(rowId, row.line.toString());
const lineId = row.line.toString();
if (!linesNeeded.has(lineId)) {
linesNeeded.set(lineId, []);
}
linesNeeded.get(lineId)?.push(rowId);
}
});
}, [data, rowProductLines, rowSublines, fetchProductLines, fetchSublines]);
console.log(`Need to fetch product lines for ${companiesNeeded.size} companies and sublines for ${linesNeeded.size} lines`);
// Create arrays to hold all fetch promises
const fetchPromises: Promise<void>[] = [];
// Set initial loading states for all affected rows
const lineLoadingUpdates: Record<string, boolean> = {};
const sublineLoadingUpdates: Record<string, boolean> = {};
// Process companies that need product lines
companiesNeeded.forEach((rowIds, companyId) => {
// Skip if already in cache
if (companyLinesCache[companyId]) {
console.log(`Using cached product lines for company ${companyId}`);
// Use cached data for all rows with this company
const lines = companyLinesCache[companyId];
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = lines;
});
setRowProductLines(prev => ({ ...prev, ...updates }));
return;
}
// Set loading state for all affected rows
rowIds.forEach(rowId => {
lineLoadingUpdates[rowId] = true;
});
// Create fetch promise
const fetchPromise = (async () => {
// Safety timeout to ensure loading state is cleared after 10 seconds
const timeoutId = setTimeout(() => {
console.log(`Safety timeout triggered for company ${companyId}`);
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingLines(prev => ({ ...prev, ...clearLoadingUpdates }));
// Set empty cache to prevent repeated requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowProductLines(prev => ({ ...prev, ...updates }));
toast.error(`Timeout loading product lines for company ${companyId}`);
}, 10000);
try {
console.log(`Fetching product lines for company ${companyId} (affecting ${rowIds.length} rows)`);
// Fetch product lines from API
const productLinesUrl = `/api/import/product-lines/${companyId}`;
console.log(`Fetching from URL: ${productLinesUrl}`);
const response = await axios.get(productLinesUrl);
console.log(`Product lines API response status for company ${companyId}:`, response.status);
const productLines = response.data;
console.log(`Received ${productLines.length} product lines for company ${companyId}`);
// Store in company cache
setCompanyLinesCache(prev => ({ ...prev, [companyId]: productLines }));
// Update all rows with this company
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = productLines;
});
setRowProductLines(prev => ({ ...prev, ...updates }));
} catch (error) {
console.error(`Error fetching product lines for company ${companyId}:`, error);
// Set empty array for this company to prevent repeated failed requests
setCompanyLinesCache(prev => ({ ...prev, [companyId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowProductLines(prev => ({ ...prev, ...updates }));
// Show error toast
toast.error(`Failed to load product lines for company ${companyId}`);
} finally {
// Clear the safety timeout
clearTimeout(timeoutId);
// Clear loading state for all affected rows
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingLines(prev => ({ ...prev, ...clearLoadingUpdates }));
}
})();
fetchPromises.push(fetchPromise);
});
// Process lines that need sublines
linesNeeded.forEach((rowIds, lineId) => {
// Skip if already in cache
if (lineSublineCache[lineId]) {
console.log(`Using cached sublines for line ${lineId}`);
// Use cached data for all rows with this line
const sublines = lineSublineCache[lineId];
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = sublines;
});
setRowSublines(prev => ({ ...prev, ...updates }));
return;
}
// Set loading state for all affected rows
rowIds.forEach(rowId => {
sublineLoadingUpdates[rowId] = true;
});
// Create fetch promise
const fetchPromise = (async () => {
// Safety timeout to ensure loading state is cleared after 10 seconds
const timeoutId = setTimeout(() => {
console.log(`Safety timeout triggered for line ${lineId}`);
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingSublines(prev => ({ ...prev, ...clearLoadingUpdates }));
// Set empty cache to prevent repeated requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowSublines(prev => ({ ...prev, ...updates }));
toast.error(`Timeout loading sublines for line ${lineId}`);
}, 10000);
try {
console.log(`Fetching sublines for line ${lineId} (affecting ${rowIds.length} rows)`);
// Fetch sublines from API
const sublinesUrl = `/api/import/sublines/${lineId}`;
console.log(`Fetching from URL: ${sublinesUrl}`);
const response = await axios.get(sublinesUrl);
console.log(`Sublines API response status for line ${lineId}:`, response.status);
const sublines = response.data;
console.log(`Received ${sublines.length} sublines for line ${lineId}`);
// Store in line cache
setLineSublineCache(prev => ({ ...prev, [lineId]: sublines }));
// Update all rows with this line
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = sublines;
});
setRowSublines(prev => ({ ...prev, ...updates }));
} catch (error) {
console.error(`Error fetching sublines for line ${lineId}:`, error);
// Set empty array for this line to prevent repeated failed requests
setLineSublineCache(prev => ({ ...prev, [lineId]: [] }));
// Update rows with empty array
const updates: Record<string, any[]> = {};
rowIds.forEach(rowId => {
updates[rowId] = [];
});
setRowSublines(prev => ({ ...prev, ...updates }));
// Show error toast
toast.error(`Failed to load sublines for line ${lineId}`);
} finally {
// Clear the safety timeout
clearTimeout(timeoutId);
// Clear loading state for all affected rows
const clearLoadingUpdates: Record<string, boolean> = {};
rowIds.forEach(rowId => {
clearLoadingUpdates[rowId] = false;
});
setIsLoadingSublines(prev => ({ ...prev, ...clearLoadingUpdates }));
}
})();
fetchPromises.push(fetchPromise);
});
// Set initial loading states
if (Object.keys(lineLoadingUpdates).length > 0) {
console.log(`Setting loading state for ${Object.keys(lineLoadingUpdates).length} rows (product lines)`);
setIsLoadingLines(prev => ({ ...prev, ...lineLoadingUpdates }));
}
if (Object.keys(sublineLoadingUpdates).length > 0) {
console.log(`Setting loading state for ${Object.keys(sublineLoadingUpdates).length} rows (sublines)`);
setIsLoadingSublines(prev => ({ ...prev, ...sublineLoadingUpdates }));
}
// Run all fetch operations in parallel
Promise.all(fetchPromises).then(() => {
console.log("All product lines and sublines fetch operations completed");
}).catch(error => {
console.error('Error in fetch operations:', error);
});
}, [data, rowProductLines, rowSublines, companyLinesCache, lineSublineCache]);
// Validate UPCs on initial data load
useEffect(() => {
// Skip if there's no data or we've already done the validation
if (data.length === 0 || initialUpcValidationDoneRef.current) return;
// Use a short timeout to allow the UI to render first
const timer = setTimeout(() => {
// Run validation immediately without timeout
validateAllUPCs();
}, 100);
return () => clearTimeout(timer);
// No cleanup needed since we're not using a timer
}, [data, validateAllUPCs]);
// Use AI validation hook

View File

@@ -88,22 +88,10 @@ export const useAiValidation = <T extends string>(
// Call the original hook
const result = await rowHook(row);
// Extract Meta-specific properties
const { __index, __errors } = result;
// Return a Meta object with properly typed errors
const { __index } = result;
// Return a Meta object with only the __index property
return {
__index: __index || row.__index || '',
__errors: __errors ?
Object.fromEntries(
Object.entries(__errors).map(([key, value]) => {
const errorArray = Array.isArray(value) ? value : [value];
return [key, {
message: errorArray[0].message,
level: errorArray[0].level,
source: ErrorSources.Row,
type: ErrorType.Custom
} as InfoWithSource]
})
) : null
__index: __index || row.__index || ''
};
} : undefined;
@@ -113,19 +101,7 @@ export const useAiValidation = <T extends string>(
const results = await tableHook(rows);
// Extract Meta-specific properties from each result
return results.map((result, index) => ({
__index: result.__index || rows[index].__index || '',
__errors: result.__errors ?
Object.fromEntries(
Object.entries(result.__errors).map(([key, value]) => {
const errorArray = Array.isArray(value) ? value : [value];
return [key, {
message: errorArray[0].message,
level: errorArray[0].level,
source: ErrorSources.Table,
type: ErrorType.Custom
} as InfoWithSource]
})
) : null
__index: result.__index || rows[index].__index || ''
}));
} : undefined;
@@ -285,7 +261,7 @@ export const useAiValidation = <T extends string>(
// Clean the data to ensure we only send what's needed
const cleanedData = data.map(item => {
const { __errors, __index, ...rest } = item;
const { __index, ...rest } = item;
return rest;
});
@@ -401,8 +377,8 @@ export const useAiValidation = <T extends string>(
// Clean the data to ensure we only send what's needed
const cleanedData = data.map(item => {
const { __errors, __index, ...cleanProduct } = item;
return cleanProduct;
const { __index, ...rest } = item;
return rest;
});
console.log('Cleaned data for validation:', cleanedData);
@@ -603,7 +579,7 @@ export const useAiValidation = <T extends string>(
console.log('Data updated after AI validation:', {
dataLength: validatedData.length,
hasErrors: validatedData.some(row => row.__errors && Object.keys(row.__errors).length > 0)
hasErrors: false // We no longer check row.__errors
});
// Show changes and warnings in dialog after data is updated

View File

@@ -11,7 +11,8 @@ export interface FilterState {
export const useFilters = <T extends string>(
data: RowData<T>[],
fields: Fields<T>
fields: Fields<T>,
validationErrors: Map<number, Record<string, any>>
) => {
// Filter state
const [filters, setFilters] = useState<FilterState>({
@@ -59,7 +60,7 @@ export const useFilters = <T extends string>(
// Apply filters to data
const applyFilters = useCallback((dataToFilter: RowData<T>[]) => {
return dataToFilter.filter(row => {
return dataToFilter.filter((row, index) => {
// Filter by search text
if (filters.searchText) {
const lowerSearchText = filters.searchText.toLowerCase()
@@ -78,7 +79,8 @@ export const useFilters = <T extends string>(
// Filter by errors
if (filters.showErrorsOnly) {
const hasErrors = row.__errors && Object.keys(row.__errors).length > 0
const hasErrors = validationErrors.has(index) &&
Object.keys(validationErrors.get(index) || {}).length > 0
if (!hasErrors) return false
}
@@ -92,7 +94,7 @@ export const useFilters = <T extends string>(
return true
})
}, [filters])
}, [filters, validationErrors])
// Reset all filters
const resetFilters = useCallback(() => {

View File

@@ -82,13 +82,11 @@ export const useTemplates = <T extends string>(
}
// Remove metadata fields
delete (template as any).__errors
delete (template as any).__meta
delete (template as any).__template
delete (template as any).__original
delete (template as any).__corrected
delete (template as any).__changes
delete (template as any).__index
// Send to API
const response = await fetch(`${getApiUrl()}/templates`, {

View File

@@ -113,49 +113,23 @@ export const useValidation = <T extends string>(
// Run row hook if provided
let rowHookResult: Meta = {
__index: row.__index || String(rowIndex),
__errors: {}
__index: row.__index || String(rowIndex)
}
if (rowHook) {
try {
rowHookResult = await rowHook(row, rowIndex, allRows)
// Call the row hook and extract only the __index property
const result = await rowHook(row, rowIndex, allRows);
rowHookResult.__index = result.__index || rowHookResult.__index;
} catch (error) {
console.error('Error in row hook:', error)
}
}
// Merge field errors and row hook errors
const mergedErrors: Record<string, InfoWithSource> = {}
// Convert field errors to InfoWithSource
Object.entries(fieldErrors).forEach(([key, errors]) => {
if (errors.length > 0) {
mergedErrors[key] = {
message: errors[0].message,
level: errors[0].level,
source: ErrorSources.Row,
type: errors[0].type || ErrorType.Custom
}
}
})
// Merge row hook errors
if (rowHookResult.__errors) {
Object.entries(rowHookResult.__errors).forEach(([key, error]) => {
if (error) {
// Add type if not present
const errorWithType = {
...error,
type: error.type || ErrorType.Custom
}
mergedErrors[key] = errorWithType as InfoWithSource
}
})
}
// We no longer need to merge errors since we're not storing them in the row data
// The calling code should handle storing errors in the validationErrors Map
return {
__index: row.__index || String(rowIndex),
__errors: mergedErrors
__index: row.__index || String(rowIndex)
}
}, [fields, validateField, rowHook])
@@ -163,8 +137,7 @@ export const useValidation = <T extends string>(
const validateTable = useCallback(async (data: RowData<T>[]): Promise<Meta[]> => {
if (!tableHook) {
return data.map((row, index) => ({
__index: row.__index || String(index),
__errors: {}
__index: row.__index || String(index)
}))
}
@@ -173,137 +146,131 @@ export const useValidation = <T extends string>(
// Process table validation results
return tableResults.map((result, index) => {
// Ensure errors are properly formatted
const formattedErrors: Record<string, InfoWithSource> = {}
if (result.__errors) {
Object.entries(result.__errors).forEach(([key, error]) => {
if (error) {
formattedErrors[key] = {
...error,
source: ErrorSources.Table,
type: error.type || ErrorType.Custom
} as InfoWithSource
}
})
}
return {
__index: result.__index || data[index].__index || String(index),
__errors: formattedErrors
__index: result.__index || data[index].__index || String(index)
}
})
} catch (error) {
console.error('Error in table hook:', error)
return data.map((row, index) => ({
__index: row.__index || String(index),
__errors: {}
__index: row.__index || String(index)
}))
}
}, [tableHook])
// Validate unique fields across the table
const validateUnique = useCallback((data: RowData<T>[]) => {
const uniqueErrors: Meta[] = data.map((row, index) => ({
__index: row.__index || String(index),
__errors: {}
}))
// Create a map to store errors for each row
const uniqueErrors = new Map<number, Record<string, InfoWithSource>>();
// Find fields with unique validation
const uniqueFields = fields.filter(field =>
field.validations?.some(v => v.rule === 'unique')
)
);
if (uniqueFields.length === 0) {
return uniqueErrors
return uniqueErrors;
}
// Check each unique field
uniqueFields.forEach(field => {
const { key } = field
const validation = field.validations?.find(v => v.rule === 'unique')
const allowEmpty = validation?.allowEmpty ?? false
const errorMessage = validation?.errorMessage || `${field.label} must be unique`
const level = validation?.level || 'error'
const { key } = field;
const validation = field.validations?.find(v => v.rule === 'unique');
const allowEmpty = validation?.allowEmpty ?? false;
const errorMessage = validation?.errorMessage || `${field.label} must be unique`;
const level = validation?.level || 'error';
// Track values for uniqueness check
const valueMap = new Map<string, number[]>()
const valueMap = new Map<string, number[]>();
// Build value map
data.forEach((row, rowIndex) => {
const value = String(row[String(key) as keyof typeof row] || '')
const value = String(row[String(key) as keyof typeof row] || '');
// Skip empty values if allowed
if (allowEmpty && isEmpty(value)) {
return
return;
}
if (!valueMap.has(value)) {
valueMap.set(value, [rowIndex])
valueMap.set(value, [rowIndex]);
} else {
valueMap.get(value)?.push(rowIndex)
valueMap.get(value)?.push(rowIndex);
}
})
});
// Add errors for duplicate values
valueMap.forEach((rowIndexes) => {
if (rowIndexes.length > 1) {
// Add error to all duplicate rows
rowIndexes.forEach(rowIndex => {
const rowErrors = uniqueErrors[rowIndex].__errors || {}
// Get existing errors for this row or create a new object
const rowErrors = uniqueErrors.get(rowIndex) || {};
rowErrors[String(key)] = {
message: errorMessage,
level,
source: ErrorSources.Table,
type: ErrorType.Unique
}
};
uniqueErrors[rowIndex].__errors = rowErrors
})
uniqueErrors.set(rowIndex, rowErrors);
});
}
})
})
});
});
return uniqueErrors
}, [fields])
return uniqueErrors;
}, [fields]);
// Run complete validation
const validateData = useCallback(async (data: RowData<T>[]) => {
// Use the shared isEmpty function
// Step 1: Run field and row validation
// Step 1: Run field and row validation for each row
const rowValidations = await Promise.all(
data.map((row, index) => validateRow(row, index, data))
)
);
// Step 2: Run unique validations
const uniqueValidations = validateUnique(data)
const uniqueValidations = validateUnique(data);
// Step 3: Run table hook
const tableValidations = await validateTable(data)
const tableValidations = await validateTable(data);
// Create a map to store all validation errors
const validationErrors = new Map<number, Record<string, InfoWithSource>>();
// Merge all validation results
return data.map((row, index) => {
const rowValidation = rowValidations[index]
const uniqueValidation = uniqueValidations[index]
const tableValidation = tableValidations[index]
data.forEach((row, index) => {
// Collect errors from all validation sources
const rowErrors: Record<string, InfoWithSource> = {};
// Start with the original data
const newRow = { ...row }
// Add field-level errors (we need to extract these from the validation process)
fields.forEach(field => {
const value = row[String(field.key) as keyof typeof row];
const errors = validateField(value, field as Field<T>);
// Combine all errors
const combinedErrors = {
...(rowValidation.__errors || {}),
...(uniqueValidation.__errors || {}),
...(tableValidation.__errors || {})
if (errors.length > 0) {
rowErrors[String(field.key)] = {
message: errors[0].message,
level: errors[0].level,
source: ErrorSources.Row,
type: errors[0].type
};
}
});
// Add unique validation errors
if (uniqueValidations.has(index)) {
Object.entries(uniqueValidations.get(index) || {}).forEach(([key, error]) => {
rowErrors[key] = error;
});
}
// Filter out "required" errors for fields that have values
const filteredErrors: Record<string, InfoWithSource> = {}
const filteredErrors: Record<string, InfoWithSource> = {};
Object.entries(combinedErrors).forEach(([key, error]) => {
const fieldValue = row[key as keyof typeof row]
Object.entries(rowErrors).forEach(([key, error]) => {
const fieldValue = row[key as keyof typeof row];
// If the field has a value and the error is of type Required, skip it
if (!isEmpty(fieldValue) &&
@@ -311,17 +278,26 @@ export const useValidation = <T extends string>(
typeof error === 'object' &&
'type' in error &&
error.type === ErrorType.Required) {
return
return;
}
filteredErrors[key] = error as InfoWithSource
})
filteredErrors[key] = error;
});
newRow.__errors = Object.keys(filteredErrors).length > 0 ? filteredErrors : undefined
// Only add to the map if there are errors
if (Object.keys(filteredErrors).length > 0) {
validationErrors.set(index, filteredErrors);
}
});
return newRow
})
}, [validateRow, validateUnique, validateTable])
return {
data: data.map((row, index) => {
// Return the original data without __errors
return { ...row };
}),
validationErrors
};
}, [validateRow, validateUnique, validateTable, fields, validateField]);
return {
validateData,

View File

@@ -34,7 +34,6 @@ export interface Props<T extends string> {
// Extended Data type with meta information
export type RowData<T extends string> = Data<T> & {
__index?: string;
__errors?: Record<string, ValidationError[] | ValidationError>;
__template?: string;
__original?: Record<string, any>;
__corrected?: Record<string, any>;
@@ -89,8 +88,8 @@ declare global {
export const getApiUrl = () => config.apiUrl;
// Add debounce utility
const DEBOUNCE_DELAY = 300;
const BATCH_SIZE = 5;
const DEBOUNCE_DELAY = 0; // No delay
const BATCH_SIZE = 50; // Larger batch size
function debounce<T extends (...args: any[]) => any>(
func: T,
@@ -99,7 +98,12 @@ function debounce<T extends (...args: any[]) => any>(
let timeout: NodeJS.Timeout;
return (...args: Parameters<T>) => {
clearTimeout(timeout);
// Execute immediately if no delay
if (wait <= 0) {
func(...args);
} else {
timeout = setTimeout(() => func(...args), wait);
}
};
}
@@ -544,10 +548,13 @@ export const useValidationState = <T extends string>({
if (validationQueueRef.current.length === 0) return;
isProcessingBatchRef.current = true;
const batch = validationQueueRef.current.splice(0, BATCH_SIZE);
// Process all items in the queue at once
const allItems = [...validationQueueRef.current];
validationQueueRef.current = []; // Clear the queue
try {
await Promise.all(batch.map(async ({ rowIndex, supplierId, upcValue }) => {
await Promise.all(allItems.map(async ({ rowIndex, supplierId, upcValue }) => {
// Skip if already validated
const cacheKey = `${supplierId}-${upcValue}`;
if (processedUpcMapRef.current.has(cacheKey)) return;
@@ -566,7 +573,7 @@ export const useValidationState = <T extends string>({
} finally {
isProcessingBatchRef.current = false;
// Process next batch if queue not empty
// Process any new items that might have been added during processing
if (validationQueueRef.current.length > 0) {
processBatchValidation();
}
@@ -620,42 +627,41 @@ export const useValidationState = <T extends string>({
return validatingUpcRows.includes(rowIndex);
}, [validatingUpcRows]);
// Compute filtered data based on current filters
// Filter data based on current filter state
const filteredData = useMemo(() => {
return data.filter(row => {
return data.filter((row, index) => {
// Filter by search text
if (filters.searchText) {
const lowerSearchText = filters.searchText.toLowerCase()
const matchesSearch = Object.entries(row).some(([key, value]) => {
// Skip metadata fields
if (key.startsWith('__')) return false
// Check if the value contains the search text
return value !== undefined &&
value !== null &&
String(value).toLowerCase().includes(lowerSearchText)
})
if (!matchesSearch) return false
const searchLower = filters.searchText.toLowerCase();
const matchesSearch = fields.some(field => {
const value = row[field.key as keyof typeof row];
if (value === undefined || value === null) return false;
return String(value).toLowerCase().includes(searchLower);
});
if (!matchesSearch) return false;
}
// Filter by errors
if (filters.showErrorsOnly) {
const hasErrors = row.__errors && Object.keys(row.__errors).length > 0
if (!hasErrors) return false
const hasErrors = validationErrors.has(index) &&
Object.keys(validationErrors.get(index) || {}).length > 0;
if (!hasErrors) return false;
}
// Filter by field value
if (filters.filterField && filters.filterValue) {
const fieldValue = row[filters.filterField as keyof typeof row]
return fieldValue !== undefined &&
fieldValue !== null &&
String(fieldValue) === filters.filterValue
const fieldValue = row[filters.filterField as keyof typeof row];
if (fieldValue === undefined) return false;
const valueStr = String(fieldValue).toLowerCase();
const filterStr = filters.filterValue.toLowerCase();
if (!valueStr.includes(filterStr)) return false;
}
return true
})
}, [data, filters])
return true;
});
}, [data, fields, filters, validationErrors]);
// Get filter fields
const filterFields = useMemo(() => {
@@ -759,72 +765,77 @@ export const useValidationState = <T extends string>({
const fieldErrors: Record<string, ValidationError[]> = {};
let hasErrors = false;
// Get current errors for comparison
const currentErrors = validationErrors.get(rowIndex) || {};
// Track if row has changes to original values
const originalRow = row.__original || {};
const changedFields = row.__changes || {};
// Use a more efficient approach - only validate fields that need validation
fields.forEach(field => {
// Skip disabled fields
if (field.disabled) return;
const key = String(field.key);
const value = row[key as keyof typeof row];
const fieldKey = String(field.key);
const value = row[fieldKey as keyof typeof row];
// Skip validation for empty non-required fields
const isRequired = field.validations?.some(v => v.rule === 'required');
if (!isRequired && (value === undefined || value === null || value === '')) {
return;
}
// Only validate if:
// 1. Field has changed (if we have change tracking)
// 2. No prior validation exists
// 3. This is a special field (supplier/company)
const hasChanged = changedFields[key] ||
!currentErrors[key] ||
key === 'supplier' ||
key === 'company';
if (hasChanged) {
// Validate the field
const errors = validateField(value, field as Field<T>);
// Store errors if any
if (errors.length > 0) {
fieldErrors[key] = errors;
fieldErrors[fieldKey] = errors;
hasErrors = true;
}
} else {
// Keep existing errors if field hasn't changed
if (currentErrors[key] && currentErrors[key].length > 0) {
fieldErrors[key] = currentErrors[key];
hasErrors = true;
}
}
});
// Special validation for supplier and company - always validate these
if (!row.supplier) {
fieldErrors['supplier'] = [{
message: 'Supplier is required',
level: 'error',
source: ErrorSources.Row,
type: ErrorType.Required
}];
hasErrors = true;
}
if (!row.company) {
fieldErrors['company'] = [{
message: 'Company is required',
level: 'error',
source: ErrorSources.Row,
type: ErrorType.Required
}];
hasErrors = true;
}
// Run row hook if provided
if (rowHook) {
try {
// Call the row hook
const hookResult = rowHook(row, rowIndex, data);
// Handle both synchronous and asynchronous results
Promise.resolve(hookResult).then(result => {
// Extract errors from the hook result
const hookErrors: Record<string, ValidationError[]> = {};
let hasHookErrors = false;
// Process hook errors if they exist
if (result) {
// The hook might return custom errors through a different mechanism
// We need to adapt to the new approach where errors are not stored in __errors
// Update validation errors for this row
setValidationErrors(prev => {
const updated = new Map(prev);
if (Object.keys(fieldErrors).length > 0 || hasHookErrors) {
// Merge field errors with hook errors
const mergedErrors = { ...fieldErrors };
if (hasHookErrors) {
Object.entries(hookErrors).forEach(([key, errors]) => {
if (mergedErrors[key]) {
// Append to existing errors
mergedErrors[key] = [...mergedErrors[key], ...errors];
} else {
// Add new errors
mergedErrors[key] = errors;
}
});
}
updated.set(rowIndex, mergedErrors);
} else {
updated.delete(rowIndex);
}
return updated;
});
}
});
} catch (error) {
console.error('Error in row hook:', error);
}
} else {
// No row hook, just update with field errors
setValidationErrors(prev => {
const updated = new Map(prev);
if (Object.keys(fieldErrors).length > 0) {
@@ -834,14 +845,8 @@ export const useValidationState = <T extends string>({
}
return updated;
});
// Update row validation status
setRowValidationStatus(prev => {
const updated = new Map(prev);
updated.set(rowIndex, hasErrors ? 'error' : 'validated');
return updated;
});
}, [data, fields, validateField, validationErrors]);
}
}, [data, fields, validateField, rowHook]);
// Update a row's field value
const updateRow = useCallback((rowIndex: number, key: T, value: any) => {
@@ -879,65 +884,98 @@ export const useValidationState = <T extends string>({
}
}
// Update the data state
setData(prevData => {
const newData = [...prevData];
const updatedRow = { ...newData[rowIndex] } as Record<string, any>;
// Update the data
setData(prev => {
const newData = [...prev];
if (rowIndex >= 0 && rowIndex < newData.length) {
// Create a new row object without modifying the original
const updatedRow = { ...newData[rowIndex] };
// Update the field value
updatedRow[key] = processedValue;
// Update the row in the data array
newData[rowIndex] = updatedRow as RowData<T>;
// Track changes from original
if (!updatedRow.__original) {
updatedRow.__original = { ...updatedRow };
}
// Clean all price fields to ensure consistency
return cleanPriceFields(newData);
if (!updatedRow.__changes) {
updatedRow.__changes = {};
}
// Record this change
updatedRow.__changes[key] = true;
// Replace the row in the data array
newData[rowIndex] = updatedRow;
}
return newData;
});
// Mark row as pending validation
setRowValidationStatus(prev => {
const updated = new Map(prev);
updated.set(rowIndex, 'pending');
return updated;
});
// Restore scroll position after update
// Restore scroll position
requestAnimationFrame(() => {
window.scrollTo(scrollPosition.left, scrollPosition.top);
});
// Use debounced validation to avoid excessive validation calls
const shouldValidateUpc = (key === 'upc' || key === 'supplier');
// Validate the updated field
if (field) {
// Clear any existing validation errors for this field
setValidationErrors(prev => {
const rowErrors = prev.get(rowIndex) || {};
const newRowErrors = { ...rowErrors };
// Clear any existing timeout for this row
if (validationTimeoutsRef.current[rowIndex]) {
clearTimeout(validationTimeoutsRef.current[rowIndex]);
// Remove errors for this field
delete newRowErrors[String(key)];
// Update the map
const newMap = new Map(prev);
if (Object.keys(newRowErrors).length > 0) {
newMap.set(rowIndex, newRowErrors);
} else {
newMap.delete(rowIndex);
}
// Set a new timeout for validation
validationTimeoutsRef.current[rowIndex] = setTimeout(() => {
// Validate the row
validateRow(rowIndex);
return newMap;
});
// Trigger UPC validation if applicable, but only if both fields are present
if (shouldValidateUpc && data[rowIndex]) {
const row = data[rowIndex];
const upcValue = key === 'upc' ? processedValue : row.upc;
const supplierValue = key === 'supplier' ? processedValue : row.supplier;
// Validate the field
const errors = validateField(processedValue, field as Field<T>);
if (upcValue && supplierValue) {
validateUpc(rowIndex, String(supplierValue), String(upcValue));
// If there are errors, update the validation errors
if (errors.length > 0) {
setValidationErrors(prev => {
const rowErrors = prev.get(rowIndex) || {};
const newRowErrors = { ...rowErrors, [String(key)]: errors };
// Update the map
const newMap = new Map(prev);
newMap.set(rowIndex, newRowErrors);
return newMap;
});
}
}
// Remove the cell from validating state
// Mark the cell as no longer validating
setValidatingCells(prev => {
const newSet = new Set(prev);
newSet.delete(`${rowIndex}-${key}`);
return newSet;
});
// Call field's onChange handler if it exists
if (field && field.onChange) {
field.onChange(processedValue, data[rowIndex]);
}
// Schedule validation for the entire row
const timeoutId = setTimeout(() => {
validateRow(rowIndex);
}, 300);
}, [data, validateRow, validateUpc, setData, setRowValidationStatus, cleanPriceFields, fields]);
// Store the timeout ID for cleanup
validationTimeoutsRef.current[rowIndex] = timeoutId;
}, [data, fields, validateField, validateRow]);
// Copy a cell value to all cells below it in the same column
const copyDown = useCallback((rowIndex: number, key: T) => {
@@ -978,7 +1016,7 @@ export const useValidationState = <T extends string>({
}
// Extract data for template, removing metadata fields
const { __index, __errors, __template, __original, __corrected, __changes, ...templateData } = selectedRow as any;
const { __index, __template, __original, __corrected, __changes, ...templateData } = selectedRow as any;
// Clean numeric values (remove $ from price fields)
const cleanedData: Record<string, any> = {};
@@ -1088,7 +1126,7 @@ export const useValidationState = <T extends string>({
// Extract template fields once outside the loop
const templateFields = Object.entries(template).filter(([key]) =>
!['id', '__errors', '__meta', '__template', '__original', '__corrected', '__changes'].includes(key)
!['id', '__meta', '__template', '__original', '__corrected', '__changes'].includes(key)
);
// Apply template to each valid row
@@ -1097,9 +1135,6 @@ export const useValidationState = <T extends string>({
const originalRow = newData[index];
const updatedRow = { ...originalRow } as Record<string, any>;
// Clear existing errors
delete updatedRow.__errors;
// Apply template fields (excluding metadata fields)
for (const [key, value] of templateFields) {
updatedRow[key] = value;
@@ -1154,51 +1189,24 @@ export const useValidationState = <T extends string>({
return row && row.upc && row.supplier;
});
// If there are rows needing UPC validation, process them
// Validate UPCs for rows that have both UPC and supplier
if (upcValidationRows.length > 0) {
// Batch UPC validation for better performance
console.log(`Validating UPCs for ${upcValidationRows.length} rows after template application`);
// Schedule UPC validation for the next tick to allow UI to update first
setTimeout(() => {
// Process in batches to avoid overwhelming API
const processUpcValidations = async () => {
const BATCH_SIZE = 5;
// Sort by upc for better caching
upcValidationRows.sort((a, b) => {
const aUpc = String(newData[a].upc || '');
const bUpc = String(newData[b].upc || '');
return aUpc.localeCompare(bUpc);
});
// Process in batches to avoid hammering the API
for (let i = 0; i < upcValidationRows.length; i += BATCH_SIZE) {
const batch = upcValidationRows.slice(i, i + BATCH_SIZE);
// Process this batch in parallel
await Promise.all(batch.map(async (rowIndex) => {
upcValidationRows.forEach(rowIndex => {
const row = newData[rowIndex];
if (row && row.upc && row.supplier) {
await validateUpc(rowIndex, String(row.supplier), String(row.upc));
validateRow(rowIndex);
}
}));
// Add delay between batches to reduce server load
if (i + BATCH_SIZE < upcValidationRows.length) {
await new Promise(r => setTimeout(r, 300));
}
}
// Reset template application flag
isApplyingTemplateRef.current = false;
};
// Start processing
processUpcValidations();
});
}, 100);
} else {
// No UPC validation needed, reset flag immediately
isApplyingTemplateRef.current = false;
}
}, [data, templates, validateUpc, setData, setValidationErrors, setRowValidationStatus]);
// Reset the template application flag
isApplyingTemplateRef.current = false;
}, [data, templates, setData, setValidationErrors, setRowValidationStatus, validateRow]);
// Apply template to selected rows
const applyTemplateToSelected = useCallback((templateId: string) => {
@@ -1349,7 +1357,7 @@ export const useValidationState = <T extends string>({
if (onNext) {
// Remove metadata fields before passing to onNext
const cleanedData = data.map(row => {
const { __errors, __original, __corrected, __changes, ...cleanRow } = row;
const { __index, __template, __original, __corrected, __changes, ...cleanRow } = row;
return cleanRow as Data<T>;
});
@@ -1383,8 +1391,8 @@ export const useValidationState = <T extends string>({
console.log(`Validating ${data.length} rows`);
// Process in batches to avoid blocking the UI
const BATCH_SIZE = Math.min(100, Math.max(20, Math.floor(data.length / 10))); // Adaptive batch size
// Process in larger batches to improve performance
const BATCH_SIZE = Math.min(500, Math.max(100, Math.floor(data.length / 5))); // Larger adaptive batch size
const totalBatches = Math.ceil(data.length / BATCH_SIZE);
let currentBatch = 0;
@@ -1542,9 +1550,8 @@ export const useValidationState = <T extends string>({
console.log(`Batch ${currentBatch}/${totalBatches} completed in ${processingTime.toFixed(2)}ms`);
if (currentBatch < totalBatches) {
// Adaptive timeout based on processing time
const nextDelay = Math.min(50, Math.max(5, Math.ceil(processingTime / 10)));
setTimeout(processBatch, nextDelay);
// Process next batch immediately without delay
processBatch();
} else {
// All batches processed, update the data once
setData(newData);

View File

@@ -1,5 +1,5 @@
import { InfoWithSource } from "../../types"
export type Meta = { __index: string; __errors?: Error | null }
export type Meta = { __index: string }
export type Error = { [key: string]: InfoWithSource }
export type Errors = { [id: string]: Error }

View File

@@ -21,5 +21,4 @@ export interface Errors { [id: string]: ErrorType[] }
// Make our Meta type match the original for compatibility
export interface Meta {
__index?: string;
__errors?: Record<string, ErrorType[] | ErrorType | InfoWithSource | InfoWithSource[] | null>;
}

View File

@@ -6,7 +6,6 @@ import { ErrorSources, ErrorType } from "../../../types"
type DataWithMeta<T extends string> = Data<T> & Meta & {
__index?: string;
__errors?: Error | null;
}
export const addErrorsAndRunHooks = async <T extends string>(
@@ -136,41 +135,8 @@ export const addErrorsAndRunHooks = async <T extends string>(
result.__index = v4()
}
// If we are validating all indexes, or we did full validation on this row - apply all errors
if (!changedRowIndexes || changedRowIndexes.includes(index)) {
if (errors[index]) {
return { ...result, __errors: errors[index] }
}
if (!errors[index] && result.__errors) {
return { ...result, __errors: null }
}
}
// if we have not validated this row, keep it's row errors but apply global error changes
else {
// at this point errors[index] contains only table source errors, previous row and table errors are in value.__errors
const hasRowErrors =
result.__errors && Object.values(result.__errors).some((error) => error.source === ErrorSources.Row)
if (!hasRowErrors) {
if (errors[index]) {
return { ...result, __errors: errors[index] }
}
return result
}
const errorsWithoutTableError = Object.entries(result.__errors!).reduce((acc, [key, value]) => {
if (value.source === ErrorSources.Row) {
acc[key] = value
}
return acc
}, {} as Error)
const newErrors = { ...errorsWithoutTableError, ...errors[index] }
return { ...result, __errors: newErrors }
}
// We no longer store errors in the row data
// The errors are now only stored in the validationErrors Map
return result
})
}

View File

@@ -190,7 +190,7 @@ export interface ValidationError {
Source determines whether the error is from the full table or row validation
Table validation is tableHook and "unique" validation
Row validation is rowHook and all other validations
it is used to determine if row.__errors should be updated or not depending on different validations
It is used to determine how errors should be stored in the validationErrors Map
*/
export type InfoWithSource = Info & {
source: ErrorSources;