feat(ui): Better many columns support in CSV tables import#2172
feat(ui): Better many columns support in CSV tables import#2172
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 552de7d7e7
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
There was a problem hiding this comment.
4 issues found across 8 files
Prompt for AI agents (all issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="frontend/src/lib/tables.ts">
<violation number="1" location="frontend/src/lib/tables.ts:69">
P2: Duplicate CSV headers will overwrite the first mapping because mapping[header] is assigned for every entry. The CSV import guidance is to drop duplicates (keep the first occurrence), so later duplicates should be ignored instead of replacing the initial mapping.
(Based on your team's feedback about dropping duplicate CSV headers.) [FEEDBACK_USED]</violation>
</file>
<file name="tracecat/tables/importer.py">
<violation number="1" location="tracecat/tables/importer.py:60">
P2: `csv.DictReader` emits a `None` key when a row has more values than headers (e.g., an unquoted comma in a cell). Calling `normalize_csv_header(None)` here will raise `AttributeError` because `str.lstrip()` is called on `None`. Filter out `None` keys before normalizing, e.g.:
```python
normalized_row = {
normalize_csv_header(key): value
for key, value in csv_row.items()
if key is not None
}
```</violation>
</file>
<file name="tracecat/tables/router.py">
<violation number="1" location="tracecat/tables/router.py:745">
P2: Rejecting CSV files with duplicate headers conflicts with the CSV import requirement to drop duplicates and keep the first occurrence. This should discard duplicates instead of raising a validation error.
(Based on your team's feedback about CSV imports handling duplicate headers.) [FEEDBACK_USED]</violation>
</file>
<file name="frontend/src/components/tables/table-import-csv-dialog.tsx">
<violation number="1" location="frontend/src/components/tables/table-import-csv-dialog.tsx:130">
P2: Auto-mapping now fills `columnMapping` with "skip" entries for every header, which makes the form valid even when the user hasn’t mapped any column. This lets submissions through despite the “map at least one column” requirement. Tighten validation (or disable submit) to require at least one non-"skip" mapping.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
1 issue found across 6 files (changes from recent commits).
Prompt for AI agents (all issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="tracecat/tables/router.py">
<violation number="1" location="tracecat/tables/router.py:750">
P3: Duplicate header columns should be dropped rather than preserved under synthetic keys; keeping them as hidden columns conflicts with the team’s CSV import guidance and can reintroduce unexpected data in rows.
(Based on your team's feedback about dropping duplicate CSV headers.) [FEEDBACK_USED]</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
Summary
Testing
Summary by cubic
Improves CSV import accuracy with fuzzy auto-mapping and stricter validation, reducing mapping errors and improving feedback. Also makes the insert-row dialog scrollable and keeps column mapping stable as files or schemas change.
New Features
Bug Fixes
Written for commit 90e5e96. Summary will update on new commits.