Payload CMS UUID already exists error even though field is unique #16257
Unanswered
17aakanksha17
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
0
I have been working on Payload for a while, but I encounter this issue almost every time. When I add data to a component, it works initially, but after a few entries, it throws an error saying that the UUID already exists, even though the UUID field is unique. After that, even if I delete the entry, it still doesn’t work, and I have to delete the entire component.
The error is: “The following field is invalid: id.”
Unable to push
Here is the exact chain of events:
Step 1 — You add a new link row and click Publish
Payload generates a UUID (id) for each array item (e.g. id: "abc-123" for the new link row) and bundles them into the PATCH request body.
Step 2 — Payload starts a DB transaction
It begins writing the new published version:
Inserts row into _clinic_header_v → gets version id = 10
Inserts block row into _clinic_header_v_blocks_toute_header → gets id = 5
Inserts link row into _clinic_header_v_blocks_toute_header_links with id = "abc-123"
Step 3 — Required validation fires and fails
label is empty → Payload throws a validation error and rolls back the transaction.
The rows are deleted. But Postgres sequences are never rolled back — the sequence counters for those tables are now permanently advanced (stuck at 5, 10, etc.).
Step 4 — You fill in the label and try to Publish again
Payload sends the exact same block data — same UUIDs generated client-side from the same form state, including id: "abc-123" for the link row.
Payload now tries to write again. It inserts the version row — this time gets id = 11 (sequence advanced). It inserts the block row — gets id = 6. It tries to insert the link row with id = "abc-123".
But "abc-123" already exists — it was written in step 2 before the rollback... or was it?
The real culprit — autosave race
While the failed publish was happening, autosave fired independently (at 3s before we raised it to 60s) and successfully wrote its own draft version with the same block IDs from the same form state — including id: "abc-123". Autosave succeeded because it doesn't enforce required (it's just a draft save).
Now the DB has "abc-123" in the autosave version. When publish tries to write "abc-123" into the same version sub-table, Postgres sees a duplicate → "Value must be unique on id".
Summary
required: true → validation fails on publish → partial DB writes happen (or autosave races in) → same UUIDs are attempted again → unique constraint violation on block/item id.
The fix is simple: no required: true inside array items, so validation never fails mid-write and the race never starts.
Beta Was this translation helpful? Give feedback.
All reactions