Skip to content

Make points upsert ordering deterministic#7130

Merged
agourlay merged 2 commits intodevfrom
conserve-initial-points-ordering-on-insert
Aug 25, 2025
Merged

Make points upsert ordering deterministic#7130
agourlay merged 2 commits intodevfrom
conserve-initial-points-ordering-on-insert

Conversation

@agourlay
Copy link
Member

This PR makes the points ordering on upsert deterministic by picking the order from the user.

This improves the debugging and testing experience while taking us one more step towards a more deterministic code base.

@agourlay agourlay requested a review from timvisee August 25, 2025 10:34
Copy link
Member

@KShivendu KShivendu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea. thanks for the insights!

@qdrant qdrant deleted a comment from coderabbitai bot Aug 25, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Aug 25, 2025

📝 Walkthrough

Walkthrough

  • Replaces two-step construction of points map and derived ids with a single-pass build that outputs (ordered ids, per-id points map), preserving input order.
  • Updates the upsert path to use ordered ids with apply_points_with_conditional_move, and uses the per-id data map for upsert_with_payload.
  • Inserts new points afterward into the smallest appendable segment, with a guard to ensure a write segment exists.
  • Adds a comment noting the conservation of initial point order.
  • No changes to exported/public signatures.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch conserve-initial-points-ordering-on-insert

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
lib/collection/src/collection_manager/segments_updater.rs (2)

575-597: Avoid duplicate inserts and skewed “updated” count when input contains duplicate IDs

When the input has duplicate point IDs that don’t yet exist, new_point_ids currently yields duplicates. That leads to inserting the same point multiple times and incorrectly increasing the “updated” count (the second insert becomes a replace). Deduplicate new IDs stably before the insertion loop.

I recommend adding a test that upserts duplicates like [1, 1, 2, 1] and asserts: (a) only one insert for 1 happens, (b) final state matches the last occurrence, (c) updated count isn’t inflated.

Apply this diff:

-    // Insert new points, which was not updated or existed
-    let new_point_ids = ids.iter().copied().filter(|x| !updated_points.contains(x));
+    // Insert new points, which were not updated or existed (stable de-dup to avoid double inserts)
+    let mut seen_new = AHashSet::with_capacity(ids.len());
+    let new_point_ids: Vec<PointIdType> = ids
+        .iter()
+        .copied()
+        .filter(|id| !updated_points.contains(id))
+        .filter(|id| seen_new.insert(*id))
+        .collect();
@@
-        for point_id in new_point_ids {
+        for point_id in new_point_ids {
             let point = points_map[&point_id];
             res += usize::from(upsert_with_payload(
                 &mut write_segment,
                 op_num,
                 point_id,
                 point.get_vectors(),
                 point.payload.as_ref(),
                 hw_counter,
             )?);
         }

581-582: Nit: grammar in error message

Plural agreement: “segments exist”.

-            CollectionError::service_error("No appendable segments exists, expected at least one")
+            CollectionError::service_error("No appendable segments exist, expected at least one")
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4e8e9da and 5c41af0.

📒 Files selected for processing (1)
  • lib/collection/src/collection_manager/segments_updater.rs (1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-10T18:30:02.986Z
Learnt from: generall
PR: qdrant/qdrant#7006
File: lib/collection/src/operations/verification/update.rs:158-174
Timestamp: 2025-08-10T18:30:02.986Z
Learning: In Qdrant's strict mode verification code (lib/collection/src/operations/verification/update.rs), exhaustive pattern matching without `..` is intentionally used for structs like PointsBatch and PointsList. This design pattern ensures compilation fails when new fields are added, forcing developers to explicitly consider how new fields should be handled in the indexed_filter_write method. This provides visibility and compile-time safety for struct evolution.

Applied to files:

  • lib/collection/src/collection_manager/segments_updater.rs
🧬 Code graph analysis (1)
lib/collection/src/collection_manager/segments_updater.rs (2)
lib/collection/src/operations/conversions.rs (1)
  • ids (894-897)
lib/storage/src/rbac/ops_checks.rs (1)
  • ids (574-574)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (12)
  • GitHub Check: e2e-tests
  • GitHub Check: integration-tests
  • GitHub Check: test-shard-snapshot-api-s3-minio
  • GitHub Check: integration-tests-consensus
  • GitHub Check: test-consensus-compose
  • GitHub Check: test-consistency
  • GitHub Check: storage-compat-test
  • GitHub Check: rust-tests-no-rocksdb (ubuntu-latest)
  • GitHub Check: rust-tests (macos-latest)
  • GitHub Check: rust-tests (ubuntu-latest)
  • GitHub Check: rust-tests (windows-latest)
  • GitHub Check: lint

Comment on lines +541 to 546
// Conserve initial order of points
let (ids, points_map) = points
.into_iter()
.map(|p| (p.id, (p.id, p)))
.collect::<(Vec<_>, AHashMap<_, _>)>();

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix: collect::<(Vec<_>, AHashMap<_, _>)>() will not compile; use unzip() with explicit types

Iterator::collect produces a single collection. There is no FromIterator for tuples of collections, so this won’t compile. Use Iterator::unzip with an explicit type to build (Vec<PointIdType>, AHashMap<PointIdType, &PointStructPersisted>) in one pass while preserving input order for ids and last-write-wins for the map.

Apply this diff:

-    // Conserve initial order of points
-    let (ids, points_map) = points
-        .into_iter()
-        .map(|p| (p.id, (p.id, p)))
-        .collect::<(Vec<_>, AHashMap<_, _>)>();
+    // Conserve initial order of points (ids); points_map is last-write-wins on duplicates
+    let (ids, points_map): (Vec<PointIdType>, AHashMap<PointIdType, &'a PointStructPersisted>) =
+        points
+            .into_iter()
+            .map(|p| (p.id, (p.id, p)))
+            .unzip();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Conserve initial order of points
let (ids, points_map) = points
.into_iter()
.map(|p| (p.id, (p.id, p)))
.collect::<(Vec<_>, AHashMap<_, _>)>();
// Conserve initial order of points (ids); points_map is last-write-wins on duplicates
let (ids, points_map): (Vec<PointIdType>, AHashMap<PointIdType, &'a PointStructPersisted>) =
points
.into_iter()
.map(|p| (p.id, (p.id, p)))
.unzip();
🤖 Prompt for AI Agents
In lib/collection/src/collection_manager/segments_updater.rs around lines
541-546, replace the failing collect::<(Vec<_>, AHashMap<_, _>)>() with
Iterator::unzip and an explicit type annotation: map each point to (id, (id,
point)) so unzip returns (Vec<IdType>, AHashMap<IdType, PointType>), e.g. let
(ids, points_map): (Vec<PointIdType>, AHashMap<PointIdType,
PointStructPersisted>) = points.into_iter().map(|p| { let id = p.id; (id, (id,
p)) }).unzip(); This preserves input order in ids and builds the hashmap with
last-write-wins semantics.

@agourlay agourlay requested a review from timvisee August 25, 2025 12:47
@agourlay agourlay merged commit 1e81d4f into dev Aug 25, 2025
16 checks passed
@agourlay agourlay deleted the conserve-initial-points-ordering-on-insert branch August 25, 2025 12:49
timvisee pushed a commit that referenced this pull request Aug 26, 2025
* Make points upsert ordering deterministic

* use Iterator magic (thx Tim)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants