Skip to content

crimson/osd: Introduce maybe_handle_osd_maps ()#51860

Closed
Matan-B wants to merge 9 commits intoceph:mainfrom
Matan-B:wip-matanb-crimson-maybe-handle-osdmap
Closed

crimson/osd: Introduce maybe_handle_osd_maps ()#51860
Matan-B wants to merge 9 commits intoceph:mainfrom
Matan-B:wip-matanb-crimson-maybe-handle-osdmap

Conversation

@Matan-B
Copy link
Contributor

@Matan-B Matan-B commented May 31, 2023

  • Before the change:

    handled osd_map messages may have epochs intersections,
    while we are trying to avoid processing the same map more than once,
    we can only assure than it is safe to skip an already handled osd map
    *only* after it was processed (and written to the superblock).
    
  • Logs:

INFO  2023-05-15 12:28:50,249 [shard 0] osd - handle_osd_map epochs [1..5], i have 0, src has [1..5]                                                                       
INFO  2023-05-15 12:28:51,245 [shard 0] osd - handle_osd_map epochs [5..6], i have 5, src has [1..6]                                                                       
INFO  2023-05-15 12:28:51,246 [shard 0] osd - handle_osd_map epochs [6..6], i have 6, src has [1..6]                                                                       
INFO  2023-05-15 12:28:55,309 [shard 0] osd - handle_osd_map epochs [6..7], i have 6, src has [1..7]                                                                       
INFO  2023-05-15 12:28:58,339 [shard 0] osd - handle_osd_map epochs [7..8], i have 7, src has [1..8]                                                                       
INFO  2023-05-15 12:29:04,389 [shard 0] osd - handle_osd_map epochs [8..10], i have 8, src has [8..10]                                                                     
INFO  2023-05-15 12:29:04,390 [shard 0] osd - handle_osd_map epochs [8..10], i have 10, src has [8..10]                                                                    
INFO  2023-05-15 12:29:08,256 [shard 0] osd - handle_osd_map epochs [10..12], i have 10, src has [10..12]
INFO  2023-05-15 12:29:08,257 [shard 0] osd - handle_osd_map epochs [10..12], i have 12, src has [10..12]
INFO  2023-05-15 12:29:08,367 [shard 0] osd - handle_osd_map epochs [12..13], i have 12, src has [1..13]
INFO  2023-05-15 12:29:09,368 [shard 0] osd - handle_osd_map epochs [13..14], i have 13, src has [1..14]
INFO  2023-05-15 12:29:14,256 [shard 0] osd - handle_osd_map epochs [14..15], i have 14, src has [14..15]
INFO  2023-05-15 12:29:14,257 [shard 0] osd - handle_osd_map epochs [14..15], i have 15, src has [14..15]
INFO  2023-05-15 12:29:14,259 [shard 0] osd - handle_osd_map epochs [14..16], i have 15, src has [14..16]
INFO  2023-05-15 12:29:14,261 [shard 0] osd - handle_osd_map epochs [14..16], i have 16, src has [14..16]
INFO  2023-05-15 12:29:18,430 [shard 0] osd - handle_osd_map epochs [16..17], i have 16, src has [16..17]
INFO  2023-05-15 12:29:18,431 [shard 0] osd - handle_osd_map epochs [16..17], i have 17, src has [16..17]
INFO  2023-05-15 12:29:25,405 [shard 0] osd - handle_osd_map epochs [17..18], i have 17, src has [1..18]
INFO  2023-05-15 12:29:26,409 [shard 0] osd - handle_osd_map epochs [19..19], i have 18, src has [1..19]
INFO  2023-05-15 12:29:26,411 [shard 0] osd - handle_osd_map epochs [18..19], i have 19, src has [1..19]
INFO  2023-05-15 12:29:27,411 [shard 0] osd - handle_osd_map epochs [20..20], i have 19, src has [1..20]
INFO  2023-05-15 12:29:27,452 [shard 0] osd - handle_osd_map epochs [19..20], i have 20, src has [1..20]
INFO  2023-05-15 12:29:29,360 [shard 0] osd - handle_osd_map epochs [20..21], i have 20, src has [20..21]
INFO  2023-05-15 12:29:29,361 [shard 0] osd - handle_osd_map epochs [20..21], i have 21, src has [20..21]

  • After the change:

    each handled epoch range is exclusive since we omit the already handled ones earlier (`calc_new_maps`)
    
  • Logs:

INFO  2023-05-31 14:48:53,777 [shard 0] osd - handle_osd_map epochs [1..5], i have 0, src has [1..5]                                                                       
INFO  2023-05-31 14:48:54,766 [shard 0] osd - handle_osd_map epochs [6..6], i have 5, src has [1..6]                                                                       
INFO  2023-05-31 14:48:58,558 [shard 0] osd - handle_osd_map epochs [7..7], i have 6, src has [1..7]                                                                       
INFO  2023-05-31 14:49:01,866 [shard 0] osd - handle_osd_map epochs [8..8], i have 7, src has [1..8]                                                                       
INFO  2023-05-31 14:49:07,948 [shard 0] osd - handle_osd_map epochs [9..10], i have 8, src has [8..10]                                                                     
INFO  2023-05-31 14:49:11,784 [shard 0] osd - handle_osd_map epochs [11..12], i have 10, src has [10..12]                                                                  
INFO  2023-05-31 14:49:12,780 [shard 0] osd - handle_osd_map epochs [13..13], i have 12, src has [1..13]                                                                   
INFO  2023-05-31 14:49:13,781 [shard 0] osd - handle_osd_map epochs [14..14], i have 13, src has [1..14]                                                                   
INFO  2023-05-31 14:49:17,784 [shard 0] osd - handle_osd_map epochs [15..15], i have 14, src has [14..15]
INFO  2023-05-31 14:49:17,788 [shard 0] osd - handle_osd_map epochs [16..16], i have 15, src has [14..16]
INFO  2023-05-31 14:49:22,862 [shard 0] osd - handle_osd_map epochs [17..17], i have 16, src has [16..17]
INFO  2023-05-31 14:50:58,018 [shard 0] osd - handle_osd_map epochs [18..18], i have 17, src has [1..18]
INFO  2023-05-31 14:50:59,021 [shard 0] osd - handle_osd_map epochs [19..19], i have 18, src has [1..19]
INFO  2023-05-31 14:51:04,816 [shard 0] osd - handle_osd_map epochs [20..20], i have 19, src has [19..20]
INFO  2023-05-31 14:51:05,785 [shard 0] osd - handle_osd_map epochs [21..21], i have 20, src has [20..21]
INFO  2023-05-31 14:51:22,016 [shard 0] osd - handle_osd_map epochs [22..22], i have 21, src has [1..22]
INFO  2023-05-31 14:51:23,067 [shard 0] osd - handle_osd_map epochs [23..23], i have 22, src has [1..23]

Resulting in also exclusive PGAdvanceMap from and to epochs

EBUG 2023-05-31 14:11:12,742 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=443 to=444)): created
DEBUG 2023-05-31 14:11:13,607 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=444 to=445)): created
DEBUG 2023-05-31 14:11:14,603 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=445 to=446)): created
DEBUG 2023-05-31 14:11:15,615 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=446 to=447)): created
DEBUG 2023-05-31 14:11:17,304 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=447 to=448)): created
DEBUG 2023-05-31 14:11:17,633 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=448 to=449)): created
DEBUG 2023-05-31 14:11:18,627 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=449 to=450)): created
DEBUG 2023-05-31 14:11:19,633 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=450 to=451)): created
DEBUG 2023-05-31 14:11:20,706 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=451 to=452)): created
DEBUG 2023-05-31 14:11:21,638 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=452 to=453)): created
DEBUG 2023-05-31 14:11:22,875 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=453 to=454)): created
DEBUG 2023-05-31 14:11:24,413 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=454 to=455)): created
DEBUG 2023-05-31 14:11:24,692 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=455 to=456)): created
DEBUG 2023-05-31 14:11:26,708 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=456 to=458)): created
DEBUG 2023-05-31 14:11:27,710 [shard 0] osd - pg_advance_map(id=0, detail=PGAdvanceMap(pg=1.0 from=458 to=459)): created

Fixes: https://tracker.ceph.com/issues/59165

Contribution Guidelines

Checklist

  • Tracker (select at least one)
    • References tracker ticket
    • Very recent bug; references commit where it was introduced
    • New feature (ticket optional)
    • Doc update (no ticket needed)
    • Code cleanup (no ticket needed)
  • Component impact
    • Affects Dashboard, opened tracker ticket
    • Affects Orchestrator, opened tracker ticket
    • No impact that needs to be tracked
  • Documentation (select at least one)
    • Updates relevant documentation
    • No doc update is appropriate
  • Tests (select at least one)
Show available Jenkins commands
  • jenkins retest this please
  • jenkins test classic perf
  • jenkins test crimson perf
  • jenkins test signed
  • jenkins test make check
  • jenkins test make check arm64
  • jenkins test submodules
  • jenkins test dashboard
  • jenkins test dashboard cephadm
  • jenkins test api
  • jenkins test docs
  • jenkins render docs
  • jenkins test ceph-volume all
  • jenkins test ceph-volume tox
  • jenkins test windows

Matan-B added 9 commits May 31, 2023 14:55
following 3642364, it is safe to assume that we handle
exlusive osdmaps. As a result, we are able to schedule a
PGAdvance with a bounded and exclusive range.

Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Added logs
If case reconstructed - epoch is unsigned.

Signed-off-by: Matan Breizman <mbreizma@redhat.com>
…ional

Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Signed-off-by: Matan Breizman <mbreizma@redhat.com>
Signed-off-by: Matan Breizman <mbreizma@redhat.com>
@Matan-B
Copy link
Contributor Author

Matan-B commented Jun 1, 2023

@Matan-B
Copy link
Contributor Author

Matan-B commented Jun 7, 2023

Closed in favor of: #51961

@Matan-B Matan-B closed this Jun 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant