Skip to content

test/librados/aio_cxx: Fix Pool EIO flag tests#49278

Merged
ljflores merged 1 commit intoceph:mainfrom
Matan-B:wip-matanb-EIO-flag-fix
Dec 12, 2022
Merged

test/librados/aio_cxx: Fix Pool EIO flag tests#49278
ljflores merged 1 commit intoceph:mainfrom
Matan-B:wip-matanb-EIO-flag-fix

Conversation

@Matan-B
Copy link
Contributor

@Matan-B Matan-B commented Dec 6, 2022

  • When tested with multiple OSDs SimplePoolEIO test will fail (error reply messages order) - test is removed.

  • Timeout edited to avoid infinite test.

Fixes: https://tracker.ceph.com/issues/58173

Signed-off-by: Matan Breizman mbreizma@redhat.com

Contribution Guidelines

Checklist

  • Tracker (select at least one)
    • References tracker ticket
    • Very recent bug; references commit where it was introduced
    • New feature (ticket optional)
    • Doc update (no ticket needed)
    • Code cleanup (no ticket needed)
  • Component impact
    • Affects Dashboard, opened tracker ticket
    • Affects Orchestrator, opened tracker ticket
    • No impact that needs to be tracked
  • Documentation (select at least one)
    • Updates relevant documentation
    • No doc update is appropriate
  • Tests (select at least one)
Show available Jenkins commands
  • jenkins retest this please
  • jenkins test classic perf
  • jenkins test crimson perf
  • jenkins test signed
  • jenkins test make check
  • jenkins test make check arm64
  • jenkins test submodules
  • jenkins test dashboard
  • jenkins test dashboard cephadm
  • jenkins test api
  • jenkins test docs
  • jenkins render docs
  • jenkins test ceph-volume all
  • jenkins test ceph-volume tox
  • jenkins test windows

* When tested with multiple OSDs SimplePoolEIO test
  will fail (error reply messages order) - test is removed.

* Timeout edited to avoid infinite test.

Fixes: https://tracker.ceph.com/issues/58173

Signed-off-by: Matan Breizman <mbreizma@redhat.com>
@github-actions github-actions bot added the tests label Dec 6, 2022
@Matan-B
Copy link
Contributor Author

Matan-B commented Dec 6, 2022

@ljflores
Copy link
Member

ljflores commented Dec 6, 2022

I think we should put this through a full rados suite run, just to say we covered all the bases. @neha-ojha @Matan-B let me know if you guys think the above tests are sufficient though.

@ljflores
Copy link
Member

ljflores commented Dec 6, 2022

Also @Matan-B does this require any crimson verification?

@athanatos
Copy link
Contributor

@ljflores Let's get the normal classic suite passing first, we can circle back to crimson afterwards.

@Matan-B
Copy link
Contributor Author

Matan-B commented Dec 7, 2022

Also @Matan-B does this require any crimson verification?

Crimson:
https://pulpito.ceph.com/matan-2022-12-06_18:05:44-crimson-rados-main-distro-crimson-smithi/

@ljflores
Copy link
Member

ljflores commented Dec 12, 2022

Rados suite review: https://pulpito.ceph.com/?branch=wip-yuri3-testing-2022-12-06-1211

Failures, unrelated:
1. https://tracker.ceph.com/issues/57311
2. https://tracker.ceph.com/issues/58098
3. https://tracker.ceph.com/issues/58096
4. https://tracker.ceph.com/issues/52321
5. https://tracker.ceph.com/issues/57731
6. https://tracker.ceph.com/issues/57546
7. https://tracker.ceph.com/issues/58256

Details:
1. rook: ensure CRDs are installed first - Ceph - Orchestrator
2. qa/workunits/rados/test_crash.sh: crashes are never posted - Ceph - RADOS
3. test_cluster_set_reset_user_config: NFS mount fails due to missing ceph directory - Ceph - Orchestrator
4. qa/tasks/rook times out: 'check osd count' reached maximum tries (90) after waiting for 900 seconds - Ceph - Orchestrator
5. Problem: package container-selinux conflicts with udica < 0.2.6-1 provided by udica-0.2.4-1 - Infrastructure
6. rados/thrash-erasure-code: wait_for_recovery timeout due to "active+clean+remapped+laggy" pgs - Ceph - RADOS
7. ObjectStore/StoreTestSpecificAUSize.SpilloverTest/2: Expected: (logger->get(l_bluefs_slow_used_bytes)) >= (16 * 1024 * 1024), actual: 0 vs 16777216 - Ceph - RADOS

@ljflores ljflores merged commit 1ef4b49 into ceph:main Dec 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants