Bug #66627
closed[crimson] OSD device class not being displayed in ceph osd tree output
0%
Description
It has been observed since long that `ceph osd tree` output in Crimson builds do not display CLASS for deployed OSDs
Crimson Build - https://shaman.ceph.com/repos/ceph/main/959f007d8363ae90c178cd7468327630cff19e73/crimson/322105/
[root@bruuni011 cephadm]# ceph osd tree 2024-06-21T01:35:59.244+0000 7ff5f2e4b640 -1 WARNING: the following dangerous and experimental features are enabled: crimson 2024-06-21T01:35:59.244+0000 7ff5f2e4b640 -1 WARNING: the following dangerous and experimental features are enabled: crimson ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF -1 27.65482 root default -6 4.07547 host bruuni003 6 1.35849 osd.6 up 1.00000 1.00000 13 1.35849 osd.13 up 1.00000 1.00000 19 1.35849 osd.19 up 1.00000 1.00000 -4 4.07547 host bruuni004 0 1.35849 osd.0 up 1.00000 1.00000 7 1.35849 osd.7 up 1.00000 1.00000 15 1.35849 osd.15 up 1.00000 1.00000 -5 4.07547 host bruuni006 3 1.35849 osd.3 up 1.00000 1.00000 10 1.35849 osd.10 up 1.00000 1.00000 16 1.35849 osd.16 up 1.00000 1.00000 -8 4.07547 host bruuni007 5 1.35849 osd.5 up 1.00000 1.00000 12 1.35849 osd.12 up 1.00000 1.00000 18 1.35849 osd.18 up 1.00000 1.00000 -3 4.07547 host bruuni008 2 1.35849 osd.2 up 1.00000 1.00000 8 1.35849 osd.8 up 1.00000 1.00000 14 1.35849 osd.14 up 1.00000 1.00000 -2 3.20200 host bruuni010 1 1.60100 osd.1 up 1.00000 1.00000 9 1.60100 osd.9 up 1.00000 1.00000 -7 4.07547 host bruuni012 4 1.35849 osd.4 up 1.00000 1.00000 11 1.35849 osd.11 up 1.00000 1.00000 17 1.35849 osd.17 up 1.00000 1.00000
Crimson version
[root@bruuni011 ~]# cephadm shell -- ceph versions
Inferring fsid 5af11440-2f67-11ef-8994-0cc47af96462
Inferring config /var/lib/ceph/5af11440-2f67-11ef-8994-0cc47af96462/mon.bruuni011/config
Using ceph image with id '2eb5b9b9dec4' and tag '959f007d8363ae90c178cd7468327630cff19e73-crimson' created on 2024-06-19 22:08:49 +0000 UTC
quay.ceph.io/ceph-ci/ceph@sha256:ec558a0c745d88b7c3d182bf971d9adbb13337bc8fedc887ddba296f2a6b264f
2024-06-24T01:58:52.661+0000 7ff327bf3640 -1 WARNING: the following dangerous and experimental features are enabled: crimson
2024-06-24T01:58:52.662+0000 7ff327bf3640 -1 WARNING: the following dangerous and experimental features are enabled: crimson
{
"mon": {
"ceph version 19.0.0-4410-g959f007d (959f007d8363ae90c178cd7468327630cff19e73) squid (dev)": 3
},
"mgr": {
"ceph version 19.0.0-4410-g959f007d (959f007d8363ae90c178cd7468327630cff19e73) squid (dev)": 3
},
"osd": {
"ceph version 19.0.0-4410-g959f007d (959f007d8363ae90c178cd7468327630cff19e73) squid (dev)": 20
},
"rgw": {
"ceph version 19.0.0-4410-g959f007d (959f007d8363ae90c178cd7468327630cff19e73) squid (dev)": 7
},
"overall": {
"ceph version 19.0.0-4410-g959f007d (959f007d8363ae90c178cd7468327630cff19e73) squid (dev)": 33
}
}
Contrary to the above, OSD class is displayed in a RHCS Reef build on the same set of hardware
[root@bruuni011 cephadm]# cephadm shell -- ceph versions
Inferring fsid 8ca8b7d4-31d0-11ef-9462-0cc47af96462
Inferring config /var/lib/ceph/8ca8b7d4-31d0-11ef-9462-0cc47af96462/mon.bruuni011/config
Using ceph image with id '5412073bd769' and tag 'latest' created on 2024-05-31 19:37:19 +0000 UTC
registry.redhat.io/rhceph/rhceph-7-rhel9@sha256:579e5358418e176194812eeab523289a0c65e366250688be3f465f1a633b026d
{
"mon": {
"ceph version 18.2.1-194.el9cp (04a992766839cd3207877e518a1238cdbac3787e) reef (stable)": 3
},
"mgr": {
"ceph version 18.2.1-194.el9cp (04a992766839cd3207877e518a1238cdbac3787e) reef (stable)": 3
},
"osd": {
"ceph version 18.2.1-194.el9cp (04a992766839cd3207877e518a1238cdbac3787e) reef (stable)": 20
},
"overall": {
"ceph version 18.2.1-194.el9cp (04a992766839cd3207877e518a1238cdbac3787e) reef (stable)": 26
}
}
[root@bruuni011 cephadm]# ceph osd tree
ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF
-1 27.65482 root default
-5 4.07547 host bruuni003
4 ssd 1.35849 osd.4 up 1.00000 1.00000
11 ssd 1.35849 osd.11 up 1.00000 1.00000
16 ssd 1.35849 osd.16 up 1.00000 1.00000
-13 4.07547 host bruuni004
3 ssd 1.35849 osd.3 up 1.00000 1.00000
8 ssd 1.35849 osd.8 up 1.00000 1.00000
14 ssd 1.35849 osd.14 up 1.00000 1.00000
-15 4.07547 host bruuni006
2 ssd 1.35849 osd.2 up 1.00000 1.00000
9 ssd 1.35849 osd.9 up 1.00000 1.00000
15 ssd 1.35849 osd.15 up 1.00000 1.00000
-11 4.07547 host bruuni007
5 ssd 1.35849 osd.5 up 1.00000 1.00000
12 ssd 1.35849 osd.12 up 1.00000 1.00000
18 ssd 1.35849 osd.18 up 1.00000 1.00000
-7 4.07547 host bruuni008
6 ssd 1.35849 osd.6 up 1.00000 1.00000
13 ssd 1.35849 osd.13 up 1.00000 1.00000
19 ssd 1.35849 osd.19 up 1.00000 1.00000
-3 3.20200 host bruuni010
0 ssd 1.60100 osd.0 up 1.00000 1.00000
7 ssd 1.60100 osd.7 up 1.00000 1.00000
-9 4.07547 host bruuni012
1 ssd 1.35849 osd.1 up 1.00000 1.00000
10 ssd 1.35849 osd.10 up 1.00000 1.00000
17 ssd 1.35849 osd.17 up 1.00000 1.00000
Updated by Hanliang Xu over 1 year ago
Hey Harsh and Matan! As a first-time contributor to Ceph, I'd love to work on this. I already went through most parts in the Developer Guide and plan to submit pull request to this by this weekend. Thank you!
Updated by kenan al-shamie over 1 year ago
Hey, I spoke to Hanliang on Linkedin and he doesn't mind me taking over this, as he's having issues getting Ceph to run on his Mac. I'll likely start working on this now.
Updated by kenan al-shamie over 1 year ago
Gonna have to drop this due to other priorities coming up - this issue is now free for grabs again!
Updated by MOHIT AGRAWAL over 1 year ago
- Status changed from New to Fix Under Review
- Pull request ID set to 60747
Updated by MOHIT AGRAWAL over 1 year ago
- Assignee changed from MOHIT AGRAWAL to MOHIT AGRAWAL
Updated by MOHIT AGRAWAL over 1 year ago
- Status changed from Fix Under Review to Resolved
Updated by Upkeep Bot 8 months ago
- Merge Commit set to 3caa54235d723db575bf0e3fa3e3af00ebd37469
- Fixed In set to v19.3.0-6195-g3caa54235d7
- Upkeep Timestamp set to 2025-07-10T23:09:52+00:00
Updated by Upkeep Bot 8 months ago
- Fixed In changed from v19.3.0-6195-g3caa54235d7 to v19.3.0-6195-g3caa54235d
- Upkeep Timestamp changed from 2025-07-10T23:09:52+00:00 to 2025-07-14T22:42:42+00:00
Updated by Upkeep Bot 5 months ago
- Released In set to v20.2.0~1587
- Upkeep Timestamp changed from 2025-07-14T22:42:42+00:00 to 2025-11-01T01:17:22+00:00