Bug #62934
closedunittest_osdmap (Subprocess aborted) during OSDMapTest.BUG_42485
100%
Description
full log: https://jenkins.ceph.com/job/ceph-pull-requests/122260/consoleText
205/283 Test #194: unittest_osdmap ...........................Subprocess aborted***Exception: 41.07 sec
[==========] Running 32 tests from 3 test suites.
[----------] Global test environment set-up.
[----------] 27 tests from OSDMapTest
[ RUN ] OSDMapTest.Create
[ OK ] OSDMapTest.Create (1 ms)
[ RUN ] OSDMapTest.Features
[ OK ] OSDMapTest.Features (1 ms)
[ RUN ] OSDMapTest.MapPG
osdmap.pool_max==2
[ OK ] OSDMapTest.MapPG (1 ms)
[ RUN ] OSDMapTest.MapFunctionsMatch
[ OK ] OSDMapTest.MapFunctionsMatch (0 ms)
[ RUN ] OSDMapTest.PrimaryIsFirst
[ OK ] OSDMapTest.PrimaryIsFirst (1 ms)
[ RUN ] OSDMapTest.PGTempRespected
[ OK ] OSDMapTest.PGTempRespected (1 ms)
[ RUN ] OSDMapTest.PrimaryTempRespected
[ OK ] OSDMapTest.PrimaryTempRespected (0 ms)
[ RUN ] OSDMapTest.CleanTemps
[ OK ] OSDMapTest.CleanTemps (1 ms)
[ RUN ] OSDMapTest.KeepsNecessaryTemps
[ OK ] OSDMapTest.KeepsNecessaryTemps (1 ms)
[ RUN ] OSDMapTest.PrimaryAffinity
pool 1 size 3 expect_primary 1666
any: [6095,5782,4528,5310,5003,3282]
first: [1875,1564,1562,2655,1406,938]
primary: [1875,1564,1562,2655,1406,938]
any: [6095,5782,4528,5310,5003,3282]
first: [1875,1564,1562,2655,1406,938]
primary: [0,0,2967,2968,2501,1564]
any: [6095,5782,4528,5310,5003,3282]
first: [1875,1564,1562,2655,1406,938]
primary: [938,0,2499,2812,2344,1407]
expect 1250
pool 2 size 3 expect_primary 1666
any: [5939,5937,3751,4374,4529,5470]
first: [1874,2031,1251,1408,1093,2343]
primary: [1874,2031,1251,1408,1093,2343]
any: [5939,5937,3751,4374,4529,5470]
first: [0,0,2033,2188,2498,3281]
primary: [0,0,2033,2188,2498,3281]
any: [5939,5937,3751,4374,4529,5470]
first: [1250,0,1720,1876,2186,2968]
primary: [1250,0,1720,1876,2186,2968]
expect 1250
[ OK ] OSDMapTest.PrimaryAffinity (870 ms)
[ RUN ] OSDMapTest.get_osd_crush_node_flags
[ OK ] OSDMapTest.get_osd_crush_node_flags (1 ms)
[ RUN ] OSDMapTest.parse_osd_id_list
Expected option value to be integer, got 'foo'invalid osd id 'foo'expected numerical value, got: -12invalid osd id '-12'[ OK ] OSDMapTest.parse_osd_id_list (0 ms)
[ RUN ] OSDMapTest.CleanPGUpmaps
2023-09-21T17:58:56.370+0000 7f7689961640 -1 verify_upmap multiple osds 2,3 come from same failure domain -5
2023-09-21T17:58:56.370+0000 7f7689961640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
2023-09-21T17:58:56.374+0000 7f7689160640 -1 verify_upmap multiple osds 4,5 come from same failure domain -6
2023-09-21T17:58:56.374+0000 7f7689160640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
[ OK ] OSDMapTest.CleanPGUpmaps (26 ms)
[ RUN ] OSDMapTest.BUG_38897
[ OK ] OSDMapTest.BUG_38897 (5 ms)
[ RUN ] OSDMapTest.BUG_40104
clean_pg_upmaps (~10000 pg_upmap_items) latency:6s
[ OK ] OSDMapTest.BUG_40104 (39942 ms)
[ RUN ] OSDMapTest.BUG_42052
[ OK ] OSDMapTest.BUG_42052 (3 ms)
[ RUN ] OSDMapTest.BUG_42485
pgid [26,12,54,46] up [26,12,54,46]
from 26 to 30
pgid 3.2 up [16,25,53,31]
from 16 to 40
from 31 to 0
pure virtual method called
terminate called without an active exception
*** Caught signal (Aborted) **
in thread 7f768895f640 thread_name:clean_upmap_tp
ceph version Development (no_version) reef (stable)
1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2d70f3) [0x5620cad660f3]
2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7f76919f9520]
3: pthread_kill()
4: raise()
5: abort()
6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7f7691d8abfe]
7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7f7691d9628c]
8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7f7691d962f7]
9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7f7691d97025]
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x5620cacb65d6]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7f76939c54a5]
12: (ThreadPool::WorkThread::entry()+0x1a) [0x7f76939cb2fa]
13: (Thread::entry_wrapper()+0x84) [0x7f769399af14]
14: (Thread::_entry_func(void*)+0x15) [0x7f769399ae75]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7f7691a4bb43]
16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7f7691adda00]
2023-09-21T17:59:36.383+0000 7f768895f640 -1 *** Caught signal (Aborted) **
in thread 7f768895f640 thread_name:clean_upmap_tp
ceph version Development (no_version) reef (stable)
1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2d70f3) [0x5620cad660f3]
2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7f76919f9520]
3: pthread_kill()
4: raise()
5: abort()
6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7f7691d8abfe]
7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7f7691d9628c]
8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7f7691d962f7]
9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7f7691d97025]
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x5620cacb65d6]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7f76939c54a5]
12: (ThreadPool::WorkThread::entry()+0x1a) [0x7f76939cb2fa]
13: (Thread::entry_wrapper()+0x84) [0x7f769399af14]
14: (Thread::_entry_func(void*)+0x15) [0x7f769399ae75]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7f7691a4bb43]
16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7f7691adda00]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
--- begin dump of recent events ---
-94> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command assert hook 0x5620cb15e600
-93> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command abort hook 0x5620cb15e600
-92> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command leak_some_memory hook 0x5620cb15e600
-91> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perfcounters_dump hook 0x5620cb15e600
-90> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command 1 hook 0x5620cb15e600
-89> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf dump hook 0x5620cb15e600
-88> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perfcounters_schema hook 0x5620cb15e600
-87> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf histogram dump hook 0x5620cb15e600
-86> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command 2 hook 0x5620cb15e600
-85> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf schema hook 0x5620cb15e600
-84> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command counter dump hook 0x5620cb15e600
-83> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command counter schema hook 0x5620cb15e600
-82> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf histogram schema hook 0x5620cb15e600
-81> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf reset hook 0x5620cb15e600
-80> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config show hook 0x5620cb15e600
-79> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config help hook 0x5620cb15e600
-78> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config set hook 0x5620cb15e600
-77> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config unset hook 0x5620cb15e600
-76> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config get hook 0x5620cb15e600
-75> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config diff hook 0x5620cb15e600
-74> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config diff get hook 0x5620cb15e600
-73> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command injectargs hook 0x5620cb15e600
-72> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log flush hook 0x5620cb15e600
-71> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log dump hook 0x5620cb15e600
-70> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log reopen hook 0x5620cb15e600
-69> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command dump_mempools hook 0x5620cb22f048
-68> 2023-09-21T17:58:56.358+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-67> 2023-09-21T17:58:56.358+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-66> 2023-09-21T17:58:56.358+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-65> 2023-09-21T17:58:56.358+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-64> 2023-09-21T17:58:56.358+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-63> 2023-09-21T17:58:56.358+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-62> 2023-09-21T17:58:56.358+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-61> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-60> 2023-09-21T17:58:56.362+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-59> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-58> 2023-09-21T17:58:56.362+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-57> 2023-09-21T17:58:56.362+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-56> 2023-09-21T17:58:56.362+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-55> 2023-09-21T17:58:56.362+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-54> 2023-09-21T17:58:56.362+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-53> 2023-09-21T17:58:56.362+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-52> 2023-09-21T17:58:56.362+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-51> 2023-09-21T17:58:56.362+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-50> 2023-09-21T17:58:56.362+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-49> 2023-09-21T17:58:56.362+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-48> 2023-09-21T17:58:56.362+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-47> 2023-09-21T17:58:56.362+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-46> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-45> 2023-09-21T17:58:56.366+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-44> 2023-09-21T17:58:56.370+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-43> 2023-09-21T17:58:56.370+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-42> 2023-09-21T17:58:56.370+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-41> 2023-09-21T17:58:56.370+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-40> 2023-09-21T17:58:56.370+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-39> 2023-09-21T17:58:56.370+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-38> 2023-09-21T17:58:56.370+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-37> 2023-09-21T17:58:56.370+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-36> 2023-09-21T17:58:56.370+0000 7f7689961640 -1 verify_upmap multiple osds 2,3 come from same failure domain -5
-35> 2023-09-21T17:58:56.370+0000 7f7689961640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-34> 2023-09-21T17:58:56.370+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-33> 2023-09-21T17:58:56.370+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-32> 2023-09-21T17:58:56.370+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-31> 2023-09-21T17:58:56.370+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-30> 2023-09-21T17:58:56.370+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-29> 2023-09-21T17:58:56.370+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-28> 2023-09-21T17:58:56.370+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-27> 2023-09-21T17:58:56.370+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-26> 2023-09-21T17:58:56.374+0000 7f7689160640 -1 verify_upmap multiple osds 4,5 come from same failure domain -6
-25> 2023-09-21T17:58:56.374+0000 7f7689160640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-24> 2023-09-21T17:58:56.374+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-23> 2023-09-21T17:58:56.374+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-22> 2023-09-21T17:58:56.374+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-21> 2023-09-21T17:58:56.374+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-20> 2023-09-21T17:58:56.374+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-19> 2023-09-21T17:58:56.378+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-18> 2023-09-21T17:58:56.378+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-17> 2023-09-21T17:58:56.378+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-16> 2023-09-21T17:59:36.307+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-15> 2023-09-21T17:59:36.307+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-14> 2023-09-21T17:59:36.307+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-13> 2023-09-21T17:59:36.307+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-12> 2023-09-21T17:59:36.307+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-11> 2023-09-21T17:59:36.307+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-10> 2023-09-21T17:59:36.307+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-9> 2023-09-21T17:59:36.307+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-8> 2023-09-21T17:59:36.323+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-7> 2023-09-21T17:59:36.323+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-6> 2023-09-21T17:59:36.323+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-5> 2023-09-21T17:59:36.323+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-4> 2023-09-21T17:59:36.323+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-3> 2023-09-21T17:59:36.323+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-2> 2023-09-21T17:59:36.323+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-1> 2023-09-21T17:59:36.327+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
0> 2023-09-21T17:59:36.383+0000 7f768895f640 -1 *** Caught signal (Aborted) **
in thread 7f768895f640 thread_name:clean_upmap_tp
ceph version Development (no_version) reef (stable)
1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2d70f3) [0x5620cad660f3]
2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7f76919f9520]
3: pthread_kill()
4: raise()
5: abort()
6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7f7691d8abfe]
7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7f7691d9628c]
8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7f7691d962f7]
9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7f7691d97025]
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x5620cacb65d6]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7f76939c54a5]
12: (ThreadPool::WorkThread::entry()+0x1a) [0x7f76939cb2fa]
13: (Thread::entry_wrapper()+0x84) [0x7f769399af14]
14: (Thread::_entry_func(void*)+0x15) [0x7f769399ae75]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7f7691a4bb43]
16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7f7691adda00]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
--- logging levels ---
0/ 5 none
0/ 1 lockdep
0/ 1 context
1/ 1 crush
1/ 5 mds
1/ 5 mds_balancer
1/ 5 mds_locker
1/ 5 mds_log
1/ 5 mds_log_expire
1/ 5 mds_migrator
0/ 1 buffer
0/ 1 timer
0/ 1 filer
0/ 1 striper
0/ 1 objecter
0/ 5 rados
0/ 5 rbd
0/ 5 rbd_mirror
0/ 5 rbd_replay
0/ 5 rbd_pwl
0/ 5 journaler
0/ 5 objectcacher
0/ 5 immutable_obj_cache
0/ 5 client
1/ 5 osd
0/ 5 optracker
0/ 5 objclass
1/ 3 filestore
1/ 3 journal
0/ 0 ms
1/ 5 mon
0/10 monc
1/ 5 paxos
0/ 5 tp
1/ 5 auth
1/ 5 crypto
1/ 1 finisher
1/ 1 reserver
1/ 5 heartbeatmap
1/ 5 perfcounter
1/ 5 rgw
1/ 5 rgw_sync
1/ 5 rgw_datacache
1/ 5 rgw_access
1/ 5 rgw_dbstore
1/ 5 rgw_flight
1/ 5 javaclient
1/ 5 asok
1/ 1 throttle
0/ 0 refs
1/ 5 compressor
1/ 5 bluestore
1/ 5 bluefs
1/ 3 bdev
1/ 5 kstore
4/ 5 rocksdb
4/ 5 leveldb
1/ 5 fuse
2/ 5 mgr
1/ 5 mgrc
1/ 5 dpdk
1/ 5 eventtrace
1/ 5 prioritycache
0/ 5 test
0/ 5 cephfs_mirror
0/ 5 cephsqlite
0/ 5 seastore
0/ 5 seastore_onode
0/ 5 seastore_odata
0/ 5 seastore_omap
0/ 5 seastore_tm
0/ 5 seastore_t
0/ 5 seastore_cleaner
0/ 5 seastore_epm
0/ 5 seastore_lba
0/ 5 seastore_fixedkv_tree
0/ 5 seastore_cache
0/ 5 seastore_journal
0/ 5 seastore_device
0/ 5 seastore_backref
0/ 5 alienstore
1/ 5 mclock
0/ 5 cyanstore
1/ 5 ceph_exporter
1/ 5 memstore
-2/-2 (syslog threshold)
99/99 (stderr threshold)
--- pthread ID / name mapping for recent threads ---
7f768815e640 / clean_upmap_tp
7f768895f640 / clean_upmap_tp
7f7689160640 / clean_upmap_tp
7f7689961640 / clean_upmap_tp
7f768a162640 / clean_upmap_tp
7f768a963640 / clean_upmap_tp
7f768b164640 / clean_upmap_tp
7f768b965640 / clean_upmap_tp
7f7690e1e2c0 / unittest_osdmap
max_recent 500
max_new 1000
log_file
--- end dump of recent events ---
--- begin dump of recent events ---
-94> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command assert hook 0x5620cb15e600
-93> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command abort hook 0x5620cb15e600
-92> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command leak_some_memory hook 0x5620cb15e600
-91> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perfcounters_dump hook 0x5620cb15e600
-90> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command 1 hook 0x5620cb15e600
-89> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf dump hook 0x5620cb15e600
-88> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perfcounters_schema hook 0x5620cb15e600
-87> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf histogram dump hook 0x5620cb15e600
-86> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command 2 hook 0x5620cb15e600
-85> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf schema hook 0x5620cb15e600
-84> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command counter dump hook 0x5620cb15e600
-83> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command counter schema hook 0x5620cb15e600
-82> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf histogram schema hook 0x5620cb15e600
-81> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command perf reset hook 0x5620cb15e600
-80> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config show hook 0x5620cb15e600
-79> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config help hook 0x5620cb15e600
-78> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config set hook 0x5620cb15e600
-77> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config unset hook 0x5620cb15e600
-76> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config get hook 0x5620cb15e600
-75> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config diff hook 0x5620cb15e600
-74> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command config diff get hook 0x5620cb15e600
-73> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command injectargs hook 0x5620cb15e600
-72> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log flush hook 0x5620cb15e600
-71> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log dump hook 0x5620cb15e600
-70> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command log reopen hook 0x5620cb15e600
-69> 2023-09-21T17:58:55.458+0000 7f7690e1e2c0 5 asok(0x5620cb19e740) register_command dump_mempools hook 0x5620cb22f048
-68> 2023-09-21T17:58:56.358+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-67> 2023-09-21T17:58:56.358+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-66> 2023-09-21T17:58:56.358+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-65> 2023-09-21T17:58:56.358+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-64> 2023-09-21T17:58:56.358+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-63> 2023-09-21T17:58:56.358+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-62> 2023-09-21T17:58:56.358+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-61> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-60> 2023-09-21T17:58:56.362+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-59> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-58> 2023-09-21T17:58:56.362+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-57> 2023-09-21T17:58:56.362+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-56> 2023-09-21T17:58:56.362+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-55> 2023-09-21T17:58:56.362+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-54> 2023-09-21T17:58:56.362+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-53> 2023-09-21T17:58:56.362+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-52> 2023-09-21T17:58:56.362+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-51> 2023-09-21T17:58:56.362+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-50> 2023-09-21T17:58:56.362+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-49> 2023-09-21T17:58:56.362+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-48> 2023-09-21T17:58:56.362+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-47> 2023-09-21T17:58:56.362+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-46> 2023-09-21T17:58:56.362+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-45> 2023-09-21T17:58:56.366+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-44> 2023-09-21T17:58:56.370+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-43> 2023-09-21T17:58:56.370+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-42> 2023-09-21T17:58:56.370+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-41> 2023-09-21T17:58:56.370+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-40> 2023-09-21T17:58:56.370+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-39> 2023-09-21T17:58:56.370+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-38> 2023-09-21T17:58:56.370+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-37> 2023-09-21T17:58:56.370+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-36> 2023-09-21T17:58:56.370+0000 7f7689961640 -1 verify_upmap multiple osds 2,3 come from same failure domain -5
-35> 2023-09-21T17:58:56.370+0000 7f7689961640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-34> 2023-09-21T17:58:56.370+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-33> 2023-09-21T17:58:56.370+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-32> 2023-09-21T17:58:56.370+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-31> 2023-09-21T17:58:56.370+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-30> 2023-09-21T17:58:56.370+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-29> 2023-09-21T17:58:56.370+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-28> 2023-09-21T17:58:56.370+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-27> 2023-09-21T17:58:56.370+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-26> 2023-09-21T17:58:56.374+0000 7f7689160640 -1 verify_upmap multiple osds 4,5 come from same failure domain -6
-25> 2023-09-21T17:58:56.374+0000 7f7689160640 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-24> 2023-09-21T17:58:56.374+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-23> 2023-09-21T17:58:56.374+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-22> 2023-09-21T17:58:56.374+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-21> 2023-09-21T17:58:56.374+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-20> 2023-09-21T17:58:56.374+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-19> 2023-09-21T17:58:56.378+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-18> 2023-09-21T17:58:56.378+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-17> 2023-09-21T17:58:56.378+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-16> 2023-09-21T17:59:36.307+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-15> 2023-09-21T17:59:36.307+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-14> 2023-09-21T17:59:36.307+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-13> 2023-09-21T17:59:36.307+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
-12> 2023-09-21T17:59:36.307+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-11> 2023-09-21T17:59:36.307+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-10> 2023-09-21T17:59:36.307+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-9> 2023-09-21T17:59:36.307+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-8> 2023-09-21T17:59:36.323+0000 7f768815e640 1 BUG_40104::clean_upmap_tp worker finish
-7> 2023-09-21T17:59:36.323+0000 7f768895f640 1 BUG_40104::clean_upmap_tp worker finish
-6> 2023-09-21T17:59:36.323+0000 7f768a963640 1 BUG_40104::clean_upmap_tp worker finish
-5> 2023-09-21T17:59:36.323+0000 7f768a162640 1 BUG_40104::clean_upmap_tp worker finish
-4> 2023-09-21T17:59:36.323+0000 7f768b965640 1 BUG_40104::clean_upmap_tp worker finish
-3> 2023-09-21T17:59:36.323+0000 7f768b164640 1 BUG_40104::clean_upmap_tp worker finish
-2> 2023-09-21T17:59:36.323+0000 7f7689961640 1 BUG_40104::clean_upmap_tp worker finish
-1> 2023-09-21T17:59:36.327+0000 7f7689160640 1 BUG_40104::clean_upmap_tp worker finish
0> 2023-09-21T17:59:36.383+0000 7f768895f640 -1 *** Caught signal (Aborted) **
in thread 7f768895f640 thread_name:clean_upmap_tp
ceph version Development (no_version) reef (stable)
1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2d70f3) [0x5620cad660f3]
2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7f76919f9520]
3: pthread_kill()
4: raise()
5: abort()
6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7f7691d8abfe]
7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7f7691d9628c]
8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7f7691d962f7]
9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7f7691d97025]
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x5620cacb65d6]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7f76939c54a5]
12: (ThreadPool::WorkThread::entry()+0x1a) [0x7f76939cb2fa]
13: (Thread::entry_wrapper()+0x84) [0x7f769399af14]
14: (Thread::_entry_func(void*)+0x15) [0x7f769399ae75]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7f7691a4bb43]
16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7f7691adda00]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
--- logging levels ---
0/ 5 none
0/ 1 lockdep
0/ 1 context
1/ 1 crush
1/ 5 mds
1/ 5 mds_balancer
1/ 5 mds_locker
1/ 5 mds_log
1/ 5 mds_log_expire
1/ 5 mds_migrator
0/ 1 buffer
0/ 1 timer
0/ 1 filer
0/ 1 striper
0/ 1 objecter
0/ 5 rados
0/ 5 rbd
0/ 5 rbd_mirror
0/ 5 rbd_replay
0/ 5 rbd_pwl
0/ 5 journaler
0/ 5 objectcacher
0/ 5 immutable_obj_cache
0/ 5 client
1/ 5 osd
0/ 5 optracker
0/ 5 objclass
1/ 3 filestore
1/ 3 journal
0/ 0 ms
1/ 5 mon
0/10 monc
1/ 5 paxos
0/ 5 tp
1/ 5 auth
1/ 5 crypto
1/ 1 finisher
1/ 1 reserver
1/ 5 heartbeatmap
1/ 5 perfcounter
1/ 5 rgw
1/ 5 rgw_sync
1/ 5 rgw_datacache
1/ 5 rgw_access
1/ 5 rgw_dbstore
1/ 5 rgw_flight
1/ 5 javaclient
1/ 5 asok
1/ 1 throttle
0/ 0 refs
1/ 5 compressor
1/ 5 bluestore
1/ 5 bluefs
1/ 3 bdev
1/ 5 kstore
4/ 5 rocksdb
4/ 5 leveldb
1/ 5 fuse
2/ 5 mgr
1/ 5 mgrc
1/ 5 dpdk
1/ 5 eventtrace
1/ 5 prioritycache
0/ 5 test
0/ 5 cephfs_mirror
0/ 5 cephsqlite
0/ 5 seastore
0/ 5 seastore_onode
0/ 5 seastore_odata
0/ 5 seastore_omap
0/ 5 seastore_tm
0/ 5 seastore_t
0/ 5 seastore_cleaner
0/ 5 seastore_epm
0/ 5 seastore_lba
0/ 5 seastore_fixedkv_tree
0/ 5 seastore_cache
0/ 5 seastore_journal
0/ 5 seastore_device
0/ 5 seastore_backref
0/ 5 alienstore
1/ 5 mclock
0/ 5 cyanstore
1/ 5 ceph_exporter
1/ 5 memstore
-2/-2 (syslog threshold)
99/99 (stderr threshold)
--- pthread ID / name mapping for recent threads ---
7f768815e640 / clean_upmap_tp
7f768895f640 / clean_upmap_tp
7f7689160640 / clean_upmap_tp
7f7689961640 / clean_upmap_tp
7f768a162640 / clean_upmap_tp
7f768a963640 / clean_upmap_tp
7f768b164640 / clean_upmap_tp
7f768b965640 / clean_upmap_tp
7f7690e1e2c0 / unittest_osdmap
max_recent 500
max_new 1000
log_file /var/lib/ceph/crash/2023-09-21T17:59:36.389680Z_e153b1b5-4873-42b7-8c2e-35a7b48604d5/log
--- end dump of recent events ---
Updated by Radoslaw Zarzynski over 2 years ago
The evidence from Neha it's not a recent thing -- is was seen in 2021: https://github.com/ceph/ceph/pull/41848#issuecomment-862208913.
Updated by Ronen Friedman over 2 years ago
Possibly the same as https://tracker.ceph.com/issues/63310 & fixed by https://github.com/ceph/ceph/pull/54177
Updated by Radoslaw Zarzynski over 2 years ago
The patch got merged pretty recently, in 44th week of 2023. Let's observe!
Updated by Casey Bodley about 2 years ago
still seeing these on main:
The following tests FAILED:
198 - unittest_osdmap (Subprocess aborted)
ex. https://jenkins.ceph.com/job/ceph-pull-requests/127638/testReport/junit/projectroot.src.test/osd/unittest_osdmap/
-1> 2024-01-16T16:07:48.078+0000 7f6baa7a5640 1 BUG_40104::clean_upmap_tp worker finish
0> 2024-01-16T16:07:48.086+0000 7f6ba6f9e640 -1 *** Caught signal (Aborted) **
in thread 7f6ba6f9e640 thread_name:clean_upmap_tp
ceph version Development (no_version) squid (dev)
1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2d7ac2) [0x5564deab2ac2]
2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7f6bb0834520]
3: pthread_kill()
4: raise()
5: abort()
6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa4f26) [0x7f6bb0bcbf26]
7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xb6d9c) [0x7f6bb0bddd9c]
8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xb6e07) [0x7f6bb0bdde07]
9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xb7b35) [0x7f6bb0bdeb35]
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x5564de9ff326]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7f6bb2996635]
12: (ThreadPool::WorkThread::entry()+0x1a) [0x7f6bb299c63a]
13: (Thread::entry_wrapper()+0x84) [0x7f6bb296ac84]
14: (Thread::_entry_func(void*)+0x15) [0x7f6bb296abe5]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7f6bb0886b43]
16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7f6bb0918a00]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
Updated by Kefu Chai about 2 years ago
OSDMapTest.BUG_43124 this time
[ RUN ] OSDMapTest.BUG_43124 ID CLASS WEIGHT TYPE NAME -1 200.00000 root default ... pgid 3.0 up [54,70,63,117,88,106,148,124,138,26,6,11] from 54 to 30 pure virtual method called terminate called without an active exception *** Caught signal (Aborted) ** in thread 7fa2a0156640 thread_name:clean_upmap_tp ceph version Development (no_version) squid (dev) 1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2ed41e) [0x55af41e1d41e] 2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7fa2a71e9520] 3: pthread_kill() 4: raise() 5: abort() 6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7fa2a7578bfe] 7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7fa2a758428c] 8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7fa2a75842f7] 9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7fa2a7585025] 10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x55af41d64d46] 11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7fa2a93092f5] 12: (ThreadPool::WorkThread::entry()+0x1a) [0x7fa2a930f2fa] 13: (Thread::entry_wrapper()+0x84) [0x7fa2a92dd984] 14: (Thread::_entry_func(void*)+0x15) [0x7fa2a92dd8e5] 15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7fa2a723bb43] 16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7fa2a72cda00] 2024-02-08T16:03:41.903+0000 7fa2a0156640 -1 *** Caught signal (Aborted) ** in thread 7fa2a0156640 thread_name:clean_upmap_tp ceph version Development (no_version) squid (dev) 1: /home/jenkins-build/build/workspace/ceph-pull-requests/build/bin/unittest_osdmap(+0x2ed41e) [0x55af41e1d41e] 2: /lib/x86_64-linux-gnu/libc.so.6(+0x42520) [0x7fa2a71e9520] 3: pthread_kill() 4: raise() 5: abort() 6: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xa2bfe) [0x7fa2a7578bfe] 7: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae28c) [0x7fa2a758428c] 8: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xae2f7) [0x7fa2a75842f7] 9: /lib/x86_64-linux-gnu/libstdc++.so.6(+0xaf025) [0x7fa2a7585025] 10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x16) [0x55af41d64d46] 11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x7a5) [0x7fa2a93092f5] 12: (ThreadPool::WorkThread::entry()+0x1a) [0x7fa2a930f2fa] 13: (Thread::entry_wrapper()+0x84) [0x7fa2a92dd984] 14: (Thread::_entry_func(void*)+0x15) [0x7fa2a92dd8e5] 15: /lib/x86_64-linux-gnu/libc.so.6(+0x94b43) [0x7fa2a723bb43] 16: /lib/x86_64-linux-gnu/libc.so.6(+0x126a00) [0x7fa2a72cda00] NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
Updated by Radoslaw Zarzynski about 2 years ago
The tested PR https://github.com/ceph/ceph/pull/55428 (main), so it's clearly a different thing that https://tracker.ceph.com/issues/63310 & mentioned earlier.
Updated by Radoslaw Zarzynski about 2 years ago
note from scrub: Nitzan is adding more context. In short: this might be a test issue,
Updated by Nitzan Mordechai about 2 years ago
since all of them getting "pure virtual method called" i don't think it's related to osdmap, but to the ThreadPool work queue that getting the wrong value in some way.
Updated by Casey Bodley almost 2 years ago
- Backport set to squid
seeing unittest_osdmap (Subprocess aborted) failures on squid too, tagged for backport
Updated by Laura Flores almost 2 years ago
- Assignee set to Laura Flores
I'll assign this and take a look.
Updated by Laura Flores almost 2 years ago
I ran this test multiple times in a local environment and didn't hit the issue. I will next look into evaluating the past environments where this was hit and see if I can reconstruct it.
Updated by Rongqi Sun almost 2 years ago ยท Edited
Seems like OSDMapTest.BUG_43124 has the same issue. Ref: https://tracker.ceph.com/issues/66244
Updated by Radoslaw Zarzynski almost 2 years ago
- Assignee changed from Laura Flores to MOHIT AGRAWAL
Updated by Radoslaw Zarzynski almost 2 years ago
- Related to Bug #66244: 202 - unittest_osdmap (Subprocess aborted) added
Updated by MOHIT AGRAWAL almost 2 years ago
- Pull request ID set to 57988
void ThreadPool::worker(WorkThread *wt)
{
if (item) {
processing++;
ldout(cct,12) << "worker wq " << wq->name << " start processing " << item
<< " (" << processing << " active)" << dendl;
ul.unlock();
TPHandle tp_handle(cct, hb, wq->timeout_interval.load(), wq->suicide_interval.load());
tp_handle.reset_tp_timeout();
wq->_void_process(item, tp_handle);===> The item has become dangling after this line
ul.lock();
wq->_void_process_finish(item);
processing--;
ldout(cct,15) << "worker wq " << wq->name << " done processing " << item
<< " (" << processing << " active)" << dendl;
if (_pause || _draining)
_wait_cond.notify_all();
did = true;
break;
}
....
....
}
And _process function is like this.
void ParallelPGMapper::WQ::_process(Item *i, ThreadPool::TPHandle &h)
{
ldout(m->cct, 20) << func << " " << i->job << " pool " << i->pool
<< " [" << i->begin << "," << i->end << ")"
<< " pgs " << i->pgs
<< dendl;
if (!i->pgs.empty())
i->job->process(i->pgs);
else
i->job->process(i->pool, i->begin, i->end);
i->job->finish_one();
delete i;
}
As we can see worker is calling _void_process that call ParallelPGMapper::WQ::_process function. We are cleanup the
object in _process function and trying to pass the same object to the next function _void_process_finish(item) in worker.In this case the
pointer(item) is become dangling after call the process function.
Updated by MOHIT AGRAWAL almost 2 years ago
- Status changed from New to Fix Under Review
Updated by Radoslaw Zarzynski over 1 year ago
- Status changed from Fix Under Review to Pending Backport
Updated by Rongqi Sun over 1 year ago
@MOHIT AGRAWAL Seems like OSDMapTest.BUG_42485 still failed sometimes
[ RUN ] OSDMapTest.BUG_42485
pgid [26,12,54,46] up [26,12,54,46]
from 26 to 30
pgid 3.2 up [16,25,53,31]
from 16 to 40
from 31 to 0
pure virtual method called
terminate called without an active exception
*** Caught signal (Aborted) **
in thread ffffad0b5d60 thread_name:clean_upmap_tp
ceph version Development (no_version) squid (dev)
1: /home/jenkins-build/build/workspace/ceph-pull-requests-arm64/build/bin/unittest_osdmap(+0x2e5c88) [0xaaaae83a5c88]
2: __kernel_rt_sigreturn()
3: /lib/aarch64-linux-gnu/libc.so.6(+0x7f200) [0xffffb340f200]
4: raise()
5: abort()
6: (__gnu_cxx::__verbose_terminate_handler()+0x124) [0xffffb36bb364]
7: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a0c) [0xffffb36b8a0c]
8: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a70) [0xffffb36b8a70]
9: __cxa_deleted_virtual()
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x20) [0xaaaae82f36dc]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x5f8) [0xffffb5623bfc]
12: (ThreadPool::WorkThread::entry()+0x24) [0xffffb56293a8]
13: (Thread::entry_wrapper()+0xa0) [0xffffb55fa31c]
14: (Thread::_entry_func(void*)+0x18) [0xffffb55fa268]
15: /lib/aarch64-linux-gnu/libc.so.6(+0x7d5c8) [0xffffb340d5c8]
16: /lib/aarch64-linux-gnu/libc.so.6(+0xe5edc) [0xffffb3475edc]
2024-06-28T07:04:02.396-0400 ffffad0b5d60 -1 *** Caught signal (Aborted) **
in thread ffffad0b5d60 thread_name:clean_upmap_tp
ceph version Development (no_version) squid (dev)
1: /home/jenkins-build/build/workspace/ceph-pull-requests-arm64/build/bin/unittest_osdmap(+0x2e5c88) [0xaaaae83a5c88]
2: __kernel_rt_sigreturn()
3: /lib/aarch64-linux-gnu/libc.so.6(+0x7f200) [0xffffb340f200]
4: raise()
5: abort()
6: (__gnu_cxx::__verbose_terminate_handler()+0x124) [0xffffb36bb364]
7: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a0c) [0xffffb36b8a0c]
8: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a70) [0xffffb36b8a70]
9: __cxa_deleted_virtual()
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x20) [0xaaaae82f36dc]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x5f8) [0xffffb5623bfc]
12: (ThreadPool::WorkThread::entry()+0x24) [0xffffb56293a8]
13: (Thread::entry_wrapper()+0xa0) [0xffffb55fa31c]
14: (Thread::_entry_func(void*)+0x18) [0xffffb55fa268]
15: /lib/aarch64-linux-gnu/libc.so.6(+0x7d5c8) [0xffffb340d5c8]
16: /lib/aarch64-linux-gnu/libc.so.6(+0xe5edc) [0xffffb3475edc]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
--- begin dump of recent events ---
-94> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command assert hook 0xaaab27562070
-93> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command abort hook 0xaaab27562070
-92> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command leak_some_memory hook 0xaaab27562070
-91> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perfcounters_dump hook 0xaaab27562070
-90> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command 1 hook 0xaaab27562070
-89> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf dump hook 0xaaab27562070
-88> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perfcounters_schema hook 0xaaab27562070
-87> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf histogram dump hook 0xaaab27562070
-86> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command 2 hook 0xaaab27562070
-85> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf schema hook 0xaaab27562070
-84> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command counter dump hook 0xaaab27562070
-83> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command counter schema hook 0xaaab27562070
-82> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf histogram schema hook 0xaaab27562070
-81> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf reset hook 0xaaab27562070
-80> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config show hook 0xaaab27562070
-79> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config help hook 0xaaab27562070
-78> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config set hook 0xaaab27562070
-77> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config unset hook 0xaaab27562070
-76> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config get hook 0xaaab27562070
-75> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config diff hook 0xaaab27562070
-74> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config diff get hook 0xaaab27562070
-73> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command injectargs hook 0xaaab27562070
-72> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log flush hook 0xaaab27562070
-71> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log dump hook 0xaaab27562070
-70> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log reopen hook 0xaaab27562070
-69> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command dump_mempools hook 0xaaab27692a88
-68> 2024-06-28T07:02:23.965-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-67> 2024-06-28T07:02:23.965-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-66> 2024-06-28T07:02:23.965-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-65> 2024-06-28T07:02:23.965-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-64> 2024-06-28T07:02:23.965-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-63> 2024-06-28T07:02:23.965-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-62> 2024-06-28T07:02:23.965-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-61> 2024-06-28T07:02:23.965-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-60> 2024-06-28T07:02:23.965-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-59> 2024-06-28T07:02:23.965-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-58> 2024-06-28T07:02:23.965-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-57> 2024-06-28T07:02:23.969-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-56> 2024-06-28T07:02:23.969-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-55> 2024-06-28T07:02:23.969-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-54> 2024-06-28T07:02:23.969-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-53> 2024-06-28T07:02:23.969-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-52> 2024-06-28T07:02:23.969-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-51> 2024-06-28T07:02:23.969-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-50> 2024-06-28T07:02:23.969-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-49> 2024-06-28T07:02:23.969-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-48> 2024-06-28T07:02:23.969-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-47> 2024-06-28T07:02:23.969-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-46> 2024-06-28T07:02:23.969-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-45> 2024-06-28T07:02:23.969-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-44> 2024-06-28T07:02:23.973-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-43> 2024-06-28T07:02:23.973-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-42> 2024-06-28T07:02:23.973-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-41> 2024-06-28T07:02:23.973-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-40> 2024-06-28T07:02:23.973-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-39> 2024-06-28T07:02:23.973-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-38> 2024-06-28T07:02:23.985-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-37> 2024-06-28T07:02:23.997-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-36> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 -1 verify_upmap multiple osds 2,3 come from same failure domain -5
-35> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-34> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-33> 2024-06-28T07:02:23.997-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-32> 2024-06-28T07:02:24.009-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-31> 2024-06-28T07:02:24.013-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-30> 2024-06-28T07:02:24.017-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-29> 2024-06-28T07:02:24.025-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-28> 2024-06-28T07:02:24.025-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-27> 2024-06-28T07:02:24.025-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-26> 2024-06-28T07:02:24.037-0400 ffffac095d60 -1 verify_upmap multiple osds 4,5 come from same failure domain -6
-25> 2024-06-28T07:02:24.037-0400 ffffac095d60 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-24> 2024-06-28T07:02:24.037-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-23> 2024-06-28T07:02:24.037-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-22> 2024-06-28T07:02:24.037-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-21> 2024-06-28T07:02:24.041-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-20> 2024-06-28T07:02:24.041-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-19> 2024-06-28T07:02:24.041-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-18> 2024-06-28T07:02:24.041-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-17> 2024-06-28T07:02:24.045-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-16> 2024-06-28T07:04:02.192-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-15> 2024-06-28T07:04:02.192-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-14> 2024-06-28T07:04:02.196-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-13> 2024-06-28T07:04:02.196-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-12> 2024-06-28T07:04:02.196-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-11> 2024-06-28T07:04:02.196-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-10> 2024-06-28T07:04:02.196-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-9> 2024-06-28T07:04:02.216-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-8> 2024-06-28T07:04:02.256-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-7> 2024-06-28T07:04:02.256-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-6> 2024-06-28T07:04:02.256-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-5> 2024-06-28T07:04:02.264-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-4> 2024-06-28T07:04:02.264-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-3> 2024-06-28T07:04:02.272-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-2> 2024-06-28T07:04:02.272-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-1> 2024-06-28T07:04:02.272-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
0> 2024-06-28T07:04:02.396-0400 ffffad0b5d60 -1 *** Caught signal (Aborted) **
in thread ffffad0b5d60 thread_name:clean_upmap_tp
ceph version Development (no_version) squid (dev)
1: /home/jenkins-build/build/workspace/ceph-pull-requests-arm64/build/bin/unittest_osdmap(+0x2e5c88) [0xaaaae83a5c88]
2: __kernel_rt_sigreturn()
3: /lib/aarch64-linux-gnu/libc.so.6(+0x7f200) [0xffffb340f200]
4: raise()
5: abort()
6: (__gnu_cxx::__verbose_terminate_handler()+0x124) [0xffffb36bb364]
7: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a0c) [0xffffb36b8a0c]
8: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a70) [0xffffb36b8a70]
9: __cxa_deleted_virtual()
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x20) [0xaaaae82f36dc]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x5f8) [0xffffb5623bfc]
12: (ThreadPool::WorkThread::entry()+0x24) [0xffffb56293a8]
13: (Thread::entry_wrapper()+0xa0) [0xffffb55fa31c]
14: (Thread::_entry_func(void*)+0x18) [0xffffb55fa268]
15: /lib/aarch64-linux-gnu/libc.so.6(+0x7d5c8) [0xffffb340d5c8]
16: /lib/aarch64-linux-gnu/libc.so.6(+0xe5edc) [0xffffb3475edc]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
--- logging levels ---
0/ 5 none
0/ 1 lockdep
0/ 1 context
1/ 1 crush
1/ 5 mds
1/ 5 mds_balancer
1/ 5 mds_locker
1/ 5 mds_log
1/ 5 mds_log_expire
1/ 5 mds_migrator
3/ 5 mds_quiesce
0/ 1 buffer
0/ 1 timer
0/ 1 filer
0/ 1 striper
0/ 1 objecter
0/ 5 rados
0/ 5 rbd
0/ 5 rbd_mirror
0/ 5 rbd_replay
0/ 5 rbd_pwl
0/ 5 journaler
0/ 5 objectcacher
0/ 5 immutable_obj_cache
0/ 5 client
1/ 5 osd
0/ 5 optracker
0/ 5 objclass
1/ 3 filestore
1/ 3 journal
0/ 0 ms
1/ 5 mon
0/10 monc
1/ 5 paxos
0/ 5 tp
1/ 5 auth
1/ 5 crypto
1/ 1 finisher
1/ 1 reserver
1/ 5 heartbeatmap
1/ 5 perfcounter
1/ 5 rgw
1/ 5 rgw_sync
1/ 5 rgw_datacache
1/ 5 rgw_access
1/ 5 rgw_dbstore
1/ 5 rgw_flight
1/ 5 rgw_lifecycle
1/ 5 rgw_notification
1/ 5 javaclient
1/ 5 asok
1/ 1 throttle
0/ 0 refs
1/ 5 compressor
1/ 5 bluestore
1/ 5 bluefs
1/ 3 bdev
1/ 5 kstore
4/ 5 rocksdb
1/ 5 fuse
2/ 5 mgr
1/ 5 mgrc
1/ 5 dpdk
1/ 5 eventtrace
1/ 5 prioritycache
0/ 5 test
0/ 5 cephfs_mirror
0/ 5 cephsqlite
0/ 5 crimson_interrupt
0/ 5 seastore
0/ 5 seastore_onode
0/ 5 seastore_odata
0/ 5 seastore_omap
0/ 5 seastore_tm
0/ 5 seastore_t
0/ 5 seastore_cleaner
0/ 5 seastore_epm
0/ 5 seastore_lba
0/ 5 seastore_fixedkv_tree
0/ 5 seastore_cache
0/ 5 seastore_journal
0/ 5 seastore_device
0/ 5 seastore_backref
0/ 5 alienstore
1/ 5 mclock
0/ 5 cyanstore
1/ 5 ceph_exporter
1/ 5 memstore
1/ 5 trace
0/ 5 ceph_dedup
-2/-2 (syslog threshold)
99/99 (stderr threshold)
--- pthread ID / name mapping for recent threads ---
ffffa9845d60 / clean_upmap_tp
ffffaa055d60 / clean_upmap_tp
ffffaa865d60 / clean_upmap_tp
ffffab075d60 / clean_upmap_tp
ffffab885d60 / clean_upmap_tp
ffffac095d60 / clean_upmap_tp
ffffac8a5d60 / unittest_osdmap
ffffad0b5d60 / clean_upmap_tp
ffffb66ad020 / unittest_osdmap
max_recent 500
max_new 1000
log_file
--- end dump of recent events ---
--- begin dump of recent events ---
-94> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command assert hook 0xaaab27562070
-93> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command abort hook 0xaaab27562070
-92> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command leak_some_memory hook 0xaaab27562070
-91> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perfcounters_dump hook 0xaaab27562070
-90> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command 1 hook 0xaaab27562070
-89> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf dump hook 0xaaab27562070
-88> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perfcounters_schema hook 0xaaab27562070
-87> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf histogram dump hook 0xaaab27562070
-86> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command 2 hook 0xaaab27562070
-85> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf schema hook 0xaaab27562070
-84> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command counter dump hook 0xaaab27562070
-83> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command counter schema hook 0xaaab27562070
-82> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf histogram schema hook 0xaaab27562070
-81> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command perf reset hook 0xaaab27562070
-80> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config show hook 0xaaab27562070
-79> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config help hook 0xaaab27562070
-78> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config set hook 0xaaab27562070
-77> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config unset hook 0xaaab27562070
-76> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config get hook 0xaaab27562070
-75> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config diff hook 0xaaab27562070
-74> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command config diff get hook 0xaaab27562070
-73> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command injectargs hook 0xaaab27562070
-72> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log flush hook 0xaaab27562070
-71> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log dump hook 0xaaab27562070
-70> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command log reopen hook 0xaaab27562070
-69> 2024-06-28T07:02:22.081-0400 ffffb66ad020 5 asok(0xaaab275e3660) register_command dump_mempools hook 0xaaab27692a88
-68> 2024-06-28T07:02:23.965-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-67> 2024-06-28T07:02:23.965-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-66> 2024-06-28T07:02:23.965-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-65> 2024-06-28T07:02:23.965-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-64> 2024-06-28T07:02:23.965-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-63> 2024-06-28T07:02:23.965-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-62> 2024-06-28T07:02:23.965-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-61> 2024-06-28T07:02:23.965-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-60> 2024-06-28T07:02:23.965-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-59> 2024-06-28T07:02:23.965-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-58> 2024-06-28T07:02:23.965-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-57> 2024-06-28T07:02:23.969-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-56> 2024-06-28T07:02:23.969-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-55> 2024-06-28T07:02:23.969-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-54> 2024-06-28T07:02:23.969-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-53> 2024-06-28T07:02:23.969-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-52> 2024-06-28T07:02:23.969-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-51> 2024-06-28T07:02:23.969-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-50> 2024-06-28T07:02:23.969-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-49> 2024-06-28T07:02:23.969-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-48> 2024-06-28T07:02:23.969-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-47> 2024-06-28T07:02:23.969-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-46> 2024-06-28T07:02:23.969-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-45> 2024-06-28T07:02:23.969-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-44> 2024-06-28T07:02:23.973-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-43> 2024-06-28T07:02:23.973-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-42> 2024-06-28T07:02:23.973-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-41> 2024-06-28T07:02:23.973-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-40> 2024-06-28T07:02:23.973-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-39> 2024-06-28T07:02:23.973-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-38> 2024-06-28T07:02:23.985-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-37> 2024-06-28T07:02:23.997-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-36> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 -1 verify_upmap multiple osds 2,3 come from same failure domain -5
-35> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-34> 2024-06-28T07:02:23.997-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-33> 2024-06-28T07:02:23.997-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-32> 2024-06-28T07:02:24.009-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-31> 2024-06-28T07:02:24.013-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-30> 2024-06-28T07:02:24.017-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-29> 2024-06-28T07:02:24.025-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-28> 2024-06-28T07:02:24.025-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-27> 2024-06-28T07:02:24.025-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-26> 2024-06-28T07:02:24.037-0400 ffffac095d60 -1 verify_upmap multiple osds 4,5 come from same failure domain -6
-25> 2024-06-28T07:02:24.037-0400 ffffac095d60 0 check_pg_upmaps verify_upmap of pg 3.0 returning -22
-24> 2024-06-28T07:02:24.037-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-23> 2024-06-28T07:02:24.037-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-22> 2024-06-28T07:02:24.037-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-21> 2024-06-28T07:02:24.041-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-20> 2024-06-28T07:02:24.041-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-19> 2024-06-28T07:02:24.041-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-18> 2024-06-28T07:02:24.041-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-17> 2024-06-28T07:02:24.045-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-16> 2024-06-28T07:04:02.192-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-15> 2024-06-28T07:04:02.192-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-14> 2024-06-28T07:04:02.196-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-13> 2024-06-28T07:04:02.196-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-12> 2024-06-28T07:04:02.196-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-11> 2024-06-28T07:04:02.196-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-10> 2024-06-28T07:04:02.196-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-9> 2024-06-28T07:04:02.216-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
-8> 2024-06-28T07:04:02.256-0400 ffffab075d60 1 BUG_40104::clean_upmap_tp worker finish
-7> 2024-06-28T07:04:02.256-0400 ffffab885d60 1 BUG_40104::clean_upmap_tp worker finish
-6> 2024-06-28T07:04:02.256-0400 ffffad0b5d60 1 BUG_40104::clean_upmap_tp worker finish
-5> 2024-06-28T07:04:02.264-0400 ffffaa865d60 1 BUG_40104::clean_upmap_tp worker finish
-4> 2024-06-28T07:04:02.264-0400 ffffac095d60 1 BUG_40104::clean_upmap_tp worker finish
-3> 2024-06-28T07:04:02.272-0400 ffffa9845d60 1 BUG_40104::clean_upmap_tp worker finish
-2> 2024-06-28T07:04:02.272-0400 ffffaa055d60 1 BUG_40104::clean_upmap_tp worker finish
-1> 2024-06-28T07:04:02.272-0400 ffffac8a5d60 1 BUG_40104::clean_upmap_tp worker finish
0> 2024-06-28T07:04:02.396-0400 ffffad0b5d60 -1 *** Caught signal (Aborted) **
in thread ffffad0b5d60 thread_name:clean_upmap_tp
ceph version Development (no_version) squid (dev)
1: /home/jenkins-build/build/workspace/ceph-pull-requests-arm64/build/bin/unittest_osdmap(+0x2e5c88) [0xaaaae83a5c88]
2: __kernel_rt_sigreturn()
3: /lib/aarch64-linux-gnu/libc.so.6(+0x7f200) [0xffffb340f200]
4: raise()
5: abort()
6: (__gnu_cxx::__verbose_terminate_handler()+0x124) [0xffffb36bb364]
7: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a0c) [0xffffb36b8a0c]
8: /lib/aarch64-linux-gnu/libstdc++.so.6(+0xa8a70) [0xffffb36b8a70]
9: __cxa_deleted_virtual()
10: (ThreadPool::WorkQueue<ParallelPGMapper::Item>::_void_dequeue()+0x20) [0xaaaae82f36dc]
11: (ThreadPool::worker(ThreadPool::WorkThread*)+0x5f8) [0xffffb5623bfc]
12: (ThreadPool::WorkThread::entry()+0x24) [0xffffb56293a8]
13: (Thread::entry_wrapper()+0xa0) [0xffffb55fa31c]
14: (Thread::_entry_func(void*)+0x18) [0xffffb55fa268]
15: /lib/aarch64-linux-gnu/libc.so.6(+0x7d5c8) [0xffffb340d5c8]
16: /lib/aarch64-linux-gnu/libc.so.6(+0xe5edc) [0xffffb3475edc]
NOTE: a copy of the executable, or `objdump -rdS <executable>` is needed to interpret this.
Updated by MOHIT AGRAWAL over 1 year ago
I remember last time i ran the test case almost 20 times in a loop and it was not crashed. To reproduce it i have to setup an environment again and will check it.
Updated by Rongqi Sun over 1 year ago
MOHIT AGRAWAL wrote in #note-20:
I remember last time i ran the test case almost 20 times in a loop and it was not crashed. To reproduce it i have to setup an environment again and will check it.
Thanks so much. BTW, it is reproduced on ARM64 environment.
Updated by Upkeep Bot over 1 year ago
- Copied to Backport #66779: squid: unittest_osdmap (Subprocess aborted) during OSDMapTest.BUG_42485 added
Updated by Upkeep Bot over 1 year ago
- Tags (freeform) set to backport_processed
Updated by Sridhar Seshasayee about 1 year ago
@MOHIT AGRAWAL The issue is present in Reef as well. See: https://jenkins.ceph.com/job/ceph-pull-requests/149459/consoleFull#-6313945356733401c-e9d0-4737-9832-6594c5da0afa
for PR: https://github.com/ceph/ceph/pull/60981.
So it would be good to backport this fix to Reef as well.
Updated by MOHIT AGRAWAL about 1 year ago
Sridhar Seshasayee wrote in #note-24:
@MOHIT AGRAWAL The issue is present in Reef as well. See: https://jenkins.ceph.com/job/ceph-pull-requests/149459/consoleFull#-6313945356733401c-e9d0-4737-9832-6594c5da0afa
for PR: https://github.com/ceph/ceph/pull/60981.So it would be good to backport this fix to Reef as well.
Yes it is backport via https://tracker.ceph.com/issues/67235
Updated by Upkeep Bot 9 months ago
- Merge Commit set to cd090c6dc06ca8e69b4ee1da8f30b59b6683ed84
- Fixed In set to v19.3.0-2802-gcd090c6dc06
- Upkeep Timestamp set to 2025-07-09T16:09:31+00:00
Updated by Upkeep Bot 8 months ago
- Fixed In changed from v19.3.0-2802-gcd090c6dc06 to v19.3.0-2802-gcd090c6dc0
- Upkeep Timestamp changed from 2025-07-09T16:09:31+00:00 to 2025-07-14T17:41:46+00:00
Updated by Upkeep Bot 5 months ago
- Released In set to v20.2.0~2683
- Upkeep Timestamp changed from 2025-07-14T17:41:46+00:00 to 2025-11-01T00:58:24+00:00
Updated by MOHIT AGRAWAL 3 months ago
- Status changed from Pending Backport to Closed
The pull request is merged.