Skip to content

abnormal behavior of globalMemoryControl #41057

@XuHuaiyu

Description

@XuHuaiyu

Bug Report

  1. global memory controller starts to kill the IndexMerge query. Remember the sql digest "3bfcbb" of the following log.
[2023/02/02 14:24:20.841 +00:00] [WARN] [servermemorylimit.go:126] ["global memory controller tries to kill the top1 memory consumer"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] [tidb_server_memory_limit=11510510320] ["heap inuse"=12128239616] ["sql memory usage"=8105754037]
  1. As the log shows, after 5s, the query exists with error "Out Of Memory Quota!"
|2023-02-02 22:24:29 | [2023/02/02 14:24:29.503 +00:00] [INFO] [conn.go:1152] ["command dispatched failed"] [conn=1655338944872580399] [connInfo="id:1655338944872580399, addr:71.69.163.118:49352 status:10, collation:utf8mb4_unicode_ci, user:root"] [command=Execute] [status="inTxn:0, autocommit:1"] [sql="      select      from  where \n    "] [txn_mode=PESSIMISTIC] [timestamp=439182385056317441] [err="Out Of Memory Quota![conn_id=1655338944872580399]\ngithub.com/pingcap/tidb/executor.(*recordSet).Next.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/adapter.go:149\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*SelectionExec).Next\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/executor.go:1585\ngithub.com/pingcap/tidb/executor.Next\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/executor.go:328\ngithub.com/pingcap/tidb/executor.(*ExecStmt).next\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/adapter.go:1142\ngithub.com/pingcap/tidb/executor.(*recordSet).Next\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/adapter.go:153\ngithub.com/pingcap/tidb/server.(*tidbResultSet).Next\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/driver_tidb.go:423\ngithub.com/pingcap/tidb/server.(*clientConn).writeChunks\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn.go:2305\ngithub.com/pingcap/tidb/server.(*clientConn).writeResultset\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn.go:2248\ngithub.com/pingcap/tidb/server.(*clientConn).executePreparedStmtAndWriteResult\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn_stmt.go:296\ngithub.com/pingcap/tidb/server.(*clientConn).executePlanCacheStmt\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn_stmt.go:221\ngithub.com/pingcap/tidb/server.(*clientConn).handleStmtExecute\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn_stmt.go:213\ngithub.com/pingcap/tidb/server.(*clientConn).dispatch\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn.go:1401\ngithub.com/pingcap/tidb/server.(*clientConn).Run\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/conn.go:1123\ngithub.com/pingcap/tidb/server.(*Server).onConn\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/server/server.go:625\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1594\n      select\n             from       "]
 | 2023-02-02 22:24:24 | [2023/02/02 14:24:24.175 +00:00] [ERROR] [misc.go:91] ["panic in the recoverable goroutine"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/util.WithRecovery.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:93\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).executeTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1182\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).pickAndExecTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1128\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1.1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:620\ngithub.com/pingcap/tidb/util.WithRecovery\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:96\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:614"]
| 2023-02-02 22:24:24 | [2023/02/02 14:24:24.175 +00:00] [ERROR] [index_merge_reader.go:1146] ["panic in IndexMergeReaderExecutor indexMergeTableWorker: Out Of Memory Quota![conn_id=1655338944872580399]"] [conn=1655338944872580399]
| 2023-02-02 22:24:23 | [2023/02/02 14:24:23.309 +00:00] [ERROR] [misc.go:91] ["panic in the recoverable goroutine"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/util.WithRecovery.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:93\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).executeTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1182\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).pickAndExecTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1128\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1.1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:620\ngithub.com/pingcap/tidb/util.WithRecovery\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:96\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:614"]
| 2023-02-02 22:24:23 | [2023/02/02 14:24:23.309 +00:00] [ERROR] [index_merge_reader.go:1146] ["panic in IndexMergeReaderExecutor indexMergeTableWorker: Out Of Memory Quota![conn_id=1655338944872580399]"] [conn=1655338944872580399]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.728 +00:00] [ERROR] [misc.go:91] ["panic in the recoverable goroutine"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/util.WithRecovery.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:93\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).executeTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1182\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).pickAndExecTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1128\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1.1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:620\ngithub.com/pingcap/tidb/util.WithRecovery\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:96\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:614"]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.728 +00:00] [ERROR] [index_merge_reader.go:1146] ["panic in IndexMergeReaderExecutor indexMergeTableWorker: Out Of Memory Quota![conn_id=1655338944872580399]"] [conn=1655338944872580399]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.626 +00:00] [ERROR] [misc.go:91] ["panic in the recoverable goroutine"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/util.WithRecovery.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:93\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).executeTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1182\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).pickAndExecTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1128\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1.1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:620\ngithub.com/pingcap/tidb/util.WithRecovery\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:96\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:614"]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.626 +00:00] [ERROR] [index_merge_reader.go:1146] ["panic in IndexMergeReaderExecutor indexMergeTableWorker: Out Of Memory Quota![conn_id=1655338944872580399]"] [conn=1655338944872580399]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.436 +00:00] [ERROR] [misc.go:91] ["panic in the recoverable goroutine"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/util.WithRecovery.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:93\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).executeTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1182\ngithub.com/pingcap/tidb/executor.(*indexMergeTableScanWorker).pickAndExecTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:1128\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1.1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:620\ngithub.com/pingcap/tidb/util.WithRecovery\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/misc.go:96\ngithub.com/pingcap/tidb/executor.(*IndexMergeReaderExecutor).startIndexMergeTableScanWorker.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/executor/index_merge_reader.go:614"]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.436 +00:00] [ERROR] [index_merge_reader.go:1146] ["panic in IndexMergeReaderExecutor indexMergeTableWorker: Out Of Memory Quota![conn_id=1655338944872580399]"] [conn=1655338944872580399]
| 2023-02-02 22:24:21 | [2023/02/02 14:24:21.307 +00:00] [ERROR] [coprocessor.go:899] ["copIteratorWork meet panic"] [r="\"Out Of Memory Quota![conn_id=1655338944872580399]\""] ["stack trace"="github.com/pingcap/tidb/store/copr.(*copIteratorWorker).handleTask.func1\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:901\nruntime.gopanic\n\t/usr/local/go/src/runtime/panic.go:884\ngithub.com/pingcap/tidb/util/memory.(*PanicOnExceed).Action\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/action.go:148\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume.func2\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:449\ngithub.com/pingcap/tidb/util/memory.(*Tracker).Consume\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/util/memory/tracker.go:461\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).sendToRespCh\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:789\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).handleCopResponse\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:1236\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).handleCopPagingResult\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:1092\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).handleTaskOnce\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:1031\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).handleTask\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:912\ngithub.com/pingcap/tidb/store/copr.(*copIteratorWorker).run\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb/store/copr/coprocessor.go:622"]
  1. It's strange that, from the slow log, we found this query execute successfully. But it actually killed before.
# Time: 2023-02-02T14:24:29.502648941Z
# Txn_start_ts: 439182385056317441
# User@Host: root[root] @ 71.69.163.118 [71.69.163.118]
# Conn_ID: 1655338944872580399
# Query_time: 26.11588911
# Parse_time: 0
# Compile_time: 0.000971646
# Rewrite_time: 0.000100893
# Optimize_time: 0.000720741
# Wait_TS: 0.000014217
# Cop_time: 73.410496269 Process_time: 1075.273 Wait_time: 104.362 Request_count: 504 Process_keys: 286534 Total_keys: 375339 Get_snapshot_time: 65.973 Rocksdb_key_skipped_count: 286534 Rocksdb_block_cache_hit_count: 1627389 Rocksdb_block_read_count: 28128 Rocksdb_block_read_byte: 1182400040 Rocksdb_block_read_time: 32.703
# Index_names: []
# Is_internal: false
# Digest: 3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a
# Stats: :pseudo
# Num_cop_tasks: 504
# Cop_proc_avg: 2.133478174 Cop_proc_p90: 4.2620000000000005 Cop_proc_max: 5.6370000000000005 Cop_proc_addr: db-tikv-0.db-tikv-peer.tidb1379661944636804070.svc:20160
# Cop_wait_avg: 0.20706746 Cop_wait_p90: 0.578 Cop_wait_max: 1.198 Cop_wait_addr: db-tikv-0.db-tikv-peer.tidb1379661944636804070.svc:20160
# Mem_max: 9187921973
# Prepared: true
# Plan_from_cache: false
# Plan_from_binding: false
# Has_more_results: false
# KV_total: 1451.511250334
# PD_total: 0.461488112
# Backoff_total: 0
# Write_sql_response_total: 17.76074209
# Result_rows: 4096
# Succ: true
# IsExplicitTxn: false
# IsSyncStatsFailed: false
select  from  where ( );
  1. It also strange that global memory controller thought itself kill the query failed and tried to kill it again and again. It gets even weirder, the sql digest changed during this phase and recovered at the end. And we can not find the query in slow log using keyword "1272ccc8974ef".
2023-02-02 22:25:16	
[2023/02/02 14:25:16.619 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 55s"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] ["sql memory usage"=7457620074]
2023-02-02 22:25:11	
[2023/02/02 14:25:11.539 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 50s"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] ["sql memory usage"=4549173213]
2023-02-02 22:25:06	
[2023/02/02 14:25:06.515 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 45s"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] ["sql memory usage"=1683859172]
2023-02-02 22:25:01	
[2023/02/02 14:25:01.505 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 40s"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] ["sql memory usage"=76444864]
2023-02-02 22:24:56	
[2023/02/02 14:24:56.504 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 35s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:51	
[2023/02/02 14:24:51.404 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 30s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:46	
[2023/02/02 14:24:46.305 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 25s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:41	
[2023/02/02 14:24:41.204 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 20s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:36	
[2023/02/02 14:24:36.105 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 15s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:31	
[2023/02/02 14:24:31.004 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 10s"] [connID=1655338944872580399] ["sql digest"=1272ccc8974ef9606af239358199593c07135fe06aed7ad6d1235104003e4328] ["sql text"=] ["sql memory usage"=0]
2023-02-02 22:24:25	
[2023/02/02 14:24:25.927 +00:00] [WARN] [servermemorylimit.go:96] ["global memory controller failed to kill the top-consumer in 5s"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] ["sql memory usage"=9171754741]
2023-02-02 22:24:21	
[2023/02/02 14:24:21.306 +00:00] [WARN] [tracker.go:458] ["global memory controller, NeedKill signal is received successfully"] [connID=1655338944872580399]
2023-02-02 22:24:20	
[2023/02/02 14:24:20.841 +00:00] [WARN] [servermemorylimit.go:126] ["global memory controller tries to kill the top1 memory consumer"] [connID=1655338944872580399] ["sql digest"=3bfcbbdb1904fde9cc2d62d889d1f0c381695a075524598096bbf6b6eccf8f9a] ["sql text"=] [tidb_server_memory_limit=11510510320] ["heap inuse"=12128239616] ["sql memory usage"=8105754037]
  1. The tidb-server is crashed because of OOM finally

截屏2023-02-05 11 11 31

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions