-
Notifications
You must be signed in to change notification settings - Fork 5.3k
Labels
Milestone
Description
With the completion of support for NativeCallableAttribute being public - #33005, there is some additional cleanup that can now occur. The UMThunkStub() calls written in assembly can now be replaced by the newly added JIT helpers for Reverse P/Invoke Enter/Exit.
runtime/src/coreclr/src/vm/jithelpers.cpp
Lines 5064 to 5095 in a1af0f2
| EXTERN_C void JIT_ReversePInvokeEnter(ReversePInvokeFrame* frame) | |
| { | |
| _ASSERTE(frame != NULL); | |
| Thread* thread = GetThreadNULLOk(); | |
| // If a thread instance exists and is in the | |
| // correct GC mode attempt a quick transition. | |
| if (thread != NULL | |
| && !thread->PreemptiveGCDisabled()) | |
| { | |
| // Manually inline the fast path in Thread::DisablePreemptiveGC(). | |
| thread->m_fPreemptiveGCDisabled.StoreWithoutBarrier(1); | |
| if (g_TrapReturningThreads.LoadWithoutBarrier() == 0) | |
| { | |
| frame->currentThread = thread; | |
| return; | |
| } | |
| } | |
| JIT_ReversePInvokeEnterRare(frame); | |
| } | |
| EXTERN_C void JIT_ReversePInvokeExit(ReversePInvokeFrame* frame) | |
| { | |
| _ASSERTE(frame != NULL); | |
| _ASSERTE(frame->currentThread == GetThread()); | |
| // Manually inline the fast path in Thread::EnablePreemptiveGC(). | |
| // This is a trade off with GC suspend performance. We are opting | |
| // to make this exit faster. | |
| frame->currentThread->m_fPreemptiveGCDisabled.StoreWithoutBarrier(0); | |
| } |
The x86 case must still use the UMThunkStub(), but all other can now be removed.