Home
last modified time | relevance | path

Searched refs:smp_mb__after_spinlock (Results 1 – 22 of 22) sorted by relevance

/linux-6.6.21/tools/memory-model/litmus-tests/
DMP+polockmbonce+poacquiresilsil.litmus6 * Do spinlocks combined with smp_mb__after_spinlock() provide order
18 smp_mb__after_spinlock();
DZ6.0+pooncelock+poonceLock+pombonce.litmus6 * This litmus test demonstrates how smp_mb__after_spinlock() may be
27 smp_mb__after_spinlock();
DREADME74 Protect the access with a lock and an smp_mb__after_spinlock()
153 As above, but with smp_mb__after_spinlock() immediately
/linux-6.6.21/kernel/kcsan/
Dselftest.c148 KCSAN_CHECK_READ_BARRIER(smp_mb__after_spinlock()); in test_barrier()
177 KCSAN_CHECK_WRITE_BARRIER(smp_mb__after_spinlock()); in test_barrier()
209 KCSAN_CHECK_RW_BARRIER(smp_mb__after_spinlock()); in test_barrier()
Dkcsan_test.c578 KCSAN_EXPECT_READ_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
623 KCSAN_EXPECT_WRITE_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
668 KCSAN_EXPECT_RW_BARRIER(smp_mb__after_spinlock(), true); in test_barrier_nothreads()
/linux-6.6.21/arch/xtensa/include/asm/
Dspinlock.h18 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.6.21/arch/csky/include/asm/
Dspinlock.h10 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.6.21/arch/arm64/include/asm/
Dspinlock.h12 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.6.21/arch/powerpc/include/asm/
Dspinlock.h14 #define smp_mb__after_spinlock() smp_mb() macro
/linux-6.6.21/arch/riscv/include/asm/
Dbarrier.h72 #define smp_mb__after_spinlock() RISCV_FENCE(iorw,iorw) macro
/linux-6.6.21/tools/memory-model/Documentation/
Dlocking.txt185 of smp_mb__after_spinlock():
199 smp_mb__after_spinlock();
212 This addition of smp_mb__after_spinlock() strengthens the lock
214 In other words, the addition of the smp_mb__after_spinlock() prohibits
Drecipes.txt160 of smp_mb__after_spinlock():
174 smp_mb__after_spinlock();
187 This addition of smp_mb__after_spinlock() strengthens the lock acquisition
Dordering.txt160 o smp_mb__after_spinlock(), which provides full ordering subsequent
Dexplanation.txt2752 smp_mb__after_spinlock(). The LKMM uses fence events with special
2764 smp_mb__after_spinlock() orders po-earlier lock acquisition
/linux-6.6.21/include/linux/
Dspinlock.h175 #ifndef smp_mb__after_spinlock
176 #define smp_mb__after_spinlock() kcsan_mb() macro
/linux-6.6.21/tools/memory-model/
Dlinux-kernel.bell33 'after-spinlock (*smp_mb__after_spinlock*) ||
Dlinux-kernel.def25 smp_mb__after_spinlock() { __fence{after-spinlock}; }
/linux-6.6.21/kernel/
Dkthread.c1471 smp_mb__after_spinlock(); in kthread_unuse_mm()
Dexit.c558 smp_mb__after_spinlock(); in exit_mm()
/linux-6.6.21/kernel/rcu/
Dtree_nocb.h1052 smp_mb__after_spinlock(); /* Timer expire before wakeup. */ in do_nocb_deferred_wakeup_timer()
/linux-6.6.21/Documentation/RCU/
DwhatisRCU.rst659 smp_mb__after_spinlock();
685 been able to write-acquire the lock otherwise. The smp_mb__after_spinlock()
/linux-6.6.21/kernel/sched/
Dcore.c1814 smp_mb__after_spinlock(); in uclamp_sync_util_min_rt_default()
4235 smp_mb__after_spinlock(); in try_to_wake_up()
6615 smp_mb__after_spinlock(); in __schedule()