glibc/sysdeps/x86_64/nptl
Noah Goldstein 653c12c7d8 x86: Cleanup pthread_spin_{try}lock.S
Save a jmp on the lock path coming from an initial failure in
pthread_spin_lock.S.  This costs 4-bytes of code but since the
function still fits in the same number of 16-byte blocks (default
function alignment) it does not have affect on the total binary size
of libc.so (unchanged after this commit).

pthread_spin_trylock was using a CAS when a simple xchg works which
is often more expensive.

Full check passes on x86-64.
2022-10-03 14:13:49 -07:00
..
Makefile Update copyright dates with scripts/update-copyrights 2022-01-01 11:40:24 -08:00
pthread_mutex_backoff.h nptl: Add backoff mechanism to spinlock loop 2022-05-09 14:38:40 -07:00
pthread_spin_init.c
pthread_spin_lock.S x86: Cleanup pthread_spin_{try}lock.S 2022-10-03 14:13:49 -07:00
pthread_spin_trylock.S x86: Cleanup pthread_spin_{try}lock.S 2022-10-03 14:13:49 -07:00
pthread_spin_unlock.S Update copyright dates with scripts/update-copyrights 2022-01-01 11:40:24 -08:00
pthread-offsets.h
tcb-access.h Update copyright dates with scripts/update-copyrights 2022-01-01 11:40:24 -08:00
tcb-offsets.sym
tls.h Update copyright dates with scripts/update-copyrights 2022-01-01 11:40:24 -08:00