]> git.ipfire.org Git - thirdparty/linux.git/commit
lib/crypto: x86/sha1: Migrate optimized code into library
authorEric Biggers <ebiggers@kernel.org>
Sat, 12 Jul 2025 23:23:04 +0000 (16:23 -0700)
committerEric Biggers <ebiggers@kernel.org>
Mon, 14 Jul 2025 18:28:35 +0000 (11:28 -0700)
commitf3d6cb3dc0394b866bc0d1e15157ce45844cf3d3
tree49d272f7f91c2cb5dfe9339107b8b487aa47f664
parentc751059985e02467c7fa6b14676c1d56d089b3cc
lib/crypto: x86/sha1: Migrate optimized code into library

Instead of exposing the x86-optimized SHA-1 code via x86-specific
crypto_shash algorithms, instead just implement the sha1_blocks()
library function.  This is much simpler, it makes the SHA-1 library
functions be x86-optimized, and it fixes the longstanding issue where
the x86-optimized SHA-1 code was disabled by default.  SHA-1 still
remains available through crypto_shash, but individual architectures no
longer need to handle it.

To match sha1_blocks(), change the type of the nblocks parameter of the
assembly functions from int to size_t.  The assembly functions actually
already treated it as size_t.

Reviewed-by: Ard Biesheuvel <ardb@kernel.org>
Link: https://lore.kernel.org/r/20250712232329.818226-14-ebiggers@kernel.org
Signed-off-by: Eric Biggers <ebiggers@kernel.org>
arch/x86/crypto/Kconfig
arch/x86/crypto/Makefile
arch/x86/crypto/sha1_ssse3_glue.c [deleted file]
lib/crypto/Kconfig
lib/crypto/Makefile
lib/crypto/x86/sha1-avx2-asm.S [moved from arch/x86/crypto/sha1_avx2_x86_64_asm.S with 98% similarity]
lib/crypto/x86/sha1-ni-asm.S [moved from arch/x86/crypto/sha1_ni_asm.S with 90% similarity]
lib/crypto/x86/sha1-ssse3-and-avx.S [moved from arch/x86/crypto/sha1_ssse3_asm.S with 97% similarity]
lib/crypto/x86/sha1.h [new file with mode: 0644]