]> git.ipfire.org Git - thirdparty/kernel/stable.git/commitdiff
crypto: x86/sha256-avx2 - add missing vzeroupper
authorEric Biggers <ebiggers@google.com>
Sat, 6 Apr 2024 00:26:09 +0000 (20:26 -0400)
committerGreg Kroah-Hartman <gregkh@linuxfoundation.org>
Sun, 16 Jun 2024 11:32:02 +0000 (13:32 +0200)
[ Upstream commit 57ce8a4e162599cf9adafef1f29763160a8e5564 ]

Since sha256_transform_rorx() uses ymm registers, execute vzeroupper
before returning from it.  This is necessary to avoid reducing the
performance of SSE code.

Fixes: d34a460092d8 ("crypto: sha256 - Optimized sha256 x86_64 routine using AVX2's RORX instructions")
Signed-off-by: Eric Biggers <ebiggers@google.com>
Acked-by: Tim Chen <tim.c.chen@linux.intel.com>
Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
Signed-off-by: Sasha Levin <sashal@kernel.org>
arch/x86/crypto/sha256-avx2-asm.S

index 3439aaf4295d2b6383ba369a991b5c3dff177d45..81c8053152bb93090a0e7fce96c4562f7ef4bcb3 100644 (file)
@@ -711,6 +711,7 @@ done_hash:
        popq    %r13
        popq    %r12
        popq    %rbx
+       vzeroupper
        RET
 SYM_FUNC_END(sha256_transform_rorx)