]> git.ipfire.org Git - thirdparty/kernel/stable.git/commitdiff
crypto: x86/sha256-avx2 - add missing vzeroupper
authorEric Biggers <ebiggers@google.com>
Sat, 6 Apr 2024 00:26:09 +0000 (20:26 -0400)
committerGreg Kroah-Hartman <gregkh@linuxfoundation.org>
Sun, 16 Jun 2024 11:39:17 +0000 (13:39 +0200)
[ Upstream commit 57ce8a4e162599cf9adafef1f29763160a8e5564 ]

Since sha256_transform_rorx() uses ymm registers, execute vzeroupper
before returning from it.  This is necessary to avoid reducing the
performance of SSE code.

Fixes: d34a460092d8 ("crypto: sha256 - Optimized sha256 x86_64 routine using AVX2's RORX instructions")
Signed-off-by: Eric Biggers <ebiggers@google.com>
Acked-by: Tim Chen <tim.c.chen@linux.intel.com>
Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
Signed-off-by: Sasha Levin <sashal@kernel.org>
arch/x86/crypto/sha256-avx2-asm.S

index 9bcdbc47b8b4beeee9f8caaf7a825cd09bea8311..f7d72877685598e8c90165b2f45f30911ba76fd7 100644 (file)
@@ -710,6 +710,7 @@ done_hash:
        popq    %r13
        popq    %r12
        popq    %rbx
+       vzeroupper
        RET
 SYM_FUNC_END(sha256_transform_rorx)